Member Feature Story

The right — and wrong — way to compete with a locked market

“ I see a pattern, but my imagination can not picture the maker of that pattern. I see the clock, I cannot envision the clockmaker. The human mind is unable to conceive of the four dimensions, so how can it conceive of God, before whom a thousand years and a thousand dimensions are as one. ”

-Albert Einstein

f Machine Learning, Mobile Development, Software Engineering etc are different arts of sword fighting, Competitive is the blade of your sword.

To use a Deep Neural Network and do Image Recognition, you don’t need to
understand Linear Algebra. But to understand a Deep Neural Network, you need to understand
Linear Algebra. But the fun fact is, to do Image Recognition, you don’t even need to use a Deep
Neural Network too, you can simply use AWS, Watson and Google Vision API or may be a Github
Repo, But then saying that you are a Data Scientist is a criminal offence.
Most of you would have heard the name of Tensorflow (Google’s Deep Learning Library). The word
‘Tensor’ is nothing but a higher dimensional matrix. Linear Algebra gives a practical and scalable way
of framing optimization algorithms like Gradient Descent and Low Memory BFGS.

Machine Learning is basically used to approximate any function, which enables us to build
Technologies which is otherwise not possible to build with conventional programming,
In almost all practical scenarios, these functions take a list of inputs, and generate a list of output.
For a Image Classification Problem, the input list should contain all the pixels of the image (1024 * 786)
which becomes quite a big list. And to approximate the function, we need thousands (sometimes
millions) of such images as training examples). For computational efficiency we pack these input and
output lists in form of vectors and thus can represent the data into a multidimensional linear space.
In entire Machine Learning (Supervised), The Approximated Function is doing nothing but applying a
Linear Transformation on the Input Vector, so that it can lands on the output vector, which is
Regression.

or trying to find a** Hyperplane **(Decision Boundary), which can linearly separate input into a given
category (having or not having cancer), which is Classification.
Most

** So up till this, it’s all free.**
But if you stay, we we will give you an in depth knowledge of Backtracking, Dynamic Programming
and Graph Theory. You will learn that Recursion is much more than ** “Function Calling Itself”.**You will learn that backtracking is nothing but depth first search of a Recursion Tree, and one
generalisation of Backtracking is all you need to solve any backtracking problem. You will learn that
algorithm of solving very famous N Queen Problem is exactly same as algorithm to solve a sudoku.

** So up till this, it’s all free.**
But if you stay, we we will give you an in depth knowledge of Backtracking, Dynamic Programming and Graph Theory. You will learn that Recursion is much more than
**
“Function Calling Itself”.**
You will learn that backtracking is nothing but depth first search of a Recursion Tree, and one
generalisation of Backtracking is all you need to solve any backtracking problem. You will learn that algorithm of solving very famous N Queen Problem is exactly same as algorithm to solve a sudoku.

By the way linear transformation is nothing but a matrix vector multiplication. The transformation that you can see in this image is nothing but the following matrix.

Above image represents a Deep Neural Network can not only learn the right matrix which can map the input to output but also the right representation (kernel) to facilitate the learning of mapping.
**Fundamentals understanding of different techniques of Linear Algebra therefore plays a critical
role to understand the inner workings of a Deep Neural Network and thus Deep Learning itself.**

**Why Numpy?**

It’s a great Matrix Manipulation Library in Python and will make your life very easy.
4