Story Behind this post:
I always wanted to write posts about Computer Science. I feel happy when I write about these stuff. Since last month I started exploring DataScience Field. I am focusing to get a job in DataScience Field. The thing which got me attracted to DataScience field is Deep Learning.
Most Deep Learning models are based on Artificial Neural Network which came from the concept of Neural network in the Human Brain. The main reason for starting this Website is to explore the fields related to Human Brain, Space and Computer Science.
Deep Learning deals with two of my favourite Topics(Human Brain and Computer Science ). So this is the main reason why I started to explore Deep Learning.
As a part of Exploring Deep Learning. I have taken a Specialization course on Coursera.
Course Homepage : https://www.coursera.org/specializations/deep-learning
This Specialization course is taught by Andrew Ng. Andrew Ng is a great person with the greatest teaching skills.
I am going to teach whatever concepts which I have learnt in this course. Images are taken from the Course Videos.
Today Topic is about ” What is Neural Network? “
Let us consider an example of housing price prediction problem. Let’s say we have a dataset of 6 houses containing the features like the price of the house and the size of the house. After depicting the values of 6 dataset values on the graph. We need to have a function which satisfies these dataset values and able to predict next house price with a Value of size of the house. We can fit a straight line satisfying these values
Using the concept of linear regression we can able to set a straight line satisfying these values. What we have done as you can see in the above image is an example of Simplest Neural Network.
Simplest Neural Network.
In the image as you can see, there is one input named “ size “ and it goes into a Neuron( little circle as shown in the image) where the actual function resides and gives output in the form of a price. Neuron implements the function which we have drawn in the image as shown above. Neuron Computes input values using the Linear function.
Sometimes in the Neural Network Literature, we often come up with a graph(or a function ) which starts with zero and continues which a straight line with a certain slope.
That particular type of function is called as “ReLU” Function. Rectified Linear Unit.
Larger Neural Network is built by stacking small neurons (Like shown in the above image ).
As you can see in the image, it’s an example of a larger Neural Network. This Larger Neural Network contains more than one input ( more than one features ).
**Here I got a doubt whether we can use different activation functions for some set of neurons and a different set of activation functions for another set of neurons.??? ( I googled this doubt and find out that Yes, we can use )
As you can see in the image, we just need to give the inputs and it will predict the price of the house using Training examples which we have given.
As you can see in the image, each of the hidden units( neurons) are connected by each input features so that each neuron in the hidden unit has the opportunity to decide and think in several aspects to predict the output.
->Given enough examples of (X,Y) neural network will do a remarkable job in figuring out the functions that accurately map (X) to (Y)
Whatever explained so far in the Neural network is an example of Supervised Learning. Which means the system takes certain input (X) and gives back the result (Y) as output.