A Neural Network is a type of Machine Learning model that works in the same way in which the human brain works. This creates an Artificial Neural Network, that works through an algorithm and allows the computer to learn by incorporating new data. An ANN is a group of interconnected nodes similar to neurons inside a brain.

Neural Networks are composed of the layers of neurons, also known as the computational units. They are also interconnected in different layers. The data is being transformed by these neurons until an output is received. Neuron multiplies the initial values by some weights that are being assigned to them, sums up with other values coming over it, adjusting the resulting number by neuron bias and then finally, the output is normalized by some activation function.

A very important feature/function of neural network is the ** Iterative Learning Process.** In this, rows are presented to the network individually not simultaneously and the weights associated with each input value is adjusted every time. During learning process, network trains itself by adjusting the weights to predict the correct class of input data points.

Once we have designed the network now comes the time for training. And to initiate this process, some values for the weights are chosen randomly, then we start with the training part.

There are many types of neural networks, and below is the very beautifully designed cheat sheet by *Fjodor Van Veen* for the basic understanding.

We will understand few of them one by one and will cover other in later posts.

: The simplest form of network, sums the input, applies activation function on them and passes them further towards the output layer.*PERCEPTRON*: Here all nodes are fully connected, and there are no back loops involved in it. There is the existence of one layer between input and output layer.*FEED FORWARD (FF)*: These are a form of Feed Forward networks. They use*RADIAL BASIS NETWORK (RBF)**Radial Basis function*as their activation function instead of Logistic Function. RBF activation function works perfectly for function approximation and also has an advantage for continuous values over Logistic Function.: They are also a form of Feed Forward Network, but with more than one hidden layer. While training traditional FF, we passed small amount of error to previous layer. Now that led to exponential growth of training times and made practical implementation of DFF quite difficult.*DEEP FEED FORWARD*: These networks have introduced different type of cells, Recurrent Cells. These type of networks are used when decisions from past iterations can influence current ones. For example, a word can be analyzed only from the influence of previous word. These networks can also be trained for sequence generation by processing real data sequences one step at a time and predicting what comes next. If we assume that predictions are probabilistic, sequences can be generated from trained network by iteratively sampling from a sequence output and using it as an input sample at next step.*RECURRENT NEURAL NETWORKS (RNN)*: The main idea behind this type of networks is taht there is no random initialization of weights, rather pre-training each layer with unsupervised learning algorithms can produce much better weights. It can be used for*AUTOENCODER***data denoising**and**Dimensionality reduction for Data Visualization.**

**Applications of Neural Networks in Industries** :

- Neural Networks are used for real world business problems such as data validation, risk management and sales forecasting.These can be used by marketing department for carrying out Market Segmentation, Target and Positioning of products. Unsupervised neural networks can be used to segment customers based on the similarity of their characteristics. Supervised neural networks can be trained to learn the boundaries between customer segments.There is a great ability of Neural Networks, they can easily consider multiple variables simultaneously. Forecasting sales, predicting profit, these are some scenarios where it can be of great use.