SlideShare a Scribd company logo
Machine Learning
With
Neural Networks
Anuj Saxena
Software Consultant
Knoldus Software LLP
Artificial Intelligence: Brief History
Agenda
• Machine Learning – what and why?
• ANN - Introduction
• Activation Function
• Train & Error
• Gradient Descent
• Importance of layers
• Back propogation
• Cons
• Demo
SKYNET
Machine learning
● Machine learning is the subfield of computer
science that gives computers the ability to
learn without being programmed.
Machine Learning  With Neural Networks
Machine Learning techniques
● Decision tree
● Random Forests
● K means Clustering
● Naive Bayes Classifier
● Artificial Neural Networks
Artificial Neural
Network
How the brain works
At granular level
perceptron
What is perceptron
● A perceptron is an artificial unit that mimics a
biological neuron.
● Using multiple perceptrons we create an
Artificial Neural Network.
● In an ANN, each single unit in every layer
(except input layer) is a perceptron.
Perceptron
A simple neural network
A bit simpler
Self Drive Car: ALVINN
● Stands for Autonomous Land Vehicle
In a Neural Network
● Steering a vehicle
● Taking input from
a 30X32 sensor
● Hence, 30X32 units
in input layer
● These inputs are provided
to our neural net and the
output tells us which neuron
to fire from all output neurons
(where each neuron defines
a direction)
Activation Function
● The activation function is the last step of
processing in a perceptron.
● It takes the summation of multiplication of the
inputs and their corrosponding weights
Need for activation
• Consider the following
• Here value of Y ranges from -inf to +inf
• Hence how to decide whether the neuron should be
fired(activated) or not??
• So we got some activation functions with us
• Step Function
• Linear Function
• Sigmoid Function
Step function
• A threshold based activation function
• “activated” if Y > threshold else not
• In this picture
output is 1 ( activated) when
value > 0 (threshold)
and outputs a 0 ( not activated) otherwise
• Drawbacks:
• Can work wrong if using more that two classes (if more than one neuron outputs
activated)
• Multiple layer not supported
Linear Function
● Y = c * (summation + bias)
where summation = sum(input*weight)
● A linear function in form of
y = mx
● Not binary in nature
● Drawbacks
– Unbounded
– Can not use multiple layers with this too
Sigmoid
● Looks smooth
● Like step function
● Most widely used
● Benefits
– Nonlinear
– Bounded values
Sigmoid contd.
● As we are working in bounded outputs, our activation
functions have a range(0, 1)
i.e. our activations are bounded
● Although bounded but not binary in nature
● i.e. we can take the max ( or softmax) in case of more than
one neurons activated.
● As it is non linear in nature hence we can use mutiple layers
to effectively.
What is bias?
● The main function of a bias is to provide every node with a
trainable constant value (in addition to the normal inputs
that the node receives)
● Lets consider a simple network with 1 input and 1 output
● The output of the network is computed by multiplying the
input (x) by the weight (w0) and passing the result through
some kind of activation function (e.g. a sigmoid function.)
Bias(contd.)
● If we change the values
of w0 the graph
fluctuates like this
●
Changing the weight
w0 essentially changes
the "steepness" of
the sigmoid
● But what if you wanted
the network to output 0
when input (x) is 2?
● changing the steepness
of the sigmoid won't
really work we need to
shift the entire curve
to the right.
Bias(contd.)
● Now consider this network with added bias
● The output of the network becomes sig(w0*x +
w1*1.0)
● Here the value of the bias is taken as 1.0
Bias(contd.)
● Now the graph moves
something like this with
the change in bias
● Having a weight of -5
for w1 shifts the curve
to the right,
which allows us to
have a network that
outputs 0 when x is 2.
Train & Error
● We now know that our perceptrons depend on its
weight vector to provide an output.
● In the training phase we shift the weights for each
input until we get our desired output
● In simple cases and less number of inputs we
can manually change our weights till the limit our
training data satisfies the outputs
● But what if the inputs are very large and training
data is really big too (a real time scenario)
Error
● Finding error implies that if we have set our weights in
our ANN model and now we want to check if they are
correct or not?
● An ideal case can not be found when there is no error in
the weight vector. So there will always be some error in
our model.
● i.e. Error = (expected output – gained output)
● Here comes the tolerance(how much error is acceptable)
● i.e. till when we need to update the weights
Minimizing Error through Gradient
Descent
● What is gradient??
Ans: An increase or decrease in the magnitude of a property observed in
passing from one point or moment to another
Or
In mathematics, the gradient is a multi-variable generalization of the
derivative.
● Error = - Y
● Squared error function E(w) = 1/2
● Gradient
● Weight update: where
Issue with gradient descent
● Gradient descent works fine only with single
layer models (why???)
● But for multilayer???
● Here comes the back propogation
Leftovers
Layers
● Problems that require two hidden layers are
rarely encountered as neural networks with two
hidden layers can represent functions with any
kind of shape.
● Currently no theoretical reason to use neural
networks with any more than two hidden layers.
● Most problem can be solved using only one
hidden layer.
Standards
● The number of neurons in a hidden layer:
Back Propogation
● We can find error in weights between hidden layer and the
output layer
● Problem is finding the error in weights between input layer and
hidden layer (and between one hidden layer to another hidden
layer in case of multiple hidden layers)
● For that we have back propogation
● In back propogation we find the error at the output layer and
then use that error to calculate error at the hidden layer.
Machine Learning  With Neural Networks
Algorithm
 
Algorithm contd.
 
Output layer
 
Hidden Layer
 
Weight Change
 
Cons
● Google’s Photos app mistakenly tagged two 
black people in a photograph as “gorillas.”
● Flickr’s smart new image recognition tool, 
powered by Yahoo’s neural network, also 
tagged a black man as an “ape.”
Demo
References
● Machine Learning – Tom Mitchell
● http://www.theprojectspot.com/tutorial-post
Machine Learning  With Neural Networks

More Related Content

PPTX
Activation functions
PPTX
Activation function
PPTX
Activation function
PPTX
Sigmoid function machine learning made simple
PPTX
04 Multi-layer Feedforward Networks
PPTX
Introduction to Neural networks (under graduate course) Lecture 6 of 9
PPTX
Neural net and back propagation
PPTX
Activation functions
Activation function
Activation function
Sigmoid function machine learning made simple
04 Multi-layer Feedforward Networks
Introduction to Neural networks (under graduate course) Lecture 6 of 9
Neural net and back propagation

What's hot (20)

PPTX
Introduction to Neural networks (under graduate course) Lecture 5 of 9
PDF
Introduction to Artificial Neural Networks
PPTX
Introduction to Neural networks (under graduate course) Lecture 4 of 9
PDF
Artificial neural networks
PPTX
Back propagation network
PDF
Multi Layer Perceptron & Back Propagation
PPTX
Introduction to Neural networks (under graduate course) Lecture 8 of 9
PPTX
Machine Learning - Neural Networks - Perceptron
PDF
Introduction to Artificial Neural Network
PPTX
Backpropagation And Gradient Descent In Neural Networks | Neural Network Tuto...
PPTX
Regularization in deep learning
PDF
Recurrent neural networks rnn
PDF
Deep learning
PPTX
03 Single layer Perception Classifier
PPTX
Nural network ER.Abhishek k. upadhyay
PPT
2.5 backpropagation
PDF
The Back Propagation Learning Algorithm
PPTX
Machine learning project
PDF
On Implementation of Neuron Network(Back-propagation)
PPT
Neural networks1
Introduction to Neural networks (under graduate course) Lecture 5 of 9
Introduction to Artificial Neural Networks
Introduction to Neural networks (under graduate course) Lecture 4 of 9
Artificial neural networks
Back propagation network
Multi Layer Perceptron & Back Propagation
Introduction to Neural networks (under graduate course) Lecture 8 of 9
Machine Learning - Neural Networks - Perceptron
Introduction to Artificial Neural Network
Backpropagation And Gradient Descent In Neural Networks | Neural Network Tuto...
Regularization in deep learning
Recurrent neural networks rnn
Deep learning
03 Single layer Perception Classifier
Nural network ER.Abhishek k. upadhyay
2.5 backpropagation
The Back Propagation Learning Algorithm
Machine learning project
On Implementation of Neuron Network(Back-propagation)
Neural networks1
Ad

Similar to Machine Learning With Neural Networks (20)

PPTX
simple NN and RBM arch for slideshare.pptx
PPTX
Reason To Switch to DNNDNNs excel in handling huge volumes of data (e.g., ima...
PPTX
08 neural networks
PPTX
ACUMENS ON NEURAL NET AKG 20 7 23.pptx
PDF
Nural Network ppt presentation which help about nural
PPT
Data mining techniques power point presentation
PPT
neural networking and factor analysis.ppt
PPT
neural1Advanced Features of Neural Network.ppt
PPT
Artificial neural networks and deep learning.ppt
PPTX
Artificial intelligence learning presentations
PPTX
Deep Learning Interview Questions And Answers | AI & Deep Learning Interview ...
PPT
SOFTCOMPUTERING TECHNICS - Unit
PPT
Neural networks,Single Layer Feed Forward
PPTX
UNIT IV NEURAL NETWORKS - Multilayer perceptron
PPTX
Artificial Neural Network
PPT
Artificial Neural Network
PPT
neural.ppt
PPT
neural.ppt
PPT
neural.ppt
simple NN and RBM arch for slideshare.pptx
Reason To Switch to DNNDNNs excel in handling huge volumes of data (e.g., ima...
08 neural networks
ACUMENS ON NEURAL NET AKG 20 7 23.pptx
Nural Network ppt presentation which help about nural
Data mining techniques power point presentation
neural networking and factor analysis.ppt
neural1Advanced Features of Neural Network.ppt
Artificial neural networks and deep learning.ppt
Artificial intelligence learning presentations
Deep Learning Interview Questions And Answers | AI & Deep Learning Interview ...
SOFTCOMPUTERING TECHNICS - Unit
Neural networks,Single Layer Feed Forward
UNIT IV NEURAL NETWORKS - Multilayer perceptron
Artificial Neural Network
Artificial Neural Network
neural.ppt
neural.ppt
neural.ppt
Ad

More from Knoldus Inc. (20)

PPTX
Angular Hydration Presentation (FrontEnd)
PPTX
Optimizing Test Execution: Heuristic Algorithm for Self-Healing
PPTX
Self-Healing Test Automation Framework - Healenium
PPTX
Kanban Metrics Presentation (Project Management)
PPTX
Java 17 features and implementation.pptx
PPTX
Chaos Mesh Introducing Chaos in Kubernetes
PPTX
GraalVM - A Step Ahead of JVM Presentation
PPTX
Nomad by HashiCorp Presentation (DevOps)
PPTX
Nomad by HashiCorp Presentation (DevOps)
PPTX
DAPR - Distributed Application Runtime Presentation
PPTX
Introduction to Azure Virtual WAN Presentation
PPTX
Introduction to Argo Rollouts Presentation
PPTX
Intro to Azure Container App Presentation
PPTX
Insights Unveiled Test Reporting and Observability Excellence
PPTX
Introduction to Splunk Presentation (DevOps)
PPTX
Code Camp - Data Profiling and Quality Analysis Framework
PPTX
AWS: Messaging Services in AWS Presentation
PPTX
Amazon Cognito: A Primer on Authentication and Authorization
PPTX
ZIO Http A Functional Approach to Scalable and Type-Safe Web Development
PPTX
Managing State & HTTP Requests In Ionic.
Angular Hydration Presentation (FrontEnd)
Optimizing Test Execution: Heuristic Algorithm for Self-Healing
Self-Healing Test Automation Framework - Healenium
Kanban Metrics Presentation (Project Management)
Java 17 features and implementation.pptx
Chaos Mesh Introducing Chaos in Kubernetes
GraalVM - A Step Ahead of JVM Presentation
Nomad by HashiCorp Presentation (DevOps)
Nomad by HashiCorp Presentation (DevOps)
DAPR - Distributed Application Runtime Presentation
Introduction to Azure Virtual WAN Presentation
Introduction to Argo Rollouts Presentation
Intro to Azure Container App Presentation
Insights Unveiled Test Reporting and Observability Excellence
Introduction to Splunk Presentation (DevOps)
Code Camp - Data Profiling and Quality Analysis Framework
AWS: Messaging Services in AWS Presentation
Amazon Cognito: A Primer on Authentication and Authorization
ZIO Http A Functional Approach to Scalable and Type-Safe Web Development
Managing State & HTTP Requests In Ionic.

Recently uploaded (20)

PPTX
Lecture 3: Operating Systems Introduction to Computer Hardware Systems
PPTX
ai tools demonstartion for schools and inter college
PDF
Why TechBuilder is the Future of Pickup and Delivery App Development (1).pdf
PDF
System and Network Administration Chapter 2
PPTX
Transform Your Business with a Software ERP System
PDF
Flood Susceptibility Mapping Using Image-Based 2D-CNN Deep Learnin. Overview ...
PPTX
Agentic AI Use Case- Contract Lifecycle Management (CLM).pptx
PDF
Understanding Forklifts - TECH EHS Solution
PDF
2025 Textile ERP Trends: SAP, Odoo & Oracle
PPTX
history of c programming in notes for students .pptx
PPTX
Operating system designcfffgfgggggggvggggggggg
PDF
Claude Code: Everyone is a 10x Developer - A Comprehensive AI-Powered CLI Tool
PPTX
L1 - Introduction to python Backend.pptx
PDF
System and Network Administraation Chapter 3
PDF
PTS Company Brochure 2025 (1).pdf.......
PDF
wealthsignaloriginal-com-DS-text-... (1).pdf
PDF
Odoo Companies in India – Driving Business Transformation.pdf
PDF
SAP S4 Hana Brochure 3 (PTS SYSTEMS AND SOLUTIONS)
PPTX
VVF-Customer-Presentation2025-Ver1.9.pptx
PDF
How Creative Agencies Leverage Project Management Software.pdf
Lecture 3: Operating Systems Introduction to Computer Hardware Systems
ai tools demonstartion for schools and inter college
Why TechBuilder is the Future of Pickup and Delivery App Development (1).pdf
System and Network Administration Chapter 2
Transform Your Business with a Software ERP System
Flood Susceptibility Mapping Using Image-Based 2D-CNN Deep Learnin. Overview ...
Agentic AI Use Case- Contract Lifecycle Management (CLM).pptx
Understanding Forklifts - TECH EHS Solution
2025 Textile ERP Trends: SAP, Odoo & Oracle
history of c programming in notes for students .pptx
Operating system designcfffgfgggggggvggggggggg
Claude Code: Everyone is a 10x Developer - A Comprehensive AI-Powered CLI Tool
L1 - Introduction to python Backend.pptx
System and Network Administraation Chapter 3
PTS Company Brochure 2025 (1).pdf.......
wealthsignaloriginal-com-DS-text-... (1).pdf
Odoo Companies in India – Driving Business Transformation.pdf
SAP S4 Hana Brochure 3 (PTS SYSTEMS AND SOLUTIONS)
VVF-Customer-Presentation2025-Ver1.9.pptx
How Creative Agencies Leverage Project Management Software.pdf

Machine Learning With Neural Networks

  • 1. Machine Learning With Neural Networks Anuj Saxena Software Consultant Knoldus Software LLP
  • 3. Agenda • Machine Learning – what and why? • ANN - Introduction • Activation Function • Train & Error • Gradient Descent • Importance of layers • Back propogation • Cons • Demo
  • 5. Machine learning ● Machine learning is the subfield of computer science that gives computers the ability to learn without being programmed.
  • 7. Machine Learning techniques ● Decision tree ● Random Forests ● K means Clustering ● Naive Bayes Classifier ● Artificial Neural Networks
  • 11. What is perceptron ● A perceptron is an artificial unit that mimics a biological neuron. ● Using multiple perceptrons we create an Artificial Neural Network. ● In an ANN, each single unit in every layer (except input layer) is a perceptron.
  • 13. A simple neural network
  • 15. Self Drive Car: ALVINN ● Stands for Autonomous Land Vehicle In a Neural Network ● Steering a vehicle ● Taking input from a 30X32 sensor ● Hence, 30X32 units in input layer ● These inputs are provided to our neural net and the output tells us which neuron to fire from all output neurons (where each neuron defines a direction)
  • 16. Activation Function ● The activation function is the last step of processing in a perceptron. ● It takes the summation of multiplication of the inputs and their corrosponding weights
  • 17. Need for activation • Consider the following • Here value of Y ranges from -inf to +inf • Hence how to decide whether the neuron should be fired(activated) or not?? • So we got some activation functions with us • Step Function • Linear Function • Sigmoid Function
  • 18. Step function • A threshold based activation function • “activated” if Y > threshold else not • In this picture output is 1 ( activated) when value > 0 (threshold) and outputs a 0 ( not activated) otherwise • Drawbacks: • Can work wrong if using more that two classes (if more than one neuron outputs activated) • Multiple layer not supported
  • 19. Linear Function ● Y = c * (summation + bias) where summation = sum(input*weight) ● A linear function in form of y = mx ● Not binary in nature ● Drawbacks – Unbounded – Can not use multiple layers with this too
  • 20. Sigmoid ● Looks smooth ● Like step function ● Most widely used ● Benefits – Nonlinear – Bounded values
  • 21. Sigmoid contd. ● As we are working in bounded outputs, our activation functions have a range(0, 1) i.e. our activations are bounded ● Although bounded but not binary in nature ● i.e. we can take the max ( or softmax) in case of more than one neurons activated. ● As it is non linear in nature hence we can use mutiple layers to effectively.
  • 22. What is bias? ● The main function of a bias is to provide every node with a trainable constant value (in addition to the normal inputs that the node receives) ● Lets consider a simple network with 1 input and 1 output ● The output of the network is computed by multiplying the input (x) by the weight (w0) and passing the result through some kind of activation function (e.g. a sigmoid function.)
  • 23. Bias(contd.) ● If we change the values of w0 the graph fluctuates like this ● Changing the weight w0 essentially changes the "steepness" of the sigmoid ● But what if you wanted the network to output 0 when input (x) is 2? ● changing the steepness of the sigmoid won't really work we need to shift the entire curve to the right.
  • 24. Bias(contd.) ● Now consider this network with added bias ● The output of the network becomes sig(w0*x + w1*1.0) ● Here the value of the bias is taken as 1.0
  • 25. Bias(contd.) ● Now the graph moves something like this with the change in bias ● Having a weight of -5 for w1 shifts the curve to the right, which allows us to have a network that outputs 0 when x is 2.
  • 26. Train & Error ● We now know that our perceptrons depend on its weight vector to provide an output. ● In the training phase we shift the weights for each input until we get our desired output ● In simple cases and less number of inputs we can manually change our weights till the limit our training data satisfies the outputs ● But what if the inputs are very large and training data is really big too (a real time scenario)
  • 27. Error ● Finding error implies that if we have set our weights in our ANN model and now we want to check if they are correct or not? ● An ideal case can not be found when there is no error in the weight vector. So there will always be some error in our model. ● i.e. Error = (expected output – gained output) ● Here comes the tolerance(how much error is acceptable) ● i.e. till when we need to update the weights
  • 28. Minimizing Error through Gradient Descent ● What is gradient?? Ans: An increase or decrease in the magnitude of a property observed in passing from one point or moment to another Or In mathematics, the gradient is a multi-variable generalization of the derivative. ● Error = - Y ● Squared error function E(w) = 1/2 ● Gradient ● Weight update: where
  • 29. Issue with gradient descent ● Gradient descent works fine only with single layer models (why???) ● But for multilayer??? ● Here comes the back propogation
  • 31. Layers ● Problems that require two hidden layers are rarely encountered as neural networks with two hidden layers can represent functions with any kind of shape. ● Currently no theoretical reason to use neural networks with any more than two hidden layers. ● Most problem can be solved using only one hidden layer.
  • 32. Standards ● The number of neurons in a hidden layer:
  • 33. Back Propogation ● We can find error in weights between hidden layer and the output layer ● Problem is finding the error in weights between input layer and hidden layer (and between one hidden layer to another hidden layer in case of multiple hidden layers) ● For that we have back propogation ● In back propogation we find the error at the output layer and then use that error to calculate error at the hidden layer.
  • 41. Demo