SlideShare a Scribd company logo
2
Most read
4
Most read
7
Most read
Artificial Neural Networks
Fundamentals of Computer Vision – Week 6
Assist. Prof. Özge Öztimur Karadağ
ALKÜ – Department of Computer Engineering
Alanya
Inspired from Biology
• Brain has a huge number of neurons (10 billion neurons and about
60000 billions of interconnections)
• Brain is a very complex, nonlinear, parallel computer
• What is a neuron?
• Dendrites accept inputs from other neurons
• Axon transmits impulses to other neurons
• Synapses are structures where impulses are
transferred from one neuron to another
• Synapses connect neurons to pass electrical
signals from one neuron to another
Neural Networks
• Biological neural networks
• Biological organisms
• Human and animal brains
• High complexity and parallelism
• Artificial neural networks
• Motivated by biological neural networks
• Much simpler and primitive compared to biological networks
• Implementation on general purpose digital computers or using specialized
hardware
Artificial Neural Networks (ANN)
• Computing systems inspired by the biological neural networks that constitute animal brains.
• An ANN is based on a collection of connected units or nodes called artificial neurons, which
loosely model the neurons in a biological brain.
• Each connection, like the synapses in a biological brain, can transmit a signal to other
neurons.
• ANN is a massive parallel distributed processor that is good for memorizing knowledge
• ANN is similar to biological NN in the following aspects:
• Knowledge is gained through learning process
• Knowledge is encoded in mutual connections between neurons
ANN Properties
• Nonlinearity
• Input to output mapping (supervised learning)
• Adaptivity
• Fault tolerance
• Possible VLSI implementation
• Neurobiological analogy
Neuron model
• Neuron model elements:
• Set of synapses, i.e. Inputs with respective weights. (Notation: Signal xj at input j of neuron k
has weight wkj )
• Adder for summation of weighted inputs. These two operations calculate the weighed sum of
inputs.
• Non-linear activation function that limits output of neuron to interval [0,1]
ANN
• Activation functions:
• Threshold
• Linear
ANN
• Activation functions:
• Sigmoid
• Rectifier Linear Unit (ReLU)
ANN
• An artificial neuron receives a signal then processes it and can
signal neurons connected to it.
• Neuron Model summary:
• The "signal" at a connection is a real number, and the output of each
neuron is computed by some non-linear function of the sum of its inputs.
• The connections are called edges. Neurons and edges typically have
a weight that adjusts as learning proceeds.
• The weight increases or decreases the strength of the signal at a
connection.
• Neurons may have a threshold such that a signal is sent only if the
aggregate signal crosses that threshold.
• ANN Architecture
• Typically, neurons are aggregated into layers. Different layers may perform
different transformations on their inputs. Signals travel from the first layer
(the input layer), to the last layer (the output layer), possibly after
traversing the layers multiple times.
ANN
• Architectures
• Single Layer Networks
• Has a single neuron layer (output layer)
• Input layer does not count due to lack of processing
• Network inputs are connected to neuron inputs
• Neuron outputs are also network outputs
• No feedbacks from outputs to inputs
ANN
• Architectures:
• Multilayer Networks
• Multilayer networks have one or more hidden layers, in
addition to input and output layers
• Outputs from n-th layer are inputs to n+1-th layer
• Connectedness:
• A network is fully connected when each neuron in a
layer is connected to all neurons in the next layer
• If some connections are missing the network is partially
connected
• An example of network with one hidden layer with four
neurons
• The network has four input neurons
• There are two output neurons
ANN
• Learning: Learning is the adaptation of the network to better handle a
task by considering sample observations.
• Learning involves adjusting the weights (and optional thresholds) of the
network to improve the accuracy of the result.
• This is done by minimizing the observed errors. Learning is complete when
examining additional observations does not usefully reduce the error rate.
• Learning rate: the size of the corrective steps that the model takes to adjust
for errors in each observation.
• a cost function is evaluated periodically during learning. As long as its output
continues to decline, learning continues.
ANN
• Backpropagation:
• Backpropagation is a method used to adjust the connection weights to
compensate for each error found during learning. The error amount is
effectively divided among the connections. Technically, backprop calculates
the gradient (the derivative) of the cost function associated with a given state
with respect to the weights.
Example
• Let’s say we have a dataset of N rows, 3 features and 1 target variable
(i.e. binary 1/0):
• Just like in every other machine learning use case, we are going to
train a model to predict the target using the features row by row. Let’s
start with the first row:
•
Example…
• ”training a model”  Searching for the best parameters in a mathematical
formula that minimize the error of our predictions.
• ie. In the regression models (i.e. linear regression) you have to find the best weights
• Usually, the weights are randomly initialized then adjusted as the learning
proceeds. Here I’ll just set them all as 1:
•
Example…
• Different than a linear model:
• The activation function defines the output of that node.
Example…
• Training: compare the output with the target, calculate the error and
optimize the weights, reiterate the whole process again and again.
Example…
• Training
Example
• Image Classification using SIFT features.
• Classify images in the CIFAR-10 image dataset by ANN using SIFT
features.
Referrences
• Mauro Di Pietro, Deep Learning with Python: Neural Networks,
https://towardsdatascience.com/deep-learning-with-python-neural-
networks-complete-tutorial-6b53c0b06af0
• Prof. Sven Lončarić, Lecture Notes on Neural Networks.

More Related Content

PPTX
Artificial Neural Network in Medical Diagnosis
PPTX
Artificial neural network
DOCX
Neural networks of artificial intelligence
PPTX
02 Fundamental Concepts of ANN
PDF
Neural networks introduction
PPT
Neural-Networks.ppt
PPT
Artificial-Neural-Networks.ppt
PPS
Neural Networks
Artificial Neural Network in Medical Diagnosis
Artificial neural network
Neural networks of artificial intelligence
02 Fundamental Concepts of ANN
Neural networks introduction
Neural-Networks.ppt
Artificial-Neural-Networks.ppt
Neural Networks

Similar to Artificial Neural Networks Artificial Neural Networks (20)

PDF
ppt_artificial neural netwroK_module 3_KTU
PDF
7 nn1-intro.ppt
PDF
ANN stands for Artificial Neural Network
PDF
Machine Learning using python module_3_ppt.pdf
PDF
Neural Network
PPT
artificial-neural-networks-rev.ppt
PPT
artificial-neural-networks-rev.ppt
DOCX
ABSTRACT.docxiyhkkkkkkkkkkkkkkkkkkkkkkkkkkkk
PDF
ANN-lecture9
PPTX
Artificial Neural Network
PPTX
ANN.pptx
PPTX
ANN.pptx bgyikkl jyrf hfuk kiyfvj jiyfv kuyfcv
PPTX
Neural network
PPTX
Artificial Neural Networks ppt.pptx for final sem cse
PDF
Artificial neural-network-paper-presentation-100115092527-phpapp02
PDF
Artificial Neural Network Paper Presentation
PDF
Artificial Neural Networks: Introduction, Neural Network representation, Appr...
PDF
Machine learningiwijshdbebhehehshshsj.pdf
PPTX
Artifical Neural Network
ppt_artificial neural netwroK_module 3_KTU
7 nn1-intro.ppt
ANN stands for Artificial Neural Network
Machine Learning using python module_3_ppt.pdf
Neural Network
artificial-neural-networks-rev.ppt
artificial-neural-networks-rev.ppt
ABSTRACT.docxiyhkkkkkkkkkkkkkkkkkkkkkkkkkkkk
ANN-lecture9
Artificial Neural Network
ANN.pptx
ANN.pptx bgyikkl jyrf hfuk kiyfvj jiyfv kuyfcv
Neural network
Artificial Neural Networks ppt.pptx for final sem cse
Artificial neural-network-paper-presentation-100115092527-phpapp02
Artificial Neural Network Paper Presentation
Artificial Neural Networks: Introduction, Neural Network representation, Appr...
Machine learningiwijshdbebhehehshshsj.pdf
Artifical Neural Network
Ad

Recently uploaded (20)

PPTX
bas. eng. economics group 4 presentation 1.pptx
PPTX
FINAL REVIEW FOR COPD DIANOSIS FOR PULMONARY DISEASE.pptx
PDF
BMEC211 - INTRODUCTION TO MECHATRONICS-1.pdf
PDF
Mohammad Mahdi Farshadian CV - Prospective PhD Student 2026
DOCX
ASol_English-Language-Literature-Set-1-27-02-2023-converted.docx
PPTX
Artificial Intelligence
PPTX
CH1 Production IntroductoryConcepts.pptx
PDF
keyrequirementskkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkk
PPTX
additive manufacturing of ss316l using mig welding
PPTX
Safety Seminar civil to be ensured for safe working.
PPTX
Infosys Presentation by1.Riyan Bagwan 2.Samadhan Naiknavare 3.Gaurav Shinde 4...
PPTX
OOP with Java - Java Introduction (Basics)
PPTX
web development for engineering and engineering
PPT
introduction to datamining and warehousing
PDF
Unit I ESSENTIAL OF DIGITAL MARKETING.pdf
PPTX
Construction Project Organization Group 2.pptx
PDF
Automation-in-Manufacturing-Chapter-Introduction.pdf
PPTX
Lecture Notes Electrical Wiring System Components
DOCX
573137875-Attendance-Management-System-original
PPTX
M Tech Sem 1 Civil Engineering Environmental Sciences.pptx
bas. eng. economics group 4 presentation 1.pptx
FINAL REVIEW FOR COPD DIANOSIS FOR PULMONARY DISEASE.pptx
BMEC211 - INTRODUCTION TO MECHATRONICS-1.pdf
Mohammad Mahdi Farshadian CV - Prospective PhD Student 2026
ASol_English-Language-Literature-Set-1-27-02-2023-converted.docx
Artificial Intelligence
CH1 Production IntroductoryConcepts.pptx
keyrequirementskkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkk
additive manufacturing of ss316l using mig welding
Safety Seminar civil to be ensured for safe working.
Infosys Presentation by1.Riyan Bagwan 2.Samadhan Naiknavare 3.Gaurav Shinde 4...
OOP with Java - Java Introduction (Basics)
web development for engineering and engineering
introduction to datamining and warehousing
Unit I ESSENTIAL OF DIGITAL MARKETING.pdf
Construction Project Organization Group 2.pptx
Automation-in-Manufacturing-Chapter-Introduction.pdf
Lecture Notes Electrical Wiring System Components
573137875-Attendance-Management-System-original
M Tech Sem 1 Civil Engineering Environmental Sciences.pptx
Ad

Artificial Neural Networks Artificial Neural Networks

  • 1. Artificial Neural Networks Fundamentals of Computer Vision – Week 6 Assist. Prof. Özge Öztimur Karadağ ALKÜ – Department of Computer Engineering Alanya
  • 2. Inspired from Biology • Brain has a huge number of neurons (10 billion neurons and about 60000 billions of interconnections) • Brain is a very complex, nonlinear, parallel computer • What is a neuron? • Dendrites accept inputs from other neurons • Axon transmits impulses to other neurons • Synapses are structures where impulses are transferred from one neuron to another • Synapses connect neurons to pass electrical signals from one neuron to another
  • 3. Neural Networks • Biological neural networks • Biological organisms • Human and animal brains • High complexity and parallelism • Artificial neural networks • Motivated by biological neural networks • Much simpler and primitive compared to biological networks • Implementation on general purpose digital computers or using specialized hardware
  • 4. Artificial Neural Networks (ANN) • Computing systems inspired by the biological neural networks that constitute animal brains. • An ANN is based on a collection of connected units or nodes called artificial neurons, which loosely model the neurons in a biological brain. • Each connection, like the synapses in a biological brain, can transmit a signal to other neurons. • ANN is a massive parallel distributed processor that is good for memorizing knowledge • ANN is similar to biological NN in the following aspects: • Knowledge is gained through learning process • Knowledge is encoded in mutual connections between neurons
  • 5. ANN Properties • Nonlinearity • Input to output mapping (supervised learning) • Adaptivity • Fault tolerance • Possible VLSI implementation • Neurobiological analogy
  • 6. Neuron model • Neuron model elements: • Set of synapses, i.e. Inputs with respective weights. (Notation: Signal xj at input j of neuron k has weight wkj ) • Adder for summation of weighted inputs. These two operations calculate the weighed sum of inputs. • Non-linear activation function that limits output of neuron to interval [0,1]
  • 7. ANN • Activation functions: • Threshold • Linear
  • 8. ANN • Activation functions: • Sigmoid • Rectifier Linear Unit (ReLU)
  • 9. ANN • An artificial neuron receives a signal then processes it and can signal neurons connected to it. • Neuron Model summary: • The "signal" at a connection is a real number, and the output of each neuron is computed by some non-linear function of the sum of its inputs. • The connections are called edges. Neurons and edges typically have a weight that adjusts as learning proceeds. • The weight increases or decreases the strength of the signal at a connection. • Neurons may have a threshold such that a signal is sent only if the aggregate signal crosses that threshold. • ANN Architecture • Typically, neurons are aggregated into layers. Different layers may perform different transformations on their inputs. Signals travel from the first layer (the input layer), to the last layer (the output layer), possibly after traversing the layers multiple times.
  • 10. ANN • Architectures • Single Layer Networks • Has a single neuron layer (output layer) • Input layer does not count due to lack of processing • Network inputs are connected to neuron inputs • Neuron outputs are also network outputs • No feedbacks from outputs to inputs
  • 11. ANN • Architectures: • Multilayer Networks • Multilayer networks have one or more hidden layers, in addition to input and output layers • Outputs from n-th layer are inputs to n+1-th layer • Connectedness: • A network is fully connected when each neuron in a layer is connected to all neurons in the next layer • If some connections are missing the network is partially connected • An example of network with one hidden layer with four neurons • The network has four input neurons • There are two output neurons
  • 12. ANN • Learning: Learning is the adaptation of the network to better handle a task by considering sample observations. • Learning involves adjusting the weights (and optional thresholds) of the network to improve the accuracy of the result. • This is done by minimizing the observed errors. Learning is complete when examining additional observations does not usefully reduce the error rate. • Learning rate: the size of the corrective steps that the model takes to adjust for errors in each observation. • a cost function is evaluated periodically during learning. As long as its output continues to decline, learning continues.
  • 13. ANN • Backpropagation: • Backpropagation is a method used to adjust the connection weights to compensate for each error found during learning. The error amount is effectively divided among the connections. Technically, backprop calculates the gradient (the derivative) of the cost function associated with a given state with respect to the weights.
  • 14. Example • Let’s say we have a dataset of N rows, 3 features and 1 target variable (i.e. binary 1/0): • Just like in every other machine learning use case, we are going to train a model to predict the target using the features row by row. Let’s start with the first row: •
  • 15. Example… • ”training a model”  Searching for the best parameters in a mathematical formula that minimize the error of our predictions. • ie. In the regression models (i.e. linear regression) you have to find the best weights • Usually, the weights are randomly initialized then adjusted as the learning proceeds. Here I’ll just set them all as 1: •
  • 16. Example… • Different than a linear model: • The activation function defines the output of that node.
  • 17. Example… • Training: compare the output with the target, calculate the error and optimize the weights, reiterate the whole process again and again.
  • 19. Example • Image Classification using SIFT features. • Classify images in the CIFAR-10 image dataset by ANN using SIFT features.
  • 20. Referrences • Mauro Di Pietro, Deep Learning with Python: Neural Networks, https://towardsdatascience.com/deep-learning-with-python-neural- networks-complete-tutorial-6b53c0b06af0 • Prof. Sven Lončarić, Lecture Notes on Neural Networks.