SlideShare a Scribd company logo
Date : 14ᵗʰ December 2023
Kathmandu Model Secondary
School
Artificial Neural
Networking
By: Rij Amatya
Pragyan Shrestha
Kritika Silwal
Branch of Machine learning models
Was proposed in 1944
Information processing model
A computational model inspired by the human brain's structure and function.
Follows principles of neuronal organization
Makes prediction based on what they’ve learned
Applied in the concept of AI
Learns by examples
Artificial Neural
Networking(ANN)
INTRODUCTION
Artificial Neural
Networks
Core of deep learning
Image recognition, NLP, Robotics, etc.
3 different layers
Input Layer
1.
Hidden Layer
2.
Output layer
3.
Sometimes referred to as the MLP (Multi-Layer
Perceptron).
The amount of data produced
has increased, and big data has
become a buzzword. Big data
made it easy to train ANNs.
While classical machine learning
algorithms fell short of analyzing
big data, artificial neural
networks performed well on big
data.
Biological Neural
Networks
A neural network consists of the cell body
containing the nucleus, many-branched dendrites,
and a long axon. As you can see in the image above,
the length of the axon is much longer than the cell
body. The axons are divided into many branches that
combine with the dendrites or cell bodies of other
neurons.
Biological neurons generate electrical signals that
travel along axons. If a neuron gets enough
stimulation, it fires. In general, biological neurons
work this way. This work may seem simple, but
billions of neural networks can be managed by
connecting neurons.
Artificial Neurons – Associated with the weights which contain
information about the input signal
Compromises interconnected nodes (neurons) organized in layers
Interconnected to each other by weighted links
As you can see, the inputs and output are
numbers and each input has a weight in this
architecture. These weighted inputs are
summed first, and then a bias is added. This
sum is passed through a step function. This
function can be, for example, a sign function.
Perceptron
A perceptron is the smallest unit of a neural
network. This architecture was developed by
Frank Rosenblatt in 1957. This is a simple
model of biological neuron in Artificial Neural
Network
An algorithm for binary classification can be
used. A simple threshold logic unit (TLU) is a
single linear function. This simple approach has
paved the way for developing AI tools like
ChatGPT.
The image shows a
perceptron with two inputs
and three outputs,
connected via a dense
layer. It is used for multi-
label classification.
How to train a perception?
First, assign a random weight to each input. The sum of the weighted inputs
is found, and then a bias is added to this sum. Note that, when one neuron
triggers another neuron frequently, the connection between them becomes
stronger. Inputs passing through neurons produce an output. This output is
a prediction. The actual value is compared with the prediction and the error
is calculated. Weights are updated to make predictions with fewer errors.
Perceptron was a nice approach, but failed to solve some simple problems
like XOR. To overcome the limitations of the perceptrons, the multilayer
perceptron was developed. Let’s dive into multilayer perceptrons.
If you learn how a perceptron is trained, you will have a better
understanding of how ANNs work. Let’s now discuss how to
train a perceptron.
Multi-layer
Perceptron
Multilayer perceptron consist of an input layer,
a hidden layer, and an output layer. As you can
see in the image above, there is a hidden layer.
If there is more than one hidden layer, it is
called a deep neural network. This is where
deep learning comes into play. Deep learning
became popular with the development of
modern AI architectures.
In short, the inputs go through neural networks and a prediction is
made. But, how to improve the prediction of a neural network? This
is where the backpropagation algorithm comes in. This algorithm
takes a neural network’s output error and then propagates that
error backward through the network. So, the weights are updated.
And this cycle continues until the best prediction is obtained.
A P P L I C A T I O N
ANNs power language translation, sentiment
analysis, and chatbots in NLP applications.
Natural Language Processing (NLP):
ANNs analyze medical images for disease detection,
assisting in diagnostics using techniques like computer-
aided diagnosis
Medical Diagnosis:
ANNs predict stock prices, market trends, and
assess financial risk by analyzing historical data.
Financial Forecasting:
ANNs contribute to the development of self-driving cars
by enabling object detection, lane keeping, and decision-
making.
Autonomous Vehicles:
ANNs are used for image and speech recognition,
enabling applications like facial recognition and voice
assistants.
Image and Speech Recognition:
12
ANNs often require large labeled datasets for
training, and obtaining such data can be challenging.
Training complex neural networks may demand
significant computational power.
Neural networks can overfit to training data,
capturing noise and hindering generalization to new
data.
Issues related to bias, fairness, and accountability
arise, especially in sensitive applications or high-
stakes decision-making.
CHALLENGES
ANNs require large amounts of labeled data for
effective training, and performance may suffer
with insufficient or biased datasets.
Training deep networks can be computationally
demanding
Models memorize training data noise rather
than learning general patterns.
Neural networks are often perceived as black-
box models, making it challenging to interpret
their decision-making processes.
LIMITATIONS
C H A L L E N G E S A N D L I M I T A T I O N S
14

More Related Content

PDF
Machine learningiwijshdbebhehehshshsj.pdf
DOCX
Artificial neural network notes ANN.docx
PDF
ML_Unit_VI_DEEP LEARNING_Introduction to ANN.pdf
PPTX
IAI - UNIT 3 - ANN, EMERGENT SYSTEMS.pptx
PDF
How Do Neural Networks Work and What Are Their Real-World Applications in AI,...
DOCX
Deep learning vxcvbfsdfaegsr gsgfgsdg sd gdgd gdgd gse
PDF
Lebanon SoftShore Artificial Intelligence Seminar - March 38, 2014
Machine learningiwijshdbebhehehshshsj.pdf
Artificial neural network notes ANN.docx
ML_Unit_VI_DEEP LEARNING_Introduction to ANN.pdf
IAI - UNIT 3 - ANN, EMERGENT SYSTEMS.pptx
How Do Neural Networks Work and What Are Their Real-World Applications in AI,...
Deep learning vxcvbfsdfaegsr gsgfgsdg sd gdgd gdgd gse
Lebanon SoftShore Artificial Intelligence Seminar - March 38, 2014

Similar to Neural networking this is about neural networks (20)

PPTX
Deep learning
PPTX
Artifical Neural Network and its applications
DOCX
Project Report -Vaibhav
PDF
Neural Network
PDF
Artificial neural network
PPTX
ppt document 5b6 presentation based on education.pptx
PDF
Fuzzy Logic Final Report
PDF
Deep Learning - The Past, Present and Future of Artificial Intelligence
PPT
Artificial neural network
PPTX
Research Areas in Artificial Intelligence and Machine Learning
PPT
Neural Networks
PPTX
Deep learning intro and examples and types
PDF
Understanding Deep Learning & Parameter Tuning with MXnet, H2o Package in R
PDF
An Overview On Neural Network And Its Application
PPTX
Understanding Neural Networks Working and Applications.pptx
PDF
Artificial Neural Network: A brief study
PDF
Neural Network
PPTX
Introduction-to-Deep-Learning about new technologies
DOCX
Artifical neural networks
PDF
Deep learning
Deep learning
Artifical Neural Network and its applications
Project Report -Vaibhav
Neural Network
Artificial neural network
ppt document 5b6 presentation based on education.pptx
Fuzzy Logic Final Report
Deep Learning - The Past, Present and Future of Artificial Intelligence
Artificial neural network
Research Areas in Artificial Intelligence and Machine Learning
Neural Networks
Deep learning intro and examples and types
Understanding Deep Learning & Parameter Tuning with MXnet, H2o Package in R
An Overview On Neural Network And Its Application
Understanding Neural Networks Working and Applications.pptx
Artificial Neural Network: A brief study
Neural Network
Introduction-to-Deep-Learning about new technologies
Artifical neural networks
Deep learning
Ad

Recently uploaded (20)

PDF
Clinical guidelines as a resource for EBP(1).pdf
PDF
Mega Projects Data Mega Projects Data
PDF
Foundation of Data Science unit number two notes
PPTX
CEE 2 REPORT G7.pptxbdbshjdgsgjgsjfiuhsd
PDF
Lecture1 pattern recognition............
PDF
Recruitment and Placement PPT.pdfbjfibjdfbjfobj
PPTX
oil_refinery_comprehensive_20250804084928 (1).pptx
PPTX
DISORDERS OF THE LIVER, GALLBLADDER AND PANCREASE (1).pptx
PPTX
IB Computer Science - Internal Assessment.pptx
PPTX
The THESIS FINAL-DEFENSE-PRESENTATION.pptx
PPTX
IBA_Chapter_11_Slides_Final_Accessible.pptx
PPTX
climate analysis of Dhaka ,Banglades.pptx
PPTX
Acceptance and paychological effects of mandatory extra coach I classes.pptx
PPTX
ALIMENTARY AND BILIARY CONDITIONS 3-1.pptx
PPT
Chapter 3 METAL JOINING.pptnnnnnnnnnnnnn
PPTX
MODULE 8 - DISASTER risk PREPAREDNESS.pptx
PDF
Fluorescence-microscope_Botany_detailed content
PPTX
iec ppt-1 pptx icmr ppt on rehabilitation.pptx
PDF
168300704-gasification-ppt.pdfhghhhsjsjhsuxush
PPTX
05. PRACTICAL GUIDE TO MICROSOFT EXCEL.pptx
Clinical guidelines as a resource for EBP(1).pdf
Mega Projects Data Mega Projects Data
Foundation of Data Science unit number two notes
CEE 2 REPORT G7.pptxbdbshjdgsgjgsjfiuhsd
Lecture1 pattern recognition............
Recruitment and Placement PPT.pdfbjfibjdfbjfobj
oil_refinery_comprehensive_20250804084928 (1).pptx
DISORDERS OF THE LIVER, GALLBLADDER AND PANCREASE (1).pptx
IB Computer Science - Internal Assessment.pptx
The THESIS FINAL-DEFENSE-PRESENTATION.pptx
IBA_Chapter_11_Slides_Final_Accessible.pptx
climate analysis of Dhaka ,Banglades.pptx
Acceptance and paychological effects of mandatory extra coach I classes.pptx
ALIMENTARY AND BILIARY CONDITIONS 3-1.pptx
Chapter 3 METAL JOINING.pptnnnnnnnnnnnnn
MODULE 8 - DISASTER risk PREPAREDNESS.pptx
Fluorescence-microscope_Botany_detailed content
iec ppt-1 pptx icmr ppt on rehabilitation.pptx
168300704-gasification-ppt.pdfhghhhsjsjhsuxush
05. PRACTICAL GUIDE TO MICROSOFT EXCEL.pptx
Ad

Neural networking this is about neural networks

  • 1. Date : 14ᵗʰ December 2023 Kathmandu Model Secondary School Artificial Neural Networking By: Rij Amatya Pragyan Shrestha Kritika Silwal
  • 2. Branch of Machine learning models Was proposed in 1944 Information processing model A computational model inspired by the human brain's structure and function. Follows principles of neuronal organization Makes prediction based on what they’ve learned Applied in the concept of AI Learns by examples Artificial Neural Networking(ANN) INTRODUCTION
  • 3. Artificial Neural Networks Core of deep learning Image recognition, NLP, Robotics, etc. 3 different layers Input Layer 1. Hidden Layer 2. Output layer 3. Sometimes referred to as the MLP (Multi-Layer Perceptron).
  • 4. The amount of data produced has increased, and big data has become a buzzword. Big data made it easy to train ANNs. While classical machine learning algorithms fell short of analyzing big data, artificial neural networks performed well on big data.
  • 5. Biological Neural Networks A neural network consists of the cell body containing the nucleus, many-branched dendrites, and a long axon. As you can see in the image above, the length of the axon is much longer than the cell body. The axons are divided into many branches that combine with the dendrites or cell bodies of other neurons. Biological neurons generate electrical signals that travel along axons. If a neuron gets enough stimulation, it fires. In general, biological neurons work this way. This work may seem simple, but billions of neural networks can be managed by connecting neurons.
  • 6. Artificial Neurons – Associated with the weights which contain information about the input signal Compromises interconnected nodes (neurons) organized in layers Interconnected to each other by weighted links
  • 7. As you can see, the inputs and output are numbers and each input has a weight in this architecture. These weighted inputs are summed first, and then a bias is added. This sum is passed through a step function. This function can be, for example, a sign function. Perceptron A perceptron is the smallest unit of a neural network. This architecture was developed by Frank Rosenblatt in 1957. This is a simple model of biological neuron in Artificial Neural Network
  • 8. An algorithm for binary classification can be used. A simple threshold logic unit (TLU) is a single linear function. This simple approach has paved the way for developing AI tools like ChatGPT. The image shows a perceptron with two inputs and three outputs, connected via a dense layer. It is used for multi- label classification.
  • 9. How to train a perception? First, assign a random weight to each input. The sum of the weighted inputs is found, and then a bias is added to this sum. Note that, when one neuron triggers another neuron frequently, the connection between them becomes stronger. Inputs passing through neurons produce an output. This output is a prediction. The actual value is compared with the prediction and the error is calculated. Weights are updated to make predictions with fewer errors. Perceptron was a nice approach, but failed to solve some simple problems like XOR. To overcome the limitations of the perceptrons, the multilayer perceptron was developed. Let’s dive into multilayer perceptrons. If you learn how a perceptron is trained, you will have a better understanding of how ANNs work. Let’s now discuss how to train a perceptron.
  • 10. Multi-layer Perceptron Multilayer perceptron consist of an input layer, a hidden layer, and an output layer. As you can see in the image above, there is a hidden layer. If there is more than one hidden layer, it is called a deep neural network. This is where deep learning comes into play. Deep learning became popular with the development of modern AI architectures.
  • 11. In short, the inputs go through neural networks and a prediction is made. But, how to improve the prediction of a neural network? This is where the backpropagation algorithm comes in. This algorithm takes a neural network’s output error and then propagates that error backward through the network. So, the weights are updated. And this cycle continues until the best prediction is obtained.
  • 12. A P P L I C A T I O N ANNs power language translation, sentiment analysis, and chatbots in NLP applications. Natural Language Processing (NLP): ANNs analyze medical images for disease detection, assisting in diagnostics using techniques like computer- aided diagnosis Medical Diagnosis: ANNs predict stock prices, market trends, and assess financial risk by analyzing historical data. Financial Forecasting: ANNs contribute to the development of self-driving cars by enabling object detection, lane keeping, and decision- making. Autonomous Vehicles: ANNs are used for image and speech recognition, enabling applications like facial recognition and voice assistants. Image and Speech Recognition: 12
  • 13. ANNs often require large labeled datasets for training, and obtaining such data can be challenging. Training complex neural networks may demand significant computational power. Neural networks can overfit to training data, capturing noise and hindering generalization to new data. Issues related to bias, fairness, and accountability arise, especially in sensitive applications or high- stakes decision-making. CHALLENGES ANNs require large amounts of labeled data for effective training, and performance may suffer with insufficient or biased datasets. Training deep networks can be computationally demanding Models memorize training data noise rather than learning general patterns. Neural networks are often perceived as black- box models, making it challenging to interpret their decision-making processes. LIMITATIONS C H A L L E N G E S A N D L I M I T A T I O N S
  • 14. 14