SlideShare a Scribd company logo
3
Most read
4
Most read
7
Most read
Quantum Neural Network
Presented By :
D Surat
M.Sc. Physics
Quantum Neural
Network
Presented By :
D Surat
M.Sc. Physics
What is Neural Network ?
The simplest definition of a neural network, more properly
referred to as an 'artificial' neural network (ANN), is provided
by the inventor of one of the first neurocomputers, Dr.
Robert Hecht-Nielsen. He defines a neural network as:
"...a computing system made up of a number of simple,
highly interconnected processing elements, which process
information by their dynamic state response to external
inputs.
Artificial Neural Network
An Artificial Neural Network (ANN) is an information processing paradigm that is
inspired by the way biological nervous systems, such as the brain, process
information. It is composed of a large number of highly interconnected
processing elements (neurones) working in unison to solve specific problems.
ANNs, like people, learn by example. An ANN is configured for a specific
application, such as pattern recognition or data classification, through a learning
process. Learning in biological systems involves adjustments to the synaptic
connections that exist between the neurons. This is true of ANNs as well.
Basics of Neural Network
Neural networks are typically organized in layers. Layers are made up of a
number of interconnected 'nodes' which contain an 'activation function'. Patterns
are presented to the network via the 'input layer', which communicates to one or
more 'hidden layers' where the actual processing is done via a system of
weighted 'connections'. The hidden layers then link to an 'output layer' where the
answer is output as shown in the graphic below.
Quantum neural network
Cont’d
Most ANNs contain some form of 'learning rule' which modifies the weights of
the connections according to the input patterns that it is presented with. In a
sense, ANNs learn by example as do their biological counterparts; a child learns
to recognize dogs from examples of dogs.
Although there are many different kinds of learning rules used by neural
networks, this demonstration is concerned only with one; the delta rule. The delta
rule is often utilized by the most common class of ANNs called
'backpropagational neural networks' (BPNNs). Backpropagation is an
abbreviation for the backwards propagation of error.
How do neural networks differ from
conventional computing ?
To better understand artificial neural computing it is important to know first how
a conventional 'serial' computer and it's software process information. A serial
computer has a central processor that can address an array of memory
locations where data and instructions are stored. Computations are made by the
processor reading an instruction as well as any data the instruction requires
from memory addresses, the instruction is then executed and the results are
saved in a specified memory location as required. In a serial system (and a
standard parallel one as well) the computational steps are deterministic,
sequential and logical, and the state of a given variable can be tracked from one
operation to another.
Cont’d
In comparison, ANNs are not sequential or necessarily deterministic. There are
no complex central processors, rather there are many simple ones which
generally do nothing more than take the weighted sum of their inputs from other
processors. ANNs do not execute programed instructions; they respond in
parallel (either simulated or actual) to the pattern of inputs presented to it. There
are also no separate memory addresses for storing data. Instead, information is
contained in the overall activation 'state' of the network. 'Knowledge' is thus
represented by the network itself, which is quite literally more than the sum of its
individual components.
Applications
Neural networks are universal approximators, and they work best if the system
you are using them to model has a high tolerance to error. One would therefore
not be advised to use a neural network to balance one's cheque book! However
they work very well for:
● capturing associations or discovering regularities within a set of patterns;
● where the volume, number of variables or diversity of the data is very great;
● the relationships between variables are vaguely understood; or,
● the relationships are difficult to describe adequately with conventional
approaches.
Limitations
There are many advantages and limitations to neural network analysis and to
discuss this subject properly we would have to look at each individual type of
network, which isn't necessary for this general discussion. In reference to
backpropagational networks however, there are some specific issues potential
users should be aware of.
● Backpropagational neural networks (and many other types of networks) are
in a sense the ultimate 'black boxes'. Apart from defining the general
architecture of a network and perhaps initially seeding it with a random
numbers, the user has no other role than to feed it input and watch it train
and await the output. In fact, it has been said that with backpropagation,
"you almost don't know what you're doing". Some software freely available
software packages (NevProp, bp, Mactivation) do allow the user to sample
the networks 'progress' at regular time intervals, but the learning itself
progresses on its own. The final product of this activity is a trained network that
provides no equations or coefficients defining a relationship (as in regression)
beyond it's own internal mathematics. The network 'IS' the final equation of the
relationship.
● Backpropagational networks also tend to be slower to train than other types
of networks and sometimes require thousands of epochs. If run on a truly
parallel computer system this issue is not really a problem, but if the BPNN
is being simulated on a standard serial machine (i.e. a single SPARC, Mac or
PC) training can take some time. This is because the machines CPU must
compute the function of each node and connection separately, which can be
problematic in very large networks with a large amount of data. However,
the speed of most current machines is such that this is typically not much of
an issue.
Advantages over conventional techniques
Depending on the nature of the application and the strength of the internal data
patterns you can generally expect a network to train quite well. This applies to
problems where the relationships may be quite dynamic or non-linear. ANNs
provide an analytical alternative to conventional techniques which are often
limited by strict assumptions of normality, linearity, variable independence etc.
Because an ANN can capture many kinds of relationships it allows the user to
quickly and relatively easily model phenomena which otherwise may have been
very difficult or impossible to explain otherwise.
Reference
https://www.doc.ic.ac.uk/~nd/surprise_96/journal/vol4/cs11/report.html
http://pages.cs.wisc.edu/~bolo/shipyard/neural/local.html
Thank You !

More Related Content

PDF
Fog Computing
PPTX
Neuromorphic computing
PDF
Artificial Neural Network Lecture 6- Associative Memories & Discrete Hopfield...
PDF
Lecture: Word Sense Disambiguation
PDF
RNN and its applications
PPT
PPTX
Quantum computing
PPT
Back propagation
Fog Computing
Neuromorphic computing
Artificial Neural Network Lecture 6- Associative Memories & Discrete Hopfield...
Lecture: Word Sense Disambiguation
RNN and its applications
Quantum computing
Back propagation

What's hot (20)

PPTX
Neural network
PPT
Deep learning ppt
PPTX
Neural networks.ppt
PPTX
Perceptron & Neural Networks
PPT
Perceptron
PPTX
neural network
PPTX
Natural Language Processing in AI
PPTX
Neuro-fuzzy systems
PPTX
Deep Learning With Neural Networks
PDF
Seq2Seq (encoder decoder) model
PDF
An introduction to Deep Learning
PPTX
Quantum computer ppt
PPTX
1.Introduction to deep learning
PPTX
Natural Language Processing
PPTX
Natural language processing
PDF
Introduction to Transformers for NLP - Olga Petrova
PPTX
Neural network & its applications
PPTX
Wireless Sensor Networks ppt
PDF
Neural Networks: Rosenblatt's Perceptron
PPTX
Artificial neural network
Neural network
Deep learning ppt
Neural networks.ppt
Perceptron & Neural Networks
Perceptron
neural network
Natural Language Processing in AI
Neuro-fuzzy systems
Deep Learning With Neural Networks
Seq2Seq (encoder decoder) model
An introduction to Deep Learning
Quantum computer ppt
1.Introduction to deep learning
Natural Language Processing
Natural language processing
Introduction to Transformers for NLP - Olga Petrova
Neural network & its applications
Wireless Sensor Networks ppt
Neural Networks: Rosenblatt's Perceptron
Artificial neural network
Ad

Similar to Quantum neural network (20)

PPTX
employed to cover the tampering traces of a tampered image. Image tampering
PPTX
Neural network
PDF
Artificial Neural networks
PPTX
Artifical Neural Network and its applications
DOCX
Artifical neural networks
PDF
Artificial Neural Network and its Applications
DOCX
Neural networks report
PPTX
Neural network
PPTX
Artificial Neural Network.pptx
DOCX
Project Report -Vaibhav
PPTX
Karan ppt for neural network and deep learning
PPTX
Artificial Neural Network ANN
PDF
Neural network
DOCX
Neural network
PDF
Artificial neural network for machine learning
PDF
Artificial Neural Networking
PDF
Neural networking this is about neural networks
PPT
Artificial neural network
DOCX
ABSTRACT.docxiyhkkkkkkkkkkkkkkkkkkkkkkkkkkkk
DOCX
Neural networks of artificial intelligence
employed to cover the tampering traces of a tampered image. Image tampering
Neural network
Artificial Neural networks
Artifical Neural Network and its applications
Artifical neural networks
Artificial Neural Network and its Applications
Neural networks report
Neural network
Artificial Neural Network.pptx
Project Report -Vaibhav
Karan ppt for neural network and deep learning
Artificial Neural Network ANN
Neural network
Neural network
Artificial neural network for machine learning
Artificial Neural Networking
Neural networking this is about neural networks
Artificial neural network
ABSTRACT.docxiyhkkkkkkkkkkkkkkkkkkkkkkkkkkkk
Neural networks of artificial intelligence
Ad

More from surat murthy (6)

PPTX
Adaptive Resonance Theory
PPTX
Set Induction
PPTX
EPR paradox
PPTX
Dark matter & dark energy
PPTX
Angular Momentum & Parity in Alpha decay
PPTX
Transistor Transistor Logic
Adaptive Resonance Theory
Set Induction
EPR paradox
Dark matter & dark energy
Angular Momentum & Parity in Alpha decay
Transistor Transistor Logic

Recently uploaded (20)

PPT
Chemical bonding and molecular structure
PPTX
Introduction to Fisheries Biotechnology_Lesson 1.pptx
PPTX
ANEMIA WITH LEUKOPENIA MDS 07_25.pptx htggtftgt fredrctvg
PPTX
Comparative Structure of Integument in Vertebrates.pptx
PPTX
famous lake in india and its disturibution and importance
PPTX
ognitive-behavioral therapy, mindfulness-based approaches, coping skills trai...
PPTX
SCIENCE10 Q1 5 WK8 Evidence Supporting Plate Movement.pptx
PDF
MIRIDeepImagingSurvey(MIDIS)oftheHubbleUltraDeepField
PDF
VARICELLA VACCINATION: A POTENTIAL STRATEGY FOR PREVENTING MULTIPLE SCLEROSIS
PDF
HPLC-PPT.docx high performance liquid chromatography
PPTX
Cell Membrane: Structure, Composition & Functions
PDF
An interstellar mission to test astrophysical black holes
PPTX
cpcsea ppt.pptxssssssssssssssjjdjdndndddd
PPTX
Classification Systems_TAXONOMY_SCIENCE8.pptx
PDF
CAPERS-LRD-z9:AGas-enshroudedLittleRedDotHostingaBroad-lineActive GalacticNuc...
PDF
. Radiology Case Scenariosssssssssssssss
PDF
Formation of Supersonic Turbulence in the Primordial Star-forming Cloud
PDF
diccionario toefl examen de ingles para principiante
PPTX
Derivatives of integument scales, beaks, horns,.pptx
PPT
The World of Physical Science, • Labs: Safety Simulation, Measurement Practice
Chemical bonding and molecular structure
Introduction to Fisheries Biotechnology_Lesson 1.pptx
ANEMIA WITH LEUKOPENIA MDS 07_25.pptx htggtftgt fredrctvg
Comparative Structure of Integument in Vertebrates.pptx
famous lake in india and its disturibution and importance
ognitive-behavioral therapy, mindfulness-based approaches, coping skills trai...
SCIENCE10 Q1 5 WK8 Evidence Supporting Plate Movement.pptx
MIRIDeepImagingSurvey(MIDIS)oftheHubbleUltraDeepField
VARICELLA VACCINATION: A POTENTIAL STRATEGY FOR PREVENTING MULTIPLE SCLEROSIS
HPLC-PPT.docx high performance liquid chromatography
Cell Membrane: Structure, Composition & Functions
An interstellar mission to test astrophysical black holes
cpcsea ppt.pptxssssssssssssssjjdjdndndddd
Classification Systems_TAXONOMY_SCIENCE8.pptx
CAPERS-LRD-z9:AGas-enshroudedLittleRedDotHostingaBroad-lineActive GalacticNuc...
. Radiology Case Scenariosssssssssssssss
Formation of Supersonic Turbulence in the Primordial Star-forming Cloud
diccionario toefl examen de ingles para principiante
Derivatives of integument scales, beaks, horns,.pptx
The World of Physical Science, • Labs: Safety Simulation, Measurement Practice

Quantum neural network

  • 1. Quantum Neural Network Presented By : D Surat M.Sc. Physics
  • 2. Quantum Neural Network Presented By : D Surat M.Sc. Physics
  • 3. What is Neural Network ? The simplest definition of a neural network, more properly referred to as an 'artificial' neural network (ANN), is provided by the inventor of one of the first neurocomputers, Dr. Robert Hecht-Nielsen. He defines a neural network as: "...a computing system made up of a number of simple, highly interconnected processing elements, which process information by their dynamic state response to external inputs.
  • 4. Artificial Neural Network An Artificial Neural Network (ANN) is an information processing paradigm that is inspired by the way biological nervous systems, such as the brain, process information. It is composed of a large number of highly interconnected processing elements (neurones) working in unison to solve specific problems. ANNs, like people, learn by example. An ANN is configured for a specific application, such as pattern recognition or data classification, through a learning process. Learning in biological systems involves adjustments to the synaptic connections that exist between the neurons. This is true of ANNs as well.
  • 5. Basics of Neural Network Neural networks are typically organized in layers. Layers are made up of a number of interconnected 'nodes' which contain an 'activation function'. Patterns are presented to the network via the 'input layer', which communicates to one or more 'hidden layers' where the actual processing is done via a system of weighted 'connections'. The hidden layers then link to an 'output layer' where the answer is output as shown in the graphic below.
  • 7. Cont’d Most ANNs contain some form of 'learning rule' which modifies the weights of the connections according to the input patterns that it is presented with. In a sense, ANNs learn by example as do their biological counterparts; a child learns to recognize dogs from examples of dogs. Although there are many different kinds of learning rules used by neural networks, this demonstration is concerned only with one; the delta rule. The delta rule is often utilized by the most common class of ANNs called 'backpropagational neural networks' (BPNNs). Backpropagation is an abbreviation for the backwards propagation of error.
  • 8. How do neural networks differ from conventional computing ? To better understand artificial neural computing it is important to know first how a conventional 'serial' computer and it's software process information. A serial computer has a central processor that can address an array of memory locations where data and instructions are stored. Computations are made by the processor reading an instruction as well as any data the instruction requires from memory addresses, the instruction is then executed and the results are saved in a specified memory location as required. In a serial system (and a standard parallel one as well) the computational steps are deterministic, sequential and logical, and the state of a given variable can be tracked from one operation to another.
  • 9. Cont’d In comparison, ANNs are not sequential or necessarily deterministic. There are no complex central processors, rather there are many simple ones which generally do nothing more than take the weighted sum of their inputs from other processors. ANNs do not execute programed instructions; they respond in parallel (either simulated or actual) to the pattern of inputs presented to it. There are also no separate memory addresses for storing data. Instead, information is contained in the overall activation 'state' of the network. 'Knowledge' is thus represented by the network itself, which is quite literally more than the sum of its individual components.
  • 10. Applications Neural networks are universal approximators, and they work best if the system you are using them to model has a high tolerance to error. One would therefore not be advised to use a neural network to balance one's cheque book! However they work very well for: ● capturing associations or discovering regularities within a set of patterns; ● where the volume, number of variables or diversity of the data is very great; ● the relationships between variables are vaguely understood; or, ● the relationships are difficult to describe adequately with conventional approaches.
  • 11. Limitations There are many advantages and limitations to neural network analysis and to discuss this subject properly we would have to look at each individual type of network, which isn't necessary for this general discussion. In reference to backpropagational networks however, there are some specific issues potential users should be aware of. ● Backpropagational neural networks (and many other types of networks) are in a sense the ultimate 'black boxes'. Apart from defining the general architecture of a network and perhaps initially seeding it with a random numbers, the user has no other role than to feed it input and watch it train and await the output. In fact, it has been said that with backpropagation, "you almost don't know what you're doing". Some software freely available software packages (NevProp, bp, Mactivation) do allow the user to sample
  • 12. the networks 'progress' at regular time intervals, but the learning itself progresses on its own. The final product of this activity is a trained network that provides no equations or coefficients defining a relationship (as in regression) beyond it's own internal mathematics. The network 'IS' the final equation of the relationship. ● Backpropagational networks also tend to be slower to train than other types of networks and sometimes require thousands of epochs. If run on a truly parallel computer system this issue is not really a problem, but if the BPNN is being simulated on a standard serial machine (i.e. a single SPARC, Mac or PC) training can take some time. This is because the machines CPU must compute the function of each node and connection separately, which can be problematic in very large networks with a large amount of data. However, the speed of most current machines is such that this is typically not much of an issue.
  • 13. Advantages over conventional techniques Depending on the nature of the application and the strength of the internal data patterns you can generally expect a network to train quite well. This applies to problems where the relationships may be quite dynamic or non-linear. ANNs provide an analytical alternative to conventional techniques which are often limited by strict assumptions of normality, linearity, variable independence etc. Because an ANN can capture many kinds of relationships it allows the user to quickly and relatively easily model phenomena which otherwise may have been very difficult or impossible to explain otherwise.