Neural networks are composed of simple processing units (neurons) that are interconnected and can learn from data. Natural neural networks in the brain contain billions of neurons that communicate via electrochemical signals. Early artificial neural networks modeled neurons as simple processing units that sum their weighted inputs and use an activation function to determine their output. These networks had limitations in what functions they could represent. The development of multi-layer perceptrons overcame these limitations by introducing hidden layers that increased their computational and representational power.
Related topics: