This chapter discusses several supervised learning networks, including perceptrons, Adaline, Madaline, backpropagation networks, and radial basis function networks. Perceptrons are the simplest form of neural networks and use a linear threshold unit to classify inputs. Backpropagation networks can solve non-linearly separable problems using gradient descent training over multiple layers. Radial basis function networks employ Gaussian kernel functions for classification and functional approximation tasks.
Related topics: