This document presents a lecture on linearly separable functions in neural networks, specifically focusing on the perceptron learning rule and various methods to solve 2-class problems. It details the process of updating weight vectors in both pattern and batch modes, including iterations and convergence examples. Graphical representation of class separation, along with the influence of learning rates and biases, is also discussed.
Related topics: