Part 6/9:
The real pivot towards modern neural networks occurred when researchers integrated the LMS idea with back-propagation in the 1980s. Notably, David Rumelhart and Geoffrey Hinton expanded on the concept by replacing the all-or-nothing activation function of traditional perceptrons with smoother activation functions like the sigmoid function, thus permitting the back-propagation algorithm to manipulate multi-layer networks effectively.
This new architecture provided significant advantages in solving complex problems, allowing neural networks to contain multiple layers of interconnected perceptrons, each functioning to refine output predictions iteratively.