You are viewing a single comment's thread from:

RE: LeoThread 2025-02-01 10:54

in LeoFinance4 months ago

Part 6/9:

The real pivot towards modern neural networks occurred when researchers integrated the LMS idea with back-propagation in the 1980s. Notably, David Rumelhart and Geoffrey Hinton expanded on the concept by replacing the all-or-nothing activation function of traditional perceptrons with smoother activation functions like the sigmoid function, thus permitting the back-propagation algorithm to manipulate multi-layer networks effectively.

This new architecture provided significant advantages in solving complex problems, allowing neural networks to contain multiple layers of interconnected perceptrons, each functioning to refine output predictions iteratively.

Modern Applications