You are viewing a single comment's thread from:

RE: LeoThread 2025-04-05 19:12

in LeoFinance6 months ago

Part 5/10:

To further explore this phenomenon, researchers discovered that most weights in large neural networks remain unused. They found that up to 96% of weights could be eliminated without adversely affecting the model's performance. This suggests that the true power lies in smaller sub-networks buried within the larger network, offering insights into how scaling up might actually create a higher likelihood of finding effective learning architectures within the noise.

The Lottery Ticket Hypothesis