You are viewing a single comment's thread from:

RE: LeoThread 2025-11-05 15-48

in LeoFinance21 days ago

Part 4/15:

  • If continued, the next leap could be even larger, but claims of a thousand-fold jump in one step seem overstated without additional breakthroughs.

The Role of Sparsity: Dense vs. Sparse Networks

A significant technical aspect is whether the next models will be dense or sparse networks.

  • Human brains are incredibly sparse with micro columns and local connections, with around 90 billion neurons and trillions of synapses.

  • Transformers like GPT are traditionally dense, meaning every parameter potentially connects to every other, leading to computational constraints.