You are viewing a single comment's thread from:

RE: LeoThread 2024-09-02 09:39

AI Models Scaled Up 10,000x Are Possible by 2030, Report Says

Recent progress in AI largely boils down to one thing: Scale.

Around the beginning of this decade, AI labs noticed that making their algorithms—or models—ever bigger and feeding them more data consistently led to enormous improvements in what they could do and how well they did it. The latest crop of AI models have hundreds of billions to over a trillion internal network connections and learn to write or code like we do by consuming a healthy fraction of the internet.

#ai #technology #scaling

Sort:  

It takes more computing power to train bigger algorithms. So, to get to this point, the computing dedicated to AI training has been quadrupling every year, according to nonprofit AI research organization, Epoch AI.

Should that growth continue through 2030, future AI models would be trained with 10,000 times more compute than today’s state of the art algorithms, like OpenAI’s GPT-4.

“If pursued, we might see by the end of the decade advances in AI as drastic as the difference between the rudimentary text generation of GPT-2 in 2019 and the sophisticated problem-solving abilities of GPT-4 in 2023,” Epoch wrote in a recent research report detailing how likely it is this scenario is possible.

But modern AI already sucks in a significant amount of power, tens of thousands of advanced chips, and trillions of online examples. Meanwhile, the industry has endured chip shortages, and studies suggest it may run out of quality training data. Assuming companies continue to invest in AI scaling: Is growth at this rate even technically possible?

In its report, Epoch looked at four of the biggest constraints to AI scaling: Power, chips, data, and latency. TLDR: Maintaining growth is technically possible, but not certain.