You are viewing a single comment's thread from:

RE: LeoThread 2024-09-02 09:39

It takes more computing power to train bigger algorithms. So, to get to this point, the computing dedicated to AI training has been quadrupling every year, according to nonprofit AI research organization, Epoch AI.

Should that growth continue through 2030, future AI models would be trained with 10,000 times more compute than today’s state of the art algorithms, like OpenAI’s GPT-4.

“If pursued, we might see by the end of the decade advances in AI as drastic as the difference between the rudimentary text generation of GPT-2 in 2019 and the sophisticated problem-solving abilities of GPT-4 in 2023,” Epoch wrote in a recent research report detailing how likely it is this scenario is possible.