You are viewing a single comment's thread from:

RE: LeoThread 2025-01-28 12:24

in LeoFinance11 months ago

Much of the AI boom and the demand for Nvidia GPUs was driven by the "scaling law," a concept in AI development proposed by OpenAI researchers in 2020. That concept suggested that better AI systems could be developed by greatly expanding the amount of computation and data that went into building a new model, requiring more and more chips.

Since November, Huang and Altman have been focusing on a new wrinkle to the scaling law, which Huang calls "test-time scaling."

This concept says that if a fully trained AI model spends more time using extra computer power when making predictions or generating text or images to allow it to "reason," it will provided better answers than it would have if it ran for less time.