You are viewing a single comment's thread from:

RE: LeoThread 2025-02-17 13:35

in LeoFinance9 months ago

There are increases in performance from test time compute. The OpenAI O3 model used 30,000 H100 GPU hours to answer the toughest math and reasoning problems. This type of AI inference will only be used when there is great value to push the boundaries of AI capability for a vastly superior and urgently needed answer. There needs to be some form of question pre-analysis or routing done to estimate how much effort is worthwhile.

Deepseek shows AI continues to improve rapidly and we are getting better results for less energy and less cost. It shows that AI will be profitable where answers and value will become lower and lower cost. Even virtually free with more and more capable local models. The World will change and more AI Data Centers will be needed to give the best answers or agent actions for the most valuable and challenging needs.