You are viewing a single comment's thread from:

RE: LeoThread 2025-02-16 13:19

in LeoFinance8 months ago

Some technologists believe that DeepSeek may have been able to achieve such a high level of performance by training its models on larger U.S. AI systems.

This technique, known as "distillation," involves having more powerful AI models evaluate the quality of answers being generated by a newer model.

It's a claim that OpenAI itself has alluded to, telling CNBC in a statement last month that it's reviewing reports that DeepSeek may have "inappropriately" used output data from its models to develop its AI model, a method referred to as "distillation."