You are viewing a single comment's thread from:

RE: LeoThread 2025-01-30 12:14

in LeoFinance9 months ago

That means the newer model can reap the benefits of the massive investments of time and computing power that went into building the initial model without the associated costs.

This form of distillation, which is different from how most academic researchers previously used the word, is a common technique used in the AI field.

However, it is a violation of the terms of service of some prominent models put out by US tech companies in recent years, including OpenAI.