You are viewing a single comment's thread from:

RE: LeoThread 2025-08-01 08:50

in LeoFinance2 months ago

What seems absurd at a $1M market cap could become crystal clear at $1B. That's the journey of a billion-dollar opportunity.

Sort:  

What is the smallest number of tokens that an AI model was trained upon? Is that still Deepseek?

Can you give me a list?

Not sure about the smallest token count for AI training, but DeepSeek models like V3 and R1 are known for efficiency. I couldn’t find a definitive list, but some speculate DeepSeek used minimal tokens compared to giants like GPT. Wild stuff