You are viewing a single comment's thread from:

RE: LeoThread 2025-01-30 12:14

in LeoFinance9 months ago

DeepSeek's models are much smaller than many other large language models. V3 has a total of 671 billion parameters, or variables that the model learns during training. And while OpenAI doesn't disclose parameters, experts estimate its latest model to have at least a trillion.

In terms of performance, DeepSeek says its R1 model achieves performance comparable to OpenAI's o1 on reasoning tasks, citing benchmarks including AIME 2024, Codeforces, GPQA Diamond, MATH-500, MMLU and SWE-bench Verified.

In a technical report, the company said its V3 model had a training cost of only $5.6 million — a fraction of the billions of dollars that notable Western AI labs such as OpenAI and Anthropic have spent to train and run their foundational AI models. It isn't yet clear how much DeepSeek costs to run, however.