You are viewing a single comment's thread from:

RE: LeoThread 2025-11-10 15-19

in LeoFinance4 days ago

Part 3/11:

Training involved approximately 3,084 million GPU hours on 16,000 Nvidia H100 GPUs, the top-tier hardware suited for large-scale AI training. The environmental footprint was substantial, resulting in 11,390 tons of CO2 emissions, underlining both the massive resource appetite and the achievements in AI engineering.


Competitive Benchmarks and Capabilities

Despite its colossal size, Meta claims that Llama 3.1 can rival industry giants like OpenAI's GPT-4 and Anthropic's Claude 3.5 in various tasks—ranging from text generation to conversational responses. While these claims are bold, Meta's internal assessments suggest that Llama 3.1 performs competitively across multiple benchmarks.


Open Source: Democratizing AI Development