You are viewing a single comment's thread from:

RE: LeoThread 2025-04-09 04:20

in LeoFinance6 months ago

Part 4/10:

In a significant leap for small model efficiency, Miss Straw released an open-source small model, Knowledge GPTQA, that outperforms larger closed-source models. With just 24 billion parameters, this multimodal model promises exceptional performance on local machines, capable of running on an RTX490 or a Mac with 32GB of RAM. It excels in advanced reasoning, boasting a context window of 128,000 tokens. Users are encouraged to download it and explore its capabilities.

Claude Updates: Enhanced Web Search Capability