You are viewing a single comment's thread from:

RE: LeoThread 2024-10-22 21:22

Eye-popping numbers like these make it easy to forget size isn’t everything.

Some researchers, particularly those with fewer resources, are aiming to do more with less. AI scaling will continue, but those algorithms will also get far more efficient as they grow.

Last week, researchers at the Allen Institute for Artificial Intelligence (Ai2) released a new family of open-source multimodal models competitive with state-of-the-art models like OpenAI’s GPT-4o—but an order of magnitude smaller. Called Molmo, the models range from 1 billion to 72 billion parameters. GPT-4o, by comparison, is estimated to top a trillion parameters.