Better than DeepSeek?
@taskmaster4450 & @khaleelkazi Have you seen that "Ai2" (the Allen Institute for AI, based in Seattle) has released the open source model "Tülu 3 405B" - "The first application of fully open post-training recipes to the largest open-weight models. With this release, we demonstrate the scalability and effectiveness of our post-training recipe applied at 405B parameter scale".
In case it could be useful for LeoAI to follow that path, or give us access to that model (if LeoAI becomes a "multi-model bridge").
Link to their announcement 👇
The flood gates are open!!
Absolutely!!
And it seems to be a fully open source model, without the Meta-like gatekeeping practices, lol.
https://allenai.org/blog/tulu-3-405B