Who cares if the code is open source, or pre-training weights are released? Virtually every Masters in CS student in 2024 is building this from scratch. The differentiator is the training dataset, or at worst, the weights after fine tuning the model.
Who really wants an LLM trained on Musk’s Great Replacement blog?
Training something like this costs millions. It’s not going to be useful to many people because of it’s size/the cost of doing anything with it.
I don’t think that giving alms justified the existence of feudal Lords. In the same way, I think the fact that nation states can’t keep up with the research expenditure of a few rich men (think space travel), shows us that we have a problem. That said, it does represent a fairly generous donation to science, start-ups, or whatever.
It’s size makes it basically useless. It underperforms models even in it’s active weight class. It’s nice that it’s available but Grok-0 would have been far more interesting.
This is the best summary I could come up with:
It is fine-tuned for applications such as natural language dialog, and represents the raw base model checkpoint from the pre-training phase, which concluded in October 2023.
Grok will be familiar to users of Musk’s social media platform, X, and subscribers have been able to ask the chatbot questions and receive answers.
If a user flicks through a dog-eared copy of The Hitchhiker’s Guide to the Galaxy radio scripts, the following definition can be found lurking in Fit the Tenth: "The Hitchhiker’s Guide to the Galaxy is an indispensable companion to all those who are keen to make sense of life in an infinitely complex and confusing universe, for though it cannot hope to be useful or informative on all matters, it does make the reassuring claim that where it is inaccurate, it is at least definitively inaccurate.
The release comes on the first anniversary of the launch of OpenAI’s GPT-4 model, and Musk’s legal spat with his former AI pals remains in the background.
OpenAI responded by releasing a trove of emails, claiming Musk was fully aware of its plans and wanted it folded into Tesla.
By opening up the weights behind Grok-1, Musk is attempting to plant a flag in the opposite camp to the proprietary world of OpenAI.
The original article contains 639 words, the summary contains 210 words. Saved 67%. I’m a bot and I’m open source!