• Grok-1’s base model weights and network architecture are now available on GitHub. • Developers can explore, modify, and redistribute the code under the Apache 2.0 license.
• Grok-1 is a 314-billion parameter Mixture- of-Experts model trained from scratch by xAI. • The released model represents Grok-1 during its pre-training phase in October 2023. • While it has been trained on a large amount of text data, it hasn’t been fine- tuned for any specific task.
• In an apparent nod to the movie Bill & Ted’s Excellent Adventure, the code of conduct accompanying Grok’s release simply states: “Be excellent to each other.”
• Grok-1 competes with other chatbot models like OpenAI’s ChatGPT. • Elon Musk previously invested in OpenAI and has been pushing for open-source AI.
• While Grok-1’s current capabilities are limited, its open-source availability encourages experimentation and innovation. • Developers can contribute to its growth and explore new use cases.