Elon Musk's AI company, xAI, announced its plans to release two new versions of its large language model (LLM) called "Grok." Grok 2 is set to be released next month, while Grok 3 is expected to be released in December.
Grok 2 has been trained on approximately 15,000 GPUs, including NVIDIA's H100 GPUs. It is claimed that Grok 2 will perform on par with or close to GPT-4.
Grok 3 is being trained at xAI's supercomputing facility in Memphis, Tennessee. This facility claims to be the world's most powerful AI training cluster, consisting of 100,000 liquid-cooled H100 GPUs. The company aims to complete Grok 3's training in about three or four months, followed by fine-tuning and bug fixing before its anticipated release in December.
Analyst QuickTake: This news comes two months after the company raised USD 6 billion to boost R&D for future technology ventures. In April, it also launched Grok-1.5V , a multimodal model capable of interpreting textual and visual data. xAI's flagship offering, Grok, is offered exclusively on the X platform and competes with OpenAI’s ChatGPT.
By using this site, you agree to allow SPEEDA Edge and our partners to use cookies for analytics and personalization. Visit our privacy policy for more information about our data collection practices.