Tesla Inc.
What Happened: Musk recently disclosed that the upcoming version of his AI chatbot, Grok 3, will be trained on a massive 100,000 Nvidia H100 chips.
The H100 chips, also known as Hopper, are crucial for handling data processing in large language models (LLMs) and are highly sought after in Silicon Valley.
Each Nvidia H100 GPU chip is estimated to cost around $30,000, with some estimates reaching as high as $40,000.
This means that the training of Grok 3 could potentially be on AI chips worth $3 billion to $4 billion, which would be nearly three to four times of what Mark Zuckerberg's Meta Platforms Inc.
Elon Musk@elonmuskReplying to @BasedBeffJezosGrok 3 end of year after training on 100k H100s should be really something special
It is unclear if these chips were purchased outright by Musk's company or rented from cloud service providers.
In a previous interview, Musk mentioned that Grok 2 required around 20,000 H100s for training.
Musk's AI startup, xAI, has already released Grok-1 and Grok-1.5, with Grok 2 set to launch in August. Musk hinted that Grok 3 will be released by the end of the year.
Why It Matters: The use of AI in various industries is increasing rapidly, leading to a high demand for AI chips.
This has led to a race among tech companies to acquire these chips, as seen in the case of Musk's xAI and Meta, which is also stacking up on a large number of GPUs.
This has also contributed to the fierce competition for top AI talent, as highlighted by Aravind Srinivas, founder and CEO of AI startup Perplexity.
The development of AI technology and the resources invested in it are crucial for the future of various industries, including automotive and space exploration, where Musk's companies are major players.