Elon Musk rolls out 'Colossus' supercomputers, competing OpenAI's GPT-4

Elon Musk rolls out 'Colossus' supercomputers, competing OpenAI's GPT-4

Technology

Colossus meant to train xAI's LLM, Grok using Nvidia H200 chips

Follow on
Follow us on Google News
 

(Web Desk) - The founder of multiple billion dollar companies, Elon Musk has unveiled ‘Colossus’ , a new supercomputer by its AI startup xAI.

"The most powerful AI training system in the world,” as per Elon Musk and would compete with OpenAI's GPT-4.

Colossus meant to train xAI's large language model (LLM), Grok and this supercomputer is located in Memphis. Musk's third generation AI model Grok-3 will be trained by Colossus.

The most powerful AI in the world, Colossus is anticipated to release in December.

This project is developing in rapid speed amid tough competition among tech rivals Microsoft, Google, and Amazon to harness AI acquiring AI chips.

Elon Musk also disclosed plans to acquire 50,000 of Nvidia’s advanced H200 chips, outperforming the H100 models currently being used.

An early beta version of Grok-2, trained on 15,000 Nvidia H100 chips, was recently rolled out and is already being recognised as one of the most capable AI models.

Musk has committed to spending $3-4 billion on Nvidia hardware this year for both xAI and Tesla.

The project has also raised eyebrows in Memphis, as Colossus requires up to 1 million gallons of water per day for cooling and could consume as much as 150 megawatts of power, straining on the city’s resources.