close
close

Musk’s new Memphis data center reaches a milestone in artificial intelligence

Musk’s new Memphis data center reaches a milestone in artificial intelligence

Ladle

Elon Musk’s new xAI data center in Memphis reached a major milestone this week, bringing all 100,000 advanced Nvidia chips online at once, according to people with knowledge of the matter.

The achievement makes the data center, nicknamed “Colossus,” the most powerful known computer ever built and represents a significant technical achievement for xAI, a relatively young company that got the massive facility up and running in less than six months.

While Musk is there he tweeted Calling the facility the largest in the world, industry experts questioned whether xAI had the energy or technical ability to power so many GPUs — in this case, Nvidia’s H100 chips — simultaneously.

“Musk may be overestimating how many GPUs are actually running on a single cluster,” The Information said. reported earlier this month. “No other company has been able to successfully assemble 100,000 GPUs due to the limitations of networking technology, which connects chips so they can act as a single computer.”

This achievement came earlier this week and allowed the company to train an AI model with more computing power than any known model in history. xAI is using the data center to train the artificial intelligence model behind Grok, the company’s chatbot that bills itself as an uncensored version of ChatGPT.

Musk did not immediately respond to a request for comment.

Learn More

xAI aggressively pursued its goals, going so far as to hook up natural gas turbines to supplement conventional power; a stopgap measure to keep recurring even as utility officials work to bring more power to the facility.

Energy has become one of the biggest challenges in the effort to build more powerful AI models. Bloomberg reported He said OpenAI CEO Sam Altman has asked U.S. government officials for help building data centers, which would require five gigawatts of power or up to five nuclear power plants.

Microsoft, BlackRock and Abu Dhabi’s MGX are collaborating on a $30 billion investment fund targeting infrastructure projects for massive data centers used for artificial intelligence.

The race to build larger data centers is why OpenAI is pursuing billions of dollars in new financing and seeking to change its corporate structure to allow for larger investments.

Computing power alone does not guarantee a better AI model, but one school of thought in the industry is that more generally equals more capable models.

It is also possible to train multiple AI models and then combine them into a larger model, sometimes referred to as a “blend of experts.”

Historically, the more GPUs connected under one roof, the more powerful the models produced.