A Chinese AI company’s more frugal approach to training large language models could point toward a less energy-intensive—and more climate-friendly—future for AI, according to some energy analysts.
“It raises the potential that electric demand from model training may end up being lower than previously thought,” John Larsen, a partner at the research firm Rhodium Group, told Newsweek via email.
Chinese startup DeepSeek claims to have developed its high-performing AI tool using a fraction of the computing power that U.S. tech companies have needed to train an AI large language model (LLM).
The news roiled tech stocks this week because DeepSeek’s apparent success undermined the presumption that U.S. tech companies would dominate AI through size, leveraging bigger data sets and building bigger data centers to provide that computing power.