Chinese AI startup DeepSeek has made headlines with its claim that it developed a highly sought-after chatbot at a significantly lower cost than those created by leading American tech firms.
This revelation raises concerns about the vast sums U.S. companies are investing in expanding energy-intensive data centers needed for the next generation of artificial intelligence.
If DeepSeek’s chatbot can operate effectively at a fraction of the cost, it raises questions about whether the tech world requires as much electricity as previously thought.
This has critical implications, particularly in the context of climate change, as AI technologies consume substantial energy, frequently sourced from fossil fuels, thus contributing to environmental degradation.
Tech companies are reporting an increase in energy consumption, countering their earlier commitments to reduce electricity use, thereby undermining their climate action strategies.
“There’s been a reckless drive for investment in fossil fuel resources,” noted Eric Gimon, a senior fellow at Energy Innovation, suggesting this moment could prompt the industry to reconsider its priorities.
Experts suggest that enhancing AI efficiency could lessen the environmental strain, despite the sustained high energy requirements of AI technologies.
After DeepSeek launched its AI assistant, it quickly became the most downloaded free app on Apple’s iPhone, surpassing competitors such as ChatGPT and Google’s Gemini.
“It was astounding to wake up and find a new app emerging as the top player overnight; it could indeed change the game,” expressed Jay Woods, chief global strategist at Freedom Capital Markets, noting the immediate ripple in the market.
The app demonstrates impressive capabilities, such as generating software code, resolving mathematical challenges, and tackling complex queries while explaining its thought process.
Analysts have scrutinized DeepSeek’s public research on its cutting-edge R1 model, particularly the claim that the cost to train its v3 model was just $5.6 million—strikingly low in contrast to the billions spent on developing systems like ChatGPT.
However, it’s important to note that this figure pertains solely to the chatbot’s training and does not include earlier research costs.
Furthermore, DeepSeek faced certain limitations, such as the U.S. export regulations governing advanced AI chips, opting instead for a less powerful chip from Nvidia that remains available in China.
The energy consumption of data centers in the U.S. is anticipated to rise dramatically, possibly doubling or tripling by 2028.
Currently, data centers account for approximately 4.4% of the country’s electricity usage, projections suggest this may escalate to between 6.7% and 12% within five years, according to findings from the Lawrence Berkeley National Laboratory.
Traditionally, it has been assumed that U.S. technology giants must invest heavily to construct data centers to support their AI developments.
For instance, Meta Platforms, the parent company of Facebook and Instagram, intends to allocate up to $65 billion this year on infrastructure projects, including a large data center in Louisiana.
Microsoft has set its spending target at $80 billion, while recent announcements from executives at OpenAI, Oracle, and SoftBank indicated a joint investment effort aiming to reach $500 billion in data centers and electricity infrastructure for AI, beginning with a Texas project already underway.
Experts suggest that increased efficiency in AI could lead to greater adoption, thus continuing the rise in demand for energy.
Vic Shao, founder of DC Grid, which supplies off-grid power to data centers and electric vehicle stations, explained that once technology becomes widely accessible and cost-effective, usage is bound to soar.
This suggests that while new data centers will continue to be established, they may operate in a more energy-efficient manner, highlighted Travis Miller, an energy and utilities strategist at Morningstar Securities Research.
He believes that the projected increase in electricity demand will likely fall toward the lower end of most forecasts.
If DeepSeek’s assertions prove valid, it may render some routine AI operations unnecessary for data centers, allowing tasks to be managed directly on smartphones, argued Rahul Sandil, MediaTek’s VP and GM for global marketing and communications.
This alteration could decrease computational demands and provide additional time to develop renewable energy sources for data centers.
However, AI-related stocks like Bloom Energy faced a decline following the news, with CEO KR Sridhar emphasizing that U.S. leadership in AI is critical, particularly regarding powering data centers with clean energy as compared to countries that still rely on coal.
“We will keep improving our technologies,” he stated.
Rick Villars, an analyst with IDC market research, remarked that the developments from DeepSeek might shape the future directions of AI research; nevertheless, the need for data centers and energy remains unchanged.
“We anticipate that this could expedite the integration of AI into various facets of our daily lives, including work and healthcare,” he noted, while affirming that sufficient capacity will still be necessary.