There’s a lot of indications that we’re currently brute forcing these models. There’s honestly not a reason they have to be 1T parameters and cost an insane amount to train and run on inference.
What we’re going to see is as energy becomes a problem; they’ll simply shift to more effective and efficient architectures on both physical hardware and model design. I suspect they can also simply charge more for the service, which reduces usage for senseless applications.
There are also elements of stock price hype and geopolitical competition involved.
The major U.S. tech giants are all tied to the same bandwagon — they have to maintain this cycle:
buy chips → build data centers → release new models → buy more chips.
It might only stop once the electricity problem becomes truly unsustainable.
Of course, I don’t fully understand the specific situation in the U.S.,
but I even feel that one day they might flee the U.S. altogether and move to the Middle East to secure resources.
I think the most interesting recent Chinese model may be MiniMax M2, which is just 200B parameters but benchmarks close to Sonnet 4, at least for coding. That's small enough to run well on ~$5,000 of hardware, as opposed to the 1T models which require vastly more expensive machines.
That number is as real as the 5.5 million to train DeepSeek. Maybe it's real if you're only counting the literal final training run, but total costs including the huge number of failed runs all other costs accounted for, it's several hundred million to train a model that's usually still worse than Claude, Gemini, or ChatGPT. It took 1B+ (500 billion on energy and chips ALONE) for Grok to get into the "big 4".
Using such theory, one can even argue that the real cost needs to include the infrastructures, like total investment into the semiconductor industry, the national electricity grid, education and even defence etc.
> That's small enough to run well on ~$5,000 of hardware...
Honestly curious where you got this number. Unless you're talking about extremely small quants. Even just a Q4 quant gguf is ~130GB. Am I missing out on a relatively cheap way to run models well that are this large?
I suppose you might be referring to a Mac Studio, but (while I don't have one to be a primary source of information) it seems like there is some argument to be made on whether they run models "well"?
Admittedly I've not tried running on system RAM often, but every time I've tried it's been abysmally slow (< 1 T/s) when I've tried on something like KoboldCPP or ollama. Is there any particular method required to run them faster? Or is it just "get faster RAM"? I fully admit my DDR3 system has quite slow RAM...
Hard to be sure because the source of that information isn't known, but generally when people talk about training costs like this they include more than just the electricity but exclude staffing costs.
Other reported training costs tend to include rental of the cloud hardware (or equivalent if the hardware is owned by the company), e.g. NVIDIA H100s are sometimes priced out in cost-per-hour.
Citation needed on "generally when people talk about training costs like this they include more than just the electricity but exclude staffing costs".
It would be simply wrong to exclude the staffing costs. When each engineer costs well over 1 million USD in total costs year over year, you sure as hell account for them.
If you have 1,000 researchers working for your company and you constantly have dozens of different training runs in the go, overlapping each other, how would you split those salaries between those different runs?
Calculating the cost in terms of GPU-hours is a whole lot easier from an accounting perspective.
The papers I've seen that talk about training cost all do it in terms of GPU hours. The gpt-oss model card said 2.1 million H100-hours for gpt-oss:120b. The Llama 2 paper said 3.31M GPU-hours on A100-80G. They rarely give actual dollar costs and I've never seen any of them include staffing hours.
No, they don't! That's why the "5.5 million" deepseek V3 number as read by American investors was total bullshit (because investors ignored their astrik saying "only final training run")
Yeah, that's one of the most frustrating things about these published numbers. Nobody ever wants to share how much money they spent on runs that didn't produce a useful model.
As with staffing costs though it's hard to account for these against individual models. If Anthropic run a bunch of training experiments that help them discover a new training optimization, then use that optimization as part of the runs for the next Opus and Sonnet and Haiku (and every subsequent model for the lifetime of the company) how should the cost of that experimental run be divvied up?
No, because what people are generally trying to express with numbers like these, is how much compute went into training. Perhaps another measure, like zettaflop or something would have made more sense.
The source for China's energy is more fragile than that of the US.
> Coal is by far China’s largest energy source, while the United States has a more balanced energy system, running on roughly one-third oil, one-third natural gas, and one-third other sources, including coal, nuclear, hydroelectricity, and other renewables.
Also, China's GDP is a bit less inefficient in terms of power used per unit of GDP. China relies on coal and imports.
> However, China uses roughly 20% more energy per unit of GDP than the United States.
Remember, China still suffers from blackouts due to manufacturing demand not matching supply. The fortune article seems like a fluff piece.
China has been adding something like a 1GW coal plant’s worth of solar generation every eight hours in the past year, and the rate is accelerating. The US is no longer a serious competitor for China when it comes to energy production.
The reason it happened in 2021, I think, might be that China took on the production capacity gap caused by COVID shutdowns in other parts of the world. The short-term surge in production led to a temporary imbalance in the supply and demand of electricity
This was very surprising to me, so I just fact-check this statement (using Kimi K2 thinking, natch), and it's presently is off by a factor of 2 - 4. In 2024 China installed 277 GW solar, so 0.25 GW / 8 hours. First half of 2025 they installed 210 GW, so 0.39 GW / 8 hours.
Not quite at 1 GW / 8 hrs, but approaching that figure rapidly!
(I'm not sure where the coal plant comes in - really, those numbers should be derated relative to a coal plant, which can run 24/7)
> (I'm not sure where the coal plant comes in - really, those numbers should be derated relative to a coal plant, which can run 24/7)
It works both ways: you have to derate the coal plant somewhat due to the transmission losses, whereas with a lot of solar power being generated and consumed on/in the same building the losses are practically nil.
Also, pricing for new solar with battery is below the price of building a new coal plant and dropping, it's approaching the point where it's economical to demolish existing coal plants and replace them with solar.
China’s breakneck development is difficult for many in the US to grasp (root causes - baselining on sluggish domestic growth, and possessing a condescending view of China). This article offers a far more accurate picture than of how China is doing right now: https://archive.is/wZes6
I don’t remeber much details about the situation in 2021. But China is in a period of technological explosion—many things are changing at an incredible speed. In just a few years, China may have completely transformed in various fields.
Western media still carry strong biases toward China’s political system, and they have done far too little to portray the country’s real situation. The narrative remains the same old one: “China succeeded because it’s capitalist,” or “China is doomed because it’s communist.”
But in reality, barely a few days go by without some new technological breakthrough or innovation happening in China. The pace of progress is so fast that even people inside the country don’t always keep up with it. For example, just since the start of November, we’ve seen China’s space station crew doing a barbecue in orbit, researchers in Hefei working on an artificial sun make some new progress, and a team discovering a safe and efficient method for preparing aromatic amines. Apart from the space station bit—which got some attention—the others barely made a ripple.Also, China's first electromagnetic catapult aircraft carrier has officially entered service
about a year ago, I started using Reddit intensively. what I read more on Reddit are reports related to electricity, because it involves environmental protection and hatred towards Trump, etc. There are too many leftists, so the discussions are somewhat biased. But the related news reports and nuclear data are real. China reach carbon peak in 2025, and this year it has truly become a powerhouse in electricity. National data centers are continuously being built, but residential electricity prices have never been and will never be affected.China still has a lot of coal-fired power, but it continues to carry out technological upgrades on them. At the same time, wind, solar, nuclear and other sources are all advancing steadily. China is the only country that is not controlled by ideology and is increasing its electricity capacity in a scientific way.
(maybe in AI field people like to talk about more. not only kimi release a new model, Xpeng has a new robot and brought some intension. these all happends in a few days )
> China is the only country that is not controlled by ideology and is increasing its electricity capacity in a scientific way.
Have recently noticed a lot of pro-CCP propaganda on social media (especially Instagram and TikTok), but strangely also on HN; kind of interesting. To anyone making the (trivially false) claim that China is not controlled by ideology, I'm not quite sure how you'd convince them of the opposite. I'm not a doomer, but as China ramps up their aggression towards Taiwan (and the US will inevitably have to intervene), this will likely not end well in the next 5-10 years.
I also think that one claim is dubious, but do you really have to focus on only that part to the exclusion of everything else? All the progress made is real, regardless of your opinion on the existance of ideology.
I mean only on this specific topic: electricity. Arguing with other things is pointless since HN has the same political leaning as reddit so I will pass
I don’t have one now. I used to post lots of comments on china stuff but I got banned once and every time I registered a new one it will be banned soon. I guess they banned all my ip. So I only go anonymous now
It's absolutely impressive to see China's development. I'm happy my country is slowly but surely moving to China's orbit of influence, especially economically.
"Not controlled by ideology" is a pretty bold statement to make about a self-declared Communist single-party country. There is always an ideology. You just happen to agree with whatever this one is (Controlled-market Communism? I don't know what the precise term is).
I cannot edit this now so I want to add some clarification, it just means on this specific topic: electricity, china dont act like us or german, abandoned wind or nuclear, its only based on science
Having larger models is nice because they have a much wider sphere of knowledge to draw on. Not in the sense of using them as encyclopedias. More in the sense that I want a model that is going to be able to cross reference from multiple domains that I might not have considered when trying to solve a problem.
What we’re going to see is as energy becomes a problem; they’ll simply shift to more effective and efficient architectures on both physical hardware and model design. I suspect they can also simply charge more for the service, which reduces usage for senseless applications.