Either you’re not going to get a choice, or you’ll need a 240v three phase outlet installed and possibly an expanded electrical panel.
Nvidia has been all about “as fast as possible, no compromise” for a very long time now. So was Intel, until the Pentium 4 forced a big reset.
At a certain point this stuff is just totally untenable. Throw more amps at it stops being workable.
Apple seems to be able to get a decent fraction of the performance using drastically less power. I have the impression Intel is doing the same, though I’m less sure on that one.
Given how parallel graphics problems are, maybe it’s time to give up on 100 cores that go uber fast as behave like a space heater and move to 250 cores that go quite fast but use 1/4 of the power each.
If GPUs requiring 2kW PSUs do become reality, at least us Europeans would have something new to flaunt over the US, along with our fast boil electric kettles.
Here in the States, I've already banished non-Metric "Freedom units" from my workshop.
I can do the same with 120v.
(And indeed: Back in the mining days, my somewhat diminutive rig was running from a dedicated 240v circuit. It made my already-efficient power supplies a tiny bit more efficient, and it saved a bit on copper.
I've already got plans for 240v outlets in the kitchen.)
Plenty of people are working hard to tackle the performance-per-watt problem while others simultaneously tackle the absolute performance problem. There's no one single metric you can focus exclusively on that's going to satisfy everyone's use cases. Obviously not everyone needs a high-end enthusiast product like the 5090 and plenty of people are going to be seeking out a middle-of-the-road product that focuses on performance-per-watt instead, which will be accommodated by the lower-tier products in the lineup as well as their competitors' products.
> Given how parallel graphics problems are, maybe it’s time to give up on 100 cores that go uber fast as behave like a space heater and move to 250 cores that go quite fast but use 1/4 of the power each.
It wasn't too long ago that people were saying the same things about single-core and dual-core NetBurst CPUs and how we need to start considering dual-core and quad-core for consumer CPUs. And now we have consumer CPUs with 64 cores and more, so aren't we moving in the direction you want already? Performance-per-watt is improving, parallelism is improving, and also absolute performance is improving too in addition to those other metrics.
Don't be ridiculous. Nobody is going to stop anything. If the only way to scale up is to build more and more power-hungry hardware, that's where the market will go. Even if some GPU of the future consumes 10kW, it's still just $1.5 per hour. Which is a lot cheaper than most other forms of entertainment.
Consumers already have the choice of not buying the most powerful card.
Increasing die size to run cores in a more power efficient regime is not going to work, because a) the chips are already as big as can be made, and b) competition will still push companies to run the 250 cores uber fast and figure out some way to push enough power to it.
As long as there is customer demand for this, these things will get built. Given the amount of bad press these melted connectors create, possibly with better engineered power delivery systems.
> I have the impression Intel is doing the same, though I’m less sure on that one.
That may be true on the low-power end of their lineup, but certainly not for the high end i9 chips. Those will increase the power budget for minuscule performance gains.
The national electric code says not to draw more than 80% of the maximum load a circuit can handle continuously. So assuming you plan to game for a reasonable amount of time we’re limited to 12 amps.
That’s 1440 watts. And power supplies for PCs these days seem to be 80-95% efficient for good ones. Let’s say you’ve got a 90%er.
That’s 1300 watts. A top end CPU is 120 watts. Looks like 70-130 watts for a motherboard. Call it 100.
We’re at 1080 watts. Four sticks of DDR5 is another 60 or so. Let’s add 20 for two M.2 drives.
Wait, didn’t we have a 600 watt GPU? Down to 400 left. Good thing dual GPU gaming is dead, we can’t afford a second in our killer gamer system. Let’s add 20 watts for fans so everything doesn’t melt.
So 380 watts left out of 1440. That’s just 25%. But no spinning hard drives or SSDs, no USB PD, no PCI-e cards at all but our one graphics card.
Wait we need a monitor on the same circuit. Looks like that’s 50 watts for a high end LG, 330 now. You did only want one monitor right?
Is it really that hard to imagine hitting that limit soon?
Idk but if computers are drawing enough power to melt themselves just to play ray traced Minecraft or write boilerplate Python code, perhaps we've gone wrong somewhere.
Because we only need hours to write such things, rather than months or years. The market wants rapid development, not necessarily developers. I usually just see the old-heads implying younger devs can't hack it.
Add in a CPU and such and we’re quickly approaching the maximum amount of power that can be drawn from a standard 12 amp circuit continuously.