Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

20W, 200W, what's the difference? Either way, you can't get that out of the battery on a mobile device. And consumers don't generally buy PCs based on power dissipation. Sad, but true.

And if you think your typical beige-box PC can handle a power supply that is specc'ed for 150W-- go ahead and put one of those in there. I DARE you.



The lower figure of 20W is pretty standard: go take a look at the battery in your laptop. Most Dells are 65Whr batteries - ie. 20W for 3 hours of battery life (to a first approximation, LiIon is a bit more complicated than that).

The ubiquitous small form factor PCs like the Optiplex 780 (http://www.dell.com/downloads/global/corporate/environ/compl...) use a 235W (max) power supply, which will be overspecced to trade off failure rates for manufacturing cost. Those machines actually draw less than 150W flat out. And they're everywhere. A certain large e-tailer with an emphasis on frugality used to use them as developer desktops(!).

Who knows what's in a typical consumer beige box, but it isn't pulling 500W continuously, unless they're playing, say, Skyrim 24/7 with a big graphics card - in which case of course one would specify the correct (safe) component for the design. I'd argue that they're not typical by that point; most people won't spend £300 on a graphics card (I do).


So to recap:

* You point a to 235W power supply as an example of the bare minimum PC power supply-- not too far from my 300W round number.

* You point out that a power supply rated for X isn't drawing X continuously-- a true statement, but it's responding to an argument nobody made. You have to pick a power supply which is rated for your max load-- everyone knows that, or should. It still doesn't change the fact that both max load and average load for X86 are orders of magnitude greater than for most ARM devices.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: