Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> The power supply is WAY too oversized resulting in bad efficiency and thus a bigger power consumption

...but who gives a f? It's a desktop, no battery to drain. And of all the appliances in your home, the computers surely don't mean that much as a percentage of power usage. Yeah, other things are probably wrong too, but I think you picked the wrong detail to care about here :) Better to have a bigger power supply and be sure it'll handle whatever you might add to the machine in the future than to care about replacing it. And from my (and everyone else I talked to) experience, inefficient (power-wise) electronics tend to last way longer than efficient ones, so I prefer them (since my time wasted finding a replacement or a repair shop is worth more than the price of the extra power), even when it comes to fridges and washing machines.

(Commenting mostly because all this insane obsession with energy efficiency is getting on my nerves.)



I give a fuck :p And even if he overclocks his CPU and GPU it's still oversized. It just doesn't make any sense. I think a good system is well balanced with room for improvement. But we are talking about a 760W PSU! This systems consumes at least 300W to 350W under load. So thats twice the needed load! A 600W PSU would have also done the job but this thing is just pure overkill.


It is better to have spare wattage instead of running it at full load. It's not like the PSU uses power by itself, and a larger one will last longer and be future proof...


He's saying he may upgrade to GTX 1080ti - that's extra 250W.

It's the first time I can hear that oversized PSU leads to more power consumption - how come?


Somewhere in the power supply, power is converted via a switched inductance, the power transfer is controlled by the switching frequency, with toughly the same energy transferred for each cycle (constant I_peak). Losses are roughly f*I_peak^2. There are several factors limiting maximum f, so to get a bigger maximum power, bigger power supplies have bigger I_peak. Thus, for the same transferred power, a bigger supply will have bigger I_peak and smaller f, but since I_peak appears squared, the loss is worse. (This is the rough explanation, the devil is in the details)

Additionally, many power supplies are worse at load regulation if their load is too low.


I just looked at corsair's efficiency graph and it seem that the best efficiency is around 50% of load on their PSUs.

Additionally up to around 35% load the fan is off, passive cooling is enough.


From what I'm reading, the 1080ti and 280x both fall right before 250W, so the change shouldn't be significant.


Also PSUs have an efficiency, i.e. only 80% of consumed energy goes into the components (it's available), the rest goes as toaster/heater I guess. There are different standards it seems gold, platinum etc.


You seem to assume that a big power supply somehow draws more power than a small power supply give the same workload. That is not the case.


Yeah, no efficiency curve for that power supply but with 0 fan under 30% and this: The 80 PLUS Platinum certification guarantees greater than 90%, 92%, 89% efficiency at 20%, 50% and 100% operating loads,

It's probably here nor there, shame about no real curve though. Could be crapola at 10%, who knows.


All else being equal that's usually true though - a 750W model of the same PSU will usually draw more at 75W output than the 650W model of the same PSU. But with 80+ gold and above the difference will be negligible.


It is the case in most common usage scenarios. Power supplies tend to reach peak efficiency at around half of their rated maximum load. If your system spends most of its time below half the PSU's maximum, then a smaller PSU can be more efficient. If your PSU is oversized by something like 250W or more, you'll have trouble getting it to hit its peak efficiency even while gaming.


> Power supplies tend to reach peak efficiency at around half of their rated maximum load.

You aren't the only one to post something like this. Where are y'all getting this idea from? Efficiency is very flat in switching power supplies once load gets above about 10% of rating. Plus, computer power supplies have multiple voltage outputs, so in theory one output could be running near full load while another is basically idle[1]. The power output rating is calculated by increasing load current until the voltage drops unacceptably, then calculating the power output at that point. A 750W power supply is just as efficient running at 250W as it is at 750W.

[1] Not that this happens much at all in practice, but it could.


> The power output rating is calculated by increasing load current until the voltage drops unacceptably, then calculating the power output at that point.

It really isn't. Sure, the engineers who design the power supply do that test, but that is not at all what determines the advertised rating of the final product. At 100% of the advertised load, most computer power supplies are still delivering nominal voltage or slightly above, not 5% below as allowed by the ATX spec.

> Efficiency is very flat in switching power supplies once load gets above about 10% of rating. [...] A 750W power supply is just as efficient running at 250W as it is at 750W.

Yes, on either side of the peak efficiency, you'll have points of equal efficiency. And in the middle, you'll have a few percentage points higher efficiency. But more importantly, at the ~30W a typical desktop will actually be drawing most of the time, a power supply with a smaller rating will be substantially more efficient.

Buying 750W and larger power supplies just doesn't make sense for single-CPU, single-GPU systems. Power supplies with lower ratings already have plenty of headroom both built-in to their rating and in the difference between 500W and what a real desktop actually uses on real workloads. To the extent that having excess capacity helps longevity, a 550W or 650W model is already well past the point of diminishing returns and going up to 750W is pure vanity. If you want reliability, shop for PSUs that use high-quality fans and capacitors, don't just stupidly add an extra 30% on top of what's already for more PSU than you really need.


> It really isn't. Sure, the engineers who design the power supply do that test, but that is not at all what determines the advertised rating of the final product. At 100% of the advertised load, most computer power supplies are still delivering nominal voltage or slightly above, not 5% below as allowed by the ATX spec.

I assumed power supply manufacturers would want to slap the peak power number on their supplies for marketing purposes. If they are actually being conservative, then you are correct.

> Yes, on either side of the peak efficiency, you'll have points of equal efficiency. And in the middle, you'll have a few percentage points higher efficiency.

My point is that efficiency is a plateau, not a "peak". Once in the plateau the fluctuations of efficiency from one load point to another are not significant. Below a certain minimum load and past the peak power "knee" is a different story, but that plateau is very large.

> But more importantly, at the ~30W a typical desktop will actually be drawing most of the time, a power supply with a smaller rating will be substantially more efficient.

I never disputed this. I disputed the nonsense that power supplies have a meaningful "peak" efficiency, and that it is a function of it's rating. 30W is probably not enough of a base load for efficient operation of larger computer power supplies, but once that point is hit it no longer matters what the actual load is.

I'm also not suggesting it is a good idea to waste money on a larger supply than you really need.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: