A while ago, my friend told me that the screen resolution and the type of output you’re using on the console effects the power consumption.
This is what he said in his claims:
-480i consumes the least power, while 1080p draws in more power.
-composite consumes the least, VGA is a step up on the chain, while component is in the middle, DVI (which can only be used through converters) is the 2nd highest, while HDMI takes up the most energy.
Now, I’m not the brightest when it comes to electrical knowledge, but I thought that was the stupidest claim ever made. However, I do find the claim about screen resolution a little bit legitimate.
So what do you think tech talk, bullshit or not bullshit?