Question about console power consumption of screen res and cable output


#1

A while ago, my friend told me that the screen resolution and the type of output you’re using on the console effects the power consumption.

This is what he said in his claims:

-480i consumes the least power, while 1080p draws in more power.

-composite consumes the least, VGA is a step up on the chain, while component is in the middle, DVI (which can only be used through converters) is the 2nd highest, while HDMI takes up the most energy.

Now, I’m not the brightest when it comes to electrical knowledge, but I thought that was the stupidest claim ever made. However, I do find the claim about screen resolution a little bit legitimate.

So what do you think tech talk, bullshit or not bullshit?


#2

That is half true. Yes there is a higher power draw going up as you went up in that line for cables BUT the difference is so small that it is not relevant.
HDMI does not drawl more power than DVI as HDMI is almost the same as DVI, HDMI protocols are built from DVI standards, its why HDMI/ DVI converts are so simple and cheap.

Yes 1080p would drawl more power than 480i on paper (in theory), in the same way your PC uses more power for a 1920 x 1080 screen resolution (1080p for HD TV) than it will at 800 x 600. Also the differences are so small that it should not matter. Your USB devices connected to the console will draw much more power than changing the video connection from HDMI to Composite.

The differences are in thousands or millions of a amp