How Many Watts Does a Computer Use?
If you have ever wondered how much energy your computer uses, you may be surprised. While modern computers and their monitors are more efficient than they were in the past, using the power saving functions they come with, like sleep mode, could save you $100 each year or more, depending on your computer and how often you use it. Your choice of peripherals also play a big part in your annual energy bill. Newer technology and energy-saving techniques will cut those costs.
A desktop computer uses between 60 and 250 watts, depending on whether it is idle. A laptop consumes 15 to 45 watts. A 22-inch LCD monitor uses an average of 25 watts, while an older 17-inch cathode ray monitor would use about 80 watts. When a computer is in sleep mode, it uses only about 20 percent of the power it uses when awake.
Calculating Annual Wattage
To determine the annual cost of running your computer, first multiply its wattage and that of the monitor by the number of hours you typically use it each day. For example, a computer and monitor with 270 watts, four hours each day, multiplied by 365 days is 394 kilowatts each year.
Calculating Annual Power Cost
Calculating your energy costs requires just three numbers and a simple formula. Add the watts consumed by your computer and monitor, then multiply by the hours and days used, dividing by 1,000 to reach the kilowatt hours. Multiply the kilowatt number by the electricity cost per kilowatt hour (found on your utility bill). Consuming 270 watts (at 8.5 cents per kilowatt hour) for four hours each day, means an annual energy bill of $33.51.
Energy Star Guidelines
To receive the EPA's Energy Star rating desktop computers must use two watts or less when in standby mode and four watts or less when in sleep mode. Laptops must use one watt or less when in standby and 1.7 watts or less, when in sleep mode. On-mode Energy Star ratings are based on a formula that calculates watts with a monitor's megapixels.