The cost of a microwave on standby

UKworkshop.co.uk

Help Support UKworkshop.co.uk:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
Much of it comes down to the difference between watts and watt-hour :

All electrical items, even something like a battery wrist watch consume power if they are connected to a power source. The power source has a voltage potential between it's two terminals, in the UK, this is approximately 220v for mains (I'll ignore the difference between AC and DC in this illustration).

If an electrical item is connected between the two power source terminals, this electrical item will introduce a resistance between the two terminals of the power source. The resistance limits the amount of current that can flow from the power source into the electrical item and back out again. The current, in conjunction with the voltage available from the power source defines how much power the electrical item consumes. The circuitry inside the electrical item, often complex, needs a certain amount of current to make the electrical item work. If insufficient current is allowed into the electrical item, it wont work properly.

So, a couple of examples, a battery powered wrist watch requires very little current to operate, just a few millionths of an amp (the unit of current). An electric car requires tens of amps to operate. So, a wrist watch requires a small power source and an electric car requires a large power source.

Now, at any point in time we can measure the power required by our electrical device in watts. Put simplistically, it is the product of voltage and current. So, for my PC, needing 100W, at 220v supply it needs 0.454 Amps of current. This current keeps the microprocessor running, it keeps the memory intact, it drives the graphics card, the sound card and provides power to the various USB ports. The power taken by the PC, and most electrical items, will vary depending upon what we are doing with them at any point in time. With my PC, if I am simply reading email, it isn't taxing the hardware and the power drain is quite low. If I decided to play video games or do something which stresses the PC hard, it takes more current, and thus consumes more power.

When we pay for our electricity, the electricity companies charge us as an accumulated usage. They measure it in Kilo watt hours or KwH. The electricity meter samples the current which each of our houses draws from the national grid at fixed intervals and records the current taken over a period of time. So, in the case of my PC, consuming an average of 100W of power, it would take ten ours to consume 1 KwH of power (100W average over ten hours). If my energy supplier was charging me, say 25p per KwH, my PC would cost me 25p per ten hours. Over a year, there are 876 blocks of 10 hours, so my PC would cost me £206.74 to run.

If I put my PC into standby all year, it would still cost me a bit to run because some power is required to drive the basic system even though most of it is sleeping. When I made the measurement I looked at the delta between it being on and being in sleep mode and it's the delta that I used in the calculation.

My PC is a decent enough illustration as it's fairly average in terms of power consumption.

If I had a 1000W microwave running continuously for a year it would consume 1000W continuously. That's ten times as much as my PC so the mircowave would cost me £2067 for the whole year! Obviously, nobody does that.

The thing is, when the microwave is not heating up my lunch, the clock is running and some simple support circuitry is also monitoring the microwave for safety and sensing when I press some buttons. All of this housekeeping may only be taking a small fraction of an amp of current so, because watts is a product of voltage and current, in standby mode, it may be consuming 1 or 2 or half a dozen watts. So, if it was consuming 5W, it can sit there for 200 hours before it consumes a single KwH and costs me 25p. (it does charge for fractions of Kwh, in case you are wondering)

So, be it mains driven or battery driven, the electronics industry has been put under massive pressure to minimise power usage. Lower power means that batteries need to be recharged (or replaced) less frequently or less power is consumed through our mains sockets.

If we unplug everything when we are not using it, it consumes less power and we are not charged. It is, however, inconvenient (having to reset the microwave clock, for example). So life becomes a balance between consuming (and costing) more power and convenience of usage.

My large OLED TV, as an example, consumes an average of 132W when I'm watching something on it. When I have turned it off with the remote control but not unplugged it, it consumes 0.5W. I would need to leave it in standby for 2000 hours before it cost me a single KwH of energy but I'm not going to turn it off at the wall and have to wait for the damn thing to boot up each and every time I use it so as to save that amount of energy...... this is how modern electronics has been designed to reduce power consumption.

(In case you are wondering - I spent most of my working life in the technology industry in electronic testing).
Ian Ive just read your post and its extremely informative, many thanks indeed for taking the trouble to do it, Steve.
 
Lots of EU stds are global standards.
Probably better to say that a number of standards are adopted either in part of full by other standards bodies. e.g. BSI (UK), DIN (Germany), ANSI (USA), IEC (Swiss), ISO (world), ITU (world) etc. all collaborate.

It's a horrible business. I once sat on a BSI standards body and nearly lost the will to live. Similarly, I was on an ANSI committee for the standardisation of one of the most popular commercial computer languages. Ghastly experience.
 
Back
Top