You know those ubiquitous “wall warts” that clutter up your outlets and power strips? Do you know what they’re doing? They’re reducing the voltage and converting AC to DC.
Imagine electricity as a flow of water–it starts in a pond, gets pumped out into pipes, gets used by people on the way, and returns eventually to the pond. Direct current is like water flowing in one direction all the time. Alternating current is like water flowing back and forth, in two directions, very quickly—60 times a second, in fact. Power is delivered to your house or building as alternating current (AC).
But most of the devices we use in daily life–computers, printers, stereo systems, any kind of small motors, any battery operated device–are designed to run on direct current (DC). Hence the “wall wart,” which converts AC to DC.
The wall wart gets hot in the process, or at least warm. In electrical devices, heat is generally a sign of inefficiency: power that could be used to generate work is instead lost as heat. Converting AC to DC causes a lot of inefficiency and a lot of power loss. If you install solar panels on your house, the panels generate direct current (DC). You need to install a device (an “inverter”) that converts the direct current to alternating current. Then once the alternating current is flowing into you house, you need wall warts to convert it back to DC. This is one of the things that makes solar power highly inefficient: the power losses involved in converting back and forth.1
So why do we do it this way? The answer goes back to the origins of electrical power, and the famous fight between Edison’s General Electric Company and the Westinghouse Company and its main adviser, Nikola Tesla. Basically, Edison owned patents related to DC power, and wanted to see DC power adopted. Tesla and Westinghouse advocated for AC. Edison waged a war against AC current, which included electrocuting an elephant with AC to show how deadly it could be.
The elephant in question, named Topsy, had killed three circus employees. Edison lead the poor beast to a metal plate and applied AC current, then released the results as a grisly film which you can see here. Edison thought he’d sell some films (he was right) and discredit Tesla and Westinghouse, who both argued for AC. Edison also helped electrocute people, via an electric chair intended specifically to discredit AC power. (DC could be just as deadly, a fact Edison neglected to stress). In the end, Westinghouse and Tesla “won,” and AC current generation became the standard.2
The reason is simple: AC can be sent over long distances, and DC current can’t. With AC current, you could have huge generating stations located hundreds of miles away, and transmit electrical power via high voltage wires to substations in cities and towns which would step the voltage down to household levels and send it to individual houses and buildings. DC current suffers much more from “voltage drop:” you can’t send it long distances, because as it travels along the wire it steadily decreases voltage. For DC to work, you would have to have many may local generating stations, or each house would have to have its own generator.
Our modern power distribution and sales system, then, reflects the economic and political situation of the late 19th and early twentieth century: it favors centralization and monopoly control and the large capital investments that only corporations can muster.
It works quite well, or it did until recently, when oil prices started to rise and global warming was identified as a problem. We’re now accustomed to—our entire economy depends on—methods of generation and distribution devised 100 years ago.
Putting solar panels on your roof seems like an eminently nice idea: no pollution, “free” power from the sun, no coal or oil being mined and burned. One of the problems with solar power is fitting it into the existing grid, and that problem grows partly from the need to convert the DC that solar panels generate into AC, so your wall warts and internal power supplies can convert it back into DC. If the power loss is as little as ten percent at each step, you’ve lost at least a fifth of your power to conversion. Using solar power involves some massive inefficiencies independent of the solar panels themselves, because of the nature of the grid system we have inherited.
A good example is LED (light emitting diode) lighting, rapidly increasing in popularity. The incandescent bulbs Edison invented could run on AC or DC. But they are very inefficient, and one of the signs of this is how hot they get. An LED bulb on the other hand, is extremely efficient. It generates a lot of light on very little current, and produces very little heat in the process. Led bulbs last a long time: up to twenty five years or more. A single solar panel could power leds far, far more effectively than it could power incandescents. But LEDs can only run on DC current, which means again that you need to convert the solar panel’s DC output to household AC, then back to DC. There’s a middleman in between who has to be appeased: that middleman is history, and George Westinghouse.
You can certainly imagine an alternative. Just as each house today has a furnace, for generating heat, each house could have a generator, for generating electricity. The generator could be powered by the same thing that powers your gas furnace, or your house could be supplied by a gas fired generator and a combination of wind and solar power, and batteries, or it could be powered by wood you chop down. Apartment and office buildings could have their own generators. There’s no reason at all that electrical generation couldn’t be localized in exactly the same way as household heat generation.
This might not be better—there are other problems with DC not mentioned here, and it would take quite a lot of wind and solar to run a house as we’re accustomed. But it’s certainly imaginable. And if one of the problems of solar or wind power is inefficiency, eliminating the need to convert from DC to AC to DC would dramatically decrease local inefficiencies. Who would not like decreasing energy consumption by 20%? I’m not an electrical engineer, and I may have overstated some of the inefficiencies involved. But the argument here makes increasing sense in a world of resource scarcity, high fuel prices, and global warming. History, in this case, is in the way.
History does no good at all in helping you avoid past mistakes or predict the future. Past mistakes only look superficially like present mistakes: most of the time comparing the past to the present only works in a very general sense.
But history does explain why what looks like an inevitability, a fact of nature, something that’s common sense and “had to be,” wasn’t inevitable and didn’t have to be: that things might have turned out very differently. In that sense, understanding the history of power generation makes it easier to imagine alternatives.
- It’s hard (for me, not an electrical engineer) to find reliable data on how much is lost in the conversion. It’s hard to answer partly because our entire consumer electrical system was designed around the idea of AC at the wall outlet. One of the other advantages of AC is that you can easily step the voltage up or down as needed, which means we have a lot of devices designed to run at very specific voltages, which is why there are multiple kinds of wall warts ↩
- The further you get from the children’s lit version of Edison as an eccentric attic tinkerer the less attractive he seems. Edison ran the turn of the century equivalent of a commercial research lab, and he directed as much of his energy to controlling markets through the patent process as he did to inventing things. ↩