There was no one lightbulb moment that led to the creation of the lightbulb. From ancient texts to the wizards of the electrical age, the story of electricity has been unfolding for thousands of years – and it’s not finished yet.
Electricity is fundamental to every facet of modern life. Even the Amish, known for their resistance to modern technology, use electricity to power their machinery and appliances. But despite the ubiquity of electricity in our lives, most of us never give a second thought to where it came from, or how different it must have been to live in a world without it.
Electricity emerged after years of theorising, research and development, rather than any single flash of inspiration – but there were some key milestones along the way.
The first sparks
The earliest recorded history of electricity has more to do with ocean currents than electric currents. Ancient Egyptian texts dating back to 2750 BCE describe the shocks administered to people by electric fish – the “Thunderers of the Nile”, hailed as the “protectors” of all other fish.
Electric fish were later written about by ancient Greek, Roman and Arabic scribes, who described the shocks delivered by electric catfish and electric rays, and even understood that these electric shocks could travel along conducting objects. Patients suffering from headaches were advised to touch electric fish in the hope that the resulting jolt would cure them; a primitive form of shock treatment that you’re unlikely to find recommended on WebMD today.
Greek mathematician and philosopher Thales of Miletus made a ground-breaking series of observations on frictions and static electricity around 600 BCE. While not all of his theories were proven correct, later science would support Thales’ gut instinct that there was a link between electricity and magnetism.
But the modern study of electricity began in earnest in 1600, when English scientist William Gilbert wrote De Magnete. Gilbert, too, explored the link between electricity and magnetism, and distinguished naturally occurring magnetic minerals from the static electricity produced by rubbing amber.
Gilbert came up with the word ‘electricus’ – ‘of amber’ or ‘like amber’, from ‘elektron’, the Greek word for amber – to describe the ability of certain materials to bear a charge of static electricity. Gilbert’s word-smithery eventually led to ‘electric’ and ‘electricity’ making their first appearances in print in Pseudodoxia Epidemica, Thomas Browne’s book challenging the common errors and superstitions of the age, in 1646.
Shortly after this, in the early 1700s, English scientist Francis Hauksbee – Isaac Newton’s lab assistant – discovered that he could make a glass ball glow by placing a small amount of mercury in it, evacuating the air from it to create a mild vacuum, and rubbing the ball with his hands to build up a charge. The glow was bright enough to read by, and later became the basis for the gas-discharge lamp that led to neon lighting.
A key and a kite
Benjamin Franklin – the famed American politician, philosopher and Founding Father – was also a scientist and a prodigious inventor. Among other things, he came up with the Franklin stove, bifocal glasses and the flexible urinary catheter, none of which he ever patented, because he believed it was his duty to invent “freely and generously”. (Alas, he was also a slave owner, but later became a staunch abolitionist.)
Franklin’s most notable invention was the lightning rod, which he came to after years of extensive research into electricity, even going so far as to sell his possessions to fund his work. In 1752, Franklin – the first person to describe electric charges as ‘positive’ and ‘negative’ – conducted his most famous experiment, tying a key to a wet kite string during a thunderstorm to collect an electrical charge. Franklin moved his hand near the key and witnessed an electric spark, proving that lightning was electrical in nature.
Franklin’s discovery was followed by a slew of breakthroughs, as scientists and engineers attempted to harness the power of electricity and demystify its workings.
In 1791, Luigi Galvani published his discovery of bioelectromagnetics, using his experiments with electrocuting and briefly reanimating dead frogs’ legs to demonstrate that electrical signals were what caused our own muscles to move. Galvani was encouraged to name this phenomenon ‘Galvanism’ by his friend Alessandro Volta.
In 1799, a game of professional one-upmanship between the two men led Volta to invent his own ‘voltaic pile’, which used alternating layers of zinc and copper to produce a steady electric current. Volta’s invention came to be recognised as the first electric battery, and the ‘volt’ was later named in his honour.
Throughout the early 1800s, English scientist Michael Faraday made significant advancements in the fields of electromagnetism and electrochemistry. In 1831, he demonstrated that an electric current could be induced by passing magnets through coils of copper wire, a discovery that formed the basis of how power plants would come to generate electricity on a much larger scale.
In 1878, American engineer Charles Brush developed an arc lamp that could be powered by a generator. That same year, Englishman Joseph Swan had the first ‘lightbulb moment’ when he demonstrated the incandescent lightbulb – but it burned out almost immediately, making it an impractical replacement for oil and gas lamps.
Enter Thomas Edison. In 1878, Edison built his iconic laboratory in Menlo Park, New Jersey, and bought a number of patents related to electric lighting in the hope that they would set him on the path to a practical, long-lasting lightbulb.
The following year, he succeeded, producing an incandescent electric lightbulb that could be used for about 40 hours before burning out. By 1880, his bulbs could be used for up to 1200 hours.
“When Edison… snatched up the spark of Prometheus in his little pear-shaped glass bulb,” German historian Emil Ludwig later wrote, “it meant that fire had been discovered for the second time, that mankind had been delivered again from the curse of night.”
The current war
In 1882, Edison opened the first central power plant, Pearl Street Station, in lower Manhattan. Pearl Street’s generators were connected to homes and businesses throughout the area, including the New York Times, through a network of buried copper wires, shifting electricity generation from a small-scale, on-site operation to an industrial-scale enterprise.
Pearl Street used the ‘direct current’ (DC) system pioneered by Edison, which was soon challenged by the ‘alternating current’ (AC) system – sparking a ‘war of the currents’ that would shape the future of electric power transmission.
In 1885, American physicist William Stanley Jr – George Westinghouse’s chief engineer – built the first practical AC transformer, the precursor to the modern transformer. In 1886, Stanley demonstrated the first complete system for high-voltage AC transmission, including generators, transformers and high-voltage transmission lines, and in 1888, Westinghouse purchased the patent rights to the first polyphase AC induction motor, developed by Edison’s former employee, Nikola Tesla.
Essentially, the main difference between AC and DC was (and still is) how electrons move inside the conducting material. In an AC system, the electrons regularly reverse directions, sometimes moving forwards and sometimes moving backwards. In a DC system, the flow of electrons stays steady in the same forward direction.
AC couldn’t be stored, while DC could be stored in cells and batteries. On the other hand, AC was easier to convert to higher and lower voltages. It could be ‘stepped up’, allowing it to be transmitted far greater distances than DC with negligible loss of energy, and then ‘stepped down’ at the end of the line, making it safe for use. AC was therefore more economical to transmit over long distances than DC, and required less power stations to be built.
The battle between Edison’s DC system and Westinghouse’s AC system raged throughout the late 1880s and early 1890s. Edison attempted to discredit AC by claiming that its higher voltages were hazardous to humans and animals, and pushed for legislation to limit AC installations and voltages.
Ultimately, the ‘current war’ came to a head at the 1893 Chicago World’s Fair, when Westinghouse won the bid to power the Fair with an AC system, beating out Edison General Electric’s DC system. That high-profile victory led to Westinghouse winning the bid later that year to build an AC power station at Niagara Falls. With Edison leaving the electric power business behind to concentrate on his inventions in other fields, General Electric dropped its opposition to AC.
AC continues to be the form in which electricity is delivered to businesses and residences around the world, including Australia. But there are now transformers capable of converting DC from low to high voltage; electric cars, which are becoming increasingly common, operate on DC power; and solar panels produce DC electricity, a problem that currently requires the use of solar inverters to connect solar panels to the electricity grid.
The issues with grid stability caused by integrating solar and wind energy could also be addressed by a transition to a DC system, which has less complex requirements for synchronisation – matching the speed and frequency of a generator to a network – than an AC system.
After all these years, the story of electricity is still being written…