10 Electrifying Facts for Michael Faraday's Birthday


This world-changing genius was born into poverty on September 22, 1791. Fortunately for us, Michael Faraday refused to let his background stand in his way.



In Faraday's boyhood home, money was always tight. His father, James, was a sickly blacksmith who struggled to support a wife and four children in one of London's poorer outskirts. At age 13, young Faraday started helping the family make ends meet. Bookseller George Ribeau (sometimes spelled Riebau) took him on as an errand boy in 1804, with the teen's primary job being the delivery and recovery of loaned-out newspapers.

Shortly after Faraday's 14th birthday, Ribeau offered him a free apprenticeship. Over the next seven years, he mastered the trade of bookbinding. After hours, Faraday remained in Ribeau's store, hungrily reading many of the same volumes he'd bound together.

Like most lower-class boys, Faraday's formal schooling was very limited. Between those bookshelves, however, he taught himself a great deal—especially about chemistry, physics, and a mysterious force called "electricity."


Wikimedia Commons // CC BY 4.0 

Sir Humphry Davy (above) left a huge mark on science. In the year 1808 alone, the man discovered no less than five elements, including calcium and boron. An excellent public speaker, Davy's lectures at the Royal Institution consistently drew huge crowds. 

Twenty-year-old Faraday attended four of these presentations in 1812, having received tickets from a customer. As Davy spoke, Faraday jotted down detailed notes, which he then compiled and bound into a little book. Faraday sent his 300-page transcript to Davy. Duly impressed, the seasoned scientist eventually hired him as a lab assistant. Later in life, Davy was asked to name the greatest discovery he'd ever made. His answer: "Michael Faraday."

Tension would nevertheless erupt between mentor and protégé. As Faraday's accomplishments began to eclipse his own, Davy accused the younger man of plagiarizing another scientist's work (this rumor was swiftly discredited) and tried to block his admission to the Royal Society.


Ramblersen, Wikimedia

On September 3, 1821, Faraday built a device that ushered technology into the modern era. One year earlier, Danish physicist Hans Christian Ørsted had demonstrated that when an electric current flows through a wire, a magnetic field is created around it. Faraday capitalized on this revelation. Inside the Royal Society basement, he began what was arguably his most groundbreaking experiment by placing a magnet in the bottom of a mercury-filled glass container. Dangling overhead was a wire, which Faraday connected to a battery. Once an electric current was conducted through the wire, it began rotating around the magnet.

Faraday had just built the world's first electric motor. How could he possibly top himself? By building the world's first electric generator. His first experiment was comprised of a simple ring of wires and cotton through which he passed a magnet. By doing so, he found that a current was generated. To this day, most electricity is made using the same principles.



By today's standards, his early models would look shabby. Made via pressing two sheets of rubber together, Faraday's balloons were used to contain hydrogen during his experiments. Faraday created his first in 1824 and was quick to praise the bag's “considerable ascending power.” Toy manufacturers started distributing these the following year.



In 1823, Faraday sealed a sample of chlorine hydrate inside a V-shaped tube. As he heated one end and cooled the other simultaneously, the scientist noticed that a peculiar yellow liquid was starting to form. Curious, he broke open the tube. Without warning, a sudden, violent explosion sent glass shards flying everywhere. Mercifully uninjured, he smelled a strong scent of chlorine in the air.

It didn't take him very long to figure out what had happened. Inside the tube, pressure was building, which liquefied the gas. Upon puncturing the glass, he'd released this pressure and, afterwards, the liquid reverted into its gaseous state. This sudden evaporation came with an interesting side-effect: it cooled down the surrounding air. Quite unintentionally, Faraday thus set the stage for the very first ice-making machines and refrigeration units.



Britain's industrialization came at a malodorous price. As London grew more crowded during the mid-1800s, garbage and fecal matter were dumped into the River Thames with increasing regularity. Naturally, the area didn't smell like a rose. In 1855, Faraday penned an oft-reproduced open letter about the problem, imploring the authorities to take action. “If we neglect this subject,” he wrote, “we cannot expect to do so with impunity; nor ought we be surprised if, ere many years are over, a hot season give us sad proof for the folly of our carelessness.”

Just as Faraday predicted, a broiling summer forced Londoners of all stripes to hold their noses. Dubbed “the Great Stink,” the warmer months of 1858 sent the Thames' rancid odor wafting all over the city. Parliament hastily responded with a comprehensive sewage reform bill. Gradually, the putrid stench began to dissipate.


Alexander Blaikley, Wikimedia Commons // Public Domain

Faraday understood the importance of making science accessible to the public. In 1825, while employed by the Royal Society, he spearheaded an annual series that's still going strong today. That holiday season, engineer John Millington delivered a set of layman-friendly lectures on “natural philosophy.” Every year thereafter (excluding 1939–1942 because of WWII), a prominent scientist has been invited to follow in his footsteps. Well-known Christmas lecturers include David Attenborough (1973), Carl Sagan (1977), and Richard Dawkins (1991). Faraday himself was the presenter on no less than 19 occasions.


GianniG46, Wikimedia Commons // Public Domain

Towards the end of his life, Faraday's lack of formal education finally caught up with him. An underprivileged childhood had rendered him mathematically illiterate, a severe handicap for a professional scientist. In 1846, he hypothesized that light itself is an electromagnetic phenomenon, but because Faraday couldn't support the notion with mathematics, it wasn't taken seriously. Salvation for him came in the form of a young physicist named James Clerk Maxwell. Familial wealth had enabled Maxwell to pursue math and—in 1864—he released equations [PDF] that helped prove Faraday's hunch.



At the age of 48, Faraday's once-sharp memory started faltering. Stricken by an illness that rendered him unable to work for three years, he wrestled with vertigo, unsteadiness, and other symptoms. Following this "extended vacation" [PDF], he returned to the Royal Society, where he experimented away until his early 70s.

However, Faraday was still prone to inexplicable spurts of sudden giddiness, depression, and extreme forgetfulness. “[My] bad memory,” he wrote, “both loses recent things and sometimes suggests old ones as new.” Nobody knows what caused this affliction, though some blame it on overexposure to mercury.


Triggerhippie4, Wikimedia Commons // Public Domain

Fittingly, the father of modern physics regarded Faraday as a personal hero. Once, upon receiving a book about him, Einstein remarked, “This man loved mysterious Nature as a lover loves his distant beloved.”

Dean Mouhtaropoulos/Getty Images
Essential Science
What Is a Scientific Theory?
Dean Mouhtaropoulos/Getty Images
Dean Mouhtaropoulos/Getty Images

In casual conversation, people often use the word theory to mean "hunch" or "guess": If you see the same man riding the northbound bus every morning, you might theorize that he has a job in the north end of the city; if you forget to put the bread in the breadbox and discover chunks have been taken out of it the next morning, you might theorize that you have mice in your kitchen.

In science, a theory is a stronger assertion. Typically, it's a claim about the relationship between various facts; a way of providing a concise explanation for what's been observed. The American Museum of Natural History puts it this way: "A theory is a well-substantiated explanation of an aspect of the natural world that can incorporate laws, hypotheses and facts."

For example, Newton's theory of gravity—also known as his law of universal gravitation—says that every object, anywhere in the universe, responds to the force of gravity in the same way. Observational data from the Moon's motion around the Earth, the motion of Jupiter's moons around Jupiter, and the downward fall of a dropped hammer are all consistent with Newton's theory. So Newton's theory provides a concise way of summarizing what we know about the motion of these objects—indeed, of any object responding to the force of gravity.

A scientific theory "organizes experience," James Robert Brown, a philosopher of science at the University of Toronto, tells Mental Floss. "It puts it into some kind of systematic form."


A theory's ability to account for already known facts lays a solid foundation for its acceptance. Let's take a closer look at Newton's theory of gravity as an example.

In the late 17th century, the planets were known to move in elliptical orbits around the Sun, but no one had a clear idea of why the orbits had to be shaped like ellipses. Similarly, the movement of falling objects had been well understood since the work of Galileo a half-century earlier; the Italian scientist had worked out a mathematical formula that describes how the speed of a falling object increases over time. Newton's great breakthrough was to tie all of this together. According to legend, his moment of insight came as he gazed upon a falling apple in his native Lincolnshire.

In Newton's theory, every object is attracted to every other object with a force that’s proportional to the masses of the objects, but inversely proportional to the square of the distance between them. This is known as an “inverse square” law. For example, if the distance between the Sun and the Earth were doubled, the gravitational attraction between the Earth and the Sun would be cut to one-quarter of its current strength. Newton, using his theories and a bit of calculus, was able to show that the gravitational force between the Sun and the planets as they move through space meant that orbits had to be elliptical.

Newton's theory is powerful because it explains so much: the falling apple, the motion of the Moon around the Earth, and the motion of all of the planets—and even comets—around the Sun. All of it now made sense.


A theory gains even more support if it predicts new, observable phenomena. The English astronomer Edmond Halley used Newton's theory of gravity to calculate the orbit of the comet that now bears his name. Taking into account the gravitational pull of the Sun, Jupiter, and Saturn, in 1705, he predicted that the comet, which had last been seen in 1682, would return in 1758. Sure enough, it did, reappearing in December of that year. (Unfortunately, Halley didn't live to see it; he died in 1742.) The predicted return of Halley's Comet, Brown says, was "a spectacular triumph" of Newton's theory.

In the early 20th century, Newton's theory of gravity would itself be superseded—as physicists put it—by Einstein's, known as general relativity. (Where Newton envisioned gravity as a force acting between objects, Einstein described gravity as the result of a curving or warping of space itself.) General relativity was able to explain certain phenomena that Newton's theory couldn't account for, such as an anomaly in the orbit of Mercury, which slowly rotates—the technical term for this is "precession"—so that while each loop the planet takes around the Sun is an ellipse, over the years Mercury traces out a spiral path similar to one you may have made as a kid on a Spirograph.

Significantly, Einstein’s theory also made predictions that differed from Newton's. One was the idea that gravity can bend starlight, which was spectacularly confirmed during a solar eclipse in 1919 (and made Einstein an overnight celebrity). Nearly 100 years later, in 2016, the discovery of gravitational waves confirmed yet another prediction. In the century between, at least eight predictions of Einstein's theory have been confirmed.


And yet physicists believe that Einstein's theory will one day give way to a new, more complete theory. It already seems to conflict with quantum mechanics, the theory that provides our best description of the subatomic world. The way the two theories describe the world is very different. General relativity describes the universe as containing particles with definite positions and speeds, moving about in response to gravitational fields that permeate all of space. Quantum mechanics, in contrast, yields only the probability that each particle will be found in some particular location at some particular time.

What would a "unified theory of physics"—one that combines quantum mechanics and Einstein's theory of gravity—look like? Presumably it would combine the explanatory power of both theories, allowing scientists to make sense of both the very large and the very small in the universe.


Let's shift from physics to biology for a moment. It is precisely because of its vast explanatory power that biologists hold Darwin's theory of evolution—which allows scientists to make sense of data from genetics, physiology, biochemistry, paleontology, biogeography, and many other fields—in such high esteem. As the biologist Theodosius Dobzhansky put it in an influential essay in 1973, "Nothing in biology makes sense except in the light of evolution."

Interestingly, the word evolution can be used to refer to both a theory and a fact—something Darwin himself realized. "Darwin, when he was talking about evolution, distinguished between the fact of evolution and the theory of evolution," Brown says. "The fact of evolution was that species had, in fact, evolved [i.e. changed over time]—and he had all sorts of evidence for this. The theory of evolution is an attempt to explain this evolutionary process." The explanation that Darwin eventually came up with was the idea of natural selection—roughly, the idea that an organism's offspring will vary, and that those offspring with more favorable traits will be more likely to survive, thus passing those traits on to the next generation.


Many theories are rock-solid: Scientists have just as much confidence in the theories of relativity, quantum mechanics, evolution, plate tectonics, and thermodynamics as they do in the statement that the Earth revolves around the Sun.

Other theories, closer to the cutting-edge of current research, are more tentative, like string theory (the idea that everything in the universe is made up of tiny, vibrating strings or loops of pure energy) or the various multiverse theories (the idea that our entire universe is just one of many). String theory and multiverse theories remain controversial because of the lack of direct experimental evidence for them, and some critics claim that multiverse theories aren't even testable in principle. They argue that there's no conceivable experiment that one could perform that would reveal the existence of these other universes.

Sometimes more than one theory is put forward to explain observations of natural phenomena; these theories might be said to "compete," with scientists judging which one provides the best explanation for the observations.

"That's how it should ideally work," Brown says. "You put forward your theory, I put forward my theory; we accumulate a lot of evidence. Eventually, one of our theories might prove to obviously be better than the other, over some period of time. At that point, the losing theory sort of falls away. And the winning theory will probably fight battles in the future."

This Just In
Yes, Parents Do Play Favorites—And Often Love Their Youngest Kid Best

If you have brothers or sisters, there was probably a point in your youth when you spent significant time bickering over—or at least privately obsessing over—whom Mom and Dad loved best. Was it the oldest sibling? The baby of the family? The seemingly forgotten middle kid?

As much as we'd like to believe that parents love all of their children equally, some parents do, apparently, love their youngest best, according to The Independent. A recent survey from the parenting website Mumsnet and its sister site, the grandparent-focused Gransnet, found that favoritism affects both parents and grandparents.

Out of 1185 parents and 1111 grandparents, 23 percent of parents and 42 percent of grandparents admitted to have a favorite out of their children or grandchildren. For parents, that tended to be the youngest—56 percent of those parents with a favorite said they preferred the baby of the family. Almost 40 percent of the grandparents with a favorite, meanwhile, preferred the oldest. Despite these numbers, half of the respondents thought having a favorite among their children and grandchildren is "awful," and the majority think it's damaging for that child's siblings.

Now, this isn't to say that youngest children experience blatant favoritism across all families. This wasn't a scientific study, and with only a few thousand users, the number of people with favorites is actually not as high as it might seem—23 percent is only around 272 parents, for instance. But other studies with a bit more scientific rigor have indicated that parents do usually have favorites among their children. In one study, 70 percent of fathers and 74 percent of mothers admitted to showing favoritism in their parenting. "Parents need to know that favoritism is normal," psychologist Ellen Weber Libby, who specializes in family dynamics, told The Wall Street Journal in 2017.

But youngest kids don't always feel the most loved. A 2005 study found that oldest children tended to feel like the preferred ones, and youngest children felt like their parents were biased toward their older siblings. Another study released in 2017 found that when youngest kids did feel like there was preferential treatment in their family, their relationships with their parents were more greatly affected than their older siblings, either for better (if they sensed they were the favorite) or for worse (if they sensed their siblings were). Feeling like the favorite or the lesser sibling didn't tend to affect older siblings' relationships with their parents.

However, the author of that study, Brigham Young University professor Alex Jensen, noted in a press release at the time that whether or not favoritism affects children tends to depend on how that favoritism is shown. "When parents are more loving and they're more supportive and consistent with all of the kids, the favoritism tends to not matter as much," he said, advising that “you need to treat them fairly, but not equally.” Sadly for those who don't feel like the golden child, a different study in 2016 suggests that there's not much you can do about it—mothers, at least, rarely change which child they favor most, even over the course of a lifetime.

[h/t The Independent]


More from mental floss studios