10 Paradoxes That Will Boggle Your Mind


A paradox is a statement or problem that either appears to produce two entirely contradictory (yet possible) outcomes, or provides proof for something that goes against what we intuitively expect. Paradoxes have been a central part of philosophical thinking for centuries, and are always ready to challenge our interpretation of otherwise simple situations, turning what we might think to be true on its head and presenting us with provably plausible situations that are in fact just as provably impossible. Confused? You should be.


The Paradox of Achilles and the Tortoise is one of a number of theoretical discussions of movement put forward by the Greek philosopher Zeno of Elea in the 5th century BC. It begins with the great hero Achilles challenging a tortoise to a footrace. To keep things fair, he agrees to give the tortoise a head start of, say, 500m. When the race begins, Achilles unsurprisingly starts running at a speed much faster than the tortoise, so that by the time he has reached the 500m mark, the tortoise has only walked 50m further than him. But by the time Achilles has reached the 550m mark, the tortoise has walked another 5m. And by the time he has reached the 555m mark, the tortoise has walked another 0.5m, then 0.25m, then 0.125m, and so on. This process continues again and again over an infinite series of smaller and smaller distances, with the tortoise always moving forwards while Achilles always plays catch up.

Logically, this seems to prove that Achilles can never overtake the tortoise—whenever he reaches somewhere the tortoise has been, he will always have some distance still left to go no matter how small it might be. Except, of course, we know intuitively that he can overtake the tortoise. The trick here is not to think of Zeno’s Achilles Paradox in terms of distances and races, but rather as an example of how any finite value can always be divided an infinite number of times, no matter how small its divisions might become.


The Bootstrap Paradox is a paradox of time travel that questions how something that is taken from the future and placed in the past could ever come into being in the first place. It’s a common trope used by science fiction writers and has inspired plotlines in everything from Doctor Who to the Bill and Ted movies, but one of the most memorable and straightforward examples—by Professor David Toomey of the University of Massachusetts and used in his book The New Time Travellers—involves an author and his manuscript.

Imagine that a time traveller buys a copy of Hamlet from a bookstore, travels back in time to Elizabethan London, and hands the book to Shakespeare, who then copies it out and claims it as his own work. Over the centuries that follow, Hamlet is reprinted and reproduced countless times until finally a copy of it ends up back in the same original bookstore, where the time traveller finds it, buys it, and takes it back to Shakespeare. Who, then, wrote Hamlet?


Imagine that a family has two children, one of whom we know to be a boy. What then is the probability that the other child is a boy? The obvious answer is to say that the probability is 1/2—after all, the other child can only be either a boy or a girl, and the chances of a baby being born a boy or a girl are (essentially) equal. In a two-child family, however, there are actually four possible combinations of children: two boys (MM), two girls (FF), an older boy and a younger girl (MF), and an older girl and a younger boy (FM). We already know that one of the children is a boy, meaning we can eliminate the combination FF, but that leaves us with three equally possible combinations of children in which at least one is a boy—namely MM, MF, and FM. This means that the probability that the other child is a boy—MM—must be 1/3, not 1/2.


Imagine you’re holding a postcard in your hand, on one side of which is written, “The statement on the other side of this card is true.” We’ll call that Statement A. Turn the card over, and the opposite side reads, “The statement on the other side of this card is false” (Statement B). Trying to assign any truth to either Statement A or B, however, leads to a paradox: if A is true then B must be as well, but for B to be true, A has to be false. Oppositely, if A is false then B must be false too, which must ultimately make A true.

Invented by the British logician Philip Jourdain in the early 1900s, the Card Paradox is a simple variation of what is known as a “liar paradox,” in which assigning truth values to statements that purport to be either true or false produces a contradiction. An even more complicated variation of a liar paradox is the next entry on our list.


A crocodile snatches a young boy from a riverbank. His mother pleads with the crocodile to return him, to which the crocodile replies that he will only return the boy safely if the mother can guess correctly whether or not he will indeed return the boy. There is no problem if the mother guesses that the crocodile will return him—if she is right, he is returned; if she is wrong, the crocodile keeps him. If she answers that the crocodile will not return him, however, we end up with a paradox: if she is right and the crocodile never intended to return her child, then the crocodile has to return him, but in doing so breaks his word and contradicts the mother’s answer. On the other hand, if she is wrong and the crocodile actually did intend to return the boy, the crocodile must then keep him even though he intended not to, thereby also breaking his word.

The Crocodile Paradox is such an ancient and enduring logic problem that in the Middle Ages the word "crocodilite" came to be used to refer to any similarly brain-twisting dilemma where you admit something that is later used against you, while "crocodility" is an equally ancient word for captious or fallacious reasoning


Imagine that you’re about to set off walking down a street. To reach the other end, you’d first have to walk half way there. And to walk half way there, you’d first have to walk a quarter of the way there. And to walk a quarter of the way there, you’d first have to walk an eighth of the way there. And before that a sixteenth of the way there, and then a thirty-second of the way there, a sixty-fourth of the way there, and so on.

Ultimately, in order to perform even the simplest of tasks like walking down a street, you’d have to perform an infinite number of smaller tasks—something that, by definition, is utterly impossible. Not only that, but no matter how small the first part of the journey is said to be, it can always be halved to create another task; the only way in which it cannot be halved would be to consider the first part of the journey to be of absolutely no distance whatsoever, and in order to complete the task of moving no distance whatsoever, you can’t even start your journey in the first place.


Imagine a fletcher (i.e. an arrow-maker) has fired one of his arrows into the air. For the arrow to be considered to be moving, it has to be continually repositioning itself from the place where it is now to any place where it currently isn’t. The Fletcher’s Paradox, however, states that throughout its trajectory the arrow is actually not moving at all. At any given instant of no real duration (in other words, a snapshot in time) during its flight, the arrow cannot move to somewhere it isn’t because there isn’t time for it to do so. And it can’t move to where it is now, because it’s already there. So, for that instant in time, the arrow must be stationary. But because all time is comprised entirely of instants—in every one of which the arrow must also be stationary—then the arrow must in fact be stationary the entire time. Except, of course, it isn’t.


In his final written work, Discourses and Mathematical Demonstrations Relating to Two New Sciences (1638), the legendary Italian polymath Galileo Galilei proposed a mathematical paradox based on the relationships between different sets of numbers. On the one hand, he proposed, there are square numbers—like 1, 4, 9, 16, 25, 36, and so on. On the other, there are those numbers that are not squares—like 2, 3, 5, 6, 7, 8, 10, and so on. Put these two groups together, and surely there have to be more numbers in general than there are just square numbers—or, to put it another way, the total number of square numbers must be less than the total number of square and non-square numbers together. However, because every positive number has to have a corresponding square and every square number has to have a positive number as its square root, there cannot possibly be more of one than the other.

Confused? You’re not the only one. In his discussion of his paradox, Galileo was left with no alternative than to conclude that numerical concepts like more, less, or fewer can only be applied to finite sets of numbers, and as there are an infinite number of square and non-square numbers, these concepts simply cannot be used in this context.


Imagine that a farmer has a sack containing 100 lbs of potatoes. The potatoes, he discovers, are comprised of 99% water and 1% solids, so he leaves them in the heat of the sun for a day to let the amount of water in them reduce to 98%. But when he returns to them the day after, he finds his 100 lb sack now weighs just 50 lbs. How can this be true? Well, if 99% of 100 lbs of potatoes is water then the water must weigh 99 lbs. The 1% of solids must ultimately weigh just 1 lb, giving a ratio of solids to liquids of 1:99. But if the potatoes are allowed to dehydrate to 98% water, the solids must now account for 2% of the weight—a ratio of 2:98, or 1:49—even though the solids must still only weigh 1lb. The water, ultimately, must now weigh 49lb, giving a total weight of 50lbs despite just a 1% reduction in water content. Or must it?

Although not a true paradox in the strictest sense, the counterintuitive Potato Paradox is a famous example of what is known as a veridical paradox, in which a basic theory is taken to a logical but apparently absurd conclusion.


Also known as Hempel’s Paradox, for the German logician who proposed it in the mid-1940s, the Raven Paradox begins with the apparently straightforward and entirely true statement that “all ravens are black.” This is matched by a “logically contrapositive” (i.e. negative and contradictory) statement that “everything that is not black is not a raven”—which, despite seeming like a fairly unnecessary point to make, is also true given that we know “all ravens are black.” Hempel argues that whenever we see a black raven, this provides evidence to support the first statement. But by extension, whenever we see anything that is not black, like an apple, this too must be taken as evidence supporting the second statement—after all, an apple is not black, and nor is it a raven.

The paradox here is that Hempel has apparently proved that seeing an apple provides us with evidence, no matter how unrelated it may seem, that ravens are black. It’s the equivalent of saying that you live in New York is evidence that you don’t live in L.A., or that saying you are 30 years old is evidence that you are not 29. Just how much information can one statement actually imply anyway?

The Origins of 5 International Food Staples

Food is more than fuel. Cuisine and culture are so thoroughly intertwined that many people automatically equate tomatoes with Italy and potatoes with Ireland. Yet a thousand years ago those dietary staples were unheard of in Europe. How did they get to be so ubiquitous there—and beyond?


For years, the wonderful fruit that’s now synonymous with Italy was mostly ignored there. Native to South America and likely cultivated in Central America, tomatoes were introduced to Italy by Spanish explorers during the 1500s. Shortly thereafter, widespread misconceptions about the newcomers took root. In part due to their watery complexion, it was inaccurately thought that eating tomatoes could cause severe digestive problems. Before the 18th century, the plants were mainly cultivated for ornamental purposes. Tomato-based sauce recipes wouldn’t start appearing in present-day Italy until 1692 (although even those recipes were more like a salsa or relish than a sauce). Over the next 150 years, tomato products slowly spread throughout the peninsula, thanks in no small part to the agreeable Mediterranean climate. By 1773, some cooks had taken to stuffing tomatoes with rice or veal. In Naples, the fruits were sometimes chopped up and placed onto flatbread—the beginnings of modern pizza. But what turned the humble tomato into a national icon was the canning industry. Within Italy’s borders, this business took off in a big way during the mid-to-late 19th century. Because tomatoes do well stored inside metal containers, canning companies dramatically drove up the demand. The popularity of canned tomatoes was later solidified by immigrants who came to the United States from Italy during the early 20th century: Longing for Mediterranean ingredients, transplanted families created a huge market for Italian-grown tomatoes in the US.


Bowl of chicken curry with a spoon in it

An international favorite, curry is beloved in both India and the British Isles, not to mention the United States. And it turns out humans may have been enjoying the stuff for a very, very long time. The word “curry” was coined by European colonists and is something of an umbrella term. In Tamil, a language primarily found in India and Sri Lanka, “kari” means “sauce.” When Europeans started traveling to India, the term was eventually modified into “curry,” which came to designate any number of spicy foods with South or Southeast Asian origins. Nonetheless, a great number of curry dishes share two popular components: turmeric and ginger. In 2012, traces of both were discovered inside residue caked onto pots and human teeth at a 4500-year-old archaeological site in northern India. And where there’s curry, there’s usually garlic: A carbonized clove of this plant was also spotted nearby. “We don’t know they were putting all of them together in a dish, but we know that they were eating them at least individually,” Steve Weber, one of the archaeologists who helped make this astonishing find, told The Columbian. He and his colleagues have tentatively described their discovery as "proto-curry."


Several baguettes

A quintessential Gallic food, baguettes are adored throughout France, where residents gobble up an estimated 10 billion every year. The name of the iconic bread ultimately comes from the Latin word for stick, baculum, and references its long, slender form. How the baguette got that signature shape is a mystery. One popular yarn credits Napoleon Bonaparte: Supposedly, the military leader asked French bakers to devise a new type of skinny bread loaf that could be comfortably tucked into his soldiers’ pockets. Another origin story involves the Paris metro, built in the 19th century by a team of around 3500 workers who were apparently sometimes prone to violence during meal times. It’s been theorized that the metro foremen tried to de-escalate the situation by introducing bread that could be broken into pieces by hand—thereby eliminating the need for laborers to carry knives. Alas, neither story is supported by much in the way of historical evidence. Still, it’s clear that lengthy bread is nothing new in France: Six-foot loaves were a common sight in the mid-1800s. The baguette as we know it today, however, didn’t spring into existence until the early 20th century. The modern loaf is noted for its crispy golden crust and white, puffy center—both traits made possible by the advent of steam-based ovens, which first arrived on France’s culinary scene in the 1920s.


Bowl of red, white, and black potatoes on wooden table

Historical records show that potatoes reached Ireland by the year 1600. Nobody knows who first introduced them; the list of potential candidates includes everyone from Sir Walter Raleigh to the Spanish Armada. Regardless, Ireland turned out to be a perfect habitat for the tubers, which hail from the misty slopes of the Andes Mountains in South America. Half a world away, Ireland’s rich soils and rainy climate provided similar conditions—and potatoes thrived there. They also became indispensable. For millennia, the Irish diet had mainly consisted of dairy products, pig meats, and grains, none of which were easy for poor farmers to raise. Potatoes, on the other hand, were inexpensive, easy to grow, required fairly little space, and yielded an abundance of filling carbs. Soon enough, the average Irish peasant was subsisting almost entirely on potatoes, and the magical plant is credited with almost single-handedly triggering an Irish population boom. In 1590, only around 1 million people lived on the island; by 1840, that number had skyrocketed to 8.2 million. Unfortunately, this near-total reliance on potatoes would have dire consequences for the Irish people. In 1845, a disease caused by fungus-like organisms killed off somewhere between one-third and one-half of the country’s potatoes. Roughly a million people died as a result, and almost twice as many left Ireland in a desperate mass exodus. Yet potatoes remained a cornerstone of the Irish diet after the famine ended; in 1899, one magazine reported that citizens were eating an average of four pounds’ worth of them every day. Expatriates also brought their love of potatoes with them to other countries, including the U.S. But by then, the Yanks had already developed a taste for the crop: The oldest record of a permanent potato patch on American soil dates back to 1719. That year, a group of farmers—most likely Scots-Irish immigrants—planted one in the vicinity of modern-day Derry, New Hampshire. From these humble origins, the potato steadily rose in popularity, and by 1796, American cookbooks were praising its “universal use, profit, and easy acquirement.”


Corn growing in a field

In the 1930s, geneticist George W. Beadle exposed a vital clue about how corn—also known as maize—came into existence. A future Nobel Prize winner, Beadle demonstrated that the chromosomes found in everyday corn bear a striking resemblance to those of a Mexican grass called teosinte. At first glance, teosinte may not look very corn-like. Although it does have kernels, these are few in number and encased in tough shells that can easily chip a human tooth. Nonetheless, years of work allowed Beadle to prove beyond a shadow of a doubt that corn was descended from teosinte. Today, genetic and archaeological data suggests that humans began the slow process of converting this grass into corn around 8700 years ago in southwestern Mexico. If you're wondering why early farmers showed any interest in cultivating teosinte to begin with, while the plant is fairly unappetizing in its natural state, it does have a few key attributes. One of these is the ability to produce popcorn: If held over an open fire, the kernels will “pop” just as our favorite movie theater treat does today. It might have been this very quality that inspired ancient horticulturalists to tinker around with teosinte—and eventually turn it into corn


Person sitting cross-legged holding a cup of green tea

The United Kingdom’s ongoing love affair with this hot drink began somewhat recently. Tea—which is probably of Chinese origin—didn’t appear in Britain until the 1600s. Initially, the beverage was seen as an exotic curiosity with possible health benefits. Shipping costs and tariffs put a hefty price tag on tea, rendering it quite inaccessible to the lower classes. Even within England’s most affluent circles, tea didn’t really catch on until King Charles II married Princess Catherine of Braganza. By the time they tied the knot in 1662, tea-drinking was an established pastime among the elite in her native Portugal. Once Catherine was crowned Queen, tea became all the rage in her husband’s royal court. From there, its popularity slowly grew over several centuries and eventually transcended socioeconomic class. At present, the average Brit drinks an estimated three and a half cups of tea every day.

All photos courtesy of iStock.

Warner Home Video
10 Filling Facts About A Charlie Brown Thanksgiving
Warner Home Video
Warner Home Video

Though it may not be as widely known as It’s the Great Pumpkin, Charlie Brown or A Charlie Brown Christmas, A Charlie Brown Thanksgiving has been a beloved holiday tradition for many families for more than 40 years now. Even if you've seen it 100 times, there’s still probably a lot you don’t know about this Turkey Day special.


We all know the trombone “wah wah wah” sound that Charlie Brown’s teacher makes when speaking in a Peanuts special. But A Charlie Brown Thanksgiving, which was released in 1973, made history as the first Peanuts special to feature a real, live, human adult voice. But it’s not a speaking voice—it’s heard in the song “Little Birdie.”


Being the first adult to lend his or her voice to a Peanuts special was kind of a big deal, so it makes sense that the honor wasn’t bestowed on just any old singer or voice actor. The song was performed by composer Vince Guardaldi, whose memorable compositions have become synonymous with Charlie Brown and the rest of the gang.

“Guaraldi was one of the main reasons our shows got off to such a great start,” Lee Mendelson, the Emmy-winning producer who worked on many of the Peanuts specials—including A Charlie Brown Thanksgivingwrote for The Huffington Post in 2013. “His ‘Linus and Lucy,’ introduced in A Charlie Brown Christmas, set the bar for the first 16 shows for which he created all the music. For our Thanksgiving show, he told me he wanted to sing a new song he had written for Woodstock. I agreed with much trepidation as I had never heard him sing a note. His singing of ‘Little Birdie’ became a hit."


While Peanuts specials are largely populated by children, there’s usually at least an adult or two seen or heard somewhere. That’s not the case with A Charlie Brown Thanksgiving. “Charlie Brown Thanksgiving may be the only Thanksgiving special (live or animated) that does not include adults,” Mendelson wrote for HuffPo. “Our first 25 specials honored the convention of the comic strip where no adults ever appeared. (Ironically, our Mayflower special does include adults for the first time.)”


Though early on in the special, viewers get that staple scene of Lucy pulling a football away from Charlie Brown at the last minute, that’s all we see of Chuck’s nemesis in A Charlie Brown Thanksgiving. (Lucy's brother, Linus, however, is still a main character.)


Though they only had a single scene together, Todd Barbee, who voiced Charlie Brown, told Noblemania that he and Robin Kohn, who voiced Lucy in the Thanksgiving special, still keep in touch. “We actually went to high school together,” Barbee said. “We still live in Marin County, are Facebook friends, and occasionally see each other.”


One unique aspect of the Peanuts specials is that the bulk of the characters are voiced by real kids. In the case of A Charlie Brown Thanksgiving, 10-year-old newcomer Todd Barbee was tasked with giving a voice to Charlie Brown—and it wasn’t always easy.

“One time they wanted me to voice that ‘AAAAAAARRRRRGGGGG’ when Charlie Brown goes to kick the football and Lucy yanks it away,” Barbee recalled to Noblemania in 2014. “Try as I might, I just couldn’t generate [it as] long [as] they were looking for … so after something like 25 takes, we moved on. I was sweating the whole time. I think they eventually got an adult or a kid with an older voice to do that one take."


While Barbee got a crash course in the downside of celebrity at a very early age—“seeing my name printed in TV Guide made everyone around me go bananas … everybody … just thought I was some big movie star or something,” he told Noblemania—Stephen Shea, who voiced Linus, still gets a pretty big reaction.

"I don't walk around saying 'I'm the voice of Linus,'" Shea told the Los Angeles Times in 2013. "But when people find out one way or another, they scream 'I love Linus. That is my favorite character!'"


As is often the case in a Peanuts special, Linus gets to play the role of philosopher in A Charlie Brown Thanksgiving and remind his friends (and the viewers) about the history and true meaning of whatever holiday they’re celebrating. His speech about the Pilgrims’ first Thanksgiving eventually led to This is America, Charlie Brown: The Mayflower Voyagers, a kind of spinoff adapted from that Thanksgiving Day prayer, which sees the Peanuts gang becoming a part of history.


In writing for HuffPo for A Charlie Brown Thanksgiving’s 40th anniversary, Mendelson admitted that one particular scene in the special led to “a rare, minor dispute during the creation of the show. Mr. Schulz insisted that Woodstock join Snoopy in carving and eating a turkey. For some reason I was bothered that Woodstock would eat a turkey. I voiced my concern, which was immediately overruled.”


Though Mendelson lost his original argument against seeing Woodstock eating another bird, he was eventually able to right that wrong. “Years later, when CBS cut the show from its original 25 minutes to 22 minutes, I sneakily edited out the scene of Woodstock eating,” he wrote. “But when we moved to ABC in 2001, the network (happily) elected to restore all the holiday shows to the original 25 minutes, so I finally have given up.”


More from mental floss studios