11 Inventions That Came Before the Wheel

ira_paradox Flickr // CC BY-NC-ND 2.0
ira_paradox Flickr // CC BY-NC-ND 2.0

The wheel is the classic example of early human invention—a quintessential innovation that distinguishes Homo sapiens from all other animals. But in the scope of human history, the wheel is actually a rather young creation. Ancient Mesopotamians in modern-day Iraq became the first people to adopt the wheel only around 5500 years ago, and fairly recent cultures from other parts of the world have managed to make impressive technical accomplishments without wheels at all. (The wheel-less people of Easter Island, for example, transported and erected their towering moai statues less than 1000 years ago.) From booze to the bow and arrow, here are 11 innovations that predate the wheel.

1. BOOZE // 7000 BCE

variety of cocktails on a bar
iStock

Some archaeologists are starting to think that the world's first farmers domesticated grains to make beer, not bread. While the extent of alcohol's influence on human civilization is still debated, its antiquity is not. The oldest evidence for booze so far comes from 9000-year-old chemical traces of a fermented cocktail found on a drinking vessel in Jiahu, China.

2. CLOTHING // 150,000 BCE

A dress discovered in Egypt that is more than 5000 years old
UCL Petrie Museum of Egyptian Archaeology

We're all born naked, but most of us are forced to wear clothes shortly afterwards. Since textiles, leathers, and furs tend to disintegrate over time, scientists have had to get creative in their quest to pinpoint the origin of clothing. The dress above, discovered in Egypt, is at least 5100 years old, but that makes it pretty recent. Clothes actually date back much further: A stone tool from a site in Germany has traces of tanned animal skin, which suggests that humans' Neanderthal cousins were wearing hides 100,000 years ago, and a study from 2011 proposed that the origin of clothes can be traced to the evolution of clothing lice, around 170,000 years ago.

3. JEWELRY // 110,000 BCE

K. Gavrilov in Antiquity Publications Ltd, 2018

Garments certainly helped humans to compensate for lost body fur and to move into colder climates, but clothes may have also been a cultural invention. As archaeological evidence of jewelry can attest, humans have also been adorning their bodies for decorative purposes for a very long time. Among the oldest surviving pieces of jewelry are 82,000-year-old pierced shells covered in red pigment from a cave in Morocco and a 130,000-year-old eagle-claw necklace found in a Neanderthal cave in Croatia. The above burial, found in Russia at a site called Sunghir, is younger, but still ancient: The man was buried more than 30,000 years ago with an elaborate array of mammoth ivory beads and arm bands, a headband of pierced fox teeth, and a pendant. (Some of the items may once have been sewn onto clothing.)

4. BOATS // 43,000–8000 BCE

Dugout boats at Kierikki Stone Age Centre

Before animal-drawn carts became a preferred mode of transport, there were rafts and boats. The 10,000-year-old Pesse canoe found in the Netherlands is thought to be the world's oldest surviving boat. But humans likely figured out how to navigate the seas for fishing and exploration even earlier. After all, people somehow crossed the seas to populate Australia, Indonesia, and islands in the Pacific at least 45,000 years ago.

5. CALENDARS // 8000 BCE

An illustration of how a 10,000-year-old
© Google Earth, Plan based on Murray et al. 2009, fig. 3, in Internet Archaeology // CC BY 3.0

Long before the gear-wheels of clocks were invented, humans used sophisticated methods to track the passage of time. One group of archaeologists has claimed that the oldest known calendar could be a 10,000-year-old series of 12 pits found in Scotland that appear to mimic the lunar cycle. You can see in the image above how the researchers imagine the system to have worked.

6. GEOGRAPHIC MAPS // 12,000 BCE

12000-year-old
Utrilla et. all in Journal of Human Evolution

Just as they had to invent ways to track time, so, too, did humans have to figure out how to represent space so that they could navigate their world. Archaeologists still debate the meaning of the earliest rock art, but some of the oldest examples of possible prehistoric maps come from Abauntz Cave in Spain. The 14,000-year-old stone tablets are thought to depict mountains, rivers, and ponds, intersected with routes and hunting game-plans. You can see the top and bottom of one tablet above.

7. COOKING // 1.8 MILLION–500,000 BCE

iStock

Sometime after humans learned to control fire, they invented cooking. When you start breaking down meat and plants over an open flame, you don't have to expend as much energy chewing and digesting those foods. A conservative estimate for the rise of cooking would be 500,000 years ago, and according to a recent article in Scientific American, some researchers argue that cooking came about 1.8 million years ago by Homo erectus, a direct ancestor of Homo sapiens. They propose that this development in human evolution is what allowed our brain size to increase.

8. MUSICAL INSTRUMENTS // 41,000 BCE

bone flute
Sascha Schuermann, AFP/Getty Images

The darkened passageways inside Germany's Hohle Fels cave get even spookier when you imagine the sounds of flutes echoing through the caverns. This is the archaeological site where the world's oldest musical instruments—43,000-year-old bone flutes made of vulture wing and mammoth tusk—have been found. Want to hear what they might have sounded like? One researcher made a replica of the vulture-wing flute, and NPR has the tune.

9. GLUE // 200,000 BCE

glue spilling from bottle onto wood table
iStock

The superglue in your toolbox and Elmer's in your kid's classroom have a long pedigree. About 200,000 years ago, Neanderthals roaming Europe used adhesive tar from birch bark to fix their stone spear tips to handles. Recent experiments suggest this type of glue was complex and difficult to make.

10. POTTERY // 18,000 BCE

archaeologist with ancient pottery
Marvin Recinos, AFP/Getty Images

Thousands of years before the invention of the wheel, people were making vessels for drinking, eating, and storage by pinching, rolling, or coiling clay into shape and baking it until hard. The oldest crude ceramic vessels come from China and date back 20,000 years. The invention of the wheel allowed for the rise of wheel-thrown pottery. Some even argue that the potter's wheel was probably the first type of wheel ever created.

11. BOW AND ARROW // 7000 BCE

rock art of hunters using bows and arrows
iStock

The remains of five bows crafted 9000 years ago were found at the Stone Age settlement of Holmegårds Mose in Denmark. But bows and arrows may have been invented far earlier by savvy hunters who wanted an efficient weapon to kill prey from a distance. Some archaeologists have argued that Sibudu Cave in South Africa contains evidence of 64,000-year-old stone-tipped arrows and bows.

Why the Filet-O-Fish Sandwich Has Been on the McDonald's Menu for Nearly 60 Years

McDonald's has introduced and quietly killed many dishes over the years (remember McDonald's pizza?), but there's a core group of items that have held their spot on the menu for decades. Listed alongside the Big Mac and McNuggets is the Filet-O-Fish—a McDonald's staple you may have forgotten about if you're not the type of person who orders seafood from fast food restaurants. But the classic sandwich, consisting of a fried fish filet, tartar sauce, and American cheese on a bun, didn't get on the menu by mistake—and thanks to its popularity around Lent, it's likely to stick around.

According to Taste of Home, the inception of the Filet-O-Fish can be traced back to a McDonald's franchise that opened near Cincinnati, Ohio in 1959. Back then the restaurant offered beef burgers as its only main dish, and for most of the year, diners couldn't get enough of them. Things changed during Lent: Many Catholics abstain from eating meat and poultry on Fridays during the holy season as a form of fasting, and in the early 1960s, Cincinnati was more than 85 percent Catholic. Fridays are supposed to be one of the busiest days of the week for restaurants, but sales at the Ohio McDonald's took a nosedive every Friday leading up to Easter.

Franchise owner Lou Groen went to McDonald's founder Ray Kroc with the plan of adding a meat alternative to the menu to lure back Catholic customers. He proposed a fried halibut sandwich with tartar sauce (though meat is off-limits for Catholics on Fridays during Lent, seafood doesn't count as meat). Kroc didn't love the idea, citing his fears of stores smelling like fish, and suggested a "Hula Burger" made from a pineapple slice with cheese instead. To decide which item would earn a permanent place on the menu, they put the two sandwiches head to head at Groen's McDonald's one Friday during Lent.

The restaurant sold 350 Filet-O-Fish sandwiches that day—clearly beating the Hula Burger (though exactly how many pineapple burgers sold, Kroc wouldn't say). The basic recipe has received a few tweaks, switching from halibut to the cheaper cod and from cod to the more sustainable Alaskan pollock, but the Filet-O-Fish has remained part of the McDonald's lineup in some form ever since. Today 300 million of the sandwiches are sold annually, and about a quarter of those sales are made during Lent.

Other seafood products McDonald's has introduced haven't had the same staying power as the Filet-O-Fish. In 2013, the chain rolled out Fish McBites, a chickenless take on McNuggets, only to pull them from menus that same year.

[h/t Taste of Home]

The Disturbing Reason Schools Tattooed Their Students in the 1950s

Kurt Hutton, Hulton Archive/Getty Images
Kurt Hutton, Hulton Archive/Getty Images

When Paul Bailey was born at Beaver County Hospital in Milford, Utah on May 9, 1955, it took less than two hours for the staff to give him a tattoo. Located on his torso under his left arm, the tiny marking was rendered in indelible ink with a needle gun and indicated Bailey’s blood type: O-Positive.

“It is believed to be the youngest baby ever to have his blood type tattooed on his chest,” reported the Beaver County News, cooly referring to the infant as an “it.” A hospital employee was quick to note parental consent had been obtained first.

The permanent tattooing of a child who was only hours old was not met with any hysteria. Just the opposite: In parts of Utah and Indiana, local health officials had long been hard at work instituting a program that would facilitate potentially life-saving blood transfusions in the event of a nuclear attack. By branding children and adults alike with their blood type, donors could be immediately identified and used as “walking blood banks” for the critically injured.

Taken out of context, it seems unimaginable. But in the 1950s, when the Cold War was at its apex and atomic warfare appeared not only possible but likely, children willingly lined up at schools to perform their civic duty. They raised their arm, gritted their teeth, and held still while the tattoo needle began piercing their flesh.

 

The practice of subjecting children to tattoos for blood-typing has appropriately morbid roots. Testifying at the Nuremberg Tribunal on War Crimes in the 1940s, American Medical Association physician Andrew Ivy observed that members of the Nazi Waffen-SS carried body markings indicating their blood type [PDF]. When he returned to his hometown of Chicago, Ivy carried with him a solution for quickly identifying blood donors—a growing concern due to the outbreak of the Korean War in 1950. The conflict was depleting blood banks of inventory, and it was clear that reserves would be necessary.

School children sit next to one another circa the 1950s
Reg Speller, Fox Photos/Getty Images

If the Soviet Union targeted areas of the United States for destruction, it would be vital to have a protocol for blood transfusions to treat radiation poisoning. Matches would need to be found quickly. (Transfusions depend on matching blood to avoid the adverse reactions that come from mixing different types. When a person receives blood different from their own, the body will create antibodies to destroy the red blood cells.)

In 1950, the Department of Defense placed the American Red Cross in charge of blood donor banks for the armed forces. In 1952, the Red Cross was the coordinating agency [PDF] for obtaining blood from civilians for the National Blood Program, which was meant to replenish donor supply during wartime. Those were both measures for soldiers. Meanwhile, local medical societies were left to determine how best to prepare their civilian communities for a nuclear event and its aftermath.

As part of the Chicago Medical Civil Defense Committee, Ivy promoted the use of the tattoos, declaring them as painless as a vaccination. Residents would get blood-typed by having their finger pricked and a tiny droplet smeared on a card. From there, they would be tattooed with the ABO blood group and Rhesus factor (or Rh factor), which denotes whether or not a person has a certain type of blood protein present.

The Chicago Medical Society and the Board of Health endorsed the program and citizens voiced a measure of support for it. One letter to the editor of The Plainfield Courier-News in New Jersey speculated it might even be a good idea to tattoo Social Security numbers on people's bodies to make identification easier.

Despite such marked enthusiasm, the project never entered into a pilot testing stage in Chicago.

Officials with the Lake County Medical Society in nearby Lake County, Indiana were more receptive to the idea. In the spring of 1951, 5000 residents were blood-typed using the card method. But, officials cautioned, the cards could be lost in the chaos of war or even the relative quiet of everyday life. Tattoos and dog tags were encouraged instead. When 1000 people lined up for blood-typing at a county fair, two-thirds agreed to be tattooed as part of what the county had dubbed "Operation Tat-Type." By December 1951, 15,000 Lake County residents had been blood-typed. Roughly 60 percent opted for a permanent marking.

The program was so well-received that the Lake County Medical Society quickly moved toward making children into mobile blood bags. In January 1952, five elementary schools in Hobart, Indiana enrolled in the pilot testing stage. Children were sent home with permission slips explaining the effort. If parents consented, students would line up on appointed tattoo days to get their blood typed with a finger prick. From there, they’d file into a room—often the school library—set up with makeshift curtains behind which they could hear a curious buzzing noise.

When a child stepped inside, they were greeted by a school administrator armed with indelible ink and wielding a Burgess Vibrotool, a medical tattoo gun featuring 30 to 50 needles. The child would raise their left arm to expose their torso (since arms and legs might be blown off in an attack) and were told the process would only take seconds.

A child raises his hand in class circa the 1950s
Vecchio/Three Lions/Getty Images

Some children were stoic. Some cried before, during, or after. One 11-year-old recounting her experience with the program said a classmate emerged from the session and promptly fainted. All were left with a tattoo less than an inch in diameter on their left side, intentionally pale so it would be as unobtrusive as possible.

At the same time that grade schoolers—and subsequently high school students—were being imprinted in Indiana, kids in Cache and Rich counties in Utah were also submitting to the program, despite potential religious obstacles for the region's substantial Mormon population. In fact, Bruce McConkie, a representative of the Church of Jesus Christ of Latter-Day Saints, declared that blood-type tattoos were exempt from the typical prohibitions on Mormons defacing their bodies, giving the program a boost among the devout. The experiment would not last much longer, though.

 

By 1955, 60,000 adults and children had gotten tattooed with their blood types in Lake County. In Milford, health officials persisted in promoting the program widely, offering the tattoos for free during routine vaccination appointments. But despite the cooperation exhibited by communities in Indiana and Utah, the programs never spread beyond their borders.

The Korean conflict had come to an end in 1953, reducing the strain put on blood supplies and along with it, the need for citizens to double as walking blood banks. More importantly, outside of the program's avid boosters, most physicians were extremely reticent to rely solely on a tattoo for blood-typing. They preferred to do their own testing to make certain a donor was a match with a patient.

There were other logistical challenges that made the program less than useful. The climate of a post-nuclear landscape meant that bodies might be charred, burning off tattoos and rendering the entire operation largely pointless. With the Soviet Union’s growing nuclear arsenal—1600 warheads were ready to take to the skies by 1960—the idea of civic defense became outmoded. Ducking and covering under desks, which might have shielded some from the immediate effects of a nuclear blast, would be meaningless in the face of such mass destruction.

Programs like tat-typing eventually fell out of favor, yet tens of thousands of adults consented to participate even after the flaws in the program were publicized, and a portion allowed their young children to be marked, too. Their motivation? According to Carol Fischler, who spoke with the podcast 99% Invisible about being tattooed as a young girl in Indiana, the paranoia over the Cold War in the 1950s drowned out any thought of the practice being outrageous or harmful. Kids wanted to do their part. Many nervously bit their lip but still lined up with the attitude that the tattoo was part of being a proud American.

Perhaps equally important, children who complained of the tattoo leaving them particularly sore received another benefit: They got the rest of the afternoon off.

SECTIONS

arrow
LIVE SMARTER