When a 1986 Meeting Between Ronald Reagan and Mikhail Gorbachev Wreaked Havoc on Iceland

World History Archive, Alamy
World History Archive, Alamy

With its Blue Lagoon thermal spa and unrivaled views of the Northern Lights, Iceland is one of the world's top tourist destinations, drawing over 2 million visitors last year alone. A few decades ago, however, it was a different story. In 1986, when the island nation—population 240,000—was asked to host an important summit between the U.S. and the Soviet Union, its emergence on the global stage that autumn was swift and chaotic. The planned meeting between U.S. President Ronald Reagan and Soviet leader Mikhail Gorbachev was the largest international event that Iceland had ever been asked to host—and the country had been given just 10 days to prepare.

At the time, Iceland was one of the “world’s most isolated nations,” according to The New York Times, and White House officials chose to host the summit in its capital city, Reykjavík, for precisely that reason. Reagan and Gorbachev planned to discuss the reduction of their nuclear arsenals—a continuation of a conversation held the previous year in Geneva, Switzerland—and hoped to reach an arms-control agreement. White House officials said Reykjavík would afford them a greater degree of privacy than London, the other proposed option. It was also a slightly shorter flight from the U.S.

In the '80s, few Americans knew much about Iceland, which was deridingly referred to as "a gallows of slush" and "place of fish." The country’s U.S.-educated prime minister at the time, Steingrimur Hermannsson, told a reporter that Americans had asked him if Icelanders lived in igloos.

A "CRITICAL SHORTAGE" OF BEDS

Still, Icelandic officials were all too happy to host the summit, which coincided with Reykjavík's 200th year as a city. “What a wonderful anniversary gift for Reykjavík,” Hermannsson said after the announcement. His enthusiasm soon turned into doubt when he “began to think of all the problems"—the inevitable traffic jams and security increases, as well as the country's shortage of hotel rooms.

Reykjavík didn’t exactly have the infrastructure to support such a large gathering. About 2000 officials and journalists would fly in to attend the summit, which is roughly the same number of hotel rooms that could be found in the entire metropolis of Reykjavík. As the arrangements were being made, many officials worried they'd have no choice but to shack up together in cramped rooms.

White House staffer William Henkel likely felt something akin to déjà vu. In 1973, when Richard Nixon met French President Georges Pompidou in the Icelandic capital to discuss trade policy, Henkel said there was a similar “critical shortage” of beds. "It's not even room we're worried about, it's beds," Henkel said prior to the 1986 summit. "We're counting every bed we have. That's the engine that's driving this summit."

To make matters worse, there was a small brouhaha when the U.S. learned that Gorbachev would be bringing a plus-one to the summit. The White House’s spokesman learned while watching Icelandic television that Raisa M. Gorbachev, wife of the Soviet leader, would be tagging along on the trip. Nancy Reagan was reportedly peeved at her Russian counterpart’s last-minute change of plans—the First Lady didn't want to be upstaged—but Mrs. Reagan ultimately decided to stay home. Another White House official dismissed the drama. “We don’t have a bilateral agreement where one First Lady has to show up when the other one does,” he said.

"IT'S GOING TO BE GREAT ... WHEN IT'S OVER"

A press pass for the Reykjavík Summit in Iceland, is seen in this photograph taken in London, January 22, 2017
John Voos, Alamy

Iceland did its best to accommodate the leaders, though. "What doesn't anybody do for guys like Gorbachev and Reagan?" Kjartan Larusson, Iceland's Director of Tourism, said before the summit. "If something bad happens this weekend, Iceland may as well pack up and go all the way back to the North Pole."

The chances of that actually happening were alarmingly high. Gentle, law-abiding Iceland was unprepared for the world's sudden attention: Reykjavík had a population of just 85,000, and the city rarely made international news. Unemployment was at 1 percent, and crime was so seldom reported that many citizens left their front doors unlocked. The country didn’t see its first bank robbery (and first armed robbery in general) until 1985. There was only one television station, which shut down on Thursdays, and Hermannsson, the prime minister, got his news the same way a nosy neighbor in a small town would: by walking down to the local public pool and chatting with the swimmers. "We sit around the pool and talk," Hermannsson said. "That's how I find out what's going on."

Before the world leaders arrived, Hermannsson put the logistical plan into play. First, the government “seized” four of the capital’s largest hotels and reserved them for U.S. and Soviet officials. Icelandair cut short the holidays of vacationing pilots and flight attendants and added 15 flights from the U.S. to Iceland. Two separate conventions scheduled for the capital were rerouted to other locations at the government's urging. "Unfortunately, we had to kick them out to make room," Icelandair president Sigurdur Helgason said. Two schools called off classes so that the buildings could serve as a press center for the gaggle of international journalists expected to descend upon the city. "It will be a real problem, since both parents in most families work," one teacher told the Times. "But for me, it's a 10-day holiday."

The government also asked Icelanders to eat at home that weekend, so that the diplomats would be able to reserve restaurant tables. "There will be inconveniences, but I cannot imagine any Icelander will mind," Hermannsson said. He may have slightly overestimated their patience, though. One taxi driver, who had been hired to wait around for CBS executives at the summit, told the Times, "This is ridiculous!" Another driver, Petur Sviensson, said of the whole ordeal, "It's going to be great ... when it's over."

Iceland did manage to survive the summit—and some citizens had commemorative T-shirts to prove it. Shirts bearing "the likeness" of the world leaders and the words "Reagan-Gorbachev Reykjavik October '86" had been making the rounds. (No word on whether you can still score one of these tees today, though.)

So was the summit worth all the fuss? That depends on whom you ask. Although no deal was reached, it was later lauded by some historians and politicians as a “turning point” in the Cold War because it started a discussion that later led to nuclear weapons reform. The following year, the two countries signed the Intermediate Range Nuclear Forces Treaty (INF Treaty), in which they agreed to get rid of their medium-range missiles.

For its part, Iceland has since gone on to host other global gatherings over the years, although none have been as large, as historic, or as disruptive as the one in 1986.

Why the Filet-O-Fish Sandwich Has Been on the McDonald's Menu for Nearly 60 Years

McDonald's has introduced and quietly killed many dishes over the years (remember McDonald's pizza?), but there's a core group of items that have held their spot on the menu for decades. Listed alongside the Big Mac and McNuggets is the Filet-O-Fish—a McDonald's staple you may have forgotten about if you're not the type of person who orders seafood from fast food restaurants. But the classic sandwich, consisting of a fried fish filet, tartar sauce, and American cheese on a bun, didn't get on the menu by mistake—and thanks to its popularity around Lent, it's likely to stick around.

According to Taste of Home, the inception of the Filet-O-Fish can be traced back to a McDonald's franchise that opened near Cincinnati, Ohio in 1959. Back then the restaurant offered beef burgers as its only main dish, and for most of the year, diners couldn't get enough of them. Things changed during Lent: Many Catholics abstain from eating meat and poultry on Fridays during the holy season as a form of fasting, and in the early 1960s, Cincinnati was more than 85 percent Catholic. Fridays are supposed to be one of the busiest days of the week for restaurants, but sales at the Ohio McDonald's took a nosedive every Friday leading up to Easter.

Franchise owner Lou Groen went to McDonald's founder Ray Kroc with the plan of adding a meat alternative to the menu to lure back Catholic customers. He proposed a fried halibut sandwich with tartar sauce (though meat is off-limits for Catholics on Fridays during Lent, seafood doesn't count as meat). Kroc didn't love the idea, citing his fears of stores smelling like fish, and suggested a "Hula Burger" made from a pineapple slice with cheese instead. To decide which item would earn a permanent place on the menu, they put the two sandwiches head to head at Groen's McDonald's one Friday during Lent.

The restaurant sold 350 Filet-O-Fish sandwiches that day—clearly beating the Hula Burger (though exactly how many pineapple burgers sold, Kroc wouldn't say). The basic recipe has received a few tweaks, switching from halibut to the cheaper cod and from cod to the more sustainable Alaskan pollock, but the Filet-O-Fish has remained part of the McDonald's lineup in some form ever since. Today 300 million of the sandwiches are sold annually, and about a quarter of those sales are made during Lent.

Other seafood products McDonald's has introduced haven't had the same staying power as the Filet-O-Fish. In 2013, the chain rolled out Fish McBites, a chickenless take on McNuggets, only to pull them from menus that same year.

[h/t Taste of Home]

The Disturbing Reason Schools Tattooed Their Students in the 1950s

Kurt Hutton, Hulton Archive/Getty Images
Kurt Hutton, Hulton Archive/Getty Images

When Paul Bailey was born at Beaver County Hospital in Milford, Utah on May 9, 1955, it took less than two hours for the staff to give him a tattoo. Located on his torso under his left arm, the tiny marking was rendered in indelible ink with a needle gun and indicated Bailey’s blood type: O-Positive.

“It is believed to be the youngest baby ever to have his blood type tattooed on his chest,” reported the Beaver County News, cooly referring to the infant as an “it.” A hospital employee was quick to note parental consent had been obtained first.

The permanent tattooing of a child who was only hours old was not met with any hysteria. Just the opposite: In parts of Utah and Indiana, local health officials had long been hard at work instituting a program that would facilitate potentially life-saving blood transfusions in the event of a nuclear attack. By branding children and adults alike with their blood type, donors could be immediately identified and used as “walking blood banks” for the critically injured.

Taken out of context, it seems unimaginable. But in the 1950s, when the Cold War was at its apex and atomic warfare appeared not only possible but likely, children willingly lined up at schools to perform their civic duty. They raised their arm, gritted their teeth, and held still while the tattoo needle began piercing their flesh.

 

The practice of subjecting children to tattoos for blood-typing has appropriately morbid roots. Testifying at the Nuremberg Tribunal on War Crimes in the 1940s, American Medical Association physician Andrew Ivy observed that members of the Nazi Waffen-SS carried body markings indicating their blood type [PDF]. When he returned to his hometown of Chicago, Ivy carried with him a solution for quickly identifying blood donors—a growing concern due to the outbreak of the Korean War in 1950. The conflict was depleting blood banks of inventory, and it was clear that reserves would be necessary.

School children sit next to one another circa the 1950s
Reg Speller, Fox Photos/Getty Images

If the Soviet Union targeted areas of the United States for destruction, it would be vital to have a protocol for blood transfusions to treat radiation poisoning. Matches would need to be found quickly. (Transfusions depend on matching blood to avoid the adverse reactions that come from mixing different types. When a person receives blood different from their own, the body will create antibodies to destroy the red blood cells.)

In 1950, the Department of Defense placed the American Red Cross in charge of blood donor banks for the armed forces. In 1952, the Red Cross was the coordinating agency [PDF] for obtaining blood from civilians for the National Blood Program, which was meant to replenish donor supply during wartime. Those were both measures for soldiers. Meanwhile, local medical societies were left to determine how best to prepare their civilian communities for a nuclear event and its aftermath.

As part of the Chicago Medical Civil Defense Committee, Ivy promoted the use of the tattoos, declaring them as painless as a vaccination. Residents would get blood-typed by having their finger pricked and a tiny droplet smeared on a card. From there, they would be tattooed with the ABO blood group and Rhesus factor (or Rh factor), which denotes whether or not a person has a certain type of blood protein present.

The Chicago Medical Society and the Board of Health endorsed the program and citizens voiced a measure of support for it. One letter to the editor of The Plainfield Courier-News in New Jersey speculated it might even be a good idea to tattoo Social Security numbers on people's bodies to make identification easier.

Despite such marked enthusiasm, the project never entered into a pilot testing stage in Chicago.

Officials with the Lake County Medical Society in nearby Lake County, Indiana were more receptive to the idea. In the spring of 1951, 5000 residents were blood-typed using the card method. But, officials cautioned, the cards could be lost in the chaos of war or even the relative quiet of everyday life. Tattoos and dog tags were encouraged instead. When 1000 people lined up for blood-typing at a county fair, two-thirds agreed to be tattooed as part of what the county had dubbed "Operation Tat-Type." By December 1951, 15,000 Lake County residents had been blood-typed. Roughly 60 percent opted for a permanent marking.

The program was so well-received that the Lake County Medical Society quickly moved toward making children into mobile blood bags. In January 1952, five elementary schools in Hobart, Indiana enrolled in the pilot testing stage. Children were sent home with permission slips explaining the effort. If parents consented, students would line up on appointed tattoo days to get their blood typed with a finger prick. From there, they’d file into a room—often the school library—set up with makeshift curtains behind which they could hear a curious buzzing noise.

When a child stepped inside, they were greeted by a school administrator armed with indelible ink and wielding a Burgess Vibrotool, a medical tattoo gun featuring 30 to 50 needles. The child would raise their left arm to expose their torso (since arms and legs might be blown off in an attack) and were told the process would only take seconds.

A child raises his hand in class circa the 1950s
Vecchio/Three Lions/Getty Images

Some children were stoic. Some cried before, during, or after. One 11-year-old recounting her experience with the program said a classmate emerged from the session and promptly fainted. All were left with a tattoo less than an inch in diameter on their left side, intentionally pale so it would be as unobtrusive as possible.

At the same time that grade schoolers—and subsequently high school students—were being imprinted in Indiana, kids in Cache and Rich counties in Utah were also submitting to the program, despite potential religious obstacles for the region's substantial Mormon population. In fact, Bruce McConkie, a representative of the Church of Jesus Christ of Latter-Day Saints, declared that blood-type tattoos were exempt from the typical prohibitions on Mormons defacing their bodies, giving the program a boost among the devout. The experiment would not last much longer, though.

 

By 1955, 60,000 adults and children had gotten tattooed with their blood types in Lake County. In Milford, health officials persisted in promoting the program widely, offering the tattoos for free during routine vaccination appointments. But despite the cooperation exhibited by communities in Indiana and Utah, the programs never spread beyond their borders.

The Korean conflict had come to an end in 1953, reducing the strain put on blood supplies and along with it, the need for citizens to double as walking blood banks. More importantly, outside of the program's avid boosters, most physicians were extremely reticent to rely solely on a tattoo for blood-typing. They preferred to do their own testing to make certain a donor was a match with a patient.

There were other logistical challenges that made the program less than useful. The climate of a post-nuclear landscape meant that bodies might be charred, burning off tattoos and rendering the entire operation largely pointless. With the Soviet Union’s growing nuclear arsenal—1600 warheads were ready to take to the skies by 1960—the idea of civic defense became outmoded. Ducking and covering under desks, which might have shielded some from the immediate effects of a nuclear blast, would be meaningless in the face of such mass destruction.

Programs like tat-typing eventually fell out of favor, yet tens of thousands of adults consented to participate even after the flaws in the program were publicized, and a portion allowed their young children to be marked, too. Their motivation? According to Carol Fischler, who spoke with the podcast 99% Invisible about being tattooed as a young girl in Indiana, the paranoia over the Cold War in the 1950s drowned out any thought of the practice being outrageous or harmful. Kids wanted to do their part. Many nervously bit their lip but still lined up with the attitude that the tattoo was part of being a proud American.

Perhaps equally important, children who complained of the tattoo leaving them particularly sore received another benefit: They got the rest of the afternoon off.

SECTIONS

arrow
LIVE SMARTER