12 Technological Advancements of World War I

Getty Images
Getty Images

Erik Sass has been covering the events leading up to World War I exactly 100 years after they happened. But today he's here to discuss some inventions of The Great War.

1. Tanks

In 1914, the “war of movement” expected by most European generals settled down into an unexpected, and seemingly unwinnable, war of trenches. With machine guns reinforcing massed rifle fire from the defending trenches, attackers were mowed down by the thousands before they could even get to the other side of “no-man’s-land.”

A solution presented itself, however, in the form of the automobile, which took the world by storm after 1900. Powered by a small internal combustion engine burning diesel or gas, a heavily-armored vehicle could advance even in the face of overwhelming small arms fire. Add some serious guns and replace the wheels with armored treads to handle rough terrain, and the tank was born.

The first tank, the British Mark I, was designed in 1915 and first saw combat at the Somme in September 1916. The French soon followed suit with the Renault FT, which established the classic tank look (turret on top). Despite their later prowess in tank combat in WWII, the Germans never got around to large-scale tank production in WWI, although they did produce 21 tanks in the unwieldy A7V model.

2. Flamethrowers

Although the Byzantines and Chinese used weapons that hurled flaming material in the medieval period, the first design for a modern flamethrower was submitted to the German Army by Richard Fiedler in 1901, and the devices were tested by the Germans with an experimental detachment in 1911. Their true potential was only realized during trench warfare, however. After a massed assault on enemy lines, it wasn’t uncommon for enemy soldiers to hole up in bunkers and dugouts hollowed into the side of the trenches. Unlike grenades, flamethrowers could “neutralize” (i.e. burn alive) enemy soldiers in these confined spaces without inflicting structural damage (the bunkers might come in handy for the new residents). The flamethrower was first used by German troops near Verdun in February 1915.

3. Poison Gas


Getty Images

Poison gas was used by both sides with devastating results (well, sometimes) during the Great War. The Germans pioneered the large-scale use of chemical weapons with a gas attack on Russian positions on January 31, 1915, during the Battle of Bolimov, but low temperatures froze the poison (xylyl bromide) in the shells. The first successful use of chemical weapons occurred on April 22, 1915, near Ypres, when the Germans sprayed chlorine gas from large cylinders towards trenches held by French colonial troops. The defenders fled, but typically for the First World War, this didn’t yield a decisive result: the Germans were slow to follow up with infantry attacks, the gas dissipated, and the Allied defenses were restored. Before long, of course, the Allies were using poison gas too, and over the course of the war both sides resorted to increasingly insidious compounds to beat gas masks, another new invention; thus the overall result was a huge increase in misery for not much change in the strategic situation (a recurring theme of the war).

4. Tracer Bullets


Photo courtesy of Military Cartridges

While the Great War involved a lot of futile activity, fighting at night was especially unproductive because there was no way to see where you were shooting. Night combat was made somewhat easier by the British invention of tracer bullets—rounds which emitted small amounts of flammable material that left a phosphorescent trail. The first attempt, in 1915, wasn’t actually that useful, as the trail was “erratic” and limited to 100 meters, but the second tracer model developed in 1916, the .303 SPG Mark VIIG, emitted a regular bright green-white trail and was a real hit (get it?). Its popularity was due in part to an unexpected side-benefit: the flammable agent could ignite hydrogen, which made it perfect for “balloon-busting” the German zeppelins then terrorizing England.

5. Interrupter Gear


Wikimedia Commons

Airplanes had been around for just a decade when WWI started, and while they had obvious potential for combat applications as an aerial platform for bombs and machine guns, it wasn’t quite clear how the latter would work, since the propeller blades got in the way. In the first attempt, the U.S. Army basically tied the gun to the plane (pointing towards the ground) with a leather strap, and it was operated by a gunner who sat beside the pilot. This was not ideal for aerial combat and inconvenient because it required two airmen to operate. Another solution was mounting the gun well above the pilot, so the bullets cleared the propeller blades, but this made it hard to aim. After the Swiss engineer Franz Schneider patented his idea for an interrupter gear in 1913, a finished version was presented by Dutch designer Anthony Fokker, whose “synchronizer,” centered on a cam attached to the propeller shaft, allowed a machine gun to fire between the blades of a spinning propeller. The Germans adopted Fokker’s invention in May 1915, and the Allies soon produced their own versions. Schneider later sued Fokker for patent infringement.

6. Air traffic control

In the first days of flight, once a plane left the ground the pilot was pretty much isolated from the terrestrial world, unable to receive any information aside from obvious signals using flags or lamps. This changed thanks to the efforts of the U.S. Army, which installed the first operational two-way radios in planes during the Great War (but prior to U.S. involvement). Development began in 1915 at San Diego, and by 1916 technicians could send a radio telegraph over a distance of 140 miles; radio telegraph messages were also exchanged between planes in flight. Finally, in 1917, for the first time a human voice was transmitted by radio from a plane in flight to an operator on the ground.

7. Depth Charges


Wikimedia Commons

The German U-boat campaign against Allied shipping sank millions of tons of cargo and killed tens of thousands of sailors and civilians, forcing the Allies to figure out a way to combat the submarine menace. The solution was the depth charge, basically an underwater bomb that could be lobbed from the deck of a ship using a catapult or chute. Depth charges were set to go off at a certain depth by a hydrostatic pistol that measured water pressure, insuring the depth charge wouldn’t damage surface vessels, including the launch ship. After the idea was sketched out in 1913, the first practical depth charge, the Type D, was produced by the Royal Navy’s Torpedo and Mine School in January 1916. The first German U-boat sunk by depth charge was the U-68, destroyed on March 22, 1916.

8. Hydrophones

Of course it was a big help if you could actually locate the U-boat using sound waves, which required a microphone that could work underwater, or hydrophone. The first hydrophone was invented by 1914 by Reginald Fessenden, a Canadian inventor who actually started working on the idea as a way to locate icebergs following the Titanic disaster; however, it was of limited use because it couldn’t tell the direction of an underwater object, only the distance. The hydrophone was further improved by the Frenchman Paul Langevin and Russian Constantin Chilowsky, who invented an ultrasound transducer relying on piezoelectricity, or the electric charge held in certain minerals: a thin layer of quartz held between two metal plates responded to tiny changes in water pressure resulting from sound waves, allowing the user to determine both the distance and direction of an underwater object. The hydrophone claimed its first U-boat victim in April 1916. A later version perfected by the Americans could detect U-boats up to 25 miles away.

9. Aircraft Carriers


Wikimedia Commons

The first time an airplane was launched from a moving ship was in May 1912, when commander Charles Rumney Samson piloted a Short S.27 pontoon biplane from a ramp on the deck of the HMS Hibernia in Weymouth Bay. However, the Hibernia wasn’t a true aircraft carrier, since planes couldn’t land on its deck; they had to set down on the water and then be retrieved, slowing the whole process considerably. The first real aircraft carrier was the HMS Furious, which began life as a 786-foot-long battle cruiser equipped with two massive 18-inch guns—until British naval designers figured out that these guns were so large they might shake the ship to pieces. Looking for another use for the vessel, they built a long platform capable of both launching and landing airplanes. To make more room for takeoffs and landings, the airplanes were stored in hangars under the runway, as they still are in modern aircraft carriers. Squadron Commander Edward Dunning became the first person to land a plane on a moving ship when he landed a Sopwith Pup on the Furious on August 2, 1917.

10. Pilotless Drones

The first pilotless drone was developed for the U.S. Navy in 1916 and 1917 by two inventors, Elmer Sperry and Peter Hewitt, who originally designed it as an unmanned aerial bomb—essentially a prototype cruise missile. Measuring just 18.5 feet across, with a 12-horsepower motor, the Hewitt-Sperry Automatic Aircraft weighed 175 pounds and was stabilized and directed (“piloted” is too generous) with gyroscopes and a barometer to determine altitude. The first unmanned flight in history occurred on Long Island on March 6, 1918. In the end, the targeting technique—point and fly—was too imprecise for it to be useful against ships during the war. Further development, by attempting to integrate remote radio control, continued for several years after the war, until the Navy lost interest in 1925.

11. Mobile X-Ray Machines

With millions of soldiers suffering grievous, life-threatening injuries, there was obviously a huge need during the Great War for the new wonder weapon of medical diagnostics, the X-ray—but these required very large machines that were both too bulky and too delicate to move. Enter Marie Curie, who set to work creating mobile X-ray stations for the French military immediately after the outbreak of war; by October 1914, she had installed X-ray machines in several cars and small trucks which toured smaller surgical stations at the front. By the end of the war there were 18 of these “radiologic cars” or “Little Curies” in operation. African-American inventor Frederick Jones developed an even smaller portable X-ray machine in 1919 (Jones also invented refrigeration units, air conditioning units, and the self-starting gasoline lawnmower).

12. Sanitary Napkins

Women traditionally improvised all kinds of disposable or washable undergarments to deal with their monthly period, all the way back to softened papyrus in ancient Egypt. But the modern sanitary napkin as we know it was made possible by the introduction of new cellulose bandage material during the First World War; it wasn’t long before French nurses figured out that clean, absorbent cellulose bandages were far superior to any predecessors. British and American nurses picked up on the habit, and corporate America wasn’t far behind: In 1920, Kimberly-Clark introduced the first commercial sanitary napkin, Kotex (that’s “cotton” + “texture”). But it was rough going at first, as no publications would carry advertisements for such a product. It wasn’t until 1926 that Montgomery Ward broke the barrier, carrying Kotex napkins in its popular catalogue.

12 Facts About Japanese Internment in the United States

Portrait of internee Tom Kobayashi at Manzanar War Relocation Center, Owens Valley, California, 1943
Portrait of internee Tom Kobayashi at Manzanar War Relocation Center, Owens Valley, California, 1943
Ansel Adams, Library of Congress/Wikimedia Commons // No Known Copyright Restrictions

On February 19, 1942, President Franklin Delano Roosevelt issued Executive Order 9066, which sanctioned the removal of Japanese immigrants and Americans of Japanese heritage from their homes to be imprisoned in internment camps throughout the country.

At the time, the move was sold to the public as a strategic military necessity. Following the attack on Pearl Harbor on December 7, 1941, the government argued that it was impossible to know where the loyalties of Japanese-Americans rested.

Between 110,000 and 120,000 people of Japanese ancestry were relocated to internment camps along the West Coast and as far east as Louisiana. Here are 12 facts about what former first lady Laura Bush has described as "one of the most shameful episodes in U.S. history."

1. The government was already discussing detaining people before the Pearl Harbor attack.

In 1936, President Franklin Roosevelt—who was concerned about Japan’s growing military might—instructed William H. Standley, his chief of naval operations, to clandestinely monitor "every Japanese citizen or non-citizen on the island of Oahu who meets these Japanese ships [arriving in Hawaii] or has any connection with their officers or men" and to secretly place their names "on a special list of those who would be the first to be placed in a concentration camp in the event of trouble."

This sentiment helped lead to the creation of the Custodial Detention List, which would later guide the U.S. in detaining 31,899 Japanese, German, and Italian nationals, separate from the 110,000-plus later interred, without charging them with a crime or offering them any access to legal counsel.

2. Initial studies of the “Japanese problem” proved that there wasn’t one.

In early 1941, Curtis Munson, a special representative of the State Department, was tasked with interviewing West Coast-based Japanese-Americans to gauge their loyalty levels in coordination with the FBI and the Office of Naval Intelligence. Munson reported that there was extraordinary patriotism among Japanese immigrants, saying that "90 percent like our way best," and that they were "extremely good citizen[s]" who were "straining every nerve to show their loyalty." Lieutenant Commander K.D. Ringle’s follow-up report showed the same findings and argued against internment because only a small percentage of the community posed a threat, and most of those individuals were already in custody.

3. The general in charge of Western defense command took nothing happening after Pearl Harbor as proof that something would happen.

Minidoka Relocation Center. Community Store in block 30
National Archives at College Park, Wikimedia Commons // CC BY 3.0

Despite both Munson and Ringle debunking the concept of internment as a strategic necessity, the plan moved ahead—spurred largely by Western Defense Command head General John L. DeWitt. One month after Pearl Harbor, DeWitt created the central ground for mass incarceration by declaring: "The fact that nothing has happened so far is more or less ... ominous in that I feel that in view of the fact that we have had no sporadic attempts at sabotage that there is a control being exercised and when we have it, it will be on a mass basis."

DeWitt, whose ancestors were Dutch, didn’t want anyone of Japanese descent on the West Coast, stating that “American citizenship does not necessarily determine loyalty.”

4. Almost no one protested internment.

Alongside General DeWitt, Wartime Civil Control Administration director Colonel Karl Bendetsen avowed that anyone with even “one drop of Japanese blood” should be incarcerated, and the country generally went along with that assessment. Some newspapers ran op-eds opposing the policy, and the American Baptist Home Mission Societies created pamphlets to push back, but as historian Eric Foner wrote in The Story of American Freedom, "One searches the wartime record in vain for public protests among non-Japanese." Senator Robert Taft was the only congressperson to condemn the policy.

5. Supporting or opposing internment were both matters of economics.

White farmers and landowners on the West Coast had great economic incentives to get rid of Japanese farmers who had come to the area only decades before and found success with new irrigation methods. They fomented deep hatred for their Japanese neighbors and publicly advocated for internment, which is one reason so many of the more than 110,000 Japanese individuals sent to camps came from the West Coast. In Hawaii, it was a different story. White business owners opposed internment, but not for noble reasons: They feared losing their workforce. Thus, only between 1200 and 1800 Japanese-Americans from Hawaii were sent to internment camps.

6. People were tagged for identification.

Children in a drawing class at Minidoka Relocation Center
National Archives at College Park, Wikimedia Commons // CC BY 3.0

Moving entire communities of people to camps in California, Colorado, Texas, and beyond was a gargantuan logistical task. The military assigned tags with ID numbers to families, including the children, to ensure they would be transferred to the correct camp. In 2012, artist Wendy Maruyama recreated thousands of these tags for an art exhibition she titled "The Tag Project."

"The process of replicating these tags using government databases, writing thousands of names, numbers, and camp locations became a meditative process," Maruyama told Voices of San Diego. “And for the hundreds of volunteers, they could, for a minute or two as they wrote the names, contemplate and wonder what this person was thinking as he or she was being moved from the comforts of home to the spare and bare prisons placed in the foreboding deserts and wastelands of America. And could it happen again?”

7. Not everyone went quietly.

Directly combatting the image of the “polite” Japanese-Americans who acquiesced to internment without protest, collections of resistance stories paint a disruptive picture of those who refused to go to the camps or made trouble once inside. Among those who were considered "problematic" were individuals who refused to register for the compulsory loyalty questionnaire, which asked questions about whether the person was a registered voter and with which party, as well as marital status and "citizenship of wife" and "race of wife."

“A broadly understood notion of resistance represents a more complete picture of what happened during World War II,” David Yoo, a professor of Asian American Studies and History and vice provost at UCLA's Institute of American Cultures, told NBC News about collecting these resistance stories. “Because these stories touch upon human rights, they are important for all peoples.”

8. The government converted unused buildings into camp facilities.

For the most part, camps were set against desert scrub land or infertile Ozark hills bordered with barbed wire. Before getting on buses to be transported to their new "homes," detainees had to go through processing centers housed in converted racetracks and fairgrounds, where they might stay for several months. The largest and most noteworthy center was Santa Anita Park, a racetrack in Arcadia, California, which was shut down so that makeshift barracks could be assembled and horse stables could be used for sleeping quarters.

9. Ansel Adams took hundreds of photographs inside the most famous camp, as did an internee with a smuggled camera.

Wooden sign at entrance to the Manzanar War Relocation Center with a car at the gatehouse in the background
Ansel Adams, Library of Congress/Wikimedia Commons // Public Domain

Approximately 200 miles north of Santa Anita Park, at the foot of the Sierra Nevada mountain range, was Manzanar—which, with its 11,000 internees, was perhaps the most famous of America's 10 relocation centers. It was also the most photographed facility. In the fall of 1942, famed photographer Ansel Adams—who was personally outraged by the situation when a family friend was taken from his home and moved halfway across the country—shot more than 200 images of the camp. In a letter to a friend about a book being made of the photos, Adams wrote that, "Through the pictures the reader will be introduced to perhaps 20 individuals ... loyal American citizens who are anxious to get back into the stream of life and contribute to our victory."

While Adams may have successfully offered a small glimpse at life inside Manzanar, Tōyō Miyatake—a photographer and detainee who managed to smuggle a lens and film into the camp, which he later fashioned into a makeshift camera—produced a series of photos that offered a much more intimate depiction of what everyday life was like for the individuals who were imprisoned there between 1942 and 1945. Today, Manzanar is a National Historic Site.

10. Detainees were told they were in camps for their own protection.

Japanese-Hawaiian hula dancers on an improvised stage during one of the frequent talent shows at Santa Anita (California) Assembly Center
U.S. Signal Corps, Library of Congress, Wikimedia Commons // Public Domain

Just as the justification for internment was an erroneous belief in mass disloyalty among a single racial group, the argument given to those incarcerated was that they were better off inside the barbed wire compounds than back in their own homes, where racist neighbors could assault them. When presented with that logic, one detainee rebutted, “If we were put there for our protection, why were the guns at the guard towers pointed inward, instead of outward?”

11. Internees experienced long-term health problems because of the camps, and children had it the worst.

Internment officially lasted through 1944, with the last camp closing in early 1946. In those years, Japanese-Americans did their best to make lives for themselves on the inside. That included jobs and governance, as well as concerts, religion, and sports teams. Children went to school, but there were also dances and comic books to keep them occupied. But the effects of their internment were long-lasting.

There have been multiple studies of the physical and psychological health of former internees. They found those placed in camps had a greater risk for cardiovascular disease and death, as well as traumatic stress. Younger internees experienced low self-esteem, as well as psychological trauma that led many to shed their Japanese culture and language. Gwendolyn M. Jensen’s The Experience of Injustice: Health Consequences of the Japanese American Internment found that younger internees “reported more post-traumatic stress symptoms of unexpected and disturbing flashback experiences than those who were older at the time of incarceration.”

12. A congressional panel called it a “grave injustice" ... 40 years later.

Japanese Americans going to Manzanar gather around a baggage car at the old Santa Fe Station. (April 1942)
Russell Lee, Library of Congress, Wikimedia Commons // Public Domain

It wasn’t until 1983 that a special Congressional commission determined that the mass internment was a matter of racism and not of military strategy. Calling the incarceration a “grave injustice,” the panel cited the ignored Munson and Ringle reports, the absence of any documented acts of espionage, and delays in shutting down the camps due to weak political leadership from President Roosevelt on down as factors in its conclusion. The commission paved the way for President Reagan to sign the Civil Liberties Act, which gave each surviving internee $20,000 and officially apologized. Approximately two-thirds of the more than 110,000 people detained were U.S. citizens.

This list first ran in 2018.

The Disturbing Reason Schools Tattooed Their Students in the 1950s

Kurt Hutton, Hulton Archive/Getty Images
Kurt Hutton, Hulton Archive/Getty Images

When Paul Bailey was born at Beaver County Hospital in Milford, Utah on May 9, 1955, it took less than two hours for the staff to give him a tattoo. Located on his torso under his left arm, the tiny marking was rendered in indelible ink with a needle gun and indicated Bailey’s blood type: O-Positive.

“It is believed to be the youngest baby ever to have his blood type tattooed on his chest,” reported the Beaver County News, cooly referring to the infant as an “it.” A hospital employee was quick to note parental consent had been obtained first.

The permanent tattooing of a child who was only hours old was not met with any hysteria. Just the opposite: In parts of Utah and Indiana, local health officials had long been hard at work instituting a program that would facilitate potentially life-saving blood transfusions in the event of a nuclear attack. By branding children and adults alike with their blood type, donors could be immediately identified and used as “walking blood banks” for the critically injured.

Taken out of context, it seems unimaginable. But in the 1950s, when the Cold War was at its apex and atomic warfare appeared not only possible but likely, children willingly lined up at schools to perform their civic duty. They raised their arm, gritted their teeth, and held still while the tattoo needle began piercing their flesh.

 

The practice of subjecting children to tattoos for blood-typing has appropriately morbid roots. Testifying at the Nuremberg Tribunal on War Crimes in the 1940s, American Medical Association physician Andrew Ivy observed that members of the Nazi Waffen-SS carried body markings indicating their blood type [PDF]. When he returned to his hometown of Chicago, Ivy carried with him a solution for quickly identifying blood donors—a growing concern due to the outbreak of the Korean War in 1950. The conflict was depleting blood banks of inventory, and it was clear that reserves would be necessary.

School children sit next to one another circa the 1950s
Reg Speller, Fox Photos/Getty Images

If the Soviet Union targeted areas of the United States for destruction, it would be vital to have a protocol for blood transfusions to treat radiation poisoning. Matches would need to be found quickly. (Transfusions depend on matching blood to avoid the adverse reactions that come from mixing different types. When a person receives blood different from their own, the body will create antibodies to destroy the red blood cells.)

In 1950, the Department of Defense placed the American Red Cross in charge of blood donor banks for the armed forces. In 1952, the Red Cross was the coordinating agency [PDF] for obtaining blood from civilians for the National Blood Program, which was meant to replenish donor supply during wartime. Those were both measures for soldiers. Meanwhile, local medical societies were left to determine how best to prepare their civilian communities for a nuclear event and its aftermath.

As part of the Chicago Medical Civil Defense Committee, Ivy promoted the use of the tattoos, declaring them as painless as a vaccination. Residents would get blood-typed by having their finger pricked and a tiny droplet smeared on a card. From there, they would be tattooed with the ABO blood group and Rhesus factor (or Rh factor), which denotes whether or not a person has a certain type of blood protein present.

The Chicago Medical Society and the Board of Health endorsed the program and citizens voiced a measure of support for it. One letter to the editor of The Plainfield Courier-News in New Jersey speculated it might even be a good idea to tattoo Social Security numbers on people's bodies to make identification easier.

Despite such marked enthusiasm, the project never entered into a pilot testing stage in Chicago.

Officials with the Lake County Medical Society in nearby Lake County, Indiana were more receptive to the idea. In the spring of 1951, 5000 residents were blood-typed using the card method. But, officials cautioned, the cards could be lost in the chaos of war or even the relative quiet of everyday life. Tattoos and dog tags were encouraged instead. When 1000 people lined up for blood-typing at a county fair, two-thirds agreed to be tattooed as part of what the county had dubbed "Operation Tat-Type." By December 1951, 15,000 Lake County residents had been blood-typed. Roughly 60 percent opted for a permanent marking.

The program was so well-received that the Lake County Medical Society quickly moved toward making children into mobile blood bags. In January 1952, five elementary schools in Hobart, Indiana enrolled in the pilot testing stage. Children were sent home with permission slips explaining the effort. If parents consented, students would line up on appointed tattoo days to get their blood typed with a finger prick. From there, they’d file into a room—often the school library—set up with makeshift curtains behind which they could hear a curious buzzing noise.

When a child stepped inside, they were greeted by a school administrator armed with indelible ink and wielding a Burgess Vibrotool, a medical tattoo gun featuring 30 to 50 needles. The child would raise their left arm to expose their torso (since arms and legs might be blown off in an attack) and were told the process would only take seconds.

A child raises his hand in class circa the 1950s
Vecchio/Three Lions/Getty Images

Some children were stoic. Some cried before, during, or after. One 11-year-old recounting her experience with the program said a classmate emerged from the session and promptly fainted. All were left with a tattoo less than an inch in diameter on their left side, intentionally pale so it would be as unobtrusive as possible.

At the same time that grade schoolers—and subsequently high school students—were being imprinted in Indiana, kids in Cache and Rich counties in Utah were also submitting to the program, despite potential religious obstacles for the region's substantial Mormon population. In fact, Bruce McConkie, a representative of the Church of Jesus Christ of Latter-Day Saints, declared that blood-type tattoos were exempt from the typical prohibitions on Mormons defacing their bodies, giving the program a boost among the devout. The experiment would not last much longer, though.

 

By 1955, 60,000 adults and children had gotten tattooed with their blood types in Lake County. In Milford, health officials persisted in promoting the program widely, offering the tattoos for free during routine vaccination appointments. But despite the cooperation exhibited by communities in Indiana and Utah, the programs never spread beyond their borders.

The Korean conflict had come to an end in 1953, reducing the strain put on blood supplies and along with it, the need for citizens to double as walking blood banks. More importantly, outside of the program's avid boosters, most physicians were extremely reticent to rely solely on a tattoo for blood-typing. They preferred to do their own testing to make certain a donor was a match with a patient.

There were other logistical challenges that made the program less than useful. The climate of a post-nuclear landscape meant that bodies might be charred, burning off tattoos and rendering the entire operation largely pointless. With the Soviet Union’s growing nuclear arsenal—1600 warheads were ready to take to the skies by 1960—the idea of civic defense became outmoded. Ducking and covering under desks, which might have shielded some from the immediate effects of a nuclear blast, would be meaningless in the face of such mass destruction.

Programs like tat-typing eventually fell out of favor, yet tens of thousands of adults consented to participate even after the flaws in the program were publicized, and a portion allowed their young children to be marked, too. Their motivation? According to Carol Fischler, who spoke with the podcast 99% Invisible about being tattooed as a young girl in Indiana, the paranoia over the Cold War in the 1950s drowned out any thought of the practice being outrageous or harmful. Kids wanted to do their part. Many nervously bit their lip but still lined up with the attitude that the tattoo was part of being a proud American.

Perhaps equally important, children who complained of the tattoo leaving them particularly sore received another benefit: They got the rest of the afternoon off.

SECTIONS

arrow
LIVE SMARTER