iStock Collage
iStock Collage

40 Words Turning 40 in 2017

iStock Collage
iStock Collage

If you're turning 40 this year, you have something in common with Sarah Michelle Gellar, Saturday Night Fever, and the Chia Pet. You also got to grow up with these words, dated by first citation to 1977 in the Oxford English Dictionary.


By 1977, girdles were on the way out—but we got shapewear to take their place.


There was an older, 19th century sense of nip and tuck that referred to a close “neck and neck” competition, but by 1977, the phrase was claimed for minor cosmetic surgery.


The first citation for party animal is from Bill Murray in an episode of Saturday Night Live.


Another Saturday Night Live contribution. Also from Bill Murray, this time complaining to the coneheads that they put brewskis in the kids' trick-or-treat bags.


In 1977, the Escanaba Daily Press had a contest to come up with a name for residents of the Upper Peninsula of Michigan, also know as the U.P. The finalists included U.P.ite, which didn’t stick, and Yooper, which did.


Once we had microwaves, we needed a term to describe the type of packaging that was suitable to put into the microwave. At the same time we got microwaveable, we also got ovenable, for packaging that could, by contrast, go into a more traditional oven—but that word didn’t last as long.


“Work-life balance” became an ideal to shoot for in the '70s, and as a result we got this adjective.


There was a heyday 40 years ago for generics, or non-branded products, at the supermarket. As Time pointed out at the time, “No name groceries have become hot items.”


We had microcomputer in the '50s. In the '70s, we started looking toward the even smaller nanocomputer.


We did see the word Murdochian as early as 1963, but then it referred to the philosophy of the writer Iris Murdoch. In 1977, it was first applied to the sensationalist tabloid style of publisher Rupert Murdoch.


Since 1965, the French had the word phallocratie for a male-dominated society (etymologically, “government run by penises”). In 1977, we made an English version.


In 1965, microchip manufacturer Gordon Earle Moore expressed the idea that the number of components that could fit on a chip would double every year. In 1977, the idea was called Moore’s Law and eventually came to stand for the idea that computers will keep getting better and faster while they also get smaller.


We’ve been talking about the A-list, the most popular, exclusive, and sought-after folks, since the 1930s, but 40 years ago, an article about the band Kiss first applied the term A-listers to the members of this list: “it is snubbed by A-listers, since it panders to 14-year-olds.”


The @ symbol itself has been around for hundreds of years, but we only have evidence for it being called the at sign since 1977. Before that, it was sometimes called the commercial at.


In Korean, this dish of mixed rice and vegetables is pronounced more like pibimbap, but 40 years ago, when American culture started getting to know it, it came into English as bibimbap.


The first citation for Britpop, in a 1977 issue of New Musical Express, refers to a band you might not expect: “At home The Sex Pistols are public enemies. In Sweden, they're an important visiting Britpop group.”


Punk had barely gotten started in 1977, but already there was a cited mention of a “post-punk disco” where a "new wave" band was to play.


A couple of years later, this term for "acceptability among, or popularity with, ordinary people, especially fashionable young urban people" was shortened to street cred. Which definitely has more street cred.

19. ‘BURB

Suburb is a very old word, going all the way back to the Middle Ages. Even suburbia goes back to the 19th century. But the 'burbs is now a young 40 years old.


Cats have been fighting for a long time, but the verb to catfight or “fight in a vicious, cat-like manner, esp. by scratching, pulling hair and biting” dates to 1977.


If what is praiseworthy is worthy of praise, then it makes sense that what is worthy of cringing at should be cringeworthy.


The pronunciation nekkid had long been a regional variant of naked, but 40 years ago it became its own word with a slightly different meaning: a purposely humorous, eyebrow wagging, sexually suggestive idea of nakedness.


The term fast track originally comes from horse racing. By 1977, it had become a verb for doing things on an accelerated schedule.

24. FRO-YO

Calling frozen yogurt fro-yo made it sound a little more fun, but still didn’t make it ice cream.


The noun guilt trip goes back to 1972, but by 1977 we had cut back the lengthy “lay a guilt trip on” to the simple verb, to guilt-trip.


In the 1940s and '50s, people started talking about the concept of "incentive pay" or bonuses to encourage workers to be more productive. By 1968, we had the verb incentivize, and 1977 brought us incentivization.


Karaoke (from a Japanese compound meaning “empty orchestra”) started in Japan in the 1970s. Though it didn’t really hit big in the English-speaking world until the '90s, we had already borrowed the word for it by 1977.


Plus-one, for a guest brought to a party by someone else who was invited, got its start with the backstage music scene.


If a cannon is not tied down on a storm-tossed ship, it’s liable to do a lot of damage. People had long used this image as a metaphor for dangerously unpredictable behavior, but loose cannon became a set phrase for that metaphor 40 years ago.


We got this word just in time for the dawn of mall culture.


The idea of getting customers to buy something more expensive than they intended was already old 40 years ago, but this abstract noun for the idea was new.


Pinkos, weirdos, and winos had already been around for a while by the time we came up with sicko.


The patent for the Steadicam, an actively stabilized video camera, was granted to filmmaker Garrett W. Brown in 1977.


A 1977 article in the Washington Post referred to “step-parenting” problems.


Strappy is 40 in the sartorial sense of strappy sandals and strappy sundresses.


Supersize as an adjective goes back to 1876, but the verb, to supersize something, shows up in 1977. It was popularized in the fast food sense after 1994.


This phrase was introduced with the publication of “Standard for Format of ARPA Network Text Messages” from the Internet Engineering Task Force.


This proprietary name for an insulating synthetic fabric has been with us for 40 years.


In the '70s audio recording had become easy and portable enough to be relied upon in many fields. This created the requirement for a new type of job: transcribing from audio. The first citation for transcriptionist is from a job ad for a medical transcriptionist.


The OED dictionary definition for this word is delightfully thorough: “An act of pulling the cloth of a person's underwear, trousers, etc., tightly between the buttocks, esp. as a practical joke; any positioning of a person's underwear, pants, etc., resembling the result of such a pulling.”

Live Smarter
Nervous About Asking for a Job Referral? LinkedIn Can Now Do It for You

For most people, asking for a job referral can be daunting. What if the person being approached shoots you down? What if you ask the "wrong" way? LinkedIn, which has been aggressively establishing itself as a catch-all hub for employment opportunities, has a solution, as Mashable reports.

The company recently launched "Ask for a Referral," an option that will appear to those browsing job listings. When you click on a job listed by a business that also employs one of your LinkedIn first-degree connections, you'll have the opportunity to solicit a referral from that individual.

The default message that LinkedIn creates is somewhat generic, but it hits the main topics—namely, prompting you to explain how you and your connection know one another and why you'd be a good fit for the position. If you're the one being asked for a referral, the site will direct you to the job posting and offer three prompts for a response, ranging from "Sure…" to "Sorry…".

LinkedIn says the referral option may not be available for all posts or all users, as the feature is still being rolled out. If you do see the option, it will likely pay to take advantage of it: LinkedIn reports that recruiters who receive both a referral and a job application from a prospective hire are four times more likely to contact that individual.

[h/t Mashable]

Dean Mouhtaropoulos/Getty Images
Essential Science
What Is a Scientific Theory?
Dean Mouhtaropoulos/Getty Images
Dean Mouhtaropoulos/Getty Images

In casual conversation, people often use the word theory to mean "hunch" or "guess": If you see the same man riding the northbound bus every morning, you might theorize that he has a job in the north end of the city; if you forget to put the bread in the breadbox and discover chunks have been taken out of it the next morning, you might theorize that you have mice in your kitchen.

In science, a theory is a stronger assertion. Typically, it's a claim about the relationship between various facts; a way of providing a concise explanation for what's been observed. The American Museum of Natural History puts it this way: "A theory is a well-substantiated explanation of an aspect of the natural world that can incorporate laws, hypotheses and facts."

For example, Newton's theory of gravity—also known as his law of universal gravitation—says that every object, anywhere in the universe, responds to the force of gravity in the same way. Observational data from the Moon's motion around the Earth, the motion of Jupiter's moons around Jupiter, and the downward fall of a dropped hammer are all consistent with Newton's theory. So Newton's theory provides a concise way of summarizing what we know about the motion of these objects—indeed, of any object responding to the force of gravity.

A scientific theory "organizes experience," James Robert Brown, a philosopher of science at the University of Toronto, tells Mental Floss. "It puts it into some kind of systematic form."


A theory's ability to account for already known facts lays a solid foundation for its acceptance. Let's take a closer look at Newton's theory of gravity as an example.

In the late 17th century, the planets were known to move in elliptical orbits around the Sun, but no one had a clear idea of why the orbits had to be shaped like ellipses. Similarly, the movement of falling objects had been well understood since the work of Galileo a half-century earlier; the Italian scientist had worked out a mathematical formula that describes how the speed of a falling object increases over time. Newton's great breakthrough was to tie all of this together. According to legend, his moment of insight came as he gazed upon a falling apple in his native Lincolnshire.

In Newton's theory, every object is attracted to every other object with a force that’s proportional to the masses of the objects, but inversely proportional to the square of the distance between them. This is known as an “inverse square” law. For example, if the distance between the Sun and the Earth were doubled, the gravitational attraction between the Earth and the Sun would be cut to one-quarter of its current strength. Newton, using his theories and a bit of calculus, was able to show that the gravitational force between the Sun and the planets as they move through space meant that orbits had to be elliptical.

Newton's theory is powerful because it explains so much: the falling apple, the motion of the Moon around the Earth, and the motion of all of the planets—and even comets—around the Sun. All of it now made sense.


A theory gains even more support if it predicts new, observable phenomena. The English astronomer Edmond Halley used Newton's theory of gravity to calculate the orbit of the comet that now bears his name. Taking into account the gravitational pull of the Sun, Jupiter, and Saturn, in 1705, he predicted that the comet, which had last been seen in 1682, would return in 1758. Sure enough, it did, reappearing in December of that year. (Unfortunately, Halley didn't live to see it; he died in 1742.) The predicted return of Halley's Comet, Brown says, was "a spectacular triumph" of Newton's theory.

In the early 20th century, Newton's theory of gravity would itself be superseded—as physicists put it—by Einstein's, known as general relativity. (Where Newton envisioned gravity as a force acting between objects, Einstein described gravity as the result of a curving or warping of space itself.) General relativity was able to explain certain phenomena that Newton's theory couldn't account for, such as an anomaly in the orbit of Mercury, which slowly rotates—the technical term for this is "precession"—so that while each loop the planet takes around the Sun is an ellipse, over the years Mercury traces out a spiral path similar to one you may have made as a kid on a Spirograph.

Significantly, Einstein’s theory also made predictions that differed from Newton's. One was the idea that gravity can bend starlight, which was spectacularly confirmed during a solar eclipse in 1919 (and made Einstein an overnight celebrity). Nearly 100 years later, in 2016, the discovery of gravitational waves confirmed yet another prediction. In the century between, at least eight predictions of Einstein's theory have been confirmed.


And yet physicists believe that Einstein's theory will one day give way to a new, more complete theory. It already seems to conflict with quantum mechanics, the theory that provides our best description of the subatomic world. The way the two theories describe the world is very different. General relativity describes the universe as containing particles with definite positions and speeds, moving about in response to gravitational fields that permeate all of space. Quantum mechanics, in contrast, yields only the probability that each particle will be found in some particular location at some particular time.

What would a "unified theory of physics"—one that combines quantum mechanics and Einstein's theory of gravity—look like? Presumably it would combine the explanatory power of both theories, allowing scientists to make sense of both the very large and the very small in the universe.


Let's shift from physics to biology for a moment. It is precisely because of its vast explanatory power that biologists hold Darwin's theory of evolution—which allows scientists to make sense of data from genetics, physiology, biochemistry, paleontology, biogeography, and many other fields—in such high esteem. As the biologist Theodosius Dobzhansky put it in an influential essay in 1973, "Nothing in biology makes sense except in the light of evolution."

Interestingly, the word evolution can be used to refer to both a theory and a fact—something Darwin himself realized. "Darwin, when he was talking about evolution, distinguished between the fact of evolution and the theory of evolution," Brown says. "The fact of evolution was that species had, in fact, evolved [i.e. changed over time]—and he had all sorts of evidence for this. The theory of evolution is an attempt to explain this evolutionary process." The explanation that Darwin eventually came up with was the idea of natural selection—roughly, the idea that an organism's offspring will vary, and that those offspring with more favorable traits will be more likely to survive, thus passing those traits on to the next generation.


Many theories are rock-solid: Scientists have just as much confidence in the theories of relativity, quantum mechanics, evolution, plate tectonics, and thermodynamics as they do in the statement that the Earth revolves around the Sun.

Other theories, closer to the cutting-edge of current research, are more tentative, like string theory (the idea that everything in the universe is made up of tiny, vibrating strings or loops of pure energy) or the various multiverse theories (the idea that our entire universe is just one of many). String theory and multiverse theories remain controversial because of the lack of direct experimental evidence for them, and some critics claim that multiverse theories aren't even testable in principle. They argue that there's no conceivable experiment that one could perform that would reveal the existence of these other universes.

Sometimes more than one theory is put forward to explain observations of natural phenomena; these theories might be said to "compete," with scientists judging which one provides the best explanation for the observations.

"That's how it should ideally work," Brown says. "You put forward your theory, I put forward my theory; we accumulate a lot of evidence. Eventually, one of our theories might prove to obviously be better than the other, over some period of time. At that point, the losing theory sort of falls away. And the winning theory will probably fight battles in the future."


More from mental floss studios