If Our Brains Are So Active During Infancy, Why Don’t We Remember Anything From That Time?


If our brains are so active and developing during infancy, why don’t we remember anything from that time?

Fabian van den Berg:

Ah, infantile amnesia as it’s better known. Weird, isn’t it? It’s a pretty universal phenomenon where people tend to have no memories before the age of four-ish and very few memories of the ages five to seven. What you say in the question is true, our brains are indeed very actively developing in that time, but they are still developing after five years as well.

The specifics aren’t known just yet. It’s tricky because memory itself is very complicated and there are swaths of unknowns that make it difficult to say for certain why we forget these early memories. This will be mostly about consensus and what can be supported with experiments.

(Image based on data from Rubin & Schulkind, 1997 [1] )

I’ll skip the whole introduction to memory bit and state that we focus on the episodic/autobiographical memories only—events that happened to us in a certain place at a certain time. And we have two forgetting phases, the early one until about four years old, and a later one from about five to seven years old, where we have very few memories.

The first notion to go is that this is “just normal forgetting,” where it’s just difficult to remember something from that long ago. This has been tested and it was found that forgetting happens quite predictably, and that the early years show less memories than they should if it was just regular old forgetting.

This leaves us with infantile amnesia, where there are probably two large camps of explanations: One says that children simply lack the ability to remember and that we don’t have these memories because the ability to make them doesn’t develop until later. This is the late emergence of autobiographical memory category.

The second big camp is the disappearance of early memory category, which says that the memories are still there, but cannot be accessed. This is also where the language aspect plays a part, where language changes the way memories are encoded, making the more visual memories incompatible with the adult system.

Both of them are sort of right and sort of wrong; the reality likely lies somewhere in between. Children do have memories, we know they do, so it’s not like they cannot form new memories. It’s also not likely that the memories are still there, just inaccessible.

Children do remember differently. When adults recall, there is a who, what, where, when, why, and how. Kids can remember all of these too, but not as well as adults can. Some memories might only contain a who and when (M1), some might have a how,
where, and when (M3), but very few, if any, memories have all the elements. These elements are also not as tightly connected and elaborated.

Kids need to learn this; they need to learn what is important [and] how to build a narrative. Try talking to a child about their day: It will be very scripted [and] filled with meaningless details. They tell you about waking up, eating breakfast, going to school, coming home from school, etc. Almost instinctively an adult will start guiding the story, asking things like, “Who was there?" or "What did we do?”

It also helps quite a bit to be aware of your own self, something that doesn’t develop until about 18 months (give or take a few). Making an autobiographical memory is a bit easier if you can center it around yourself.

(Image from Bauer (2015) based on the Complementary Process Account [2] )

This method of forming memories makes for weak memories, random spots of memories that are barely linked and sort of incomplete (lacking all the elements). Language acquisition can’t account for all that. Ever met a three-year old? They can talk your ears off! So they definitely have language. Children make weak memories, but that doesn’t completely tell you why those memories disappear, but I’ll get there.

The brain is still growing, very plastic, and things are going on that would amaze you. Large structures in the brain are still specifying and changing, the memory systems are part of that change. There’s a lot of biology involved and I’ll spare you all the science-y sounding brain structures. The best way to see a memory is as a skeleton of elements, stored in a sort of web.

When you remember something, one of the elements is activated (which can be by seeing something, smelling something, or any kind of stimulus), which travels through the web activating all the other elements. Once they are all activated, the memory can be built, the blanks are filled in, and we “remember."

This is all well and good in adults, but as you can imagine this requires an intact web. The weak childhood memories barely hung together as they were, and time is not generous to them. Biological changes can break the weak memories apart, leaving only small isolated elements that can no longer form a memory. New neurons are formed in the hippocampus, squeezing in between existing memories, breaking the pattern. New strategies, new knowledge, new skills—they all interfere with what and how we remember things. And all of that is happening very fast in the first years of our lives.

We forget because inefficient memories are created by inefficient cognitive systems, trying to be stored by inefficient structures. Early memories are weak, but strong enough to survive some time. This is why children can still remember. Ask a four-year-old about something important that happened last year and chances are they will have a memory of it. Eventually the memories will decay over the long term, much faster than normal forgetting, resulting in infantile amnesia when the brain matures.

It’s not that children cannot make memories, and it’s not that the memories are inaccessible. It’s a little bit of both, where the brain grows and changes the way it stores and retrieves memories, and where old memories decay faster due to biological changes.

All that plasticity, all that development, is part of why you forget. Which makes you wonder what might happen if we reactivate neurogenesis and allow the brain to be that plastic in adults, huh? Might heal brain damage, with permanent amnesia as a side-effect ... who knows!


[1] Rubin, D. C., & Schulkind, M. D. (1997). Distribution of important and word-cued autobiographical memories in 20-, 35-, and 70-year-old adults. Psychol Aging.

[2] Bauer, P. J. (2015). A complementary processes account of the development of childhood amnesia and a personal past. Psychological review, 122(2), 204.

This post originally appeared on Quora. Click here to view.

Big Questions
Why Do Onions Make You Cry?

The onion has been traced back as far as the Bronze Age and was worshipped by the Ancient Egyptians (and eaten by the Israelites during their bondage in Egypt). Onions were rubbed over the muscles of Roman gladiators, used to pay rent in the Middle Ages, and eventually brought to the Americas, where today we fry, caramelize, pickle, grill, and generally enjoy them.

Many of us burst into tears when we cut into one, too. It's the price we pay for onion-y goodness. Here's a play-by-play breakdown of how we go from grabbing a knife to crying like a baby:

1. When you cut into an onion, its ruptured cells release all sorts of goodies, like allinase enzymes and amino acid sulfoxides. The former breaks the latter down into sulfenic acids.

2. The sulfenic acids, unstable bunch that they are, spontaneously rearrange into thiosulfinates, which produce a pungent odor and at one time got the blame for our tears. The acids are also converted by the LF-synthase enzyme into a gas called syn-propanethial-S-oxide, also known as the lachrymatory factor (or the crying factor).

3. Syn-propanethial-S-oxide moves through the air and reaches our eyes. The first part of the eye it meets, the cornea, is populated by autonomic motor fibers that lead to the lachrymal glands. When syn-propanethial-S-oxide is detected, all the fibers in the cornea start firing and tell the lachrymal glands to wash the irritant away.

4. Our eyes automatically start blinking and producing tears, which flushes the irritant away. Of course, our reaction to burning eyes is often to rub them, which only makes things worse since our hands also have some syn-propanethial-S-oxide on them.

It only takes about 30 seconds to start crying after you make the first cut; that's the time needed for syn-propanethial-S-oxide formation to peak.


The onion's relatives, like green onions, shallots, leeks and garlic, also produce sulfenic acids when cut, but they generally have fewer (or no) LF-synthase enzymes and don't produce syn-propanethial-S-oxide.


Since I usually go through a good deal of onions while cooking at home, I've been road testing some of the different methods the internet suggests for reducing or avoiding the effects of the lachrymatory factor. Here's what I tried:

Method #1: Chill or slightly freeze the onions before cutting, the idea being that this will change the chemical reactions and reduce the gas that is released.
Result: The onion from the fridge has me crying just as quickly as room temperature ones. The one that was in a freezer for 30 minutes leaves me dry-eyed for a bit, but by the time I'm done dicing my eyes start to burn a little.

Method #2: Cut fast! Get the chopping over with before the gas reaches your eyes.
Result: Just hacking away at the onion, I get in the frying pan without so much as a sting in my eyes. The onion looks awful, though. Doing a proper dice, I take a little too long and start tearing up. If you don't mind a mangled onion, this is the way to go.

Method #3: Put a slice of bread in your mouth, and cut the onion with most of the bread sticking out to "catch" the fumes.
Result: It seems the loaf of bread I have has gone stale. I stop the experiment and put bread on my shopping list.

Method #4: Chew gum while chopping. It keeps you breathing through your mouth, which keeps the fumes away from your eyes.
Result: This seems to work pretty well as long as you hold your head in the right position. Leaning toward the cutting board or looking right down at the onion puts your eyes right in the line of fire again.

Method #5: Cut the onions under running water. This prevents the gas from traveling up into the eyes.
Result: An onion in the sink is a hard onion to cut. I think Confucius said that. My leaky Brita filter is spraying me in the face and I'm terrified I'm going to cut myself, but I'm certainly not crying.

Method #6: Wear goggles.
Result: In an effort to maintain my dignity, I try my eyeglasses and sunglasses first. Neither do me any good. The ol' chemistry lab safety glasses make me look silly, but help a little more. I imagine swim goggles would really do the trick, but I don't have any.

Method #7: Change your onion. "Tear free" onions have been developed in the UK via special breeding and in New Zealand via "gene silencing" techniques.
Result: My nearest grocery store, Whole Foods, doesn't sell genetically modified produce or onions from England. Tonight, we eat leeks!

Have you got a Big Question you'd like us to answer? If so, let us know by emailing us at

Big Questions
What is Mercury in Retrograde, and Why Do We Blame Things On It?

Crashed computers, missed flights, tensions in your workplace—a person who subscribes to astrology would tell you to expect all this chaos and more when Mercury starts retrograding for the first time this year on Friday, March 23. But according to an astronomer, this common celestial phenomenon is no reason to stay cooped up at home for weeks at a time.

"We don't know of any physical mechanism that would cause things like power outages or personality changes in people," Dr. Mark Hammergren, an astronomer at Chicago's Adler Planetarium, tells Mental Floss. So if Mercury doesn’t throw business dealings and relationships out of whack when it appears to change direction in the sky, why are so many people convinced that it does?


Mercury retrograde—as it's technically called—was being written about in astrology circles as far back as the mid-18th century. The event was noted in British agricultural almanacs of the time, which farmers would read to sync their planting schedules to the patterns of the stars. During the spiritualism craze of the Victorian era, interest in astrology boomed, with many believing that the stars affected the Earth in a variety of (often inconvenient) ways. Late 19th-century publications like The Astrologer’s Magazine and The Science of the Stars connected Mercury retrograde with heavy rainfall. Characterizations of the happening as an "ill omen" also appeared in a handful of articles during that period, but its association with outright disaster wasn’t as prevalent then as it is today.

While other spiritualist hobbies like séances and crystal gazing gradually faded, astrology grew even more popular. By the 1970s, horoscopes were a newspaper mainstay and Mercury retrograde was a recurring player. Because the Roman god Mercury was said to govern travel, commerce, financial wealth, and communication, in astrological circles, Mercury the planet became linked to those matters as well.

"Don’t start anything when Mercury is retrograde," an April 1979 issue of The Baltimore Sun instructed its readers. "A large communications organization notes that magnetic storms, disrupting messages, are prolonged when Mercury appears to be going backwards. Mercury, of course, is the planet associated with communication." The power attributed to the event has become so overblown that today it's blamed for everything from digestive problems to broken washing machines.


Though hysteria around Mercury retrograde is stronger than ever, there's still zero evidence that it's something we should worry about. Even the flimsiest explanations, like the idea that the gravitational pull from Mercury influences the water in our bodies in the same way that the moon controls the tides, are easily deflated by science. "A car 20 feet away from you will exert a stronger pull of gravity than the planet Mercury does," Dr. Hammergren says.

To understand how little Mercury retrograde impacts life on Earth, it helps to learn the physical process behind the phenomenon. When the planet nearest to the Sun is retrograde, it appears to move "backwards" (east to west rather than west to east) across the sky. This apparent reversal in Mercury's orbit is actually just an illusion to the people viewing it from Earth. Picture Mercury and Earth circling the Sun like cars on a racetrack. A year on Mercury is shorter than a year on Earth (88 Earth days compared to 365), which means Mercury experiences four years in the time it takes us to finish one solar loop.

When the planets are next to one another on the same side of the Sun, Mercury looks like it's moving east to those of us on Earth. But when Mercury overtakes Earth and continues its orbit, its straight trajectory seems to change course. According to Dr. Hammergren, it's just a trick of perspective. "Same thing if you were passing a car on a highway, maybe going a little bit faster than they are," he says. "They're not really going backwards, they just appear to be going backwards relative to your motion."

Embedded from GIFY

Earth's orbit isn't identical to that of any other planet in the solar system, which means that all the planets appear to move backwards at varying points in time. Planets farther from the Sun than Earth have even more noticeable retrograde patterns because they're visible at night. But thanks to astrology, it's Mercury's retrograde motion that incites dread every few months.

Dr. Hammergren blames the superstition attached to Mercury, and astrology as a whole, on confirmation bias: "[Believers] will say, 'Aha! See, there's a shake-up in my workplace because Mercury's retrograde.'" He urges people to review the past year and see if the periods of their lives when Mercury was retrograde were especially catastrophic. They'll likely find that misinterpreted messages and technical problems are fairly common throughout the year. But as Dr. Hammergren says, when things go wrong and Mercury isn't retrograde, "we don't get that hashtag. It's called Monday."

This story originally ran in 2017.


More from mental floss studios