5 Alternative Teaching Methods

Maria Montessori, an Italian educationalist.
Maria Montessori, an Italian educationalist.
Topical Press Agency, Getty Images

Traditional schools "“ with their lectures, homework, and report cards "“ aren't for everyone. Here are five alternative approaches to education.

1. Montessori

Dr. Maria Montessori, the first woman in Italy to earn her physician's degree, developed the educational model that bears her name while teaching a class of 50 poor students on the outskirts of Rome in 1907. Dr. Montessori, who previously worked with special needs students, rejected the notion that children were born as "blank slates." Rather, she believed that children were born with absorbent minds and were fully capable of self-directed learning. Montessori developed the framework for a prepared educational environment in which children, empowered with the freedom to choose how they would spend their time in school, would seek out opportunities to learn on their own. Her pioneering work formed the basis for the Montessori classroom, which endures primarily in preschool and elementary school settings today.

Montessori believed that children enjoyed and needed periods of long concentration and that the traditional education model, with its structured lessons and teacher-driven curriculum, inhibited a child's natural development. Montessori students are free to spend large blocks of the day however they choose, while the teacher, or director, observes. Dr. Montessori was a major proponent of tactile learning. Classic materials, such as the Pink Tower, Brown Stairs, and the Alphabet Box "“ a set of wooden letters that children are encouraged to hold and feel before learning to write "“ remain staples of Montessori classrooms.

Montessori classes typically span three-year age groups.

The lack of grades, tests, and other forms of formal assessment helps ensure that classes remain non-competitive. The first Montessori school in the United States was opened in Tarrytown, New York, in 1911. The New York Times described the school as follows: "Yet this is by no means a school for defective children or tubercular children or children who are anemic. The little pupils in the big sunny classroom at Tarrytown are normal, happy, healthy American children, little sons and daughters of well-to-do suburban residents." Today, the Montessori method is employed in roughly 5,000 schools in the U.S., including several hundred public schools. A 2006 study comparing outcomes of children at a public inner-city Montessori school with children who attended traditional schools provided evidence that Montessori education leads to children with better social and academic skills. Among the many celebrities who can attest to the value of a Montessori education are Google co-founders Sergey Brin and Lawrence Page.

2. Steiner/Waldorf

steiner.jpgIn addition to creating the field of anthroposophy, which is based on the belief that humans have the inherent wisdom to uncover the mysteries of the spiritual world, Austrian philosopher and scientist Rudolf Steiner developed an educational model that focused on the development of the "whole child" "“ body, soul, and spirit. Influenced by the likes of Goethe and Jean Piaget, Steiner believed there were three 7-year periods of child development, and his educational approach reflected what he thought should and should not be taught during each of these stages.

Steiner founded his first Waldorf school (the term Waldorf is now used interchangeably with Steiner to describe schools with curriculums based on Steiner's teachings) in 1919 in Stuttgart, Germany, for children of workers at the Waldorf-Astoria cigarette factory. The original curriculum spanned 12 years and aimed to prepare students "for living," with an emphasis on creative expression and social and spiritual values. Within 10 years, Steiner's school in Stuttgart was the largest private school in Germany. When the Nazis closed German schools during World War II, Waldorf teachers fled to other countries, contributing to the methodology's increased post-war popularity.

The curriculum that defines the Waldorf method has remained relatively unchanged in the last 90 years. Steiner believed the first 7 years of a child's life, a period marked by imitative and sensory-based learning, should be devoted to developing a child's noncognitive abilities. To that end, kindergartners in Waldorf schools are encouraged to play and interact with their environment instead of being taught academic content in a traditional setting. Steiner also believed that children should learn to write before they learned to read, and that no child should learn to read before the age of 7. From age 7-14, creativity and imagination are emphasized. During this stage, Waldorf school students may learn foreign languages, as well as eurythmy, an expressive dance developed by Steiner, and other performing arts. By age 14, students are ready for a more structured environment that stresses social responsibility.

Some critics of the Waldorf method argue that it borders on religion. According to the curriculum, students learn about Christian saints in second grade and Old Testament figures in third grade. Despite those concerns and the restricting demands of standardized testing, there are more than 800 schools that employ some variation of Steiner's teaching method throughout the world. Rudolf Steiner College, which was founded in 1974 in Fair Oaks, California, serves as the center for anthroposophical studies and the training ground for future generations of Waldorf teachers.

3. Harkness

harkness.jpgThe Harkness method isn't based on a specific curriculum or a particular ideology, but rather one important piece of furniture. Developed by oil magnate and philanthropist Edward Harkness, a large, oval table is the centerpiece of any classroom that employs the Harkness method of teaching. Students sit with their classmates and teacher around the table and discuss any and all subjects, from calculus to history, often in great detail. The Harkness method represents a significant departure from the traditional classroom setup of a teacher at a chalkboard lecturing to students seated in rows of desks. Individual opinions are formed, raised, rejected, and revised at the Harkness table, where the teacher's main responsibilities are to ensure that no one student dominates the discussion and to keep the students on point. No conversation is ever the same, which can help teachers avoid the burnout that might result from teaching the same lesson from year to year.

In 1930, Harkness gave a multi-million dollar donation to Phillips Exeter Academy, a private secondary school in New Hampshire, under the condition that the money be used to implement a new educational method that would involve all students in the learning process. Part of Harkness' endowment paid for the hiring of 26 new teachers, which enabled Exeter to shrink its average class size. This was imperative, as the Harkness method is most effective in classes of 15 students or less. "The classes are now small enough so that the shy or slow individual will not be submerged," Exeter principal Dr. Lewis Perry told the New York Times in the early years of the program. "The average boy, similarly, finds his needs cared for. In short, the Harkness harkness-table.jpgplan is best defined as an attitude. It is a new approach to the problem of getting at the individual boy." The method was effective from the start; Exeter reported a decrease in failing grades of 6 percent during the first three years of the Harkness approach.

The intimate setting of the Harkness table forces students to take responsibility for their own learning and encourages them to share their opinions. In addition to learning about topics being discussed, students also learn valuable public speaking skills and to be respectful of their fellow students' ideas. Studies have supported the method's effectiveness in increasing students' retention and recall of material. It takes time to delve into subjects using the Harkness method, which is one reason, in addition to class size limitations, that it hasn't become more popular in public schools.

4. Reggio Emilia

emilio.jpgReggio Emilia is an educational approach used primarily for teaching children aged 3 to 6. The method is named after the city in northern Italy where teacher Loris Malaguzzi founded a new approach to early childhood education after World War II. Malaguzzi's philosophy was based on the belief that children are competent, curious and confident individuals who can thrive in a self-guided learning environment where mutual respect between teacher and student is paramount. While the first Reggio Emilia preschool opened in 1945, the approach attracted a serious following in the United States in 1991 after Newsweek named the Diana preschool in Reggio Emilia among the best early childhood institutions in the world.

Reggio Emilia schools emphasize the importance of parents taking an active role in their child's early education. Classrooms are designed to look and feel like home and the curriculum is flexible, as there are no set lesson plans. Reggio Emilia stresses growth on the students' terms. Art supplies are an important component of any Reggio Emilia classroom and traditional schools have an atelierista, or art teacher, who works closely with the children on a variety of creative projects. Reggio Emilia teachers often keep extensive documentation of a child's development, including folders of artwork and notes about the stories behind each piece of art.

"It's about exploring the world together and supporting children's thinking rather than just giving them ready-made answers," said Louise Boyd Cadwell, who was an intern at two Reggio Emilia schools in Italy in the early '90s and then wrote a book about the teaching method. "Reggio Emilia is about full-blown human potential and how you support that in both intellectual and creative terms."

5. Sudbury

sudbury.jpgSudbury schools take their name from the Sudbury Valley School, which was founded in 1968 in Framingham, Massachusetts. Sudbury schools operate under the basic tenets of individuality and democracy and take both principles to extremes that are unrivaled in the education arena. In Sudbury schools, students have complete control over what and how they learn, as well as how they are evaluated, if at all. At the weekly School Meeting, students vote on everything from school rules and how to spend the budget to whether staff members should be rehired. Every student and staff member has a vote and all votes count equally.

The Sudbury philosophy is that students are capable of assuming a certain level of responsibility and of making sound decisions; in the event that they make poor decisions, learning comes in the form of dealing with the consequences. While many public and private schools are constantly looking for new ways to motivate students to learn, Sudbury schools don't bother. According to the Sudbury approach, students are inherently motivated to learn. One Sudbury educator uses the example of an infant who learns to walk despite the fact that lying in a crib is a viable "“ and easier "“ alternative as support of this belief.

Sudbury schools, which have some similarities with the "free schools" that gained popularity in the U.S. during the 1970s, do not divide students into different classes by age. Students regularly engage in collaborative learning, with the older students often mentoring the younger students. Annual tuition for the Sudbury Valley School, which welcomes students as young as 4 years old, is $6,450 for the first child in a family to attend the school.

Why the Filet-O-Fish Sandwich Has Been on the McDonald's Menu for Nearly 60 Years

McDonald's has introduced and quietly killed many dishes over the years (remember McDonald's pizza?), but there's a core group of items that have held their spot on the menu for decades. Listed alongside the Big Mac and McNuggets is the Filet-O-Fish—a McDonald's staple you may have forgotten about if you're not the type of person who orders seafood from fast food restaurants. But the classic sandwich, consisting of a fried fish filet, tartar sauce, and American cheese on a bun, didn't get on the menu by mistake—and thanks to its popularity around Lent, it's likely to stick around.

According to Taste of Home, the inception of the Filet-O-Fish can be traced back to a McDonald's franchise that opened near Cincinnati, Ohio in 1959. Back then the restaurant offered beef burgers as its only main dish, and for most of the year, diners couldn't get enough of them. Things changed during Lent: Many Catholics abstain from eating meat and poultry on Fridays during the holy season as a form of fasting, and in the early 1960s, Cincinnati was more than 85 percent Catholic. Fridays are supposed to be one of the busiest days of the week for restaurants, but sales at the Ohio McDonald's took a nosedive every Friday leading up to Easter.

Franchise owner Lou Groen went to McDonald's founder Ray Kroc with the plan of adding a meat alternative to the menu to lure back Catholic customers. He proposed a fried halibut sandwich with tartar sauce (though meat is off-limits for Catholics on Fridays during Lent, seafood doesn't count as meat). Kroc didn't love the idea, citing his fears of stores smelling like fish, and suggested a "Hula Burger" made from a pineapple slice with cheese instead. To decide which item would earn a permanent place on the menu, they put the two sandwiches head to head at Groen's McDonald's one Friday during Lent.

The restaurant sold 350 Filet-O-Fish sandwiches that day—clearly beating the Hula Burger (though exactly how many pineapple burgers sold, Kroc wouldn't say). The basic recipe has received a few tweaks, switching from halibut to the cheaper cod and from cod to the more sustainable Alaskan pollock, but the Filet-O-Fish has remained part of the McDonald's lineup in some form ever since. Today 300 million of the sandwiches are sold annually, and about a quarter of those sales are made during Lent.

Other seafood products McDonald's has introduced haven't had the same staying power as the Filet-O-Fish. In 2013, the chain rolled out Fish McBites, a chickenless take on McNuggets, only to pull them from menus that same year.

[h/t Taste of Home]

The Disturbing Reason Schools Tattooed Their Students in the 1950s

Kurt Hutton, Hulton Archive/Getty Images
Kurt Hutton, Hulton Archive/Getty Images

When Paul Bailey was born at Beaver County Hospital in Milford, Utah on May 9, 1955, it took less than two hours for the staff to give him a tattoo. Located on his torso under his left arm, the tiny marking was rendered in indelible ink with a needle gun and indicated Bailey’s blood type: O-Positive.

“It is believed to be the youngest baby ever to have his blood type tattooed on his chest,” reported the Beaver County News, cooly referring to the infant as an “it.” A hospital employee was quick to note parental consent had been obtained first.

The permanent tattooing of a child who was only hours old was not met with any hysteria. Just the opposite: In parts of Utah and Indiana, local health officials had long been hard at work instituting a program that would facilitate potentially life-saving blood transfusions in the event of a nuclear attack. By branding children and adults alike with their blood type, donors could be immediately identified and used as “walking blood banks” for the critically injured.

Taken out of context, it seems unimaginable. But in the 1950s, when the Cold War was at its apex and atomic warfare appeared not only possible but likely, children willingly lined up at schools to perform their civic duty. They raised their arm, gritted their teeth, and held still while the tattoo needle began piercing their flesh.

 

The practice of subjecting children to tattoos for blood-typing has appropriately morbid roots. Testifying at the Nuremberg Tribunal on War Crimes in the 1940s, American Medical Association physician Andrew Ivy observed that members of the Nazi Waffen-SS carried body markings indicating their blood type [PDF]. When he returned to his hometown of Chicago, Ivy carried with him a solution for quickly identifying blood donors—a growing concern due to the outbreak of the Korean War in 1950. The conflict was depleting blood banks of inventory, and it was clear that reserves would be necessary.

School children sit next to one another circa the 1950s
Reg Speller, Fox Photos/Getty Images

If the Soviet Union targeted areas of the United States for destruction, it would be vital to have a protocol for blood transfusions to treat radiation poisoning. Matches would need to be found quickly. (Transfusions depend on matching blood to avoid the adverse reactions that come from mixing different types. When a person receives blood different from their own, the body will create antibodies to destroy the red blood cells.)

In 1950, the Department of Defense placed the American Red Cross in charge of blood donor banks for the armed forces. In 1952, the Red Cross was the coordinating agency [PDF] for obtaining blood from civilians for the National Blood Program, which was meant to replenish donor supply during wartime. Those were both measures for soldiers. Meanwhile, local medical societies were left to determine how best to prepare their civilian communities for a nuclear event and its aftermath.

As part of the Chicago Medical Civil Defense Committee, Ivy promoted the use of the tattoos, declaring them as painless as a vaccination. Residents would get blood-typed by having their finger pricked and a tiny droplet smeared on a card. From there, they would be tattooed with the ABO blood group and Rhesus factor (or Rh factor), which denotes whether or not a person has a certain type of blood protein present.

The Chicago Medical Society and the Board of Health endorsed the program and citizens voiced a measure of support for it. One letter to the editor of The Plainfield Courier-News in New Jersey speculated it might even be a good idea to tattoo Social Security numbers on people's bodies to make identification easier.

Despite such marked enthusiasm, the project never entered into a pilot testing stage in Chicago.

Officials with the Lake County Medical Society in nearby Lake County, Indiana were more receptive to the idea. In the spring of 1951, 5000 residents were blood-typed using the card method. But, officials cautioned, the cards could be lost in the chaos of war or even the relative quiet of everyday life. Tattoos and dog tags were encouraged instead. When 1000 people lined up for blood-typing at a county fair, two-thirds agreed to be tattooed as part of what the county had dubbed "Operation Tat-Type." By December 1951, 15,000 Lake County residents had been blood-typed. Roughly 60 percent opted for a permanent marking.

The program was so well-received that the Lake County Medical Society quickly moved toward making children into mobile blood bags. In January 1952, five elementary schools in Hobart, Indiana enrolled in the pilot testing stage. Children were sent home with permission slips explaining the effort. If parents consented, students would line up on appointed tattoo days to get their blood typed with a finger prick. From there, they’d file into a room—often the school library—set up with makeshift curtains behind which they could hear a curious buzzing noise.

When a child stepped inside, they were greeted by a school administrator armed with indelible ink and wielding a Burgess Vibrotool, a medical tattoo gun featuring 30 to 50 needles. The child would raise their left arm to expose their torso (since arms and legs might be blown off in an attack) and were told the process would only take seconds.

A child raises his hand in class circa the 1950s
Vecchio/Three Lions/Getty Images

Some children were stoic. Some cried before, during, or after. One 11-year-old recounting her experience with the program said a classmate emerged from the session and promptly fainted. All were left with a tattoo less than an inch in diameter on their left side, intentionally pale so it would be as unobtrusive as possible.

At the same time that grade schoolers—and subsequently high school students—were being imprinted in Indiana, kids in Cache and Rich counties in Utah were also submitting to the program, despite potential religious obstacles for the region's substantial Mormon population. In fact, Bruce McConkie, a representative of the Church of Jesus Christ of Latter-Day Saints, declared that blood-type tattoos were exempt from the typical prohibitions on Mormons defacing their bodies, giving the program a boost among the devout. The experiment would not last much longer, though.

 

By 1955, 60,000 adults and children had gotten tattooed with their blood types in Lake County. In Milford, health officials persisted in promoting the program widely, offering the tattoos for free during routine vaccination appointments. But despite the cooperation exhibited by communities in Indiana and Utah, the programs never spread beyond their borders.

The Korean conflict had come to an end in 1953, reducing the strain put on blood supplies and along with it, the need for citizens to double as walking blood banks. More importantly, outside of the program's avid boosters, most physicians were extremely reticent to rely solely on a tattoo for blood-typing. They preferred to do their own testing to make certain a donor was a match with a patient.

There were other logistical challenges that made the program less than useful. The climate of a post-nuclear landscape meant that bodies might be charred, burning off tattoos and rendering the entire operation largely pointless. With the Soviet Union’s growing nuclear arsenal—1600 warheads were ready to take to the skies by 1960—the idea of civic defense became outmoded. Ducking and covering under desks, which might have shielded some from the immediate effects of a nuclear blast, would be meaningless in the face of such mass destruction.

Programs like tat-typing eventually fell out of favor, yet tens of thousands of adults consented to participate even after the flaws in the program were publicized, and a portion allowed their young children to be marked, too. Their motivation? According to Carol Fischler, who spoke with the podcast 99% Invisible about being tattooed as a young girl in Indiana, the paranoia over the Cold War in the 1950s drowned out any thought of the practice being outrageous or harmful. Kids wanted to do their part. Many nervously bit their lip but still lined up with the attitude that the tattoo was part of being a proud American.

Perhaps equally important, children who complained of the tattoo leaving them particularly sore received another benefit: They got the rest of the afternoon off.

SECTIONS

arrow
LIVE SMARTER