Are There Number 1 Pencils?


Almost every syllabus, teacher, and standardized test points to the ubiquitous No. 2 pencil, but are there other choices out there?

Of course! Pencil makers manufacture No. 1, 2, 2.5, 3, and 4 pencils—and sometimes other intermediate numbers. The higher the number, the harder the core and lighter the markings. (No. 1 pencils produce darker markings, which are sometimes preferred by people working in publishing.)

The current style of production is profiled after pencils developed in 1794 by Nicolas-Jacques Conté. Before Conté, pencil hardness varied from location to location and maker to maker. The earliest pencils were made by filling a wood shaft with raw graphite, leading to the need for a trade-wide recognized method of production.

Conté’s method involved mixing powdered graphite with finely ground clay; that mixture was shaped into a long cylinder and then baked in an oven. The proportion of clay versus graphite added to a mixture determines the hardness of the lead. Although the method may be agreed upon, the way various companies categorize and label pencils isn't.

Today, many U.S. companies use a numbering system for general purpose, writing pencils that specifies how hard the lead is. For graphic and artist pencils and for companies outside the U.S., systems get a little complicated, using a combination of numbers and letters known as the HB Graphite Scale.

"H" indicates hardness and "B" indicates blackness. Lowest on the scale is 9H, indicating a pencil with extremely hard lead that produces a light mark. On the opposite end of the scale, 9B represents a pencil with extremely soft lead that produces a dark mark. ("F" also indicates a pencil that sharpens to a fine point.) The middle of the scale shows the letters and numbers that correspond to everyday writing utensils: B = No. 1 pencils, HB = No. 2, F = No. 2½, H = No. 3, and 2H = No. 4 (although exact conversions depend on the brand).

So why are testing centers such sticklers about using only No. 2 pencils? They cooperate better with technology because early machines used the electrical conductivity of the lead to read the pencil marks. Early scanning-and-scoring machines couldn't detect marks made by harder pencils, so No. 3 and No. 4 pencils usually resulted in erroneous results. Softer pencils like No. 1s smudge, so they're just impractical to use. Which is how No. 2 pencils became the industry standard.

Have you got a Big Question you'd like us to answer? If so, let us know by emailing us at

What's the Difference Between Straw and Hay?

The words straw and hay are often used interchangeably, and it's easy to see why: They're both dry, grassy, and easy to find on farms in the fall. But the two terms actual describe different materials, and once you know what to look for, it's easy to tell the difference between them.

Hay refers to grasses and some legumes such as alfalfa that are grown for use as animal feed. The full plant is harvested—including the heads, leaves, and stems—dried, and typically stored in bales. Hay is what livestock like cattle eat when there isn't enough pasture to go around, or when the weather gets too cold for them to graze. The baled hay most non-farmers are familiar with is dry and yellow, but high-quality hay has more of a greenish hue.

The biggest difference between straw and hay is that straw is the byproduct of crops, not the crop itself. When a plant, such as wheat or barley, has been stripped of its seeds or grains, the stalk is sometimes saved and dried to make straw. This part of the plant is lacking in nutrients, which means it doesn't make great animal fodder. But farmers have found other uses for the material throughout history: It what's used to weave baskets, thatch roofs, and stuff mattresses.

Today, straw is commonly used to decorate pumpkin-picking farms. It's easy to identify (if it's being used in a way that would be wasteful if it were food, chances are it's straw), but even the farms themselves can confuse the two terms. Every hayride you've ever taken, for example, was most likely a straw-ride.

Have you got a Big Question you'd like us to answer? If so, let us know by emailing us at

How and Why Did Silent Letters Emerge in English?


Kory Stamper:

The easy answer is “"because English can’t leave well enough alone."

When we first started speaking English around 600 AD, it was totally phonetic: every letter had a sound, and we sounded every letter in a word. But English—and England itself—were influenced quite a bit by the French, who conquered the island in 1066 and held it for a long time. And then later by Dutch and Flemish printers, who were basically the main publishers in England for a solid two centuries, and then by further trading contact with just about every continent on the planet. And while we’re shaking hands and stealing language from every single people-group we meet, different parts of the language started changing at uneven rates.

By the 1400s, English started to lose its phonetic-ness: the way we articulated vowels in words like “loud” changed slowly but dramatically, and that had an effect on the rest of the word. (This is called “The Great Vowel Shift,” and it took place over a few hundred years.) Somewhere in the middle of the GVS, though, English spelling became fixed primarily because of the printing press and the easy distribution/availability of printed materials. In short: we have silent letters because the spelling of words stopped changing to match their pronunciations.

This post originally appeared on Quora. Click here to view.