Monthly Archives: October 2012

Essential Facts About Pumpkins

A tiny fraction of the 1.5 billion pounds produced in the U.S. each year

OK, the title is a little misleading:  There are no essential facts about pumpkins.  You can go on having a reasonably happy life without knowing where those members of the gourd family thrive — unless you’re a pumpkin farmer, of course, and if you are, you already know far more about pumpkins than I do.

The dictionary defines essential as “absolutely necessary; indispensable”, so I’m pretty sure it’s not essential to know that China is the world leader in pumpkin production, in spite of the fact that the Chinese do not observe Halloween or Thanksgiving.

Those holidays are associated with pumpkins in the United States, which is fourth among the world’s pumpkin producers.  States like California, New York, Ohio, Pennsylvania and Michigan grow a lot of them, but take a moment to consider this fact, which, while not essential, is pretty darned impressive:

Ninety percent of the pumpkins grown in the United States are raised within a 90-mile radius of Peoria, Illinois.

That’s what a University of Illinois website says, anyway, and I don’t think they would make a claim like that just to boost Peoria tourism.  The University happens to have a trove of information about the orange-colored fruit that is related to the cucumber.  For instance, its website mentions that pumpkins were once recommended for removing freckles and curing snakebites.  Maybe someone at U of I should do a study to see if people living near Peoria have a lower incidence of freckles and snakebites.

You have to look elsewhere, though, to learn that the practice of carving jack-o-lanterns originated in Ireland.  The Irish typically used turnips or potatoes for that purpose, but when immigrants arrived in America, they applied their fruit-and-vegetable carving skills to pumpkins, and an American holiday tradition was born.

The University of Illinois hasn’t yet had a chance to update its information about the world’s largest pumpkin.  That record was set a few weeks ago by Ron Wallace of Greene, Rhode Island (well, not by Ron himself — by a pumpkin he grew).  It weighed 2,009 pounds, breaking the one-ton barrier that had eluded pumpkin growers until now.  Presumably their new goal is to grow one that weighs more than a car; they’re only a few hundred pounds away.

The subject of colossal pumpkins leads us to Thanksgiving, the holiday traditionally associated with “topping off” a big meal with a huge chunk of pumpkin pie.

Let me pass along the pertinent facts to answer a question that has probably occurred to you by now.  The world’s largest pumpkin pie was made in New Bremen, Ohio, in 2010.  It contained over 1,200 pounds of canned pumpkin, 109 gallons of evaporated milk, 525 pounds of sugar, 233 dozen eggs and a pinch of cinnamon — 14.5 pounds.  I don’t have the details about the crust, but the whole thing weighed just under 3,700 pounds, and measured 20 feet in diameter.

Whipped cream?  Why yes, I don’t mind if I do.

Who Were the Founding Fathers?

John Trumbull, “Declaration of Independence” (painted 1817), U.S. Capitol

It was one of America’s least-admired presidents, Warren G. Harding, who coined the term “Founding Fathers”.  Speaking at the 1916 Republican convention, he used it in reference to the men who transformed America from a cluster of British colonies to an independent nation.

They are now spoken of with great respect, and deserve to be.  We tend to think of the Founding Fathers as demigods, though, or at least as men who always conducted themselves with the greatest decorum, as if they were posing for that picture on the back of the two-dollar bill (see above).

In fact, there were intense personal rivalries among the Founding Fathers.  According to historian David McCullough, animosity between Alexander Hamilton and Thomas Jefferson “reached the point where they could hardly bear to be in the same room.”

A Jefferson ally, James Callender, attacked John Adams in print, calling him “that strange compound of ignorance and ferocity, of deceit and weakness.”  Adams muttered about Benjamin Franklin, who had famously promoted the virtue of “a penny saved is a penny earned,” but in his personal life was a big spender.

Even George Washington was the target of insults.  Thomas Paine, a key figure in the American Revolution, slammed Washington as “treacherous in private friendship… and a hypocrite in public life.”

As author Ron Chernow noted in the Wall Street Journal, “After sharpening their verbal skills hurling polemics against the British Crown, the founding generation then directed those energies against each other.”

While some of the attacks were rooted in personality conflicts — this guy didn’t like that guy — much of it had to do with the clash of competing concerns as they groped their way toward a new form of government.

Jefferson didn’t want a president to have much power, worried that he would become a de facto king.  Adams wanted to safeguard against the proposed Senate becoming an aristocracy.  Hamilton, John Jay and James Madison wrote persuasively for the new federal government, while Patrick Henry and others were wary of it, citing concerns about states’ rights.

So who were the Founding Fathers?  How did one qualify for inclusion in this particular pantheon?  Sometimes they are narrowly defined as those who attended the Constitutional Convention of 1787.  But that would leave out John Adams, Samuel Adams and Thomas Jefferson, among others.  Besides, what do you do about Elbridge Gerry, George Mason and Edmund Randolph, who attended the convention but refused to sign the document?  Incidentally, only six men, Franklin among them, signed both the Declaration of Independence and the Constitution.

It seems fair to say that the Founding Fathers should include those who had important roles in the fight for independence, the drafting of the Constitution and Bill of Rights, and who actually steered the fragile new government that had resulted from their efforts.  Only a few have been named here.

There are many whose names you may not know, like Gouverneur Morris of New York, who had a wooden leg.  His official story was that he’d had a carriage accident, but some accounts attribute it to a leap from a window to escape a jealous husband.  Sadly, Gouverneur Morris never became governor, but he had a lot to do with the final wording of the Constitution.

They were a diverse group of men; stubborn and contentious at times, but cognizant of the need to negotiate and compromise, ultimately finding common ground.  They weren’t the titans we sometimes imagine them to be, but were hotheads and skeptics and windbags and risk-takers — in short, they were a lot like people we know.  That’s what makes their achievement so remarkable.

Back to Standard Time

A finger in each hemisphere at the prime meridian, Greenwich

Attention, U.S. residents:  You’re about to regain the hour you lost last spring.  That’s because we’re no longer going to be saving daylight; we’re returning to standard time soon.

Resetting clocks every few months has its annoyances.  If you forget to change back to standard time in the fall, you might turn on your TV and discover that you’ve missed the first hour of that game you wanted to see.  When clocks “spring forward” in the spring, people who make a habit of arriving at church fashionably late are chagrined to find that they’re going to have to endure the entire service.  Those who live in Arizona or Hawaii have no excuse for being late or early, since they observe year-round standard time.

“What time is it?” has become an increasingly complicated question.  Back when people used sundials to determine local time, villages that were only a few miles apart had different opinions of when noon was.  It didn’t matter much, though, since one rarely visited that neighboring village.

With the growth of international trade and travel, there was a greater incentive to get on the same schedule.  Greenwich Mean Time was an attempt to get the world to synchronize its watches, so to speak.

The general idea is that there is a one-hour difference for every 15 degrees of longitude.  The so-called prime meridian — 0° longitude — runs through Greenwich, England, which is not very far down the Thames from London.  The location of the prime meridian was arbitrary, of course, but in 1884 an international conference decided to humor the Brits and let them think the world’s day started there.

It’s probably worth noting that the one hour = fifteen degrees concept gets ignored in a lot of places.  China, for example, has one time zone for the entire country, which covers 60 degrees of longitude.  Like most Asian and African countries, the Chinese don’t bother with daylight saving time, either.

The idea of adding an hour of light at the end of the day during summer months wasn’t seriously considered until the late 19th century.  It was implemented during World War I as a way to conserve energy resources, and has been repealed and resumed many times since.

In 1966, the U.S. government tried to simplify things by establishing the Uniform Time Act.  One of its provisions was that clocks were set forward on the last Sunday in April and returned to standard time on the last Sunday in October.  States were allowed to opt out of daylight saving time, provided the entire state did so.  The Act has been amended several times since, including an experiment with year-round DST during the Arab oil embargo of the 1970s.

The most recent change to the law took effect in 2007, when the beginning of daylight saving time was moved to the second Sunday in March and the ending was pushed to the first Sunday in November.

One of the arguments for reverting to standard time after Halloween was that kids could then do their trick-or-treating during daylight hours.  Seriously.  There may be a lot of studies that support that view, but based on anecdotal evidence — the traffic at my front door — kids still show up after dark.  That’s just an hour later than it used to be.

Terms of Avoidance

Yes, she is, uh… in the family way. (1972)

There was a lot of controversy when Lucille Ball got pregnant in 1952, because in those early days of television, actors weren’t allowed to say the word “pregnant”.  Now some channels carry programs that not only use the word pregnant, but have actors graphically depicting procedures that can lead to pregnancy.

The compromise that was reached with CBS executives back then was that Lucy could be referred to in dialogue as “expecting”.  Perhaps the reason that term was deemed acceptable was that it was then left to the viewers’ imaginations about what it was she was expecting.

That’s nonsense, of course — even in those ancient times, people knew the basics of human reproduction.  They just weren’t very comfortable talking about it in polite company, so euphemisms like “expecting” or “in the family way” were used.

Euphemisms are expressions that substitute milder words for harsh or socially unacceptable ones.  On the other hand, synonyms for standard terms that often are intended to shock or offend are called slang.  For instance, let’s use pregnant as the standard term.  Something like “a bun in the oven” is a euphemism for it, while “knocked up” is slang.

As Richard A. Spears wrote in his book Slang and Euphemism, “Slang originally referred to the patter of criminals,” and was unwelcome in the company of ladies and gentlemen.  The book includes many frank examples, dozens of which are alternatives to pregnant, and hundreds are colorful synonyms for the body parts involved in the production of babies.

Similarly, there are abundant slang terms and euphemisms that convey the concept of drunkenness and its aftereffects.  I’m not sure why we need so many, but just to name a few that begin with the letter S, there’s sloshed, smashed, snockered, soused, stewed to the gills, swacked, and seeing double.

The origin of some of the terms for intoxication are fairly obvious (“irrigated”), but the meaning of others can be obscure.  Did you ever hear someone — not you, of course — described as being “three sheets to the wind”?

That expression is derived from nautical terminology.  If you haven’t spent time on a sailboat, you might logically assume that the sails are called sheets, since they look like bed coverings.  Sorry, that makes too much sense.  In Boatspeak, sheets are actually ropes that are attached to the sails, and are used to position the sails, thereby controlling the course.

On a sloop — a boat with one mast — there are typically three sheets:  two for the jib (foresail) and one for the main.  If the sheets are released “to the wind”, the sails are then flapping in the breeze and the boat is not able to hold a steady course — not unlike a drunk person.

A man in that condition might stagger off mumbling about needing to “kill a tree”.  You can probably guess what that means, but it’s not something we discuss in polite company.

Let’s Do Lunch

“Luncheon of the Boating Party”, Pierre-Auguste Renoir (1881), Phillips Collection, Washington, D.C.

If you had lived 500 years ago, you never would have heard of a meal called lunch.  Of course, you never would have heard of computers, either, but that’s sort of the point — things change.

For many centuries, the midday meal was called dinner, and it was the biggest meal of the day.  Supper was the evening meal, and often consisted of leftovers from dinner, since refrigeration had not yet caught on.  Supper was also a precursor of today’s Early Bird Specials:  it was eaten around 5 p.m. or so. That’s because in the absence of artificial light and cable TV, people went to bed when it got dark.

To review, then, if you had been hungry in 1512, you would have had breakfast soon after dawn, dinner around noon, and supper at sunset.  “But what about lunch?” I hear you mutter.  (Or maybe that’s your stomach growling.)

The word once conveyed the idea of “snack”, possibly derived from the Spanish word lonja, which means a slice of ham, or loncha, a slice of cheese.  As recently as 1755, lunch or luncheon meant a portion; a hunk of something.  In his dictionary of that year, Samuel Johnson defined it this way:  “as much food as one’s hand can hold.”  Fast-food chains still base their menus on that definition, apparently.

It was not until the mid-19th century that the rearrangement of our meal designations took hold, with dinner moving to the evening hours, and lunch taking over the noontime slot formerly held by dinner.

There were several reasons for that, one of which was artificial lighting.  Oh sure, candles had been around forever, but who wants to eat dinner by candlelight?  Well, yes dear, candlelight is very romantic.  I’m just saying it was the practicalities of meal preparation and cleanup that made having a meal after dark, you know, impractical.  With improved oil lamps and gas lamps, it became possible to eat dinner later.

Another reason for the change had to do with social conventions of the upper classes, who were obliged to call on friends and acquaintances during the early afternoon.  Let’s say someone had dropped in while you were away from home.  Etiquette required you to return the in-person call at their residence the following day.

As Bill Bryson notes in his book At Home, “What this meant in practice was that most people spent their afternoons dashing around in a similarly unproductive manner trying to catch up with them.”  That had the effect of pushing the dinner hour later.

Perhaps the most important factor was the transition from rural to urban life in the 1800s.  Back when you owned a farm — or worked on one that someone else owned — you were close to home at midday.  That made it possible to sit down to a big dinner at noon.

When farms were replaced by factories, though, and people went off to work in cities, the noon meal at home wasn’t possible, so the dinner hour was delayed.  Lunch filled the void.

A current exhibit at the New York City Library makes the claim that “Of the three meals that mark the American day, lunch is the one that acquired its modern identity here on the streets of New York.”  Sounds plausible to me, but what do I know?  Some people tell me I seem “out to lunch”.