The calendar above my desk calls today the first day of summer. For meteorologists, summer started June 1 based on monthly average temperatures. In parts of northern Europe, the solstice counts as midsummer, halfway between planting and harvest, halfway through the weeks of longest daylight. Just to complicate matters, some celebrate Midsummer on June 24 to coincide with the Feast of Saint John the Baptist.
I tend to think of seasons as part of nature, untouched by humans except as we interfere. But it’s we humans who define them by dates and pretend nature gave us the definition. Why four seasons, instead of just two—hot and cold—or the six on the Hindu calendar? Why treat them as equal in length whether you’re in San Diego or Wisconsin?
Nature cycles gradually, by fits and starts, with markers like the first robins in March or ragweed pollen in August. Our calendar with its seasonal divisions is a mere approximation, a convenience that makes it easier to think and talk. It only gets us in trouble when we expect nature to comply.
Image: William Blake, c. 1786, Oberon, Titania and Puck with Fairies Dancing, inspired by A Midsummer Night’s Dream. Midsummer, set in June, was among the most popular and rowdiest festivals of Shakespeare’s England.
“Resolved: that the flag of the United States be made of thirteen stripes, alternate red and white; that the union by thirteen stars, white in a blue field, representing a new Constellation.”
- Journal of the Continental Congress, June 14, 1777
Every morning in my grade school we pledged allegiance to the flag, mumbled the Lord’s Prayer, and sang a patriotic song. Flag equaled country, sacrosanct, deserving of reverence. It wasn’t always that way.
The young republic in 1777 needed an identifying banner to raise on ships or carry into battle. More practical than emotive, the new flags didn’t fly in classrooms or outside private homes. Their mythic overtones came later, in response to current events.
Union loyalists flew the Stars and Stripes during the Civil War. Soon after, approaching the centennial, descendants claimed Betsy Ross sewed the first American flag. In the late 1800s, U.S. flags promoted the assimilation of immigrants. A Wisconsin schoolteacher is among many credited with starting Flag Day. A marketer created the Pledge of Allegiance to boost flag sales to schools.
Two world wars and the rise of godless Communism heightened the distinctively American “cult of the flag.” Presidents and Congress formalized Flag Day (1916, 1949), made “The Star-Spangled Banner” the national anthem (1931), adopted the Pledge of Allegiance (1942), and added “under God” to the pledge (1954). Flags marked political discord in the 1960s, unity after 9/11, and polarization last January—surely not what the founders had in mind.
The song from Rogers and Hammerstein’s Carousel has been playing in my head all week. The garden is lush with blossoms, the air melodious with birdsong. Who could not sing, dance, and frolic? In the musical, the high spirits of “June Is Bustin’ Out All Over” lead into the death and redemption of an unemployed carnival barker who robs to support his wife and unborn child. Joy and woe are woven fine.
Carousel opened on Broadway in April 1945, as World War II was winding down. It was a season of hope. The troops would soon come home to set off a baby boom. Prosperity would replace wartime privation. It was also a season of mourning for the fallen who would never come home, and for their commander-in-chief. President Franklin Roosevelt had died exactly a week before the show opened.
The waning of the pandemic brings another season of joy and woe, hope and loss. Instead of blackout curtains, we begin to shed the face masks we wore to protect our communities. Vacation travel and nonessential shopping are making a comeback, as they did after WWII. At the same time, we are mourning the closure of favorite restaurants and bookstores. We grieve loved ones for whom the possibility of vaccination came too late.
It doesn’t negate the sorrows to celebrate the joys. I’m off to pull weeds from among the flowers, while “June Is Bustin’ Out All Over” plays over and over in my head.
One pandemic day looks much like another. I track the calendar by weekly routines: Tuesday trash pickup, Wednesday laundry, Thursday groceries. Chores will remain in post-pandemic life, of course, but people and places and activities will reintroduce variety.
Rituals, like routines, can look the same from day to day. The difference is that rituals are infused with meaning and intention. Some writers begin their pen-to-paper or hand-to-keyboard time with a ritual to focus the mind: light a candle, or meditate, or sip coffee and watch the sun rise. “Meaningless ritual” is an oxymoron. Repetition without presence or meaning is merely routine.
Any routine can be converted to ritual with mindfulness. Washing the dishes to wash the dishes. Feeling the warmth of clean towels as you fold. At the same time, I cherish the inclusion of mindless routines in the mix, activities so habitual that my thoughts can wander free. Being fully present is a wonderful thing. So, to me, is the unfocused state in which imagination runs rampant, untethered to the here and now.
For most of the past year, I figured coronavirus could do serious harm if I got infected, but avoiding people kept my exposure low. Last month, I traveled to a state where exposure was almost ubiquitous, trusting vaccination to minimize my risk.
All our lives we weigh risks as a basis for decisions, large and small. I’ve long thought of risk as having two elements: how likely is something to happen, and how bad might it be if it does? Lately I’ve noticed a third major element: personal risk tolerance. As my sociologist father taught me long ago, statistics predict populations, not individuals. How much certainty does your spirit require? What is your comfort with the unknown?
Data and logic go only so far. In the waning of the pandemic, some of my vaccinated friends remain outdoors in masks. Others dine in restaurants and hop on airplanes. There’s nothing wrong with letting emotions influence our choices. Without emotions, in fact, neuroscience suggests we couldn’t make decisions at all.
Rinderpest, or cattle plague, devastated southern and eastern Africa in the 1890s. With a fatality rate near 100%, the bovine epidemic caused mass starvation among humans. Europe and Russia managed to eliminate the pestilence by the early 1900s through quarantine, hygiene, and slaughter. After development of a vaccine, rinderpest in 2011 became the second disease ever officially declared eradicated.
I was privileged some years ago to attend a meeting on disease eradication at The Carter Center in Atlanta. With smallpox eradicated by 1980, what other human diseases might be targets? As best I recall, criteria included that a disease be infectious, caused by a virus that would die out without human hosts, and preventable by vaccine. Covid-19 wouldn’t make the list; even if every human were immunized, coronavirus could lurk in other animals to come back later.
Herd immunity is the way a mostly immune population forms a circle of protection around the susceptible few. It’s why unvaccinated babies and people with weak immune systems rarely get measles in the United States; the virus can't spread for lack of hosts. The threshold for herd immunity depends on too many factors to calculate precisely. How effective and long-lasting is the vaccine? How infectious is the virus? How carefully do people behave? How much do unvaccinated individuals cluster together, letting a single case set off an outbreak?
Those factors for Covid are all changing or not fully known. Perhaps the best we will manage is control. Coronavirus may continue to circulate at low levels, mutating along the way. We may return year after year for the latest vaccine formulation, as many of us do for flu. Not the best of all possible worlds, but one we could live with. We may have to.
If I thought of them at all, I thought of magazines like Ladies’ Home Journal and Good Housekeeping as light reading in a doctor’s waiting room, an occasional source of recipes. I never thought of them as Progressive Era forces for social reform, alongside the movements for prohibition and suffrage.
Middle-class women around 1900 organized to protect home and family. They battled corruption that threatened health, safety, and sanitation. Women’s magazines pioneered investigative journalism to inform and promote these efforts.
Unscrupulous vendors peddled quack remedies promising to cure every ailment. In response, in 1892 the Ladies’ Home Journal became the first magazine to refuse medical advertising. It compelled Mrs. Winslow’s Soothing Syrup to reveal its ingredients and eliminate the morphine. The editor published the ingredients of other patent medicines and hired a journalist-lawyer to investigate abuses. Other periodicals followed suit, building public pressure to regulate drugs.
No law required labeling the contents of packaged foods. Good Housekeeping published articles about hazardous food colorings and preservatives, such as formaldehyde in infant formula. It opened an experiment station in 1900 (later called the Good Housekeeping Institute) to test products and issue consumer alerts. The magazine campaigned for a national pure food law and advised readers how to add their voices.
Congress passed the Pure Food and Drug Act in 1906. Women’s magazines had been laying the groundwork for years.
Grateful to be fully vaccinated, some of us are relearning the art of unmasked, face-to-face conversation. I hope this shift won’t breathe new life into low-tech tools for muting one another.
For example, “You’re in denial” uses classic psychobabble to silence disagreement. If you answer, “No, I’m not,” you just proved the speaker’s point.
Another conversation-stopper is “Can’t you take a joke?” It shifts blame from one who says something offensive to the one who takes offense. Uneasy with threats to batter a spouse or kill an elected official? If you don’t want to be dismissed as humorless, better keep your mouth shut.
“I’m just saying” is a more recent addition to the toolkit. It supposedly takes the sting out of a remark by labeling it casual opinion or observation. Rather than an invitation to explore, it signals lack of interest in analysis or debate. You are free to respond, but your response will fall on deaf ears. You’re on mute.
“Vaccine” comes from vacca, Latin for “cow.” Edward Jenner proved in the 1790s that pus from relatively harmless cowpox blisters—common among dairymaids—protected humans against smallpox. Jenner’s vaccination built on a riskier traditional practice in Asia and Africa called inoculation (from inoculatus, “to graft or implant”). An Englishwoman and an African man introduced inoculation into Europe and North America respectively, almost exactly 300 years ago.
Lady Mary Wortley Montagu lost a brother to smallpox, and the disease scarred her face. Living in Constantinople while her husband was British ambassador, she watched old Turkish women scratch smallpox pus into healthy arms or legs. The resulting infection was usually mild instead of deadly or disfiguring. Back in England during an epidemic, she had her daughter inoculated publicly in April 1721. The practice spread.
That same April, smallpox came by ship to Boston, Massachusetts. An enslaved West African named Onesimus told his master, Cotton Mather, that he knew how to prevent the disease. The operation he described “had given him something of the smallpox and would forever preserve him from it.” Bostonians resisted vehemently. More than half the city’s residents contracted smallpox in 1721-22; one in seven patients died. But of the 242 who were inoculated, all but six survived.
Jenner holds an important place in the development of vaccines. So should the traditional healers of Asia and Africa, and the white woman and the black man who brought their methods to the West.
One of my pandemic projects is to study introductory Spanish. Please don’t ask me to speak it. Videos and worksheets are limited tools for learning a language. But they help me understand the workings of an unfamiliar tongue, and of my mind.
Two distinct verbs translate as “to be.” (Spanish speakers, feel free to correct me.) Ser refers to inherent or fundamental characteristics. I am a woman, a writer, an American. Ser o no ser, “to be or not to be.” Estar is for a condition of the moment: I am tired, and my house is a mess.
Language shapes perception. How might I experience life’s ups and downs if I had to distinguish fleeting from lasting, each time I opened my mouth? Learning the difference when I first learned to talk might have given clearer meaning to my mother’s adage, “This too shall pass.”
I'm a historian who writes novels and literary nonfiction. My home base is Madison, Wisconsin.