We’re in the throes of another election season. Candidates and voters deny election results from 2020 or from recent primary defeats. Meanwhile, Covid-19 responses add to a long history of science denial, in the tradition of the Flat Earth Society or nineteenth-century doctors’ refusal to wash their hands.
Perhaps some people can’t stand the tension of beliefs that conflict with their wishes. Perhaps some haven’t heard the factual evidence. Perhaps some spread lies deliberately for profit, ratings, or votes. But top factors in the appeal of denialism surely include belonging and distrust.
To be shut out from community because of differences of belief brings pain. We’re influenced by those around us not only because we think they know best but because we want and need their acceptance. If we deny evidence to stay in the group, what’s most likely to change our minds is exposure to supportive new friends with a different perspective.
To feel scorned or ignored by perceived elites breeds distrust of authority. Unlike skepticism, which demands evidence, denialism rejects legal and scientific authority regardless of evidence. If we deny expertise in order to claim our equal right to an opinion, the most effective antidote might be mutually respectful conversations free from superior sneers.
Richard S. Gilbert writes, “I rise in the morning torn between the desire to save the world or to savor it.” When my must-do list is completed for the day, does should-do have a claim on any time and energy left over? Or are the remaining hours mine to enjoy as uselessly as I please?
Poets offer both answers. “The people I love the best jump into work head first,” Marge Piercy writes. “The pitcher cries for water to carry and a person for work that is real.” Yet from Mary Oliver we read, “You do not have to be good. . . . You only have to let the soft animal of your body love what it loves.”
Who got it right? I suspect individuals differ in what will help them heal. Saint Augustine, after years of restless promiscuity, found peace through Christian asceticism. A thousand years later, Martin Luther—a religious over-achiever whose efforts to please God never felt like enough—found peace in the idea of salvation by faith. Telling young Augustine he did not have to be good would only have worsened his struggles, but it was exactly the message young Luther needed to hear.
“O wad some Pow'r the giftie gie us
To see oursels as ithers see us!”
- Robert Burns, “To a Louse”
“We are what we pretend to be, so we must be careful
about what we pretend to be.”
- Kurt Vonnegut, Mother Night
Zoom still confuses me. In a group set on gallery view, if we all lean left with the right arm raised, the screen shows me leaning the opposite way from everyone else. Apparently Zoom shows me in mirror image and the rest as if they’re facing me in a photo. Which gives a truer picture, the mirror or the photograph?
It’s human to care how others view us. This can reinforce community norms. On the other hand, it’s unhealthy to care too much. Aware and conscious may be synonyms, but self-awareness becomes a liability when it sinks into self-consciousness.
Then there’s the fake persona, the image projected in order to disguise the true self. That’s great when Julie Andrews whistles a happy tune in The King and I to suppress her fear. Not so great when it’s you or me unable to establish honest ties.
Some people seem to value image so highly it obliterates any separate sense of self. Nothing fake about it. Photo ops are all that matters. A recent mass shooter was said to have nurtured an idealized image of mass shooters online and shaped himself in that image.
How do you want to be remembered? What do you want carved in your gravestone? Personally, I’d rather forget all that in favor of connecting with the individual in front of me. How others see me is their business.
Eating sweet, juicy blackberries straight off the vine was an unexpected pleasure of last week’s Ice Age Trail segment behind a golf course. Thorns were the downside. Even when I resisted reaching in for another berry, branches blocking the path scratched my bare skin. I could have stayed home to weed, but then I’d have missed the walk. It’s all trade-offs.
Whether a decision is weighty or trivial, personal or global, moral or strategic, every option has pros and cons. Without costs or silver linings, there’s no choice to make. What to do is obvious.
In these contentious times, it can be hard to admit there’s right and wrong on both sides of the issues we debate so hotly. To acknowledge trade-offs doesn’t betray our deep convictions. Instead, it opens us to the humanity of those with whom we disagree. We may find similar pros and cons on their scales and ours. Sometimes we just weigh them differently.
How did our Founders view abortion when they wrote the Constitution? I doubt they gave it much thought. Morning sickness and lack of monthly bleeding were female ailments, best left to wives and midwives. How carnal knowledge related to childbirth was a mystery. The first sign of growth in the womb came at four or five months with “quickening,” when the mother felt the baby move. Perhaps that’s when a piece of the mother’s body broke free or the male seed sprouted. Perhaps disrupted cycles cracked the shells in which babies had waited since Creation, nested inside each other like a Russian doll.
As in British common law, ending a pregnancy or “restoring the menses” was legal in America before quickening but not afterward. It was often done with herbs or patent medicines, and was most common among married, middle- or upper-class white Protestant women. Disapproval was reserved for unmarried women and Catholics, guilty of sex outside wedlock or for purposes other than procreation.
The American Medical Association, founded in 1847, campaigned to prohibit ending a pregnancy before quickening. Why? Some say it was to protect women from poison by unregulated abortifacients; or to wrest control of women’s health from midwives; or to counter the declining white Protestant birthrate in reaction to rising immigration. By 1880, all states had laws to restrict abortion.
The science of conception was still unfolding; DNA lay decades in the future. Laws against abortion, like those on contraception, were more about public morals than what was happening in the womb. Does it matter what the Founders believed? Would they have thought differently if they knew what science later revealed? Should a notion of long legal tradition give more weight to the laws of 1820 or 1880? These are questions of public policy and judicial interpretation, not my sphere today. I’ll only say, if the cultural and legal history of abortion has a role in the discussion, let’s at least get the history right.
The cultural and political divisions of our time look as nearly impassible as the snow-covered peaks of the Continental Divide, back before railroads and airplanes. Other great North American drainage divides offer more inviting models. The one I know best separates the watershed of the Lakes/St. Lawrence River from that of the Mississippi River. Waters on one side flow to the North Atlantic; on the other, to the Gulf of Mexico.
This divide looks far from dramatic. It is low, often marshy, and scarcely visible to the casual observer. Centuries ago, before canals, spring rains made some marshes wet enough to paddle across. Otherwise, travelers had to carry goods and canoes overland from one watershed to the other. French fur traders called such crossings portages, from French for “carry.” Major portages connected Green Bay with the Wisconsin River, Chicago with the Illinois River, and Cleveland with the Ohio River. To cross them was tiring but not prohibitive.
The Gulf of St. Lawrence and the Gulf of Mexico are two thousand miles apart. That’s a plausible metaphor for today’s societal distance between right and left, conservative and progressive, red state and blue state. We’ll never all think alike, and it wouldn’t be healthy if we did. But the barrier shouldn’t have to consist of impenetrable mountains. What if we envisioned it as low and possible to traverse? Might we aspire to connect despite differences, and some days even to paddle across?
Image: The Chicago Portage by Edgar Spier Cameron, 1862-1944.
Our sprawling perennial garden delights me with constant surprises. What’s in bloom today? Dedicated gardeners appear to pot, divide, or transplant with pleasure. Me, I’d rather pull dandelions.
I studied the Emancipation Proclamation as a schoolgirl but didn’t hear of Juneteenth till much later. On June 19, 1865, the Union army announced in Texas that all formerly enslaved people were now free. Annual celebrations spread from Black communities in Texas throughout the South and beyond. Urged by activists such as Opal Lee, last year Juneteenth became our newest federal holiday. Lee said, “It is not a Black thing, it’s not just a Texas thing, but it’s about freedom for everybody.”
As a white northerner, how can I join in with joy and respect? The difference between cultural diffusion, appropriation, and assimilation is imprecise. Already some businesses have introduced Juneteenth-themed products and faced backlash. Opinions online vary. Many say to study, recommit, and give the holiday the solemnity of Veterans Day or Memorial Day. Unlike those days, though, it’s fine to wish people a happy Juneteenth. Is it supportive or intrusive for me to enter into Juneteenth traditions like street fairs, rodeos, and barbeque cookouts featuring red drinks and desserts?
We know how to enjoy a festive wedding without making it all about us. We attend by invitation only; we don’t try to dress like the bride or groom; we take our cues from the organizers; we listen more than we talk. We have a great time and remember what it’s all about. This might be a model for a white northerner at a local, public Juneteenth celebration. Thanks to Opal Lee and others, we’re all invited.
Image: 1920s Juneteenth celebration, from a documentary film by Solomon Sir Jones. Beinecke Rare Book & Manuscript Library, Yale.
I furled my umbrella and passed the registration desk without stopping. As directed by email, I took the elevator to the third floor, bypassed another registration desk, and followed a young woman through a maze of hallways. Her desk held a tape measure, a blood pressure cuff, and a rack of eight or ten vials with brightly colored caps. An hour later I opened the umbrella and stepped back into the rain, lighter by several tablespoons of bodily fluids. My doctor would never know unless I decided to tell her.
All of Us is a nationwide program to support medical research about interactions among environment, lifestyle, and genetics. Organizers aim to build a data base of a million adult volunteers who give researchers anonymous access to their medical records. Now that I’ve donated specimens and measurements, my role is to answer occasional surveys online. I’m always free to opt out.
Part of life is finding ways to contribute that fit one’s interests, skills, and opportunities, whether organizing a petition drive or feeding a hungry kitten. In this time of widespread distrust of science, one small way I can help is to be one of a million whose data researchers can draw on for years to come.
What once-common jobs can you think of that are now obsolete or nearly so? Town crier. Lamp lighter. Phone company switchboard operator. Wet nurse, a woman who breastfed someone else’s baby.
A less desirable alternative was to feed the baby milk from goats, cows, mares, or donkeys. Compared to human milk, cow’s milk contains fewer easy-to-digest carbohydrates, and more protein in the form of hard-to-digest casein. Milk spoils quickly without refrigeration or pasteurization. Babies fed only cow’s milk were less likely to survive infancy.
In 1865, the German chemist Justus von Liebig introduced Liebig’s Soup for Infants. Formulated to make cow’s milk more like human milk, it added wheat flour, malt flour, and potassium carbonate. A powdered version to mix with cow’s milk and water resolved the problem of spoilage. Soon afterward, Nestlé in Switzerland introduced a cereal composed of cow’s milk, wheat flour, and sugar, for infants who could not be breastfed. Such products were expensive. Most caregivers preferred to mix their own at home.
Harvard professor Thomas Morgan Rotch taught pediatricians to direct infant nutrition according to a “percentage method.” Caregivers should dilute cow’s milk with water to reduce the percentage of casein, then add sugar and cream to restore their concentration. It was cumbersome to do at home but reached a close match to the percentages of protein, sugar, and fat in human milk.
By the 1950s, many hospitals gave new mothers recipes for formula made of evaporated milk, water, and sugar or corn syrup, with a liquid vitamin supplement on the side. Perhaps our present emergency has caregivers reviving such recipes. (Consult your pediatrician.) On the borderline between food and pharmaceuticals, commercial formula is closely regulated to keep babies safe. That makes the industry difficult to enter. Like so much in life, it’s all tradeoffs.
Image: The bureau of wet nurses in Paris - wet nurses waiting to be selected. Aquatint, 1822. Wellcome Collection.
I'm a historian who writes novels and literary nonfiction. My home base is Madison, Wisconsin.