|
The warrior ethos is back – or did it ever go away? According to the U.S. Army, it includes focus, courage, persistence, and loyalty. According to the recently named War Department, it also includes misogyny, maximum lethality, freedom from restraint, and treatment of protesters as the enemy. What some call manliness, others call toxic masculinity.
Don’t get me wrong. Since high school, I’ve hated the terms feminine and masculine, which implied that having interests more common among boys made me less of a girl. I’m a fan of people being themselves regardless of stereotypes. But what if a young man feels drawn toward a self-image of traditional manhood? Must he give up that part of himself to avoid turning toxic? Our hypothetical young man may find dictionary opposites of toxic—wholesome, beneficial, healthful, harmless—decent but hardly inspiring. He might be more energized by the classic, positive image of a man as protector and provider. He can protect his country by serving in the military, his friends by seeing them safely home at night, or his children by getting them vaccinated. He can provide by making household repairs, volunteering in his community, or bringing home an income. He has the self-assurance to share these roles with a partner or others. He has the perspective to affirm his chosen roles without demeaning the choices of others: the artist painting in his attic, the explorer in the Amazon jungle. Not to say that men must be providers and protectors, nor that women should not. But for those who choose it, traditional manhood can be anything but toxic. Image: National Park Service photograph
0 Comments
The ghosts have gone back to their graveyards. The witches have flown off on their broomsticks. At this time a year ago, the presidential election was almost upon us. This year I have friends living in fear even after the ghosts and witches have left.
Why do some scares delight us, while others keep us awake at night? The same state of arousal that prepares our bodies for fight or flight—rapid heartbeat, heightened blood pressure—can also feel like excitement, depending on context. For those who like risky adventure challenges, fear and thrill go hand in hand. My circle of young moms long ago discussed when preschoolers are old enough to enjoy Where the Wild Things Are by Maurice Sendak. Three-and-a-half? Five? I don’t recall. The point was that younger tots found Sendak’s wild things too scary, no matter how cheerfully we read to them. Adult brains, too, find some fears too intense to enjoy. I read murder mysteries, spiced by an element of suspense, but horror films I’ll leave to others. Nature, nurture, and post-traumatic stress disorder leave some of us more fearful than others. Beyond that, some of us are in genuinely more vulnerable situations. To those living in fear right now, I can’t say you’re wrong—whether you choose to fight, flee, or enjoy the excitement. Image: Photo by Neven Krcmarek on Unsplash. It gives me wry amusement to recall my youthful response to the civil rights demonstrations of the early 1960s. “Nobody ever changed anything by walking around waving signs,” I said. The aftermath of the March on Washington in 1963 proved me wrong. Some 250,000 people from all over the nation came to the National Mall on August 28 to protest racial discrimination. The next year Congress passed the Civil Rights Act of 1964.
Anti-war protests in the streets of the United States a few years later shifted public opinion toward ending American involvement in the Vietnam War. Meanwhile, an estimated 20 million people nationwide participated in Earth Day on April 22, 1970; this largest single-day demonstration in U.S. history led to passage of the Clean Air and Clean Water Acts and the creation of the Environmental Protection Agency. Protests over the weeks after police killed George Floyd in Minneapolis in May 2020 involved between 15 and 26 million across the nation, spurring some state and local police reforms such as banning chokeholds and no-knock warrants. Crowd sizes are often contentious. Numbers matter as a measure of public support, a means to expand public awareness, and a factor in planning for public safety. Memorable disputes arose after the 1995 Million Man March and the 2017 Presidential Inauguration. Early attempts to be objective, by multiplying density (people per unit space) by the total space covered, didn’t account for variations in how tightly people cluster or for arrivals and departures over time. Drone photography, satellite images, and AI models allow more accurate headcounts if weather and visibility cooperate. That said, nearly 7 million estimated participants nationwide made the No Kings protest this October one of the largest in American history. What changes as a result, if anything, is yet to be seen. Image: No Kings protest in Dallas, Texas, June 14, 2025. Photograph by Brendan Rogers. Habits can be hard to break, especially those you’ve kept up for ages. Other habits, formed not so long ago and then skipped for a year or two, can be hard to recover. After spending more days this summer in the garden than the woods, I put too much trust in my Ice Age Trail (IAT) routines and mental maps when I finally got back to the trail last week.
I used to toss the IAT guidebook and atlas in the car before leaving home. This time it didn’t enter my head. Road names like Frenchtown and Storytown rang familiar, but I forgot how they related to where I wanted to go. Even after a stop at Kwik Trip for bananas, gas, and directions, it took twice the predicted time to locate the trailhead. Bit by bit I re-learned what was once familiar. How to use trekking poles to stay upright on a stony, uneven surface. How to pause at forks in the path to make sure I’ll know the way back. How to still my mind to make space for the sounds of squirrels, birdsong, and wind in the upper branches. I’ve read suggestions to recall what delighted you as child and rediscover or adapt it later in life. Exploring unfamiliar woodlands has always brought me joy. Today I carry poles and a cell phone and am more likely to stick to the path, but the joy is the same. And the habits that support it are gradually coming back. On Thursday I stowed all my short-sleeved tee shirts in the guestroom closet and brought the long-sleeved jerseys to hang near my bed. The outdoor temperature was in the forties. We may not top the low seventies again till spring.
It’s often struck me how law and culture define sharp boundaries, of necessity, for changes that are really gradual. When are you really mature enough to drink or drive, or to vote wisely? When does one become a person or an adult? How fast is it really safe to drive on a country road? By what date is a landlord required to heat the apartment? This week I was struck by the personal need to do the same. Although switching the wardrobe from summer to winter, or the thermostat from cool to heat, isn’t tied to a specific date on the calendar, these markers of the turning season happen on a single day. Meanwhile the world orbits the sun, leaf colors deepen week by week, and the level on the outdoor thermometer bobs up and down. By any measure, fall is well and truly here. In high school, I was secretary of every club I belonged to. Those were the days when secretaries were girls almost by definition, and the nerdy clubs I joined consisted mostly of boys. I didn’t mind. Taking minutes let me record events as I saw them, truthfully but with editorial comment. I might describe a discussion as interminable, or mention smirks and rolled eyes when someone said something particularly stupid.
By my twenties I’d reached a more adult understanding of responsibility. Impersonal writing had no place for personal bias. And yet . . . My college seminar professor Mr. Blodgett taught us that history without bias was impossible. However hard you strive for objectivity, you must still choose what to include and how to arrange it. Researchers must decide what to investigate. Journalists must decide who to interview. An editor must decide which story to headline. Nor is “fair and balanced” a solution. Do flat earth and round earth merit equal weight? Between 1950 and 1981, Americans heard Walter Cronkite’s news reports with confidence and respect. He was called the most trusted man in America. I haven’t heard that said of anyone else in decades. Advances in technology let us choose our separate silos, who we’ll trust and who we’ll dismiss. We debate not just policies but facts. The skeptics among us distrust everybody. There’s no going back, but that doesn't mean "anything goes." Repeat as fact only what’s true; base claims on evidence; reveal possible conflicts of interest. Follow the scientific method or the journalists’ code of ethics. Every honest step helps, even if we can’t avoid bias altogether. Even Walter Cronkite had to decide which news was important enough to report. Image: Newsman Walter Cronkite, “the most trusted man in America.” At the height of the Cold War in the 1950s, the U.S. created training materials for school children to protect themselves in case of a nuclear explosion. Many older adults recall crouching under their desks, or in interior corridors, and tucking their heads under their arms for practice.
My school didn’t hold “duck and cover” drills. What enemy would waste nuclear weapons on Morgantown, West Virginia? However, I remember lining up in the playground for military-style metal “dog tags” embossed with name, religion, and probably more. We were to wear them on a chain around the neck, so our bombed bodies could be identified and given an appropriate burial. The Soviet bomb didn’t fall, the Cold War ended, and today school children learn self-protection through active shooter drills instead. Déjà vu all over again? Two differences stand out. First, school shootings aren’t merely possible but occur at an alarming rate. An individual child’s risk is low, but one dead child is one too many. Second, the source of potential danger is no longer overseas but in our home neighborhoods. In the words of Walt Kelly’s cartoon possum Pogo, “We have met the enemy and he is us.” Active shooter drills are controversial. Some children say preparation boosts their confidence. Others experience anxiety to the point of PTSD, especially if a drill comes unannounced and mimics a real shooting. Best would be to end school shootings in the first place. Common-sense laws to bar access to firearms except for recreation, hunting, or self-defense? Stronger mental health systems for youth? Ample funding for nonpartisan scientific research into gun violence and the effectiveness of current state-level preventive strategies would be a good place to start. Image: The U.S. produced the pamphlet Duck and Cover in 1951. A film version followed the next year. Crunch! Grind! Surely something is wrong with the car.
Five majestic oaks line our long Wisconsin driveway. I love the trees and the wildlife they support. Each September, though, they carpet the drive with the loose organic gravel Winnie-the-Pooh called “haycorns.” I take out the leaf blower with the rechargeable battery to clear acorns from the asphalt. At first it is fun. The battery runs down and I take a break. After it charges, I blow some more. An acorn falls on my head. By the second recharge, the fun is wearing off. My charge has run down too. Like most things in life, it’s a trade-off. We always have choices, even if we don’t have the choice we’d prefer. Reframing “have to” as “choose to” gives me a sense of agency, even if the outcome is the same. I could cut down the oaks or leave the drive too bumpy for comfort. I’m not required to blow the acorns aside. I enjoy the freedom to choose this task over the alternatives. There isn’t always one clear best choice. Socrates drank hemlock rather than hide his opinions, Galileo recanted rather than face torture for heresy, and I don’t respect either man the less. Other cases are more mundane. I wasn’t forced to cancel the picnic because of rain; I decided to. I didn’t have to stay home sick; I chose that over the risk of infecting others. It boosts my resilience to treat myself as an actor in my own life instead of a mere victim of circumstance or fate. Blaming the younger generation is nothing new. “They think they know everything, and are always quite sure about it,” Aristotle wrote in ancient Greece. But naming distinct generations is a novelty. Credited to Gertrude Stein, it began with more sympathy than blame. “All of you young people who served in the war . . . You are all a lost generation,” Stein told Ernest Hemingway to describe the cynicism and disillusion of the 1920s.
Shared historical experiences give any generation a distinct consciousness, Hungarian sociologist Karl Mannhein wrote in 1923. Each cohort who grew up during wars, depression, the baby boom, rapid technological change, or the Covid pandemic shared a different life experience. Tom Brokaw’s book The Greatest Generation (1998) honored the men and women who fought in the Second World War and labored on the home front. Next came the more traditional, conformist generation of the 1950s. After a childhood of depression and war, they craved quiet family life with economic security. An article in Time Magazine in 1951 stated, “The most startling fact about the younger generation is its silence.” A high birth rate (baby boom) in families of the silent generation gave rise to a large, rebellious cohort of young adults in the 1960s. They weren’t called boomers till long after the fact. The slightly bizarre practice of naming and dating every generation is largely a 21st century phenomenon. Douglas Coupland’s novel Generation X: Tales for an Accelerated Culture (1991) unintentionally popularized a title for his own cohort, born after the baby boom. The theory of archetypal cycles in Neil Howe’s and William Strauss’s Generations (1991) didn’t catch on, but their term millennials did, meaning those who would come of age around the year 2000. Classing individuals by year of birth (listed here) risks stereotyping. It also offers a shorthand for real differences, like the job security of the boomers vs. the social media of the millennials. It has become so entrenched that we now name a cohort before it takes form. Instead of “lost,” “great,” or “silent,” after millennials came the placeholder Generations Z and Alpha. Our only certainty about Gen Beta, present-day infants, is that they will think they know everything, and will always be quite sure about it. Image: Photo by Bill Fairs on Unsplash. Fresh-picked vegetables fill the grocery bins to overflowing. Ragweed pollen sets my nose running nonstop. Here we are again, welcoming meteorological autumn, while the astronomical calendar says we’re in the last gasp of summer. Diminishing daylight is predictable to the minute. Plants ripen on their own terms when they’re ready.
As a historian, I tend to think in terms of linear time. Things happen from first to second to third, from beginning to end. The past never recurs exactly; context always changes. On the other hand, it’s said that history doesn’t repeat itself, but it rhymes. While I grew irreversibly from child to teen to college student and beyond, my girlhood summers ended with the start of school, year after year without fail. All time is both linear and cyclical. For me, current events force the issue. Linear time isn’t the same as progress. I’m grateful for medical advances that keep me alive and technologies that keep me in touch with family far away. But the state of democracy, climate, and the war-torn world can make me question whether the long arc of the universe truly bends toward justice. At these moments, shifting to a cyclical perspective eases my spirit. Day will follow night. Spring will follow winter. And one of these weeks, frost will shut down the ragweed until next year. Image: Photo by Aaron Burden on Unsplash. |
AuthorI'm a historian who writes novels and literary nonfiction. My home base is Madison, Wisconsin.
|
RSS Feed