Pulitzer Prize–winning poet Natasha Trethewey began her two-year tenure as United States Poet Laureate in 2012, becoming the first African American, and the first Southerner, to receive the honor in decades.
In “Articulation,” a poem from our June 2016 issue, Trethewey envisions her recently deceased mother after viewing an 18th-century portrait of Saint Gertrude:
Miguel Cabrera / Dallas Museum of Art
How not to see, in the saint’s image,
my mother’s last portrait—the dark backdrop,
her dress black as a habit, the bright edge
of her afro ringing her face with light? And how
not to recall her many wounds: ring finger
shattered, her ex-husband’s bullet finding
her temple, lodging where her last thought lodged?
Read the full poem here, and read about how Trethewey wrote her father’s “Elegy” here.
This week marks 157 years since Walt Whitman’s poetry first appeared in The Atlantic.
Library of Congress
Now celebrated as “America’s Bard” and read widely as one of the country’s most popular poets, Whitman first reached out to Atlantic co-founder Ralph Waldo Emerson from creative obscurity. In 1855 he sent Emerson a copy of his recently self-published poetry collection, Leaves of Grass, in hopes of expanding his readership. Though Emerson responded with a note of praise—which Whitman, to Emerson’s dismay, circulated in the press and even published in an expanded version of the collection—Leaves of Grass failed to garner widespread attention.
Whitman’s next contact with The Atlantic resulted in the publication of “Bardic Symbols” (later reprinted under the title “As I Ebb’d With the Ocean of Life”) in 1860—though James Russell Lowell omitted two lines that he considered overly graphic. In the poem, Whitman responded to his would-be readers’ disinterest with melancholy self-reflection:
As I wend the shores I know not,
As I listen to the dirge, the voices of men and woman wrecked,
As I inhale the impalpable breezes that set in upon me,
As the ocean so mysterious rolls toward me closer and closer,
At once I find, the least thing that belongs to me, or that I see or
touch, I know not;
I, too, but signify a little washed-up drift,—a few sands and dead
leaves to gather,
Gather, and merge myself as part of the leaves and drift.
Luckily for Whitman, this period of creative frustration did not last.
Newly inspired during the Civil War, Whitman published a second collection of poems, Drum-Taps, and won the recognition and critical acclaim he had sorely lacked a decade earlier. The resulting change in his outlook is evident in “Proud Music of the Sea Storm,” his second poem to appear in The Atlantic, which ends on a note of creative triumph:
… what thou hast heard, O Soul, was not the sound of winds,
Nor dream of stormy waves, nor sea-hawks flapping wings, nor harsh scream,
Nor vocalism of sun-bright Italy,
Nor German organ majestic—nor vast concourse of voices—nor layers of harmonies;
Nor strophes of husbands and wives—nor sound of marching soldiers,
Nor flutes, nor harps, nor the different bugle-calls of camps;
But, to a new rhythmus fitted for thee,
Poems, vaguely wafted in night air, uncaught, unwritten,
Which, let us go forth in the bold day, and write.
In April 1904, more than a decade after his death, The Atlantic published Whitman’s writing for the final time. In a lecture he had prepared but never had the opportunity to deliver, he celebrates language, and particularly the language of America, at one point musing:
In America an immense number of new words are needed to embody the new political facts, the compact of the Declaration of Independence, and of the Constitution—the union of the States—the new States—the Congress—the modes of election—the stump speech—the ways of electioneering—addressing the people—stating all that is to be said in modes that fit the life and experience of the Indianian, the Michiganian, the Vermonter, the men of Maine. Also words to answer the modern, rapidly spreading faith of the vital equality of women with men, and that they are to be placed on an exact plane, politically, socially, and in business, with men. Words are wanted to supply the copious trains of facts, and flanges of facts, arguments, and adjectival facts, growing out of all new knowledges.
Whitman’s poetry is often held up as an embodiment of the enduring spirit of America. Piecing through his presence in the archives, I was struck by the resonance of these works from his period of struggle to the current national moment. With all the uncertainty and disunity of America today, I found it both illuminating and a little heartening to take a look at the country, the difficulties of expression, the seeming public indifference through his eyes—and to consider the language, new or old, that might carry us through to brighter times.
Henry David Thoreau is something of a poster child for solitude. In his essay “Walking,” published just after his death in our June 1862 issue, Thoreau made the case “for absolute freedom and wildness … to regard man as an inhabitant, or a part and parcel of Nature, rather than a member of society”:
We should go forth on the shortest walk, perchance, in the spirit of undying adventure, never to return, prepared to send back our embalmed hearts only as relics to our desolate kingdoms. If you are ready to leave father and mother, and brother and sister, and wife and child and friends, and never see them again—if you have paid your debts, and made your will, and settled all your affairs, and are a free man—then you are ready for a walk.
Thoreau himself was “a genuine American weirdo,” as Jedediah Purdy recently put it, and solitude suited him: His relentless individualism irritated his friends, including Atlantic co-founder Ralph Waldo Emerson, who described Thoreau’s habit of contradicting every point in pursuit of his own ideals as “a little chilling to the social affections.” Emerson may have had Thoreau in mind when, in our December 1857 issue, he mused that “many fine geniuses” felt the need to separate themselves from the world, to keep it from intruding on their thoughts. Yet he questioned whether such withdrawal was good for a person, not to mention for society as a whole:
Thoreau in his second and final photographic sitting, August 1861 (Wikimedia)
This banishment to the rocks and echoes no metaphysics can make right or tolerable. This result is so against nature, such a half-view, that it must be corrected by a common sense and experience. “A man is born by the side of his father, and there he remains.” A man must be clothed with society, or we shall feel a certain bareness and poverty, as of a displaced and unfurnished member. He is to be dressed in arts and institutions, as well as body-garments. Now and then a man exquisitely made can live alone, and must; but coop up most men, and you undo them. …
When a young barrister said to the late Mr. Mason, “I keep my chamber to read law,”—“Read law!” replied the veteran, “’tis in the courtroom you must read law.” Nor is the rule otherwise for literature. If you would learn to write, ’tis in the street you must learn it. Both for the vehicle and for the aims of fine arts, you must frequent the public square. … Society cannot do without cultivated men.
Emerson concluded that the key to effective, creative thought was to maintain a balance between solitary reflection and social interaction: “The conditions are met, if we keep our independence, yet do not lose our sympathy.”
Four decades later, in our November 1901 issue, Paul Elmore More identified a radical sympathy in the work of Nathaniel Hawthorne, which stemmed, he argued, from Hawthorne’s own “imperial loneliness of soul”:
Hester Prynne, the lonely protagonist of Hawthorne’s The Scarlet Letter (Wikimedia)
His words have at last expressed what has long slumbered in human consciousness. … Not with impunity had the human race for ages dwelt on the eternal welfare of the soul; for from such meditation the sense of personal importance had become exacerbated to an extraordinary degree. … And when the alluring faith attendant on this form of introspection paled, as it did during the so-called transcendental movement into which Hawthorne was born, there resulted necessarily a feeling of anguish and bereavement more tragic than any previous moral stage through which the world had passed. The loneliness of the individual, which had been vaguely felt and lamented by poets and philosophers of the past, took on a poignancy altogether unexampled. It needed but an artist with the vision of Hawthorne to represent this feeling as the one tragic calamity of mortal life, as the great primeval curse of sin … the universal protest of the human heart.
Fast-forward a century, and what More described as “the solitude that invests the modern world” had only gotten deeper invested—while “the sense of personal importance” gained new narcissistic vehicles in the form of social-media tools that let us “connect” online while keeping our real, messy selves as private as we choose. Which is not a bad thing: In some ways, the internet looks like the perfect way to achieve Emerson’s ideal balance between independent thought and social engagement.
In our May 2012 issue, however, Steven Marche wondered if the rise of social media is making us lonely:
A considerable part of Facebook’s appeal stems from its miraculous fusion of distance with intimacy, or the illusion of distance with the illusion of intimacy. Our online communities become engines of self-image, and self-image becomes the engine of community. The real danger with Facebook is not that it allows us to isolate ourselves, but that by mixing our appetite for isolation with our vanity, it threatens to alter the very nature of solitude.
The new isolation is not of the kind that Americans once idealized, the lonesomeness of the proudly nonconformist, independent-minded, solitary stoic, or that of the astronaut who blasts into new worlds. Facebook’s isolation is a grind. What’s truly staggering about Facebook usage is not its volume—750 million photographs uploaded over a single weekend—but the constancy of the performance it demands. More than half its users—and one of every 13 people on Earth is a Facebook user—log on every day. Among 18-to-34-year-olds, nearly half check Facebook minutes after waking up, and 28 percent do so before getting out of bed. The relentlessness is what is so new, so potentially transformative. Facebook never takes a break. We never take a break.
The same year, Brian Patrick Eha also noted the changing nature of solitude—particularly the kind of solitude achieved by wearing headphones in public. “We are each of us cocooned in noise,” he wrote, “and can escape from one another’s only when immersed in our own.” For both Marche and Eha, the problem with technology is not its tendency to isolate people so much as the way it works to prevent us—through a sense of connection or simply through distraction—from fully experiencing that isolation and all it entails.
And as the author Dorthe Nors explained in 2014 for our By Heart series of writer interviews, a full experience of isolation has serious benefits:
Doug McLean
The artistic process unfolds in the lonely hours. That’s when the work happens. You have to control the creative energy that you’ve got. You have to discipline yourself to fulfill it. And that work only happens alone.
Solitude, I think, heightens artistic receptivity in a way that can be challenging and painful. When you sit there, alone and working, you get thrown back on yourself. Your life and your emotions, what you think and what you feel, are constantly being thrown back on you. And then the “too much humanity” feeling is even stronger: you can’t run away from yourself. You can’t run away from your emotions and your memory and the material you’re working on. Artistic solitude is a decision to turn and face these feelings, to sit with them for long periods of time.
For Nors, like for Hawthorne, solitude not only enables personal reflection, but also grants access to some deeper, more universal strain of human feeling. That’s the same lesson that Nathaniel Rich, writing in our latest issue, took from the story of Christopher Knight, who spent 27 years living utterly alone in the woods of Maine:
Since his arrest in April 2013, Knight has agreed to be interviewed by a single journalist. Michael Finkel published an article about him in GQ in 2014 and has now written a book, The Stranger in the Woods, that combines an account of Knight’s story with an absorbing exploration of solitude and man’s eroding relationship with the natural world. Though the “stranger” in the title is Knight, one closes the book with the sense that Knight, like all seers, is the only sane person in a world gone insane—that modern civilization has made us strangers to ourselves.
Adrian Tomine
Yet a total withdrawal from civilization can’t be the answer—nor, at a political moment when empathy and understanding seem ever-more-urgently needed, can walling yourself off from other people’s ideas be wise. In February, Emma Green offered this critique of a new book by Rod Dreher, a conservative Christian thinker who calls for like-minded members of his faith to withdraw from public life into communities of their own:
Dreher wrote The Benedict Option for people like him—those who share his faith, convictions, and feelings of cultural alienation. But even those who might wish to join Dreher’s radical critique of American culture, people who also feel pushed out and marginalized by shallowness of modern life, may feel unable to do so. Many people, including some Christians, feel that knowing, befriending, playing with, and learning alongside people who are different from them adds to their faith, not that it threatens it. For all their power and appeal, Dreher’s monastery walls may be too high, and his mountain pass too narrow.
So, tell us about your experience: How do you incorporate solitary reflection into a 21st-century lifestyle? Can you see communitarian benefits in spending more time on your own—or, on the other hand, point to what society loses when more people spend more time alone? Please send your answers (and your questions) to [email protected].
Here’s how an Atlantic author answered that question in September 1858:
Full of anticipations, full of simple, sweet delights, are these [childhood] years, the most valuable of [a] lifetime. Then wisdom and religion are intuitive. But the child hastens to leave its beautiful time and state, and watches its own growth with impatient eye. Soon he will seek to return. The expectation of the future has been disappointed. Manhood is not that free, powerful, and commanding state the imagination had delineated. And the world, too, disappoints his hope. He finds there things which none of his teachers ever hinted to him. He beholds a universal system of compromise and conformity, and in a fatal day he learns to compromise and conform.
But it wasn’t until the 20th century that scientists began to seriously study child development. In our July 1961 issue, Peter B. Neubauer heralded “The Century of the Child”:
Gone is the sentimental view that childhood is an era of innocence and the belief that an innate process of development continuously unfolds along more or less immutable lines. Freud suggested that, from birth on, the child’s development proceeds in a succession of well-defined stages, each with its own distinctive psychic organization, and that at each stage environmental factors can foster health and achievement or bring about lasting retardation and pathology. …
Freudian psychology does not, as some people apparently imagine, provide a set of ready-made prescriptions for the rearing of children. … The complexity of the interactions between mother and child cannot be reduced to rigid formulas. Love and understanding cannot be prescribed, and if they are not genuinely manifested, the most enlightened efforts to do what is best for the child may not be effective.
According to this view, children weren’t miniature adults, but they were preparing for adulthood. Growing up was a process that had to be managed by adults, which made the boundaries of childhood both more important and more nebulous.
A few years later, in our October 1968 issue, Richard Poirier described the backlash to a wave of campus protests as “The War Against the Young.” He implored older adults to take young people’s ideas seriously:
It is perhaps already irrelevant, for example, to discuss the so-called student revolt as if it were an expression of “youth.” The revolt might more properly be taken as a repudiation by the young of what adults call “youth.” It may be an attempt to cast aside the strangely exploitative and at once cloying, the protective and impotizing concept of “youth” which society foists on people who often want to consider themselves adults.
What’s more, Poirier argued, idealism shouldn’t just be the province of the young:
If young people are freeing themselves from a repressive myth of youth only to be absorbed into a repressive myth of adulthood, then youth in its best and truest form, of rebellion and hope, will have been lost to us, and we will have exhausted the best of our natural resources.
But how much redefinition could adulthood handle? In our February 1975 issue, Midge Decter addressed an anxious letter to that generation of student revolutionaries, who—though “no longer entitled to be called children”—had not yet fulfilled the necessary rites of passage for being “fully accredited adults”:
Why have you, the children, found it so hard to take your rightful place in the world? Just that. Why have your parents’ hopes for you come to seem so impossible of attainment?
Some of their expectations were, to be sure, exalted. … But … beneath these throbbing ambitions were all the ordinary—if you will, mundane—hopes that all parents harbor for their children: that you would grow up, come into your own, and with all due happiness and high spirit, carry forward the normal human business of mating, home-building, and reproducing—replacing us, in other words, in the eternal human cycle. And it is here that we find ourselves to be most uneasy, both for you and about you.
Decter blamed this state of affairs on overindulgent parenting: Adults, she argued, had failed their children by working too hard to protect them from unhappiness and by treating their “youthful rebellion” with too much deference.
The next decades’ developments in child psychology gave parents new advice. In our March 1987 issue, Bruno Bettelheim stressed the importance of letting kids guide their own play, without parents pushing them to obey rules they aren’t yet developmentally ready for. And in our February 1990 issue, Robert Karen outlined attachment theorists’ recommendations for how to “enable children to thrive emotionally and come to feel that the world of people is a positive place”—standards measured in part by a baby’s willingness to explore apart from its mother.
Were these parenting styles encouraging kids’ independence, or failing to push them hard enough? A generation after Decter, in Lori Gottlieb’s 2011 Atlantic piece “How to Land Your Kid in Therapy,” she also worried about parental indulgence:
The message we send kids with all the choices we give them is that they are entitled to a perfect life—that, as Dan Kindlon, the psychologist from Harvard, puts it, “if they ever feel a twinge of non-euphoria, there should be another option.” [Psychologist Wendy] Mogel puts it even more bluntly: what parents are creating with all this choice are anxious and entitled kids whom she describes as “handicapped royalty.” …
When I was my son’s age, I didn’t routinely get to choose my menu, or where to go on weekends—and the friends I asked say they didn’t, either. There was some negotiation, but not a lot, and we were content with that. We didn’t expect so much choice, so it didn’t bother us not to have it until we were older, when we were ready to handle the responsibility it requires. But today, [psychologist Jean] Twenge says, “we treat our kids like adults when they’re children, and we infantilize them when they’re 18 years old.”
In Hanna Rosin’s April 2014 article “The Overprotected Kid,” she lamented the loss of independence that once helped kids come of age:
One common concern of parents these days is that children grow up too fast. But sometimes it seems as if children don’t get the space to grow up at all; they just become adept at mimicking the habits of adulthood. As [geographer Roger] Hart’s research shows, children used to gradually take on responsibilities, year by year. They crossed the road, went to the store; eventually some of them got small neighborhood jobs. Their pride was wrapped up in competence and independence, which grew as they tried and mastered activities they hadn’t known how to do the previous year. But these days, middle-class children, at least, skip these milestones. They spend a lot of time in the company of adults, so they can talk and think like them, but they never build up the confidence to be truly independent and self-reliant.
Yet how exactly do you measure “true” independence and self-reliance? And what’s the final milestone that marks the transition to adulthood? Decter suggests it’s settling down with a stable career and a family. But in Julie Beck’s 2016 Atlantic piece, “When Are You Really an Adult?,” she places that rite of passage in historical context:
The economic boom that came after World War II made Leave It to Beaver adulthood more attainable than it had ever been. Even for very young adults. There were enough jobs available for young men, [historian Steven] Mintz writes, that they sometimes didn’t need a high-school diploma to get a job that could support a family. And social mores of the time strongly favored marriage over unmarried cohabitation hence: job, spouse, house, kids. But this was a historical anomaly. …
Many young people, [psychologist Jeffrey] Jensen Arnett says, still want these things—to establish careers, to get married, to have kids. (Or some combination thereof.) They just don’t see them as the defining traits of adulthood. Unfortunately, not all of society has caught up, and older generations may not recognize the young as adults without these markers. A big part of being an adult is people treating you like one, and taking on these roles can help you convince others—and yourself—that you’re responsible.
So, adults: What convinced you? Many readers have discussed the topic already, and we’d like to reopen the call for your stories—this time with an eye to the gaps between what it takes to feel like an adult and what it takes to be seen as one. Did you feel you’d become an adult long before you got treated like one? Or have you passed the markers of adulthood without quite feeling you’ve fully grown up? If you’re a parent, when did you feel your kids had grown up, or what will it take to make you certain? Please send your answers—and questions—to [email protected].
Alexandra of Russia and her son Alexei, photographed between 1910 and 1913.Library of Congress
Today is International Women’s Day. It also happens to be the 100th anniversary of the start of the revolution that brought down the Russian empire. Given the coincidence, I was delighted to find in our archives an article from our January 1928 issue titled “The Fall of the Russian Empire: The Part Played by a Woman”—that is, until I read author Edmund Walsh’s assessment of exactly what that “part” was:
Russia was the last island fortress of absolutism in the rising tide of democracy, the outstanding anachronism of the twentieth century. … It defied the elements for three hundred years—until the deluge came. Whose hand unloosed the flood gates? In my opinion, a woman, all unconsciously, had more to do with the final debacle than any other single cause. … History probably will clear the memory of Alexandra Feodorovna [of treason, but] it can never clear her memory of tendencies, practices, and imprudences that contributed notably to Russia's ruin. The domination which this imperious, proud, aloof, and resolute woman exercised over her irresolute and impressionable husband became such a menace that more than one grand duke, duchess, and general cried out in warning against it. …
Revolutions are made by men and women determining events. Men are swayed by powerful human emotions. Women create them. And the master passion, particularly in neurotic females, can be as elegantly indifferent to the realities of life and war as ever Montesquieu was to the existence of God.
It’s a fascinating historical document, undeniably sexist in its overtones. The gist of Walsh’s argument is that the Tsarina Alexandra, driven by fear for the health of her hemophiliac son, gave the self-proclaimed holy man and healer Grigori Rasputin a level of influence that irrevocably weakened the Russian government. For evidence, Walsh delves into the embarrassing intimacies of Alexandra’s letters to her husband. And he criticizes the empress on two familiar, contradictory fronts: On the one hand, she’s weak and overly emotional, too much guided by motherly worries to see the bigger picture of Russian politics. On the other, she’s aggressive and overly domineering, stepping outside her proper sphere of childrearing to advise her husband on governance. She’s portrayed as a femme fatale, making a “subtle approach to political questions … through the gateway of the Tsar’s affections.” But she’s not granted agency either: Walsh argues she brought about the fall of the empire “all unconsciously.” She is, like female leaders still are, damned for the stereotypes of womanhood she does fulfill and damned for the ones she does not.
But none of this is to dispute the chain of events that Walsh describes. Alexandra and her husband did fail at governance: For any leader, male or female, it’s a heartbreaking reality that even the safety of one’s own family must come second to the national interest. And for all the sexism embedded in Walsh’s narrative, I agree with his central point that “revolutions are made by men and women determining events.” What struck me, reading this article today, was Alexandra’s simple human vulnerability, and my own reaction to it—my inclination to sweep this unflattering story under the rug. When we seek to recognize the women of history, what do we do with the history that reveals individual women as less than admirable? How do we celebrate women—our role models, ourselves—as powerful, vulnerable, fully complex humans, flaws intact?
If you have thoughts on that question, please let us know. It reminded me of a comment about female characters I’d seen recently from a reader during TAD’s book-club discussion of Margaret Atwood’s The Handmaid’s Tale. The dystopian government of that novel is built on the oppression of women, to which its central character, Offred, bears witness but doesn’t fight back. The reader wrote:
When I started reading The Handmaid's Tale before Christmas, someone (a LADY before you get all "pffft men!") told me that the women of the novel, especially Offred, bothered her. It took me a long time to settle into why Offred might be kind of frustrating—and it's because I think, as women, we all want her to fight, to struggle, to know that what is happening to her is wrong. But she's a product of her conditioning, isn't she? And she's been conditioned so well. She passively allows things to happen to her. She just rides along, not speaking up or out or anything.
But as another reader pointed out, “Offred is like a lot of folks. When it comes down to it. It's easier to put your head down and survive.” And the first reader agreed, summing it up:
I think we expect women characters to be strong nowadays. We’re shocked when they aren’t.
We’ve come a long way since 1928, when portrayals of women like Walsh’s were more or less the norm. Over decades—centuries, for that matter—women have worked to prove that they are strong and brave and smart; that they are leaders and revolutionaries; that they are more than mothers only; that motherhood is no lesser thing. It’s been proven time and again, but women still must demand the freedom to be imperfect—which is no more and no less than what every human deserves.
That’s the question that reader John Harris has been asking himself lately. He’s not alone: In 1862, one of The Atlantic’s founders, Ralph Waldo Emerson, wondered the same thing about aging. Acknowledging that “the creed of the street is, Old Age is not disgraceful, but immensely disadvantageous,” Emerson set out to explain the upsides of senescence. A common theme is the sense of serenity that comes with age and experience:
Youth suffers not only from ungratified desires, but from powers untried, and from a picture in his mind of a career which has, as yet, no outward reality. He is tormented with the want of correspondence between things and thoughts. … Every faculty new to each man thus goads him and drives him out into doleful deserts, until it finds proper vent. … One by one, day after day, he learns to coin his wishes into facts. He has his calling, homestead, social connection, and personal power, and thus, at the end of fifty years, his soul is appeased by seeing some sort of correspondence between his wish and his possession. This makes the value of age, the satisfaction it slowly offers to every craving. He is serene who does not feel himself pinched and wronged, but whose condition, in particular and in general, allows the utterance of his mind.
By 1928, advances in medicine had made it more possible to take a long lifespan for granted. In an Atlantic article titled “The Secret of Longevity” (unavailable online), Cary T. Grayson noted that “probably at no other time in the history of the human race has so much attention been paid to the problem of prolonging the span of life.” He offered a word of warning:
Any programme which has for its object the prolongation of life must also have, accompanying this increased span of life, the ability of the individual to engage actively and with some degree of effectiveness in the affairs of life. Merely to live offers little to the individual if he has lost the ability to think, to grieve, or to hope. There is perhaps no more depressing picture than that of the person who remains on the stage after his act is over.
On the other hand, as Cullen Murphy contended in our January 1993 issue, an eternity spent with no decrease in faculties wouldn’t necessarily be desirable either:
There are a lot of characters in literature who have been endowed with immortality and who do manage to keep their youth. Unfortunately, in many cases nobody else does. Spouses and friends grow old and die. Societies change utterly. The immortals, their only constant companion a pervading loneliness, go on and on. This is the pathetic core of legends like those of the Flying Dutchman and the Wandering Jew. In Natalie Babbitt’s Tuck Everlasting, a fine and haunting novel for children, the Tuck family has inadvertently achieved immortality by drinking the waters of a magic spring. As the years pass, they are burdened emotionally by an unbridgeable remoteness from a world they are in but not of.
Since antiquity, Murphy wrote, literature has had a fairly united stance on immortality: “Tamper with the rhythms of nature and something inevitably goes wrong.” After all, people die to make room for more people, and pushing lifespans beyond their ordinary limits risks straining resources as well as reshaping families.
Charles C. Mann examined some of those potential consequences in his May 2005 Atlantic piece “The Coming Death Shortage,” predicting a social order increasingly stratified between “the very old and very rich on top … a mass of the ordinary old … and the diminishingly influential young.” Presciently, a few years before the collapse of the real-estate bubble that wiped out millions of Americans’ retirement savings, Mann outlined the effects of an increased proportion of older people in the workforce:
When lifespans extend indefinitely, the effects are felt throughout the life cycle, but the biggest social impact may be on the young. According to Joshua Goldstein, a demographer at Princeton, adolescence will in the future evolve into a period of experimentation and education that will last from the teenage years into the mid-thirties. … In the past the transition from youth to adulthood usually followed an orderly sequence: education, entry into the labor force, marriage, and parenthood. For tomorrow’s thirtysomethings, suspended in what Goldstein calls “quasi-adulthood,” these steps may occur in any order.
In other words, Emerson’s period of “ungratified desires and powers untried” would be extended indefinitely. Talk about doleful deserts! On top of such Millennial malaise, Mann also predicted increased marital stress, declining birth rates, a depleted labor force, and a widespread economic slowdown as the world’s most powerful nations entered a “longevity crisis.”
But that’s just one vision. Another came from Gregg Easterbrook, who anticipated “a grayer, quieter, better future” in his October 2014 Atlantic article “What Happens When We All Live to 100?” His argument has some echoes of Emerson’s, but with modern science to back it up:
October 2014
Neurological studies of healthy aging people show that the parts of the brain associated with reward-seeking light up less as time goes on. Whether it’s hot new fashions or hot-fudge sundaes, older people on the whole don’t desire acquisitions as much as the young and middle-aged do. Denounced for generations by writers and clergy, wretched excess has repelled all assaults. Longer life spans may at last be the counterweight to materialism.
Deeper changes may be in store as well. People in their late teens to late 20s are far more likely to commit crimes than people of other ages; as society grays, the decline of crime should continue. Violence in all guises should continue downward, too. … Research by John Mueller, a political scientist at Ohio State University, suggests that as people age, they become less enthusiastic about war. Perhaps this is because older people tend to be wiser than the young—and couldn’t the world use more wisdom?
It’s a good point. Couldn’t we all use more wisdom, more experience, more opportunities to learn? Wouldn’t we make better use of our lives if our lives went on forever? Not so fast, Olga Khazan wrote last month:
A common fear about life in our brave, new undying world is that it will just be really boring, says S. Matthew Liao, director of the Center for Bioethics at New York University. Life, Liao explained, is like a party—it has a start and end time. … “But imagine there’s a party that doesn’t end,” he continued. “It would be bad, because you’d think, ‘I could go there tomorrow, or a month from now.’ There’s no urgency to go to the party anymore.”
The Epicureans of ancient Greece thought about it similarly, [psychologist Sheldon] Solomon said. They saw life as a feast: “If you were at a meal, you’d be satiated, then stuffed, then repulsed,” he said. “Part of what makes each of us uniquely valuable is the great story. We have a plot, and ultimately it concludes.”
Even so, some futurists believe immortality is within reach:
So, what do you think: Is there a limit to how long people should live? Is it selfish to want eternity for yourself, or would having even a few immortals around make the world better for everyone? Here’s one reader’s take:
This reminds me a bit of the cylons in the “new” Battlestar Galactica.
With the ability to reincarnate infinitely, and be effectively immortal, they were callous towards humans, and killed humans with impunity. It was only when their ability to reincarnate was ended and they became effectively mortal (and thus subject to basically the same rules of death as humans) that they were driven to behave in a moral way.
But another reader argues:
I for one think the world would be a better place if we collectively took a longer view, and what better way to do that than to give everyone a stake in it?
What the internet does to the mind is something of an eternal question. Here at The Atlantic, in fact, we pondered that question before the internet even existed. Back in 1945, in his prophetic essay “As We May Think,” Vannevar Bush outlined how technology that mimics human logic and memory could transform “the ways in which man produces, stores, and consults the record of the race”:
Presumably man’s spirit should be elevated if he can better review his shady past and analyze more completely and objectively his present problems. He has built a civilization so complex that he needs to mechanize his records more fully if he is to push his experiment to its logical conclusion and not merely become bogged down part way there by overtaxing his limited memory. His excursions may be more enjoyable if he can reacquire the privilege of forgetting the manifold things he does not need to have immediately at hand, with some assurance that he can find them again if they prove important.
Bush didn’t think machines could ever replace human creativity, but he did hope they could make the process of having ideas more efficient. “Whenever logical processes of thought are employed,” he wrote, “there is opportunity for the machine.”
Fast-forward six decades, and search engines had claimed that opportunity, acting as a stand-in for memory and even for association. In his October 2006 piece “Artificial Intelligentsia,” James Fallows confronted the new reality:
If omnipresent retrieval of spot data means there’s less we have to remember, and if categorization systems do some of the first-stage thinking for us, what will happen to our brains?
I’ve chosen to draw an optimistic conclusion, from the analogy of eyeglasses. Before corrective lenses were invented, some 700 years ago, bad eyesight was a profound handicap. In effect it meant being disconnected from the wider world, since it was hard to take in knowledge. With eyeglasses, this aspect of human fitness no longer mattered in most of what people did. More people could compete, contribute, and be fulfilled. …
It could be the same with these new computerized aids to cognition. … Increasingly we all will be able to look up anything, at any time—and, with categorization, get a head start in thinking about connections.
But in Nicholas Carr’s July 2008 piece “Is Google Making Us Stupid?,” he was troubled by search engines’ treatment of information as “a utilitarian resource to be mined and processed with industrial efficiency.” And he questioned the idea that artificial intelligence would make people’s lives better:
It suggests a belief that intelligence is the output of a mechanical process, a series of discrete steps that can be isolated, measured, and optimized. In Google’s world, the world we enter when we go online, there’s little place for the fuzziness of contemplation. Ambiguity is not an opening for insight but a bug to be fixed. The human brain is just an outdated computer that needs a faster processor and a bigger hard drive.
Even as Carr appreciated the ease of online research, he felt the web was “chipping away [his] capacity for concentration and contemplation.” It was as if the rote tasks of research and recall, far from wasting innovators’ time, were actually the building blocks of more creative, complex thought.
On the other hand, “you should be skeptical of my skepticism,” as Carr put it. And from the beginning, one great benefit of the internet was that it brought people in contact not just with information, but with other people’s ideas. In April 2016, Adrienne LaFrance reflected on “How Early Computer Games Influenced Internet Culture”:
In the late 1970s and early 1980s, game makers—like anyone who found themselves tinkering with computers at the time—were inclined to share what they learned, and to build on one another’s designs. … That same culture, and the premium it placed on openness, would eventually carry over to the early web: a platform that anyone could build on, that no one person or company could own. That idea is at the heart of what proponents for net neutrality are trying to protect—that is, the belief that openness is a central value, perhaps even the foundational value, of what is arguably the most important technology of our time.
But as tech culture evolved and pervaded life outside the web, even its problem-solving methods began to seem reductive at times. Ian Bogost outlined that paradox in November 2016 when a new product called ketchup leather was billed as the “solution” to soggy burgers:
The technology critic Evgeny Morozov calls this sort of thinking “solutionism”—the belief that all problems can be solved by a single and simple technological solution. … Morozov is concerned about solutionism because it recasts social conditions that demand deeper philosophical and political consideration as simple hurdles for technology. …
But solutionism has another, subtler downside: It trains us to see everything as a problem in the first place. Not just urban transit or productivity, but even hamburgers. Even ketchup!
So, what’s your personal experience of how the internet affects creativity? Can you point to a digital distraction—Netflix, say, or Flappy Bird—that’s enriched your thinking in other areas of your life? On the flip side of the debate, can you point to a tool like email or Slack that’s sharpened your efficiency but narrowed the scope of your ideas? We’d like to hear your stories; please send us a note: [email protected].
The poet Thomas Lux died on February 5. It seems fitting to honor him and his decades of Atlantic contributions with a brief history, but also with his own words in his own voice.
Speaking about his craft in an Atlantic interview from 2004, Lux is both magpie of unusual facts (“Without the dung beetle we’d all be up to our clavicles in cow pies. They deserve an ode!”) and defender of poetry’s essential weirdness:
I love mystery, strangeness, nuttiness, wildness, leaps across chasms, irreverence, all the crazy stuff we love about poetry. We don’t usually love poems because they are well made, or smart, or deep. We love them for their crazy hearts.
In the nine poems Lux published in our pages, you’ll find wry humor—1984’s “Snake Lake” begins:
My friends, I hope you will not swim here:
this lake isn’t named for what it lacks.
And you’ll find startling echoes of the present in “Henry Clay’s Mouth” (1999):
He said: “Kissing is like the presidency,
it is not to be sought and not to be declined.”
…
It was written, if women had the vote,
he would have been President,
kissing everyone in sight,
dancing on tables (“a grand Terpsichorean
performance ...”), kissing everyone,
sometimes two at once, kissing everyone,
the almost-President
of our people.
Years ago, as part of a series for poetry month, we gathered a selection of old Atlantic audio recordings of poets reading their works. My part was to convert the files from an obsolete, unplayable format to mp3. Among them was Lux’s reading of “Virgule,” an ode to “/” that begins:
What I love about this little leaning mark
is how it divides
without divisiveness. The left
or bottom side prying that choice up or out,
the right or top side pressing down upon
its choice: either/or,
his/her.
Listen to him read the entire poem:
Far more qualified people can speak to his particular brilliance—I’m just someone who tried to rescue his voice, or a minute and 38 seconds of it, from the online abyss and deliver him to you.
I asked my colleague David Barber, the Atlantic’s poetry editor, for his memories of the magazine’s long history with Lux. He writes:
Tom Lux’s quirky, wily, incorrigibly uncanny poems left their mark far and wide from way back, but The Atlantic could be said to have a special claim on him.
For one thing, he was a local boy made good: Born and raised in Northampton, Mass., where his father ran a dairy farm, he was a fixture for many years in Boston and its environs, home base to the august bewhiskered poets who founded the magazine in 1857. His editor at Boston’s Houghton Mifflin for several of his celebrated collections was Peter Davison, the Atlantic’s late longtime poetry editor and literary lion of parts. His work appeared early and often in these pages over those years, immediately recognizable for its mordant wit, offbeat verve, and matchless knack for musing beguilingly on just about anything. The only predictable trait of a Lux poem was that it would be the one and only thing of its kind.
It’s the weariest of clichés to say that a certain poet sounds like none other. Lux was the real McCoy. It’s there in the deadpan delivery, the sure comic timing, the live-wire ear for oddball lingo and kooky hearsay, the slyboots way of spinning tall tales out of small talk. His bittersweet satirical bent belongs to no school or tribe; his smarts and chops were his and his alone. Is there another American poet since Stevens who conjured up so many humdinger titles? Could anyone else have composed an ode to the secret life-force of a punctuation mark? Was there ever a laconic elegy for long-gone summertimes quite as definitively disarming as “The Man Into Whose Yard You Should Not Hit Your Ball”?
At an event marking the start of Black History Month, President Trump gave a very Trumpian shoutout to Frederick Douglass, who, he said, “is an example of somebody who’s done an amazing job and is getting recognized more and more, I notice.” The vague nature of the praise has drawn scorn from some corners of the Internet, but let’s not be churlish: Frederick Douglass was a king among men. So let us continue with our now-annual tradition of reacquainting you with his brilliant and prescient mind. It’s a valuable exercise in part because Douglass’s preoccupations are still very much the topics of contemporary political debate.
Man is the only government-making animal in the world. His right to a participation in the production and operation of government is in inference from his nature, as direct and self-evident as is his right to acquire property or education. It is no less a crime against the manhood of a man, to declare that he shall not share in the making and directing of the government under which he lives, than to say that he shall not acquire property and education.
The arm of the Federal government is long, but it is far too short to protect the rights of individuals in the interior of distant States. They must have the power to protect themselves, or they will go unprotected, spite of all the laws the Federal Government can put upon the national statute-book.
In an 1853 letter to Harriet Beecher Stowe that exemplified his interest in vocational education, Douglass didn’t call for the creation of new colleges to serve African Americans. Instead, he sought schools that would teach "agriculture and the mechanic arts." Douglass argued that teaching vocational skills—especially industrial ones—to African Americans would help them climb the ladder from slave to integrated freeman. A prosperous, upwardly mobile African-American working class would, he thought, offer a profound refutation of many pro-slavery arguments, which held that African Americans were incapable of economic self-sufficiency.
Douglass’s interest in practical training is in harmony with the thinking behind contemporary efforts to expand economic opportunity—especially to students from disadvantaged backgrounds—by promoting vocational programs and targeted enterprises such as Project Lead the Way. But his memoirs demonstrate that practical training is hardly enough. In his 1845 autobiography, the Narrative of the Life of Frederick Douglass, an American Slave, he chronicles his efforts to fashion an identity as a free man, offering a bracing portrait not only of the physical hardships of slavery but also of its psychological torments. "You have seen how a man was made a slave; you shall see how a slave was made a man," Douglass wrote.
If intelligence is the only true and rational basis of government, it follows that that is the best government which draws its life and power from the largest sources of wisdom, energy, and goodness at its command. The force of this reasoning would be easily comprehended and readily assented to in any case involving the employment of physical strength. We should all see the folly and madness of attempting to accomplish with a part what could only be done with the united strength of the whole.Though his folly may be less apparent, it is just as real when one-half of the moral and intellectual power of the world is excluded from any voice or vote in civil government.
And here he is, as my colleague Ta-Nehisi Coates lays out, demonstrating radical empathy across a divide so enormous it seems almost impassable.
It is a real and praiseworthy accomplishment for Douglass’s name to keep spreading. But the frequent, and often valid, critique of Black History Month is that it encourages a tokenist approach to African American culture, leading everyone from national leaders to elementary-school teachers to recite a catechism of well-known figures, producing both shallow engagement and privileging a passé Great Man (and Woman) theory of history.
There’s no use praising Frederick Douglass merely to consign him to the dustbin of history. He had plenty of things to say about the issues we’re still grappling with every day. And we’ll still be here, talking about voting rights and gender equality and cultural divides and the limits of federal power and, yes, Frederick Douglass, long after Black History Month is past.
Confession: I’ve never read one of Saul Bellow’s novels. (If you’ve got a strong case to make for which one I should start with, feel free to send me a note.) But I’ve been taught by enough people who love him to recognize his monumental place in American literature. Christopher Hitchens wrote about that influence in our November 2007 issue:
At Bellow’s memorial meeting ... the main speakers were Ian McEwan, Jeffrey Eugenides, Martin Amis, William Kennedy, and James Wood. ... Had it not been for an especially vapid speech by some forgettable rabbi, the platform would have been exclusively composed of non-Jews, many of them non-American. How had Bellow managed to exert such an effect on writers almost half his age, from another tradition and another continent? Putting this question to the speakers later on, I received two particularly memorable responses. Ian McEwan related his impression that Bellow, alone among American writers of his generation, had seemed to assimilate the whole European classical inheritance. And Martin Amis vividly remembered something Bellow had once said to him, which is that if you are born in the ghetto, the very conditions compel you to look skyward, and thus to hunger for the universal. ...
Bellow in his time was to translate Isaac Bashevis Singer into English (and “The Love Song of J. Alfred Prufrock” into Yiddish), but it mattered to him that the ghetto be transcended and that he, too, could sing America.
On that note, my colleague Emma has written a piece today about the complex history of Jewish identity in America, at a time when the self-described alt-right movement has given anti-Semitism an ugly new presence in public discourse. Read it here (and let us know if you have a related personal experience to share).
But speaking, as Hitchens does, of singing and transcendence and the translation of art into other forms and languages, I remember being thrilled to discover during a high-school AP English class that one of the Counting Crows songs I’d been listening to on repeat was titled after one of Bellow’s novels: Henderson the Rain King. The song’s narrator is scared, trapped, frustrated, and overlooked, and seems to invoke Henderson as a figure who represents many of those feelings:
Hey, I only want the same as anyone
Henderson is waiting for the sun
Oh, it seems night endlessly begins and ends ...
There’s a vision of freedom, though, in the wistful opening lines: “When I think of heaven ... I think of flying.”
Back to Hitchens on Bellow, Henderson, and flight:
Several of his heroes and protagonists—including the thick-necked Henderson, his only non-Jewish central character—rise above the sickly and the merely bookish. They tackle lions and, in the case of Augie March, a truly fearsome eagle. They mix it up with revolutionaries and bandits and hard-core criminals. Commenting on Socrates’ famous dictum about the worthlessness of the unexamined life, the late Kurt Vonnegut once inquired: “What if the examined life turns out to be a clunker as well?” Bellow would have seen, and indeed did see, the force of this question. Like Lambert Strether’s in The Ambassadors, his provisional answer seems to have been: “Live all you can; it’s a mistake not to.” And the tough-guy Henderson, so gross and physical and intrepid (and so inarticulate when he speaks, yet so full of reflective capacity when he thinks), cannot repress his wonder when flying: He keeps pointing out that his is the first generation to have seen the clouds from above as well as below:
“What a privilege! First people dreamed upward. Now they dream both upward and downward. This is bound to change something, somewhere.”
(Submit a song via hello@. Track of the Day archive here. Pre-Notes archive here.)
This song has been stuck in my head ever since I stumbled on a review of Polk’s diary in our August 1895 issue, in which James Schouler looked back on the legacy of the 11th U.S. president:
Whatever may be thought of Mr. Polk’s official course in despoiling Mexico for the aggrandizement of his own country, one cannot read this Diary carefully without an increased respect for his simple and sturdy traits of character, his inflexible honesty in financial concerns, and the pertinacious zeal and strong sagacity which characterized his whole presidential career. ... Both [George] Bancroft and [James] Buchanan, of his official advisers, have left on record, since his death, incidental tributes to his greatness as an administrator and unifier of executive action; both admitting in effect his superior force of will and comprehension of the best practical methods for attaining his far-reaching ends.
Indeed, Polk—who was born on this day in 1795—“met his every goal,” as TMBG puts it. Schouler also noted that John Quincy Adams had left a similar diary:
No two Presidents could have been more at the antipodes than were Polk and John Quincy Adams in political affiliations and designs. Yet each, after his peculiar fashion, was honest, inflexible in purpose, and pursuant of the country’s good; and both have revealed views singularly alike—the one as a scholar, the other as a sage and sensible observer—of the selfish, ignoble, and antagonistic influences which surge about the citadel of national patronage, and beset each supreme occupant of the White House.
Striking words for partisan times. Read a PDF of Schouler’s complete review here, and read more from my colleague Adam on those antagonistic influences here.
(Submit a song via hello@. Track of the Day archive here. Pre-Notes archive here.)
I love this song by the Simon Sisters, formed out of a famous poem. Carly Simon’s older sister, Lucy, had composed it. For a summer on the East Coast, dirt poor, they performed this in small clubs. The sisters caught on, and not too long after, Carly Simon went solo, finally daring to perform her solo songs.
I like the idea of the Simon Sisters launching a nascent career on the sails of a children’s lullaby. After all the lyrics, from Eugene Field’s poem “Wynken, Blynken, and Nod,” tell the story of a dream, about three fishermen who “sailed off in a wooden shoe” to fish among the stars. (“Now cast your nets wherever you wish— / Never afeard are we!” the stars tell them.) The poem promises “beautiful things” and “wonderful sights that be”; it’s no wonder the sisters’ audiences were charmed by the wistful tune. Likewise, when The Atlantic reviewed Field’s work in our August 1896 issue, the editors were especially enchanted by his poems for children:
Here the most guarded critic can forget his qualms, and yield himself whole-heartedly to a new and naïve fascination. … One has to go to Schumann’s Kinderscenen for a parallel rendering of the silver-gray phantasmagoria, half dream, half waking gleams and splinterings of fancy, that Field has given us in The Fly-Away Horse, and Wynken, Blynken, and Nod. …
Strangely enough, too, in the handling of these sympathetic subjects, many of the technical limitations of the poet’s gift which we have noticed are refined quite away. Elsewhere his sense of style is dull or non-existent; here the diction springs new as a flower out of rich deposits of nursery tradition, and the tune, starting with the swing of a cradle or the to-and-fro of a grand dame’s rockerless chair, leaps and lingers and bickers and swirls like the spirit of water. …
It is no small thing to voice the joys and woes of one whole stage of the earthly journey, however short, especially when that stage is full of the most enormous little psychic adventures. This Field has done. He has written the Canterbury Pilgrimage of infancy.
And off the pilgrims sail.
(Submit a song via hello@. Track of the Day archive here. Pre-Notes archive here.)