In the May 1910 issue of The Atlantic, Charles M. Harvey described a thrilling scene from 173 years ago today:
A rifle-reveille from the sentinels at four o’clock in the morning on June 1, 1843, awoke the camps on the Kaw [River near Westport, Missouri], and the bustle of preparation for the march [to Oregon] began. Fires were lit, breakfast cooked and eaten, the cattle and horses at the outskirts collected, and the oxen yoked.
At seven o’clock the bugle sounded the advance, the various divisions filed into the positions which had been assigned to them, and the column, stretching itself to several miles in length, broke away from Westport and the Missouri, and headed for the sunset. ...
Men, women, and children were there, to the number of nearly a thousand, with two hundred wagons drawn by oxen. With them were several thousand horses and cattle, and also household furniture, ploughs, and seeds. It was the kind of army that never retreats. It was a nation in transit.
If you grow up in Oregon, as I did, you can’t escape the romance of the Oregon Trail. It’s an origin story as powerful as that of Paul Revere or AlexanderHamilton, the kind of deeply American dream of work and reward that promises if you’ve made it this far, you deserve to be here.
Museums and landmarks across the state feature recreated wagons covered in canvas, and small towns have tourist strips decked out in frontier fonts. At my elementary school in Portland, we went to Oregon Trail camp and made “pioneer crafts,” coloring quilt patterns and shaking cream into butter in baby-food jars. From first to fourth grade I checked out every pioneer story the school library had to offer, wrote book reports with the plot summary “endured many hardships on the prairie,” and learned early how to spell “perseverance.” At recess, where other kids might play house, we played covered wagons.
It’s not just Oregonians, either; a generation of American kids grew up on the Oregon Trail computer game, shooting bison and dying of dysentery all across the prairie. Today, the pixelated icon has become an app, and adults can participate in a live-action version, as Emily Grosvenor reported for us in 2014:
Emily Grosvenor
Teams of 2-4 people, many in pioneer garb, build a wagon out of paper and dowel rods before tackling 10 challenges inspired by the computer game—things like floating the wagon across a kiddie pool, shooting at game with nerf guns, competing in a three-legged dysentery race to an outhouse. Instead of finding shelter, we built a tarp tent while volunteers sprayed us with water. We survived being pummeled with pool noodles by roller derby girls at the Platte River station. …
On the trail, as in the game, if you killed a bison, you could only carry 200 pounds of meat with you. In the live-action game, participants face the task of pushing 200 pounds of meat up a hill—in this case, a 200-pound man in a wagon regaling the crowd with meat facts. …
The nostalgia is intense, the group bonding quotient is high, the survival rate is through the roof. And luckily, the chances of dying of dysentery? Next to none.
You can sign up here. But a look back at Harvey’s 1910 account reveals a much less innocent side of the great migration west. His language is military—the wagon train is an “army” that moves in columns and wakes to reveille—and that’s no accident, since the settlers’ presence is partly an act of war:
Thus, diversified by occasional rushes of vast herds of buffaloes across the trail, or by menaces of attack from Indians hovering near in large bands, the days, weeks, and months passed. … The Platte, Fort Laramie, Independence Rock, South Pass, Fort Bridger, Fort Hall, Fort Boise, and other halting-places were greeted and left behind. From their lookouts on ridges and in mountain-gorges the Sioux, Crows, and Blackfeet, seeing white women and children for the first time, read their own doom in this vast migration of a great people.
At the end of 1848 there were only five hundred American settlers west of the Rocky Mountains and north of California. When, in September, 1843, the column … filed across the Cascade Mountains, and down into the valley of the Willamette, a thousand were added to this population-roll, and the first corps of the American army of occupation arrived in Oregon.
Oregon’s pioneers weren’t only propelled by the simple motives I learned as a child—hope, bravery, wanderlust, the desire for something better. That’s part of the story, but as Harvey explains, “They were also the answer to appeals for colonists who would rescue the Northwest from England,” staking a claim for American empire by settling in contested territory. More grievously, their mission was to colonize what Harvey calls “an Indian-infested region,” and the “vast migration” that began with the wagon train in 1843 did indeed spell doom for Native American tribes. As Weston put it recently,
This was Manifest Destiny, and there’d never be enough room for Native Americans and white settlers. In treaty after reneged treaty, the land granted to the tribes of the Great Plains shrunk. The U.S. wanted them docile, to take up farming on the reservations and stay put. … The Army planned to slaughter all buffalo and starve the tribes into submission.
Which leaves the Oregon Trail story—where? It’s a marvel, and a tragedy, the story of a nation’s progress and of other nations’ loss. It’s a story of people who set out to build new lives for themselves, of people whose lives were destroyed in the process, and of something more human than Manifest Destiny, a story of change and of harm and of hope.
This Memorial Day marked 485 years since Joan of Arc was burned at the stake on charges of heresy. The Catholic saint, who led the French army to victory at the siege of Orleans before she was captured and tried by an enemy court, was only 19 when she died, and only 13 when she saw the visions of saints that she said called her to fight for France. In the January 1884 issue of The Atlantic, poet Helen Gray Cone captured that moment of revelation, which ended the carefree days of Joan’s childhood:
Such days are gone, and strange things come instead;
For she has looked on other faces white [...]
Has stooped, ah Heaven! in some low sheltering shed
To tend dark wounds, the leaping arrow’s bite,
While the cold death that hovered seemed her own.
And in her hurt heart, o’er some grizzled head,
The mother that shall never be has yearned;
And love’s fine voice, she else shall never hear,
Came to her as the call of saints long dead;
And straightway all the passion in her burned,
One altar-flame, that hourly waxes clear.
Hence goes she ever in a glimmering dream.
And very oft will sudden stand at gaze,
With blue, dim eyes that still not seem to see:
For now the well-known ways with visions teem.
That poem, “Lepage’s Joan of Arc,” is based on an 1879 painting, Joan of Arc by Jules-Bastien Lepage, that happens to hold a special place in my family mythology.
The painting is a favorite of both my parents, two artists who met at New York City’s Cooper Union in the 1980s. When I emailed my dad to tell him I’d found the painting in the Atlantic archives, he reminded me that the two of them used to visit it often at the Metropolitan Museum of Art. “Your mom and I would always make a special trip to see it and pay homage,” he wrote. “I like to think that we each discovered the painting independently and brought our mutual love for it into the relationship, further evidence of some kind of fated convergence for the two of us.”
I asked my parents to tell me how they came to love Joan of Arc. First, my dad’s reply:
Among the formal aspects that I most admire about the painting are its tapestry-like surface, a confluence of backyard gardens, weedy turf and overhanging tree branches that reveal brief portals in some places, into a more distant space. The wonderful part of it is that all this riotous activity brings all the visual drama of the painting right up to the surface, as though it’s all happening on one plane.
Your mom also noted the painting’s scale—over nine feet in each dimension, as a totally encompassing experience—especially when you get up close. It completely fills your field of vision. Within that scale, the actual Joan of Arc figure is virtually life-size, kind of making it a mirror for your own body.
I would say this work has been a big influence on both of us. Lepage was a late-comer to the impressionist, naturalistic style and he brought with him all the ambitions of the great themes one would paint if they were an academician. He never quite fit into either camp. He wanted to make work about the things he knew best and although he appreciated academic scope of ambition, he didn’t like its stodgy rules.
You could say that your mom and I are both trying to make extraordinary things through using a vocabulary rooted in the day-to-day vernacular. That is where Lepage is influential. That and in the surprise of the Joan of Arc painting, where you encounter so many extra things that you didn’t expect to find there.
And here’s my mom’s response:
One of the cool things about the experience of Joan of Arc is that it was located somewhere in the European painting section upstairs, but I was never exactly sure where it was. That area was very big (it was differently arranged back then), and the rooms were always so similar looking and you could just go around and wander and get lost in the images. So when I went, it was like taking a walk in the woods, just an enjoyable stroll, and then it would appear, as if I discovered it anew all over again.
I love that Joan’s revelation is depicted as taking place in a garden. For me there is always a sense of communion in my garden, a feeling of so much life energy and spirit, like I am not alone out there, and like so much is possible.
I think that sense of possibility is something that I see captured in Helen Gray Cone’s poem. She talks of this moment that Joan is in, where once life was soft, gray skies with fluffy shapes of flowers etc. and she is a girl, and then those days are gone. Her experiences with others’ fears and pain move her to imagine change, to dream of something better. It makes sense that it was a woman who wrote the poem, that she could put herself in Joan of Arc’s position and emotions wrapped up in this decision—the hurt heart, the yearning of the mother that shall never be, love’s fine voice she’ll never hear—these are women’s choices, ones we make in our everyday battles.
I’d been thinking of Joan’s vision, in which “the cold death that hovered seemed her own,” as a kind of death—not only the loss of her childhood, but the loss of choice and chance, the revelation of a destiny that could no longer be avoided. And so it meant a lot to me to see my parents bring up possibility and discovery in their reading of the painting. I’m still learning, after all, what it means to be an adult, to have a purpose and a future, to see “the well-known ways” with different eyes. Maybe Joan’s vision is less a loss than a coming of age—not death, but the discovery of life to come.
See ya later, ‘60s. We’re on the next stop of our tour of nonfiction pieces by female authors in our archives: the swinging ‘70s. This decade saw the Watergate scandal, the end of the Vietnam War, and Star Wars. And the ladies of The Atlantic were there, reporting on politics, culture, and more.
Here’s a list of ten nonfiction works, one per year (and some of them were just digitized for the first time):
Alice S. Rossi’s “Job Discrimination and What Women Can Do About It” A sociologist addresses workplace discrimination. Rossi’s piece is part of the “Women’s Place” issue illustrated above, and we’re planning to do a follow-up note that goes into great detail on that ten-part series. (March 1970)
Sara Davidson’s “Mick Jagger Shoots Birds” A profile of the singer during the Rolling Stones’s 1970 tour of Europe. (May 1971)
Claire Sterling’s “The Making of the Sub-Saharan Wasteland” This newly digitized report on a drought in the Sahel region was a finalist for the 1975 National Magazine Award for Reporting Excellence. (May 1974)
Helen Vendler’s “The Difficult Grandeur of Robert Lowell” A profile of the poet and a close look at his work. “Why should such grim books give such pleasure?” Vendler asks. (Jan 1975)
Susan Braudy’s “Francis Ford Coppola: A Profile” A profile of the director while he was experiencing “financial and production problems” with the film ‘Apocalypse Now.’ (Aug 1976)
Doris Kearns (Goodwin) leading the May ‘76 issue
Elizabeth Vorenberg’s “The Biggest Pimp of All” Comparing prostitution in different cities in the U.S. and abroad. The byline was shared with her husband. (Jan 1977)
Gail Godwin’s “A Writing Woman” She asks,“At what point does regurgitated autobiography graduate into memory shaped by art?” A version of this piece appeared in “The Writer on Her Work,” a collection of personal essays on writing while female. (Oct 1979)
Again, a shout-out to Sage for her assistance on this project. Next up, Nshira is bringing you the rollicking ‘80s ...
What happens when Iceland, an island nation with 330,000 residents, starts welcoming 1.2 million tourists a year? Feargus O’Sullivan, of our sister site CityLab, explains:
This is raw-boned, hardscrabble country, both thinly populated and thinly served by public amenities. That’s much of its attraction, of course—the idea of having ancient lava fields, raging waterfalls, and mossy ravines more or less to yourself.
You’re far less likely to be alone nowadays, though, and many of the easier-to-access areas are groaning under the pressure of not being as unfrequented as they once were. Land at some beautiful spots is being trampled by too many feet, while basic facilities such as parking and toilets are limited. This has led to unfortunate incidents that include desperate tourists turning the graves of Iceland’s greatest poets into an impromptu bathroom. Less gross but also less forgivable are tourists who drive off-road, damaging fragile landscapes and thus partly ruining the wildernesses that they have traveled so far to witness.
An Atlantic reader feels the irony:
When I first went to Iceland in the ‘60s it was not unusual to find attractions like Gullfoss to be virtually free of visitors. In contrast, on my most recent visit, lines of people shuffled past key spots with just enough time to get their selfie. Now I am sorry that I kept telling everyone just how great Iceland is.
If you have a related anecdote from your own visit to Iceland you’d like to share, drop us an email. We published one in our February 1893 issue.
William Edward Mead, writing about Iceland’s literary culture, was shocked at the barrenness of the landscape, finding it distinctly “unfavorable … to literary fertility” and other scholarly pursuits. “The country is little better than a desert,” he wrote. “People with so little to make life attractive might be pardoned if they were to sink into a stolid indifference to everything but the struggle to keep alive.” Yet the beauty of such harsh, isolated country is also evident in his description:
The only inhabitable portion is a narrow strip of pasture land extending like a green girdle round the coast and up the deep, narrow fiords. The interior of the country is a howling waste of sand and ice, traversed by darting glacier rivers, and utterly incapable of supporting more than a few scattered inhabitants. […]
The farmhouse where I spent more than a fortnight [is] distant a day’s ride on horseback from Reykjavik. Behind the house rises a naked, precipitous ridge of basalt, a quarter of a mile high, sweeping in a magnificent unbroken curve from the bold headland that juts into the sea to the upper waters of the Laxá. Before the house stretches the long, narrow fiord, swarming with sea-birds that circle endlessly about the double cascade foaming down from the river into the sea.
It’s a place that takes its romance from its solitude—and Anglo-American poet W. H. Auden, who visited Iceland in 1936, captured that lonely beauty in his poem “Journey to Iceland.” Iceland, to Auden, with its “sterile immature mountains” and “abnormal day,” is a place for travelers who want to reject the world—a kind of alternate reality, whose purity rubs off on people.
For Europe is absent: this is an island and therefore
Unreal. And the steadfast affections of its dead may be bought
By those whose dreams accuse them of being
Spitefully alive, and the pale
From too much passion of kissing feel pure in its deserts.
Auden later revised that stanza to read “this is an island and therefore / A refuge”—a small change, but a telling one. After all the only truly solitary journeys are imaginary ones; the mind is the most isolated country of all; and the best place to get away from people, tourists or otherwise, may be the refuge of your own thoughts. As Auden closed his poem:
Again the writer
Runs howling to his art.
You can read the full text of “Journey to Iceland” (or listen to Auden reading it) here, and you can see some of its sights for yourself:
Our video team recently posted a short documentary featuring the story of Marisol Conde-Hernandez, an undocumented immigrant currently studying at Rutgers Law School:
“I think that I’m the first undocumented person to attend law school in the state of New Jersey,” Conde-Hernandez says in the film. “It’s still in the back of my mind because I’m undocumented. What if I can’t practice as an attorney?”
In the comments section for the video, Atlantic readers discussed immigration policy, which has become the signature issue for the presumptive GOP nominee for U.S. president. One reader wants to know more about a landmark piece of legislation passed under President Ronald Reagan:
Has there been any deep longitudinal or follow-up study of Reagan’s 1986 amnesty recipients? There were about three million of them, if I recall correctly. I’d be interested in how they fared economically and, more so, how their kids fared.
First, a bit of background: The 1986 Immigration Reform and Control Act (IRCA) passed Congress 30 years ago this November. (Here’s the New York Times report of Reagan signing the bill.) Eric Schlosser, in his award-winning 1995 investigative piece for The Atlantic, “In the Strawberry Fields,” described how the IRCA was so long in the making:
In 1951 the President’s Commission on Migratory Labor condemned the abysmal living conditions of illegal immigrants employed as migrant farm workers in the United States. At the time, workers were found living in orchards and irrigation ditches. They lived in constant fear of apprehension, like fugitives, and were routinely exploited by their employers, who could maintain unsafe working conditions, cut wages, or abruptly dismiss them with little fear of reprisal. In many cases the life of these migrants was, according to the commission, “virtually peonage.”
The commission estimated that 40 percent of the migrants in the United States--at least 400,000 people--were illegal immigrants. Their presence in such large numbers depressed wages for all farm workers; that was “unquestionable.” [...]
The commission argued that the only way to stop the flow of illegals was to impose harsh punishments on those who employed and exploited them. It suggested fines, imprisonment, and a strict prohibition of interstate commerce in any goods produced or harvested by illegal immigrants. [...] Congress ignored the commission’s recommendations, and for the next two decades it was a crime to be an illegal immigrant in the United States but not a crime to employ one.
Then came the IRCA, “accepted as a once-only great compromise,” wrote the Pulitzer-Prize winning scholar Jack Miles for the June 1994 issue of The Atlantic:
The mass legalization of then-illegal immigrants was traded for the promise that a new program of employer sanctions would destroy the incentive for further mass immigration. That hope proved vain; but if it had never been entertained, IRCA would never have passed.
That hope proved vain because, as Schlosser put it, “these sanctions have rarely been applied”:
There are approximately 873,400 private employers in California--and only about 200 federal inspectors to investigate workplace violations of the immigration code. Moreover, the federal penalties for employing an illegal immigrant are mild. A first offense may result in a fine of $250, a third offense in a fine of $3,000.
Instead of stemming illegal immigration, IRCA has actually encouraged it. In response to growers’ fears that the new sanctions on employers would create a shortage of farm workers, Congress included in the bill a special amnesty for illegal immigrants who could prove that they had done farm work in the United States during the previous year. It did not demand much proof. [The program] was expected to grant legal status to 350,000 illegal immigrants. Instead more than 1.3 million illegal immigrants--a number roughly equivalent at the time to a sixth of the adult male population of rural Mexico--applied for this amnesty, most of them using phony documents in what has been called one of the greatest immigration frauds in American history.
Our reader is roughly correct about the eventual numbers: Around 2.7 million people received legal status under the IRCA, according to The Washington Post’s Emily Badger. How exactly did those granted amnesty fare? It’s not very clear. Here’s Badger:
So what do we know about what happened to that earlier wave of immigrants? Only a little bit — and hardly enough to measure the impact of a massive government policy change.
The Department of Labor sponsored two survey studies following up on several thousand of the IRCA immigrants — in 1989 and then again in 1992, five years after the law went into effect. Those studies suggested that immigrants made significant wage gains in the years after legalization, many of them by obtaining better jobs. Government records also revealed over time how many of them became naturalized citizens. In 1996, the year the entire IRCA cohort was eligible, a quarter of a million were naturalized. By 2001, one-third of the entire group had been.
Regarding our reader’s inquiry about the kids:
[Some] research suggests that children of undocumented immigrants are more likely to be poor and in poor health than children of legal parents. And so we might reasonably expect the children of immigrants to benefit from their legal status, too — even if they’re not born until well after any amnesty is granted.
Read the rest of Badger’s report here. If you’re interested in digging deeper into the IRCA or the immigration issue more generally, please let us know. Priscilla recently examined the present-day debate for our A&Q series, if you’re curious about some of the basics. Update from a reader, G.A., who raises some interesting questions:
I’m curious, as this subject arises, how advocates for illegal immigrants view the Reagan amnesty. Did it work? Did it fail?
Perhaps more important than the issue of fixing the “broken immigration system,” which is the narrative commonly heard, is the question of what happens AFTER such a reform is passed. Sort of like how “winning” the war in Iraq (defeating Saddam’s forces) was far easier than winning the peace, as it were. How do advocates for illegal immigrants convince American citizens that a reform would address their concerns?
I just wonder if 30 years from now, analysts will write about the failed Clinton amnesty. And to really play devil’s advocate here, is the immigration system really broken? Or are people simply ignoring the system and its penalties and later claiming that it is unfair?
Nathaniel Hawthorne, one of the great American writers of the 19th century, died on this day 152 years ago. In the July 1864 issue of our magazine, Atlantic co-founder Oliver Wendell Holmes Sr. described the funeral of his friend Nathaniel:
Hawthorne circa 1860-1864 (Wikimedia)
In a patch of sunlight, flecked by the shade of tall, murmuring pines, at the summit of a gently swelling mound where the wild-flowers had climbed to find the light and the stirring of fresh breezes, the tired poet was laid beneath the green turf.
Poet let us call him, though his chants were not modulated in the rhythm of verse. The element of poetry is air: we know the poet by his atmospheric effects, by the blue of his distances, by the softening of every hard outline he touches, by the silvery mist in which he veils deformity and clothes what is common so that it changes to awe-inspiring mystery, by the cloud of gold and purple which are the drapery of his dreams.
In the months before his death, Hawthorne “evidently had no hope of recovering his health. He spoke as if his work were done, and he should write no more.” The fact that death was on his mind is evident in Hawthorne’s last, unfinished novel, The Dolliver Romance, an excerpt of which was published in The Atlantic two months after his death. It’s the story of Dr. Dolliver, “a worthy personage of extreme antiquity,” who is troubled by the persistent symptoms of old age—arthritis and fatigue, coughs and chills—and whose memory is haunted by “a throng of ghosts.” And yet:
This weight of years had a perennial novelty for the poor sufferer. He never grew accustomed to it, but, long as he had now borne the fretful torpor of his waning life, and patient as he seemed, he still retained an inward consciousness that these stiffened shoulders, these quailing knees, this cloudiness of sight and brain, this confused forgetfulness of men and affairs, were troublesome accidents that did not really belong to him. He possibly cherished a half-recognized idea that they might pass away.
1841 portrait of Hawthorne by Charles Osgood
Youth, however eclipsed for a season, is undoubtedly the proper, permanent, and genuine condition of man; and if we look closely into this dreary delusion of growing old, we shall find that it never absolutely succeeds in laying hold of our innermost convictions. A sombre garment, woven of life’s unrealities, has muffled us from our true self, but within it smiles the young man whom we knew; the ashes of many perishable things have fallen upon our youthful fire, but beneath them lurk the seeds of inextinguishable flame.
Back to Holmes:
There the bed is made in which he whose dreams had peopled our common life with shapes and thoughts of beauty and wonder is to take his rest. This is the end of the first chapter we have been reading, and of that other first chapter in the life of an Immortal, whose folded pages will be opened, we trust, in the light of a brighter day.
If you’d like to page through more of Hawthorne’s work in The Atlantic, check out his sketches from the Boston custom house and, more famously, his controversial Civil War reporting in our July 1862 issue. Sean Weiner provides some great context for that reporting and how Hawthorne was written about in The Atlantic in the decades following his death. Julian Hawthorne reviewed his father’s greatest work, The Scarlet Letter, for our April 1886 issue. James Russell Lowell, our founding editor, reviewed another Hawthorne novel, The Marble Faun, for our April 1860 issue. Other examples of his inextinguishable influence on The Atlantic include Paul Elmore More’s “Solitude of Nathaniel Hawthorne” (November 1901) and Alfred Kazin’s “Hawthorne: The Artist of New England” (December 1966).
“The party of Lincoln” wasn’t always so. In 1856, at the first-ever Republican National Convention, party leaders passed up Abraham Lincoln for vice president. But second time’s a charm, right? On May 18, 1860, the 51-year-old congressman from Illinois, having raised his national stature during the Lincoln-Douglas debates two years earlier, secured his party’s nomination for president. The Chicago Tribune’s Kenan Heise described the scene at the convention for the book Chicago Days: 150 Defining Moments in the Life of a Great City:
The eloquent, self-assured [William] Seward, a U.S. senator from New York, was widely thought to have the nomination wrapped up; many deals had been cut, one of which put Chicago Mayor “Long John” Wentworth in the Seward camp. … Fortunately for [Lincoln], Chicago, which was hosting its first national political convention, was the heart of Lincoln country.
To make sure a friendly crowd was on hand to out-shout the competition, batches of admission tickets were printed at the last moment and handed out to Lincoln supporters, who were told to show up early at the Wigwam, a rickety hall that held 10,000 people. And, for good measure, Illinois delegation chairman Norman Judd and Joseph Medill of the Chicago Daily Press and Tribune placed the New York delegates off to one side, far from key swing states such as Pennsylvania.
Drawing of the Wigwam, a building specially constructed for the convention (Wikimedia)
No candidate had a majority after two ballots. During the third ballot, with Lincoln tantalizingly close to winning the nomination, Medill sat close to the chairman of the Ohio delegation, which had backed its favorite son, Salmon P. Chase. Swing your votes to Lincoln, Medill whispered, and your boy can have anything he wants. The Ohio chairman shot out of his chair and changed the state’s votes.
After a moment of stunned silence, the flimsy Wigwam began to shake with the stomping of feet and the shouting of the Lincoln backers who packed the hall and blocked the streets outside. A cannon on the roof fired off a round, and boats on the Chicago River tooted in reply. … The Republicans had a candidate.
The Atlantic, founded as an abolitionist magazine just three years earlier, threw its weight behind Lincoln but expressed some initial disappointment over Seward’s loss (the New York senator was a more forceful opponent of slavery than the moderate Lincoln). Here’s our founding editor, James Russell Lowell, on “The Election in November”:
We are of those who at first regretted that another candidate was not nominated at Chicago; but we confess that we have ceased to regret it, for the magnanimity of Mr. Seward since the result of the Convention was known has been a greater ornament to him and a greater honor to his party than his election to the Presidency would have been. … [Seward], more than any other man, combined in himself the moralist’s oppugnancy to Slavery as a fact, the thinker’s resentment of it as a theory, and the statist’s distrust of it as a policy,—thus summing up the three efficient causes that have chiefly aroused and concentrated the antagonism of the Free States.
After sizing up the national schisms over slavery, Lowell turns to Lincoln:
The first portrait of Lincoln as nominee (May 20, 1860)
We are persuaded that the election of Mr. Lincoln will do more than anything else to appease the excitement of the country. He has proved both his ability and his integrity; he has had experience enough in public affairs to make him a statesman, and not enough to make him a politician. That he has not had more will be no objection to him in the eyes of those who have seen the administration of the experienced public functionary whose term of office is just drawing to a close. He represents a party who know that true policy is gradual in its advances, that it is conditional and not absolute, that it must deal with fact and not with sentiments, but who know also that it is wiser to stamp out evil in the spark than to wait till there is no help but in fighting fire with fire.
The 1860 general election brought one of the highest voter turnout rates in presidential history. And, of course, “Honest Abe” walked away the winner and went on to issue the Emancipation Proclamation and then orchestrate the passage of the 13th Amendment, abolishing slavery for good. As Lowell wrote with great prescience, “We believe that this election is a turning-point in our history; for, although there are four candidates, there are really, as everybody knows, but two parties, and a single question that divides them.”
Yesterday marked the 62nd anniversary of the Supreme Court’s ruling in Brown v. Board of Education, which undid Plessy v. Ferguson’s “separate but equal” doctrine established 120 years ago today. (Sage has compiled archival Atlantic readings on the Brown decision.) But the fight to desegregate schools continues. Just last week, a Mississippi judge ordered the state’s Cleveland School District to desegregate. City Lab’s Brentin Mock has details:
One city that just never succeeded at school integration is Cleveland, Mississippi, where the school district was sued by a group of parents way back in 1965 for its failure to comply with Brown. Black families were concentrated (both then and today) in neighborhoods to the east of a railroad track that split Cleveland in half, both physically and racially. Black children were forbidden from attending schools located to the west of the tracks, where white families lived almost exclusively, due to Jim Crow policies.
This sequestering of black students persisted for decades after that lawsuit was filed, despite numerous consent decrees and court orders for the Cleveland school district to desegregate. The district was never able to come up with a plan that could convince white parents to send their kids to schools on the black side of town. Now, the federal government wants Cleveland to squash its schools’ race-based reputations by folding the east-of-the-tracks black middle and high schools into the historically white schools to create single, blended schools for each age group.
“The wheels of justice have been said to turn slowly,” wrote Shannon Lerner in a piece for us last year covering the Cleveland School District on the ground. She continues:
And few things move quickly here in Cleveland, Mississippi, a town of 12,000 people with no movie theater and a quaint commercial district that’s shuttered on Sunday. But when a deadline on a school desegregation suit—originally filed in 1965—came and went last month with opposing sides still unable to agree on a resolution, some locals admitted frustration.
“If you fight for something for 50-some-odd years and it don’t work out? Good gravy, that’s a long time,” said Leroy Byars, 67, who is known around town simply as “Coach.”
Cleveland isn’t alone. Last month, Alana reported from Little Rock, Arkansas, on the persistence of school segregation there. She tells the story of LaVerne Bell-Tolliver, whose parents “volunteered her to integrate Forest Heights Junior High in Little Rock in 1961”:
Bell-Tolliver looks at Little Rock schools now, though, and wonders if her years of hell were all in vain. In the decades since the schools were first integrated, Little Rock has become a more residentially segregated city, with white residents in the northwest part of town and blacks in the southwest and south. Because the vast majority of children attend schools in their neighborhood, the schools have become re-segregated too.
And those separate schools are not at all equal. For example, 58 percent of the students at Roberts Elementary, located in northwest Little Rock, are white, though the district as a whole is just 18 percent white. Roberts was completed in 2010 and has a climbing wall, a state-of-the art computer lab, a chemistry lab, telescopes, high ceilings, natural light, and a cafeteria with a stage and TV screens. Wilson Elementary, 72 percent black, is located in a majority-black neighborhood and, according to a lawsuit filed this year, has failing air conditioning, squirrels that died in the air ducts, and a cafeteria that was closed by the public-health department.
One reader has a cynical reaction:
It’s wrong to deny resources to majority-minority schools. But no matter how liberal, every parent wants their children to attend a high-performing school. And no doubt about it, those schools are white.
On that note, Alia posed the following question to New York Times Magazine investigative reporter Nikole Hannah-Jones during our Education Summit yesterday: “What do you say to parents who really believe in [integration], but they don’t want to sacrifice their kid’s education?” She added, “They don’t want to send their kid to a school with bad test scores. What do you say to them?”
Hannah-Jones’s response:
Well, one, I would say test scores are often a reflection of the socioeconomic status of the kids in the school, and what the data shows is that middle-class parents who go into these schools, their kids do just fine. My daughter is doing just fine. She’s reading above her grade level; she’s thriving. Because anything that the school would lack, I can provide for her. But also she has come into the school with a certain level of knowledge and privilege. So I think that that’s a fear that is often unfounded, but it’s a fear all the same.
I think the other thing is the notion that we’re going to get equality without having to give up any of our privilege is just a false notion. It doesn’t work that way. You can’t say, “I want equality, but at the same time I want my child to have every advantage.” That’s not equality.
You can watch the full exchange below (Alia’s question starts around the 9:25 mark):
For more from Hannah-Jones, check out “The Problem We All Live With,” an episode of This American Life looking at desegregation and the achievement gap.
Ninety-eight years ago today, Congress passed the Sedition Act of 1918, which made it an imprisonable offense to criticize the federal government or U.S. military involvement in World War I. The legislation, which expanded the Espionage Act of 1917, came at the height of wartime fear and anger:
W. A. Rogers / Library of Congress
Violence on the part of local groups of citizens, sometimes mobs or vigilantes, persuaded some lawmakers that the [original] law was inadequate. In their view the country was witnessing instances of public disorder that represented the public’s own attempt to punish unpopular speech in light of the government’s inability to do so. Amendments to enhance the government’s authority under the Espionage Act would prevent mobs from doing what the government could not.
It was in this political climate that James Harvey Robinson, in the December 1917 issue of The Atlantic, addressed “The Threatened Eclipse of Free Speech”—a foreshadowing of the Sedition Act. Robinson argued that in times of national hardship, dissent is not only natural but necessary:
When we see khaki uniforms all about us … when coal runs low in the cellar and sugar in the kitchen; when we … are consciously grateful for a boiled potato; when we note the lowering of the exemption limit of the income tax, and are suspected of being a scoundrel if we do not invest in government bonds, the mind is quickened as never before. We would seem to have a right to suspect that many things must have been fundamentally wrong in the old and revered notions of the State, of national honor, even of patriotism, since they seem at least partially responsible for bringing the world to the pass in which it now finds itself.
Robinson (who took care to assure his readers that he, too, supported the war effort) sought to calm people on both sides of the free-speech debate: those worried about the dangers posed by dissenters and the dangers posed by suppression of speech.
But some parts of his argument are more unsettling. In this passage, he considers why free expression can be so incendiary and concludes it’s because the beliefs we express—and those we react to—are not rational:
Strangely enough most of us most of the time are really quite indifferent to truth, and are using language in the old, primitive way as a signal of agreement or disagreement. We become partisans before we realize it. We get pledged to beliefs we know not how, and they become dear to us by reason of their familiarity and associations. When they are questioned, we are outraged, and rush to their defense in the name of truth. Our hypocrisy is too deep and impulsive for us to detect.
It’s a frightening idea—but fortunately, this problem of free speech contains its own solution. In 1919, two years after Robinson’s article, the U.S. Supreme Court upheld the convictions of four men who had been prosecuted under the Sedition Act for publishing pamphlets critical of the war effort. Justice Oliver Wendell Holmes veered from the 7-2 majority and issued “the most powerful dissent in American history,” in the words of The Atlantic’s Andrew Cohen. Here’s Holmes:
When men have realized that time has upset many fighting faiths, they may come to believe even more than they believe the very foundations of their own conduct that the ultimate good desired is better reached by free trade in ideas—that the best test of truth is the power of the thought to get itself accepted in the competition of the market, and that truth is the only ground upon which their wishes safely can be carried out.
That, at any rate, is the theory of our Constitution. It is an experiment, as all life is an experiment. Every year, if not every day, we have to wager our salvation upon some prophecy based upon imperfect knowledge. While that experiment is part of our system, I think that we should be eternally vigilant against attempts to check the expression of opinions that we loathe and believe to be fraught with death.
All life is an experiment. Tentative though Holmes’s argument sounds, it’s a powerful case for American democracy—and a powerful case for the value of dissent itself. In acknowledging the fragility and shortcomings of our beliefs, ourselves, and our nation, we can prove and preserve their strength.
Our Aug 1968 cover features a nonfiction excerpt from Joan Baez’s memoir, Daybreak.
We’ve made it our project over the next several weeks to uncover nonfiction Atlantic pieces written by women. It’s been a difficult process, since many of the early pieces are not online. But after (physically) digging through our print archives, we’re able to present the following crop of lady-journos from the ‘60s—and it’s quite an impressive group.
During that decade, women in TheAtlantic tackled everything from Castro’s Cuba to illegal abortion to Marlon Brando. Among the authors listed here are a Pulitzer-Prize-winning historian, an anonymous part-time secretary, a Harvard professor, a First Lady, and a famous film critic. (Like our first list, the authors here are fairly monochromatic—the majority are white and American.)
Eliza Paschall’s “A Southern Point of View” The writer and activist criticizes the Georgia legislature’s willingness to close down schools rather than integrate them. (May 1960)
Eleanor Roosevelt’s “What Has Happened to The American Dream?” The former First Lady demands a re-dedication to the Dream in the face of Soviet influence, “the greatest challenge our way of life has ever had to meet.” (April 1961)
Martha Gellhorn’s “Eichmann and the Private Conscience” The famed war correspondent reports on the trial of Nazi official Adolf Eichmann in Jerusalem and sketches out “some of the lessons to be learned.” (Feb 1962)
Jessica Mitford’s “The Undertaker’s Racket” An investigation of the funeral industry in the United States. (June 1963)
Mrs. X’s “One Woman’s Abortion” An anonymous suburban mother of three talks about her search for an illegal abortion. (Aug 1965)
Barbara W. Tuchman’s “The Case of Woodrow Wilson” A historian and best-selling author “agrees with Sigmund Freud that President Wilson was a tragic figure whose neuroses got in his way.” (Feb 1967)
Elizabeth Drew’s “Report: Washington” One of her many dispatches during her run as Washington correspondent for the magazine. (April 1968)
Emma Rothschild’s “Reports and Comment: Cuba” A look at Fidel Castro “committing Cuba to an agricultural future.” (March 1969)
A huge shoutout to contributing editor and Atlantic archives legend, Sage Stossel, for helping us with this list.
But what about 1964? One work we were unable to digitize was “Four and a Half Days in Atlanta’s Jails” by Gloria Wade Bishop (now Gloria Wade Gayles), a prolific black essayist and literary critic. In that July 1964 piece, she gives a gripping account of her time behind bars after her arrest during a peaceful protest.
Harper Lee, who died earlier this year, would have been 90 years old today. She’s best remembered as the author of a single novel, To Kill A Mockingbird, whose standing in American culture is so great that it has become, over the years, much more than a book. As Megan put it:
The elements of To Kill a Mockingbird—“our national novel,” Oprah Winfrey called it—have been varnished by time. And polished, by the equal forces of memory and forgetfulness, into symbols of some of the things the current culture holds most dear, or tries to: justice, wisdom, decency, bravery, empathy. You never really understand a person until you climb into his skin and walk around in it. The names Scout and Atticus—and, perhaps above all, the name Harper—reflect a respect not just for the arc of history, but for the hope that it does indeed bend toward justice.
In the August 1960 issue of The Atlantic, Phoebe Adams described the novel as “respectable hammock-reading”—i.e. the kind of thing you can read on the subway without embarrassment, rave about over cocktails with impunity, but maybe not mention in a job interview or list on OkCupid. She continues:
Harper Lee’s To Kill A Mockingbird is sugar-water served with humor. … It is frankly and completely impossible, being told in the first person by a six-year-old girl with the prose style of a well-educated adult. … A variety of adults, mostly eccentric in Scout’s judgment, and a continual bubble of incident make To Kill A Mockingbird pleasant, undemanding reading.
I love To Kill a Mockingbird. But that description strikes a chord with me.
I read it when I was 12 and promptly designated it my favorite book. (I also developed a serious crush on Jem.) It remained my favorite throughout high school, through rereadings and group projects, and into the first-year college lit class where two other people, during icebreakers, announced it was also their favorite. It was still my favorite when I went to class with Ivy-League English majors who loved Anne Carson and David Foster Wallace, and when I interned at a literary magazine beside people newly obsessed with Karl Ove Knausgaard and Tao Lin.
It was my favorite book, that is, long after I became embarrassed to admit it was—long after I began to wonder if I should love something more challenging, more obscure, and less widely beloved.
It was still my official favorite when I confessed to a friend that my slightly-cheesy life goal was to write someone else’s favorite book. I wanted to inspire that kind of mind-shaping love in someone. I wanted my words to get stuck in people’s heads. That’s sweet, he said, but it was proof that I didn’t really get art. Didn’t I think I was being a little too populist? Shouldn’t I be aiming a little higher?
I may or may not owe my friend an apology for shouting at him so inarticulately. But we can’t all think on our feet.
Today I’d argue that To Kill a Mockingbird shows the value of popular literature; that some of the power of that pleasant, undemanding, multifaceted book of humor and tragedy lies in its very undemanding nature; that its enormous footprint in American culture comes in part from how easy it is for people to love it, and identify with it, and strive to honor its good parts. Here’s how Harper Lee’s literary agent described her strengths to another author:
She has the same ability you have to create living characters, from kids to old folks, so real that people from totally different environments immediately believe in them … the same gentle underlying humor which adds charm to the telling. I remember telling her, when trying to persuade her to go on and keep working at the Mockingbird, how well you had succeeded with similar material. I remember showing her that section of SO LONG AT THE FAIR where the boy is listening to the sounds of the gin whistles, near and distant, and how much she loved it.
That’s the power of sugar water served with humor: to charm, to win people over, and in that gentle winning-over to inspire a deeper belief.
And after all, sweetness can conceal a subtle strength. One of the most striking scenes in To Kill a Mockingbird is the missionary society tea party where Scout learns that Tom Robinson, the black man whom Atticus defended in court, has been shot. Atticus pulls aside Scout, her aunt, and their housekeeper, Calpurnia, to tell them the news.
The scene is a masterpiece of dissonance: characters professing altruism and racism in the same breath; cups clinking and small talk gently humming over the news of a man’s brutal killing. Scout, who absorbs the news only slowly, “found myself shaking and couldn’t stop.” She’s a child, frightened not only by the tragedy but also by seeing the adults in her life at such a loss. Then, her neighbor, Miss Maudie, quiets her. The women go back to the party:
And so they went, down the row of laughing women, around the diningroom, refilling coffee cups, dishing out goodies as though their only regret was the temporary domestic disaster of losing Calpurnia. … Aunt Alexandra looked across the room at me and smiled. She looked at a tray of cookies on the table and nodded to them. I carefully picked up the tray and watched myself walk to Mrs. Merriweather. With my very best company manners, I asked her if she would have some. After all, if Aunty could be a lady at a time like this, so could I.
It’s not callousness that drives them to return, but something more human and complicated: the fact that life and all its banalities go on in spite of tragedy, that deep feelings and grave events are bound to have humor and hypocrisy, sweetness and absurdity mixed in. That’s the knowledge that brings Scout one step closer to adulthood—and it’s the quality that makes Lee’s writing seem real.
I’ll admit it: To Kill a Mockingbird is still my favorite book.
A wolf in a wild wood near Ukraine's Chernobyl in April of 2012 (Sergiy Gaschak / AP)
Today marks 30 years since the Chernobyl nuclear disaster. In our January 1987 issue, Mary Jo Salter, an American living in Rome at the time of the accident, described the fear and uncertainty of living with the fallout, as public information and government assessments of the danger kept changing. “Although we were living in an increasingly nuclear-powered world,” Salter wrote, “we had also been living in ignorance of the nature of radiation.” She continued:
The newspapers provided some of the information that, I suddenly felt, I should have known already: that iodine comes in a radioactive form, iodine 131, which is often the principal component of nuclear-reactor leaks and which has the relatively brief half-life of eight days. I learned that iodine 131 causes thyroid cancer, that it is readily absorbed by green plants, and, therefore, that it is found in the milk of grass-eating animals. I learned that cesium 137, with a half-life of thirty years, settles especially in muscle tissue and organs, and that strontium 90, with a half-life of twenty-eight years, settles in bones and so can cause bone-marrow cancer.
Almost everyone I knew in Rome had learned at least some of these facts within a few days—a few days not after we learned of the Chernobyl disaster but after we learned that la nube [the radioactive cloud from the explosion] had passed over us.
Salter had a two-year-old daughter at the time, and she and her husband faced agonizing worries over how to keep her safe. They closed the windows for fear of the air. Milk had become dangerous; eggs and vegetables, too, absorbed the poison. Life-sustaining food and water were suddenly vectors of death.
With soil in many parts of Europe still contaminated from the Chernobyl blast, the danger posed by radioactive flora and fauna lives on. Today on our site, Ron Broglio writes about the radioactive boars invading towns in southern Germany, several hundred miles from where the reactor exploded in Ukraine:
They become irradiated by eating plants downwind from the meltdown that contain residual traces of radioactivity—including truffles, tubers, and mushrooms that absorb high degrees of radioactive waste from the soil. Apart from anniversaries like this one, Chernobyl has faded from memory. But for the radioactive elements the disaster expelled, life has just begun. The disaster lives on, but invisibly.
Invisibility is probably the most terrifying part about the aftermath of Chernobyl. How can a threat so insidious as radiation be visualized or depicted, let alone faced? In a piece for us today on Chernobyl’s literary legacy, Michael Lapointe writes:
Through three decades of literary response, Chernobyl has undermined the sort of authoritative depiction that might bring closure. But something closed can be forgotten. The finest works express profound doubts about the power of language to absorb a disaster of this magnitude, and so continually reopen it to new ways of being remembered.
Alan recently curated a photo essay of the post-Chernobyl cleanup, attempting to render the invisible meanings of this disaster visible. The images of Pripyat, the now-abandoned town where the nuclear plant was located, show structures overgrown and fallen to pieces, grass poking between paving stones, branches twined around beams, and the encroaching vines and trees merging with the architecture:
A playground in the deserted town of Pripyat on November 27, 2012. (Efrem Lukatsky / AP)
And Broglio describes the animal life taking over the Exclusion Zone:
Rare species not seen in the region for hundreds of years have returned, including the Przewalski’s horse, the European bison, the lynx, and the Eurasian brown bear. Without fear of being hunted, the animals roam the forest and the ruins of cities in what has become an eerily post-human wildlife sanctuary.
Nature is taking back Chernobyl, which is almost reassuring until you remember that radioactive elements are still in the soil. Thirty years later, the legacy of nuclear disaster—which, Broglio notes, could ultimately lead to 4,000 deaths—perpetuates a paradox: Gradually, life returns to the dead zone; and gradually, death grows.