Reporter's Notebook

Deep Archive
Show Newer Notes

Earlier this month, Anjelica Jade Bastién pointed to Jared Leto’s performance in Suicide Squad—the preparations for which involved watching footage of brutal crimes and sending his fellow cast members used condoms—as evidence that “Hollywood Has Ruined Method Acting.” A reader who’s been a member of the Actors Studio for 15 years is outraged by Leto’s antics and the commentary surrounding it:

The pop understanding of Method Acting is that an actor walks around the set in character, goes home in character, does weird stuff, and ruins the environment for everybody else on set. And that is just NOT what Method Acting is. It’s not Method Acting is using your own experiences to make a role deeply personal, particularly when your character has moments that diverge from your experiences.

If you are playing Catherine the Great and want me to call you “Your Highness” on set, that’s perfectly fine. That’s your process. But if you treat the PAs like serfs, you’re just a jerk.

Or, as another reader puts it:

In order to play a killer, one does not need to go out and kill people. That is not acting; that’s insanity.

In a previous note, my colleague Chris had much more on method acting from other readers and actors. What’s striking to me about these comments—with the distinction they draw between literally experiencing something and using an actor’s art to imitate it—echo a debate from more than a century ago, when acting in movies was something entirely new. Back then, as Annie Nathan Meyer wrote in the January 1914 issue of The Atlantic,

Many good souls to-day—after attending their first performance of the modern highly perfected moving pictures—pronounce the death of the art of acting. … Now I am frankly of the opinion that it is not the art of acting that is in any danger, but that it is rather that a certain tradition of acting is indeed passing away. … The stage in many ways has held curiously aloof from the spirit of its age. … The truth is that the freedom of the theater, its right to mirror life untrammeled and unquestioned, has not been won in the sense that such freedom has been won in the other arts.

She goes on to argue that the theater of her time—when it isn’t prioritizing historical or allegorical subjects over realistic, relatable narratives—is caught up in attempts to make settings and costumes seem real, while representations of character remain grandiose and stilted. In this way, the advent of film provides an opportunity for theater:

Will the drama cease to concern itself with an eye-deep realism and concern itself with the soul-drama in which the cinematograph will scarcely attempt to rival it? … This is my hope. … Hopelessly outclassed in realism, in the apotheosis of the commonplace, by the modern photographic invasion, the drama will—even as painting did before it at the oncoming of the photograph three quarters of a century ago—escape into the realms of a heightened personality and an enriched imagination. … The modern actor must … give us an art so personal, so elusive, that the camera cannot follow him into the new realm at all.

Twenty years later, in his 1936 essay “The Work of Art in the Age of Mechanical Reproduction,” the German critic Walter Benjamin was also skeptical that film could ever capture an actor’s art:

The stage actor identifies himself with the very character of his role. The film actor very often is denied this opportunity. His creation is by no means all of a piece; it is composed of many separate performances. … Let us assume that an actor is supposed to be startled by a knock at the door. If his reaction is not satisfactory, the director can resort to an expedient: when the actor happens to be at the studio again he has a shot fired behind him without his being forwarned of it. The frightened reaction can be shot now and be cut into the screen version. Nothing more strikingly shows that art has left the realm of the “beautiful semblance” which, so far, had been taken to be the only sphere where art could thrive.

Enter method acting, the modern acting technique that promised all the “heightened personality and … enriched imagination” that Meyer had called for in 1914.

Developed by Lee Strasberg, the artistic director of New York City’s Actors Studio from 1951 until his death in 1982, it reached its height in the 1950s and ’60s, when Strasberg trained actors like James Dean, Jane Fonda, Paul Newman, and Marilyn Monroe in his signature technique. As if in response to the discontinuity of filming that Benjamin criticized above, method acting required actors to consider their characters as complete people, with identities and pasts extending beyond the boundaries of a play or a film. It also made acting personal, requiring intense introspection from actors to help them identify with a role. Here’s how the first reader quoted above describes it:

If you’ve never had a child, and your character’s child goes missing, THAT'S where the Method can help. Have you ever felt desperation and loss like that? So you explore, you do affective memory, maybe sub-personality work, maybe create a place where you felt loss, until you hit on the one that works. Then you own it, keep it to yourself, and let it happen on camera.

You can watch Strasberg himself talk about the method in the clip below (the interview starts about 30 seconds in):

For an art that had transformed from public performance into something experienced alone on either side of a camera’s lens—forcing actors to act, in Benjamin’s words, “not for an audience but for a mechanical contrivance”—these private, deeply personal techniques made sense. Method acting, with all the emphasis it placed on emotional depth and subtlety, was perfect for a medium that now allowed long close-ups of a character’s changing face, flashbacks to gradually reveal her past, soundtracks to complement a mood, and more variation in scene and setting than had ever been possible onstage. For all Meyer’s hope that stage actors would surpass the cinema in “soul-drama,” film offered new opportunities for that drama to be fulfilled—and method acting provided a transformative way to do it.

Today, though, as Bastién notes, “going to great lengths to inhabit a character is now as much a marketing tool as it is an actual technique”—and the private transformations that once constituted method acting have been replaced, at least in the popular understanding of the term, with more spectacular stunts. Leto sent his coworkers a dead pig and a live rat in pursuit of the Joker’s mindset; Leonardo DiCaprio slept in an animal carcass for his role in The Revenant. Reading Benjamin’s essay with this in mind, another passage jumped out at me:

Never for a moment does the screen actor cease to be conscious of this fact[:] While facing the camera he knows that ultimately he will face the public, the consumers who constitute the market. This market, where he offers not only his labor but also his whole self, his heart and soul, is beyond his reach. During the shooting he has as little contact with it as any article made in a factory. … The cult of the movie star, fostered by the money of the film industry, preserves not the unique aura of the person but the “spell of the personality,” the phony spell of a commodity.

This era in Hollywood, as my colleague David has written, is more factory-like than ever—reliant on blockbusters, repackaging sequel after remake after reboot, and selling stunts like Leto’s and DiCaprio’s as much as it tries to sell films. It’s an industry built on bombast and spectacle, made for a massive public, and it lends itself to phony commodifications of an actor’s work.

Yet meanwhile, we audiences consume acting more privately than ever--on personal laptops and tablets and smartphones, via streaming accounts whose algorithms guess what we like. We chill with Netflix, oh-so-intimately, and spend our weekends binging on serials, privately treating ourselves to hours of indulgence and pleasure and shame. Watching alone, we invest ourselves deeply in the lives of characters, performing the same kinds of relational, introspective work that true method actors do.

So maybe method acting isn’t dead. Maybe it’s the perfect time for actors to reclaim it, particularly if Hollywood wants to reach audiences where they are. Bring back introspection and personal reflection for an era of small reflective screens, to stage a private confrontation: just you and the actor, alone in the dark.

Last week, Paul Barnwell worried that academics are crowding out character education in schools. Some readers pointed out how lessons in ethics and morality could be integrated with academics. But meanwhile, according to Barnwell, the pressure seems to have given students a worrying obsession with getting ahead:

The 2012 Josephson Report Card on the Ethics of American Youth reveals a pressing need to integrate elements of character education into the country’s public-school curriculums. According to the study, 57 percent of teens stated that successful people do what they have to do to win, even if it involves cheating. Twenty-four percent believe it is okay to threaten or hit someone when angry. Thirty-one percent believe physical violence is a big problem in their schools. Fifty-two percent reported cheating at least once on an exam. Forty-nine percent of students reported being bullied or harassed in a manner that seriously upset them.

And it’s not just students who are part of the problem, says Bob, a reader from Dallas: “In our good public high school, when our daughter and her classmates complained of others cheating, the teacher replied, ‘You could cheat too,’ as if that were some kind of solution to the competitive imbalance.”

Marvin, a teacher in California, would never say something like that:

Just talking or teaching about character and ethics won’t accomplish much, if anything. It has to be ingrained and enforced. I graduated from the U.S. Naval Academy, where there is zero tolerance for cheating. Anyone caught was immediately expelled, including half the football team one year. I never saw any cheating.

At first, I warned my MBA students that cheating was unfair to their classmates and would not be tolerated. When I caught a student cheating, I flunked him from my course. It would have delayed his program completion date. The administration forgave him. It gave everyone there that knew about it, including me, a different attitude and expectation about student cheating there.

Back in 1883, Atlantic contributor Oliver Johnson warned that a failure to address problems like student cheating—or more to the point, train them not to do it in the first place—could have dire consequences:

If, as people of every variety of belief in respect to religion confess, a sound moral character is indispensable to good citizenship, it behooves the state, if possible, to find a way of so training the youth of the country that they will be reasonably certain to form such a character. It must not content itself with imparting secular and scientific instruction alone. The consciences and the affections, or, as [philosopher Herbert] Spencer says, the moral sentiments, of children must be cultivated, or the quality of citizenship will so deteriorate as to endanger the republic.

Reading Johnson’s words on citizenship in the midst of a deeply divisive election, I can’t help but worry that the republic is already in danger—not only from a lack of good citizenship, but also because people on both ends of the political spectrum feel they can’t take a fair system for granted. The Democratic nominee is embroiled in a scandal that suggests either she or her party’s officials cheated in the primaries. The Republican nominee is regularly called a con man and a pathological liar, and no one who’s seen him debate can doubt he’s a bully. As one reader puts it: “Well, when you have a presidential candidate who defrauds investors and clients, lies every time he opens his mouth, then calls everyone he doesn’t like names, how do you expect kids to act?”

Reader Linda, an educator who has facilitated a character education program, has one suggestion for how to discourage bullying:

I believe resiliency training is the key to a school atmosphere that controls bullies at the source—within each child of the school. A child possessed of a strong, resilient attitude is less likely to be bullied. Bullies readily identify the vulnerable in any school group. A bully requires an audience and emotionally weak children to intimidate. A child that can simply walk away, and bystanders who stand up to those bullies, reduce the incidence of bullying. Children possessed of resilient character attributes are capable of such self-determination.

Programming has to be grade-specific and consistent, delivered on a weekly basis. A complete buy-in by the staff throughout the school is essential.

Have you got ideas or experience with character education? Please share them. Update from a reader who disagrees with Linda’s approach:

I have a problem with her advocating resiliency training. There’s nothing inherently wrong with such training, but it won’t stop certain forms of bullying. If a bully hits another kid, it hurts. If a bully takes another kid’s food or money, those things are gone. If a bully dumps your books in a puddle, your books are messed up. Linda apparently doesn’t realize how bullying can be violent—and escalate. Telling the authorities often involves further abuse. She’s blaming victims here.

Another reader weighs in:

Linda’s wording implies that if only children were resilient, they wouldn't be bullied. I learned to be assertive as a kid by standing up to bullies, and since then I have had many opportunities to be glad of the lesson. Sometimes people will indeed stop bothering you if you ask them once, firmly, to stop, and then refuse to be affected by it if they try to persist. But sometimes all that resiliency will get you is many years of sexual harassment that your teachers conveniently never have to deal with on account of you being so resilient.

Parents, educators, students: Have you found ways to successfully discourage bullying at your school? Let us know.

Paul Barnwell, who teaches high school English in Kentucky, wrote a story for us last week about students’ broken moral compasses. As he argues, pressure to ensure their students meet high academic standards has led schools to skip over important discussions about ethics and character, narrowly tailoring their curricula to standardized tests. Here’s Barnwell:

As my students seemed to crave more meaningful discussions and instruction relating to character, morality, and ethics, it struck me how invisible these issues have become in many schools. By omission, are U.S. schools teaching their students that character, morality, and ethics aren’t important in becoming productive, successful citizens?

But as readers point out, meeting academic goals doesn’t have to conflict with moral education. One reader thinks a high-school economics requirement might be the solution:

I know some schools have it, but my sense is that most don’t (and that it’s an elective even where it does exist). Even if one objects to neoclassical economics, I think that a class (or three) could do a strong job of teaching students the normative (value-based) side of economics as well as the positive (fact-based) side:

  • Here's how GDP is calculated. (What important moral considerations does GDP leave out?)
  • Here are the problems one finds polluting industries. (What is the best method for pollution abatement? And how does one avoid regulatory capture?)
  • Here's trade theory. (What are the potential benefits of free trade? What are the potential losses.)

Of course, this only addresses the discussion side of things. But it could be extended to actions. My old junior high school, for example, had a program where a fund was set aside for movies shown at lunch time. But for every act of vandalism, the cost of fixing it was deducted from the fun. And such acts and their costs were announced in home room the day after it happened. That incentive structure made vandalism much more an attack on the student body, rather than merely the creation of a mess that “somebody else” would have to clean up.

Another reader suggests approaching ethics through the humanities:

I can’t imagine teaching literature without engaging in discussions about choices, right and wrong, and differing values across cultures.

For me, one of the main reasons we read literature is to explore moral issues and to consider how characters respond to a variety of situations. My classes have always read works from many different cultures and time periods, so that we can discuss what ideas vary by culture and which seem to be fairly universal. Reading is the one place where we can truly inhabit another person's mind and viewpoint, which means it is a powerful way of developing empathy and expanding one's point of view.

Students are able to passionately argue their ideas about these issues when given the opportunity. And any good story or poem will have a moral dilemma or question lurking in its conflict or subject matter, from Shakespeare to Winnie the Pooh to the latest Young Adult series. The teacher needs to ask open-ended questions, hear out the students’ reasoning, and ask follow-ups that dig deeper and help kids see the implications of what they suggest. A writing assignment growing out of such a discussion is also a fruitful way to teach the use of evidence, support, and clear argumentation.

The Atlantic has tackled this topic before. In 1883, in an essay titled “Morality in the Public Schools,” Oliver Johnson suggested a more organic approach to cultivating kids’ consciences:

Philosophical disquisitions upon the foundations of morality have no legitimate place in the school-room, as every well-instructed teacher will admit. … Moral instruction, to be effective, must be spontaneous and free, and skillfully adapted to cases as they arise. The best teachers, as a general rule, will have the shortest code of laws, if indeed they have any code at all.

For the most part, I’d agree with him there. The K-8 school where I attended 6th-8th grades required a less-than-spontaneous morning pledge “to practice active listening, to use no put-downs, and to do my personal best” that unfortunately got mocked much more often than taken to heart. But for older students, I’d say philosophical disquisitions aren’t so bad either. My university’s core curriculum included a mandatory philosophy class, first established in 1917 to teach students the “skills of peace.” The questions my professors and classmates raised in that course have stuck with me, and I’ll be thinking about them for a long time yet.

Are you an educator with experience bringing ethics into the classroom? Do you remember lessons from school that shaped your ideas of right and wrong? Tell us about it at [email protected].

NPS / D.S. Stanko

Lately I have been saying goodbye to New York City—to bagels and bridges and underground tunnels and my sleepless college years. I have eaten my last halal dinner in Riverside Park, and I’ve stopped at the Met to give my last regards to Joan of Arc. Some night before I move this month to Washington, DC, I will take the Staten Island Ferry for the last time in a circle, lean over the rail and watch the Statue of Liberty rise, come close, and then recede.

I am saying goodbye to the places I know. But I had never yet been inside Grant’s Tomb, a national monument to the Civil War general and U.S. president located by the Hudson River at 122nd Street. This in spite of the fact that I graduated from Grant High School in Portland, Oregon, and then from Columbia, where I lived for four years just a few blocks west of the general’s resting place. You could say that I owe Ulysses S. Grant—who died 131 years ago today—my education, among other things. It felt wrong to leave New York without paying my respects.

Who’s buried in Grant’s Tomb? No one, as the punchline goes, since Grant and his wife are entombed—not buried—in sarcophagi, raised on a dais, and watched over by the busts of Civil War generals. A/C clatters in the shadows. Sun haloes the neoclassical dome. The murals high on the walls show victories: Vicksburg, Chattanooga, Appomattox. The inscription on the marble facade reads, “Let us have peace.” This, in 1868, was Grant’s presidential campaign slogan. As the park ranger explained to me: Having won the Civil War as the “Unconditional Surrender” general, Grant ran on a platform of black civil rights and reconstruction, promising, essentially, to make America whole again.

Richard Rubin, writing about Grant’s Tomb in the July 1996 issue of The Atlantic, confessed that his favorite exhibit at the monument was the guest register:

Of course, it's not the kind of thing you tend to notice immediately in a classical mausoleum with two eight-and-a-half-ton sarcophagi of Wisconsin red granite, five scowling bronze busts, and a pair of seventeen-foot-high wood-and-bronze doors. [But] I’d hate to think what would happen if the Parks Department ever gave up on the guest register. At the very least, some local denizens would lose a place to record thoughts, ideas, or merely the fact that they are still alive.

Two decades later, the guest book is still there by the door, with the same brief comments: “Great general of the Civil War.” “Beautiful and humbling.” “WOW.” The day I went, July 15, a couple from Nice, France, had signed the register. Five thousand miles from the tragedy unfolding in their city, they stood in the tomb among the draped flags and murals and wrote, in French, “Very beautiful place to remember.”

“I am deeply sad,” wrote Ta-Nehisi Coates in 2010, after reading the memoirs that Grant finished in his last days, before he died of throat cancer on July 23, 1885:

Library of Congress

Toward the very end (when this picture was taken) he could no longer talk and was in constant pain. Knowing that, death is always in the background for the reader. But having Grant acknowledge death is breath-taking. There is so much there—a twice elected leader of the most advanced nation in history. A tanner’s son, failing at so much, turned savior of his country. A slave-holder turned mass emancipator. The warrior transformed into a warrior-poet, and to the last embracing the hare-brained scheme of black emigration.

It’s all just too much. I am a black man, and God only knows what Grant would have made of me in that time, or in this one. I asked myself that question so many times while reading that I made myself ill. I don’t care to ever hear it again. Grant is splendid to me, and I am sick of keeping score.

Who’s buried in Grant’s tomb? No one. There’s a deep sense of loss in this joke—as if the monument this country built to the hope of its healing were empty; as if we break, over and over, even after fighting so hard to stay whole.

But then again, that’s what America does. We’re a nation of contradictions. As one of Ta-Nehisi’s readers wrote six years ago:

I’m an Army officer in Afghanistan, and the best of us here learn what made Grant such a good soldier, which is that we have to be hard and kind and stubborn and conciliatory and embody all sorts of contradictions in order to get from here to there. Grant was the best officer and citizen the Army has ever produced, in my view, in large part because he embodied all the contradictions that come with the United States. Grant neither had nor claimed any big answers, but he was a thoughtful, observant American who did his best as he understood it literally till the day he died.

“Let us have peace,” says Grant’s Tomb. Who’s buried there? No one. That kind of hope doesn’t die.

Joe Cavaretta / AP

The winning entry of our reader contest for the best walk-on song for Trump, “You’re So Vain,” reminded me of a literary reference to vanity dropped by the conservative writer David Brooks in our March 2002 issue:

Pretty soon the hedonist will be sitting at the baccarat table in a low-cut pec-neck sweater and alligator loafers, failing to observe the distinction between witty banter with the cocktail waitress and sexual harassment. His skin will have that effervescent glow that Donald Trump’s takes on in the presence of gilded metal and ceiling mirrors.

In a similar vein, Trump biographer Gwenda Blair—via a book review by Jack Beatty for our October 2000 issue—had a pretty damning label for Trump:

“The Donald is fantastic in the golf and very good in the tennis,” Ivana Trump once observed, imperishably, of that “national symbol of luxury and sybaritic [self-indulgent] excess” Donald Trump, whom Gwenda Blair depicts as a Gatsby of self-infatuation transfixed by the green light at the end of his own dock.

William Powers similarly called out Trump in his November 2005 essay on the narcissism of aging Baby Boomers.

But if there’s one theme that most characterized Trump in our culture prior to his presidential run, it’s flashy wealth. Of the 25 print pieces of The Atlantic that referenced Trump between 1992 (our earliest mention of Trump) and early 2011 (when Trump burst on the scene of presidential politics with Birtherism, notwithstanding his flirtation with a Reform Party run in 2000), most of the Trump mentions are off-hand references to luxury.

Compiled here are many such examples, from writers across the political spectrum. From our September 2002 issue, libertarian P.J. O’Rourke:

Peering into bright living rooms, I could see another emblematic Cairo item—the astonishingly ugly sofa. An ideal Egyptian davenport has two Fontainebleaus’ (the one in France and the one in Miami) worth of carving and gilt and is upholstered in plush, petit point, plaid, and paisley, as if Donald Trump and Madame de Pompadour and Queen Victoria and The Doors had gotten together to start a decorating firm.

From our April 2004 issue, Joshua Green profiled Ralph Reed “born again as a political strategist”:

[Reed’s] position as a political consultant to [George W.] Bush is a subordinate one, however, and demands that he never outshine his client. Here Reed struggles a bit. His double-breasted navy suit, impeccably knotted silk tie, and matching gold cufflinks and wristwatch are more Donald Trump than Organization Man.

In stark contrast to Trump and Reed is the Midwestern magnate Warren Buffett, whom Walter Kirn profiled for our November 2004 issue:

Buffett’s attitudes and mannerisms now stand for American capitalism itself—or at least for its more positive aspects. He is what’s good about the free market, in human form—akin to what Joe DiMaggio was to baseball. Bill Gates may be richer, and Donald Trump (the anti-Buffett) flashier, but compared with Buffett they’re mere character actors.

Kirn again underscores Trump’s ostentatiousness:

While the Trumps and Iacoccas of the world prefer to present themselves in garish books with jackets featuring large color photos of their own faces, Buffett, the legendary midwestern cheapskate with a knack for discovering hidden value in cookware clubs (The Pampered Chef) and encyclopedia publishers (World Book), has reclaimed a form of junk mail for his collected works. Buffett’s penny-pinching persona doesn’t allow for lavish photos or graphics; the reports are all text, and they’re printed in black-and-white.

William Powers went looking for penny-pinchers in our July 2006 issue:

Are there no Greens or Gettys in America today? I follow the news pretty closely, and I can’t think of a single infamous tightwad. We celebrate the filthy rich of our culture, turn the Donald Trumps and Paris Hiltons into idols. To read the mainstream press, not to mention the celebrity rags, being rich is a heroic act all by itself.

Lastly, a bit of historical irony from Joshua Green in our January 2007 issue:

If John McCain loses the Republican nomination, he’ll be too old to try again in four or eight years and would loathe waiting around—why not take a final shot at the White House? If Barack Obama concludes that his time is now and yet can’t stop the Hillary juggernaut, might he cash in his chips before his popularity wanes? And isn’t [business tycoon] Jack Welch looking for something to do? Or—heaven forbid—Donald Trump?

Trump and Warhol in 1983. That horse captures my mood this week.  (Mario Suriani / AP)

In the previous installment of our archive series, Fallows featured a popularity poll of Donald Trump in 1990 that one his wealthy supporters fixed to give the false impression that 81 percent of those surveyed believed that Trump “symbolized what was right with the United States.” (Let’s hope that kind of ballot-box stuffing doesn’t happen in November.) That incident makes me think of another Atlantic piece we came across in our archives: “Vote of the Century,” written by Barbara Wallraff for our November 1992 issue (unavailable online). It’s a dispatch from Lake Buena Vista, Florida, where Walt Disney World is headquartered, and Wallraff reports on what then-CEO Michael Eisner in the following video calls “one of the most significant projects in which the Walt Disney Company has ever embarked”—a decade-long survey to determine the “Person of the Century”:

Here’s Wallraff with more on the ambitious project:

The Electronic Forum is a couple of rooms full of ATM-like kiosks in Epcot Center’s Communicore East building. I walked in out of the sun to find a computer screen inviting me to take a stand, or stands, on the Person of the Century question by choosing up to three contenders from a list of eighty-nine names and a write-in blank. Only thirty names, or twenty-nine and the blank, appeared on the screen at once, but the list is in alphabetical order, and so it was pretty clear from the outset that I could call up more names at the touch of a button. Bill Cosby, Marie Curie...Mao Zedong, George C. Marshall, Maria Montessori...Jim Thorpe, Harry Truman, Donald Trump, Andy Warhol...I spoiled my first ballot trying to flip through the whole list again and again, and had to start over.

She adds, “Obviously, this poll is not scientific. For one thing, anyone can hang around the Electronic Forum all day voting for his or her favorite candidate.” Which is exactly what happened—though not for Trump this time. Here’s the story from Jim Hill at The Orlando Weekly:

If you typed in anyone’s name -- and I mean anyone’s -- the computer registered that entry as a legitimate vote. So, as a gag, Epcot cast members began dropping by on their lunch breaks and typing in the name of a particular employee. At first, it was just a few people doing this. But the gag snowballed. Which is how this cast member ended up on the tote board as one of the top 10 candidates for “Person of the Century.”

Management was furious. But there was no way they could delete the employee’s name without corrupting the results of the whole poll. Plus, there seemed to be no way to stop Epcot employees from typing this guy’s name in.

When Communicore closed in July 1994 to make way for Innoventions, Disney quietly pulled the plug on its poll and pretended the whole thing never happened.

If only we could do the same with the Trump candidacy.

Here’s an extraordinary sight for today:

Back on Earth, on July 20, 1969, people all over this world gathered to watch humankind’s first steps on another world. But in South Africa, where TV was banned under apartheid, the only way to tune in to the moon landing was via radio. In the July 1999 issue of The Atlantic, Rob Nixon, who was a teenager in South Africa during the landing, remembered:

Faced with the prospect of missing the moonwalk, even conservative white South Africans began to grumble. … Some months after NASA’s triumph, the government sought to quell local discontent by arranging limited viewings of the taped landing. We had to line up at a planetarium: Mondays, Wednesdays, and Fridays for whites; Tuesdays and Thursdays for blacks.

The turnout was immense. Policemen with German shepherds and Dobermans straining at the leash patrolled the line. After hours of waiting I entered a barricaded enclosure and joined twenty other people seated on collapsible metal chairs. A sullen moustached man tugged a sash, a purple velvet curtain slid back, and a television was revealed. For fifteen minutes I witnessed a lunar landing that seemed no stranger than the unearthly presence of that black box in the room. Then the curtain was closed again and we filed out, abandoning our seats to the next twenty people in line.

The moon landing, according to Nixon, “marked the beginning of the end of the apartheid government’s conviction that South Africa could remain a fortress against television.”  

It was the kind of event that couldn’t fail to unite people in fear and wonder and triumph—a historical moment made for precisely the unifying and equalizing effects of television that the oppressive regime feared. After Apollo 11, white South Africans campaigned for TV, arguing—rather ironically, at the height of the civil rights movement—that this technological deficiency made the country look backward, reactionary. Like the velvet curtain at the viewing Nixon attended, a wall separating South Africa from the rest of the world had slid back for a moment and could never be closed quite as tightly again.

After all, a glimpse of outer space can provide profound perspective. In our September 1874 issue, N.S. Shaler reflected on what was then the recent scientific conclusion that the moon could not support human life. He concluded:

The picture which modern science paints of the moon is cold and hard, and at first sight saddening. It is no more the land of our dreams, a refuge for those who find our blooming earth too hard. … With the telescope we seem to go with the quickness of sight away from the present, to stand in the face of primeval chaos. … Standing in the presence of a worse than ruined world, we feel our confidence in the universe to be weakly founded. Beneficence, creative power, omniscience—all the great words we coin for use on earth seem to have no place here. …

But with time …. the persistent student of the moon will find its silence and peace wonderfully attractive. … He will find it a physical Nirvana where matter has lost its eagerness and endless longings to rest in peace. When he comes back to the earth again … we are sure that he will be the more content with the world and all its ways.

Standing in the presence of a worse than ruined world, we feel our confidence in the universe to be weakly founded. It’s a sentiment that resonates painfully today, nearly 150 years later, in a month of violence and uncertainty around the world. Which makes it a good time, perhaps, to look up at the moon and back at the moon landing 47 years ago. It’s a sight that brings peace and contentment, yes. But it also brings a reminder of progress.

Earth rises above the moon’s horizon during the Apollo 11 lunar mission in July 1969. (NASA / Reuters)

(See all Orbital Views here)

AP

Donald Trump’s successful campaign is genuinely something new. But Trump himself, plus many of the distinctive Trump moves with which people worldwide are now so familiar, come with a surprisingly long record of marks across our public mind. That’s the purpose of the items in this thread: to follow the spoor of this extraordinary figure’s emergence in modern America’s public consciousness.

The earliest known appearance of Donald Trump in The Atlantic’s pages was nearly a quarter of a century ago, in an article by Amitai Etzioni. The October 1992 piece is called “Teledemocracy,” about the ways then-dawning digital technologies might improve democratic processes. Etzioni made the case for “electronic town meetings” that prefigured some of today’s real-time mass-participation events. In exploring the possibilities, he said:

Once we put our minds to it, other shortcomings of the electronic town meeting could be fixed. Take, for example, ballot-box stuffing. Even when much less than national policy is at stake, call-in polls have been grossly manipulated.

Richard Morin, the polling director for The Washington Post, reports two such incidents. In one, USA Today asked its readers in June of 1990 if Donald Trump symbolized what was right or wrong with the United States. Eighty-one percent of the 6,406 people who called in said that he was great, 19 percent a skunk. It turned out that 72 percent of the calls came from two phone numbers.

(Why am I not providing a link to this article? Because it’s from that weird between-two-eras moment in digital-journalistic history, in which the rights for electronic publication had not been fully worked out. A number of our articles from that era, including some of my reports from China and Japan, are not yet online.)

To be clear about this story: Etzioni was discussing an episode in the early 1990s in which, as all evidence suggests, Donald Trump or his allies flooded a phone poll to create a favorable result for himself. Trump was in his early 40s at that time — and it was in exactly this same era that he was calling journalists, posing as his own publicist “John Miller,” to say how kind and generous Mr. Trump was, and how sexually attractive famous actresses and models found him. There are more delicious details about that 1990 rigged poll in a WaPo piece by Philip Bump, after the jump.

***

People have studied Abraham Lincoln’s self-education for clues about the man he became. The different struggles of Teddy Roosevelt and Franklin Roosevelt with disease and physical challenges. How Margaret Thatcher developed the spine to become the U.K.’s Iron Lady. Having lived so much of his life in public, Donald Trump has also given us clues of how he became the kind of person we’ll see accept the nomination tomorrow night.

***

Here are a few highlight’s from P. Bump’s piece, toward the end of encouraging you to read it in its entirety:

During the period in 1990 when Trump and his first wife Ivana were very publicly ending their marriage, USA Today asked its readers to call in to offer their opinions on the real estate magnate. People could call to agree with one of two statements:

  • “Donald Trump symbolizes what makes the USA a great country,” or
  • “Donald Trump symbolizes the things that are wrong with this country”

It's not entirely clear why USA Today decided to do the poll, other than that they wanted to leverage a good water-cooler topic into a little extra spending money. The calls were 1-900 calls costing $0.50 a piece -- and nearly 8,000 people weighed in….

But there was a problem.

“The calls had been running a steady 2-to-1 in Trump’s favor Friday and Saturday,” USA Today's Gary Strauss wrote. “However, a surge of more than 1,000 calls in the hours before the hot line ended at 6 p.m. EDT Sunday ran 93% positive.” Strauss quoted a guy from the company that conducted the survey: “That's definitely odd, out of character with how these things go.”

And the story goes on to explain the origins of the oddity. Behold our nominee!  

Donald with his father Fred and boxing promoter Don King at a press conference in December 1987 in Atlantic City AP

Greetings from Cleveland! Where to start?

Well, here’s one possible starting point. Everything about Donald Trump’s rise suggests a Year Zero, history-begins-this-instant approach to norms, traditions, constraints, you name it.

So in an effort to show the history behind the tabula-rasa of this anti-history, we’ll be highlighting some items from The Atlantic’s archives concerning the way Donald Trump has registered in the national consciousness before he became a supernova over this past year and even before his birtherism burst on the scene in 2011.

Let’s start with a review from our January 1999 issue, written by Nicholas Lemann about Neal Gabler’s Life the Movie: How Entertainment Conquered Reality. (Nick Lemann was then an Atlantic colleague; he subsequently joined The New Yorker and became dean of the Columbia Journalism School. Neal Gabler is the author of our recent cover piece on the Secret Shame of the Middle Class.)

Here is how Lemann referred to Trump’s role as avatar and exemplar of a trend that has only become more pronounced:

Gabler rolls out dozens of examples of the transmogrification of life into stock drama, as entertainment techniques have relentlessly leached into non-entertainment venues. In politics the quadrennial political conventions have changed from real dramas to pageants staged for the purpose of winning the votes of television viewers. Ronald Reagan turned the presidency itself into a procession of scripts and images. The docudrama and the novelistic lead are ubiquitous in journalism. The self-dramatizing memoir has taken over book publishing.

Donald Trump became a tycoon by making himself a celebrity first. Ordinary people have turned from religion to the worship of celebrities (Gabler points out that the Air Jordan logo resembles a crucifix), and have also become the dramaturges of their own lives with the aid of home video cameras, Internet chat rooms, and health clubs joined in the hope of getting to look like a star. Busted farmers stock their land with exotic animals and go into “agritainment.” Even the Pope, Gabler implies, is stealing his moves from James Brown.

More to come from the “Trump in American memory” files. Thanks to The Atlantic’s Chris Bodenner, Caroline Mimbs Nyce, and Graham Starr for spelunking through our archives.

***

What is this Year Zero of which we old timers speak? It seems a useful touchstone given the tone of politics these days.

Obviously it’s not fair to say that what we’re hearing from Donald Trump has no historical precedent. As discussed often in this space, from the days of the Know-Nothing Party onward the United States swum in political currents like those swirling now.

But this humble Wikipedia definition of “Year Zero (political notion)” has some relevance:

The idea behind Year Zero is that all culture and traditions within a society must be completely destroyed or discarded and a new revolutionary culture must replace it, starting from scratch. All history of a nation or people before Year Zero is deemed largely irrelevant, as it will ideally be purged and replaced from the ground up.

Trumpism claims to “give us our country back,” but only after razing what the country has actually become.

It has come to the attention of our editorial board—a group of august, Harvard-educated, middle-aged Boston Brahmins in tweedy suits sitting at heavy wooden desks and smoking fine pipe tobacco * —that there’s a controversy afoot involving “The Battle Hymn of the Republic.” To wit, former President George W. Bush is being criticized for swaying just a little too zestily during a rendition at Tuesday’s memorial service in Dallas for five police officers killed by a gunman:

Even the Associated Press has weighed in, reporting with studious vagueness that “Some criticize Bush’s behavior as inappropriate given the solemn occasion. Others are using the moment to post videos of Bush dancing awkwardly in the past.”

Let us (we tweedy band of editors) stipulate that this is hardly the most important or momentous news of the day. Let us stipulate further, however, that as the periodical that first published Julia Ward Howe’s abolitionist poem, The Atlantic feels a special obligation to weigh in on the matter.

So here it is: Eh, let the guy be.

Look, any criticism delivered can only pale in comparison to the greater penalty Bush faces in this case, which is for anyone to watch this video, in which he looks like—to use the scientific term—a doofus. The true star of this clip is First Lady Michelle Obama, who looks at Bush with what looks like affectionate shade and helpless embarrassment as he rocks out, even as the rest of the dais stands somberly. But when the choir hits the chorus (“Glory, glory hallelujah!”) both Obamas seem to get into the act, swaying along with Bush.

Two points here: First, it’s not the case that getting in the spirit and even laughing are incompatible with memorializing the dead, a point made eloquently by Obama’s own rendition of “Amazing Grace” at a memorial in Charleston for those slain at Emanuel AME Church. Second, it’s the “Battle Hymn of the Republic,” not the “Battle Dirge of the Republic.” The tune was borrowed from a religious camp meeting song, and even before Howe wrote her lyrics, Union soldiers had adopted it as a marching song, under the name “John Brown’s Body.” These days the song is often employed at sporting events, such as this lively performance of Florida State University’s marching band, complete with a flag-waving color guard and dancing cheerleaders—for a September 11 commemoration, no less:

In short, it’s a song made for movement, not stiffness.

In conclusion, leave Dubya alone.


* Not really.

Bain News Service / Library of Congress

Looking through the Atlantic archives on William Butler Yeats’s birthday—the legendary Irish poet was born on June 13, 1865—I stumbled upon a funny anecdote from our May 1919 issue. In a remembrance written shortly after the death of his friend Theodore Roosevelt, Maurice Francis Egan recounts how the president, an avid reader of Irish literature, arranged a meeting with Yeats:

The Celtic poet seemed very happy, but he was silent. President Roosevelt beamed through his glasses, and tried to draw him out.  Suddenly Yeats said, “It’s the Little People we must consider.”

“Oh, yes,” the President rejoined, rather surprised, “I believe with all my heart in the preservation of the little nations.”

Yeats looked astonished, and I said, “By the ‘Little People’ he means the Irish fairies.”

It was President Roosevelt’s turn to look astonished. “Mr. Yeats, have you ever seen an Irish fairy?” he asked, with a glint in his eye.

“Many times,” Yeats said solemnly. “Sure, not only I, but every Irishman, especially the old ones that mow the hay in the twilight, have seen the Little People many and many a time; but they are not small insignificant creatures, like the English fairies; they are giants, the old gods come back again.”

The president was bowled out, but [Roosevelt’s] children found themselves on congenial ground.

What a delightfully awkward encounter between two historical giants. But that’s Yeats, apparently—someone who could be introduced to the president of a foreign nation and feel quite comfortable chatting about fairies. According to a profile of the poet by Louise Bogan for our May 1938 issue:

William Butler Yeats first appears, in the memories of his contemporaries, as a rarefied human being: a tall, dark-visaged young man who walked the streets of Dublin and London in a poetic hat, cloak, and flowing tie, intoning verses. The young man's more solid qualities were not then apparent to the casual observer. But it was during these early years that Yeats was building himself, step by step, into a person who could not only cope with reality but bend it to his will.

Surely that’s something Roosevelt, arguably the most imperial U.S. president, could get behind; as he famously said himself, bending the world to one’s will requires soft words as well as a big stick. Yet Yeats’s most famous fairy poem, “The Stolen Child,” seems less about transforming reality than escaping it:

Where the wave of moonlight glosses
The dim gray sands with light …
To and fro we leap
And chase the frothy bubbles,
While the world is full of troubles
And anxious in its sleep.
Come away, O human child!
To the waters and the wild
With a faery, hand in hand,
For the world’s more full of weeping than you can understand.

After a weekend full of weeping, it’s worth thinking about the kinds of questions Yeats and Roosevelt’s awkward exchange brings to mind: about the place of art, and imagination, and innocence, in governance. About the need to protect the little people, including those made to feel small by forces like homophobia, Islamophobia, xenophobia, or any kind of hate. It’s a time to remember that no one is insignificant. To remember, too, that humans are capable of beauty as well as of harm. As David Sims wrote about last night’s Tony Awards:

Even at the best of times, there’s very little more frivolous than an awards show, but rather than feeling tonally jarring, the revelry ended up being a perfect answer to the misery of the day, its mere existence offering a counterpoint to an act of hatred. It was the kind of night where the Hamilton creator Lin-Manuel Miranda could accept a trophy with a sonnet in praise of love and have it not feel corny, but necessary.

Necessary, because it’s a reminder of what art can do: the solace it can provide, and the love it can promote. In art, we bend reality to our will. In reality, we will ourselves to something better.

Samuel Hollyer / Library of Congress

The beloved author of Oliver Twist, A Christmas Carol, and many other classic works of English literature died 146 years ago today. James T. Fields, The Atlantic’s second editor-in-chief, was a good friend of Dickens, and he published a tribute to the great novelist in our August 1870 issue:

In his presence there was perpetual sunshine, and gloom was banished as having no sort of relationship with him. No man suffered more keenly or sympathized more fully than he did with want and misery; but his motto was, “Don’t stand and cry; press forward and help remove the difficulty.” … He found all the fair humanities blooming in the lowliest hovel. He never put on the good Samaritan: that character was native to him. …

His life will no doubt be written out in full by some competent hand in England; but however numerous the volumes of his biography, the half can hardly be told of the good deeds he has accomplished for his fellow-men.

But in a review of one of those biographical volumes for our May 2010 issue, Christopher Hitchens revealed a darker side of Dickens:

This is the man who had a poor woman arrested for using filthy language in the street; who essentially recast his friend Thomas Carlyle’s pessimistic version of the French Revolution in fictional form in A Tale of Two Cities ... who dreaded the mob more than he disliked the Gradgrinds. … His exiguous chapter on slavery in American Notes was lazily annexed word-for-word from a famous abolitionist pamphlet of the day, and employed chiefly to discredit the whole American idea. But when it came to a fight on the question, he was on balance sympathetic to the Confederate states, which he had never visited, and made remarks about Negroes that might have shocked even the pathologically racist Carlyle. …

What is necessary, therefore, is a portrait that supplies for us what Dickens so generously served up to his hungry readers: some real villainy and cruelty to set against the angelic and the innocent.

How can two such disparate accounts of a man be reconciled? Reading them, I was reminded of Dickens’s A Tale of Two Cities, in which a flawed man redeems himself by dying disguised as a good man.

Charles Darnay and Sydney Carton—the novel’s hero and antihero, respectively—are lookalikes in love with the same woman, Lucie—Darnay’s wife. When Darnay, a French nobleman who gave up his title out of sympathy to the poor, is captured and sentenced to death, the alcoholic ne’er-do-well Carton assumes his identity, dying in his place so that Lucie and her family can escape. On the scaffold of the guillotine, Carton imagines the Darnays will name a child after him and love his memory as much as they do each other. He imagines the child growing up “winning his way up in that path of life … so well, that my name is made illustrious by the light of his … the blots I threw upon it faded away.”

It’s a romantic form of transference, but a strange one: Rather than redeem himself with a heroic act under his own name, Carton all but erases his own history at the moment of his death. In the eyes of the world, in the narrative Carton creates, it’s Darnay who dies, and Darnay who lives; Sydney Carton, dying under Darnay’s name, claims his innocent past and his noble future.

Meanwhile, Dickens, as Hitchens noted, lifted his condemnation of the French revolution from another famous writer, Carlyle. Yet in fictionalizing his story, Dickens placed himself into his characters—his initials, his demons, his childhood sweetheart in Lucie—just as Carton steps into Darnay’s body. Carton’s famous last words could have been spoken by Dickens himself: “It is a far, far better thing that I do than I have ever done; it is a far, far better rest that I go to than I have ever known.”

Hitchens:

Always saying that he sought rest, and always exhausting himself, [Dickens] may have been half in love with easeful death. The next biography should take this stark chiaroscuro as its starting point.

More Notes From The Atlantic
  • Notes Home