READING SUBTLY

This
was the domain of my Blogger site from 2009 to 2018, when I moved to this domain and started
The Storytelling Ape
. The search option should help you find any of the old posts you're looking for.
 

Dennis Junk Dennis Junk

In the Crowded Moment atop the Fire Wave

A reflection on time and friendship, the product of reading a bunch of Cormac McCarthy, going through a breakup, and going out west to visit some national parks with old friends.

Emily had continued ahead of them all along the trail until she was out of sight. Steven had fallen behind, likewise now occluded by the towering red wall of rock. He was sure for a while after he began following Stacy that Steven would be coming along. But when he stepped up onto an escarpment and climbed a ways to a higher vantage he looked and saw no one on the trail behind him. The way Emily plunged forth with no concern for keeping apace the rest of the group and the fact that now Steven was showing a similar disregard gave him the sense that some deadly tension was building between them. But, then, hadn’t they been like this for as long as they’d been together? He turned and continued along a route above the sandy track on the rock surface, thinking of the rough-grained unyielding folds as ripples caught in some time-halting spell as they oozed along, the lower tiers melting out from beneath those stacked above, the solid formation melting from the bottom up.

            Marching at a pace to overtake Stacy, he watched the dull blood-stained bands of the undulating sandstone pass through the space immediately before his feet and thought of Darwin striding across the volcanic rock surface of an island in the Galapagos, Charles Lyell’s Principles of Geology still resonating in the capacious chambers of his dogged mind, the sense taking hold more solidly that the solid earth beneath his feet had once been, and may again at any moment be, a flowing emblem of impermanence. Later, when the Beagle docked in Chile, the coast only hours before blasted by an earthquake and subsequent tsunami, Darwin, stepping onto the rocky shore, smelled putrefying fish. The quake had lifted what had moments before been seafloor up into the open air, and, seeing the remains of all the stranded marine creatures, he suddenly understood the provenance of the fossil shells and crustaceans he’d found high in the mountains, thousands of feet above sea level.

            A hundred and fifty years later, it seems like such a simple deduction. To advance our understanding of geology or evolution today, he thought, you would have to be much more subtle than that. And yet, he wondered, how many people think of moments in history, the earthshaking conception of earthshattering theories, as they traverse rock formations like these? What do most people think of when they’re walking along this trail? He looked up and saw how the distant mountains bursting up into the sky gave scope to the vast distances, making of the horizon a symbol of the tininess of these individual human bodies tossing imperceptibly about on the tide of eternity, precipitating a vertiginous dropping away of identity, obliteration before the shock wave of detonated timescales.

             Stacy and Emily were still far enough ahead, and Steven far enough behind, that he had the trail to himself. Darwin just isn’t relevant to anyone, he thought. The concentrated heat of the toppling afternoon sun was momentarily chased over the corrugated sandstone by a chill. You can live your whole life, he thought, be moderately happy, and never have to think of evolution or geological timescales even once—hell, you may even be happier that way. And you, he thought to himself, you may have a type of passion teachers think is just delightful—until you let it loose to savage their own lessons. But most people, most of the time, would rather not have to take anything that seriously. Examine the underpinnings to a point. Allow for doubts and objections to a point. The point where having to think about what it could mean isn’t as exciting as it is scary. The way you do it makes people uncomfortable. It made your ex want to kill you.

            He had the thought—before deciding that he wouldn’t be thinking about her anymore this trip—that his ex couldn’t make that final compromise. Whatever disagreement they’d had was years past, but, even though her ability to articulate what this insurmountable barrier in her heart consisted of went no further than “I can’t forgive you,” even though she’d meet the idea with her own detonations, he knew that last unbudgable bit of incompatibility, the ultimate deal-breaker, had everything to do with her gathering recognition that their two worldviews, once so distant, their separate territories so fiercely guarded, had moved steadily closer over the years—and it wasn’t his that was undergoing the displacement. She needed that one refuge of holding out. So I got to be right, he thought. But what’s it worth if I’m alone? And anyway, much as everyone assumes the contrary, I could give a fuck about being right.

            Then there’s Stacy’s easy social grace—masterful really. She has a fine sense of everyone’s perspective, an impressive memory for everyone’s preferences, and, when in doubt, she simply fills the void with the surging energy of her character. Her charms are even such as can accommodate the intensity of his skepticism and passion for science’s refining crucible. What would she be thinking right now? Where she’s going to live in the coming weeks? Whether she’ll be able to find a job in Charlotte? How much she’s going to miss the poor boys she drives so crazy? Or maybe she too is wondering how the rock came to have such clearly demarcated bands, what accounts for the red hues—iron?—and what it means that our human lifespans scarcely even register on the timescale of geology. As vivacious and loquacious as she is, she’s always had an impressively developed inner life. Still, he remembers her nudges under the table that first night he was in L.A., debating with her friends during their apartment gathering about the virtues, or lack thereof, of SSRIs, those nudges which effectively said, “Don’t do that now—don’t be you,” though that last part was more in keeping with what his more recent ex might’ve said.

            As he covered more distance yet failed to overtake the women ahead of him on the trail, he felt increasingly and pleasantly placeless, but the discomfort at leaving Steven behind for all this time began to disrupt the flow his thoughts. Still, he assured himself, it isn’t like any of us will have the chance to pop over again some other time. Steven could’ve come along; it isn’t my responsibility to make sure no one gets stuck waiting for the others—a task that between them Steven and Emily seemed to be going out of their way to make impossible. His mind went back to the party, to Corina, the pretty blonde, talking about how people give her directions out of Compton whenever she drives through to meet with the troubled teens she tries to help. He scanned his memory for evidence that she was offended or unsettled by anything he said. No, she was incredulous—how could someone say antidepressants don’t work? It was such a foreign idea. But she seemed to enjoy grappling with it, batting it back. She seemed exhilarated to be in the presence of someone so confidently misguided. And fine, he would have said, let’s see how far down the rabbit hole you can go before you start to panic. No, he decides, it really is bullshit, all that about me hurting people. The worst that can be said is that I ruined the mood—and even that isn’t true. If anything, I brought some unexpected excitement.

            When he first stepped into the apartment and was introduced to Corina, he put some added effort into answering the question about what it is he does in Fort Wayne, setting his current job within the context of his aspirations, perhaps making it seem more exciting than it really is, as if it were just a way station along the path to his career as a novelist. Speaking of your occupation as a vocation, telling a story about how you came to do what you’re doing and how it will lead to you doing something even more extraordinary—it was something he’d ruminated on as he made the trip westward. All the strangers making small talk on the planes and in the airports, and that question, “What do you do?”, so routine. One or two words couldn’t suffice as an answer. The two words may as well be, “Dismiss me.” Telling a story, though, well, everyone appreciates a good story. You may forget a mere accountant, or programmer, or copywriter, but everyone loves a protagonist.

Maybe, he thinks now, you can do something similar when it comes to your beliefs and your way of thinking and debating and refusing to shy away from disagreements. Present it in the context of a story—how you came to think the way you do—with a beginning, middle, and end.

He imagines himself on a first date saying, I always talk about the importance of science, and I feel it’s often necessary to challenge people on beliefs that they’ve invested a lot of emotion in. So a lot of people assume I’m heartless or domineering—that I get off on proving how smart I am. The truth is I’m so sensitive and so sentimental that half the time I’m disgusted with myself for being so pathetic. If I’m calculating, it would be more like the calculations of someone with second degree burns lowering himself into a tub of ice water. That sensitivity, though, that receptiveness, it’s what makes me so attentive and engaged. I go into these trances when I read or even when I’m watching movies or shows. People are impressed with how well I remember plots and lines from stories. People in school used to ask how I did it. There wasn’t any trick. I sure as hell didn’t apply any formula. I just took the stories seriously—I couldn’t help but take them seriously. The characters came across as real people, and I cared about them. The plots—I knew they weren’t real of course—but in those moments when you’re really into it, they’re real in their own way. It’s like it doesn’t matter if they’re real or not. And that connection I have to novels and shows, you know, it’s like in school they try to tell you that’s not how you should read and they tell you all this bullshit about how you’re supposed to analyze them or deconstruct them. All I can say is when I realized how utterly fucking stupid all that literary theory crap is—it was like this huge epiphany. I felt so liberated. And today a lot of the arguments I get into are with people who want to take this or that writer to task for some supposed sexism or racism, or for not toeing the line of some brain dead theory.

            But the other part of it is that I still remember being sixteen and realizing that the Catholicism I was brought up with was the purest nonsense, nothing but a set of traditions clung to out of existential desperation and unthinking habit. What made that so horrible for me, again, was that up till then I had taken it so seriously. I wasn’t a bible thumper or anything. But I prayed every night. I really believed. So when it came crashing down—well, I can’t describe how betrayed I felt. I remember wondering why no one tried harder to get at the truth before they all conspired to foist this idiocy on so many children. The next big disillusionment came when my friends and I started watching the Ultimate Fighting Championships. At the time, I’d been taking tae kwon do and karate for over four years in a couple of those strip mall dojos MMA guys talk about with such disdain these days. Watching those fights, guys actually going in there and trying beat the shit out each other, I saw—even though I admit it took me a while to accept it—most of the stuff I’d been learning so assiduously all those years was next to worthless. It was like, oh well, at least you learned some discipline and stayed in shape. Yeah, but I could have been learning muay thai or jujitsu. The thing is, when you go in for this type of nonsense, it’s not just your thing. It affects other people. Religious people teach religion to their kids. They proselytize to anyone who’ll listen. Those charlatan karate teachers, they take people’s money. They give them a false sense of control—not to mention wasting their fucking time.

            Then there were the two years in college when I had my heart set on being a clinical psychologist. At the time, everyone just knew childhood trauma was at the root of almost all mental illness. I listened to that Love Lines show on the radio where Dr. Drew and Adam Corolla interrogated the women who called in until they broke down and admitted they’d been abused as children. Then, the summer before my senior year, I start digging into the actual science. Turns out the sexual abuse everyone is so sure fucks kids up for life—its effects can’t even be distinguished from those of physical abuse or neglect. There’s almost no evidence that those childhood traumas lead to psychological issues later in life. Repressed memories? Total bullshit. If you think you underwent some process of recovering long-forgotten memories of sexual abuse you suffered as a child, what you were really doing was going through a ritual induction into a bizarre, man-hating, life-ruining cult. And I was graduating from college at just about the time in the late 90s when all the false accusations and wrongful imprisonments were coming to light. Oh, and did I mention that the first girl I fell in love with—the woman up ahead of me on the trail right now—I couldn’t touch her for years because I was so worried about the harm it might cause, that lost time of my late teens and into my twenties. My mind so full to capacity with all that ridiculous feminist pseudo-psychology.  Instead of coming on to her, I waited for her to initiate, and she wondered what the hell was wrong with me since all the while I kept insisting I didn’t want to be just friends. Oh, the awakenings I had in store, rude and otherwise, when it came to women and desire.

            Then there was a brief flirtation with new age ideas after I took a course on Religion and Culture where the teacher assigned the fucking Celestine Prophecy as a text book—and she treated all the claims as if they were real. Luckily, Carl Sagan’s Demon-Haunted World was recommended by one of my anthropology teachers, so I read it soon afterward. I wanted to leave a copy in the Religion and Culture teacher’s mailbox. Sagan’s book was what showed me, not what ideas I should believe in, but how I could go about finding out which ideas were most likely to be true. That book changed my perspective on society and conventional wisdom in general. I think most people assume if a bunch of teachers tell you something and if enough people believe it there must be something to it. But some ideas, most ideas, sometimes I think nearly all ideas but a precious few are just plain wrong no matter who or how many people believe them. What I couldn’t have known then is how much that kind of thinking would alienate me.

            He stopped, having arrived at the top of a large rise which dropped off precipitously before him. He chuckled, thinking, yeah, maybe you better not say all that on a first date. Looking across to another, somewhat lower rise, he saw Stacy leveling her iPhone for a picture. Two tall and slender women, attired in form-fitting apparel like Stacy’s, were likewise taking turns getting pictures of each other on the various peaks and mounds. He watched the girls, turned back to the jagged mountains on the distant horizon, like the spine of some scarcely corporeal monster cresting the surface of a sand-crusted sea, the air separating the countless miles as perceptibly invisible as the freshly polished crystal of a priceless timepiece, and he thought about how hospitable all the world has become to us humans. Without the cars and the roads and the ready stores of water, this desert would appear so differently to them. In all likelihood, they would even lose something of their humanity, becoming vicious to adapt to the precariousness and harsh brutality of a less trustworthy denizenry.

            He imagined roving bands of Native Americans, then government-sponsored cavalries, cowboys, thieves, marauders, so many varieties of deadly men, barely human. We’ve had to build up, on such a flimsy foundation, a space for men to be more civilized, more peaceful, less desperate. And somehow what we—or our forebears, also mostly men—created has succeeded to such a stupid degree we take it enough for granted that whole schools of thought have grown up to lament the evils of civilization. The truth is, before civilization came, as civilization was still busy coming, there was probably enough bloodshed in this region that the pink of iron glowing in these seasonally layered bands under our feet may have leeched into the rock after leaking from the endless variety of wounds sustained by human flesh.

            Looking over the rim of the bulging rock, he saw it rolling away severely to reveal a drop of several hundred feet. And he was surprised to realize, for the third time in two days, the instinctual anchor preventing him from taking a step, and another, toward that abrupt curving back and away of the rough surface, it had either vanished or simply never existed. He felt his weight pulling against the spongy grip of the soles of his shoes while his breaths continued slow and his heart beat softly on. It wasn’t until turning back to check once again if he could see Steven from this elevated vantage that he felt an impulse to back away from the downward curve. If he was to let himself fall, he’d only do so eyes forward.

            Self-conscious now, he glanced about for Stacy and the handful of other people milling about the outcroppings, nestled pockets, and rolling protuberances of ancient rock. What are we all looking for in these travels and treks to otherworldly places? We’re looking for inspiration, calling forth moments, invoking the powers of transcendence, pushing ourselves forward into what we hope are those periods of our lives when it seems like we’re finally becoming who in our dreamlife hearts we always believed we would be. Set the world alight with enchantment and endless possibility. And we’ve all had those experiences where we’ve met someone, or undertaken some project, or set off to some faraway destination—and all our lives seemed in flux and we were moving at last toward that state of being when we could relax, be ourselves, but work and strive meaningfully at the same time. Usually, though, we miss them. The quake’s upheaval threatens more than it promises. We can’t appreciate these periods in our lives while we’re living them because we’re still caught up in the time and the transformation that occurred previous to this one. Attachments are like habits that way. You could wait till the end of time and they’d never extinguish on their own. Your only hope is to replace the old ones with new ones. But since no love you have can ever match the poignancy of the loves you’ve lost, no civilization lives up to the golden accomplishments of the one that’s vanished, you live looking back, boats against the current and all that. Or, knowing all this, you wait. And you look out. And you wonder all the while if there’s something more you should be doing to bring about that next period of becoming who you are, worrying that you may have already used up all the ones you had coming.

            He walked down the rounded surface, on the side he’d come up, feeling purged, emptied of some burden of long-accustomed ache, as if it had drained from his blood into the banded stone beneath his soft-soled shoes. Drops in time, echoes like living breathing beings, the absent people in our minds. Exes, old friends, Darwin, roving bands of savage men—they have life, existence independent now from the bodies housing their own autonomous searchings and wanderings. Their echoes forever pull and impact us, scour our flesh and turn us inside out—flaring with the red heat of rage and longing and protectiveness and abandonment and loss. These emotions they call forth with their spectral gestures, their faces, their words, they never cease, even with physical absence. Each prod, each tug, each blow gets recorded and replayed forever, the dynamic of our interactions carrying on even when we’re alone, drowning out all the other beckonings at the doorstep of our hearts.

            “Where the hell is Steven?”

            He looks up from where he’d been searching for footholds to see Stacy startlingly close to where he’d finally landed two-footedly in the sand. “I don’t think he’s coming.” His felt snot wetting his mustache, the unaccountable allergic outflow that had been plaguing them all for the past two days. “He’s really missing out.”

Also read: 

THE TREE CLIMBER: A STORY INSPIRED BY W.S. MERWIN

And 

DERAILED: A REFLECTION ON DATING, DOGS, AND FAMILY

Read More
Dennis Junk Dennis Junk

Gone Girl and the Relationship Game: The Charms of Gillian Flynn's Amazingly Seductive Anti-Heroine

Gillian Flynn’s “Gone Girl” is a brilliantly witty and trenchant exploration of how identities shift and transform, both within relationships and as part of an ongoing struggle for each of us to master the plot of our own narratives.

The simple phrase “that guy,” as in the delightfully manipulative call for a man to check himself “You don’t want to be that guy,” underscores something remarkable about what’s billed as our modern age of self-invention. No matter how hard we try, we’re nearly always on the verge of falling into some recognizable category of people—always in peril of becoming a cliché. Even in a mature society characterized by competitive originality, the corralling of what could be biographical chaos into a finite assemblage of themes, if not entire stories, seems as inescapable as ever. Which isn’t to say that true originality is nowhere to be found—outside of fiction anyway—but that resisting the pull of convention, or even (God forbid) tradition, demands sustained effort. And like any other endeavor requiring disciplined exertion, you need a ready store of motivation to draw on if you’re determined not to be that guy, or that couple, or one of those women.

            When Nick Dunne, a former writer of pop culture reviews for a men’s magazine in Gillian Flynn’s slickly conceived, subtly character-driven, and cleverly satirical novel Gone Girl, looks over the bar he’s been reduced to tending at the young beauty who’s taken a shine to him in his capacity as part-time journalism professor, the familiarity of his dilemma—the familiarity even of the jokes about his dilemma—makes for a type of bitingly bittersweet comedy that runs throughout the first half of the story. “Now surprise me with a drink,” Andie, his enamored student, enjoins him.

She leaned forward so her cleavage was leveraged against the bar, her breasts pushed upward. She wore a pendant on a thin gold chain; the pendant slid between her breasts down under her sweater. Don’t be that guy, I thought. The guy who pants over where the pendant ends. (261)

Nick, every reader knows, is not thinking about Andie’s heart. For many a man in this situation, all the condemnation infused into those two words abruptly transforms into an awareness of his own cheap sanctimony as he realizes about the only thing separating any given guy from becoming that guy is the wrong set of circumstances. As Nick recounts,

You ask yourself, Why? I’d been faithful to Amy always. I was the guy who left the bar early if a woman was getting too flirty, if her touch was feeling too nice. I was not a cheater. I don’t (didn’t?) like cheaters: dishonest, disrespectful, petty, spoiled. I had never succumbed. But that was back when I was happy. I hate to think the answer is that easy, but I had been happy all my life, and now I was not, and Andie was there, lingering after class, asking me questions about myself that Amy never had, not lately. Making me feel like a worthwhile man, not the idiot who lost his job, the dope who forgot to put the toilet seat down, the blunderer who just could never quite get it right, whatever it was. (257)

More than most of us are comfortable acknowledging, the roles we play, or rather the ones we fall into, in our relationships are decided on collaboratively—if there’s ever any real deciding involved at all. Nick turns into that guy because the role, the identity, imposed on him by his wife Amy makes him miserable, while the one he gets to assume with Andie is a more complimentary fit with what he feels is his true nature. As he muses while relishing the way Andie makes him feel, “Love makes you want to be a better man—right, right. But maybe love, real love, also gives you permission to just be the man you are” (264).

            Underlying and eventually overtaking the mystery plot set in motion by Amy’s disappearance in the opening pages of Gone Girl is an entertainingly overblown dramatization of the struggle nearly all modern couples go through as they negotiate the contours of their respective roles in the relationship. The questions of what we’ll allow and what we’ll demand of our partners seem ever-so-easy to answer while we’re still unattached; what we altogether fail to anticipate are the questions about who our partners will become and how much of ourselves they’ll allow us the space to actuate. Nick fell in love with Amy because of how much he enjoyed being around her, but, in keeping with so many clichés about married life, that would soon change. Early in the novel, it’s difficult to discern just how hyperbolic Nick is being when he describes the metamorphosis. “The Amy of today,” he tells us, “was abrasive enough to want to hurt, sometimes.” He goes on to explain,

I speak specifically of the Amy of today, who was only remotely like the woman I fell in love with. It had been an awful fairytale reverse transformation. Over just a few years, the old Amy, the girl of the big laugh and the easy ways, literally shed herself, a pile of skin and soul on the floor, and out stepped this brittle, bitter Amy. My wife was no longer my wife but a razor-wire knot daring me to unloop her, and I was not up to the job with my thick, numb, nervous fingers. Country fingers. Flyover fingers untrained in the intricate, dangerous work of solving Amy. When I’d hold up the bloody stumps, she’d sigh and turn to her secret mental notebook on which she tallied all my deficiencies, forever noting disappointments, frailties, shortcomings. My old Amy, damn, she was fun. She was funny. She made me laugh. I’d forgotten that. And she laughed. From the bottom of her throat, from right behind that small finger-shaped hollow, which is the best place to laugh from. She released her grievances like handfuls of birdseed: They are there, and they are gone. (89)

In this telling, Amy has gone from being lighthearted and fun to suffocatingly critical and humorless in the span of a few years. This sounds suspiciously like something that guy, the one who cheats on his wife, would say, rationalizing his betrayal by implying she was the one who unilaterally revised the terms of their arrangement. Still, many married men, whether they’ve personally strayed or not, probably find Nick’s description of his plight uncomfortably familiar.

            Amy’s own first-person account of their time together serves as a counterpoint to Nick’s narrative about her disappearance throughout the first half of the novel, and even as you’re on the lookout for clues as to whose descriptions of the other are the more reliable you get the sense that, alongside whatever explicit lies or omissions are indicated by the contradictions and gaps in their respective versions, a pitched battle is taking place between them for the privilege of defining not just each other but themselves as well. At one point, Nick admits that Amy hadn’t always been as easy to be with as he suggested earlier; in fact, he’d fallen in love with her precisely because in trying to keep up with her exacting and boundlessly energetic mind he felt that he became “the ultimate Nick.” He says,

Amy made me believe I was exceptional, that I was up to her level of play. That was both our making and our undoing. Because I couldn’t handle the demands of greatness. I began craving ease and averageness, and I hated myself for it. I turned her into the brittle, prickly thing she became. I had pretended to be one kind of man and revealed myself to be quite another. Worse, I convinced myself our tragedy was entirely her making. I spent years working myself into the very thing I swore she was: a righteous ball of hate. (371-2)

At this point in the novel, Amy’s version of their story seems the more plausible by far. Nick is confessing to mischaracterizing her change, taking some responsibility for it. Now, he seems to accept that he was the one who didn’t live up to the terms of their original agreement.

            But Nick’s understanding of the narrative at this point isn’t as settled as his mea culpa implies. What becomes clear from all the mulling and vacillating is that arriving at a definitive account of who provoked whom, who changed first, or who is ultimately to blame is all but impossible. Was Amy increasingly difficult to please? Or did Nick run out of steam? Just a few pages prior to admitting he’d been the first to undergo a deal-breaking transformation, Nick was expressing his disgust at what his futile efforts to make Amy happy had reduced him to:

For two years I tried as my old wife slipped away, and I tried so hard—no anger, no arguments, the constant kowtowing, the capitulation, the sitcom-husband version of me: Yes, dear. Of course, sweetheart. The fucking energy leached from my body as my frantic-rabbit thoughts tried to figure out how to make her happy, and each action, each attempt, was met with a rolled eye or a sad little sigh. A you just don’t get it sigh.

            By the time we left for Missouri, I was just pissed. I was ashamed of the memory of me—the scuttling, scraping, hunchbacked toadie of a man I’d turned into. So I wasn’t romantic; I wasn’t even nice. (366)

The couple’s move from New York to Nick’s hometown in Missouri to help his sister care for their ailing mother is another point of contention between them. Nick had been laid off from the magazine he’d worked for. Amy had lost her job writing personality quizzes for a women’s magazine, and she’d also lost most of the trust fund her parents had set up for her when those same parents came asking for a loan to help bail them out after some of their investments went bad. So the conditions of the job market and the economy toward the end of the aughts were cranking up the pressure on both Nick and Amy as they strived to maintain some sense of themselves as exceptional human beings. But this added burden fell on each of them equally, so it doesn’t give us much help figuring out who was the first to succumb to bitterness.

Nonetheless, throughout the first half of Gone Girl Flynn works to gradually intensify our suspicion of Nick, making us wonder if he could possibly have flown into a blind rage and killed his wife before somehow dissociating himself from the memory of the crime. He gives the impression, for instance, that he’s trying to keep his affair with Andie secret from us, his readers and confessors, until he’s forced to come clean. Amy at one point reveals she was frightened enough of him to attempt to buy a gun. Nick also appears to have made a bunch of high-cost purchases with credit cards he denies ever having signed up for. Amy even bought extra life insurance a month or so before her disappearance—perhaps in response to some mysterious prompting from her husband. And there’s something weird about Nick’s relationship with his misogynist father, whose death from Alzheimer’s he’s eagerly awaiting. The second half of Gone Girl could have gone on to explore Nick’s psychosis and chronicle his efforts at escaping detection and capture. In that case, Flynn herself would have been surrendering to the pull of convention. But where she ends up going with her novel makes for a story that’s much more original—and, perhaps paradoxically, much more reflective of our modern societal values.

In one sense, the verdict about which character is truly to blame for the breakdown of the marriage is arbitrary. Marriages come apart at the seams because one or both partners no longer feel like they can be themselves all the time —affixing blame is little more than a postmortem exercise in recriminatory self-exculpation. If Gone Girl had been written as a literary instead of a genre work, it probably would have focused on the difficulty and ultimate pointlessness of figuring out whose subjective experiences were closest to reality, since our subjectivity is all we have to go on and our messy lives simply don’t lend themselves to clean narratives. But Flynn instead furnishes her thriller with an unmistakable and fantastically impressive villain, one whose aims and impulses so perfectly caricature the nastiest of our own that we can’t help elevating her to the ranks of our most beloved anti-heroes like Walter White and Frank Underwood (making me a little disappointed that David Fincher is making a movie out of the book instead of a cable or Netflix series).

Amy, we learn early in the novel, is not simply Amy but rather Amazing Amy, the inspiration for a series of wildly popular children’s books written by her parents, both of whom are child psychologists. The books are in fact the source of the money in the trust fund they had set up for her, and it is the lackluster sales of recent editions in the series, along with the spendthrift lifestyle they’d grown accustomed to, that drives them to ask for most of that money back. In the early chapters, Amy writes about how Amazing Amy often serves as a subtle rebuke from her parents, making good decisions in place of her bad ones, doing perfectly what she does ineptly. But later on we find out that the real Amy has nothing but contempt for her parents’ notions of what might constitute the ideal daughter. Far from worrying that she may not be living up to the standard set by her fictional counterpart, Amy feels the title of Amazing is part of her natural due. Indeed, when she first discovers Nick is cheating with Andie—a discovery she makes even before they sleep together the first time—what makes her the angriest about it is how mediocre it makes her seem. “I had a new persona,” she says, “not of my choosing. I was Average Dumb Woman Married to Average Shitty Man. He had single-handedly de-amazed Amazing Amy” (401).  

Normally, Amy does choose which persona she wants to take on; that is in fact the crucial power that makes her worthy of her superhero sobriquet. Most of us no longer want to read stories about women who become the tragic victims of their complicatedly sympathetic but monstrously damaged husbands. With Amy, Flynn turns that convention inside-out. While real life seldom offers even remotely satisfying resolutions to the rival PR campaigns at the heart of so many dissolving marriages, Amy confesses—or rather boasts—to us, her readers and fans, that she was the one who had been acting like someone else at the beginning of the relationship.

Nick loved me. A six-O kind of love: He looooooved me. But he didn’t love me, me. Nick loved a girl who doesn’t exist. I was pretending, the way I often did, pretending to have a personality. I can’t help it, it’s what I’ve always done: The way some women change fashion regularly, I change personalities. What persona feels good, what’s coveted, what’s au courant? I think most people do this, they just don’t admit it, or else they settle on one persona because they’re too lazy or stupid to pull off a switch. (382)

This ability of Amy’s to take on whatever role she deems suitable for her at the moment is complemented by her adeptness at using people’s—her victims’—stereotype-based expectations of women against them. Taken together with her capacity for harboring long-term grudges in response to even the most seemingly insignificant of slights, these powers make Amazing Amy a heroic paragon of postmodern feminism. The catch is that to pull off her grand deception of Nick, the police, and the nattering public, she has to be a complete psychopath.

Amy assumes the first of the personas we encounter, Diary Amy, through writing, and the story she tells, equal parts fiction and lying, takes us in so effectively because it reprises some of our culture’s most common themes. Beyond the diary, the overarching story of Gone Girl so perfectly subverts the conventional abuse narrative that it’s hard not to admire Amy for refusing to be a character in it. Even after she’s confessed to framing Nick for her murder, the culmination of a plan so deviously vindictive her insanity is beyond question, it’s hard not to root for her when she pits herself against Desi, a former classmate she dated while attending a posh prep school called Wickshire Academy. Desi serves as the ideal anti-villain for our amazing anti-heroine. Amy writes,

It’s good to have at least one man you can use for anything. Desi is a white-knight type. He loves troubled women. Over the years, after Wickshire, when we’d talk, I’d ask after his latest girlfriend, and no matter the girl, he would always say: “Oh, she’s not doing very well, unfortunately.” But I know it is fortunate for Desi—the eating disorders, the painkiller addictions, the crippling depressions. He’s never happier than when he’s at a bedside. Not in bed, just perched nearby with broth and juice and a gently starched voice. Poor darling. (551-2)

Though an eagerness to save troubled women may not seem so damning at first blush, we soon learn that Desi’s ministrations are motived more by the expectation of gratitude and the relinquishing of control than by any genuine proclivity toward altruism. After Amy shows up claiming she’s hiding from Nick, she quickly becomes a prisoner in Desi’s house. And the way he treats her, so solicitous, so patronizing, so creepy—you almost can’t wait for Amy to decide he’s outlived his usefulness.

The struggle between Nick and Amy takes place against the backdrop of a society obsessed with celebrity and scandal. One of the things Nick is forced to learn, not to compete with Amy—which he’d never be able to do—but to merely survive, is to make himself sympathetic to people whose only contact with him is through the media. What Flynn conveys in depicting his efforts is the absurdity of trial by media. A daytime talk show host named Ellen Abbott—an obvious sendup of Nancy Grace—stirs her audience into a rage against Nick because he makes the mistake of forcing a smile in front of a camera. One of the effects of our media obsession is that we ceaselessly compare ourselves with characters in movies and TV. Nick finds himself at multiple points enacting scenes from cop shows, trying to act the familiar role of the innocent suspect. But he knows all the while that the story everyone is most familiar with, whether from fictional shows or those billed as nonfiction, is of the husband who tries to get away with murdering his wife.

  Being awash in stories featuring celebrities and, increasingly, real people who are supposed to be just like us wouldn’t have such a virulent impact if we weren’t driven to compete with all of them. But, whenever someone says you don’t want to be that guy, what they’re really saying is that you should be better than that guy. Most of the toggling between identities Amy does is for the purpose of outshining any would-be rivals. In one oddly revealing passage, she even claims that focusing on outcompeting other couples made her happier than trying to win her individual struggle against Nick:

I thought we would be the most perfect union: the happiest couple around. Not that love is a competition. But I don’t understand the point of being together if you’re not the happiest.

            I was probably happier for those few years—pretending to be someone else—than I ever have been before or after. I can’t decide what that means. (386-7)

The problem was that neither could maintain this focus on being the happiest couple because each found themselves competing against the other. It’s all well and good to look at how well your relationship works—how happy you both are in it—until some women you know start suggesting their husbands are more obedient than yours.

            The challenge that was unique to Amy—that wasn’t really at all unique to Amy—entailed transitioning from the persona she donned to attract Nick and persuade him to marry her to a persona that would allow her to be comfortably dominant in the marriage. The persona women use to first land their man Amy refers to as the Cool Girl.

Being the Cool Girl means I am a hot, brilliant, funny woman who adores football, poker, dirty jokes, and burping, who plays video games, drinks cheap beer, loves threesomes and anal sex, and jams hot dogs and hamburgers into her mouth like she’s hosting the world’s biggest culinary gang bang while somehow still maintaining a size 2, because Cool Girls are above all hot. Hot and understanding. Cool Girls never get angry; they only smile in a chagrined, loving manner and let their men do whatever they want. Go ahead, shit on me, I don’t mind, I’m the Cool Girl. (383)

Amy is convinced Andie too is only pretending to be Cool Girl—because the Cool Girl can’t possibly exist. And the ire she directs at Nick arises out of her disappointment that he could believe the role she’d taken on was genuine.

I hated Nick for being surprised when I became me. I hated him for not knowing it had to end, for truly believing he had married this creature, this figment of the imagination of a million masturbatory men, semen-fingered and self-satisfied. (387)

Her reason for being contemptuous of women who keep trying to be the Cool Girl, the same reason she can’t bring herself to continue playing the role, is even more revealing. “If you let a man cancel plans or decline to do things for you,” she insists, “you lose.” She goes on to explain,

You don’t get what you want. It’s pretty clear. Sure, he may be happy, he may say you’re the coolest girl ever, but he’s saying it because he got his way. He’s calling you a Cool Girl to fool you! That’s what men do: They try to make it sound like you are the Cool Girl so you will bow to their wishes. Like a car salesman saying, How much do you want to pay for this beauty? when you didn’t agree to buy it yet. That awful phrase men use: “I mean, I know you wouldn’t mind if I…” Yes, I do mind. Just say it. Don’t lose, you dumb little twat. (387-8)

So Amy doesn’t want to be the one who loses any more than Nick wants to be that “sitcom-husband version” of himself, the toadie, the blunderer. And all either of them manages to accomplish by resisting is to make the other miserable.

            Gen-Xers like Nick and Amy were taught to dream big, to grow up and change the world, to put their minds to it and become whatever they most want to be. Then they grew up and realized they had to find a way to live with each other and make some sort of living—even though whatever spouse, whatever job, whatever life they settled for inevitably became a symbol of the great cosmic joke that had been played on them. From thinking you’d be a hero and change the world to being cheated on by the husband who should have known he wasn’t good enough for you in first place—it’s quite a distance to fall. And it’s easy to imagine how much more you could accomplish without the dead weight of a broken heart or the burden of a guilty conscience. All you have driving you on is your rage, even the worst of which flares up only for a few days or weeks at most before exhausting itself. Then you return to being the sad, wounded, abandoned, betrayed little critter you are. Not Amy, though. Amy was the butt of the same joke as the rest of us, though in her case it was even more sadistic. She was made to settle for a lesser life than she’d been encouraged to dream of just like the rest of us. She was betrayed just like the rest of us. But she suffers no pangs of guilt, no aching of a broken heart. And Amy’s rage is inexhaustible. She really can be whoever she wants—and she’s already busy becoming more than an amazing idea. 

Also read:
HOW VIOLENT FICTION WORKS: ROHAN WILSON’S “THE ROVING PARTY” AND JAMES WOOD’S SANGUINARY SUBLIME FROM CONRAD TO MCCARTHY

And:

WHAT MAKES "WOLF HALL" SO GREAT?

Read More
Dennis Junk Dennis Junk

Science’s Difference Problem: Nicholas Wade’s Troublesome Inheritance and the Missing Moral Framework for Discussing the Biology of Behavior

Nicholas Wade went there. In his book “A Troublesome Inheritance,” he argues that not only is race a real, biological phenomenon, but one that has potentially important implications for our understanding of the fates of different peoples. Is it possible to even discuss such things without being justifiably labeled a racist? More importantly, if biological differences do show up in the research, how can we discuss them without being grossly immoral?

            No sooner had Nicholas Wade’s new book become available for free two-day shipping than a contest began to see who could pen the most devastating critical review of it, the one that best satisfies our desperate urge to dismiss Wade’s arguments and reinforce our faith in the futility of studying biological differences between human races, a faith backed up by a cherished official consensus ever so conveniently in line with our moral convictions. That Charles Murray, one of the authors of the evil tome The Bell Curve, wrote an early highly favorable review for the Wall Street Journal only upped the stakes for all would-be champions of liberal science. Even as the victor awaits crowning, many scholars are posting links to their favorite contender’s critiques all over social media to advertise their principled rejection of this book they either haven’t read yet or have no intention of ever reading.

You don’t have to go beyond the title, A Troublesome Inheritance: Genes, Race and Human History, to understand what all these conscientious intellectuals are so eager to distance themselves from—and so eager to condemn. History has undeniably treated some races much more poorly than others, so if their fates are in any small way influenced by genes the implication of inferiority is unavoidable. Regardless of what he actually says in the book, Wade’s very program strikes many as racist from its inception.

            The going theories for the dawn of the European Enlightenment and the rise of Western culture—and western people—to global ascendency attribute the phenomenon to a combination of geographic advantages and historical happenstance. Wade, along with many other scholars, finds such explanations unsatisfying. Geography can explain why some societies never reached sufficient population densities to make the transition into states. “Much harder to understand,” Wade writes, “is how Europe and Asia, lying on much the same lines of latitude, were driven in the different directions that led to the West’s dominance” (223). Wade’s theory incorporates elements of geography—like the relatively uniform expanse of undivided territory between the Yangtze and Yellow rivers that facilitated the establishment of autocratic rule, and the diversity of fragmented regions in Europe preventing such consolidation—but he goes on to suggest that these different environments would have led to the development of different types of institutions. Individuals more disposed toward behaviors favored by these institutions, Wade speculates, would be rewarded with greater wealth, which would in turn allow them to have more children with behavioral dispositions similar to their own.

            After hundreds of years and multiple generations, Wade argues, the populations of diverse regions would respond to these diverse institutions by evolving subtly different temperaments. In China for instance, favorable, and hence selected for traits may have included intelligence, conformity, and obedience. These behavioral propensities would subsequently play a role in determining the future direction of the institutions that fostered their evolution. Average differences in personality would, according to Wade, also make it more or less likely that certain new types of institution would arise within a given society, or that they could be successfully transplanted into it. And it’s a society’s institutions that ultimately determine its fate relative to other societies. To the objection that geography can, at least in principle, explain the vastly different historical outcomes among peoples of specific regions, Wade responds, “Geographic determinism, however, is as absurd a position as genetic determinism, given that evolution is about the interaction between the two” (222).

            East Asians score higher on average on IQ tests than people with European ancestry, but there’s no evidence that any advantage they enjoy in intelligence, or any proclivity they may display toward obedience and conformity—traits supposedly manifest in their long history of autocratic governance—is attributable to genetic differences as opposed to traditional attitudes toward schoolwork, authority, and group membership inculcated through common socialization practices. So we can rest assured that Wade’s just-so story about evolved differences between the races in social behavior is eminently dismissible. Wade himself at several points throughout A Troublesome Inheritance admits that his case is wholly speculative. So why, given the abominable history of racist abuses of evolutionary science, would Wade publish such a book?

It’s not because he’s unaware of the past abuses. Indeed, in his second chapter, titled “Perversions of Science,” which none of the critical reviewers deigns to mention, Wade chronicles the rise of eugenics and its culmination in the Holocaust. He concludes,

After the Second World War, scientists resolved for the best of reasons that genetics research would never again be allowed to fuel the racial fantasies of murderous despots. Now that new information about human races has been developed, the lessons of the past should not be forgotten and indeed are all the more relevant. (38)

The convention among Wade’s critics is to divide his book into two parts, acknowledge that the first is accurate and compelling enough, and then unload the full academic arsenal of both scientific and moral objections to the second. This approach necessarily scants a few important links in his chain of reasoning in an effort to reduce his overall point to its most objectionable elements. And for all their moralizing, the critics, almost to a one, fail to consider Wade’s expressed motivation for taking on such a fraught issue.

            Even acknowledging Wade’s case is weak for the role of biological evolution in historical developments like the Industrial Revolution, we may still examine his reasoning up to that point in the book, which may strike many as more firmly grounded. You can also start to get a sense of what was motivating Wade when you realize that the first half of A Troublesome Inheritance recapitulates his two previous books on human evolution. The first, Before the Dawn, chronicled the evolution and history of our ancestors from a species that resembled a chimpanzee through millennia as tribal hunter-gatherers to the first permanent settlements and the emergence of agriculture. Thus, we see that all along his scholarly interest has been focused on major transitions in human prehistory.

While critics of Wade’s latest book focus almost exclusively on his attempts at connecting genomics to geopolitical history, he begins his exploration of differences between human populations by emphasizing the critical differences between humans and chimpanzees, which we can all agree came about through biological evolution. Citing a number of studies comparing human infants to chimps, Wade writes in A Troublesome Inheritance,

Besides shared intentions, another striking social behavior is that of following norms, or rules generally agreed on within the “we” group. Allied with the rule following are two other basic principles of human social behavior. One is a tendency to criticize, and if necessary punish, those who do not follow the agreed-upon norms. Another is to bolster one’s own reputation, presenting oneself as an unselfish and valuable follower of the group’s norms, an exercise that may involve finding fault with others. (49)

What separates us from chimpanzees and other apes—including our ancestors—is our much greater sociality and our much greater capacity for cooperation. (Though primatologist Frans de Waal would object to leaving the much more altruistic bonobos out of the story.) The basis for these changes was the evolution of a suite of social emotions—emotions that predispose us toward certain types of social behaviors, like punishing those who fail to adhere to group norms (keeping mum about genes and race for instance). If there’s any doubt that the human readiness to punish wrongdoers and rule violators is instinctual, ongoing studies demonstrating this trait in children too young to speak make the claim that the behavior must be taught ever more untenable. The conclusion most psychologists derive from such studies is that, for all their myriad manifestations in various contexts and diverse cultures, the social emotions of humans emerge from a biological substrate common to us all.  

            After Before the Dawn, Wade came out with The Faith Instinct, which explores theories developed by biologist David Sloan Wilson and evolutionary psychologist Jesse Bering about the adaptive role of religion in human societies. In light of cooperation’s status as one of the most essential behavioral differences between humans and chimps, other behaviors that facilitate or regulate coordinated activity suggest themselves as candidates for having pushed our ancestors along the path toward several key transitions. Language for instance must have been an important development. Religion may have been another. As Wade argues in A Troublesome Inheritance

The fact that every known society has a religion suggests that each inherited a propensity for religion from the ancestral human population. The alternative explanation, that each society independently invented and maintained this distinctive human behavior, seems less likely. The propensity for religion seems instinctual, rather than purely cultural, because it is so deeply set in the human mind, touching emotional centers and appearing with such spontaneity. There is a strong evolutionary reason, moreover, that explains why religion may have become wired in the neural circuitry. A major function of religion is to provide social cohesion, a matter of particular importance among early societies. If the more cohesive societies regularly prevailed over the less cohesive, as would be likely in any military dispute, an instinct for religious behavior would have been strongly favored by natural selection. This would explain why the religious instinct is universal. But the particular form that religion takes in each society depends on culture, just as with language. (125-6)

As is evident in this passage, Wade never suggests any one-to-one correspondence between genes and behaviors. Genes function in the context of other genes in the context of individual bodies in the context of several other individual bodies. But natural selection is only about outcomes with regard to survival and reproduction. The evolution of social behavior must thus be understood as taking place through the competition, not just of individuals, but also of institutions we normally think of as purely cultural.

            The evolutionary sequence Wade envisions begins with increasing sociability enforced by a tendency to punish individuals who fail to cooperate, and moves on to tribal religions which involve synchronized behaviors, unifying beliefs, and omnipresent but invisible witnesses who discourage would-be rule violators. Once humans began living in more cohesive groups, behaviors that influenced the overall functioning of those groups became the targets of selection. Religion may have been among the first institutions that emerged to foster cohesion, but others relying on the same substrate of instincts and emotions would follow. Tracing the trajectory of our prehistory from the origin of our species in Africa, to the peopling of the world’s continents, to the first permanent settlements and the adoption of agriculture, Wade writes,

The common theme of all these developments is that when circumstances change, when a new resource can be exploited or a new enemy appears on the border, a society will change its institutions in response. Thus it’s easy to see the dynamics of how human social change takes place and why such a variety of human social structures exists. As soon as the mode of subsistence changes, a society will develop new institutions to exploit its environment more effectively. The individuals whose social behavior is better attuned to such institutions will prosper and leave more children, and the genetic variations that underlie such a behavior will become more common. (63-4)

First a society responds to shifting pressures culturally, but a new culture amounts to a new environment for individuals to adapt to. Wade understands that much of this adaptation occurs through learning. Some of the challenges posed by an evolving culture will, however, be easier for some individuals to address than others. Evolutionary anthropologists tend to think of culture as a buffer between environments and genes. Many consider it more of a wall. To Wade, though, culture is merely another aspect of the environment individuals and their genes compete to thrive in.

If you’re a cultural anthropologist and you want to study how cultures change over time, the most convenient assumption you can make is that any behavioral differences you observe between societies or over periods of time are owing solely to the forces you’re hoping to isolate. Biological changes would complicate your analysis. If, on the other hand, you’re interested in studying the biological evolution of social behaviors, you will likely be inclined to assume that differences between cultures, if not based completely on genetic variance, at least rest on a substrate of inherited traits. Wade has quite obviously been interested in social evolution since his first book on anthropology, so it’s understandable that he would be excited about genome studies suggesting that human evolution has been operating recently enough to affect humans in distantly separated regions of the globe. And it’s understandable that he’d be frustrated by sanctions against investigating possible behavioral differences tied to these regional genetic differences. But this doesn’t stop his critics from insinuating that his true agenda is something other than solely scientific.

            On the technology and pop culture website io9, blogger and former policy analyst Annalee Newitz calls Wade’s book an “argument for white supremacy,” which goes a half-step farther than the critical review by Eric Johnson the post links to, titled "On the Origin of White Power." Johnson sarcastically states that Wade isn’t a racist and acknowledges that the author is correct in pointing out that considering race as a possible explanatory factor isn’t necessarily racist. But, according to Johnson’s characterization,

He then explains why white people are better because of their genes. In fairness, Wade does not say Caucasians are betterper se, merely better adapted (because of their genes) to the modern economic institutions that Western society has created, and which now dominate the world’s economy and culture.

The clear implication here is that Wade’s mission is to prove that the white race is superior but that he also wanted to cloak this agenda in the garb of honest scientific inquiry. Why else would Wade publish his problematic musings? Johnson believes that scientists and journalists should self-censor speculations or as-yet unproven theories that could exacerbate societal injustices. He writes, “False scientific conclusions, often those that justify certain well-entrenched beliefs, can impact peoples’ lives for decades to come, especially when policy decisions are based on their findings.” The question this position begs is how certain can we be that any scientific “conclusion”—Wade would likely characterize it as an exploration—is indeed false before it’s been made public and become the topic of further discussion and research?

Johnson’s is the leading contender for the title of most devastating critique of A Troublesome Inheritance, and he makes several excellent points that severely undermine parts of Wade’s case for natural selection playing a critical role in recent historical developments. But, like H. Allen Orr’s critique in The New York Review, the first runner-up in the contest, Johnson’s essay is oozing with condescension and startlingly unselfconscious sanctimony. These reviewers profess to be standing up for science even as they ply their readers with egregious ad hominem rhetoric (Wade is just a science writer, not a scientist) and arguments from adverse consequences (racist groups are citing Wade’s book in support of their agendas), thereby underscoring another of Wade’s arguments—that the case against racial differences in social behavior is at least as ideological as it is scientific. Might the principle that researchers should go public with politically sensitive ideas or findings only after they’ve reached some threshold of wider acceptance end up stifling free inquiry? And, if Wade’s theories really are as unlikely to bear empirical or conceptual fruit as his critics insist, shouldn’t the scientific case against them be enough? Isn’t all the innuendo and moral condemnation superfluous—maybe even a little suspicious?

            White supremacists may get some comfort from parts of Wade’s book, but if they read from cover to cover they’ll come across plenty of passages to get upset about. In addition to the suggestion that Asians are more intelligent than Caucasians, there’s the matter of the entire eighth chapter, which describes a scenario for how Ashkenazi Jews became even more intelligent than Asians and even more creative and better suited to urban institutions than Caucasians of Northern European ancestry. Wade also points out more than once that the genetic differences between the races are based, not on the presence or absence of single genes, but on clusters of alleles occurring with varying frequencies. He insists that

the significant differences are those between societies, not their individual members. But minor variations in social behavior, though barely perceptible, if at all, in an individual, combine to create societies of very different character. (244)

In other words, none of Wade’s speculations, nor any of the findings he reports, justifies discriminating against any individual because of his or her race. At best, there would only ever be a slightly larger probability that an individual will manifest any trait associated with people of the same ancestry. You’re still much better off reading the details of the résumé. Critics may dismiss as mere lip service Wade’s disclaimers about how “Racism and discrimination are wrong as a matter of principle, not of science” (7), and how the possibility of genetic advantages in certain traits “does not of course mean that Europeans are superior to others—a meaningless term in any case from an evolutionary perspective” (238).  But if Wade is secretly taking delight in the success of one race over another, it’s odd how casually he observes that “the forces of differentiation seem now to have reversed course due to increased migration, travel and intermarriage” (71).

            Wade does of course have to cite some evidence, indirect though it may be, in support of his speculations. First, he covers several genomic studies showing that, contrary to much earlier scholarship, populations of various regions of the globe are genetically distinguishable. Race, in other words, is not merely a social construct, as many have insisted. He then moves on to research suggesting that a significant portion of the human genome reveals evidence of positive selection recently enough to have affected regional populations differently. Joshua Akey’s 2009 review of multiple studies on markers of recent evolution is central to his argument. Wade interprets Akey’s report as suggesting that as much as 14 percent of the human genome shows signs of recent selection. Orr insists this is a mistake in his review, putting the number at 8 percent.

Steven Pinker, who discusses Akey’s paper in his 2011 book The Better Angels of Our Nature, likewise takes the number to be 8 and not 14. But even that lower proportion is significant. Pinker, an evolutionary psychologist, stresses just how revolutionary this finding might be.

Some journalists have uncomprehendingly lauded these results as a refutation of evolutionary psychology and what they see as its politically dangerous implication of a human nature shaped by adaptation to a hunter-gatherer lifestyle. In fact the evidence for recent selection, if it applies to genes with effects on cognition and emotion, would license a far more radical form of evolutionary psychology, one in which minds have been biologically shaped by recent environments in addition to ancient ones. And it could have the incendiary implication that aboriginal and immigrant populations are less biologically adapted to the demands of modern life than populations that have lived in literate societies for millennia. (614)

Contra critics who paint him as a crypto-supremacist, it’s quite clearly that “far more radical form of evolutionary psychology” Wade is excited about. That’s why he’s exasperated by what he sees as Pinker’s refusal to admit that the case for that form is strong enough to warrant pursuing it further owing to fear of its political ramifications. Pinker does consider much of the same evidence as Wade, but where Wade sees only clear support Pinker sees several intractable complications. Indeed, the section of Better Angels where Pinker discusses recent evolution is an important addendum to Wade’s book, and it must be noted Pinker doesn’t rule out the possibility of regional selection for social behaviors. He simply says that “for the time being, we have no need for that hypothesis” (622).

            Wade is also able to point to one gene that has already been identified whose alleles correspond to varying frequencies of violent behavior. The MAO-A gene comes in high- and low-activity varieties, and the low-activity version is more common among certain ethnic groups, like sub-Saharan Africans and Maoris. But, as Pinker points out, a majority of Chinese men also have the low-activity version of the gene, and they aren’t known for being particularly prone to violence. So the picture isn’t straightforward. Aside from the Ashkenazim, Wade cites another well-documented case in which selection for behavioral traits could have played an important role. In his book A Farewell to Alms, Gregory Clark presents an impressive collection of historical data suggesting that in the lead-up to the Industrial Revolution in England, people with personality traits that would likely have contributed to the rapid change were rewarded with more money, and people with more money had more children. The children of the wealthy would quickly overpopulate the ranks of the upper classes and thus large numbers of them inevitably descended into lower ranks. The effect of this “ratchet of wealth” (180), as Wade calls it, after multiple generations would be genes for behaviors like impulse control, patience, and thrift cascading throughout the population, priming it for the emergence of historically unprecedented institutions.

            Wade acknowledges that Clark’s theory awaits direct confirmation through the discovery of actual alleles associated with the behavioral traits he describes. But he points to experiments with artificial selection that suggest the time-scale Clark considers, about 24 generations, would have been sufficient to effect measurable changes. In his critical review, though, Johnson counters that natural selection is much slower than artificial selection, and he shows that Clark’s own numbers demonstrate a rapid attenuation of the effects of selection. Pinker points to other shortcomings in the argument, like the number of cases in which institutions changed and populations exploded in periods too short to have seen any significant change in allele frequencies. Wade isn’t swayed by any of these objections, which he takes on one-by-one, contrary to Orr’s characterization of the disagreement. As of now, the debate is ongoing. It may not be settled conclusively until scientists have a much better understanding of how genes work to influence behavior, which Wade estimates could take decades.

            Pinker is not known for being politically correct, but Wade may have a point when he accuses him of not following the evidence to the most likely conclusions. “The fact that a hypothesis is politically uncomfortable,” Pinker writes, “does not mean that it is false, but it does mean that we should consider the evidence very carefully before concluding that it is true” (614). This sentiment echoes the position taken by Johnson: Hold off going public with sensitive ideas until you’re sure they’re right. But how can we ever be sure whether an idea has any validity if we’re not willing to investigate it? Wade’s case for natural selection operating through changing institutions during recorded history isn’t entirely convincing, but neither is it completely implausible. The evidence that would settle the issue simply hasn’t been discovered yet. But neither is there any evidence in Wade’s book to support the conclusion that his interest in the topic is political as opposed to purely scientific. “Each gene under selection,” he writes, “will eventually tell a fascinating story about some historical stress to which the population was exposed and then adapted” (105). Fascinating indeed, however troubling they may be.

            Is the best way to handle troublesome issues like the possible role of genes in behavioral variations between races to declare them off-limits to scientists until the evidence is incontrovertible? Might this policy come with the risk that avoiding the topic now will make it all too easy to deny any evidence that does emerge later? If genes really do play a role in violence and impulse-control, then we may need to take that into account when we’re devising solutions to societal inequities.

Genes are not gods whose desires must be bowed to. But neither are they imaginary forces that will go away if we just ignore them. The challenge of dealing with possible biological differences also arises in the context of gender. Because women continue to earn smaller incomes on average than men and are underrepresented in science and technology fields, and because the discrepancy is thought to be the product of discrimination and sexism, many scholars argue that any research into biological factors that may explain these outcomes is merely an effort at rationalizing injustice. The problem is the evidence for biological differences in behavior between the genders is much stronger than it is for those between populations from various regions. We can ignore these findings—and perhaps even condemn the scientists who conduct the studies—because they don’t jive with our preferred explanations. But solutions based on willful ignorance have little chance of being effective.

            The sad fact is that scientists and academics have nothing even resembling a viable moral framework for discussing biological behavioral differences. Their only recourse is to deny and inveigh. The quite reasonable fear is that warnings like Wade’s about how the variations are subtle and may not exist at all in any given individual will go unheeded as the news of the findings is disseminated, and dumbed-down versions of the theories will be coopted in the service of reactionary agendas. A study reveals that women respond more readily to a baby’s vocalizations and the headlines read “Genes Make Women Better Parents.” An allele associated with violent behavior is found to be more common in African Americans and some politician cites it as evidence that the astronomical incarceration rate for black males is justifiable. But is censorship the answer? Average differences between genders in career preferences is directly relevant to any discussion of uneven representation in various fields. And it’s possible that people with a certain allele will respond differently to different types of behavioral intervention. As Carl Sagan explained, in a much different context, in his book Demon-Haunted World, “we cannot have science in bits and pieces, applying it where we feel safe and ignoring it where we feel threatened—again, because we are not wise enough to do so” (297).

            Part of the reason the public has trouble understanding what differences between varying types of people may mean is that scientists are at odds with each other about how to talk about them. And with all the righteous declamations they can start to sound a lot like the talking heads on cable news shows. Conscientious and well-intentioned scholars have so thoroughly poisoned the well when it comes to biological behavioral differences that their possible existence is treated as a moral catastrophe. How should we discuss the topic? Working to convey the importance of the distinction between average and absolute differences may be a good start. Efforts to encourage people to celebrate diversity and to challenge the equating of genes with destiny are already popularly embraced. In the realm of policy, we might shift our focus from equality of outcome to equality of opportunity. It’s all too easy to find clear examples of racial disadvantages—in housing, in schooling, in the job market—that go well beyond simple head counting at top schools and in executive boardrooms. Slight differences in behavioral propensities can’t justify such blatant instances of unfairness. Granted, that type of unfairness is much more difficult to find when it comes to gender disparities, but the lesson there is that policies and agendas based on old assumptions might need to give way to a new understanding, not that we should pretend the evidence doesn’t exist or has no meaning.

            Wade believes it was safe for him to write about race because “opposition to racism is now well entrenched” in the Western world (7). In one sense, he’s right about that. Very few people openly profess a belief in racial hierarchies. In another sense, though, it’s just as accurate to say that racism is itself well entrenched in our society. Will A Troublesome Inheritance put the brakes on efforts to bring about greater social justice? This seems unlikely if only because the publication of every Bell Curve occasions the writing of another Mismeasure of Man.

  The unfortunate result is that where you stand on the issue will become yet another badge of political identity as we form ranks on either side. Most academics will continue to consider speculation irresponsible, apply a far higher degree of scrutiny to the research, and direct the purest moral outrage they can muster, while still appearing rational and sane, at anyone who dares violate the taboo. This represents the triumph of politics over science. And it ensures the further entrenchment of views on either side of the divide.

Despite the few superficial similarities between Wade’s arguments and those of racists and eugenicists of centuries past, we have to realize that our moral condemnation of what we suppose are his invidious extra-scientific intentions is itself borne of extra-scientific ideology. Whether race plays a role in behavior is a scientific question. Our attitude toward that question and the parts of the answer that trickle in despite our best efforts at maintaining its status as taboo just may emerge out of assumptions that no longer apply. So we must recognize that succumbing to the temptation to moralize when faced with scientific disagreement automatically makes hypocrites of us all. And we should bear in mind as well that insofar as racial and gender differences really do exist it will only be through coming to a better understanding of them that we can hope to usher in a more just society for children of any and all genders and races. 

Also read: 

THE SELF-RIGHTEOUSNESS INSTINCT: STEVEN PINKER ON THE BETTER ANGELS OF MODERNITY AND THE EVILS OF MORALITY

And: 

NAPOLEON CHAGNON'S CRUCIBLE AND THE ONGOING EPIDEMIC OF MORALIZING HYSTERIA IN ACADEMIA

And:

FROM DARWIN TO DR. SEUSS: DOUBLING DOWN ON THE DUMBEST APPROACH TO COMBATTING RACISM

Read More
Dennis Junk Dennis Junk

Are 1 in 5 Women Really Sexually Assaulted on College Campuses?

If you wanted to know how many young women are sexually assaulted on college campuses, you could easily devise a survey to ask a sample of them directly. But that’s not what advocates of stricter measures to prevent assault tend to do. Instead, they ask ambiguous questions they go on to interpret as suggesting an assault occurred. This almost guarantees wildly inflated numbers.

            If you were a university administrator and you wanted to know how prevalent a particular experience was for students on campus, you would probably conduct a survey that asked a few direct questions about that experience—foremost among them the question of whether the student had at some point had the experience you’re interested in. Obvious, right? Recently, we’ve been hearing from many news media sources, and even from President Obama himself, that one in five college women experience sexual assault at some time during their tenure as students. It would be reasonable to assume that the surveys used to arrive at this ratio actually asked the participants directly whether or not they had been assaulted. 

            But it turns out the web survey that produced the one-in-five figure did no such thing. Instead, it asked students whether they had had any of several categories of experience the study authors later classified as sexual assault, or attempted sexual assault, in their analysis. This raises the important question of how we should define sexual assault when we’re discussing the issue—along with the related question of why we’re not talking about a crime that’s more clearly defined, like rape. 

Of course, whatever you call it, sexual violence is such a horrible crime that most of us are willing to forgive anyone who exaggerates the numbers or paints an overly frightening picture of reality in an attempt to prevent future cases. (The issue is so serious that PolitiFact refrained from applying their trademark Truth-O-Meter to the one-in-five figure.) 

            But there are four problems with this attitude. The first is that for every supposed assault there is an alleged perpetrator. Dramatically overestimating the prevalence of the crime comes with the attendant risk of turning public perception against the accused, making it more difficult for the innocent to convince anyone of their innocence. 

            The second problem is that by exaggerating the danger in an effort to protect college students we’re sabotaging any opportunity these young adults may have to make informed decisions about the risks they take on. No one wants students to die in car accidents either, but we don’t manipulate the statistics to persuade them one in five drivers will die in a crash before they graduate from college. 

            The third problem is that going to college and experimenting with sex are for many people a wonderful set of experiences they remember fondly for the rest of their lives. Do we really want young women to barricade themselves in their dorms? Do we want young men to feel like they have to get signed and notarized documentation of consent before they try to kiss anyone? The fourth problem I’ll get to in a bit.

            We need to strike some appropriate balance in our efforts to raise awareness without causing paranoia or inspiring unwarranted suspicion. And that balance should be represented by the results of our best good-faith effort to arrive at as precise an understanding of the risk as our most reliable methods allow. For this purpose, The Department of Justice’s Campus Sexual Assault Study, the source of the oft-cited statistic, is all but completely worthless. It has limitations, to begin with, when it comes to representativeness, since it surveyed students on just two university campuses. And, while the overall sample was chosen randomly, the 42% response rate implies a great deal of self-selection on behalf of the participants. The researchers did compare late responders to early ones to see if there was a systematic difference in their responses. But this doesn’t by any means rule out the possibility that many students chose categorically not to respond because they had nothing to say, and therefore had no interest in the study. (Some may have even found it offensive.) These are difficulties common to this sort of simple web-based survey, and they make interpreting the results problematic enough to recommend against their use in informing policy decisions.

            The biggest problems with the study, however, are not with the sample but with the methods. The survey questions appear to have been deliberately designed to generate inflated incidence rates. The basic strategy of avoiding direct questions about whether the students had been the victims of sexual assault is often justified with the assumption that many young people can’t be counted on to know what actions constitute rape and assault. But attempting to describe scenarios in survey items to get around this challenge opens the way for multiple interpretations and discounts the role of countless contextual factors. The CSA researchers write, “A surprisingly large number of respondents reported that they were at a party when the incident happened.” Cathy Young, a contributing editor at Reason magazine who analyzed the study all the way back in 2011, wrote that

the vast majority of the incidents it uncovered involved what the study termed “incapacitation” by alcohol (or, rarely, drugs): 14 percent of female respondents reported such an experience while in college, compared to six percent who reported sexual assault by physical force. Yet the question measuring incapacitation was framed ambiguously enough that it could have netted many “gray area” cases: “Has someone had sexual contact with you when you were unable to provide consent or stop what was happening because you were passed out, drugged, drunk, incapacitated, or asleep?” Does “unable to provide consent or stop” refer to actual incapacitation – given as only one option in the question – or impaired judgment?  An alleged assailant would be unlikely to get a break by claiming he was unable to stop because he was drunk.

This type of confusion is why it’s important to design survey questions carefully. That the items in the CSA study failed to make the kind of fine distinctions that would allow for more conclusive interpretations suggests the researchers had other goals in mind.

            The researchers’ use of the blanket term “sexual assault,” and their grouping of attempted with completed assaults, is equally suspicious. Any survey designer cognizant of all the difficulties of web surveys would likely try to narrow the focus of the study as much as possible, and they would also try to eliminate as many sources of confusion with regard to definitions or descriptions as possible. But, as Young points out,

The CSA Study’s estimate of sexual assault by physical force is somewhat problematic as well – particularly for attempted sexual assaults, which account for nearly two-thirds of the total. Women were asked if anyone had ever had or attempted to have sexual contact with them by using force or threat, defined as “someone holding you down with his or her body weight, pinning your arms, hitting or kicking you, or using or threatening to use a weapon.” Suppose that, during a make-out session, the man tries to initiate sex by rolling on top of the woman, with his weight keeping her from moving away – but once she tells him to stop, he complies. Would this count as attempted sexual assault?

The simplest way to get around many of these difficulties would have been to ask the survey participants directly whether they had experienced the category of crime the researchers were interested in. If the researchers were concerned that the students might not understand that being raped while drunk still counts as rape, why didn’t they just ask the participants a question to that effect? It’s a simple enough question to devise.

            The study did pose a follow up question to participants it classified as victims of forcible assault, the responses to which hint at the students’ actual thoughts about the incidents. It turns out 37 percent of so-called forcible assault victims explained that they hadn’t contacted law enforcement because they didn’t think the incident constituted a crime. That bears repeating: a third of the students the study says were forcibly assaulted didn’t think any crime had occurred. With regard to another category of victims, those of incapacitated assault, Young writes, “Not surprisingly, three-quarters of the female students in this category did not label their experience as rape.” Of those the study classified as actually having been raped while intoxicated, only 37 percent believed they had in fact been raped. Two thirds of the women the study labels as incapacitated rape victims didn’t believe they had been raped. Why so much disagreement on such a serious issue? Of the entire incapacitated sexual assault victim category, Young writes,

Two-thirds said they did not report the incident to the authorities because they didn’t think it was serious enough. Interestingly, only two percent reported having suffered emotional or psychological injury – a figure so low that the authors felt compelled to include a footnote asserting that the actual incidence of such trauma was undoubtedly far higher.

So the largest category making up the total one-in-five statistic is predominantly composed of individuals who didn’t think what happened to them was serious enough to report. And nearly all of them came away unscathed, both physically and psychologically.

            The impetus behind the CSA study was a common narrative about a so-called “rape culture” in which sexual violence is accepted as normal and young women fail to report incidents because they’re convinced you’re just supposed to tolerate it. That was the researchers’ rationale for using their own classification scheme for the survey participants’ experiences even when it was at odds with the students’ beliefs. But researchers have been doing this same dance for thirty years. As Young writes,

When the first campus rape studies in the 1980s found that many women labeled as victims by researchers did not believe they had been raped, the standard explanation was that cultural attitudes prevent women from recognizing forced sex as rape if the perpetrator is a close acquaintance. This may have been true twenty-five years ago, but it seems far less likely in our era of mandatory date rape and sexual assault workshops and prevention programs on college campuses.

The CSA also surveyed a large number of men, almost none of whom admitted to assaulting women. The researchers hypothesize that the men may have feared the survey wasn’t really anonymous, but that would mean they knew the behaviors in question were wrong. Again, if the researchers are really worried about mistaken beliefs regarding the definition of rape, they could investigate the issue with a few added survey items.

            The huge discrepancies between incidences of sexual violence as measured by researchers and as reported by survey participants becomes even more suspicious in light of the history of similar studies. Those campus rape studies Young refers to from the 1980s produced a ratio of one in four. Their credibility was likewise undermined by later surveys that found that most of the supposed victims didn’t believe they’d been raped, and around forty percent of them went on to have sex with their alleged assailants again. A more recent study by the CDC used similar methods—a phone survey with a low response rate—and concluded that one in five women has been raped at some time in her life. Looking closer at this study, feminist critic and critic of feminism Christina Hoff Sommers attributes this finding as well to “a non-representative sample and vaguely worded questions.” It turns out activists have been conducting different versions of this same survey, and getting similarly, wildly inflated results for decades.

            Sommers challenges the CDC findings in a video everyone concerned with the issue of sexual violence should watch. We all need to understand that well-intentioned and intelligent people can, and often do, get carried away with activism that seems to have laudable goals but ends up doing more harm than good. Some people even build entire careers on this type of crusading. And PR has become so sophisticated that we never need to let a shortage, or utter lack of evidence keep us from advocating for our favorite causes. But there’s still a fourth problem with crazily exaggerated risk assessments—they obfuscate issues of real importance, making it more difficult to come up with real solutions. As Sommers explains,

To prevent rape and sexual assault we need state-of-the-art research. We need sober estimates. False and sensationalist statistics are going to get in the way of effective policies. And unfortunately, when it comes to research on sexual violence, exaggeration and sensation are not the exception; they are the rule. If you hear about a study that shows epidemic levels of sexual violence against American women, or college students, or women in the military, I can almost guarantee the researchers used some version of the defective CDC methodology. Now by this method, known as advocacy research, you can easily manufacture a women’s crisis. But here’s the bottom line: this is madness. First of all it trivializes the horrific pain and suffering of survivors. And it sends scarce resources in the wrong direction. Sexual violence is too serious a matter for antics, for politically motivated posturing. And right now the media, politicians, rape culture activists—they are deeply invested in these exaggerated numbers.

So while more and more normal, healthy, and consensual sexual practices are considered crimes, actual acts of exploitation and violence are becoming all the more easily overlooked in the atmosphere of paranoia. And college students face the dilemma of either risking assault or accusation by going out to enjoy themselves or succumbing to the hysteria and staying home, missing out on some of the richest experiences college life has to offer.

            One in five is a truly horrifying ratio. As conservative crime researcher Heather McDonald points out, “Such an assault rate would represent a crime wave unprecedented in civilized history. By comparison, the 2012 rape rate in New Orleans and its immediately surrounding parishes was .0234 percent; the rate for all violent crimes in New Orleans in 2012 was .48 percent.” I don’t know how a woman can pass a man on a sidewalk after hearing such numbers and not look at him with suspicion. Most of the reforms rape culture activists are pushing for now chip away at due process and strip away the rights of the accused. No one wants to make coming forward any more difficult for actual victims, but our first response to anyone making such a grave accusation—making any accusation—should be skepticism. Victims suffer severe psychological trauma, but then so do the falsely accused. The strongest evidence of an honest accusation is often the fact that the accuser must incur some cost in making it. That’s why we say victims who come forward are heroic. That’s the difference between a victim and a survivor.

            Trumpeting crazy numbers creates the illusion that a large percentage of men are monsters, and this fosters an us-versus-them mentality that obliterates any appreciation for the difficulty of establishing guilt. That would be a truly scary world to live in. Fortunately, we in the US don’t really live in such a world. Sex doesn’t have to be that scary. It’s usually pretty damn fun. And the vast majority of men you meet—the vast majority of women as well—are good people. In fact, I’d wager most men would step in if they were around when some psychopath was trying to rape someone.

Also read:  

THE SELF-RIGHTEOUSNESS INSTINCT: STEVEN PINKER ON THE BETTER ANGELS OF MODERNITY AND THE EVILS OF MORALITY

And:

FROM DARWIN TO DR. SEUSS: DOUBLING DOWN ON THE DUMBEST APPROACH TO COMBATTING RACISM

And: 

VIOLENCE IN HUMAN EVOLUTION AND POSTMODERNISM'S CAPTURE OF ANTHROPOLOGY

Read More
Dennis Junk Dennis Junk

Rebecca Mead’s Middlemarch Pilgrimage and the 3 Wrong Ways to Read a Novel

Rebecca Mead’s “My Life in Middlemarch,” about her lifelong love of George Eliot’s masterpiece, discusses various ways to think of the relationship between readers and authors, as well as the relationship between readers and characters. To get at the meat of these questions, though, you have to begin with an understanding of what literature is and how a piece of fiction works.

  All artists are possessed of the urge to render through some artificial medium the experience of something real. Since artists are also possessed of a desire to share their work and have it appreciated, they face the conundrum of having to wrest the attention of their audience away from the very reality they hope to relay some piece of—a feat which can only be accomplished with the assurance of something extraordinary. Stories that are too real are seldom interesting, while the stories that are the most riveting usually feature situations audiences are unlikely to encounter in their real lives. This was the challenge George Eliot faced when she set out to convey something of the reality of a provincial English town in the early nineteenth century; she had to come up with a way to write a remarkable novel about unremarkable characters in an unremarkable setting. And she somehow managed to do just that. By almost any measure, Eliot’s efforts to chronicle the fates of everyday people in a massive work many everyday people would enjoy reading were wildly, albeit complicatedly, successful.

  Before Middlemarch, the convention for English novels was to blandish readers with the promise of stories replete with romance or adventure, culminating more often than not with a wedding. Eliot turned the marriage plot on its head, beginning her novel with a marriage whose romantic underpinnings were as one-sided as they were transparently delusional. So what, if not luridness or the whiff of wish-fulfillment, does Eliot use to lure us into the not-so-fantastic fictional world of Middlemarch? Part of the answer is that, like many other nineteenth century novelists, she interspersed her own observations and interpretations with the descriptions and events that make up the story. “But Eliot was doing something in addition with those moments of authorial interjection,” Rebecca Mead writes in her book My Life in Middlemarch before going on to explain,

She insists that the reader look at the characters in the book from her own elevated viewpoint. We are granted a wider perspective, and a greater insight, than is available to their neighbors down in the world of Middlemarch. By showing us the way each character is bound within his or her own narrow viewpoint, while providing us with a broader view, she nurtures what Virginia Woolf described as “the melancholy virtue of tolerance.” “If Art does not enlarge men’s sympathies, it does nothing morally,” Eliot once wrote. “The only effect I ardently long to produce by my writings, is that those who read them should be better able to imagine and to feel the pains and the joys of those who differ from themselves in everything but the broad fact of being struggling erring human creatures.” (55-6)

Eliot’s story about ordinary people, in other words, is made extraordinary by the insightful presence of its worldly-wise author throughout its pages.

  But this solution to the central dilemma of art—how to represent reality so as to be worthy of distraction from reality—runs into another seeming contradiction. Near the conclusion of Middlemarch, Dorothea, one of the many protagonists, assures her sister Celia that her second marriage will not be the disaster her first was, explaining that the experience of falling in love this time around was much more auspicious than it had been before. Celia, wanting to understand what was so different, presses her, asking “Can’t you tell me?” Dorothea responds, “No, dear, you would have to feel with me, else you would never know” (783). The question that arises is whether a reader can be made to “imagine and feel the pains and joys” of a character while at the same time being granted access to the author’s elevated perspective. Can we in the audience simultaneously occupy spaces both on the ground alongside the character and hovering above her with a vantage on her place in the scheme of history and the wider world?

  One of the main sources of tension for any storyteller is the conflict between the need to convey information and provide context on the one hand and the goal of representing, or even simulating experiences on the other. A weakness Eliot shares with many other novelists who lived before the turn of the last century is her unchecked impulse to philosophize when she could be making a profounder impact with narrative description or moment-by-moment approximations of a character’s thoughts and feelings. For instance, Dorothea, or Miss Brooke as she’s called in the first pages of the novel, yearns to play some important role in the betterment of humankind. Mead’s feelings about Dorothea and what she represents are likely shared by most who love the book. She writes,

As Miss Brooke, Dorothea remains for me the embodiment of that unnameable, agonizing ache of adolescence, in which burgeoning hopes and ambitions and terrors and longings are all roiled together. When I spend time in her company, I remember what it was like to be eighteen, and at the beginning of things. (43)

Dorothea becomes convinced that by marrying a much older scholar named Casaubon and helping him to bring his life’s work to fruition she’ll be fulfilling her lofty aspirations. So imagine her crushing disappointment upon discovering that Casaubon is little more than an uninspired and bloodless drudge who finds her eagerness to aid in his research more of an annoying distraction than an earnest effort at being supportive. The aspects of this disappointment that are unique to Dorothea, however, are described only glancingly. After locating her character in a drawing room, overwhelmed by all she’s seen during her honeymoon in Rome—and her new husband’s cold indifference to it all, an indifference which encompasses her own physical presence—Eliot retreats into generality:

Not that this inward amazement of Dorothea’s was anything very exceptional: many souls in their young nudity are tumbled out among incongruities and left to “find their feet” among them, while their elders go about their business. Nor can I suppose that when Mrs. Casaubon is discovered in a fit of weeping six weeks after her wedding, the situation will be regarded as tragic. Some discouragement, some faintness of heart at the new real future which replaces the imaginary, is not unusual, and we do not expect people to be deeply moved by what is not unusual. That element of tragedy which lies in the very fact of frequency, has not yet wrought itself into the coarse emotion of mankind; and perhaps our frames could hardly bear much of it. If we had a keen vision and feeling of all ordinary human life, it would be like hearing the grass grow and the squirrel’s heart beat, and we should die of that roar which lies on the other side of silence. As it is, the quickest of us walk about well wadded with stupidity. (185)

The obvious truth that humans must be selective in their sympathies is an odd point for an author to focus on at such a critical juncture in her heroine’s life—especially an author whose driving imperative is to widen the scope of her readers’ sympathies.

  My Life in Middlemarch, a memoir devoted to Mead’s evolving relationship with the novel and its author, takes for granted that Eliot’s masterpiece stands at the pinnacle of English literature, the greatest accomplishment of one of the greatest novelists in history. Virginia Woolf famously described it as “the magnificent book which with all its imperfections is one of the few English novels written for grown-up people.” Martin Amis called it “the central English novel” and said that it was “without weaknesses,” except perhaps for Dorothea’s overly idealized second love Will Ladislaw.  Critics from F.R. Leavis to Harold Bloom have celebrated Eliot as one of the greatest novelists of all time. But Middlemarch did go through a period when it wasn’t as appreciated as it had been originally and has been again since the middle of the twentieth century. Mead quotes a couple of appraisals from the generation succeeding Eliot’s:

“It is doubtful whether they are novels disguised as treatises, or treatises disguised as novels,” one critic wrote of her works. Another delivered the verdict that her books “seem to have been dictated to a plain woman of genius by the ghost of David Hume.” (217)

And of course Woolf would have taken issue with Amis’s claim that Eliot’s novel has no weaknesses, since her oft-quoted line about Middlemarch being for grownups contains the phrase “for all its imperfections.” In the same essay, Woolf says of Eliot,

The more one examines the great emotional scenes the more nervously one anticipates the brewing and gathering and thickening of the cloud which will burst upon our heads at the moment of crisis in a shower of disillusionment and verbosity. It is partly that her hold upon dialogue, when it is not dialect, is slack; and partly that she seems to shrink with an elderly dread of fatigue from the effort of emotional concentration. She allows her heroines to talk too much.

She has little verbal felicity. She lacks the unerring taste which chooses one sentence and compresses the heart of the scene within that. ‘Whom are you doing to dance with?’ asked Mr Knightley, at the Weston’s ball. ‘With you, if you will ask me,’ said Emma; and she has said enough. Mrs Casaubon would have talked for an hour and we should have looked out of the window.

Mead’s own description of the scene of Dorothea’s heartbreak in Rome is emblematic of both the best and the worst of Eliot’s handling of her characters. She writes,

For several pages, Eliot examines Dorothea’s emotions under a microscope, as if she were dissecting her heroine’s brain, the better to understand the course of its electrical flickers. But then she moves quickly and just as deeply into the inward movement of Casaubon’s emotions and sensations. (157)

Literary scholars like to justify the application of various ideologies to their analyses of novels by comparing the practice to looking through different lenses in search of new insights. But how much fellow feeling can we have for “electrical flickers” glimpsed through an eyepiece? And how are we to assess the degree to which critical lenses either clarify or distort what they’re held up to? What these metaphors of lenses and microscopes overlook is the near impossibility of sympathizing with specimens under a glass.

            The countless writers and critics who celebrate Middlemarch actually seem to appreciate all the moments when Eliot allows her storytelling to be submerged by her wry asides, sermons, and disquisitions. Indeed, Middlemarch may be best thought of as a kind of hybrid, and partly because of this multifariousness, but partly too because of the diversity of ways admirers like Mead have approached and appreciated it over the generations, the novel makes an ideal test case for various modes of literary reading. Will Ladislaw, serendipitously in Rome at the same time as the Casaubons, converses with Dorothea about all the art she has seen in the city. “It is painful,” she says, “to be told that anything is very fine and not be able to feel that it is fine—something like being blind, while people talk of the sky.” Ladislaw assures her, “Oh, there is a great deal in the feeling for art which must be acquired,” and then goes on to explain,

Art is an old language with a great many artificial affected styles, and sometimes the chief pleasure one gets out of knowing them is the mere sense of knowing. I enjoy the art of all sorts here immensely; but I suppose if I could pick my enjoyment to pieces I should find it made up of many different threads. There is something in daubing a little one’s self, and having an idea of the process. (196)

Applied to literature, Ladislaw’s observation—or confession—suggests that simply being in the know with regard to a work of great renown offers a pleasure of its own, apart from the direct experience of reading. (Imagine how many classics would be abandoned midway were they not known as such.) Mead admits to succumbing to this sort of glamor when she was first beginning to love Eliot and her work:

I knew that some important critics considered Middlemarch to be the greatest novel in the English language, and I wanted to be among those who understood why. I loved Middlemarch, and loved being the kind of person who loved it. It gratified my aspirations to maturity and learnedness. To have read it, and to have appreciated it, seemed a step on the road to being one of the grown-ups for whom it was written. (6-7).

What Mead is describing here is an important, albeit seldom mentioned, element in our response to any book. And there’s no better word for it than branding. Mead writes,

Books gave us a way to shape ourselves—to form our thoughts and to signal to each other who we were and who we wanted to be. They were part of our self-fashioning, no less than our clothes. (6)

The time in her life she’s recalling here is her late adolescence, the time when finding and forging all the pieces of our identities is of such pressing importance to us—and, not coincidentally, a time when we are busy laying the foundations of what will be our lifelong tastes in music and books.

It should not be lost on anyone that Mead was close in age to Dorothea in the early chapters when she first began reading Middlemarch, both of them “at the beginning of things.” Mead, perhaps out of nostalgia, tries to honor that early connection she felt even as she is compelled to challenge it as a legitimate approach to reading. Early in My Life in Middlemarch she writes,

Reading is sometimes thought of as a form of escapism, and it’s a common turn of phrase to speak of getting lost in a book. But a book can also be where one finds oneself; and when a reader is grasped and held by a book, reading does not feel like an escape from life so much as it feels like an urgent, crucial dimension of life itself. There are books that grow with the reader as the reader grows, like a graft to a tree. (16)

Much later in the book, though, after writing about how Eliot received stacks of letters from women insisting that they were the real-life version of Dorothea—and surmising that Eliot would have responded to such claims with contempt—Mead makes a gesture of deference toward critical theory, with its microscopes and illusory magnifications. She writes,

Such an approach to fiction—where do I see myself in there?—is not how a scholar reads, and it can be limiting in its solipsism. It’s hardly an enlarging experience to read a novel as if it were mirror of oneself. One of the useful functions of literary criticism and scholarship is to suggest alternative lenses through which a book might be read. (172)

As Eliot’s multimodal novel compels us to wonder how we should go about reading it, Mead’s memoir brilliantly examines one stance after another we might take in relationship to a novel. What becomes clear in the process is that literary scholars leave wide open the question of the proper way to read in part because they lack an understanding what literary narratives really are. If a novel is a disguised treatise, then lenses that could peer through the disguise are called for. If a novel is a type of wish-fulfillment, then to appreciate it we really should imagine ourselves as the protagonist.

             One of the surprises of Middlemarch is that, for all the minute inspection the narrator subjects the handful of protagonists to, none of them is all that vividly rendered. While in the most immersive of novels the characters come to life in a way that makes it easy to imagine them stepping out of the pages into real life, Eliot’s characters invite the reader to step from real life into the pages of the book. We see this in the way Eliot moves away from Dorothea at her moment of crisis. We see it too in the character of Tertius Lydgate, a doctor with a passion for progress to match Dorothea’s. Eliot describes the birth of this passion with a kind of distant and universal perspective that reaches out to envelop readers in their own reminiscences:

Most of us who turn to any subject we love remember some morning or evening hour when we got on a high stool to reach down an untried volume, or sat with parted lips listening to a new talker, or for very lack of books began to listen to the voices within, as the first traceable beginning of our love. Something of that sort happened to Lydgate. He was a quick fellow, and when hot from play, would toss himself in a corner, and in five minutes be deep in any sort of book that he could lay his hands on. (135)

Who among the likely readers of Middlemarch would object to the sentiment expressed in the line about Lydgate’s first encounter with medical texts, after “it had already occurred to him that books were stuff, and that life was stupid”?

Mead herself seems to have been drawn into Middlemarch first by its brand and then by her ready ease in identifying with Dorothea. But, while her book is admirably free of ideological musings, she does get quite some distance beyond her original treatment of the characters as avatars. And she even suggests this is perhaps a natural progression in a serious reader’s relationship to a classic work.

Identification with character is one way in which most ordinary readers do engage with a book, even if it is not where a reader’s engagement ends. It is where part of the pleasure, and the urgency, of reading lies. It is one of the ways that a novel speaks to a reader, and becomes integrated into the reader’s own imaginative life. Even the most sophisticated readers read novels in the light of their own experience, and in such recognition, sympathy may begin. (172-3)

What, then, do those sophisticated readers graduate into once they’ve moved beyond naïve readings? The second mode of reading, the one that academics prefer, involves the holding up of those ideological lenses. Mead describes her experience of the bait-and-switch perpetrated against innumerable young literature aficionados throughout their university educations:

I was studying English literature because I loved books, a common enough motivation among students of literature, but I soon discovered that love didn’t have much purchase when it came to our studies. It was the mideighties, the era of critical theory—an approach to literature that had been developed at Yale, among other distant and exotic locales. I’d never heard of critical theory before I got to Oxford, but I soon discovered it was what the most sophisticated-seeming undergraduates were engaged by. Scholars applied the tools of psychoanalysis or feminism to reveal the ways in which the author was blind to his or her own desire or prejudice, or they used the discipline of deconstruction to dispense with the author altogether. (Thus, J. Hillis Miller on George Eliot: “This incoherent, heterogeneous, ‘unreadable,’ or nonsythesizable quality of the text of Middlemarch jeopardizes the narrator’s effort of totalization.”) Books—or texts, as they were called by those versed in theory—weren’t supposed merely to be read, but to be interrogated, as if they had committed some criminal malfeasance. (145)

What identifying with the characters has to recommend it is that it makes of the plot a series of real-time experiences. The critical approaches used by academics take for granted that participating in the story like this puts you at risk of contracting the author’s neuroses or acquiring her prejudices. To critical theorists, fiction is a trick to make us all as repressed and as given to oppressing women and minorities as both the author and the culture to which she belongs. By applying their prophylactic theories, then, academic critics flatter themselves by implying that they are engaging in a type of political activism.

In her descriptions of what this second mode of reading tends to look like up close, Mead hints at a third mode that she never fully considers. The second mode works on the assumption that all fiction is allegory, representing either some unconscious psychological drama or some type of coded propaganda.  But, as Mead recounts, the process of decoding texts, which in the particular case she witnesses consists of an “application of Marxist theory to literature,” has many of the hallmarks of a sacred ritual:

I don’t recall which author was the object of that particular inquisition, but I do remember the way the room was crowded with the don’s acolytes. Monkish-looking young men with close-shaven heads wearing black turtlenecks huddled with their notebooks around the master, while others lounged on the rug at his feet. It felt very exclusive—and, with its clotted jargon, willfully difficult. (145)

The religious approach to reading favored by academics is decidedly Old Testament; it sees literature and the world as full of sin and evil, and makes of reading a kind of expiation or ritual purging. A more New Testament approach would entail reading a fictional story as a parable while taking the author as a sort of messianic figure—a guide and a savior. Here’s Mead describing a revelation she reached when she learned about Eliot’s relationship with her stepsons. It came to her after a period of lamenting how Middlemarch had nothing to say about her own similarly challenging family situation:

A book may not tell us exactly how to live our own lives, but our own lives can teach us how to read a book. Now when I read the novel in the light of Eliot’s life, and in the light of my own, I see her experience of unexpected family woven deep into the fabric of the novel—not as part of the book’s obvious pattern, but as part of its tensile strength. Middlemarch seems charged with the question of being a stepmother: of how one might do well by one’s stepchildren, or unwittingly fail them, and of all that might be gained from opening one’s heart wider. (110)

The obvious drawback to this approach—tensile strength?—is that it makes of the novel something of a Rorschach, an ambiguous message we read into whatever meaning we most desire at a given moment. But, as Mead points out, this is to some degree inevitable. She writes that

…all readers make books over in their own image, and according to their own experience. My Middlemarch is not the same as anyone else’s Middlemarch; it is not even the same as my Middlemarch of twenty-five years ago. Sometimes, we find that a book we love has moved another person in the same ways as it has moved ourselves, and one definition of compatibility might be when two people have highlighted the same passages in their editions of a favorite novel. But we each have our own internal version of the book, with lines remembered and resonances felt. (172)

How many other readers, we may wonder, see in Middlemarch an allegory of stepparenthood? And we may also wonder how far this rather obvious point should be taken? If each reader’s experience of a novel is completely different from every other readers’, then we’re stuck once more with solipsism. But if this were true we wouldn’t be able to talk about a novel to other people in the first place, let alone discover whether it’s moved others in the same way as us.   

           Mead doesn’t advocate any of the modes of reading she explores, and she seems to have taken on each of them at the point in her life she did without any conscious deliberation. But it’s the religious reading of Middlemarch, the one that treats the story as an extended parable, that she has ultimately alighted on—at least as of the time when she was writing My Life in Middlemarch. This is largely owing to how easily the novel lends itself to this type of reading. From the early chapters in which we’re invited to step into the shoes of these ardent, bookish, big-spirited characters to the pages detailing their inevitable frustrations and disappointments, Eliot really does usher her readers—who must be readerly and ambitious even to take up such a book—toward a more mature mindset. Some of the most moving scenes feature exchanges between Dorothea and Dr. Lydgate, and in their subtle but unmistakable sympathy toward one another they seem to be reaching out to us with the same sympathy. Here is Dorothea commiserating with Lydgate in the wake of a scandal which has tarnished his name and thwarted his efforts at reform:

And that all this should have come to you who had meant to lead a higher life than the common, and to find out better ways—I cannot bear to rest in this as unchangeable. I know you meant that. I remember what you said to me when you first spoke to me about the hospital. There is no sorrow I have thought more about than that—to love what is great, and try to reach it, and yet to fail. (727)

Dorothea helps Lydgate escape Middlemarch and establish himself in London, where he becomes a successful physician but never pushes through any reforms to the profession. And this is one of Dorothea’s many small accomplishments. The lesson of the parable is clear in the famous final lines of the novel:

Her finely-touched spirit had still its fine issues, though they were not widely visible. Her full nature, like that river of which Cyrus broke the strength, spent itself in channels which had no great name on the earth. But the effect of her being on those around her was incalculably diffusive: for the growing good of the world is partly dependent on unhistoric acts; and that things are not so ill with you and me as they might have been, is partly owing to the number who lived faithfully a hidden life, and rest in unvisited tombs. (799)

Entire generations of writers, scholars, and book-lovers have taken this message to heart and found solace in its wisdom. But its impact depends not so much on the reader’s readiness to sympathize with the characters as on their eagerness to identify with them. In other words, Eliot is helping her readers to sympathize with themselves, not “to imagine and to feel the pains and the joys of those who differ from themselves in everything but the broad fact of being struggling erring human creatures.”

            For all the maturity of its message, Middlemarch invites a rather naïve reading. But what’s wrong with reading this way? And how else might we read if we want to focus more on learning to sympathize with those who differ from us? My Life in Middlemarch chronicles the journeys Mead takes to several places of biographical significance hoping to make some connection or attain some greater understanding of Eliot and her writing. Mead also visits libraries in England and the US so she can get her hands on some of the sacred texts still bearing ink scribbled onto paper by the author’s own hand. In an early chapter of Middlemarch, Eliot shows readers Casaubon’s letter proposing marriage to Dorothea, and it’s simultaneously comic and painful for being both pedantic and devoid of feeling. As I read the letter, I had the discomfiting thought that it was only a slight exaggeration of Eliot’s usual prose. Mead likewise characterizes one of Eliot’s contemporary fans, Alexander Main, in a way uncomfortably close the way she comes across herself. Though she writes at one point, “I recognize in his enthusiasm for her works enough of my own admiration for her to feel an awkward fellowship with him,” she doesn’t seem to appreciate the extent to which Main’s relationship to Eliot and her work resembles her own. But she also hints at something else in her descriptions of Main, something that may nod to an alternative mode of reading beyond the ones she’s already explored. She writes,

In his excessive, grandiose, desperately lonely letters, Main does something that most of us who love books do, to some extent or another. He talks about the characters as if they were real people—as vivid, or more so, than people in his own life. He makes demands and asks questions of an author that for most of us remain imaginary but which he transformed, by force of will and need, into an intense epistolary relationship. He turned his worship and admiration of George Eliot into a one-sided love affair of sorts, by which he seems to have felt sustained even as he felt still hungrier for engagement. (241-2)

Main, and to some extent Mead as well, make of Eliot a godly figure fit for worship, and who but God could bring characters into life—into our lives—who are as real and vivid as other actual living breathing human beings we love or hate or like or tolerate?

As Mead’s reading into Middlemarch an allegory of stepmotherhood illustrates, worshipping authors and treating their works as parables comes with the risk of overburdening what’s actually on the page with whatever meanings the faithful yearn to find there, allowing existential need to override honest appreciation. But the other modes are just as problematic. Naïve identification with the heroine, as Mead points out, limits the scope of our sympathy and makes it hard to get into a novel whose characters are strange or otherwise contrary to our individual tastes. It also makes a liability of characters with great weaknesses or flaws or any other traits that would make being in their shoes distasteful or unpleasant. Treating a story as an allegory on the other hand can potentially lead to an infinite assortment of interpretations, all of questionable validity. This mode of reading also implies that writers of fiction have the same goals to argue or persuade as writers of tracts and treatises. The further implication is that all the elements of storytelling are really little more that planks making up the Trojan horse conveying the true message of the story—and if this were the case why would anyone read fictional stories in the first place? It’s quite possible that this ideological approach to teaching literature has played some role in the declining number of people who read fiction.

What’s really amazing about the people who love any book is that, like Alexander Main, they all tend to love certain characters and despise certain others as if they were real people. There may be a certain level of simple identification with the protagonist, but we also tend to identify with real people who are similar to us—that’s the basis of many friendships and relationships. It’s hard to separate identification from fellow-feeling and sympathy even in what seem to be the most wish-fulfilling stories. Does anybody really want to be Jane Eyre or Elizabeth Bennet? Does anyone really want to be James Bond or Katniss Everdeen? Or do we just admire and try to emulate some of their qualities? Characters in fiction are designed to be more vivid than people in real life because, not even being real, they have to be extraordinary in some way to get anyone to pay attention to them. Their contours are more clearly delineated, their traits exaggerated, and their passions intensified. This doesn’t mean, however, that we’re falling for some kind of trick when we forget for a moment that they’re not like anyone we’ll ever meet.

Characters, to be worthy of attention, have to be caricatures—like real people but with a few identifying characteristics blown all out of realistic proportion. Dorothea is a caricature. Casaubon is for sure a caricature. But we understand them using the same emotional and cognitive processes we use to understand real people. And it is in exercising these very perspective-taking and empathizing abilities that our best hope for expanding our sympathies lies. What’s the best way to read a work of literature? First, realize that the author is not a god. In fact, forget the author as best you can. Eliot makes it difficult for us to overlook her presence in any scene, and for that reason it may be time to buck the convention and admit that Middlemarch, as brilliantly conceived as it was, as pioneering and revolutionary as it was, is not by any means the greatest novel ever written. What’s more important to concentrate on than the author is the narrator, who may either be herself an important character in the story, or who may stand as far out of the picture as she can so as not to occlude our view. What’s most important though is to listen to the story the narrator tells, to imagine it’s really happening, right before our eyes, right in the instant we experience it. And, at least for that moment, forget what anyone else has to say about what we're witnessing.

Also read: 

MUDDLING THROUGH "LIFE AFTER LIFE": A REFLECTION ON PLOT AND CHARACTER IN KATE ATKINSON’S NEW NOVEL

And: 

WHY SHAKESPEARE NAUSEATED DARWIN: A REVIEW OF KEITH OATLEY'S "SUCH STUFF AS DREAMS"

And: 

GONE GIRL AND THE RELATIONSHIP GAME: THE CHARMS OF GILLIAN FLYNN'S AMAZINGLY SEDUCTIVE ANTI-HEROINE

Read More
Dennis Junk Dennis Junk

The Spider-Man Stars' Dust-up over Pseudo-Sexism

You may have noticed that the term sexism has come to refer to any suggestion that there may be meaningful differences between women and men, or between what’s considered feminine and what’s considered masculine. But is the denial of sex differences really what’s best for women? Is it what’s best for anyone.

A new definition of the word sexism has taken hold in the English-speaking world, even to the point where it’s showing up in official definitions. No longer used merely to describe the belief that women are somehow inferior to men, sexism can now refer to any belief in gender differences. Case in point: when Spider-Man star Andrew Garfield fielded a question from a young boy about how the superhero came by his iconic costume by explaining that he sewed it himself, even though sewing is “kind of a feminine thing to do,” Emma Gray and The Huffington Post couldn’t resist griping about Garfield’s “Casual Sexism” and celebrating his girlfriend Emma Stone’s “Most Perfect Way” of calling it out. Gray writes,

Instead of letting the comment—which assumes that there is something fundamentally female about sewing, and that doing such a “girly” thing must be qualified with a “masculine” outcome—slide, Stone turned the Q&A panel into an important teachable moment. She stopped her boyfriend and asked: “It's feminine, how?”

Those three words are underwhelming enough to warrant suspicion that Gray is really just cheerleading for someone she sees as playing for the right team.  

            A few decades ago, people would express beliefs about the proper roles and places for women quite openly in public. Outside of a few bastions of radical conservatism, you’re unlikely to hear anyone say that women shouldn’t be allowed to run businesses or serve in high office today. But rather than being leveled with decreasing frequency the charge of sexism is now applied to a wider and more questionable assortment of ideas and statements. Surprised at having fallen afoul of this broadening definition of sexism, Garfield responded to Stone’s challenge by saying,

It’s amazing how you took that as an insult. It’s feminine because I would say femininity is about more delicacy and precision and detail work and craftsmanship. Like my mother, she’s an amazing craftsman. She in fact made my first Spider-Man costume when I was three. So I use it as a compliment, to compliment the feminine in women but in men as well. We all have feminine in us, young men.

Gray sees that last statement as a result of how Stone “pressed Garfield to explain himself.” Watch the video, though, and you’ll see she did little pressing. He seemed happy to explain what he meant. And that last line was actually a reiteration of the point he’d made originally by saying, “It’s kind of a feminine thing to do, but he really made a very masculine costume”—the line that Stone pounced on. 

            Garfield’s handling of both the young boy’s question and Stone’s captious interruption is far more impressive than Stone’s supposedly perfect way of calling him out. Indeed, Stone’s response was crudely ideological, implying quite simply that her boyfriend had revealed something embarrassing about himself—gotcha!—and encouraging him to expound further on his unacceptable ideas so she and the audience could chastise him. She had, like Gray, assumed that any reference to gender roles was sexist by definition. But did Garfield’s original answer to the boy’s question really reveal that he “assumes that there is something fundamentally female about sewing, and that doing such a ‘girly’ thing must be qualified with a ‘masculine’ outcome,” as Gray claims? (Note her deceptively inconsistent use of scare quotes and actual quotes.)

Garfield’s thinking throughout the exchange was quite sophisticated. First, he tried to play up Spider-Man’s initiative and self-sufficiency because he knew the young fan would appreciate these qualities in his hero. Then he seems to have realized that the young boy might be put off by the image of his favorite superhero engaging in an activity that’s predominantly taken up by women. Finally, he realized he could use this potential uneasiness as an opportunity for making the point that just because a male does something generally considered feminine that doesn’t mean he’s any less masculine. This is the opposite of sexism. So why did Stone and Gray cry foul? 

One of the tenets of modern feminism is that gender roles are either entirely chimerical or, to the extent that they exist, socially constructed. In other words, they’re nothing but collective delusions. Accepting, acknowledging, or referring to gender roles then, especially in the presence of a young child, abets in the perpetuation of these separate roles. Another tenet of modern feminism that comes into play here is that gender roles are inextricably linked to gender oppression. The only way for us as a society to move toward greater equality, according to this ideology, is for us to do away with gender roles altogether. Thus, when Garfield or anyone else refers to them as if they were real or in any way significant, he must be challenged.

One of the problems with Stone’s and Gray’s charge of sexism is that there happens to be a great deal of truth in every aspect of Garfield’s answer to the boy’s question. Developmental psychologists consistently find that young children really are preoccupied with categorizing behaviors by gender and that the salience of gender to children arises so reliably and at so young an age that it’s unlikely to stem from socialization.

Studies have also consistently found that women tend to excel in tasks requiring fine motor skill, while men excel in most other dimensions of motor ability. And what percentage of men ever go beyond sewing buttons on their shirts—if they do even that? Why but for the sake of political correctness would anyone deny this difference? Garfield’s response to Stone’s challenge was also remarkably subtle. He didn’t act as though he’d been caught in a faux pas but instead turned the challenge around, calling Stone out for assuming he somehow intended to disparage women. He then proudly expounded on his original point. If anything, it looked a little embarrassing for Stone.

Modern feminism has grown over the past decade to include the push for LGBT rights. Historically, gender roles were officially sanctioned and strictly enforced, so it was understandable that anyone advocating for women’s rights would be inclined to question those roles. Today, countless people who don’t fit neatly into conventional gender categories are in a struggle with constituencies who insist their lifestyles and sexual preferences are unnatural. But even those of us who support equal rights for LGBT people have to ask ourselves if the best strategy for combating bigotry is an aggressive and wholesale denial of gender. Isn’t it possible to recognize gender differences, and even celebrate them, without trying to enforce them prescriptively? Can’t we accept the possibility that some average differences are innate without imposing definitions on individuals or punishing them for all the ways they upset expectations? And can’t we challenge religious conservatives for the asinine belief that nature sets up rigid categories and the idiotic assumption that biology is about order as opposed to diversity instead of ignoring (or attacking) psychologists who study gender differences?

I think most people realize there’s something not just unbecoming but unfair about modern feminism’s anti-gender attitude. And most people probably don’t appreciate all the cheap gotchas liberal publications like The Huffington Post and The Guardian and Slate are so fond of touting. Every time feminists accuse someone of sexism for simply referring to obvious gender differences, they belie their own case that feminism is no more and no less than a belief in the equality of women. Only twenty percent of Americans identify themselves as feminists, while over eighty percent believe in equality for women. Feminism, like sexism, has clearly come to mean something other than what it used to. It may be the case that just as the gender roles of the past century gradually came to be seen as too rigid so too that century’s ideologies are increasingly seen as too lacking in nuance and their proponents too quick to condemn. It may even be that we Americans and Brits no longer need churchy ideologies to tell us all people deserve to be treated equally. 

Also read: 

FROM DARWIN TO DR. SEUSS: DOUBLING DOWN ON THE DUMBEST APPROACH TO COMBATTING RACISM

And: 

SCIENCE’S DIFFERENCE PROBLEM: NICHOLAS WADE’S TROUBLESOME INHERITANCE AND THE MISSING MORAL FRAMEWORK FOR DISCUSSING THE BIOLOGY OF BEHAVIOR

And:

WHY I WON'T BE ATTENDING THE GENDER-FLIPPED SHAKESPEARE PLAY

Read More
Dennis Junk Dennis Junk

Why I Won't Be Attending the Gender-Flipped Shakespeare Play

Gender-flipped performances are often billed as experiments to help audiences reconsider and look more deeply into what we consider the essential characteristics of males and females. But not to far beneath the surface, you find that they tend to be more like ideological cudgels for those who would deny biological differences.

The Guardian’s “Women’s Blog” reports that “Gender-flips used to challenge sexist stereotypes are having a moment,” and this is largely owing, author Kira Cochrane suggests, to the fact that “Sometimes the best way to make a point about sexism is also the simplest.” This simple approach to making a point consists of taking a work of art or piece of advertising and swapping the genders featured in them. Cochrane goes on to point out that “the gender-flip certainly isn’t a new way to make a political point,” and notes that “it’s with the recent rise of feminist campaigning and online debate that this approach has gone mainstream.”

What is the political point gender-flips are making? As a dancer in a Jennifer Lopez video that reverses the conventional gender roles asks, “Why do men always objectify the women in every single video?” Australian comedian Christiaan Van Vuuren explains that he posed for a reproduction of a GQ cover originally featuring a sexy woman to call attention to the “over-sexualization of the female body in the high-fashion world.” The original cover photo of Miranda Kerr is undeniably beautiful. The gender-flipped version is funny. The obvious takeaway is that we look at women and men differently (gasp!). When women strike an alluring pose, or don revealing clothes, it’s sexy. When men try to do the same thing, it’s ridiculous. Feminists insist that this objectification or over-sexualization of women is a means of oppression. But is it? And are gender-flips simple ways of making a point, or just cheap gimmicks? 

Tonight, my alma mater IPFW is hosting a production called “Juliet and Romeo,” a gender-flipped version of Shakespeare’s most recognizable play. The lead on the Facebook page for the event asks us to imagine that “Juliet is instead a bold Montague who courts a young, sheltered Capulet by the name of Romeo.” Lest you fear the production is just a stunt to make a political point about gender, the hosts have planned a “panel discussion focusing on Shakespeare, gender, and language.” Many former classmates and teachers, most of whom I consider friends, a couple I consider good friends, are either attending or participating in the event. But I won’t be going.

I don’t believe the production is being put on in the spirit of open-minded experimentation. Like the other gender-flip examples, the purpose of staging “Juliet and Romeo” is to make a point about stereotypes. And I believe this proclivity toward using literature as fodder to fuel ideological agendas is precisely what’s most wrong with English lit programs in today’s universities. There have to be better ways to foster interest in great works than by letting activists posing as educators use them as anvils to hammer agendas into students’ heads against.

You may take the position that my objections would carry more weight were I to attend the event before rendering judgment on it. But I believe the way to approach literature is as an experience, not as a static set of principles or stand-alone abstractions. And I don’t want thoughts about gender politics to intrude on my experience of Shakespeare—especially when those thoughts are of such dubious merit. I want to avoid the experience of a gender-flipped production of Shakespeare because I believe scholarship should push us farther into literature—enhance our experience of it, make it more immediate and real—not cast us out of it by importing elements of political agendas and making us cogitate about some supposed implications for society of what’s going on before our eyes.

Regarding that political point, I see no contradiction in accepting, even celebrating, our culture’s gender roles while at the same time supporting equal rights for both genders. Sexism is a belief that one gender is inferior to the other. Demonstrating that people of different genders tend to play different roles in no way proves that either is being treated as inferior. As for objectification and over-sexualization, a moment’s reflection ought to make clear that the feminists are getting this issue perfectly backward. Physical attractiveness is one of the avenues through which women exercise power over men. Miranda Kerr got paid handsomely for that GQ cover. And what could be more arrantly hypocritical than Jennifer Lopez complaining about objectification in music videos? She owes her celebrity in large part to her willingness to allow herself to be objectified. The very concept of objectification is only something we accept from long familiarity--people are sexually aroused by other people, not objects.

I’m not opposed to having a discussion about gender roles and power relations, but if you have something to say, then say it. I’m not even completely opposed to discussing gender in the context of Shakespeare’s plays. What I am opposed to is people hijacking our experience of Shakespeare to get some message across, people toeing the line by teaching that literature is properly understood by “looking at it through the lens” of one or another well-intentioned but completely unsupported ideology, and people misguidedly making sex fraught and uncomfortable for everyone. I doubt I’m alone in turning to literature, at least in part, to get away from that sort of puritanism in church. Guilt-tripping guys and encouraging women to walk around with a chip on their shoulders must be one of the least effective ways to get people to respect each other more we've ever come up with.

But, when you guys do a performance of the original Shakespeare, you can count on me being there to experience it. 

Update:

The link to this post on Facebook generated some heated commentary. Some were denials of ideological intentions on behalf of those putting on the event. Some were mischaracterizations based on presumed “traditionalist” associations with my position. Some made the point that Shakespeare himself played around with gender, so it should be okay for others to do the same with his work. In the end, I did feel compelled to attend the event because I had taken such a strong position.

Having flipflopped and attended the event, I have to admit I enjoyed it. All the people involved were witty, charming, intellectually stimulating, and pretty much all-around delightful.

But, as was my original complaint, it was quite clear—and at two points explicitly stated—that the "experiment" entailed using the play as a springboard for a discussion of current issues like marriage rights. Everyone, from the cast to audience members, was quick to insist after the play that they felt it was completely natural and convincing. But gradually more examples of "awkward," "uncomfortable," or "weird" lines or scenes came up. Shannon Bischoff, a linguist one commenter characterized as the least politically correct guy I’d ever meet, did in fact bring up a couple aspects of the adaptation that he found troubling. But even he paused after saying something felt weird, as if to say, "Is that alright?" (Being weirded out about a 15 year old Romeo being pursued by a Juliet in her late teens was okay because it was about age not gender.)

The adapter himself, Jack Cant, said at one point that though he was tempted to rewrite some of the parts that seemed really strange he decided to leave them in because he wanted to let people be uncomfortable. The underlying assumption of the entire discussion was that gender is a "social construct" and that our expectations are owing solely to "stereotypes." And the purpose of the exercise was for everyone to be brought face-to-face with their assumptions about gender so that they could expiate them. I don't think any fair-minded attendee could deny the agreed-upon message was that this is a way to help us do away with gender roles—and that doing so would be a good thing. (If there was any doubt, Jack’s wife eliminated it when she stood up from her seat in the audience to say she wondered if Jack had learned enough from the exercise to avoid applying gender stereotypes to his nieces.) And this is exactly what I mean by ideology. Sure, Shakespeare played around with gender in As You Like It and Twelfth Night. But he did it for dramatic or comedic effect primarily, and to send a message secondarily—or more likely not at all.

For the record, I think biology plays a large (but of course not exclusive) part in gender roles, I enjoy and celebrate gender roles (love being a man; love women who love being women), but I also support marriage rights for homosexuals and try to be as accepting as I can of people who don't fit the conventional roles.

To make one further clarification: whether you support an anti-gender agenda and whether you think Shakespeare should be used as a tool for this or any other ideological agenda are two separate issues. I happen not to support anti-genderism. My main point in this post, however, is that ideology—good, bad, valid, invalid—should not play a part in literature education. Because, for instance, while students are being made to feel uncomfortable about their unexamined gender assumptions, they're not feeling uncomfortable about, say, whether Romeo might be rushing into marriage too hastily, or whether Juliet will wake up in time to keep him from drinking the poison—you know, the actual play.

Whether Shakespeare was sending a message or not, I'm sure he wanted first and foremost for his audiences to respond to the characters he actually created. And we shouldn't be using "lenses" to look at plays; we should be experiencing them. They're not treatises. They're not coded allegories. And, as old as they may be to us, every generation of students gets to discover them anew.

We can discuss politics and gender or whatever you want. There's a time and a place for that and it's not in a lit classroom. Sure, let's encourage students to have open minds about gender and other issues, and let's help them to explore their culture and their own habits of thought. There are good ways to do that—ideologically adulterated Shakespeare is not one of them.

Also read:

FROM DARWIN TO DR. SEUSS: DOUBLING DOWN ON THE DUMBEST APPROACH TO COMBATTING RACISM

And: 

HOW TO GET KIDS TO READ LITERATURE WITHOUT MAKING THEM HATE IT

And: 

THE ISSUE WITH JURASSIC WORLD NO ONE HAS THE BALLS TO TALK ABOUT

Read More
Dennis Junk Dennis Junk

The Better-than-Biblical History of Humanity Hidden in Tiny Cells and a Great Story of Science Hidden in Plain Sight

With “Neanderthal Man,” paleogeneticist Svante Pääbo has penned a deeply personal and sparely stylish paean to the field of paleogenetics and all the colleagues and supporters who helped him create it. The book offers an invaluable look behind the scenes of some of the most fascinating research in recent decades.

            Anthropology enthusiasts became acquainted with the name Svante Pääbo in books or articles published throughout the latter half of this century’s first decade about how our anatomically modern ancestors might have responded to the presence of other species of humans as they spread over new continents tens of thousands of years ago. The first bit of news associated with this unplaceable name was that humans probably never interbred with Neanderthals, a finding that ran counter to the multiregionalist theory of human evolution and lent support to the theory of a single origin in Africa. The significance of the Pääbo team’s findings in the context of this longstanding debate was a natural enough angle for science writers to focus on. But what’s shocking in hindsight is that so little of what was written during those few years conveyed any sense of wonder at the discovery that DNA from Neanderthals, a species that went extinct 30,000 years ago, was still retrievable—that snatches of it had in fact already been sequenced.

Then, in 2010, the verdict suddenly changed; humans really had bred with Neanderthals, and all people alive today who trace their ancestry to regions outside of Africa carry vestiges of those couplings in their genomes. The discrepancy between the two findings, we learned, was owing to the first being based on mitochondrial DNA and the second on nuclear DNA. Even those anthropology students whose knowledge of human evolution derived mostly from what can be gleaned from the shapes and ages of fossil bones probably understood that since several copies of mitochondrial DNA reside in every cell of a creature’s body, while each cell houses but a single copy of nuclear DNA, this latest feat of gene sequencing must have been an even greater challenge. Yet, at least among anthropologists, the accomplishment got swallowed up in the competition between rival scenarios for how our species came to supplant all the other types of humans. Though, to be fair, there was a bit of marveling among paleoanthropologists at the implications of being some percentage Neanderthal.

            Fortunately for us enthusiasts, in his new book Neanderthal Man: In Search of Lost Genomes, Pääbo, a Swedish molecular biologist now working at the Max Planck Institute in Leipzig, goes some distance toward making it possible for everyone to appreciate the wonder and magnificence of his team’s monumental achievements. It would have been a great service to historians for him to simply recount the series of seemingly insurmountable obstacles the researchers faced at various stages, along with the technological advances and bursts of inspiration that saw them through. But what he’s done instead is pen a deeply personal and sparely stylish paean to the field of paleogenetics and all the colleagues and supporters who helped him create it.

It’s been over sixty years since Watson and Crick, with some help from Rosalind Franklin, revealed the double-helix structure of DNA. But the Human Genome Project, the massive effort to sequence all three billion base pairs that form the blueprint for a human, was completed just over ten years ago. As inexorable as the march of technological progress often seems, the jump from methods for sequencing the genes of living creatures to those of long-extinct species only strikes us as foregone in hindsight. At the time when Pääbo was originally dreaming of ancient DNA, which he first hoped to retrieve from Egyptian mummies, there were plenty of reasons to doubt it was possible. He writes,

When we die, we stop breathing; the cells in our body then run out of oxygen, and as a consequence their energy runs out. This stops the repair of DNA, and various sorts of damage rapidly accumulate. In addition to the spontaneous chemical damage that continually occurs in living cells, there are forms of damage that occur after death, once the cells start to decompose. One of the crucial functions of living cells is to maintain compartments where enzymes and other substances are kept separate from one another. Some of these compartments contain enzymes that break down DNA from various microorganisms that the cell may encounter and engulf. Once an organism dies and runs out of energy, the compartment membranes deteriorate, and these enzymes leak out and begin degrading DNA in an uncontrolled way. Within hours and sometimes days after death, the DNA strands in our body are cut into smaller and smaller pieces, while various other forms of damage accumulate. At the same time, bacteria that live in our intestines and lungs start growing uncontrollably when our body fails to maintain the barriers that normally contain them. Together these processes will eventually dissolve the genetic information stored in our DNA—the information that once allowed our body to form, be maintained, and function. When that process is complete, the last trace of our biological uniqueness is gone. In a sense, our physical death is then complete. (6)

The hope was that amid this nucleic carnage enough pieces would survive to restore a single strand of the entire genome. That meant Pääbo needed lots of organic remains and some really powerful extraction tools. It also meant that he’d need some well-tested and highly reliable methods for fitting the pieces of the puzzle together.

            Along with the sense of inevitability that follows fast on the heels of any scientific advance, the impact of the Neanderthal Genome Project’s success in the wider culture was also dampened by a troubling inability on the part of the masses to appreciate that not all ideas are created equal—that any particular theory is only as good as the path researchers followed to arrive at it and the methods they used to validate it. Sadly, it’s in all probability the very people who would have been the most thoroughly gobsmacked by the findings coming out of the Max Planck Institute whose amazement switches are most susceptible to hijacking at the hands of the charlatans and ratings whores behind shows like Ancient Aliens. More serious than the cheap fictions masquerading as science that abound in pop culture, though, is a school of thought in academia that not only fails to grasp, but outright denies, the value of methodological rigor, charging that the methods themselves are mere vessels for the dissemination of encrypted social and political prejudices.

Such thinking can’t survive even the most casual encounter with the realities of how science is conducted. Pääbo, for instance, describes his team’s frustration whenever rival researchers published findings based on protocols that failed to meet the standards they’d developed to rule out contamination from other sources of genetic material. He explains the common “dilemma in science” whereby

doing all the analyses and experiments necessary to tell the complete story leaves you vulnerable to being beaten to the press by those willing to publish a less complete story that nevertheless makes the major point you wanted to make. Even when you publish a better paper, you are seen as mopping up the details after someone who made the real breakthrough. (115)

The more serious challenge for Pääbo, however, was dialing back extravagant expectations on the part of prospective funders against the backdrop of popular notions propagated by the Jurassic Park movie franchise and extraordinary claims from scientists who should’ve known better. He writes,

As we were painstakingly developing methods to detect and eliminate contamination, we were frustrated by flashy publications in Nature and Science whose authors, on the surface of things, were much more successful than we were and whose accomplishments dwarfed the scant products of our cumbersome efforts to retrieve DNA sequences “only” a few tens of thousands of years old. The trend had begun in 1990, when I was still at Berkeley. Scientists at UC Irvine published a DNA sequence from leaves of Magnolia latahensis that had been found in a Miocene deposit in Clarkia, Idaho, and were 17 million years old. This was a breathtaking achievement, seeming to suggest that one could study DNA evolution on a time scale of millions of years, perhaps even going back to the dinosaurs! (56)

            In the tradition of the best scientists, Pääbo didn’t simply retreat to his own projects to await the inevitable retractions and failed replications but instead set out to apply his own more meticulous extraction methods to the fossilized plant material. He writes,

I collected many of these leaves and brought them back with me to Munich. In my new lab, I tried extracting DNA from the leaves and found they contained many long DNA fragments. But I could amplify no plant DNA by PCR. Suspecting that the long DNA was from bacteria, I tried primers for bacterial DNA instead, and was immediately successful. Obviously, bacteria had been growing in the clay. The only reasonable explanation was that the Irvine group, who worked on plant genes and did not use a separate “clean lab” for their ancient work, had amplified some contaminating DNA and thought it came from the fossil leaves. (57)

With the right equipment, it turns out, you can extract and sequence genetic material from pretty much any kind of organic remains, no matter how old. The problem is that sources of contamination are myriad, and chances are whatever DNA you manage to read is almost sure to be from something other than the ancient creature you’re interested in.

            At the time when Pääbo was busy honing his techniques, many scientists thought genetic material from ancient plants and insects might be preserved in the fossilized tree resin known as amber. Sure enough, in the late 80s and early 90s, George Poinar and Raul Cano published a series of articles in which they claimed to have successfully extracted DNA through tiny holes drilled into chunks of amber to reach embedded bugs and leaves. These articles were in fact the inspiration behind Michael Crichton’s description of how the dinosaurs in Jurassic Park were cloned. But Pääbo had doubts about whether these researchers were taking proper precautions to rule out contamination, and no sooner had he heard about their findings than he started trying to find a way to get his hands on some amber specimens. He writes,

The opportunity to find out came in 1994, when Hendrik Poinar joined our lab. Hendrik was a jovial Californian and the son of George Poinar, then a professor at Berkeley and a well-respected expert on amber and the creatures found in it. Hendrik had published some of the amber DNA sequences with Raul Cano, and his father had access to the best amber in the world. Hendrik came to Munich and went to work in our new clean room. But he could not repeat what had been done in San Luis Obisco. In fact, as long as his blank extracts were clean, he got no DNA sequences at all out of the amber—regardless of whether he tried insects or plants. I grew more and more skeptical, and I was in good company. (58)

Those blank extracts were important not just to test for bacteria in the samples but to check for human cells as well. Indeed, one of the special challenges of isolating Neanderthal DNA is that it looks so much like the DNA of the anatomically modern humans handling the samples and the sequencing machines.

A high percentage of the dust that accumulates in houses is made up of our sloughed off skin cells. And Polymerase Chain Reaction (PCR), the technique Pääbo’s team was using to increase the amount of target DNA, relies on a powerful amplification process which uses rapid heating and cooling to split double helix strands up the middle before fitting synthetic chemicals along each side like an amino acid zipper, resulting in exponential replication. The result is that each fragment of a genome gets blown up, and it becomes impossible to tell what percentage of the specimen’s DNA it originally represented. Researchers then try to fit the fragments end-to-end based on repeating overlaps until they have an entire strand. If there’s a great deal of similarity between the individual you’re trying to sequence and the individual whose cells have contaminated the sample, you simply have no way to know whether you’re splicing together fragments of each individual’s genome. Much of the early work Pääbo did was with extinct mammals like giant ground sloths which were easier to disentangle from humans. These early studies were what led to the development of practices like running blank extracts, which would later help his team ensure that their supposed Neanderthal DNA wasn’t really from modern human dust.

Despite all the claims of million-year-old DNA being publicized, Pääbo and his team eventually had to rein in their frustration and stop “playing the PCR police” (61) if they ever wanted to show their techniques could be applied to an ancient species of human. One of the major events in Pääbo’s life that would make this huge accomplishment a reality was the founding of the Max Planck Institute for Evolutionary Anthropology in 1997. As celebrated as the Max Planck Society is today, though, the idea of an institute devoted to scientific anthropology in Germany at the time had to overcome some resistance arising out of fears that history might repeat itself. Pääbo explains,

As do many contemporary German institutions, the MPS had a predecessor before the war. Its name was the Kaiser Wilhelm Society, and it was founded in 1911. The Kaiser Wilhelm Society had built up and supported institutes around eminent scientists such as Otto Hahn, Albert Einstein, Max Planck, and Werner Heisenberg, scientific giants active at a time when Germany was a scientifically dominant nation. That era came to an abrupt end when Hitler rose to power and the Nazis ousted many of the best scientists because they were Jewish. Although formally independent of the government, the Kaiser Wilhelm Society became part of the German war machine—doing, for example, weapons research. This was not surprising. Even worse was that through its Institute for Anthropology, Human Heredity, and Eugenics the Kaiser Wilhelm Society was actively involved in racial science and the crimes that grew out of that. In that institute, based in Berlin, people like Josef Mengele were scientific assistants while performing on inmates at Auschwitz death camp, many of them children. (81-2)

Even without such direct historical connections, many scholars still automatically leap from any mention of anthropology or genetics to dubious efforts to give the imprimatur of science to racial hierarchies and clear the way for atrocities like eugenic culling or sterilizations, even though no scientist in any field would have truck with such ideas and policies after the lessons of the past century.

            Pääbo not only believed that anthropological science could be conducted without repeating the atrocities of the past; he insisted that allowing history to rule real science out of bounds would effectively defeat the purpose of the entire endeavor of establishing an organization for the study of human origins. Called on as a consultant to help steer a course for the institute he was simultaneously being recruited to work for, Pääbo recalls responding to the administrators’ historical concerns,

Perhaps it was easier for me as a non-German born well after the war to have a relaxed attitude toward this. I felt that more than fifty years after the war, Germany could not allow itself to be inhibited in its scientific endeavors by its past crimes. We should neither forget history nor fail to learn from it, but we should also not be afraid to go forward. I think I even said that fifty years after his death, Hitler should not be allowed to dictate what we could or could not do. I stressed that in my opinion any new institute devoted to anthropology should not be a place where one philosophized about human history. It should do empirical science. Scientists who were to work there should collect real hard facts about human history and test their ideas against them. (82-3)

As it turned out, Pääbo wasn’t alone in his convictions, and his vision of what the institute should be and how it should operate came to fruition with the construction of the research facility in Leipzig.

            Faced with Pääbo’s passionate enthusiasm, some may worry that he’s one of those mad scientists we know about from movies and books, willing to push ahead with his obsessions regardless of the moral implications or the societal impacts. But in fact Pääbo goes a long way toward showing that the popular conception of the socially oblivious scientist who calculates but can’t think, and who solves puzzles but is baffled by human emotions is not just a caricature but a malicious fiction. For instance, even amid the excitement of his team’s discovery that humans reproduced with Neanderthals, Pääbo was keenly aware that his results revealed stark genetic differences between Africans, who have no Neanderthal DNA, and non-Africans, most of whose genomes are between one and four percent Neanderthal. He writes,

When we had come this far in our analyses, I began to worry about what the social implications of our findings might be. Of course, scientists need to communicate the truth to the public, but I feel that they should do so in ways that minimize the chance for it to be misused. This is especially the case when it comes to human history and human genetic variation, when we need to ask ourselves: Do our findings feed into prejudices that exist in society? Can our findings be misrepresented to serve racists’ purposes? Can they be deliberately or unintentionally misused in some other way? (199-200)

In light of the Neanderthal’s own caricature—hunched, brutish, dimwitted—their contribution to non-Africans’ genetic makeup may actually seem like more of a drawback than a basis for any claims of superiority. The trouble would come, however, if some of these genes turned out to confer adaptive advantages that made their persistence in our lineage more likely. There are already some indications, for instance, that Neanderthal-human hybrids had more robust immune responses to certain diseases. And the potential for further discoveries along these lines is limitless. Neanderthal Man explores the personal and political dimensions of a major scientific undertaking, but it’s Pääbo’s remembrances of what it was like to work with the other members of his team that bring us closest to the essence of what science is—or at least what it can be. At several points along the team’s journey, they were faced with a series of setbacks and technical challenges that threatened to sink the entire endeavor. Pääbo describes what it was like when during one critical juncture where things looked especially dire everyone brought their heads together in weekly meetings to try to come up with solutions and assign tasks:

To me, these meetings were absorbing social and intellectual experiences: graduate students and postdocs know that their careers depend on the results they achieve and the papers they publish, so there is always a certain amount of jockeying for opportunity to do the key experiments and to avoid doing those that may serve the group’s aim but will probably not result in prominent authorship on an important publication. I had become used to the idea that budding scientists were largely driven by self-interest, and I recognized that my function was to strike a balance between what was good for someone’s career and what was necessary for a project, weighing individual abilities in this regard. As the Neanderthal crisis loomed over the group, however, I was amazed to see how readily the self-centered dynamic gave way to a more group-centered one. The group was functioning as a unit, with everyone eagerly volunteering for thankless and laborious chores that would advance the project regardless of whether such chores would bring any personal glory. There was a strong sense of common purpose in what all felt was a historic endeavor. I felt we had the perfect team. In my more sentimental moments, I felt love for each and every person around the table. This made the feeling that we’d achieved no progress all the more bitter. (146-7)

Those “more sentimental moments” of Pääbo’s occur quite frequently, and he just as frequently describes his colleagues, and even his rivals, in a way that reveals his fondness and admiration for them. Unlike James Watson, who in The Double Helix, his memoir of how he and Francis Crick discovered the underlying structure of DNA, often comes across as nasty and condescending, Pääbo reveals himself to be bighearted, almost to a fault.

            Alongside the passion and the drive, we see Pääbo again and again pausing to reflect with childlike wonder at the dizzying advancement of technology and the incredible privilege of being able to carry on such a transformative tradition of discovery and human progress. He shows at once the humility of recognizing his own limitations and the restless curiosity that propels him onward in spite of them. He writes,

My twenty-five years in molecular biology had essentially been a continuous technical revolution. I had seen DNA sequencing machines come on the market that rendered into an overnight task the toils that took me days and weeks as a graduate student. I had seen cumbersome cloning of DNA in bacteria be replaced by the PCR, which in hours achieved what had earlier taken weeks or months to do. Perhaps that was what had led me to think that within a year or two we would be able to sequence three thousand times more DNA than what we had presented in the proof-of-principle paper in Nature. Then again, why wouldn’t the technological revolution continue? I had learned over the years that unless a person was very, very smart, breakthroughs were best sought when coupled to big improvements in technologies. But that didn’t mean we were simply prisoners awaiting rescue by the next technical revolution. (143)

Like the other members of his team, and like so many other giants in the history of science, Pääbo demonstrates an important and rare mix of seemingly contradictory traits: a capacity for dogged, often mind-numbing meticulousness and a proclivity toward boundless flights of imagination.

What has been the impact of Pääbo and his team’s accomplishments so far? Their methods have already been applied to the remains of a 400,000-year-old human ancestor, led to the discovery of completely new species of hominin known as Denisovans (based on a tiny finger bone), and are helping settle a longstanding debate about the peopling of the Americas. The out-of-Africa hypothesis is, for now, the clear victor over the multiregionalist hypothesis, but of course the single origin theory has become more complicated. Many paleoanthropologists are now talking about what Pääbo calls the “Leaky replacement” model (248). Aside from filling in some of the many gaps in the chronology of humankind’s origins and migrations—or rather fitting together more pieces in the vast mosaic of our species’ history—every new genome helps us to triangulate possible functions for specific genes. As Pääbo explains, “The dirty little secret of genomics is that we still know next to nothing about how a genome translates into the particularities of a living and breathing individual” (208). But knowing the particulars of how human genomes differ from chimp genomes, and how both differ from the genomes of Neanderthals, or Denisovans, or any number of living or extinct species of primates, gives us clues about how those differences contribute to making each of us who and what we are. The Neanderthal genome is not an end-point but rather a link in a chain of discoveries. Nonetheless, we owe Svante Pääbo a debt of gratitude for helping us to appreciate what all went into the forging of this particular, particularly extraordinary link. 

Also read: 

“THE WORLD UNTIL YESTERDAY” AND THE GREAT ANTHROPOLOGY DIVIDE: WADE DAVIS’S AND JAMES C. SCOTT’S BIZARRE AND DISHONEST REVIEWS OF JARED DIAMOND’S WORK

And: 

THE FEMINIST SOCIOBIOLOGIST: AN APPRECIATION OF SARAH BLAFFER HRDY DISGUISED AS A REVIEW OF “MOTHERS AND OTHERS: THE EVOLUTIONARY ORIGINS OF MUTUAL UNDERSTANDING”

And: 

OLIVER SACKS’S GRAPHOPHILIA AND OTHER COMPENSATIONS FOR A LIFE LIVED “ON THE MOVE”

Read More
Dennis Junk Dennis Junk

The Time for Tales and A Tale for the Time Being

The pages of Ruth Ozeki’s novel A Tale for the Time Being are brimful with the joys and heartache, not just of fiction in general, but of literary fiction in particular, with the bite of reality that sinks deeper than that of the commercial variety, which sacrifices lasting impact for immediate thrills—this despite the Gothic and science fiction elements Ozeki judiciously sprinkles throughout her chapters.

Storytelling in the twenty-first century is a tricky business. People are faced with too many real-world concerns to be genuinely open to the possibility of caring what happens to imaginary characters in a made-up universe. Some of us even feel a certain level of dread as we read reviews or lists of the best books of any given year, loath to add yet another modern classic to the towering mental stack we resolve to read. Yet we’re hardly ever as grateful to anyone as we are to an author or filmmaker who seduces us with a really good story. No sooner have we closed a novel whose characters captured our heart and whose plights succeeded in making us forget our own daily anxieties, however briefly, than we feel compelled to proselytize on behalf of that story to anyone whose politeness we can exploit long enough to deliver the message. The enchantment catches us so unawares, and strikes us as so difficult to account for according to any reckoning of daily utility, that it feels perfectly natural to relay our literary experiences in religious language. But, as long as we devote sufficient attention to an expertly wrought, authentically heartfelt narrative, even those of us with the most robust commitment to the secular will find ourselves succumbing to the magic.

The pages of Ruth Ozeki’s novel A Tale for the Time Being are brimful with the joys and heartache, not just of fiction in general, but of literary fiction in particular, with the bite of reality that sinks deeper than that of the commercial variety, which sacrifices lasting impact for immediate thrills—this despite the Gothic and science fiction elements Ozeki judiciously sprinkles throughout her chapters. The story is about a woman who becomes obsessed with a story despite herself because she can’t help worrying about the fate of the teenage girl who wrote it. This woman, a novelist who curiously bears the same first name as the novel’s author, only reluctantly brings a barnacle-covered freezer bag concealing a Hello Kitty lunchbox to her home after discovering it washed up on a beach off the coast of British Columbia. And it’s her husband Oliver—named after Ozeki’s real-life husband—who brings the bag into the kitchen so he can examine the contents, over her protests. Inside the lunchbox, Oliver finds a stack of letters written in Japanese and dating to the Second World War. With them, there is a diary written in French, and what at first appears to be a copy of Proust’s In Search of Lost Time but turns out to be another diary, with pages in the handwritten English of a teenage Japanese girl named Nao, pronounced much like the word “now.”

Ruth begins reading the diary, sometimes by herself, sometimes to Oliver before they go to bed. Nao addresses her reader very personally, and delights in the idea that she may be sharing her story with a single individual. As she poses one question after another to this lone reader—who we know has turned out to be Ruth—Nao’s loneliness, her desperate need to unburden herself, wraps itself around Ruth and readers of Ozeki’s novel alike. In the early passages, we learn that Nao’s original plan for the pages under Proust’s cover was to write a biography of her 104-year-old great grandmother Jiko, a Buddhist nun living in a temple in the Miyagi Prefecture, just north of the Fukishima nuclear power plant, and then toss it into the waves of the Pacific for a single beachcomber to find. Contemplating the significance of her personal gesture toward some anonymous future reader, she writes,

If you ask me, it’s fantastically cool and beautiful. It’s like a message in a bottle, cast out onto the ocean of time and space. Totally personal, and real, too, right out of old Jiko’s and Marcel’s prewired world. It’s the opposite of a blog. It’s an antiblog, because it’s meant for only one special person, and that person is you. And if you’ve read this far, you probably understand what I mean. Do you understand? Do you feel special yet? (26)

Nao does manage to write a lot about her own experiences with her great grandmother, and we learn a bit about Jiko’s life before Nao was born. But for the most part Nao’s present troubles override her intentions to write about someone else’s life. Meanwhile, the character Ruth is trying to write a memoir about nursing her mother, who has recently died after a long struggle with Alzheimer’s, but she’s also getting distracted by Nao’s tribulations, even to the point of conducting a search for the girl based on whatever telling details she comes across in the diary.

            All great stories pulse with resonances from great stories past, and A Tale for the Time Being, whether as a result of human commonalities, direct influence, or cultural osmosis, reverberates with the tones and themes of Catcher in the Rye, Donnie Darko, The Karate Kid, My Girl, and the first third of Roberto Bolaño’s 2666. The characters in Ozeki’s multiple embedded and individually shared narratives fascinate themselves with, even as they reluctantly succumb to, the mystery of communion between storyteller and audience. But, since the religion—or perhaps rather the philosophy—that suffuses the characters’ lives is not Christian but Buddhist, it’s only fitting that the focus is on how narrative time flows according to a type of eternal present, always waking us up to the present moment. The surprise in store for readers of the novel, though, is that narrative communion—the bond of compassion between narrators and audiences—is intricately tied to our experience of time. This connection comes through in the double meaning of the phrase “for the time being,” which conveys that whatever you’re discussing, though not ideal, is sufficient for now, until something more suitable comes along. But it can also refer to a being existing in time—hence to all beings. As Nao explains in the early pages of her diary,

A time being is someone who lives in time, and that means you, and me, and every one of us who is, or was, or ever will be. As for me, right now I am sitting in a French maid Café in Akiba Electricity Town, listening to a sad chanson that is playing sometime in your past, which is also my present, writing this and wondering about you, somewhere in my future. And if you’re reading this, then maybe by now you’re wondering about me, too. (3)

Time is what connects every being—and, as Jiko insists, every object—that moves through it, moment by moment. Further, as Nao demonstrates, the act of narrating, in collusion with a separate act of attending, renders any separation in time somewhat moot.

            Revealing that the contrapuntal narration of A Tale for the Time Being represents the flowering of a relationship between two women who never even meet, whose destinies, it is tantalizingly suggested, may not even be unfolding in the same universe, will likely do nothing to blunt the impact of the story, for the same reasons the characters continually find themselves contemplating. Ruth, once she’s fully engaged in Nao’s story, becomes frustrated at how hard it is “to get a sense from the diary of the texture of time passing.” She wants so much to understand what Nao was going through—what she is going through in the story—but, as she explains, “No writer, even the most proficient, could re-enact in words the flow of a life lived, and Nao was hardly that skillful” (64). We may chuckle at how Ozeki is covering her tracks with this line about her prodigiously articulate sixteen-year-old protagonist. But she’s also showing just how powerful Ruth’s urge is to keep up her end of the connection. This paradox of urgent inconsequence is introduced by Nao on the first page of her diary. After posing a series of questions about the reader, she writes,  

Actually, it doesn’t matter very much, because by the time you read this, everything will be different, and you will be nowhere in particular, flipping idly through the pages of this book, which happens to be the diary of my last days on earth, wondering if you should keep on reading. (3)

In an attempt to mimic the flow of time as Nao experienced it, Ruth paces her reading of the diary to correspond with the spans between entries. All the while, though, she’s desperate to find out what happened—what happens—because she’s worried that Nao may have killed herself, or may still be on the verge of doing so.

            Nao wrote her diary in English because she spent most of her life in Sunnyvale, California, where her father Haruki worked for a tech firm. After he lost his job for what Nao assumes are economic reasons, he moved with her and her mother back to Tokyo. Nao starts having difficulties at school right away, but she’s reluctant to transfer to what she calls a stupid kids’ school because, as she explains,

It’s probably been a while since you were in junior high school, but if you can remember the poor loser foreign kid who entered your eighth-grade class halfway through the year, then maybe you will feel some sympathy for me. I was totally clueless about how you’re supposed to act in a Japanese classroom, and my Japanese sucked, and at the time I was almost fifteen and older than the other kids and big for my age, too, from eating so much American food. Also, we were broke so I didn’t have an allowance or any nice stuff, so basically I got tortured. In Japan they call it ijime, but that word doesn’t begin to describe what the kids used to do to me. I would probably already be dead if Jiko hadn’t taught me how to develop my superpower. Ijime is why it’s not an option for me to go to a stupid kids’ school, because in my experience, stupid kids can be even meaner than smart kids because they don’t have as much to lose. School just isn’t safe. (44)

The types of cruelty Nao is subjected to are insanely creative, and one of her teachers even participates so he can curry favor with the popular students. They pretend she’s invisible and only detectable by a terrible odor she exudes—and she perversely begins to believe she really does give off a tell-tale scent of poverty. They pinch her and pull out strands of her hair. At one point, they even stage an elaborate funeral for her and post a video of it online. Curiously, though, when Ruth searches for the title, nothing comes up.

            Unfortunately, the extent of Nao’s troubles stretches beyond the walls of the school. Her father, whom she and her mother believed to have been hired for a new job, tries to kill himself by lying down in front of a train. But the engineers see him in time to stop, and all he manages to do is incur a fine. Once Nao’s mother brings him home, he reveals the new job was a lie, that he’s been earning money gambling, and that now he’s lost nearly every penny. One of the few things Ruth finds in her internet searches is an essay by Nao’s father about why he and other Japanese men are attracted to the idea of suicide. She decides to email the professor who posted this essay, claiming that it’s “urgent” for her to find the author and his daughter. When Oliver reminds Ruth that all of this occurred some distance in the past, she’s dumbfounded. As the third-person narrator of the Ruth chapters explains,

It wasn't that she'd forgotten, exactly. The problem was more a kind of slippage. When she was writing a novel, living deep inside a fictional world, the days got jumbled together, and entire weeks or months or even years would yield to the ebb and flow of the dream. Bills went unpaid, emails unanswered, calls unreturned. Fiction had its own time and logic. That was its power. But the email she’d just written to the professor was not fiction. It was real, as real as the diary. (313-4)

Before ending up in that French maid café (the precise character of which is a bone of contention between Ruth and Oliver) trying to write Jiko’s life story, Nao had gone, at her parents’ prompting, to spend a summer with her great grandmother at the temple. And it’s here she learns to develop what Jiko calls her superpower. Of course, as she describes the lessons in her diary, Nao is also teaching to Ruth what Jiko taught to her. So Ozeki’s readers get the curious sense that not only is Ruth trying to care for Nao but Nao is caring for Ruth in return.

During her stay at the temple, Nao learns that Jiko became a nun after her son, Nao’s great uncle and her father’s namesake, died in World War II. It is the letters he wrote home as he underwent pilot training that Oliver and Ruth find with Nao’s diary, and they eventually learn that the other diary, the one written in French, is his as well. Even though Nao only learns about the man she comes to call Haruki #1 from her grandmother’s stories, along with the letters and a few ghostly visitations, she develops a type of bond with him, identifying with his struggles, admiring his courage. At the same time, she comes to sympathize with and admire Jiko all the more for how she overcame the grief of losing her son. “By the end of the summer, with Jiko’s help,” Nao writes,

I was getting stronger. Not Just strong in my body, but strong in my mind. In my mind, I was becoming a superhero, like Jubei-chan, the Samurai Girl, only I was Nattchan, the Super Nun, with abilities bestowed upon me by Lord Buddha that included battling the waves, even if I always lost, and being able to withstand astonishing amounts of pain and hardship. Jiko was helping me cultivate my supapawa! by encouraging me to sit zazen for many hours without moving, and showing me how not to kill anything, not even the mosquitoes that buzzed around my face when I was sitting in the hondo at dusk or lying in bed at night. I learned not to swat them even when they bit me and also not to scratch the itch that followed. At first, when I woke up, my face and arms were swollen from the bites, but little by little, my blood and skin grew tough and immune to their poison and I didn’t break out in bumps no matter how much I’d been bitten. And soon there was no difference between me and the mosquitoes. My skin was no longer a wall that separated us, and my blood was their blood. I was pretty proud of myself, so I went and found Jiko and I told her. She smiled. (204)

Nao is learning to break through the walls that separate her from others by focusing her mind on simply being, but the theme Ozeki keeps bringing us back to in her novel is that it’s not only through meditation that we open our minds compassionately to others.

            Despite her new powers, though, Nao’s problems at home and at school only get worse when she returns from her sojourn at the temple. And, as the urgency of her story increases, a long-simmering tension between Ruth and Oliver boils over into spats and insults. The added strain stems not so much from their different responses to Nao’s story as it does from the way the narration resonates with many of Ruth’s own feelings and experiences, making her more aware of all that’s going on in her own mind. When Nao explains how she came to be writing in a blanked out copy of In Search of Lost Time, for instance, she describes how the woman who makes the diaries “does it so authentically you don’t even notice the hack, and you almost think that the letters just slipped off the pages and fell to the floor like a pile of dead ants” (20). Later, Ruth reflects on how,

When she was little, she was always surprised to pick up a book in the morning, and open it, and find the letters aligned neatly in their places. Somehow she expected them to be all jumbled up, having fallen to the bottom when the covers were shut. Nao had described something similar, seeing the blank pages of Proust and wondering if the letters had fallen off like dead ants. When Ruth had read this, she’d felt a jolt of recognition. (63)

The real source of the trouble, though, is Nao’s description of her father as a hikikomori, which Ruth defines in one of the footnotes that constantly remind readers of her presence in Nao’s story as a “recluse, a person who refuses to leave the house” (70). Having agreed to retreat to the sparsely populated Whaletown, on the tiny island of Cortes, to care for her mother and support Oliver as he undertook a series of eminently eccentric botanical projects, part art and part ecological experimentations—the latest of which he calls the “Neo-Eocene”—Ruth is now beginning to long for the bustling crowds and hectic conveniences of urban civilization: “Engulfed by the thorny roses and massing bamboo, she stared out the window and felt like she’d stepped into a malevolent fairy tale” (61).

A Tale for the Time Being is so achingly alive the characters’ thoughts and words lift up off the pages as if borne aloft on clouds of their own breath. Ironically, though, if there’s one character who suffers from overscripting it’s the fictional counterpart to Ozeki’s real-life husband. Oliver’s role is to see what Ruth cannot, moving the plot forward with strained scientific explanations of topics ranging from Pacific gyres, to Linnaean nomenclature, to quantum computing, raising the suspicion that he’s serving as the personification of the author’s own meticulous research habits. Under the guise of the masculine tendency to launch into spontaneous lectures—which to be fair Ozeki has at least one female character manifesting as well—Oliver at times transforms from a living character into a scholarly mouthpiece, performing a parody of professorial pedantism. For the most part, though, the philosophical ruminations in the novel emanate so naturally from the characters and are woven so seamlessly into the narrative that you can’t help following along with the characters' thoughts, as if taking part in a conversation. When Nao writes about discovering through a Google search that “A la recherché du temps perdu” means “In search of lost time,” she encourages her reader to contemplate what that means along with her:

Weird, right? I mean, there I was, sitting in a French maid café in Akiba, thinking about lost time, and old Marcel Proust was sitting in France a hundred years ago, writing a whole book about the exact same subject. So maybe his ghost was lingering between the covers and hacking into my mind, or maybe it was just a crazy coincidence, but either way, how cool is that? I think coincidences are cool, even if they don’t mean anything, and who knows? Maybe they do! I’m not saying everything happens for a reason. It was more just that it felt as if me and old Marcel were on the same wavelength. (23)

Waves and ghosts and crazy coincidences make up some of the central themes of the novel, but underlying them all is the spooky connectedness we humans so readily feel with one another, even against the backdrop of our capacity for the most savage cruelty.

A Tale for the Time Being balances its experimental devices with time-honored storytelling techniques. It’s central conceits are emblematic of today’s reigning fictional aesthetic, which embodies a playful exploration of the infinite possibilities of all things meta—stories about stories, narratives pondering the nature of narrative, authors becoming characters, characters becoming authors, fiction that’s indistinguishable from nonfiction and vice versa. We might call this tradition jootsism, after computational neuroscientist Douglas Hofstadter’s term jootsing, or jumping out of the system, which he theorizes is critical to the emergence of consciousness from physical substrates, whether biochemical or digital. Some works in this tradition fatally undermine the systems they jump out of, revealing that the story readers are emotionally invested in is a hoax. Indeed, one strain of literary theory that’s been highly influential over the past four decades equates taking pleasure from stories with indulging in the reaffirmation of prejudices against minorities. Authors in this school therefore endeavor to cast readers out of their own narratives as punishment for their complicity in societal oppression. Fortunately, the types of gimmickry common to this brand of postmodernism—deliberately obnoxious and opaquely nonsensical neon-lit syntactical acrobatics—appear to have drastically declined in popularity, though the urge toward guilt-tripping readers and the obsession with the most harebrained and pusillanimous forms of identity politics persist. There are hidden messages in all media, we’re taught to believe, and those messages are the lifeblood of all that’s evil in our civilization.

But even those of us who don’t subscribe to the ideology that sees all narratives as sinister political allegories are still looking to be challenged, and perhaps even enlightened, every time we pick up a new book. If there is a set of conventions realist literary fiction is trying to break itself free of, it’s got to be the standard lineup of what masquerade as epiphanies but are really little more than varieties of passive acceptance or world-weary acquiescence in the face of life’s inexorables: disappointment, death, the delimiting of opportunity, the dimming of beauty, the diminishing of faculties, the desperate need for love, the absence of any ideal candidate for love. The postmodern wing of the avant-garde offers nothing in place of these old standbys but meaningless antics and sadomasochistic withholdings of the natural pleasure humans derive from sharing stories. While the theme that emerges from all the jootsing in A Tale for the Time Being, a theme that paradoxically produces its own proof, is that the pleasure we get from stories comes from a kind of magic—the superpower of the storyteller, with the generous complicity of the reader—and that this is the same magic that forms the bonds between us and our loved ones. In a world where teenagers subject each other to unspeakable cruelty, where battling nations grind bodies by the boatload into lifeless, nameless mush, there is still solace to be had, and hope, in the daily miracle of how easily we’re made to feel the profoundest sympathy for people we never even meet, simply by experiencing their stories.

Also read:

WHAT MAKES "WOLF HALL" SO GREAT?

And:

MUDDLING THROUGH "LIFE AFTER LIFE": A REFLECTION ON PLOT AND CHARACTER IN KATE ATKINSON’S NEW NOVEL

And:

REBECCA MEAD’S MIDDLEMARCH PILGRIMAGE AND THE 3 WRONG WAYS TO READ A NOVEL

Read More
Dennis Junk Dennis Junk

Lab Flies: Joshua Greene’s Moral Tribes and the Contamination of Walter White

Joshua Greene’s book “Moral Tribes” posits a dual-system theory of morality, where a quick, intuitive system 1 makes judgments based on deontological considerations—”it’s just wrong—whereas the slower, more deliberative system 2 takes time to calculate the consequences of any given choice. Audiences can see these two systems on display in the series “Breaking Bad,” as well as in critics’ and audiences’ responses.

Walter White’s Moral Math

In an episode near the end of Breaking Bad’s fourth season, the drug kingpin Gus Fring gives his meth cook Walter White an ultimatum. Walt’s brother-in-law Hank is a DEA agent who has been getting close to discovering the high-tech lab Gus has created for Walt and his partner Jesse, and Walt, despite his best efforts, hasn’t managed to put him off the trail. Gus decides that Walt himself has likewise become too big a liability, and he has found that Jesse can cook almost as well as his mentor. The only problem for Gus is that Jesse, even though he too is fed up with Walt, will refuse to cook if anything happens to his old partner. So Gus has Walt taken at gunpoint to the desert where he tells him to stay away from both the lab and Jesse. Walt, infuriated, goads Gus with the fact that he’s failed to turn Jesse against him completely, to which Gus responds, “For now,” before going on to say,

In the meantime, there’s the matter of your brother-in-law. He is a problem you promised to resolve. You have failed. Now it’s left to me to deal with him. If you try to interfere, this becomes a much simpler matter. I will kill your wife. I will kill your son. I will kill your infant daughter.

In other words, Gus tells Walt to stand by and let Hank be killed or else he will kill his wife and kids. Once he’s released, Walt immediately has his lawyer Saul Goodman place an anonymous call to the DEA to warn them that Hank is in danger. Afterward, Walt plans to pay a man to help his family escape to a new location with new, untraceable identities—but he soon discovers the money he was going to use to pay the man has already been spent (by his wife Skyler). Now it seems all five of them are doomed. This is when things get really interesting.

      Walt devises an elaborate plan to convince Jesse to help him kill Gus. Jesse knows that Gus would prefer for Walt to be dead, and both Walt and Gus know that Jesse would go berserk if anyone ever tried to hurt his girlfriend’s nine-year-old son Brock. Walt’s plan is to make it look like Gus is trying to frame him for poisoning Brock with risin. The idea is that Jesse would suspect Walt of trying to kill Brock as punishment for Jesse betraying him and going to work with Gus. But Walt will convince Jesse that this is really just Gus’s ploy to trick Jesse into doing what he has forbidden Gus to do up till now—and kill Walt himself. Once Jesse concludes that it was Gus who poisoned Brock, he will understand that his new boss has to go, and he will accept Walt’s offer to help him perform the deed. Walt will then be able to get Jesse to give him the crucial information he needs about Gus to figure out a way to kill him.

It’s a brilliant plan. The one problem is that it involves poisoning a nine-year-old child. Walt comes up with an ingenious trick which allows him to use a less deadly poison while still making it look like Brock has ingested the ricin, but for the plan to work the boy has to be made deathly ill. So Walt is faced with a dilemma: if he goes through with his plan, he can save Hank, his wife, and his two kids, but to do so he has to deceive his old partner Jesse in just about the most underhanded way imaginable—and he has to make a young boy very sick by poisoning him, with the attendant risk that something will go wrong and the boy, or someone else, or everyone else, will die anyway. The math seems easy: either four people die, or one person gets sick. The option recommended by the math is greatly complicated, however, by the fact that it involves an act of violence against an innocent child.

In the end, Walt chooses to go through with his plan, and it works perfectly. In another ingenious move, though, this time on the part of the show’s writers, Walt’s deception isn’t revealed until after his plan has been successfully implemented, which makes for an unforgettable shock at the end of the season. Unfortunately, this revelation after the fact, at a time when Walt and his family are finally safe, makes it all too easy to forget what made the dilemma so difficult in the first place—and thus makes it all too easy to condemn Walt for resorting to such monstrous means to see his way through.

            Fans of Breaking Bad who read about the famous thought-experiment called the footbridge dilemma in Harvard psychologist Joshua Greene’s multidisciplinary and momentously important book Moral Tribes: Emotion, Reason, and the Gap between Us and Them will immediately recognize the conflicting feelings underlying our responses to questions about serving some greater good by committing an act of violence. Here is how Greene describes the dilemma:

A runaway trolley is headed for five railway workmen who will be killed if it proceeds on its present course. You are standing on a footbridge spanning the tracks, in between the oncoming trolley and the five people. Next to you is a railway workman wearing a large backpack. The only way to save the five people is to push this man off the footbridge and onto the tracks below. The man will die as a result, but his body and backpack will stop the trolley from reaching the others. (You can’t jump yourself because you, without a backpack, are not big enough to stop the trolley, and there’s no time to put one on.) Is it morally acceptable to save the five people by pushing this stranger to his death? (113-4)

As was the case for Walter White when he faced his child-poisoning dilemma, the math is easy: you can save five people—strangers in this case—through a single act of violence. One of the fascinating things about common responses to the footbridge dilemma, though, is that the math is all but irrelevant to most of us; no matter how many people we might save, it’s hard for us to see past the murderous deed of pushing the man off the bridge. The answer for a large majority of people faced with this dilemma, even in the case of variations which put the number of people who would be saved much higher than five, is no, pushing the stranger to his death is not morally acceptable.

Another fascinating aspect of our responses is that they change drastically with the modification of a single detail in the hypothetical scenario. In the switch dilemma, a trolley is heading for five people again, but this time you can hit a switch to shift it onto another track where there happens to be a single person who would be killed. Though the math and the underlying logic are the same—you save five people by killing one—something about pushing a person off a bridge strikes us as far worse than pulling a switch. A large majority of people say killing the one person in the switch dilemma is acceptable. To figure out which specific factors account for the different responses, Greene and his colleagues tweak various minor details of the trolley scenario before posing the dilemma to test participants. By now, so many experiments have relied on these scenarios that Greene calls trolley dilemmas the fruit flies of the emerging field known as moral psychology.

The Automatic and Manual Modes of Moral Thinking

One hypothesis for why the footbridge case strikes us as unacceptable is that it involves using a human being as an instrument, a means to an end. So Greene and his fellow trolleyologists devised a variation called the loop dilemma, which still has participants pulling a hypothetical switch, but this time the lone victim on the alternate track must stop the trolley from looping back around onto the original track. In other words, you’re still hitting the switch to save the five people, but you’re also using a human being as a trolley stop. People nonetheless tend to respond to the loop dilemma in much the same way they do the switch dilemma. So there must be some factor other than the prospect of using a person as an instrument that makes the footbridge version so objectionable to us.

Greene’s own theory for why our intuitive responses to these dilemmas are so different begins with what Daniel Kahneman, one of the founders of behavioral economics, labeled the two-system model of the mind. The first system, a sort of autopilot, is the one we operate in most of the time. We only use the second system when doing things that require conscious effort, like multiplying 26 by 47. While system one is quick and intuitive, system two is slow and demanding. Greene proposes as an analogy the automatic and manual settings on a camera. System one is point-and-click; system two, though more flexible, requires several calibrations and adjustments. We usually only engage our manual systems when faced with circumstances that are either completely new or particularly challenging.

According to Greene’s model, our automatic settings have functions that go beyond the rapid processing of information to encompass our basic set of moral emotions, from indignation to gratitude, from guilt to outrage, which motivates us to behave in ways that over evolutionary history have helped our ancestors transcend their selfish impulses to live in cooperative societies. Greene writes,

According to the dual-process theory, there is an automatic setting that sounds the emotional alarm in response to certain kinds of harmful actions, such as the action in the footbridge case. Then there’s manual mode, which by its nature tends to think in terms of costs and benefits. Manual mode looks at all of these cases—switch, footbridge, and loop—and says “Five for one? Sounds like a good deal.” Because manual mode always reaches the same conclusion in these five-for-one cases (“Good deal!”), the trend in judgment for each of these cases is ultimately determined by the automatic setting—that is, by whether the myopic module sounds the alarm. (233)

What makes the dilemmas difficult then is that we experience them in two conflicting ways. Most of us, most of the time, follow the dictates of the automatic setting, which Greene describes as myopic because its speed and efficiency come at the cost of inflexibility and limited scope for extenuating considerations.

The reason our intuitive settings sound an alarm at the thought of pushing a man off a bridge but remain silent about hitting a switch, Greene suggests, is that our ancestors evolved to live in cooperative groups where some means of preventing violence between members had to be in place to avoid dissolution—or outright implosion. One of the dangers of living with a bunch of upright-walking apes who possess the gift of foresight is that any one of them could at any time be plotting revenge for some seemingly minor slight, or conspiring to get you killed so he can move in on your spouse or inherit your belongings. For a group composed of individuals with the capacity to hold grudges and calculate future gains to function cohesively, the members must have in place some mechanism that affords a modicum of assurance that no one will murder them in their sleep. Greene writes,

To keep one’s violent behavior in check, it would help to have some kind of internal monitor, an alarm system that says “Don’t do that!” when one is contemplating an act of violence. Such an action-plan inspector would not necessarily object to all forms of violence. It might shut down, for example, when it’s time to defend oneself or attack one’s enemies. But it would, in general, make individuals very reluctant to physically harm one another, thus protecting individuals from retaliation and, perhaps, supporting cooperation at the group level. My hypothesis is that the myopic module is precisely this action-plan inspecting system, a device for keeping us from being casually violent. (226)

Hitting a switch to transfer a train from one track to another seems acceptable, even though a person ends up being killed, because nothing our ancestors would have recognized as violence is involved.

Many philosophers cite our different responses to the various trolley dilemmas as support for deontological systems of morality—those based on the inherent rightness or wrongness of certain actions—since we intuitively know the choices suggested by a consequentialist approach are immoral. But Greene points out that this argument begs the question of how reliable our intuitions really are. He writes,

I’ve called the footbridge dilemma a moral fruit fly, and that analogy is doubly appropriate because, if I’m right, this dilemma is also a moral pest. It’s a highly contrived situation in which a prototypically violent action is guaranteed (by stipulation) to promote the greater good. The lesson that philosophers have, for the most part, drawn from this dilemma is that it’s sometimes deeply wrong to promote the greater good. However, our understanding of the dual-process moral brain suggests a different lesson: Our moral intuitions are generally sensible, but not infallible. As a result, it’s all but guaranteed that we can dream up examples that exploit the cognitive inflexibility of our moral intuitions. It’s all but guaranteed that we can dream up a hypothetical action that is actually good but that seems terribly, horribly wrong because it pushes our moral buttons. I know of no better candidate for this honor than the footbridge dilemma. (251)

The obverse is that many of the things that seem morally acceptable to us actually do cause harm to people. Greene cites the example of a man who lets a child drown because he doesn’t want to ruin his expensive shoes, which most people agree is monstrous, even though we think nothing of spending money on things we don’t really need when we could be sending that money to save sick or starving children in some distant country. Then there are crimes against the environment, which always seem to rank low on our list of priorities even though their future impact on real human lives could be devastating. We have our justifications for such actions or omissions, to be sure, but how valid are they really? Is distance really a morally relevant factor when we let children die? Does the diffusion of responsibility among so many millions change the fact that we personally could have a measurable impact?

These black marks notwithstanding, cooperation, and even a certain degree of altruism, come natural to us. To demonstrate this, Greene and his colleagues have devised some clever methods for separating test subjects’ intuitive responses from their more deliberate and effortful decisions. The experiments begin with a multi-participant exchange scenario developed by economic game theorists called the Public Goods Game, which has a number of anonymous players contribute to a common bank whose sum is then doubled and distributed evenly among them. Like the more famous game theory exchange known as the Prisoner’s Dilemma, the outcomes of the Public Goods Game reward cooperation, but only when a threshold number of fellow cooperators is reached. The flip side, however, is that any individual who decides to be stingy can get a free ride from everyone else’s contributions and make an even greater profit. What tends to happen is, over multiple rounds, the number of players opting for stinginess increases until the game is ruined for everyone, a process analogical to a phenomenon in economics known as the Tragedy of the Commons. Everyone wants to graze a few more sheep on the commons than can be sustained fairly, so eventually the grounds are left barren.

The Biological and Cultural Evolution of Morality

            Greene believes that humans evolved emotional mechanisms to prevent the various analogs of the Tragedy of the Commons from occurring so that we can live together harmoniously in tight-knit groups. The outcomes of multiple rounds of the Public Goods Game, for instance, tend to be far less dismal when players are given the opportunity to devote a portion of their own profits to punishing free riders. Most humans, it turns out, will be motivated by the emotion of anger to expend their own resources for the sake of enforcing fairness. Over several rounds, cooperation becomes the norm. Such an outcome has been replicated again and again, but researchers are always interested in factors that influence players’ strategies in the early rounds. Greene describes a series of experiments he conducted with David Rand and Martin Nowak, which were reported in an article in Nature in 2012. He writes,

…we conducted our own Public Goods Games, in which we forced some people to decide quickly (less than ten seconds) and forced others to decide slowly (more than ten seconds). As predicted, forcing people to decide faster made them more cooperative and forcing people to slow down made them less cooperative (more likely to free ride). In other experiments, we asked people, before playing the Public Goods Game, to write about a time in which their intuitions served them well, or about a time in which careful reasoning led them astray. Reflecting on the advantages of intuitive thinking (or the disadvantages of careful reflection) made people more cooperative. Likewise, reflecting on the advantages of careful reasoning (or the disadvantages of intuitive thinking) made people less cooperative. (62)

These results offer strong support for Greene’s dual-process theory of morality, and they even hint at the possibility that the manual mode is fundamentally selfish or amoral—in other words, that the philosophers have been right all along in deferring to human intuitions about right and wrong.

            As good as our intuitive moral sense is for preventing the Tragedy of the Commons, however, when given free rein in a society comprised of large groups of people who are strangers to one another, each with its own culture and priorities, our natural moral settings bring about an altogether different tragedy. Greene labels it the Tragedy of Commonsense Morality. He explains,

Morality evolved to enable cooperation, but this conclusion comes with an important caveat. Biologically speaking, humans were designed for cooperation, but only with some people. Our moral brains evolved for cooperation within groups, and perhaps only within the context of personal relationships. Our moral brains did not evolve for cooperation between groups (at least not all groups). (23)

Expanding on the story behind the Tragedy of the Commons, Greene describes what would happen if several groups, each having developed its own unique solution for making sure the commons were protected from overgrazing, were suddenly to come into contact with one another on a transformed landscape called the New Pastures. Each group would likely harbor suspicions against the others, and when it came time to negotiate a new set of rules to govern everybody the groups would all show a significant, though largely unconscious, bias in favor of their own members and their own ways.

The origins of moral psychology as a field can be traced to both developmental and evolutionary psychology. Seminal research conducted at Yale’s Infant Cognition Center, led by Karen Wynn, Kiley Hamlin, and Paul Bloom (and which Bloom describes in a charming and highly accessible book called Just Babies), has demonstrated that children as young as six months possess what we can easily recognize as a rudimentary moral sense. These findings suggest that much of the behavior we might have previously ascribed to lessons learned from adults is actually innate. Experiments based on game theory scenarios and thought-experiments like the trolley dilemmas are likewise thought to tap into evolved patterns of human behavior. Yet when University of British Columbia psychologist Joseph Henrich teamed up with several anthropologists to see how people living in various small-scale societies responded to game theory scenarios like the Prisoner’s Dilemma and the Public Goods Game they discovered a great deal of variation. On the one hand, then, human moral intuitions seem to be rooted in emotional responses present at, or at least close to, birth, but on the other hand cultures vary widely in their conventional responses to classic dilemmas. These differences between cultural conceptions of right and wrong are in large part responsible for the conflict Greene envisions in his Parable of the New Pastures.

But how can a moral sense be both innate and culturally variable?  “As you might expect,” Greene explains, “the way people play these games reflects the way they live.” People in some cultures rely much more heavily on cooperation to procure their sustenance, as is the case with the Lamelara of Indonesia, who live off the meat of whales they hunt in groups. Cultures also vary in how much they rely on market economies as opposed to less abstract and less formal modes of exchange. Just as people adjust the way they play economic games in response to other players’ moves, people acquire habits of cooperation based on the circumstances they face in their particular societies. Regarding the differences between small-scale societies in common game theory strategies, Greene writes,

Henrich and colleagues found that payoffs to cooperation and market integration explain more than two thirds of the variation across these cultures. A more recent study shows that, across societies, market integration is an excellent predictor of altruism in the Dictator Game. At the same time, many factors that you might expect to be important predictors of cooperative behavior—things like an individual’s sex, age, and relative wealth, or the amount of money at stake—have little predictive power. (72)

In much the same way humans are programmed to learn a language and acquire a set of religious beliefs, they also come into the world with a suite of emotional mechanisms that make up the raw material for what will become a culturally calibrated set of moral intuitions. The specific language and religion we end up with is of course dependent on the social context of our upbringing, just as our specific moral settings will reflect those of other people in the societies we grow up in.

Jonathan Haidt and Tribal Righteousness

In our modern industrial society, we actually have some degree of choice when it comes to our cultural affiliations, and this freedom opens the way for heritable differences between individuals to play a larger role in our moral development. Such differences are nowhere as apparent as in the realm of politics, where nearly all citizens occupy some point on a continuum between conservative and liberal. According to Greene’s fellow moral psychologist Jonathan Haidt, we have precious little control over our moral responses because, in his view, reason only comes into play to justify actions and judgments we’ve already made. In his fascinating 2012 book The Righteous Mind, Haidt insists,

Moral reasoning is part of our lifelong struggle to win friends and influence people. That’s why I say that “intuitions come first, strategic reasoning second.” You’ll misunderstand moral reasoning if you think about it as something people do by themselves in order to figure out the truth. (50)

To explain the moral divide between right and left, Haidt points to the findings of his own research on what he calls Moral Foundations, six dimensions underlying our intuitions about moral and immoral actions. Conservatives tend to endorse judgments based on all six of the foundations, valuing loyalty, authority, and sanctity much more than liberals, who focus more exclusively on care for the disadvantaged, fairness, and freedom from oppression. Since our politics emerge from our moral intuitions and reason merely serves as a sort of PR agent to rationalize judgments after the fact, Haidt enjoins us to be more accepting of rival political groups— after all, you can’t reason with them. 

            Greene objects both to Haidt’s Moral Foundations theory and to his prescription for a politics of complementarity. The responses to questions representing all the moral dimensions in Haidt’s studies form two clusters on a graph, Greene points out, not six, suggesting that the differences between conservatives and liberals are attributable to some single overarching variable as opposed to several individual tendencies. Furthermore, the specific content of the questions Haidt uses to flesh out the values of his respondents have a critical limitation. Greene writes,

According to Haidt, American social conservatives place greater value on respect for authority, and that’s true in a sense. Social conservatives feel less comfortable slapping their fathers, even as a joke, and so on. But social conservatives do not respect authority in a general way. Rather, they have great respect for authorities recognized by their tribe (from the Christian God to various religious and political leaders to parents). American social conservatives are not especially respectful of Barack Hussein Obama, whose status as a native-born American, and thus a legitimate president, they have persistently challenged. (339)

The same limitation applies to the loyalty and sanctity foundations. Conservatives feel little loyalty toward the atheists and Muslims among their fellow Americans. Nor do they recognize the sanctity of Mosques or Hindu holy texts. Greene goes on,

American social conservatives are not best described as people who place special value on authority, sanctity, and loyalty, but rather as tribal loyalists—loyal to their own authorities, their own religion, and themselves. This doesn’t make them evil, but it does make them parochial, tribal. In this they’re akin to the world’s other socially conservative tribes, from the Taliban in Afghanistan to European nationalists. According to Haidt, liberals should be more open to compromise with social conservatives. I disagree. In the short term, compromise may be necessary, but in the long term, our strategy should not be to compromise with tribal moralists, but rather to persuade them to be less tribalistic. (340)

Greene believes such persuasion is possible, even with regard to emotionally and morally charged controversies, because he sees our manual-mode thinking as playing a potentially much greater role than Haidt sees it playing.

Metamorality on the New Pastures

            Throughout The Righteous Mind, Haidt argues that the moral philosophers who laid the foundations of modern liberal and academic conceptions of right and wrong gave short shrift to emotions and intuitions—that they gave far too much credit to our capacity for reason. To be fair, Haidt does honor the distinction between descriptive and prescriptive theories of morality, but he nonetheless gives the impression that he considers liberal morality to be somewhat impoverished. Greene sees this attitude as thoroughly wrongheaded. Responding to Haidt’s metaphor comparing his Moral Foundations to taste buds—with the implication that the liberal palate is more limited in the range of flavors it can appreciate—Greene writes,

The great philosophers of the Enlightenment wrote at a time when the world was rapidly shrinking, forcing them to wonder whether their own laws, their own traditions, and their own God(s) were any better than anyone else’s. They wrote at a time when technology (e.g., ships) and consequent economic productivity (e.g., global trade) put wealth and power into the hands of a rising educated class, with incentives to question the traditional authorities of king and church. Finally, at this time, natural science was making the world comprehensible in secular terms, revealing universal natural laws and overturning ancient religious doctrines. Philosophers wondered whether there might also be universal moral laws, ones that, like Newton’s law of gravitation, applied to members of all tribes, whether or not they knew it. Thus, the Enlightenment philosophers were not arbitrarily shedding moral taste buds. They were looking for deeper, universal moral truths, and for good reason. They were looking for moral truths beyond the teachings of any particular religion and beyond the will of any earthly king. They were looking for what I’ve called a metamorality: a pan-tribal, or post-tribal, philosophy to govern life on the new pastures. (338-9)

While Haidt insists we must recognize the centrality of intuitions even in this civilization nominally ruled by reason, Greene points out that it was skepticism of old, seemingly unassailable and intuitive truths that opened up the world and made modern industrial civilization possible in the first place.  

            As Haidt explains, though, conservative morality serves people well in certain regards. Christian churches, for instance, encourage charity and foster a sense of community few secular institutions can match. But these advantages at the level of parochial groups have to be weighed against the problems tribalism inevitably leads to at higher levels. This is, in fact, precisely the point Greene created his Parable of the New Pastures to make. He writes,

The Tragedy of the Commons is averted by a suite of automatic settings—moral emotions that motivate and stabilize cooperation within limited groups. But the Tragedy of Commonsense Morality arises because of automatic settings, because different tribes have different automatic settings, causing them to see the world through different moral lenses. The Tragedy of the Commons is a tragedy of selfishness, but the Tragedy of Commonsense Morality is a tragedy of moral inflexibility. There is strife on the new pastures not because herders are hopelessly selfish, immoral, or amoral, but because they cannot step outside their respective moral perspectives. How should they think? The answer is now obvious: They should shift into manual mode. (172)

Greene argues that whenever we, as a society, are faced with a moral controversy—as with issues like abortion, capital punishment, and tax policy—our intuitions will not suffice because our intuitions are the very basis of our disagreement.

            Watching the conclusion to season four of Breaking Bad, most viewers probably responded to finding out that Walt had poisoned Brock by thinking that he’d become a monster—at least at first. Indeed, the currently dominant academic approach to art criticism involves taking a stance, both moral and political, with regard to a work’s models and messages. Writing for The New Yorker, Emily Nussbaum, for instance, disparages viewers of Breaking Bad for failing to condemn Walt, writing,

When Brock was near death in the I.C.U., I spent hours arguing with friends about who was responsible. To my surprise, some of the most hard-nosed cynics thought it inconceivable that it could be Walt—that might make the show impossible to take, they said. But, of course, it did nothing of the sort. Once the truth came out, and Brock recovered, I read posts insisting that Walt was so discerning, so careful with the dosage, that Brock could never have died. The audience has been trained by cable television to react this way: to hate the nagging wives, the dumb civilians, who might sour the fun of masculine adventure. “Breaking Bad” increases that cognitive dissonance, turning some viewers into not merely fans but enablers. (83)

To arrive at such an assessment, Nussbaum must reduce the show to the impact she assumes it will have on less sophisticated fans’ attitudes and judgments. But the really troubling aspect of this type of criticism is that it encourages scholars and critics to indulge their impulse toward self-righteousness when faced with challenging moral dilemmas; in other words, it encourages them to give voice to their automatic modes precisely when they should be shifting to manual mode. Thus, Nussbaum neglects outright the very details that make Walt’s scenario compelling, completely forgetting that by making Brock sick—and, yes, risking his life—he was able to save Hank, Skyler, and his own two children.

But how should we go about arriving at a resolution to moral dilemmas and political controversies if we agree we can’t always trust our intuitions? Greene believes that, while our automatic modes recognize certain acts as wrong and certain others as a matter of duty to perform, in keeping with deontological ethics, whenever we switch to manual mode, the focus shifts to weighing the relative desirability of each option's outcomes. In other words, manual mode thinking is consequentialist. And, since we tend to assess outcomes according their impact on other people, favoring those that improve the quality of their experiences the most, or detract from it the least, Greene argues that whenever we slow down and think through moral dilemmas deliberately we become utilitarians. He writes,

If I’m right, this convergence between what seems like the right moral philosophy (from a certain perspective) and what seems like the right moral psychology (from a certain perspective) is no accident. If I’m right, Bentham and Mill did something fundamentally different from all of their predecessors, both philosophically and psychologically. They transcended the limitations of commonsense morality by turning the problem of morality (almost) entirely over to manual mode. They put aside their inflexible automatic settings and instead asked two very abstract questions. First: What really matters? Second: What is the essence of morality? They concluded that experience is what ultimately matters, and that impartiality is the essence of morality. Combing these two ideas, we get utilitarianism: We should maximize the quality of our experience, giving equal weight to the experience of each person. (173)

If you cite an authority recognized only by your own tribe—say, the Bible—in support of a moral argument, then members of other tribes will either simply discount your position or counter it with pronouncements by their own authorities. If, on the other hand, you argue for a law or a policy by citing evidence that implementing it would mean longer, healthier, happier lives for the citizens it affects, then only those seeking to establish the dominion of their own tribe can discount your position (which of course isn’t to say they can’t offer rival interpretations of your evidence).

            If we turn commonsense morality on its head and evaluate the consequences of giving our intuitions priority over utilitarian accounting, we can find countless circumstances in which being overly moral is to everyone’s detriment. Ideas of justice and fairness allow far too much space for selfish and tribal biases, whereas the math behind mutually optimal outcomes based on compromise tends to be harder to fudge. Greene reports, for instance, the findings of a series of experiments conducted by Fieke Harinick and colleagues at the University of Amsterdam in 2000. Negotiations by lawyers representing either the prosecution or the defense were told to either focus on serving justice or on getting the best outcome for their clients. The negotiations in the first condition almost always came to loggerheads. Greene explains,

Thus, two selfish and rational negotiators who see that their positions are symmetrical will be willing to enlarge the pie, and then split the pie evenly. However, if negotiators are seeking justice, rather than merely looking out for their bottom lines, then other, more ambiguous, considerations come into play, and with them the opportunity for biased fairness. Maybe your clients really deserve lighter penalties. Or maybe the defendants you’re prosecuting really deserve stiffer penalties. There is a range of plausible views about what’s truly fair in these cases, and you can choose among them to suit your interests. By contrast, if it’s just a matter of getting the best deal you can from someone who’s just trying to get the best deal for himself, there’s a lot less wiggle room, and a lot less opportunity for biased fairness to create an impasse. (88)

Framing an argument as an effort to establish who was right and who was wrong is like drawing a line in the sand—it activates tribal attitudes pitting us against them, while treating negotiations more like an economic exchange circumvents these tribal biases.

Challenges to Utilitarianism

But do we really want to suppress our automatic moral reactions in favor of deliberative accountings of the greatest good for the greatest number? Deontologists have posed some effective challenges to utilitarianism in the form of thought-experiments that seem to show efforts to improve the quality of experiences would lead to atrocities. For instance, Greene recounts how in a high school debate, he was confronted by a hypothetical surgeon who could save five sick people by killing one healthy one. Then there’s the so-called Utility Monster, who experiences such happiness when eating humans that it quantitatively outweighs the suffering of those being eaten. More down-to-earth examples feature a scapegoat convicted of a crime to prevent rioting by people who are angry about police ineptitude, and the use of torture to extract information from a prisoner that could prevent a terrorist attack. The most influential challenge to utilitarianism, however, was leveled by the political philosopher John Rawls when he pointed out that it could be used to justify the enslavement of a minority by a majority.

Greene’s responses to these criticisms make up one of the most surprising, important, and fascinating parts of Moral Tribes. First, highly contrived thought-experiments about Utility Monsters and circumstances in which pushing a guy off a bridge is guaranteed to stop a trolley may indeed prove that utilitarianism is not true in any absolute sense. But whether or not such moral absolutes even exist is a contentious issue in its own right. Greene explains,

I am not claiming that utilitarianism is the absolute moral truth. Instead I’m claiming that it’s a good metamorality, a good standard for resolving moral disagreements in the real world. As long as utilitarianism doesn’t endorse things like slavery in the real world, that’s good enough. (275-6)

One source of confusion regarding the slavery issue is the equation of happiness with money; slave owners probably could make profits in excess of the losses sustained by the slaves. But money is often a poor index of happiness. Greene underscores this point by asking us to consider how much someone would have to pay us to sell ourselves into slavery. “In the real world,” he writes, “oppression offers only modest gains in happiness to the oppressors while heaping misery upon the oppressed” (284-5).

            Another failing of the thought-experiments thought to undermine utilitarianism is the shortsightedness of the supposedly obvious responses. The crimes of the murderous doctor and the scapegoating law officers may indeed produce short-term increases in happiness, but if the secret gets out healthy and innocent people will live in fear, knowing they can’t trust doctors and law officers. The same logic applies to the objection that utilitarianism would force us to become infinitely charitable, since we can almost always afford to be more generous than we currently are. But how long could we serve as so-called happiness pumps before burning out, becoming miserable, and thus lose the capacity for making anyone else happier? Greene writes,

If what utilitarianism asks of you seems absurd, then it’s not what utilitarianism actually asks of you. Utilitarianism is, once again, an inherently practical philosophy, and there’s nothing more impractical than commanding free people to do things that strike them as absurd and that run counter to their most basic motivations. Thus, in the real world, utilitarianism is demanding, but not overly demanding. It can accommodate our basic human needs and motivations, but it nonetheless calls for substantial reform of our selfish habits. (258)

Greene seems to be endorsing what philosophers call "rule utilitarianism." We can approach every choice by calculating the likely outcomes, but as a society we would be better served deciding on some rules for everyone to adhere to. It just may be possible for a doctor to increase happiness through murder in a particular set of circumstances—but most people would vociferously object to a rule legitimizing the practice.

The concept of human rights may present another challenge to Greene in his championing of consequentialism over deontology. It is our duty, after all, to recognize the rights of every human, and we ourselves have no right to disregard someone else’s rights no matter what benefit we believe might result from doing so. In his book The Better Angels of our Nature, Steven Pinker, Greene’s colleague at Harvard, attributes much of the steep decline in rates of violent death over the past three centuries to a series of what he calls Rights Revolutions, the first of which began during the Enlightenment. But the problem with arguments that refer to rights, Greene explains, is that controversies arise for the very reason that people don’t agree which rights we should recognize. He writes,

Thus, appeals to “rights” function as an intellectual free pass, a trump card that renders evidence irrelevant. Whatever you and your fellow tribespeople feel, you can always posit the existence of a right that corresponds to your feelings. If you feel that abortion is wrong, you can talk about a “right to life.” If you feel that outlawing abortion is wrong, you can talk about a “right to choose.” If you’re Iran, you can talk about your “nuclear rights,” and if you’re Israel you can talk about your “right to self-defense.” “Rights” are nothing short of brilliant. They allow us to rationalize our gut feelings without doing any additional work. (302)

The only way to resolve controversies over which rights we should actually recognize and which rights we should prioritize over others, Greene argues, is to apply utilitarian reasoning.

Ideology and Epistemology

            In his discussion of the proper use of the language of rights, Greene comes closer than in other section of Moral Tribes to explicitly articulating what strikes me as the most revolutionary idea that he and his fellow moral psychologists are suggesting—albeit as of yet only implicitly. In his advocacy for what he calls “deep pragmatism,” Greene isn’t merely applying evolutionary theories to an old philosophical debate; he’s actually posing a subtly different question. The numerous thought-experiments philosophers use to poke holes in utilitarianism may not have much relevance in the real world—but they do undermine any claim utilitarianism may have on absolute moral truth. Greene’s approach is therefore to eschew any effort to arrive at absolute truths, including truths pertaining to rights. Instead, in much the same way scientists accept that our knowledge of the natural world is mediated by theories, which only approach the truth asymptotically, never capturing it with any finality, Greene intimates that the important task in the realm of moral philosophy isn’t to arrive at a full accounting of moral truths but rather to establish a process for resolving moral and political dilemmas.

What’s needed, in other words, isn’t a rock solid moral ideology but a workable moral epistemology. And, just as empiricism serves as the foundation of the epistemology of science, Greene makes a convincing case that we could use utilitarianism as the basis of an epistemology of morality. Pursuing the analogy between scientific and moral epistemologies even farther, we can compare theories, which stand or fall according to their empirical support, to individual human rights, which we afford and affirm according to their impact on the collective happiness of every individual in the society. Greene writes,

If we are truly interested in persuading our opponents with reason, then we should eschew the language of rights. This is, once again, because we have no non-question-begging (and utilitarian) way of figuring out which rights really exist and which rights take precedence over others. But when it’s not worth arguing—either because the question has been settled or because our opponents can’t be reasoned with—then it’s time to start rallying the troops. It’s time to affirm our moral commitments, not with wonky estimates of probabilities but with words that stir our souls. (308-9)

Rights may be the closest thing we have to moral truths, just as theories serve as our stand-ins for truths about the natural world, but even more important than rights or theories are the processes we rely on to establish and revise them.

A New Theory of Narrative

            As if a philosophical revolution weren’t enough, moral psychology is also putting in place what could be the foundation of a new understanding of the role of narratives in human lives. At the heart of every story is a conflict between competing moral ideals. In commercial fiction, there tends to be a character representing each side of the conflict, and audiences can be counted on to favor one side over the other—the good, altruistic guys over the bad, selfish guys. In more literary fiction, on the other hand, individual characters are faced with dilemmas pitting various modes of moral thinking against each other. In season one of Breaking Bad, for instance, Walter White famously writes a list on a notepad of the pros and cons of murdering the drug dealer restrained in Jesse’s basement. Everyone, including Walt, feels that killing the man is wrong, but if they let him go Walt and his family will at risk of retaliation. This dilemma is in fact quite similar to ones he faces in each of the following seasons, right up until he has to decide whether or not to poison Brock. Trying to work out what Walt should do, and anxiously anticipating what he will do, are mental exercises few can help engaging in as they watch the show.

            The current reigning conception of narrative in academia explains the appeal of stories by suggesting it derives from their tendency to fulfill conscious and unconscious desires, most troublesomely our desires to have our prejudices reinforced. We like to see men behaving in ways stereotypically male, women stereotypically female, minorities stereotypically black or Hispanic, and so on. Cultural products like works of art, and even scientific findings, are embraced, the reasoning goes, because they cement the hegemony of various dominant categories of people within the society. This tradition in arts scholarship and criticism can in large part be traced back to psychoanalysis, but it has developed over the last century to incorporate both the predominant view of language in the humanities and the cultural determinism espoused by many scholars in the service of various forms of identity politics. 

            The postmodern ideology that emerged from the convergence of these schools is marked by a suspicion that science is often little more than a veiled effort at buttressing the political status quo, and its preeminent thinkers deliberately set themselves apart from the humanist and Enlightenment traditions that held sway in academia until the middle of the last century by writing in byzantine, incoherent prose. Even though there could be no rational way either to support or challenge postmodern ideas, scholars still take them as cause for leveling accusations against both scientists and storytellers of using their work to further reactionary agendas.

For anyone who recognizes the unparalleled power of science both to advance our understanding of the natural world and to improve the conditions of human lives, postmodernism stands out as a catastrophic wrong turn, not just in academic history but in the moral evolution of our civilization. The equation of narrative with fantasy is a bizarre fantasy in its own right. Attempting to explain the appeal of a show like Breaking Bad by suggesting that viewers have an unconscious urge to be diagnosed with cancer and to subsequently become drug manufacturers is symptomatic of intractable institutional delusion. And, as Pinker recounts in Better Angels, literature, and novels in particular, were likely instrumental in bringing about the shift in consciousness toward greater compassion for greater numbers of people that resulted in the unprecedented decline in violence beginning in the second half of the nineteenth century.

Yet, when it comes to arts scholarship, postmodernism is just about the only game in town. Granted, the writing in this tradition has progressed a great deal toward greater clarity, but the focus on identity politics has intensified to the point of hysteria: you’d be hard-pressed to find a major literary figure who hasn’t been accused of misogyny at one point or another, and any scientist who dares study something like gender differences can count on having her motives questioned and her work lampooned by well intentioned, well indoctrinated squads of well-poisoning liberal wags.

            When Emily Nussbaum complains about viewers of Breaking Bad being lulled by the “masculine adventure” and the digs against “nagging wives” into becoming enablers of Walt’s bad behavior, she’s doing exactly what so many of us were taught to do in academic courses on literary and film criticism, applying a postmodern feminist ideology to the show—and completely missing the point. As the series opens, Walt is deliberately portrayed as downtrodden and frustrated, and Skyler’s bullying is an important part of that dynamic. But the pleasure of watching the show doesn’t come so much from seeing Walt get out from under Skyler’s thumb—he never really does—as it does from anticipating and fretting over how far Walt will go in his criminality, goaded on by all that pent-up frustration. Walt shows a similar concern for himself, worrying over what effect his exploits will have on who he is and how he will be remembered. We see this in season three when he becomes obsessed by the “contamination” of his lab—which turns out to be no more than a house fly—and at several other points as well. Viewers are not concerned with Walt because he serves as an avatar acting out their fantasies (or else the show would have a lot more nude scenes with the principal of the school he teaches in). They’re concerned because, at least at the beginning of the show, he seems to be a good person and they can sympathize with his tribulations.

The much more solidly grounded view of narrative inspired by moral psychology suggests that common themes in fiction are not reflections or reinforcements of some dominant culture, but rather emerge from aspects of our universal human psychology. Our feelings about characters, according to this view, aren’t determined by how well they coincide with our abstract prejudices; on the contrary, we favor the types of people in fiction we would favor in real life. Indeed, if the story is any good, we will have to remind ourselves that the people whose lives we’re tracking really are fictional. Greene doesn’t explore the intersection between moral psychology and narrative in Moral Tribes, but he does give a nod to what we can only hope will be a burgeoning field when he writes, 

Nowhere is our concern for how others treat others more apparent than in our intense engagement with fiction. Were we purely selfish, we wouldn’t pay good money to hear a made-up story about a ragtag group of orphans who use their street smarts and quirky talents to outfox a criminal gang. We find stories about imaginary heroes and villains engrossing because they engage our social emotions, the ones that guide our reactions to real-life cooperators and rogues. We are not disinterested parties. (59)

Many people ask why we care so much about people who aren’t even real, but we only ever reflect on the fact that what we’re reading or viewing is a simulation when we’re not sufficiently engrossed by it.

            Was Nussbaum completely wrong to insist that Walt went beyond the pale when he poisoned Brock? She certainly showed the type of myopia Greene attributes to the automatic mode by forgetting Walt saved at least four lives by endangering the one. But most viewers probably had a similar reaction. The trouble wasn’t that she was appalled; it was that her postmodernism encouraged her to unquestioningly embrace and give voice to her initial feelings. Greene writes, 

It’s good that we’re alarmed by acts of violence. But the automatic emotional gizmos in our brains are not infinitely wise. Our moral alarm systems think that the difference between pushing and hitting a switch is of great moral significance. More important, our moral alarm systems see no difference between self-serving murder and saving a million lives at the cost of one. It’s a mistake to grant these gizmos veto power in our search for a universal moral philosophy. (253)

Greene doesn’t reveal whether he’s a Breaking Bad fan or not, but his discussion of the footbridge dilemma gives readers a good indication of how he’d respond to Walt’s actions.

If you don’t feel that it’s wrong to push the man off the footbridge, there’s something wrong with you. I, too, feel that it’s wrong, and I doubt that I could actually bring myself to push, and I’m glad that I’m like this. What’s more, in the real world, not pushing would almost certainly be the right decision. But if someone with the best of intentions were to muster the will to push the man off the footbridge, knowing for sure that it would save five lives, and knowing for sure that there was no better alternative, I would approve of this action, although I might be suspicious of the person who chose to perform it. (251)

The overarching theme of Breaking Bad, which Nussbaum fails utterly to comprehend, is the transformation of a good man into a bad one. When he poisons Brock, we’re glad he succeeded in saving his family, some of us are even okay with methods, but we’re worried—suspicious even—about what his ability to go through with it says about him. Over the course of the series, we’ve found ourselves rooting for Walt, and we’ve come to really like him. We don’t want to see him break too bad. And however bad he does break we can’t help hoping for his redemption. Since he’s a valuable member of our tribe, we’re loath to even consider it might be time to start thinking he has to go.

Also read:

The Criminal Sublime: Walter White's Brutally Plausible Journey to the Heart of Darkness in Breaking Bad

And

LET'S PLAY KILL YOUR BROTHER: FICTION AS A MORAL DILEMMA GAME

And

HOW VIOLENT FICTION WORKS: ROHAN WILSON’S “THE ROVING PARTY” AND JAMES WOOD’S SANGUINARY SUBLIME FROM CONRAD TO MCCARTHY

Read More
Dennis Junk Dennis Junk

“The World until Yesterday” and the Great Anthropology Divide: Wade Davis’s and James C. Scott’s Bizarre and Dishonest Reviews of Jared Diamond’s Work

The field of anthropology is divided into two rival factions, the postmodernists and the scientists—though the postmodernists like to insist they’re being scientific as well. The divide can be seen in critiques of Jared Diamond’s “The World until Yesterday.”

Cultural anthropology has for some time been divided into two groups. The first attempts to understand cultural variation empirically by incorporating it into theories of human evolution and ecological adaptation. The second merely celebrates cultural diversity, and its members are quick to attack any findings or arguments by those in the first group that can in any way be construed as unflattering to the cultures being studied. (This dichotomy is intended to serve as a useful, and only slight, oversimplification.)

Jared Diamond’s scholarship in anthropology places him squarely in the first group. Yet he manages to thwart many of the assumptions held by those in the second group because he studiously avoids the sins of racism and biological determinism they insist every last member of the first group is guilty of. Rather than seeing his work as an exemplar or as evidence that the field is amenable to scientific investigation, however, members of the second group invent crimes and victims so they can continue insisting there’s something immoral about scientific anthropology (though the second group, oddly enough, claims that designation as well).

            Diamond is not an anthropologist by training, but his Pulitzer Prize-winning book Guns, Germs, and Steel, in which he sets out to explain why some societies became technologically advanced conquerors over the past 10,000 years while others maintained their hunter-gatherer lifestyles, became a classic in the field almost as soon as it was published in 1997. His interest in cultural variation arose in large part out of his experiences traveling through New Guinea, the most culturally diverse region of the planet, to conduct ornithological research. By the time he published his first book about human evolution, The Third Chimpanzee, at age 54, he’d spent more time among people from a more diverse set of cultures than many anthropologists do over their entire careers.

In his latest book, The World until Yesterday: What Can We Learn from Traditional Societies?, Diamond compares the lifestyles of people living in modern industrialized societies with those of people who rely on hunting and gathering or horticultural subsistence strategies. His first aim is simply to highlight the differences, since the way most us live today is, evolutionarily speaking, a very recent development; his second is to show that certain traditional practices may actually lead to greater well-being, and may thus be advantageous if adopted by those of us living in advanced civilizations.

            Obviously, Diamond’s approach has certain limitations, chief among them that it affords him little space for in-depth explorations of individual cultures. Instead, he attempts to identify general patterns that apply to traditional societies all over the world. What this means in the context of the great divide in anthropology is that no sooner had Diamond set pen to paper than he’d fallen afoul of the most passionately held convictions of the second group, who bristle at any discussion of universal trends in human societies. The anthropologist Wade Davis’s review of The World until Yesterday in The Guardian is extremely helpful for anyone hoping to appreciate the differences between the two camps because it exemplifies nearly all of the features of this type of historical particularism, with one exception: it’s clearly, even gracefully, written. But this isn’t to say Davis is at all straightforward about his own positions, which you have to read between the lines to glean. Situating the commitment to avoid general theories and focus instead on celebrating the details in a historical context, Davis writes,

This ethnographic orientation, distilled in the concept of cultural relativism, was a radical departure, as unique in its way as was Einstein’s theory of relativity in the field of physics. It became the central revelation of modern anthropology. Cultures do not exist in some absolute sense; each is but a model of reality, the consequence of one particular set of intellectual and spiritual choices made, however successfully, many generations before. The goal of the anthropologist is not just to decipher the exotic other, but also to embrace the wonder of distinct and novel cultural possibilities, that we might enrich our understanding of human nature and just possibly liberate ourselves from cultural myopia, the parochial tyranny that has haunted humanity since the birth of memory.

This stance with regard to other cultures sounds viable enough—it even seems admirable. But Davis is saying something more radical than you may think at first glance. He’s claiming that cultural differences can have no explanations because they arise out of “intellectual and spiritual choices.” It must be pointed out as well that he’s profoundly confused about how relativity in physics relates to—or doesn’t relate to—cultural relativity in anthropology. Einstein discovered that time is relative with regard to velocity compared to a constant speed of light, so the faster one travels the more slowly time advances. Since this rule applies the same everywhere in the universe, the theory actually works much better as an analogy for the types of generalization Diamond tries to discover than it does for the idea that no such generalizations can be discovered. Cultural relativism is not a “revelation” about whether or not cultures can be said to exist or not; it’s a principle that enjoins us to try to understand other cultures on their own terms, not as deviations from our own. Diamond appreciates this principle—he just doesn’t take it to as great an extreme as Davis and the other anthropologists in his camp.

            The idea that cultures don’t exist in any absolute sense implies that comparing one culture to another won’t result in any meaningful or valid insights. But this isn’t a finding or a discovery, as Davis suggests; it’s an a priori conviction. For anthropologists in Davis’s camp, as soon as you start looking outside of a particular culture for an explanation of how it became what it is, you’re no longer looking to understand that culture on its own terms; you’re instead imposing outside ideas and outside values on it. So the simple act of trying to think about variation in a scientific way automatically makes you guilty of a subtle form of colonization. Davis writes,

The very premise of Guns, Germs, and Steel is that a hierarchy of progress exists in the realm of culture, with measures of success that are exclusively material and technological; the fascinating intellectual challenge is to determine just why the west ended up on top. In the posing of this question, Diamond evokes 19th-century thinking that modern anthropology fundamentally rejects. The triumph of secular materialism may be the conceit of modernity, but it does very little to unveil the essence of culture or to account for its diversity and complexity.

For Davis, comparison automatically implies assignment of relative values. But, if we agree that two things can be different without one being superior, we must conclude that Davis is simply being dishonest, because you don’t have to read beyond the Prelude to Guns, Germs, and Steel to find Diamond’s explicit disavowal of this premise that supposedly underlies the entire book:

…don’t words such as “civilization,” and phrases such as “rise of civilization,” convey the false impression that civilization is good, tribal hunter-gatherers are miserable, and history for the past 13,000 years has involved progress toward greater human happiness? In fact, I do not assume that industrialized states are “better” than hunter-gatherer tribes, or that the abandonment of the hunter-gatherer lifestyle for iron-based statehood represents “progress,” or that it has led to an increase in happiness. My own impression, from having divided my life between United States cities and New Guinea villages, is that the so-called blessings of civilization are mixed. For example, compared with hunter-gatherers, citizens of modern industrialized states enjoy better medical care, lower risk of death by homicide, and a longer life span, but receive much less social support from friendships and extended families. My motive for investigating these geographic differences in human societies is not to celebrate one type of society over another but simply to understand what happened in history. (18)

            For Davis and those sharing his postmodern ideology, this type of dishonesty is acceptable because they believe the political ends of protecting indigenous peoples from exploitation justifies their deceitful means. In other words, they’re placing their political goals before their scholarly or scientific ones. Davis argues that the only viable course is to let people from various cultures speak for themselves, since facts and theories in the wrong hands will inevitably lubricate the already slippery slope to colonialism and exploitation. Even Diamond’s theories about environmental influences, in this light, can be dangerous. Davis writes,

In accounting for their simple material culture, their failure to develop writing or agriculture, he laudably rejects notions of race, noting that there is no correlation between intelligence and technological prowess. Yet in seeking ecological and climatic explanations for the development of their way of life, he is as certain of their essential primitiveness as were the early European settlers who remained unconvinced that Aborigines were human beings. The thought that the hundreds of distinct tribes of Australia might simply represent different ways of being, embodying the consequences of unique sets of intellectual and spiritual choices, does not seem to have occurred to him.

Davis is rather deviously suggesting a kinship between Diamond and the evil colonialists of yore, but the connection rests on a non sequitur, that positing environmental explanations of cultural differences necessarily implies primitiveness on the part of the “lesser” culture.

Davis doesn’t explicitly say anywhere in his review that all scientific explanations are colonialist, but once you rule out biological, cognitive, environmental, and climatic theories, well, there’s not much left. Davis’s rival explanation, such as it is, posits a series of collective choices made over the course of history, which in a sense must be true. But it merely begs the question of what precisely led the people to make those choices, and this question inevitably brings us back to all those factors Diamond weighs as potential explanations. Davis could have made the point that not every aspect of every cultural can be explained by ecological factors—but Diamond never suggests otherwise. Citing the example of Kaulong widow strangling in The World until Yesterday, Diamond writes that there’s no reason to believe the practice is in any way adaptive and admits that it can only be “an independent historical cultural trait that arose for some unknown reason in that particular area of New Britain” (21).

I hope we can all agree that harming or exploiting indigenous peoples in any part of the world is wrong and that we should support the implementation of policies that protect them and their ways of life (as long as those ways don’t involve violations of anyone’s rights as a human—yes, that moral imperative supersedes cultural relativism, fears of colonialism be damned). But the idea that trying to understand cultural variation scientifically always and everywhere undermines the dignity of people living in non-Western cultures is the logical equivalent of insisting that trying to understand variations in peoples’ personalities through empirical methods is an affront to their agency and freedom to make choices as individuals. If the position of these political-activist anthropologists had any validity, it would undermine the entire field of psychology, and for that matter the social sciences in general. It’s safe to assume that the opacity that typifies these anthropologists’ writing is meant to protect their ideas from obvious objections like this one. 

As well as Davis writes, it’s nonetheless difficult to figure out what his specific problems with Diamond’s book are. At one point he complains, “Traditional societies do not exist to help us tweak our lives as we emulate a few of their cultural practices. They remind us that our way is not the only way.” Fair enough—but then he concludes with a passage that seems startlingly close to a summation of Diamond’s own thesis.

The voices of traditional societies ultimately matter because they can still remind us that there are indeed alternatives, other ways of orienting human beings in social, spiritual and ecological space… By their very existence the diverse cultures of the world bear witness to the folly of those who say that we cannot change, as we all know we must, the fundamental manner in which we inhabit this planet. This is a sentiment that Jared Diamond, a deeply humane and committed conservationist, would surely endorse.

On the surface, it seems like Davis isn’t even disagreeing with Diamond. What he’s not saying explicitly, however, but hopes nonetheless that we understand is that sampling or experiencing other cultures is great—but explaining them is evil.

            Davis’s review was published in January of 2013, and its main points have been echoed by several other anti-scientific anthropologists—but perhaps none so eminent as the Yale Professor of Anthropology and Political Science, James C. Scott, whose review, “Crops, Towns, Government,” appeared in the London Review of Books in November. After praising Diamond’s plea for the preservation of vanishing languages, Scott begins complaining about the idea that modern traditional societies offer us any evidence at all about how our ancestors lived. He writes of Diamond,

He imagines he can triangulate his way to the deep past by assuming that contemporary hunter-gatherer societies are ‘our living ancestors’, that they show what we were like before we discovered crops, towns and government. This assumption rests on the indefensible premise that contemporary hunter-gatherer societies are survivals, museum exhibits of the way life was lived for the entirety of human history ‘until yesterday’–preserved in amber for our examination.

Don’t be fooled by those lonely English quotation marks—Diamond never makes this mistake, nor does his argument rest on any such premise. Scott is simply being dishonest. In the first chapter of The World until Yesterday, Diamond explains why he wanted to write about the types of changes that took place in New Guinea between the first contact with Westerners in 1931 and today. “New Guinea is in some respects,” he writes, “a window onto the human world as it was until a mere yesterday, measured against a time scale of the 6,000,000 years of human evolution.” He follows this line with a parenthetical, “(I emphasize ‘in some respects’—of course the New Guinea Highlands of 1931 were not an unchanged world of yesterday)” (5-6). It’s clear he added this line because he was anticipating criticisms like Davis’s and Scott’s.

The confusion arises from Scott’s conflation of the cultures and lifestyles Diamond describes with the individuals representing them. Diamond assumes that factors like population size, social stratification, and level of technological advancement have a profound influence on culture. So, if we want to know about our ancestors, we need to look to societies living in conditions similar to the ones they must’ve lived in with regard to just these types of factors. In another bid to ward off the types of criticism he knows to expect from anthropologists like Scott and Davis, he includes a footnote in his introduction which explains precisely what he’s interested in.

By the terms “traditional” and “small-scale” societies, which I shall use throughout this book, I mean past and present societies living at low population densities in small groups ranging from a few dozen to a few thousand people, subsisting by hunting-gathering or by farming or herding, and transformed to a limited degree by contact with large, Westernized, industrial societies. In reality, all such traditional societies still existing today have been at least partly modified by contact, and could alternatively be described as “transitional” rather than “traditional” societies, but they often still retain many features and social processes of the small societies of the past. I contrast traditional small-scale societies with “Westernized” societies, by which I mean the large modern industrial societies run by state governments, familiar to readers of this book as the societies in which most of my readers now live. They are termed “Westernized” because important features of those societies (such as the Industrial Revolution and public health) arose first in Western Europe in the 1700s and 1800s, and spread from there overseas to many other countries. (6)

Scott goes on to take Diamond to task for suggesting that traditional societies are more violent than modern industrialized societies. This is perhaps the most incendiary point of disagreement between the factions on either side of the anthropology divide. The political activists worry that if anthropologists claim indigenous peoples are more violent outsiders will take it as justification to pacify them, which has historically meant armed invasion and displacement. Since the stakes are so high, Scott has no compunctions about misrepresenting Diamond’s arguments. “There is, contra Diamond,” he writes, “a strong case that might be made for the relative non-violence and physical well-being of contemporary hunters and gatherers when compared with the early agrarian states.” 

Well, no, not contra Diamond, who only compares traditional societies to modern Westernized states, like the ones his readers live in, not early agrarian ones. Scott is referring to Diamond's theories about the initial transition to states, claiming that interstate violence negates the benefits of any pacifying central authority. But it may still be better to live under the threat of infrequent state warfare than of much more frequent ambushes or retaliatory attacks by nearby tribes. Scott also suggests that records of high rates of enslavement in early states somehow undermine the case for more homicide in traditional societies, but again Diamond doesn’t discuss early states. Diamond would probably agree that slavery, in the context of his theories, is an interesting topic, but it's hardly the fatal flaw in his ideas Scott makes it out to be.

The misrepresentations extend beyond Diamond’s arguments to encompass the evidence he builds them on. Scott insists it’s all anecdotal, pseudoscientific, and extremely limited in scope. His biggest mistake here is to pull Steven Pinker into the argument, a psychologist whose name alone may tar Diamond’s book in the eyes of anthropologists who share Scott’s ideology, but for anyone else, especially if they’ve actually read Pinker’s work, that name lends further credence to Diamond’s case. (Pinker has actually done the math on whether your chances of dying a violent death are better or worse in different types of society.) Scott writes,

Having chosen some rather bellicose societies (the Dani, the Yanomamo) as illustrations, and larded his account with anecdotal evidence from informants, he reaches the same conclusion as Steven Pinker in The Better Angels of Our Nature: we know, on the basis of certain contemporary hunter-gatherers, that our ancestors were violent and homicidal and that they have only recently (very recently in Pinker’s account) been pacified and civilised by the state. Life without the state is nasty, brutish and short.

In reality, both Diamond and Pinker rely on evidence from a herculean variety of sources going well beyond contemporary ethnographies. To cite just one example Scott neglects to mention, an article by Samuel Bowles published in the journal Science in 2009 examines the rates of death by violence at several prehistoric sites and shows that they’re startlingly similar to those found among modern hunter-gatherers. Insofar as Scott even mentions archeological evidence, it's merely to insist on its worthlessness. Anyone who reads The World until Yesterday after reading Scott’s review will be astonished by how nuanced Diamond’s section on violence actually is. Taking up almost a hundred pages, it is far more insightful and better supported than the essay that purports to undermine it. The section also shows, contra Scott, that Diamond is well aware of all the difficulties and dangers of trying to arrive at conclusions based on any one line of evidence—which is precisely why he follows as many lines as are available to him.

However, even if we accept that traditional societies really are more violent, it could still be the case that tribal conflicts are caused, or at least intensified, through contact with large-scale societies. In order to make this argument, though, political-activist anthropologists must shift their position from claiming that no evidence of violence exists to claiming that the evidence is meaningless or misleading. Scott writes,

No matter how one defines violence and warfare in existing hunter-gatherer societies, the greater part of it by far can be shown to be an effect of the perils and opportunities represented by a world of states. A great deal of the warfare among the Yanomamo was, in this sense, initiated to monopolise key commodities on the trade routes to commercial outlets (see, for example, R. Brian Ferguson’s Yanomami Warfare: A Political History, a strong antidote to the pseudo-scientific account of Napoleon Chagnon on which Diamond relies heavily).

It’s true that Ferguson puts forth a rival theory for warfare among the Yanomamö—and the political-activist anthropologists hold him up as a hero for doing so. (At least one Yanomamö man insisted, in response to Chagnon’s badgering questions about why they fought so much, that it had nothing to do with commodities—they raided other villages for women.) But Ferguson’s work hardly settles the debate. Why, for instance, do the patterns of violence appear in traditional societies all over the world, regardless of which state societies they’re in supposed contact with? And state governments don’t just influence violence in an upward direction. As Diamond points out, “State governments routinely adopt a conscious policy of ending traditional warfare: for example, the first goal of 20th-Century Australian patrol officers in the Territory of Papua New Guinea, on entering a new area, was to stop warfare and cannibalism” (133-4).

What is the proper moral stance anthropologists should take with regard to people living in traditional societies? Should they make it their priority to report the findings of their inquiries honestly? Or should they prioritize their role as advocates for indigenous people’s rights? These are fair questions—and they take on a great deal of added gravity when you consider the history, not to mention the ongoing examples, of how indigenous peoples have suffered at the hands of peoples from Western societies. The answers hinge on how much influence anthropologists currently have on policies that impact traditional societies and on whether science, or Western culture in general, is by its very nature somehow harmful to indigenous peoples. Scott’s and Davis’s positions on both of these issues are clear. Scott writes,

Contemporary hunter-gatherer life can tell us a great deal about the world of states and empires but it can tell us nothing at all about our prehistory. We have virtually no credible evidence about the world until yesterday and, until we do, the only defensible intellectual position is to shut up.

Scott’s argument raises two further questions: when and from where can we count on the “credible evidence” to start rolling in? His “only defensible intellectual position” isn’t that we should reserve judgment or hold off trying to arrive at explanations; it’s that we shouldn’t bother trying to judge the merits of the evidence and that any attempts at explanation are hopeless. This isn’t an intellectual position at all—it’s an obvious endorsement of anti-intellectualism. What Scott really means is that he believes making questions about our hunter-gatherer ancestors off-limits is the only morally defensible position.

            It’s easy to conjure up mental images of the horrors inflicted on native peoples by western explorers and colonial institutions. But framing the history of encounters between peoples with varying levels of technological advancement as one long Manichean tragedy of evil imperialists having their rapacious and murderous way with perfectly innocent noble savages risks trivializing important elements of both types of culture. Traditional societies aren’t peaceful utopias. Western societies and Western governments aren’t mere engines of oppression. Most importantly, while it may be true that science can be—and sometimes is—coopted to serve oppressive or exploitative ends, there’s nothing inherently harmful or immoral about science, which can just as well be used to counter arguments for the mistreatment of one group of people by another. To anthropologists like Davis and Scott, human behavior is something to stand in spiritual awe of, indigenous societies something to experience religious guilt about, in any case not anything to profane with dirty, mechanistic explanations. But, for all their declamations about the evils of thinking that any particular culture can in any sense be said to be inferior to another, they have a pretty dim view of our own.

            It may be simple pride that makes it hard for Scott to accept that gold miners in Brazil weren’t sitting around waiting for some prominent anthropologist at the University of Michigan, or UCLA, or Yale, to publish an article in Science about Yanomamö violence to give them proper justification to use their superior weapons to displace the people living on prime locations. The sad fact is, if the motivation to exploit indigenous peoples is strong enough, and if the moral and political opposition isn’t sufficient, justifications will be found regardless of which anthropologist decides to publish on which topics. But the crucial point Scott misses is that our moral and political opposition cannot be founded on dishonest representations or willful blindness regarding the behaviors, good or bad, of the people we would protect. To understand why this is so, and because Scott embarrassed himself with his childishness, embarrassed The London Review which failed to properly fact-check his article, and did a disservice to the discipline of anthropology by attempting to shout down an honest and humane scholar he disagrees with, it's only fitting that we turn to a passage in The World until Yesterday Scott should have paid more attention to. “I sympathize with scholars outraged by the mistreatment of indigenous peoples,” Diamond writes,

But denying the reality of traditional warfare because of political misuse of its reality is a bad strategy, for the same reason that denying any other reality for any other laudable political goal is a bad strategy. The reason not to mistreat indigenous people is not that they are falsely accused of being warlike, but that it’s unjust to mistreat them. The facts about traditional warfare, just like the facts about any other controversial phenomenon that can be observed and studied, are likely eventually to come out. When they do come out, if scholars have been denying traditional warfare’s reality for laudable political reasons, the discovery of the facts will undermine the laudable political goals. The rights of indigenous people should be asserted on moral grounds, not by making untrue claims susceptible to refutation. (153-4)

Also read:

NAPOLEON CHAGNON'S CRUCIBLE AND THE ONGOING EPIDEMIC OF MORALIZING HYSTERIA IN ACADEMIA

And:

THE SELF-RIGHTEOUSNESS INSTINCT: STEVEN PINKER ON THE BETTER ANGELS OF MODERNITY AND THE EVILS OF MORALITY

And:

THE PEOPLE WHO EVOLVED OUR GENES FOR US: CHRISTOPHER BOEHM ON MORAL ORIGINS – PART 3 OF A CRASH COURSE IN MULTILEVEL SELECTION THEORY

Read More
Dennis Junk Dennis Junk

Encounters, Inc. Part 2 of 2

The relationship between Tom and Ashley (Monster Face) heats up as Jim keeps getting in deeper and deeper, even though Marcus is looking ever more shady.

The way it’s going to go down is that some detective will show up at the local headquarters of Marcus’s phone service provider, produce the necessary documents—or the proper dose of intimidation—and in return receive a stack of paper or a digital file listing the numbers of every call and text sent both to and from his phone. The detective will see that my own number appears in connection with the ill-fated outing enough times to warrant bringing me in for questioning. So now I’m wondering for the first time in my life if I have what it takes to lie to police detectives. Does a town like Fort Wayne employ expert interrogators? Of course, if I happen to have mentioned the murder in any of the texts, I can’t very well claim I didn’t know about it. Maybe I should call a lawyer—but if I do that before the detective knocks on my door I’ll have established for everyone that I’m at least guilty of something.

Then there are the emails, where we conducted most of our business. If the NSA can get into emails, it can’t be that difficult for the FWPD to do it. But will they be admissible? If not, do they point the way to any other evidence that is? Just to be safe, it’s probably time to go in and delete all of Marcus’s messages in my inbox—not that it will do much good, not that it will make me feel much better.

Hey Jim,

I love your ideas for the website, esp the one about a section for all the local lore by region—the stuff about how you cross this bridge in your car and turn off your lights to see the ghost of some lady (Lagrange?), or look over the side of the bridge and say some name three times and the eyes appear in the water (Huntertown?). People love that kind of stuff. It may not appeal to grownups as much, but we can still use it to drive traffic—or hell we could even sell the ebooks like you suggest.

You may be getting carried away with the all independent publishing stuff tho. I’m not sure, but it sounds like it’s just getting too far afield, you know? Let’s just say you haven’t sold me on it yet. Is there some way you can tie it all together? My concern is that if we start pushing out fiction it will detract from the… what’s the word? Authenticity? The authenticity of the experiences we’re providing our clients. I don’t know—tell me more about your vision for how this would work.

Anyway, love the dream stuff. You’re cranking this stuff out faster than I could’ve anticipated. One thing—and I’m very serious here—don’t go down there. Don’t go anywhere near the base of the Thieme Overlook. I’ll just say you don’t have all the details, don’t know the whole story yet. Patience my man.

Best,

Marcus

Jim,

Lol. Well, I’m glad you didn’t come to any harm on your little trek down to the Saint Mary’s. That’s all I’ll say about it for now—except you’ll just have to trust me that there not being anything down there doesn’t mean we don’t have a story. You’ve obviously caught the inquisitive bug that’s going around. But I have to say we should probably respect Tom’s wishes and leave Ashley out of it. Let’s focus on the ghost story and not get too carried away with the sleuthing. Besides, you need to have something ready in just a couple weeks for our first outing.

As far as the final details of the story go—I believe Tom might be eager to do some further unburdening.

Best,

Marcus

I had a hard time accepting that Ashley would do anything as outright sadistic as she’d done that night when she provoked those two men—not without giving Tom at least some indication of what had prompted her to do it. Again and again, he swore he really didn’t know what could’ve motivated her, but as I kept pressing him, throwing out my own attempted explanations one after the other, his dismissals began to converge on a single new most likely explanation. “It was like when I yelled at her that same night,” he said. “I always got this sense that she was slightly threatened by me. I’m pretty opinionated and outspoken—and I’m nothing like the people she and I both used to work with at the restaurants.”

Tom had the rare distinction among his classmates in the MBA program of being a fervent liberal. His views had begun to take shape as early as his undergrad years, when he absorbed the attitude toward religion and so-called family values popular among campus intellectuals. The farther he moved ideologically from the Catholicism he was raised to believe, the more resentful he became of Christianity and religion in general. “It’s too complicated to go into now,” he told me. “But it was like I was realizing that what was supposed to be the source of all morality was in reality fundamentally immoral—hypocritical to the core. You know, what’s all this nonsense about Jesus being tortured and executed because Eve ate some damn apple—this horrible crime you’re somehow still guilty of even if you were born thousands of years after it all happened? I mean, it’s completely insane.”

The economic side to Tom’s political leanings was shaped by his experiences with a woman he’d oscillated between dating and being friends with going all the way back to his sophomore year in high school. In her early twenties, she started having severe pain during her periods. It got so bad on a few occasions that it landed her in the ER. She ended up being treated for endometriosis, and to this day (Tom had spoken to her as recently as a month and half before she came up in our interview) her ability to conceive is in doubt. The ordeal lasted for over a year and half, which would have been bad enough, but at the time it began she had just been kicked off her dad’s insurance coverage. Try as she might, it wouldn’t be until five years later, after she’d been forced to declare bankruptcy—and after Tom had devoted a substantial chunk of his own student loans to helping her—that she’d finally have health insurance again. “It’s inhuman,” Tom said. “Tell me, how does supply-and-demand work with healthcare? What’s the demand? Not wanting to die? And how the hell does fucking personal responsibility come into the equation? Do republicans really think they can will themselves healthy?”

Over the course of all the presidential and midterm elections that resulted in Obamacare becoming the law of the land, Tom acquired a reputation as someone to be avoided, particularly if you were a conservative—or maybe I should say particularly if you were only halfheartedly political. And it wasn’t just healthcare. He was known to have brought people to tears in debates over economic inequality, racial profiling, education reform, wage gaps between men and women, whites and minorities, corporatism, plutocracy, white warlike men wearing the mantle of righteousness as they persisted in wielding their unwarranted power—just in ever more subtle and devious ways. Even though I, your humble narrator, voiced no opposition to his charged declamations, he still managed to make me slightly uneasy. I could even see a type of perverse logic in the way Ashley expressed her exasperation, forcing this saintly advocate for all things rational and humane to kick the shit out of a couple white trash wastoids.

It was a lesson for me in the power of circumstance over personality—he’d originally struck me as so humble and mindfully soft-spoken, or perhaps simply restrained—and it set my mind to work disentangling the various threads of his character: fascinated with combat sports but loath to do harm, a marketer with an MBA who rants about the evils of capitalism, a champion for the cause of women and minorities who gives no indication of having a clue when it comes to either. I even weighed the possibility that he may have been less than forthright with me, and with himself, when he spoke of having no blood in his eye. Had he just been trying to prove something?

On the same night he told me the story of the two men who’d attacked him and Ashley outside Henry’s, Tom told me about another encounter he’d had a year earlier that could also have ended in violence. He was clearly trying to prevent an impression of him as a violent man from becoming cemented in my mind, and he was at the same time emphasizing just how impossible it was to figure out what Ashley ever really wanted from him. But it came to mind after I made the discovery about his ardently fractious inclinations as possibly holding some key to his paradoxical character.

Tom and Ashley had walked from West Central to the Three-Rivers Natural Food Co-op and Deli (which they referred to affectionately as “The Coop”) for lunch on a sweltering summer afternoon. As he stood in line to order their sandwiches, he let his eyes wander idly around until they lit on those of another man who appeared to be letting his own eyes wander in the same fashion. Tom gave a nod and turned back to face the counter and the person in front of him in line. But the eye contact hadn’t been purely coincidental; after a moment, the man was sidling up and offhandedly inquiring whether Tom would be willing to buy him a pack of cigarettes. Tom leaned back and laughed through his nose. “What’s so funny?” the man asked, making a performance of his indignation. After Tom explained that they were standing in a health food store that most certainly didn’t sell cigarettes, the man said, “You can still help me out with a couple bucks” in a tone conveying his insufferable grievance.

Ashley appeared beside Tom just as he was saying, “I’ll buy you a sandwich or something, but I’m not giving you any money.” She was either already annoyed by something else, or became so immediately upon hearing him, making him wonder if the offer of the sandwich had been inadvisable, or if maybe he hadn’t rebuffed the supplication with enough force.

“I don’t need no damn sandwich,” the man said, making an even bigger production of how offended he was. He stood watching Tom for several beats before wandering off in a silent huff, like a chided but defiant child appalled by his mother’s impudence in disciplining him.  

By the time he was sitting down on the steel mesh chair across from Ashley at one of the Coop’s patio tables, Tom had decided the man was an experienced, though unskillful, grifter. A middle-aged African American, he’d planned to take advantage of Tom’s liberal eagerness to buddy up with any black guy. When that failed, he did his enactment of taking offense, so Tom might feel obliged and be so gracious as to make monetary amends. Finally, he must’ve considered making a scene but considered it unlikely to do any good now that Tom was stubbornly set on giving him the brush off. Ashley and Tom hadn’t even finished unwrapping their sandwiches when the guy emerged through the glass doors, walked a few steps to the bike rack on the sidewalk, and half-stood, half-sat with one leg resting atop the frame, glaring at Tom as he drank Sprite from its gleaming green bottle in tiny sips. So now the goal has shifted, Tom thought, to saving face.

The guy had a film on his skin that, along with the trickling sweat, evoked in Tom’s mind an image of rain misting in through a carelessly unclosed window onto a dusty finished-wood surface. His clothes looked as though they hadn’t been washed—or even removed—in weeks. With little more than a nudge, Tom figured he could knock the guy off the bike rack, his spindly legs having no chance of finding their way beneath him in time to keep him from landing on his face. Could he have a weapon? Tom considered his waxy eyes. Even if the guy has a gun or a knife, he reasoned, he wouldn’t be able to get to it in time to stop me from bouncing his head off the parking lot. Assured that the guy had no chance of winning a fight, Tom gave up the idea of fighting him. So the guy stayed there staring at the couple almost the whole time they were eating their lunch. As annoyed as Tom was, he was prepared to dismiss it all as a simple misfortune.

But that evening Ashley let him know she didn’t like how he’d handled it.

“What do you think I should have done?”

“I don’t know. I just didn’t like the way you handled it.”  

 ***

            “What I loved most about being with Ashley,” Tom said to me, “was there was always this sense that whatever we came across or encountered together held some special fascination for us. One of the first times I had the thought that things might be getting serious between us was when she took me to visit her grandmother up in… Ah, but you know, I’m going keep her personal details out of this I think. I’ll just say there was this feeling I had—like we were both kind of fusing or intertwining our life stories together. We would go on these walks that would end up lasting all day and take us miles and miles from where we started, and we’d never run out of things to talk about, and every little thing we saw seemed important because it automatically became part of the stories we were weaving together.”

            When Tom and Ashley had been together for a couple of months, soon after the visit to her grandmother, he drove her to Union Chapel Road, in what had until recently been the northernmost part of the city, to show her the house he’d lived in for years and years, the place where he said he felt like he’d really become the man he was. After parking on a street in Burning Tree, the neighborhood whose entrance lies only a few dozen yards from his old driveway, they went for one of those walks, heading east on Union Chapel, the same way Tom used to go when embarking on one of his routine runs down to Metea Park and back, a circuit of about nine and half miles. Already by the time he was introducing the area to Ashley, he knew they would encounter a lot of construction and new housing developments once they crossed the bridge over Interstate 69. But he wasn’t sure if the one thing he most wanted to show her would still be there.

            The area across from his old house had long since been developed, but once you crossed Auburn Road you saw nothing but driveways leading to single residences on either side of the road all the way up the rise to the bridge. On the other side of 69, Tom used to see open expanses of old farmland, but the steady march of development was moving west from Tonkel Road back toward the interstate. Back in the days of his runs, just down the slope on the west side of the bridge, a line of pine trees would come into view beginning on the left side of Union Chapel and heading north. “I ran past it for years and never really noticed it much,” Tom said. “I always go into these trances when I’m running. I think I may have even realized it was a driveway only long after my run one day, when I was back home doing something else.”

            The pines, he would discover, ran along both sides of a gravel drive that took you about a quarter mile up a rise into the old farmland and then split apart to form the base of an enclosed square. “It’s the oddest looking thing, right in the middle of all this open space at the top of a hill you almost can’t see for all the weeds. I can’t believe how many times I just ran right by it and never really thought about how weird it was.” In the middle of the square stood an old two-story house, white with gray splotches, and what used to be a lawn now overtaken by the relentlessly encroaching weeds. Tom had finally decided to turn into the drive one day while on his way back from Metea, and ended up remembering it ever since, even though he never returned—not until the day he walked there with Ashley.

As August was coming to an end that year, Tom had begun to feel a nostalgic longing, not for the classes and schoolwork he’d been so relieved to be done with forever just the year before, but for the anticipation of that abstract sense of personal expansion, the promise every upcoming semester holds for new faces, new books, new journeys. As long as you’re in school, you feel you’re working toward something; your life has a gratifying undercurrent of steady improvement, of progress along the path toward some meaningful goal. Outside of school, as Tom was discovering, every job you start, every habit you allow yourself to carry on, pretty much everything you do comes with the added weight of contemplating that this is what your life is and this is what your life is always going to be—a flat line.

            Back in June of that year, the woman Tom had been in an on-again-off-again relationship with almost his entire adolescent and adult life, the one whose health issues had forced to declare bankruptcy, had moved to Charlotte to live with her mom and little sister. He wanted to see his freshly unattached and unencumbered life as at long last open to the infinite opportunity he’d associated in his mind with adulthood for as long as he could remember, the blank canvas for the masterpiece he would make of his own biography. Instead, all summer he’d been oppressed by incessant misgivings, a paralytic foreboding sense of already knowing exactly where all the paths open before him ultimately led.

It was on a day when this foreboding weighed on him with a runaway self-feeding intensity that Tom determined to go for his customary run despite the forecasted rain. By the time he’d made it all the way to Metea and more than halfway back, finding himself near the entrance of the remarkable but long unremarked tree-lined driveway, after having been all the while in a blind trance more like a dreamless sleep than the meditative nirvana he’d been counting on, he was having hitches in his breath, as if he were on the verge of breaking into sobs. The sky had gone from that rich glassy blue that heralds early fall to an oppressively overcast gray more reminiscent of deepest summer, the air ominously swelling with a heavy pressurized dankness that crowded out all the oxygen and clung to Tom’s chilled and sweat-drenched shoulders in a way that made him feel as though the skin there had been perforated to allow his watery innards to seep upward in a hopeless effort to evaporate. It was a sensation indistinguishable from his thwarted urge to escape this body of his he knew too well, along with the world and everything in it.

Turning into the drive, he maintained his stride and continued running until he was about halfway up the rise, where he surprised himself by pulling back against the rolling momentum of his legs and feet, tamping down the charged fury of his pace, until his numbly agitated legs were carrying him along with harshly chastened steps. “As I walked up to the house, I wondered what the story of the people who’d lived there was. And, seeing how rundown everything was, how the weeds were growing up all over the place, you know, it was just like, what does it even matter at this point what happened here? I had actually thought about showing the house to some of my friends, like one of those old spooky houses you’re fascinated with when you’re a kid. But, I don’t know, somehow it got folded into my mood, and all I saw was a place someone had probably loved that had gone to seed.”

Having lost all interest in further exploring the place, Tom was gearing up to start running again once he reached the end of the driveway. But just when he was about to turn back something caught his eye. “I remember telling Ashley about it as we were walking along the bridge over I-69 because she gave me this weird look. See, I explained that the flowers looked like some you see all the time in late summer and early fall. But whenever I’d seen them before they were always purple. These ones, I told her, were ‘as blue as the virgin’s veil.’ We’re both pretty anti-religion, so she thought the comparison was a bit suspicious. I had to assure her it was just an expression, that I wasn’t lapsing back into my childhood Catholicism or anything. To this day, I have no idea why that particular image popped into my head.”

Sure enough, when Tom returned that day years later with Ashley, the tree-lined drive, the graying house, and the blue wildflowers were all there just as he remembered them. Ashley recognized the species at a glance: “They’re actually called blue lobelias, but you’re right—they’re usually much closer to purple than blue.” Tom went on to recount how when he’d first discovered these three clusters near the head of the driveway he’d been feeling as though his whole body, his whole life, had somehow turned into a rickety old husk he had to drag around every waking hour. Setting out for his run that day, he’d experienced an upwelling of his longing to break out of it—to free himself. The heaviness of the atmosphere and the sight of the house only made it worse though. He’d felt like he was suffocating. When he saw the lobelias, it was at first simply a matter of thinking they looked unusual. But after squatting down for a closer look he stood up and took this gulping breath deep into the lowermost regions of his lungs.

“It was like I was drinking something in, like I no longer needed to escape because my body was being reinvigorated. All that dead weight was coming back to life. The change was so abrupt—I couldn’t have been in that yard for more than a few minutes—but I ended up running the rest of the way home with the cleared head I’d set out for in the first place. Even more than that, though, I experienced a sense of renewal that brought me out of the funk I’d been sunk in for weeks. It was only a few weeks later that I started working at the restaurant.

“Oh, and I can’t leave out how the rain began to fall just as I was within a hundred feet of the driveway at my own house.”

 ***

It was after conducting the interview about the old house on Union Chapel that I first started thinking about pulling out of Marcus’s haunted house business. Was I doing all this work for a story about a house that wouldn’t even be there six months from now? As far as knew, construction had already begun on a junction connecting Union Chapel with I-69. Hadn’t Marcus thought to consult with any locals about the location? And how would using the story for some other location affect the “authenticity” he was so concerned with? But the bigger issue was that, while I didn’t yet know the whole story, I was getting the impression it wasn’t really over—and now I was smack-dab in the middle of it. Looking Marcus up on LinkedIn again, I found that since the time he’d first contacted me, which was apparently only about a month after he moved to Fort Wayne from Terra Haute, he’d opened, ironically enough, his own coffee shop on Wells Street.

Back when he’d told me about the money he had saved up for his business venture, I imagined him sitting on a big pile of cash which would allow him to devote all his time to it. Now I was thinking that the whole endeavor, for all his high-wattage salesmanship, could only be a measly side project of his. Yet I was getting all these emails urging me to hurry up and get the story ready to send to all the people who’d already signed up for the first outing the weekend before Halloween. And then there were all the questions surrounding Tom and how he’d ended up coming into Marcus’s ambit. Tom had given us his consent to use his story, his sharing of which I had interpreted as an attempt at unburdening himself, to make of me and Marcus his surrogate confessors. But now I was no longer sure that what Marcus and I were doing met the strictest Capitalism 2.0 standards. I kept asking myself as I listened to the recordings, am I making myself complicit in some kind of exploitation? Should I be doing something more to help Tom—instead of trying to make money off of him?

 ***

In the days after Ashley broke up with Tom, as he settled into the new apartment by himself, he went nearly mad from lack of sleep. No sooner would he lie down and close his eyes than he would be wracked with jealous anxiety powered by images of Ashley with myriad other men he couldn’t help suspecting were the real motivation behind her decision to abandon him. The searing blade of his kicked-in rib, which flared up as if trying to tear free of his body whenever he lay flat or attempted to roll from one side to the other, robbed him of what little of the night’s sleep was left after the jealous heartbreak had taken its share. Desperate, he contacted an old friend from Munchies who came through for him with some mind-bogglingly potent weed. Over the next week, Tom managed to get plenty of sleep, and mysteriously managed as well to gain seven pounds.

One Sunday, Tom heard a knock on his apartment door, which meant one of his neighbors from the three other apartments in the house wanted something. When he opened the door to a tall, very dark-complexioned black man, he couldn’t conceal his surprise. “It’s okay my friend,” the visitor said with a heavy-consonanted accent Tom couldn’t place. “I’m Sara’s boyfriend—from across the hall.” Tom introduced himself and offered his hand. The man’s name was Luca or Lucas, and Tom would later learn that he’d come from the Dominican Republic or Haiti as a high school student, a move that was undertaken under the auspices of the Fort Wayne diocese of the Catholic church. “I noticed the smell of marijuana coming from your apartment the other night and I was wondering if you’d be willing to sell me a small amount.”

Tom gave Lucas all the weed he still had gratis. He would recount to his landlord two days later the story of how he’d been startled by the big black guy knocking on his door (without of course revealing the reason behind the visit), and in return hear what little the landlord knew about him, but, aside from that, he thought nothing further about it—until the following Sunday when he heard another knock on his door. Impatient, Tom opened the door prepared to explain that he had no more weed and didn’t know Lucas well enough to give him his source’s contact information. But Lucas barely let him open the door fully before thrusting a small green pouch into his chest. “Just a little thank you my friend. You’ll only get a couple of hits from that, but trust me. Save it for when you have no work in the morning.” Tom was ready to refuse the gift, but Lucas withdrew his hand and rushed downstairs and out the front door of the house so quickly all he had time to shout after him was thanks. It turned out Lucas and Sara had just broken up. That was the last Tom ever saw of him.

           ***

            Marcus’s coffee shop was smaller and more dimly lit than Old Crown, but it had a certain undeniable charm. When I got there at a little after four in the afternoon, the place was completely empty except for the two young women working the counter. I asked the taller of the two if Marcus was around, but she responded by casting a worried look at her coworker. They both looked to be in their early to mid-twenties, and they were both strikingly attractive in a breezily unkempt sort of way. The shorter one seemed the sharper of the two, exuding a type of evaluating authority, sizing me up, silently challenging me to convince her I was worth the moment of her attention I had requested. Most restaurants and shops like this have a matriarch or two who are counted on to really run the place, the all-but-invariably male general manager’s official title notwithstanding. I guessed I was looking at just such a matriarch—and I suspected she might be romantically linked to Marcus too, an intuition I couldn’t rationally support, except perhaps by pointing to her seeming defensiveness at mention of his name. “No, he’s not in,” she said. “Is there anything we can help you with?”

            I told her I was a friend and had learned about the coffee shop from Marcus’s LinkedIn profile, but leaving a message or even giving my name would be unnecessary. “I’m sure I’ll be back sometime.” I quickly scanned the chalkboard menu on the wall behind her for something to order, hoping to distract her enough to ward off any further questions. “Could I have a pumpkin spice latte please?” I browsed around while the taller barista made my latte and the shorter one returned to what she’d been doing before I arrived—it looked like she was reading from something concealed behind the counter, a book or a magazine perhaps. I constructed an image of her and a sense of her bearing over the course of several nonchalant sweeps of my eyes. She had what the guys in my circle call a monster face: rounded cheeks that lift high on her smoothly protrusive cheekbones when she smiles, pushing her outsize eyes into squinting crescents, a tiny chin topped by an amazingly expandable and dynamic press of lips. As pretty as she was, her face, especially when she smiled, bore the slightest resemblance to the Grinch’s. All in all, she seemed to have a big and formidable personality animating her small, even dainty body. I could see why Marcus would like her.

            I took my latte over to a couch and set it on a coffee table so I could remove my laptop from my backpack and do some editing for the writing project you’re currently reading. Taking a moment to acknowledge the pumpkin spice’s justifiedly much-touted powers of evocation, I scanned line after line, simultaneously wondering how much Cute Monster Face knew about Marcus’s side venture. I realized that I couldn’t enquire after it though because I was worried about the legal ramifications if Tom’s deed came to light. And that realization transformed quickly into frustration with Marcus for getting me involved without properly disclosing the crucial details of what I was getting involved in. After finishing, as I stepped outside onto the sidewalk running along Wells Street, I considered going back in and penning a message:

Marcus,

I don’t appreciate you getting me tangled up in your shady operation. I’m out.

Jim

But I decided against it because I thought I should be fair and wait until I knew the rest of the story. It would be pretty low of me to leave him high-and-dry this close to the event. And, I admit it, I was just too damn curious about what had happened to Tom, and too damn curious as well about how Marcus’s endeavor would pan out. I went home and prepared for what would be my second-to-last interview with Tom.

 ***

“I’ve heard so many times about how I supposedly like to bully people,” Tom was telling me. “But I’ll never forget the feeling of my knuckles hacking into the flesh right behind that guy’s jaw. It all happened so fast with the two guys outside Henry’s, like it was over before I knew what was happening. With this guy, though—I’d been thinking about Ashley, about how much she misunderstood me, and how she of all people should’ve been able to see past what on the surface probably did look like bullying. I admit it. But that’s not what it was. That’s not ever what it was. Then I started wondering who she might be with at that same moment—I couldn’t help it. You know, you’re mind just sort of goes there no matter how hard you try to rein it in. I had all this rage surging up. All the streetlamps are leaving these slow-motion trails, my heart is banging on my ribs like some rabid ape trying to break out of its damn cage, and I keep closing my eyes—and it’s happening right in front of me. Ashley and some guy. Then I smelled somebody’s menthol cigarette—that’s when I opened my eyes.”

            It had begun one evening in June while Tom was out for one of his nightly walks in West Central. What he found most soothing about these listless meanderings of his was being able to look straight up into the sky and see nothing but endless blue, an incomprehensible vastness the mere recognition of which created a sensation like an upward pull, as if the sheer immensity of the emptiness inhered with its own vacuuming force to counter the gravity of the solid earth. He liked going out when the drop in temperature could be most dramatically felt, when the vibrant azure of the day faded before his eyes to the darker, richer, fathomless hues of twilight—the airy insubstantial sea of white-cast blue drawn out through the gracelessly mended seam of the western horizon, a submerged wound which never heals and daily reopens to spill its irradiated blood into the coursing streams of air as they seep out over the edge of the world. Tom, stopping at the overlook each day for weeks to watch the molten pinks and oranges and startling crimsons bleed away the day’s final residue, was always surprised to be the only one in attendance.

            But one night as he stood resting his elbows on the concrete railing he heard someone addressing him. “You guna jump?” Tom looked back over his shoulder without bothering to stand up from his leaning position and saw a heavy-set man with a bushy mess of a goatee waiting for an answer on the sidewalk behind him. He was shortish and had the look of a walking barrel. “He looked,” in Tom’s words, “quite a bit like Tank Abbott from some of the earlier UFCs.” Seeing that no one else was around for the man to be addressing, Tom turned and said, “Probably not today.”

            “Oh, well, I think you should,” the man said, taking a couple of steps forward. He had a backpack slung over one shoulder and looked to be returning from work somewhere on the other side of the river.

            Tom was reticent but couldn’t help asking, “Yeah, why is that?”

            “Well, you’re probably thinking maybe things will be better tomorrow. But it doesn’t work that way as far as I know. Tomorrow, you’ll still be just as much of a faggot.”

            Tom leaned back, resting both elbows on the rail, and leveled a steady glare at the man as he affected the profoundest boredom. The walking barrel responded by leaning slightly to the side and spitting on the ground between them. Then he continued walking down Thieme, leaving Tom free to wonder what had prompted the insult. His first thought was that this man might know one or both of the two guys he’d beat up that night after leaving Henry’s. But if that were the case he probably would’ve gone further than calling him a faggot. Tom figured it was just one of those things, and set to trying to push it out of his mind. He’d noted the knife clipped to the outside of the man’s right pants pocket, and he reasoned that since he was alone and hence not putting on a show for anyone, his goad must have been a genuine invitation to fight. Tom would have to watch out for him in the future.

            But, as was his wont, Tom had gone into one of his walking trances some minutes before he encountered the barrel a second time. It was later that same week, which made Tom realize afterward that he could probably avoid further encounters by heading out for his walks at a different time of night. This second meeting had them crossing each other’s paths as they headed in opposite directions on the sidewalk along Thieme. “What’s up, faggot?” the barrel called out in greeting. Emerging instantly from his trance, Tom glared back at him. When they were face to face, within striking distance, the man jerked forward with his shoulders, trying to trick Tom into reacting as if he were lunging at him. Tom indeed turned sideways into a subtle crouch, flinching, which produced a broad self-amused grin on the barrel’s scraggly, egg-shaped face. “See you tomorrow faggot,” he said continuing past.

The barrel thus began haunting Tom while he was still very much alive, in the way that all men are haunted by insults they fail to, or choose not to redress. His strategy of avoidance felt like a breed of forced effacement, a step toward submission. He was going out later, missing the crepuscular displays over the treetops on the facing side of the Saint Mary’s, returning to his apartment just before going to bed, aggravating his already too wakeful condition. Every time he rounded the corner from Wayne Street onto Thieme Drive, Tom felt like he was stepping onto a sidewalk in a completely different city. Some evenings, he could even close his eyes and reopen them on an entirely different world. Behind the line of trees across the street, there was a type of void, a pressurized humming emptiness hovering over the imperceptibly slow-coursing river separating his neighborhood from the motley houses beyond the opposite bank. You could often hear the throb of bass in the distance, faraway rhythmical percussions made to seem primitive or otherworldly, catch whiffs of smoke from far-off bonfires, or pick up the tail end of some couple’s shouting match. It all resonated through that strange hollow space framed by the trees on each facing bank, making it seem somehow closer and at the same time farther away.

            The old sycamores and random oaks along the walk, compared to all the other new-growth trees in the city, seemed prehistorically gargantuan, their branches reaching up like monstrous undulating tendrils toward the firmament of a world in which no human rightly belongs, at least not one whose flesh has been scoured under gas-heated water, who dons fabrics composed of meticulously complex fibers weekly cleansed in a chemically stewed machine vortex. Bats lurched and dived to snatch unseen prey, crowding the air with hectic, predatory pursuits which mirrored and amplified the groping chaotic alarm of Tom’s most desperately savage thoughts. Into July, he was becoming increasingly frazzled and gnawed at, beset from all sides by silent curses and unvoiced hatreds.

            Then one Friday he saw Ashley at Henry’s with a group of people he didn’t know, a group which included no less than three men who each might’ve been the one who served as his replacement. They played polite, but Tom told the friends he’d arrived with he was feeling ill and excused himself. Walking back to his apartment, he felt his soul building up the volatile force to explode out of his body through a howling roar whose rippling shockwave would shatter every building and house for miles around him, wipe the slate of the earth clean of this taint permeating his existence down to each individual blood cell and neuron. Up the stairs and through his door, he collapsed to his hands and knees, sowing half-imaginary, half-planned destruction on every object his eyes lit upon: the couch that used to be hers, the picture she’d given him as a birthday gift, the old-fashioned TV she used to tease him about. The wooden shelf beside the door, where he dropped his keys and his wallet when returning home—the middle shelf he never used, with forgotten odds and ends, and the little green pouch his friend from the island of Hispaniola had given him as thanks.

 ***

            “I don’t know if it was just the state I was in before I smoked it, or if the shit was laced with something,” Tom said. By the time he was taking the second of three hits, he knew he had to get outside. “I was queasy, claustrophobic. Every time I moved my eyes the light and colors would streak. I thought I should sit still and try not to move, but it was like my skin was on fire. Now it’s hard to remember what actually happened—I don’t remember leaving the house, or where I went at first. I remember turning onto Thieme and stepping into this alien world, this jungle hell that shouted back in flames at every shouting thought in my head. And I wanted to burn. I wanted to be flayed. I needed something to sever my mind from the fucking sinkhole it was trapped in—no matter where I went I was still back in Henry’s, in her apartment, in some guy’s apartment. No matter where I went Ashley was fucking some guy right in front of me.

            “I smelled his cigarette before I saw him. He probably called me a faggot, but if he did I never heard it. I turned and saw him sitting there on that step. Though all I could really see was the orange light of his cigarette. I think I would’ve just walked past him if my eyes hadn’t locked on it, glowing, swaying, bouncing along with the words I never heard. I must’ve walked right up to him. Then he lunged, thrusting his face at me like he’d done before, trying to get me to flinch. Only I didn’t see his face. I saw something more like a bat’s face, something like an African tribal mask. I saw something with blue and red teeth like broken shards of stained glass trying to bite me, to devour me. I was so charged up with rage, and now with fear, and he was coming up from his sitting position. I came down with a right cross, my whole body twisting, all my weight. I probably smashed his jaw with that first punch. But what I saw was this bat’s face with the shards of stained glass for teeth still trying to bite my face off.

            “I remember a lot of pulling and dragging. And I remember pummeling him in the middle of the street, bouncing his head off the pavement. I kept at it because I was sure somehow that I couldn’t really hurt him. I thought I heard him cackling even after his body had gone limp. It made me think it was all part of some trick he was playing on me—and it was infuriating. When my mind first started to clear, when I looked down and saw the guy—the walking barrel—with his face staved in, we were near the overlook. I knew he was dead. He had to be. But then I heard the fucking cackling again—and it scared the hell out of me. I dragged him to the bank and rolled him down, staying long enough to see his body come to rest at the base of the monument just on the edge of the retaining wall. And I ran.”

 ***

            The lobelias were the last piece of the puzzle—or maybe the second to last. Tom had no faith in the law to expiate his sin. He needed to enact some form of penance, and he believed his demon-haunted dreams were guiding him toward it. “I drove my car over there in the middle of the afternoon, in broad fucking daylight. I figured the whole point was to sort of bring what I had done to light, so if someone saw me and called the police, well, so be it. I popped open the hatchback, took a breath, and went down the bank. I was still holding out the hope that I had hallucinated the whole thing. But there he was, rotting, covered in maggots and swirling flies. But you could still see what I’d done, how I’d smashed in his face. I’m afraid that I’ll be seeing that face every time I close my eyes until the day I die. And the smell… I can’t even look at meat anymore. The smell of fish makes me retch.”

            Tom overmastered his repulsion, leaned down over the corpse, and returned to an upright position with it clasp to his chest, surprised by its lightness, by how much flesh had already rotted away. He hoisted one of the perfectly supple dead arms up and twisted under it. He gripped the wrist of the arm thus draped over his shoulder and began the trudge back up the bank. Midway into the climb, the body no longer seemed so light, and his recently mended rib began to prick again. Sweating, panting, wincing against the pain in his side and the stench of putrescence and shit, Tom crested the bank, paused for a breath, and continued toward the passenger side of his car—having decided against using the hatch. He fumbled with the door, and then made a ramp of his body down which the dead one slid into the passenger seat. After making some final adjustments, he belted in the proof of his crime—upright for anyone to see who cared to look.

            Tom walked back to close the hatch, stood for a moment considering the gore spattered and smeared all over his clothes, and then went to the driver’s side door, got in, started the car, and drove away. No one saw him. He made the twenty-minute drive to the north side of town with the raw meat of the walking barrel’s face variously propped and bouncing against the window across from him, made the drive without incident. “I kept having to reach over and keep it from pitching forward, or from rolling over onto me. I thought for sure it was only a matter of time until I heard sirens. But I just kept driving, putting it in fate’s hands. The risk was part of the punishment I guess. Even when I pulled off to the side of the road to vomit, though, no one seemed to care much. I made it all the way to Union Chapel and into the tree-lined driveway without noticing anyone even looking at us funny.”

 ***

            I met Tom the week after he’d buried the walking barrel’s body beside the cluster of blue lobelias at the head of the driveway to the old abandoned house on Union Chapel. I tried to work out the timeline. I’d met Marcus at Old Crown only days after Tom had moved the body. How the hell how had Marcus even known any of this was going on? Did Tom have a confidante, another confessor, a mutual acquaintance of Marcus’s? I had by now settled on a policy of giving Marcus the benefit of the doubt and proceeding with the project—at least until I had all the information and could tell for sure whether something was amiss. I wonder how many ongoing crimes throughout history have sustained themselves on just this type of moment-by-moment justification. I wrote Tom’s story as if it were completed, even though I had arranged another interview. I tried to heighten all the ghostly elements, considered suggesting the lobelias at the house had been seen glowing, maybe throw in some sightings of a spirit wandering around amid a cloud of flies, the face stove in. But the leap from fresh wound to fun game was difficult to make, reminding me that all ghost stories begin with personal tragedies.

            Then the final interview: Tom has lost his job at EntSol; the Car-Ride of Horror has failed to quell his hallucinations; he’s seeing the bat-featured, tribal mask face with its mouth-full of broken stained glass hovering outside the windows of his apartment; and he’s returned to the house on Union Chapel at least once—to dig up the body he’d buried there, sling it over his shoulder again, and carry it down to the end of the drive and back as further penance. When he told me he took the knife that was still clipped inside the guy’s pocket and used it to carve and dismember the body before reburying it, I wasn’t sure I should believe him. It was too much.  Or maybe I just didn’t want to believe it. I’d already written up the story and sent it as a PDF to Marcus, who in turn had sent it via email to the seven participants in the upcoming inaugural event. I told myself we could go through with it and concentrate on helping Tom once it was over.

Marcus had invited me to attend—or rather insisted that I do. He’d already sent me a check for the PDF.

 ***

From Tom’s description of the place, I’d imagined a desolate beige field with nothing but the pine tree borders marking off the old yard, but there was in fact a stand of aged trees in the back providing enough foliage to encircle the fire pit we constructed there. Marcus’s idea was to let everyone settle in and get cozy until it got dark, when we’d walk as a group up to the spot where the earth was disturbed (what an expression!) and the lobelias bloomed. Once we were all gathered there, I would tell the story, and then everyone would return to the safety and warmth of the fire. And it had all actually gone quite well—I was pleased with my somewhat improvised oral rendition of the tale—but I was getting nervous.

A few of the clients were already there when I arrived, and I hadn’t been afforded any opportunity to pull Marcus aside and put my questions to him. Then the two women I’d seen working the counter at his coffee shop showed up. They were introduced simply as helpers for some of the activities planned for later in the night, and they gave no indication of recognizing me. Still, I figured Marcus must know I’d been checking up on him. After the tale-telling, as we sat circling the fire, strange noises began coming from inside the old house. I confess, my nervousness had less to do with how Marcus might respond to my insubordination or the impropriety of profiting from the ongoing suffering of an imperiled man—and more to do with the likelihood that Marcus had arranged for some hokey theatrics to ensure a memorable experience for his clients so he could count on his precious word-of-mouth endorsements. What ever happened to not being cheap?

There ended up being six clients, two couples and a single of each gender, so with Marcus, me, and the two baristas, we made up a camp of ten, all crowded in a half-circle around the impressive fire Marcus had prudently prepared to keep well fueled long into the night. Now that the principle specter’s story had been rehearsed (hearse?), the ascendency of the walking barrel’s ghost established, I was turned to for more stories to, as it were, get the ball rolling. I told the one about the house in Garrett where men frequently see a gorgeous young woman standing in a second-story window, holding a candle and beckoning to them; as they stand there struck dumb by her spectral beauty, she transforms by imperceptible increments into first an old hag and then finally into an ashen-faced demon. Next, I extemporized a story about a group of boys from Carroll High School who made a game out of prowling through The Bicentennial Woods, a nature preserve up off Shoaff Road, waylaying hikers and ritualistically torturing them to death in the old farmhouse that still stands in the field behind the park; when one of the boys’ conscience got to him, he told a football player about the murders; the football player in turn stalked the killers and picked them off one by one—but of course they still haunt the woods and are thought to be behind occasional disappearances.

Passing the baton to one of the clients—who told the story of the witch in Devil’s Hollow—I glanced over at Marcus and saw that his approving smile was more bemused than amused. His preoccupation increased my worry that he was going to try and pull off some kind of stupid stunt. This was only the second time I’d seen the man in person, and my second impression was starkly different from the first. That smile that contended with the sun was all but impossible to imagine now. Even in his physical dimensions, he looked diminished. There were parts of this story I was not privy to, I was sure, and my mind couldn’t help trying to fill in the gaps. But I also couldn’t help trying to think of more ghost stories in case I was called upon to supply them. It occurred to me then that were I inventing Tom’s story, as opposed to reporting it, I would have him return to hang himself from a rafter in the house—that, or something like it, needed to happen for the story to come full circle, for the seed to be properly sown, for the haunting to be thoroughly, um, haunty. It also occurred to me, as I looked over at the house—remarkably nondescript, fading white siding, boxy—past Marcus who sat in my line of sight, that he didn’t just look distracted; he looked a little frightened. I supposed he too might be nervous about how the event would turn out.

I couldn’t help mulling over the question he’d begun his pitch with, the one I assumed he began all his pitches with: Why do people really get scared? I thought of all the normal stuff, public speaking, plane crashes, murders, shark attacks. Then I tried to think of the actual statistics on what people should be the most afraid of—heart disease, cancer, car accidents, or, hell, lifelong loneliness and disappointment. But the thing is, when you start thinking about what people fear—what you personally fear—it’s hard to separate thoughts of real dangers from feelings about deserts, as if we simply can’t imagine coming to any end other than the one we most deserve, which is probably why most people don’t believe they’ll ever die, not really, not in the sense of utterly ceasing to exist. So I was thinking of how I probably would die—heart disease, car crash—but then I started wondering how I might deserve to die. My biggest sin is working with this guy, I thought, making money from murder and exploitation, even if my complicity is only indirect. And that’s kind of the sin, isn’t it? The one we’re all complicit in to one degree or another.

I looked around and felt walled off from the intimate little gathering with each of its individual mystified gazes forming a spoke radiating outward from the hub of the stone-lined fire pit, all of our clients basking in the lively orange radiance of the bonfire, sharing in that nostalgic storytelling atmosphere we designed the scene to evoke—a wheeling symbol of death and rebirth, seasons and ages. But for me the fire gradually resolved into an image of a street protest rendered in orange and yellow light, the raving throngs arhythmically bouncing with their hands thrust up, clamoring for recognition, reaching, yearning, wildly stretching their coiled arms upward as if to lay hold of divine justice and rip from it from the sky, sparks and embers variously lolling or sashaying or darting up into the overabundance of moribund leaves, or lifting along the flue of the clearing up over the canopy, taking their leave of our little half-circle altogether, rising up to the black star-specked heavens like so many spent prayers. I could almost hear the protesters’ shouts as I overlay the flames with images I’d seen from Egypt, Pakistan, Libya. I thought of a Bangladeshi clothing factory flaring up into a surging conflagration that was its own symbolic prefiguring of the outrage it would go on to ignite. Whole regions of the world seething and simmering with the bounding rage so eloquent of critical volatility in every language on this disturbed earth of ours—we are here, we want our share too, we won’t be the worshipful crowd at the rock concert of western civilization’s slave-making sabbath—if you don’t acknowledge us we’ll make you watch as we wrap these tangled masses of coiling arms around everything you love.

And press a button.

Then, returning to the question of how any of us deserves to die, I imagined Tom stepping out from under the trees with an axe or a machete, running up and hacking Marcus and me to pieces and then throwing them one at time into the fire like logs and kindling for everyone to watch sizzle and ooze and blacken—how’s that for an experience to pass on by word of mouth? But now my attention was being brought back to reality by a female voice commanding more casual authority than those of any of the males who’d been yammering on, including, I suspected, my own.

“This story began at a house a lot like this one but in a smaller town, not far from here. This house had a ghost story to go with it too, a story that went back for at least a generation to a time when a family lived in it. The story was that the husband started hearing voices telling him that his wife was unfaithful. The man couldn’t believe this about his wife. He realized that he must be going crazy, so he struggled to think of the voices as nothing but hallucinations. But then one night the wife was late getting home, and in addition to the voices the husband saw images of his wife with another man. When she got home, he attacked her before she could explain. He ended up beating her to death right in front of their five-year-old daughter. Then the man got a kitchen knife and slit his wrists. When the police arrived, they were all three dead, the five-year-old having died of fright. According to the locals, some nights you can still hear the little girl screaming.”

It was one of Marcus’s baristas telling the story, the one I’ve been referring to as Monster Face, though in the underglow of the flames she looked more leonine than monstrous. She was leveling a steady, placid-faced glare high into the fire as she spoke. I thought I could glean an edge of intensity to her words, despite the slowness with which she rolled them out. Her storytelling was both playful and deliberate—and somehow, I thought, malicious in that deliberateness. It was a unique, uniquely powerful performance, worthy of Cannes.

“That was the original story,” she went on. “The second chapter would lend some ironic symmetry to the developing legend surrounding the old house. What happened was some kids who’d grown up hearing the story of the screaming girl started visiting the house every year around Halloween. Of course, by then everyone was saying that the man who’d beaten his wife to death wasn’t really insane, wasn’t schizophrenic or anything like that, but that it was a demon who’d spoken to him and shown him those images, a demon who still haunted the place. The kids were terrified—but they loved it. By the time they were in high school and had their driver’s licenses, they were starting what would be a yearly tradition of camping in the yard outside the house the weekend before Halloween.”

I felt my mouth fall open as I involuntarily turned my eyes back toward Marcus, who I realized was the focal point through the flames for the barista’s own gaze. His eyes were bulging open, fixed on hers, and he couldn’t have been breathing because there was pressure building up in his neck and behind the strained flesh of his face. Her voice pulled my eyes back in her direction.

“There was a core group of three, two men and one woman. One of the men was a sports jock, a smooth talker, the type who has all kinds of luck with the ladies. Not long after they all three graduated high school, he and the woman fell in love and started planning on getting married. About that time, the other man, the one with all the business sense, came up with a plan for how they could actually make money with their haunted house camping trips. They started bringing other people with them, started hosting the trips not just once a season, but every weekend in October. They even started arranging trips to other houses associated with ghost stories.

“Unfortunately, the smooth talking jock went on one too many forays into that demon-haunted house to prove his mettle. He became convinced that his fiancée was screwing around with his best friend, the guy with the business sense. He became convinced too that they were planning on pushing him out of the operation and keeping all that money to themselves. One night in October, he showed up for the first outing of the season—and he thought he saw them murmuring to each other, with their faces a little too close together. He flew into a rage and murdered them both. Then he hacked them to pieces and buried them in assorted spots surrounding the house. And then he hosted the night’s guests as if nothing had happened. But afterward, just to be safe, he left town, planning to lie low for a while, maybe get started getting a foot in the door for his business in other nearby cities. The only problem was it wasn’t just business sense he was lacking.”

Marcus was standing. There was movement all about the fire, people standing and backing away, running—but I could hardly take my eyes off of our storyteller. “Does that story resonate, Marcus?” she said, grinning her evilest Grinch’s grin. Hearing him grunt, I turned to see him with someone’s arm wrapped around his neck, the upper arm forming a V with the wrist, the elbow directly under his chin. His assailant was clad head to toe in black, with a mask that erased his features, including his eyes. It was a moment before I realized there were in fact two such black-clad figures wrestling Marcus to the ground as the choke quickly relieved him of his consciousness. I was standing now too, but the guys in masks actually diminished my panic, making me think this really was some kind of stunt after all. Stuff like this doesn’t really happen, does it? Maybe if they’d thrown a black hood over his head.

Amid the bustle and shouting, I hadn’t noticed the other barista, the taller one, the minion, circling behind me. “Oh, you don’t need to get up, Mr. Conway.” She pressed something into my kidney, preventing me from turning around. Was she holding me at gunpoint? It was so preposterous I almost burst out laughing. But then I saw Monster Face glancing back and forth between me and the men dragging Marcus toward the trees. Seeing her now made what was happening seem much more serious—nobody was that good of an actor. “Have a seat please,” the minion said. I sat back down on the sectioned tree trunk that had been my chair. She pulled my hands behind me and bound them with what I guessed was a plastic tie-wrap, and then bound the tie-wrap to another she’d threaded through an eye-bolt screwed into the log. I remember saying to myself, “Jim, this is really happening. You shouldn’t have let her do that.”  

The tree they bound Marcus to was directly behind me, so I had to crane my neck to see any of what was happening to him. Monster Face’s minion was standing behind me the whole time too, occluding my view. I think Marcus’s back was against the trunk, his hands tied with the same kind of cable wraps as mine were behind it. Monster Face slapped him on the cheek a few times to bring him to, but he must have had something stuffed in his mouth because aside from sounds of a sort more apt to emerge from a nose I heard little more from him that night. It was the authoritative, matriarchal barista’s voice I heard—and I heard every last fucking word she uttered.

“Sorry, Marcus. But you’re running this little venture into the ground. I’ve decided it’s time for some new management.” She stepped into view beside me. “You did do one thing right, though, I have to say. Mr. Conway here is quite the artistic man of letters. And a good interviewer too. Believe me,” she addressed me now, “I know how hard it can be to get a straight answer out of Tom.”

Staring at her, I dumbly repeated, “Tom.” My imagination was overwhelmed with the task of going back to all those scenes Tom had described with her—filling in the lineaments of her face, the texture of her voice.

“Oh, now I see you’ve figured it out. Wow. That’s a little disappointing. I have to say with your vaunted intellect and all I figured you had it all worked out by the time you came to the coffee shop. How did you think Marcus even knew about Tom in the first place?” She squatted down beside me. “I know you’re worried about him, Jim. But you shouldn’t be. I’ve got some special plans for him. For both of you.

“Now, Mr. Friedman,” she said standing up and moving to a position where I could no longer see her. “To the question you asked me the first time you told me about your little business. You want to know what scares people? It ain’t the paranormal or otherworldly stuff that scares them—tell them a place is haunted and they’ll be lining up, as you and I both well know. And that’s because what really scares them is their own tiny, pointless existences, the realization that in the great flood of humanity they have nothing special to offer, nothing special to say. Here and gone in a blink and you don’t even leave a mark—not so much as a residue. So they all want to be able to say—they’re fucking desperate to be able to say, ‘You think you know?—well, you don’t know shit. Let me tell you what I’ve seen. Let me tell you what I know.’

“Traumatic experiences leave a trace? Ha! What complete bullshit. You and I both know dark experiences are a commodity, a ticket to instant status. Encounters with evil? Trust me, everyone wishes they could have some. Murder doesn’t scare people. What scares people is the thought that they might not have it in them. Oh, Tom is plenty tore up about killing that tubby piece of trash.” I could tell from her voice that she’d turned and was now addressing Marcus and me jointly. “He’ll lose sleep, bitch and moan, threaten to turn himself in. But Tom—he’s smart. He knows what this means. It means he’s gone far beyond me. It means his life is so much more… interesting. It means, for the moment at least, he wins. And I can’t have that.”

Catching her drift, I began pulling at the cable wraps for the first time, feeling them bread knife into the skin of my wrists.

“Mr. Conway, please don’t scream. I see no need to gag you, as you’re just here to be a witness and chronicler.”

“Tom?” I said. “Why do you even still care about Tom? You broke up with him. Why didn’t he tell me you were still talking to him?”

“Because I told him not to. I broke up with him because I thought he was tame. Christ, you’re all so tame these days. And if they’re not tame they’re just pathetic idiots with no more sense or dignity than fucking gorillas. Tom thought I left him because he argued too much. The truth is, I’d never felt as exhilarated as I did that night when he beat the crap out of those two morons. It was like I was having this revelation, this breaking through to a new type of life—and then instead of sharing it with me he starts going off, telling me how reckless I was, how I could have gotten us killed, or worse. I just looked at him and thought—how can you not feel this? How can I be with someone who’s not capable of having an experience like this? I actually remember thinking, if he doesn’t slap me I have to leave him.”

She came back and squatted down beside me again. “You think coming back here to hack up that body makes him crazy?” she asked. “Well, Mr. Conway, I’ve looked into his eyes, and he looked right back into mine in way he never could before. And you know what I saw there for the first time? I saw blood. I saw the blood in his eye and it made me feel my own blood pumping in my veins. Oh, don’t worry, Mr. Conway. I know exactly what Tom needs. Tom’s going to be just fine.”

  She went back to Marcus, saying, “Now, Mr. Moneybags.” I heard grunting and scraping, Marcus thrashing about against the tree. “Oh, I know. I know. I’m so horrible, so evil. Whatever. All I really am is honest with myself in a way almost no other woman is willing to be. And just so you know—I really do have a conscience. That’s why I chose you.”

I spent the next few hours squeezing my eyes shut, listening to the sounds of a slender young woman of about five foot four experimenting with methods for killing a large muscular man bound to a tree, with her bare hands. I kept thinking one of the fleeing guests must’ve called the police by now, kept thinking I should be hearing sirens any minute. But she’d obviously planned this whole thing out meticulously. When she finally cut my cable tie, I was shivering, my wrists were dripping blood, my legs were so achy I could barely stand—and Marcus half-sat, half-lay against the tree, lifeless.

I shambled toward the front of the house, past the blue lobelias, heading for my car down by the entrance to the drive on Union Chapel. Just as I was rounding the front toward the driver’s side, I heard Monster Face shouting from beside the house. “I don’t need to tell you it would be bad for you if you went to the police. Don’t worry, though, I’ll be good to you. I’ll be in touch. We’ve got a lot of work to do.” I watched her turn back toward the backyard and the tree.

I opened the door of my car, but just as I was about to lower myself in on my wobbly legs, I heard her shouting again, “Oh yeah, and Jim—don’t fucking forget to change the names before you post this story.”  

Next, read:

CANNONBALL: A HALLOWEEN STORY

Read More
Dennis Junk Dennis Junk

Encounters, Inc. Part 1

Aspiring author Jim Conway is recruited by entrepreneur Marcus Friedman to craft stories about the haunted houses his business arranges tours of for Halloween. Jim’s first assignment turns out a bit differently from what everyone expected.

            When it first occurred to me that I was getting in deep myself, that I too might be culpable, I was listening to a recording of Tom describing one of his nightmares—Tom, whom I knew to be a murderer. The job I’d been hired to do consisted of turning the raw material of all these recordings into something that was part literary thriller and part content marketing, something in between a book and a brochure. I don’t have any of the recordings anymore because I made a point of destroying any and all physical evidence of my association with Marcus Friedman. But I’ve held on to all the writing I did. I can always say it’s just fiction, right?

            Anyway, I was listening to Tom’s voice, turning his words into a story as well as I could—and, to be honest, I didn’t really know what I was doing—when it struck me. This guy has admitted, if not to me yet then to Marcus, that he killed a guy a few blocks from here, some guy who’d harassed him one too many times, killed him and left his body to rot at the base of the concrete overlook tucked in the bend between the Main Street bridge over the Saint Mary's River and Thieme Drive, which runs along the east bank. He’s admitted to murdering someone, and here I am interviewing him for some ill-defined marketing strategy this Marcus guy hired me to implement—and I’m not turning him in.

            The thought that I should probably at least consider diming on Tom came not from any sense that he was dangerous or evil or anything like that. On the contrary, I thought I should encourage him to confess because it seemed the only way to save him from the torments of his own mind. Turning him in might be the only way to save him from complete insanity. It was only after having that thought that I realized I could be in some trouble myself for letting things go on this long without saying anything. Maybe, just maybe, I thought, I’ll end up having to bring the whole story to light to save us both.

For whatever reason, though, I just kept on working on the story:

            …After awaking, Tom couldn’t close his eyes again, making him wonder if he’d been sleeping with them open—and how long could that state of affairs have persisted? He sat up in bed and scanned the darkened room for the disturbance that had woken him. When he lay back down, he assured himself the condition was only a momentary dream echo, but his gaze remained locked on the ceiling and the buzzing, wobbling whirl of the dusty fan blades. He hesitated before reaching up to probe his eyes with his fingers, anticipating something awful. Postponing the discovery, he pushed one leg cautiously out from under the duvet, and then the other. Finally, he folded his body up from the bed, holding his head fixed rigidly atop a neck stiff with apprehension.

            On his feet, moving forward on sturdy legs, he felt more together, leaving behind that seldom remarked feeling of vulnerability we all experience in our places of slumber. He tested the sweep of his eyes beneath the fixed-open lids. Each pass from one side to the other brought a peculiar sensation in its wake, a sort of dragging discomfort approaching the threshold of pain. Already having walked as far as the passage from his meager kitchen to the open space of his living room, he thought to try a darting glance upward, only to find it caused a strange drawing at his lower lids down to the skin atop his cheeks and a fleshy bunching up under his brows. Feeling a simultaneous poke above each eye, he halted mid-step in the corner before stepping through the bathroom doorway, quickly leveling his gaze. Now he could no longer resist examining his face with halting, trembling fingers.

            Rushing toward the mirror, he realized he had to turn back for the light switch. When he finally reached a position hovering over the sink, he felt an odd calm descend on him, as if the shock of what he was seeing with his skewered eyes on the black-flecked glass somehow shattered the surface of the dream’s deception—or as if the gruesomeness, the sheer sadistic inventiveness of the procedure, painless though it was, pushed him toward some state beyond panic. He leant in to investigate the surgically precise mechanism, composed of carved slits in the upper and lower lids of each eye, forming tracks for the tiny bars vertically impaling the delicate white sacks of fluid, preventing them from any fleshly occlusion. First came the slowly widening incision of his lips into a smile. Then the chuff of a laugh.

            “Now who would go and do such a thing?” he posed to the white-lit, echoing vacancy…

 ***

            Aside from the baroque dreams, Tom’s was your typical haunting story: he’d killed a man, and now that man’s ghost was insisting on some type of reckoning. To be fair, Tom claimed not to know for sure that the man was dead because the crime had been committed in a hallucinatory whirl of drug-induced confusion. But it wasn’t long before he determined to settle that uncertainty once and for all by clambering down the river bank to see what he’d left to bake and putrefy in the late summer sun amid the weeds growing up through all that dried mud, reeking of decay, at the base of the crumbling, graffiti-marked monument towering over the brown, insalubrious waters of the Saint Mary's, from which would continue to emanate for a couple of months that invisible miasma, redolent of rotting fish, that only the coldest winters could cleanse from the air. But before I get into any of that I have to tell you about Marcus Friedman and why he was having me write this poor guy’s story.

            Marcus found me on LinkedIn. He’d been searching for a local writer when he came across my profile. After exchanging a few emails, we ended up meeting for the first time at Old Crown, a neat little bar and coffee roaster on Anthony Blvd, one of those painted cinderblock buildings with a ceiling of exposed ductwork. I was at a quaintly light-painted wooden table, admiring a two-page spread for iPads—“The experience of a product. Will it make life better? Does it deserve to exist?”—in the previous week’s New Yorker, when I glanced up and saw this huge rugby player masquerading as a businessman making giant, energetic strides toward my small, elevated table, which was then, at an unaccustomed hour for me, set off in a dull gilded aura by the last light of the day issuing meekly through the shop’s inconspicuous row of out-of-reach windows. Surging into that light, this athlete with a Caribbean air smiled a smile that was like its own dawn competing with the gloaming preview of tomorrow’s truer version. When he reached out his hand, I was surprised to see that it was human in scale, not much larger than mine.

            “Jim Conway?” he half questioned, half insisted. “I recognize you from your LinkedIn profile. Marcus Friedman.” He was already pulling back the opposite chair by the time I could gesture toward it. “God, I love this place,” he said, turning this way and that to devour the ambiance with his eyes, all the while making these big swirling and swimming gestures. “It’s so—warm—and intimate. Like we’re wrapped up in the residue of like a thousand great conversations.”

             I had to smile at this, though I hadn’t yet been able to get out a single word.

            Manifestly responding to my smile, he said, “Ah, but here I am throwing out metaphors to the metaphor master. Well, what do you think? What metaphor would you use for this place?” From a position leaning forward with his arms on the table, he leaned back slowly, draping his right arm over the back of his chair, authoritatively opening the exchange for my contribution. This was, after all, a job interview.

            “Well, Mr. Friedman—”

            “—Marcus, please.”

            “Well, Marcus, it would depend on the context and your goals. The idea of walking into a place and sensing past experiences—good times, stimulating conversations—that’s really intriguing. But if I were writing copy for Old Crown I’d stay away from the word ‘residue.’”

            He smiled his aspiringly solar smile, bringing both hands out over the table, showing me his palms, simultaneously offering me something—recognition, praise—and claiming my entire person. I was torn between wanting to allow myself to be drawn in by his energy and charisma and wanting to throw all that smarm back in his face. And he seemed to embody a mass of similar contradictions. What I’d thought at first was some kind of knit hat were actually dreadlocks, but arranged in a way that was somehow much more businesslike than my own give-a-care gladiator cut. That Caribbean air—he looked to be of largely African ancestry, but his skin had this gilded, gleaming pallor against which my own Scotch-Irish sallowness was as dull as day-old dairy. And most of the salesmen I know don’t have deltoids that strain the seams of their blazers.

            “This place,” I began, suddenly, inexplicably inspired, “this place is an old post-industrial warehouse where the last people on earth came together to ride out the apocalypse. Only that has been so long ago now nobody even remembers. It just feels like it’s been here forever—impossible to imagine a time when there was no Old Crown. The people who come to places like this—and there’s another one pretty similar to it just up the road here on Anthony—they don’t just want a cheap cup of coffee and an occasional beer or mixed drink. They want to try new things, like beer from some small town they’ve never heard of in Germany, or coffee made from beans grown in Papua New Guinea. The reason these coffee houses were built where they are is that our community college is only about a mile and a half from here. These people are educated—and for the first time they actually care how their goods are made. Next door is a health food shop where you can get locally grown poultry and produce. This place, unassuming as it as, represents the promise of a new economy—a sort of Capitalism 2.0. Look, right here on the wall next to us we see the work of local artists. In that room back there past the bathrooms, the one with the blue walls, book clubs meet there, writers’ groups, start-up charities, you name it. It’s not a big corporate chain like Starbucks, because we like places with local flavor. Through the exotic beers and coffees and conversation, we get this tiny window into far-flung regions of the globe. But the window’s built into the wall of what’s unmistakably our own house. We’ve been here all along, surviving the ravages of a less human, more predatory economy. The battle’s not over yet—not by a longshot. But places like this are the beginning. This place doesn’t need any metaphors, Mr. Friedman—Marcus—because this place is a symbol in its own right.”

            Marcus granted me the full dazzling radiance of his too-ready smile and shook his head in faux disbelief as he brought his hands together, once, twice, three times in big sweeps of his bulked-up arms. “And you just came up with that on the spot, huh? You got a gift, Jim. God damn it, you got a gift.” To signal that the preliminaries were over and we were getting down to business, he laid his forearms parallel to each other on the table in front of him and leaned toward me. “Do you know why I like you for this job?”

            “Ha! You haven’t even told me what this job is yet. You said in your emails you needed a content marketer who understood storytelling. You said you were looking for a copywriter who wanted to write novels and short stories. My response to that is you’d probably have a harder time finding one who doesn’t.”

            “I definitely need someone who has an ear for noticing things like residue being a poor choice of word. And I definitely need someone who can write awesome stories. But what you have that everyone else I’ve talked to lacks is optimism. No offense, but most of your fellow English majors are a bunch of pinko commie, whining feminazi fucktards who think the world started off shitty and just keeps getting worse because too few people are pinko feminist fucktards like them.”

            My failure to fully stifle the eruption of a belly laugh encouraged him to proceed. As he did, I realized he must’ve spent quite a bit of time on my blog—though I’m probably more of a pinko myself than he seemed able to glean—and that the interview had progressed from the portion in which Marcus was testing me to the one in which he would pitch me his idea.

            “And why,” I asked, “is it important for you to have someone who doesn’t think we should give up on capitalism—or on letting men roam around freely without gelding them?” 

            “It’s not just that,” he said, leaning back to liberate his untamed hands. “I want someone who will be as excited about my business as I am, someone who’s not afraid of money, who doesn’t think it’s evil or any other ridiculous nonsense like that.”

            Looking back, I realize the thought that occurred to me then—that any of my anti-capitalist collegiate colleagues would’ve made quick work of finding a way to justify taking in a little extra revenue, that I was hardly unique in that regard—should have sparked a wider suspicion. But, to retrospectively justify my own obtuseness, I was just too distracted trying to figure out what Marcus was about to try to sell me on trying to help him sell. Was he starting his own rugby league? Hosting a capoeira tournament?

            “Let me ask you a question, Jim,” he said frowning. After a pause to gather his thoughts—which led me to conclude the ensuing performance was something he’d rehearsed—he locked eyes with me and asked, “What do you think scares people most?”

            “For mothers of young children, it’s that their kids will come to harm. For everyone else, it’s humiliation.”

            “Whoa—ha ha. Thought about this before, huh?”

            I thoroughly enjoyed the brief fluster my ready answer produced in Marcus as he worked out how to segue back into his pitch.

            “Interesting that you jump to little children and their mothers,” he picked up at last. “See, I believe fear falls into two categories. One of them I guess you could say includes humiliation—it’s all the practical things we’re afraid of, the fight or flight type of stuff. But there’s another type of fear, closer to the one that had us running from our bedrooms to our parents’ rooms as kids. Now, Jim, I’m curious—do you really believe all that stuff you were saying about Capitalism 2.0?”

            “Well, for the most part, I do. I think the colloquial expression for what I was doing there is ‘laying it on thick.’”

            “Ha ha—fair enough. Now the second part of the question—what was wrong with the first version of capitalism?”

            “I suppose it was focused too exclusively on profits. Every other human concern got coopted and overridden. If 2.0 is going to work, it’ll be because we come up with ways to include other considerations in our business models—things like working conditions, environmental impacts, and consequences for local communities. Instead of subordinating everything to that one number—the profit—that number will have to incorporate a broader array of concerns.”

            “I couldn’t agree with you more, Jim. Businesses today can’t just exploit people’s weaknesses and desires—”

            “Well, a lot of them still do.”

            “But who wants to work like that? I’ll tell you, even the most ruthless Wall Street guy, you sit him down, and even though you and I agree he’s not doing anything but exploiting people, he’ll go on and on about how what he does benefits society.”

            “And how does your business benefit society?”

            Marcus drew himself up, his lips stretching slowly into a proud, fatherly smile. “Well, Jim, it’s like you said. What individuals do has an impact on the broader community. I’m basically in the entertainment industry, but the trend in entertainment is toward more and more personalized, more and more individualized experiences. We don’t go to the movies anymore. We watch Netflix. We don’t go to arcades anymore. We have Xboxes. We don’t even have conversations anymore. We post status updates and tweets. A growing number of people aren’t even going to church anymore. So what’s the impact on communities? What’s the fallout?”

            “Are you saying you want to scare people to bring them together, to foster a sense of community, and make money on it somehow? Please tell me you’re not asking me to help recruit people for a cult.”

            “No, no, not a cult. But in your answer to my question about what scares people you forgot about those kids the mothers are afraid will come to harm. What’s it like for them? You see what the purpose of that fear is for them—it’s to get them to run to their parents’ bedroom. And that second, less practical type of fear stays with us our whole lives too. And it serves the same purpose. I notice some of the most popular posts on your blog are ghost stories. Why do you think that is?”

            “Everybody enjoys a good ghost story—well, nearly everybody.”

            “Yeah, but why? Why would people go out of their way to be scared? I’ll tell you, I’ve been asking that question for a long time, and no one really has a good answer for it. But then I started looking at it from a different angle. You know how every fall you start hearing people—predominantly women—talking about pumpkin spice lattes at Starbucks? You know how everyone gets excited when the leaves start to change colors? Well, what the hell are they excited about? Sure, the colors are spectacular and all. But what’s the next step? The leaves fall off man. Then the tree stands there, just a big stick in the mud all winter. Fall symbolizes death. Halloween is a time for reconnecting with dead loved ones. What comes next is cold and barren winter—so why do so many people love it all so much?”

            It was at this point that I acknowledged to myself I was finding what Marcus had to say really impressive. “People start thinking about death,” I answered, “and it makes them want to be closer to their loved ones, the ones who are still living. People get frightened of something that goes bump in the night and it makes them want to sleep closer to their parents or spouses. People want to tell scary stories around a campfire because it makes them all feel closer together. That’s why the guy in the movie who says ‘I’ll be right back’ always gets killed. The whole point is to huddle together. The whole point is community.”

            Marcus took this opportunity to ply me with some hyperbolic flattery, something to the effect that anyone who read my work could tell I was smart but seeing my mind work in real time… etc. Then, at long last, he came to his business idea. “We used to have these big harvest celebrations, but not many of us harvest anymore. Even when we do come together for things like football games and concerts, it’s not like we even know most of people in the crowd. Now, Mr. Conway, I’m inviting you to come in on the ground floor here, though I’ve already done quite a few proof-of-concept outings. It has to start with individuals—that’s where you come in. You’re going to hook them with the stories.”

            For the past six years, Marcus had been organizing camping trips to haunted houses every October for a little extra cash. It had started with a place in his hometown in Terra Haute. He and his friends had been going to this house to pitch their tents in the yard every year around Halloween going back to high school. They built campfires, rehearsed the story of the house, dared each other to go in—alone, of course—and bring something out from inside as proof. Everyone loved it. Whenever he talked about it to people outside his closest circle, they all but invariably said they would love to participate in something like that. Marcus’s eyes turned to dollar signs. First, the outings to the house in Terra Haute started getting bigger. Next, Marcus started scoping out other locations, usually no more than abandoned houses on isolated, modestly forested plots. Before long, he was planning months in advance for three separate expeditions on consecutive weekends.

            “Eighty bucks a head, and I supply the location, arrange things with neighbors and law enforcement, maybe throw in someone who can play guitar or hand drums. Most important, I supply the story. Jim, this is pure word-of-mouth so far, and no matter how hard I try I still end up turning a bunch of people down every year. So I finally decided, I’ve got some money saved up, I’m going to go big with this thing. As for the impact on the community, well, it’s just a step, just a little step, but who knows? If it catches on like I think it will, think of all the variations. Every season has its stories and rituals. So you get everyone you know together and share it. And without any of the hellfire or guilt-tripping or boring shit you get at church.”

            “You may run into some thorny dilemmas trying to mix commerce with what people consider sacred. But personally I think it’s a great idea. Whether it’s aboveboard or not, you’re paying for all the feast days and rituals at church too. At least this way it’s honest. You’re kind of branching the vacation industry out into the market for encounters with the supernatural—or at least the extra-mundane.”  

            “I need two things right now,” Marcus said, standing up from his chair. “I need new locations—I’m working on that as we speak. And I need stories—that’s where you come in. In the next couple of days, I’m going to be sending you contact info for a guy who’s had one of those encounters with the supernatural. What I want you to do—if you’re interested in partnering up with me—is talk to the guy, interview him. Bring something to record it if you need to, however you think it’ll work best. You write the story. We get it out there on social media and wherever else we can get people to listen. And then we sit back and watch this thing blow up.”

            I sat watching him make a production of how urgently he needed to get back on the road, assuming it was an element of his recruitment strategy. We shook hands to seal the partnership. As he was walking toward the front door, I called to him. “Marcus, one more question. These stories—are you envisioning them as more literary writings, or more marketing oriented? Because those two styles can end up being at odds.”

            His smile dawned one last time for the night. “That’s your department now. I only ask one thing—make sure it doesn’t sound cheap.”

 ***

            … A human mass beside him as he eased into consciousness set Tom to channeling through his memory of recent events until he decided it must be Ashley. Immediately, under his ribs, a humming warmth began to gather and flow outward, suffusing his limbs with an airy lightness as a thousand meager but incessant doubts, which dogged him even in sleep, blinked out of existence. His consciousness pulsed piece by piece to life in the still darkened room, like an athlete shaking his limbs into readiness before an event. With this stepwise return from oblivion came the intensifying awareness that he was experiencing the very sensation he’d determined to resist, this warm buzzing hollowness and weightless elation—that this was the very feeling he’d decided was the product of a deadly intoxicant. Pure poison. And with that unspoken word poison still echoing among his mist-cloaked thoughts there came a sharp pricking deep inside his nostrils, causing him to grimace and recoil into his pillow, jerking his face to one side then the other. It wasn’t Ashley sleeping next to him. It was someone who’d just smoked a menthol.

            Finding himself in the middle of the room, his hands held out to check the advance of any attacker, he glared down at the bed with its twisted sheets and undecipherable chaos of mounded folds and depressions, each heartbeat bulging under the skin of his temples, each jagged breath ruling out any hope of remaining quietly inconspicuous. He stood there long enough to calm his breathing before stepping forward and smoothing the comically disheveled sheets with his palms. What kept him from being able to reassure himself that the presence he’d sensed was no more than the remnant of a dream borne of his guilty conscience was that he couldn’t recall ever in his life having had a dream that featured a scent of any sort, much less one so recognizable and vividly real. It took him some time to fall asleep again, and when he did he had a perfectly conventional dream about being called before a court, the assembled judges looking over the tops of ridiculously tall and imposing podiums…

*** 

            “I feel like whatever I do or whatever I say it’s bound to be exactly the wrong thing,” Tom said. “It’s like she wants something from me but I never know what it is. Thing is, I don’t even think she knows what it is—what she really wants is for me to figure out what she wants and give it to her as a this perfect surprise. So I’m not only supposed to read her mind—I’m supposed to be able to read it so clearly I know more about what’s going to make her happy than she does. All the while, I’m thinking, does this chick even like me? All I get from her are signs of disapproval and disappointment.” He looked down at the table, shaking his head. “I hate that I’m still talking about it in the present tense.”

            Tom’s voice resonates with a soulfulness at odds with his general air of insouciance—which at times borders on impatience. He experiences his inner dramas in solitude. He’s around six foot tall, and at thirty-three still has a young athlete’s gleaming complexion. As he’s speaking, you have the sense that he’s at once minutely aware of your responses—even anticipating those you’ve yet to make—and prejudiced in favor of some other activity or exchange he could be engaged in, almost as if he’d already participated in several conversations exactly like the one you were currently having. There’s a softness to the flesh around his eyes, but his eyebrows rise outward in subtle curves that create an illusion of severe peaks. The combined effect is of a sympathetic man restraining some bound up energy, perhaps harboring some unspoken rage, one of those generally kind people you know at a glance not to get on the wrong side of. Or maybe these impressions were based on what I already knew. Even through his somewhat loose work shirt you could see his workouts went beyond the simple cardio routines he spoke of to me.

            He was telling me about why he and Ashley had broken up. “We were always at loggerheads, like there was some unresolved issue keeping her from opening up to me—or like I’d done something to really piss her off. That’s what it felt like anyway. But no matter how hard I tried I couldn’t figure out what I’d done, and she wasn’t about to tell me. Once in a while, I’d get pissed off myself—I couldn’t stand her always being ready to go off, having that vague disapproval of hers hanging over my head all the time. We’d have these knockdown-drag-out arguments. I never got physical. Though she hit me and pushed me around quite a bit. For her, I kept getting the sense that it was these arguments that were the deal-breaker. They were pretty intense, and toward the end they were happening pretty often too. But I kept thinking, you know, we can’t work out whatever our issue is if we don’t talk about it, and every time we tried to talk about it we ended up arguing. It’s probably my fault. I always felt like she was just being so unfair so I ended up losing my temper and the next thing you know we’re not talking to each other.”

            Tom and Ashley had been planning on moving in together, at the apartment Tom lives in now, when their final blowup occurred. They had been leaving Henry’s, a low-key old bar on Main Street known for being classier than any of the hole-in-the-wall establishments that predominate in that area, walking back to what was then Tom’s apartment, a one-bedroom on Rock Hill, when two skater kids saw fit to shout a couple of obscenities at them from across the street. “To this day, I can’t figure out why she did it,” Tom told me. “She must’ve already been really pissed off about something—but, if she was, I hadn’t noticed it. And we’d just been talking inside the bar for like an hour.” Ashley had heard the first two or three insults care of the young skaters (an honorary term, since neither had a board) and then stopped to turn toward them. “The weird thing was, I’d never seen that expression on her face before. She had this gleam—it was almost like she was smiling.”

            “Hey,” she shouted back to them. “I know you two.” They stopped, turned, and took a couple of steps back to get a better look at her and hear what she was about to say. “I met these girls who pointed you guys out a while back. They said they tried to date you but you were just too horrible in bed. They said you didn’t know how to fuck.”

            “Ashley, what the hell are you doing?”

            “You must have the wrong guys, cunt. If you want, I’ll show you how I can fuck right now.” The taller of the two kids started walking with these clown-shoe strides toward them, leaning back with his shoulders even as he thrust his hips forward, bobbing his head, and flailing his arms to puff out his elbows. That he was so lanky and dressed in that faux unfashionable apparel that’s so fashionable now—shaved head, wife-beater undershirt, testicle compressing jeans—made it easier for Tom to reserve enough mental space to marvel at Ashley, and to wonder what could possibly have gotten into her, while all but ignoring the threat.

            “Listen guys,” he started to say before Ashley began again.

            “Yeah, they said it was mostly because you both have really tiny dicks. But of course it doesn’t help that you’re illiterate retards.”

            The bald guy actually stopped in the middle of Main Street to look back at his friend, as if expecting to see him doubled over with laughter at the joke he was playing on his buddy. But this guy was looking straight ahead toward Ashley, taking his hands out of his pockets and moving a step forward. “What the fuck Ashley!” Tom shouted, stepping in front of her, glancing quickly at each of the skater kids’ hands to see if they were reaching for weapons.

            “Yeah, Ashley, what the fuck?” the lanky one said, moving forward again. “Now we’re going to have to fuck up Tinker Bell here and have a chat to find out who’s been spreading these lies about us.”

            Tom turned around to see Ashley backing away. “Even then I swear I saw her grinning.” There was nothing behind them but an empty parking lot. Turning back toward the guy in the wife-beater as he backpedaled, Tom said, “Listen man, I’m not sure why she’s trying to mess with you but there’s no reason for either of us to fuck up our lives. Broken teeth. Broken hands. I see cop cars parked here all the time.”

            Now that the guy was charging toward him, Tom saw that he wasn’t sixteen, as he’d looked from across the street, but probably closer to his mid-twenties. His pocked, roughly shaved face and filthy clothes revealed him to be not the child of privilege given to slumming he’d appeared from a distance, but something closer to a skinny, drug-addled convict. “Hey, don’t worry Tinker Bell,” he said, lifting his hands. “It’s just your life we’re going to fuck up. And we’ll be long gone with Ashley here by the time any cops come around.”

            Tom, halting abruptly in his retreat, stepped forward, planting his weight on his right foot before swinging his left leg around in a wide loop and burying the blade of his shin in the boy convict’s thigh, turning and folding him backward like a three-section chaise lounge caught in a torrent of wind. As he collapsed, the convict reached out the arm he’d raised to throw a punch, catching Tom’s collar and pulling him forward. Tom lunged forward, thrusting up with his right knee, blasting it into his assailant’s solar plexus and sending them both tumbling to the asphalt. Taking advantage of the convict’s panic at being struck so hard and knocked from his upright position, Tom made ready and timed a right elbow to coincide with their collision against the asphalt. He threw it with a twisting force gathered from the entire length of his body down to his toes, landing it on the guy’s temple the instant his shoulders hit the ground, feeling that sort of crisp resonating bat-on-ball crack of elbow against temple so familiar to him even though he’d never personally produced it before.

            The boy convict went immediately limp, but his fingers were still wrapped in Tom’s collar. As Tom sat back, pushing the arm aside, sliding a foot into position to push himself back up to his feet, he felt the brutal ax blade of a foot wedging itself into the left side of his torso, lifting him up off his one planted knee. The shock of the blow made everything flash white. Following some vague instinct, Tom rolled onto his back and rotated his body on the asphalt to get his feet between him and this second attacker. This man, whose appearance Tom wouldn’t be able to remember at all, ended up awkwardly forfeiting the brief opening afforded him by his landed shot because, having rushed so frantically to the aid of his fallen comrade, he’d managed to upset his own balance in delivering the kick and was thus forced to scramble after the man he’d just injured in a clumsy attempt to ensure he’d sustained enough damage to render him incapable of any further defense.

            “I would say I threw a triangle on him,” Tom said of the final moments of this seconds-long confrontation, “but it seemed more like he just moved right into it on his own.” As the guy crawled over Tom’s legs so he could climb atop, pin his torso to the ground and pummel him, he quickly found his own torso pinched and immobile. Tom had hooked his right leg over the man’s shoulder, his calf clamped down across the back of the guy’s neck. Reaching up with his hand, Tom tucked his right foot in the crook of his left knee, trapping the man’s head and one of his arms in the constrictive frame of his legs. “I didn’t just choke him out right away like I would have in training. I was so freaked out that these guys were actually attacking us that I wanted to make sure I did some damage. So before really sinking the choke I bloodied up his face pretty good. By then the first guy was trying to stand up on his chicken legs, and I just wanted to get Ashley the hell out of there.

            “I grabbed her wrist and we ran—and I swear I heard her laughing. Once we were a few blocks away and the two skater kids—who were actually more like meth heads as far as I could tell—as soon as we had some houses and buildings between us, I couldn’t help it. I just whipped around and started yelling at her. I mean, I was fucking pissed. At first, she was looking up at me with this dazed look, like she was drunk, or high, or like she’d just been having a fucking ball. But as I explained to her that I’d just given that guy a severe concussion, plus whatever I’d done to his leg—as I’m shouting at her that we were lucky as hell to get away without me getting mauled half to death and worse happening to her, she just starts wilting before my eyes.

            “Pretty soon she’s in tears and I’m starting to notice the little stabbing pain in my ribs. When we finally got to my apartment, she just went straight to her car without saying a word, got in, and drove away. I didn’t hear from her for two days. On the third day, she finally responded to a text asking her to call. She said she couldn’t move in with me, that she didn’t think it could ever work between us. She broke up with me over the phone. I wanted to plead with her to give me an explanation for why she’d done it, why she’d provoked those guys. And I wanted her to explain too what the hell it was she’d wanted that whole time, our whole relationship, that I wasn’t giving her. What had I done to piss her off so damn much? But the call was over before I could say any more. That was it. I moved in to this place by myself.”

 ***

            Tom didn’t have any blood in his eye. He’d begun taking taekwondo at age thirteen from a pear-shaped, middle-aged Korean man who barely spoke English. Then at sixteen he’d transferred high schools and found a place he liked better that was closer to home. Here he learned from a diminutive blue-collar, country-music American with an amateur kickboxing record of 40-2 who’d learned karate from a grand master while stationed with the air force in Japan and Wing Chun from a Chinese man he’d partnered with in the states so they could open their own school. This was all in the 90s. When Tom and his friends saw their first Ultimate Fighting Championship toward the end of the decade, they couldn’t understand why experts in so many different styles were having such a hard time with the skinny and boyish-looking Brazilian named Royce Gracie.

            Before long, they were doing whatever they could to teach themselves Brazilian jujitsu, staying after class at their kickboxing school to practice grappling and submissions, much to their teacher’s consternation. By a few months later, they’d found a guy closer to their own age who traveled around to attend seminars in jujitsu and submission wrestling, and he was looking for guys to train with. They rented a backroom usually reserved for aerobics classes and split the cost of some wrestling mats. A couple years later, they found another guy, one who taught Muay Thai, the style of Thai kickboxing that fighters had the most success with in mixed martial arts competitions, out of a rundown former office building. It had been this guy who’d first taught Tom how to throw leg kicks, knees, and elbows like the ones that would save Ashley and him from their mauling or worse outside Henry’s all those years later.

            Tom discovered he had no blood in his eye after his first and only full-contact fight in the ring. He took a beating nearly the entire five minutes of the first round but landed a big head kick fifteen seconds before the bell—a blow that made his opponent go horrifically rigid before sending him toppling over like a concrete statue, his arms remaining freakishly extended in front of him even after he hit the canvas, bounced, and came to a rest. Tom stood horror-struck. He knew right then he would never step in a ring or octagon or anything else like that again. When he told Mark, one of his best friends back in his corner, that he wouldn’t be pursuing a fight career anymore, Mark responded, “Yeah, we all kind of already knew you never had any blood in your eye,” and went on to explain that was an old boxing expression for fighters who had a hard time overcoming their reluctance to hurt anyone. Tom went on to help a few of his friends prepare for fights, but over time he attended training sessions with diminishing frequency until he was done with marital arts altogether and doing more pacific exercise routines on his own.

            Tom’s single venture into the ring occurred two years after he’d earned his degree in communications at IPFW, which is the affectionate acronym locals apply to the joint satellite campus for Indiana and Purdue Universities in Fort Wayne. Throughout college, he’d delivered pizzas for a place called East of Chicago. After graduating, he moved on to a local franchise called The Munchie Emporium, which had three locations in the city and a reputation for employing and serving hippies and stoners. All the servers and kitchen people Tom worked with were either in a band or had a boyfriend who was. He would go on to remark of his time there, “It was like a second education after college. Everyone was sleeping with everyone else. The whole back of the house was usually taking breaks to pass around a bowl or a joint. The whole front of the house was taking turns going to the bathrooms to do lines off the back of the toilet. So all the servers and bartenders are tweaking and all the cooks are mellowed out. I can’t say I really fit in, but I was having a fucking great time.”

            After it became clear to him that he was never going to be a professional fighter and that he didn’t want to serve and bartend for the rest of his life, Tom decided to go back to IPFW and attend the MBA program that had recently been instituted there. It was as he was nearing completion of his master’s that he began an internship with the three-person marketing department at a web design and custom software company called EntSol (an abbreviation of Enterprise Solutions). Tom finished graduate school, became a project clarity specialist at EntSol, and started dating Ashley, who was working at one of the other Munchies stores across town from the one where he’d worked (though none of them were called Munchies anymore by then), all within the same two-month period. The PCS position, which had him serving as a liaison between EntSol’s tech people and the clients, didn’t really appeal to him. So he decided to take a pay cut and return to the marketing team, where he’s still working on strategy, testing, and analytics—all the stuff that drives copywriters like me a little crazy. He said Ashley was generally supportive, though she let him know she didn’t understand what he found so distasteful about the PCS gig. “You have to do what makes you happy, regardless of the money,” she’d told him. “But I think you could have given it more of a chance.”

 ***

            Tom said he believed the dreams were leading up to something, or trying to tell him something. What he needed, he confided to me, was some form of penance—but then he wasn’t even sure if he’d actually committed any crime. Somehow, notwithstanding his uncertainty, he was convinced the dreams were pointing the way for him. One particular dream I wrote up would end up being of particular importance in this regard:

            …Tom was on one of the nightly walks he’d started taking after moving in to his new apartment alone, whenever he felt like the walls were moving in on him, whenever he feared the heartbreak would suffocate him, whenever he got too antsy from missing workouts as his broken ribs healed. In keeping with the bizarre logic of dreams, he approached the spot on Thieme Drive as if it held no special significance whatsoever, the same spot he passed almost every night for over a month, the spot where the powdery golden light of a streetlamp was split by a thin wedge of darkness edged by an old oak tree standing a few feet away, right between the post and the sidewalk. As he was passing through the wedge, past the three square steps rising away from the tree and along a fenced-in walkway up to a house atop a rise, an aberrant blue light flashed in his periphery, bringing him to a halt. The steps form the base of a nook enclosed by a low-roofed, maroon-painted garage on the left, a wooden crosshatched fence on the right, and the always latched gate at the top. Tom had always grinned passing between the oak and the little nook it cast into almost perfect darkness, thinking it was the ideal spot for someone to hide in ambush for lonely night amblers like him.

            Now he stood examining a gleaming cluster of tiny blue flowers rising up out of an orange ceramic pot positioned square in the middle of the step midway up to the gate, trying to discern the source of the illumination—though it appeared as though it was the flowers themselves giving off the glow—and wondering why anyone would leave them in the middle of this staircase. After a few moments, he could no longer resist stepping forward to examine the flowers. He lifted the pot and turned with it to bring it closer to the oak tree. Sure enough, it continued to give off the blue glow, mesmerizing him into tightly focused oblivion, until he heard a voice, vaguely familiar, demanding to know what he was doing.

            Still transfixed by the flowers, he began to say he was simply appreciating the wondrous phenomenon of the blue glow—like open-air bioluminescence—when he heard the sourceless voice muttering something that sounded like a name, as if the woman—yes, it was a woman’s voice for sure—were addressing someone else, and her tone carried an unmistakable note of impatience. Tom finally broke the trance and turned one way, then the other, scanning for the woman whose voice he’d heard. Most of the house was hidden from view by the fence and a hedge running along the inside of it, but he could see that the front door, lit dimly yellow by a porch light, was sealed and inert. Hearing the muttered, indecipherable name again, he turned looking first toward the far end of the garage, and then farther up the sidewalk and the street that it ran alongside. Before his feet caught up with his side-turned eyes, a shout like an explosion of rage sent him stumbling backward. Fumbling with the flowerpot, he tripped on a sidewalk section pushed up by one of the darkening oak’s roots and began to fall.

            But he didn’t land on the sidewalk. He landed in mud, which was redolent of putrescence. Now with a firm hold of the pot, he started to sit up, and he knew immediately where he was—down by the river across the street from the sidewalk, and down the steep, tree-strewn bank, two blocks up from the oak-shaded nook, at the base of the concrete overlook adjacent to the Main Street Bridge over the Saint Mary's. He knew immediately too that the bioluminescent flowers were no longer in the pot he was holding clasped to his stomach. Desperation overtook him. He had to find those flowers and return them to the pot. Setting the pot aside, he got to his feet, darting glances frantically in all directions. The blue light, he thought. Just look for the blue light. How can you possibly miss it?

            As soon as he stood still for a moment, he noticed a faint glow emanating from around the curved base of the overlook. For some reason, his desperation now turned to apprehension, but he stepped forward to investigate, hoping to find the lost luminescent flowers. Rounding the base of the monument, he had no trouble seeing where they now grew. Tom saw first the light, then the myriad sprouting star-burst petals, and finally the half buried, half rotted human body whose head they were clustered about. The terror didn’t seize him instantly, but rather crept upon him as he approached. As he drew nearer to the body, he could discern the angles of the crowded, tangled stems, right down to where their roots had discovered a new source for their sustenance.

            The left side of the man’s face had decayed down to the skull, but much of the flesh had been replaced by grayish mud that resembled the decomposing skin on the other side. Tom leaned down to see if it would be possible to extricate the roots without disrupting the body—without touching it—but saw that the left eye, partially caked over with mud, partially glaring back at him with that familiar black, empty-socket skull’s glare, had somehow allowed the central stem of a large cluster of glowing blue flowers to grow up from its hollowed depths. Tom had brought himself back to his full height and taken two steps back from the corpse before consciously registering the repugnance and terror which were propelling him away. His awareness of his own intensifying panic grew simultaneously with the dawning realization that he was dreaming. As he hauled himself up from the muddy riverbank and into consciousness, the brightening glow of the flowers merged with the light of the morning sun seeping in through the breaking seal of his lids.

            “Blue lobelias,” he muttered as he sat up in his fully lit bedroom.  

Encounters, Inc. Part 2 of 2

Read More
Dennis Junk Dennis Junk

The World Perspective in War and Peace: Tolstoy’s Genius for Integrating Multiple Perspectives

As disappointing as the second half of “War and Peace” is, Tolstoy genius when it comes to perspective makes the first half one of the truly sublime reading experiences on offer to lovers of literature.

            Sometime around the age of twenty, probably as I was reading James M. Cain’s novel Mildred Pierce, I settled on the narrative strategy I have preferred ever since. At the time, I would have called it third-person limited omniscient, but I would learn later, in a section of a class on nineteenth century literature devoted to Jane Austen’s Emma, that the narrative style I always felt so compelled by was referred to more specifically by literary scholars as free indirect discourse. Regardless of the label, I had already been unconsciously emulating the style for some time by then in my own short stories. Some years later, I became quite fond of the reviews and essays of the literary critic James Wood, partly because he eschewed all the idiotic and downright fraudulent nonsense associated with postmodern pseudo-theories, but partly too because in his book How Fiction Works he both celebrated and expounded at length upon that same storytelling strategy that I found to be the most effective in pulling me into the dramas of fictional characters.

            Free indirect discourse (or free indirect style, as it’s sometimes called) blends first-person with third-person narration, so that even when descriptions aren’t tagged by the author as belonging to the central character we readers can still assume what is being attended to and how it’s being rendered in words are revealing something of that character’s mind. In other words, the author takes the liberty of moving in and out of the character’s mind, detailing thoughts, actions, and outside conditions or events in whatever way most effectively represents—and even simulates—the drama of the story. It’s a tricky thing to master, demanding a sense of proportion and timing, a precise feeling for the key intersecting points of character and plot. And it has a limitation: you really can’t follow more than one character at a time, because doing so would upset the tone and pacing of the story, or else it would expose the shallowness of the author’s penetration. Jumping from one mind to another makes the details seem not so much like a manifestation of the characters’ psyche as a simple byproduct of the author’s writing habits.

            Fiction writers get around this limitation in a number of ways. Some break their stories into sections or chapters and give each one over to a different character. You have to be really good to pull this off successfully; it usually still ends up lending an air of shallowness to the story. Most really great works rendered in free indirect discourse—Herzog, Sabbath’s Theater, Mantel’s Cromwell novels—stick to just one character throughout, and, since the strategy calls for an intensely thorough imagining of the character, the authors tend to stick to protagonists who are somewhat similar to themselves. John Updike, whose linguistic talents were prodigious enough to set him apart even in an era of great literary masters, barely even attempted to bend his language to his characters, and so his best works, like those in the Rabbit series, featured characters who are at least a bit like Updike himself.  

            But what if an author could so thoroughly imagine an entire cast of characters and have such a keen sense of every scene’s key dramatic points that she could incorporate their several perspectives without turning every page into a noisy and chaotic muddle? What if the trick could be pulled off with such perfect timing and proportion that readers’ attention would wash over the scene, from character to character spanning all the objects and accidents in between, without being thrown into confusion and without any attention being drawn to the presence of the author? Not many authors try it—it’s usually a mark of inexperience or lack of talent—but Leo Tolstoy somehow managed to master it.

War and Peace is the quintessentially huge and intimidating novel—more of a punch line to jokes about pretentious literature geeks than a great masterwork everyone feels obliged to read at some point in her life. But, as often occurs when I begin reading one of the classics, I was surprised to discover not just how unimposing it is page-by-page but how immersed in the story I became by the end of the first few chapters. My general complaint about novels from the nineteenth century is that the authors wrote from too great a distance from their characters, in prose that’s too formal and wooden. It’s impossible to tell if the lightness of touch in War in Peace, as I’m reading it, is more Tolstoy’s or more the translators Richard Pevear and Larissa Volokhonsky’s, but the original author’s handling of perspective is what shines through most spectacularly.

            I’m only as far into the novel as the beginning of volume II (a little past page 300 of over 1200 pages), but much of Tolstoy’s mastery is already on fine display. The following long paragraph features the tragically plain Princess Marya, who for financial reasons is being presented to the handsome Prince Anatole as a candidate for a mutually advantageous marriage. Marya’s pregnant sister-in-law, Liza, referred to as “the little princess” and described as having a tiny mustache on her too-short upper lip, has just been trying, with the help of the pretty French servant Mademoiselle Bourienne, to make her look as comely as possible for her meeting with the young prince and his father Vassily. But Marya has become frustrated with her own appearance, and, aside from her done-up hair, has decided to present herself as she normally is. The scene begins after the two men have arrived and Marya enters the room.

When Princess Marya came in, Prince Vassily and his son were already in the drawing room, talking with the little princess and Mlle Bourienne. When she came in with her heavy step, planting her heels, the men and Mlle Bourienne rose, and the little princess, pointing to her said, “Voila Marie!” Princess Marya saw them all, and saw them in detail. She saw the face of Prince Vassily, momentarily freezing in a serious expression at the sight of the princess, and the face of the little princess, curiously reading on the faces of the guests the impression Marie made. She also saw Mlle Bourienne with her ribbon, and her beautiful face, and her gaze—lively as never before—directed at him; but she could not see him, she saw only something big, bright, and beautiful, which moved towards her as she came into the room. Prince Vassily went up to her first, and she kissed the bald head that bowed over her hand, and to his words replied that, on the contrary, she remembered him very well. Then Anatole came up to her. She still did not see him. She only felt a gentle hand firmly take hold of her hand, and barely touched the white forehead with beautiful, pomaded blond hair above it. When she looked at him, his beauty struck her. Anatole, the thumb of his right hand placed behind a fastened button of his uniform, chest thrust out, shoulders back, swinging his free leg slightly, and inclining his head a little, gazed silently and cheerfully at the princess, obviously without thinking of her at all. Anatole was not resourceful, not quick and eloquent in conversation, but he had instead a capacity, precious in society, for composure and unalterable assurance. When an insecure man is silent at first acquaintance and shows an awareness of the impropriety of this silence and a wish to find something to say, it comes out badly; but Anatole was silent, swung his leg, and cheerfully observed the princess’s hairstyle. It was clear that he could calmly remain silent like that for a very long time. “If anyone feels awkward because of this silence, speak up, but I don’t care to,” his look seemed to say. Besides that, in Anatole’s behavior with women there was a manner which more than any other awakens women’s curiosity, fear, and even love—a manner of contemptuous awareness of his own superiority. As if he were saying to them with his look: “I know you, I know, but why should I bother with you? And you’d be glad if I did!” Perhaps he did not think that when he met women (and it is even probable that he did not, because he generally thought little), but such was his look and manner. The princess felt it, and, as if wishing to show him that she dared not even think of interesting him, turned to the old prince. The conversation was general and lively, thanks to the little princess’s voice and the lip with its little mustache which kept rising up over her white teeth. She met Prince Vassily in that jocular mode often made use of by garrulously merry people, which consists in the fact that, between the person thus addressed and oneself, there are supposed to exist some long-established jokes and merry, amusing reminiscences, not known to everyone, when in fact there are no such reminiscences, as there were none between the little princess and Prince Vassily. Prince Vassily readily yielded to this tone; the little princess also involved Anatole, whom she barely knew, in this reminiscence of never-existing funny incidents. Mlle Bourienne also shared in these common reminiscences, and even Princess Marya enjoyed feeling herself drawn into this merry reminiscence. (222-3)

In this pre-film era, Tolstoy takes an all-seeing perspective that’s at once cinematic and lovingly close up to his characters, suggesting the possibility that much of the deep focus on individual minds in contemporary fiction is owing to an urge for the one narrative art form to occupy a space left untapped by the other. Still, as simple as Tolstoy’s incorporation of so many minds into the scope of his story may seem as it lies neatly inscribed and eternally memorialized on the page, a fait accompli, his uncanny sense of where to point the camera, as it were, to achieve the most evocative and forwardly propulsive impact in the scene is one not many writers can be counted on to possess. Again, the pitfall lesser talents fall prey to when trying to integrate multiple perspectives like this arises out of an inability to avoid advertising their own presence, which entails a commensurate detraction from the naturalness and verisimilitude of the characters. The way Tolstoy maintains his own invisibility in those perilously well-lit spaces between his characters begins with the graceful directness and precision of his prose but relies a great deal as well on his customary method of characterization.

For Tolstoy, each character’s experience is a particular instance of a much larger trend. So, when the lens of his descriptions focuses in on a character in a particular situation, the zooming doesn’t occur merely in the three-dimensional space of what a camera would record but in the landscape of recognizable human experience as well. You see this in the lines above about how "in Anatole’s behavior with women there was a manner which more than any other awakens women’s curiosity, fear, and even love," and the "jocular mode often made use of by garrulously merry people." 

Here is a still more illustrative example from when the Countess Rostov is reflecting on a letter from her son Nikolai informing her that he was wounded in battle but also that he’s been promoted to a higher rank.

How strange, extraordinary, joyful it was that her son—that son who twenty years ago had moved his tiny limbs barely perceptibly inside her, that son over whom she had quarreled with the too-indulgent count, that son who had first learned to say “brush,” and then “mamma,” that this son was now there, in a foreign land, in foreign surroundings, a manly warrior, alone, with no help or guidance, and doing there some manly business of his own. All the worldwide, age-old experience showing that children grow in an imperceptible way from the cradle to manhood, did not exist for the countess. Her son’s maturing had been at every point as extraordinary for her as if there had not been millions upon millions of men who had matured in just the same way. As it was hard to believe twenty years ago that the little being who lived somewhere under her heart would start crying, and suck her breast, and begin to talk, so now it was hard to believe that this same being could be the strong, brave man, an example to sons and people, that he was now, judging by his letter. (237)

There’s only a single person in the history of the world who would have these particular feelings in response to this particular letter, but at the same time these same feelings will be familiar—or at least recognizable—to every last person who reads the book.  

While reading War and Peace, you have the sense, not so much that you’re being told a grand and intricate story by an engagingly descriptive author, but that you’re witnessing snippets of countless interconnected lives, selections from a vast historical multitude that are both arbitrary and yet, owing to that very connectedness, significant. Tolstoy shifts breezily between the sociological and the psychological with such finesse that it’s only in retrospect that you realize what he’s just done. As an epigraph to his introduction, Pevear quotes Isaac Babel: “If the world could write by itself, it would write like Tolstoy.”  

The biggest drawback to this approach (if you don’t count its reliance on ideas about universals in human existence, which are a bit unfashionable of late) is that since there’s no way to know how long the camera will continue to follow any given character, or who it will be pointed at next, emotional investments in any one person have little chance to accrue any interest. For all the forward momentum of looming marriages and battle deaths, there’s little urgency attached to the fate of any single individual. Indeed, there’s a pervasive air of comic inconsequence, sometimes bordering on slapstick, in all the glorious strivings and abrupt pratfalls. (Another pleasant surprise in store for those who tackle this daunting book is how funny it is.) Of course, with a novel that stretches beyond the thousand-page mark, an author has plenty of time to train readers which characters they can expect to hear more about. Once that process begins, it’s difficult to laugh at their disappointments and tragedies. 

Also read:

HOW VIOLENT FICTION WORKS: ROHAN WILSON’S “THE ROVING PARTY” AND JAMES WOOD’S SANGUINARY SUBLIME FROM CONRAD TO MCCARTHY

And:

WHAT'S THE POINT OF DIFFICULT READING?

And:

WHO NEEDS COMPLEX NARRATIVES? : TIM PARKS' ENLIGHTENED CYNICISM 

Read More
Dennis Junk Dennis Junk

Nice Guys with Nothing to Say: Brett Martin’s Difficulty with “Difficult Men” and the Failure of Arts Scholarship

Brett Martin’s book “Difficult Men” contains fascinating sections about the history and politics behind some of our favorite shows. But whenever he reaches for deeper insights about the shows’ appeal, the results range from utterly banal to unwittingly comical. The reason for his failure is his reliance on politically motivated theorizing, which is all too fashionable in academia.

With his book Difficult Men: Behind the Scenes of a Creative Revolution: From “The Sopranos” and “The Wire” to “Mad Men” and “Breaking Bad”, Brett Martin shows that you can apply the whole repertoire of analytic tools furnished by contemporary scholarship in the arts to a cultural phenomenon without arriving at anything even remotely approaching an insight. Which isn’t to say the book isn’t worth reading: if you’re interested in the backstories of how cable TV series underwent their transformation to higher production quality, film-grade acting and directing, greater realism, and multiple, intricately interlocking plotlines, along with all the gossip surrounding the creators and stars, then you’ll be delighted to discover how good Martin is at delivering the dish. 

He had excellent access to some of the showrunners, seems to know everything about the ones he didn’t have access to anyway, and has a keen sense for the watershed moments in shows—as when Tony Soprano snuck away from scouting out a college with his daughter Meadow to murder a man, unceremoniously, with a smile on his face, despite the fears of HBO executives that audiences would turn against the lead character for doing so. And Difficult Men is in no way a difficult read. Martin’s prose is clever without calling too much attention to itself. His knowledge of history and pop culture rivals that of anyone in the current cohort of hipster sophisticates. And his enthusiasm for the topic radiates off the pages while not marring his objectivity with fanboyism. But if you’re more interested in the broader phenomenon of unforgivable male characters audiences can’t help loving you’ll have to look elsewhere for any substantive discussion of it.

Difficult Men would have benefited from Martin being a more difficult man himself. Instead, he seems at several points to be apologizing on behalf of the show creators and their creations, simultaneously ecstatic at the unfettering of artistic freedom and skittish whenever bumping up against questions about what the resulting shows are reflecting about artists and audiences alike. He celebrates the shows’ shucking off of political correctness even as he goes out of his way to brandish his own PC bona fides. With regard to his book’s focus on men, for instance, he writes,

Though a handful of women play hugely influential roles in this narrative—as writers, actors, producers, and executives—there aren’t enough of them. Not only were the most important shows of the era run by men, they were also largely about manhood—in particular the contours of male power and the infinite varieties of male combat. Why that was had something to do with a cultural landscape still awash in postfeminist dislocation and confusion about exactly what being a man meant. (13)

Martin throws multiple explanations at the centrality of “male combat” in high-end series, but the basic fact that he suggests accounts for the prevalence of this theme across so many shows in TV’s Third Golden Age is that most of the artists working on the shows are afflicted with the same preoccupations.

In other words, middle-aged men predominated because middle-aged men had the power to create them. And certainly the autocratic power of the showrunner-auteur scratches a peculiarly masculine itch. (13)

Never mind that women make up a substantial portion of the viewership. If it ever occurred to Martin that this alleged “masculine itch” may have something to do with why men outnumber women in high-stakes competitive fields like TV scriptwriting, he knew better than to put the suspicion in writing.

            The centrality of dominant and volatile male characters in America’s latest creative efflorescence is in many ways a repudiation of the premises underlying the scholarship of the decades leading up to it. With women moving into the workplace after the Second World War, and with the rise of feminism in the 1970s, the stage was set for an experiment in how malleable human culture really was with regard to gender roles. How much change did society’s tastes undergo in the latter half of the twentieth century? Despite his emphasis on “postfeminist dislocation” as a factor in the appeal of TV’s latest crop of bad boys, Martin is savvy enough to appreciate these characters’ long pedigree, up to a point. He writes of Tony Soprano, for instance,

In his self-absorption, his horniness, his alternating cruelty and regret, his gnawing unease, Tony was, give or take Prozac and one or two murders, a direct descendant of Updike’s Rabbit Angstrom. In other words, the American Everyman. (84)

According to the rules of modern criticism, it’s okay to trace creative influences along their historical lineages. And Martin is quite good at situating the Third Golden Age in its historical and technological context:

The ambition and achievement of these shows went beyond the simple notion of “television getting good.” The open-ended, twelve- or thirteen-episode serialized drama was maturing into its own, distinct art form. What’s more, it had become the signature American art form of the first decade of the twenty-first century, the equivalent of what the films of Scorsese, Altman, Coppola, and others had been to the 1970s or the novels of Updike, Roth, and Mailer had been to the 1960s. (11)

What you’re not allowed to do, however—and what Martin knows better than to try to get away with—is notice that all those male filmmakers and novelists of the 60s and 70s were dealing with the same themes as the male showrunners Martin is covering. Is this pre-feminist dislocation? Mad Men could’ve featured Don Draper reading Rabbit, Run right after it was published in 1960. In fact, Don bears nearly as much resemblance to the main character of what was arguably the first novel ever written, The Tale of Genji, by the eleventh-century Japanese noblewoman, Murasaki Shikibu, as Tony Soprano bears to Rabbit Angstrom.

            Missed connections, tautologies, and non sequiturs abound whenever Martin attempts to account for the resonance of a particular theme or show, and at points his groping after insight is downright embarrassing. Difficult Men, as good as it is on history and the politicking of TV executives, can serve as a case study in the utter banality and logical bankruptcy of scholarly approaches to discussing the arts. These politically and academically sanctioned approaches can be summed up concisely, without scanting any important nuances, in the space of paragraph. While any proposed theory about average gender differences with biological bases must be strenuously and vociferously criticized and dismissed (and its proponents demonized without concern for fairness), any posited connection between a popular theme and contemporary social or political issues is seen not just as acceptable but as automatically plausible, to the point where after drawing the connection the writer need provide no further evidence whatsoever.

One of several explanations Martin throws out for the appeal of characters like Tony Soprano and Don Draper, for instance, is that they helped liberal HBO and AMC subscribers cope with having a president like George W. Bush in office. “This was the ascendant Right being presented to the disempowered Left—as if to reassure it that those in charge were still recognizably human” (87). But most of Mad Men’s run, and Breaking Bad’s too, has been under a President Obama. This doesn’t present a problem for Martin’s analysis, though, because there’s always something going on in the world that can be said to resonate with a show’s central themes. Of Breaking Bad, he writes,

Like The Sopranos, too, it uncannily anticipated a national mood soon to be intensified by current events—in this case the great economic unsettlement of the late aughts, which would leave many previously secure middle-class Americans suddenly feeling like desperate outlaws in their own suburbs. (272)

If this strikes you as comically facile, I can assure you that were the discussion taking place in the context of an explanation proposed by a social scientist, writers like Martin would be falling all over themselves trying to be the first to explain the danger of conflating correlation with causation, whether the scientist actually made that mistake or not.

            But arts scholarship isn’t limited to this type of socio-historical loose association because at some point you simply can’t avoid bringing individual artists, characters, and behind-the-scenes players into the discussion. Even when it comes to a specific person or character’s motivation, though, it’s important to focus on upbringing in a given family and sociopolitical climate as opposed to any general trend in human psychology. This willful blindness becomes most problematic when Martin tries to identify commonalities shared by all the leading men in the shows he’s discussing. He writes, for example,

All of them strove, awkwardly at times, for connection, occasionally finding it in glimpses and fragments, but as often getting blocked by their own vanities, their fears, and their accumulated past crimes. (189-90)

This is the closest Martin comes to a valid insight into difficult men in the entire book. The problem is that the rule against recognizing trends in human nature has made him blind to the applicability of this observation to pretty much everyone in the world. You could use this passage as a cold read and convince people you’re a psychic.

            So far, our summation of contemporary arts scholarship includes a rule against referring to human nature and an injunction to focus instead on sociopolitical factors, no matter how implausible their putative influence. But the allowance for social forces playing a role in upbringing provides something of a backdoor for a certain understanding of human nature to enter the discussion. Although the academic versions of this minimalist psychology are byzantine to the point of incomprehensibility, most of the main precepts will be familiar to you from movie and book reviews and criticism: parents, whom we both love and hate, affect nearly every aspect of our adult personalities; every category of desire, interest, or relationship is a manifestation of the sex drive; and we all have subconscious desires—all sexual in one way or another—based largely on forgotten family dramas that we enjoy seeing played out and given expression in art. That’s it. 

            So, if we’re discussing Breaking Bad for instance, a critic might refer to Walt and Jesse’s relationship as either oedipal, meaning they’re playing the roles of father and son who love but want to kill each other, or homoerotic, meaning their partnership substitutes for the homosexual relationship they’d both really prefer. The special attention the show gives to the blue meth and all the machines and gadgets used to make it constitutes a fetish. And the appeal of the show is that all of us in the audience wish we could do everything Walt does. Since we must repress those desires, we come to the show because watching it effects a type of release.

            Not a single element of this theory has any scientific validity. If we were such horny devils, we could just as easily watch internet pornography as tune into Mad Men. Psychoanalysis is to modern scientific psychology what alchemy is to chemistry and what astrology is to astronomy. But the biggest weakness of Freud’s pseudo-theories from a scientific perspective is probably what has made them so attractive to scholars in the humanities over the past century: they don’t lend themselves to testable predictions, so they can easily be applied to a variety of outcomes. As explanations, they can never fail or be definitively refuted—but that’s because they don’t really explain anything. Quoting Craig Wright, a writer for Six Feet Under, Martin writes that

…the left always articulates a critique through the arts.  “But the funny part is that masked by, or nested within, that critique is a kind of helpless eroticization of the power of the Right. They’re still in love with Big Daddy, even though they hate him.”

That was certainly true for the women who made Tony Soprano an unlikely sex symbol—and for the men who found him no less seductive. Wish fulfillment has always been at the queasy heart of the mobster genre, the longing for a life outside the bounds of convention, mingled with the conflicted desire to see the perpetrator punished for the same transgression… Likewise for viewers, for whom a life of taking, killing, and sleeping with whomever and whatever one wants had an undeniable, if conflict-laden, appeal. (88)

So Tony reminds us of W. because they’re both powerful figures, and we’re interested in powerful figures because they remind us of our dads and because we eroticize power. Even if this were true, would it contribute anything to our understanding or enjoyment of the show? Are any of these characters really that much like your own dad? Tony smashes some poor guy’s head because he got in his way, and sometimes we wish we could do that. Don Draper sleeps with lots of attractive women, and all the men watching the show would like to do that too. Startling revelations, those.

What a scholar in search of substantive insights might focus on instead is the universality of the struggle to reconcile selfish desires—sex, status, money, comfort—with the needs and well-being of the groups to which we belong. Don Draper wants to sleep around, but he also genuinely wants Betty and their children to be happy. Tony Soprano wants to be feared and respected, but he doesn’t want his daughter to think he’s a murderous thug. Walter White wants to prove he can provide for his family, but he also wants Skyler and Walter Junior to be safe. These tradeoffs and dilemmas—not the difficult men themselves—are what most distinguish these shows from conventional TV dramas. In most movies and shows, the protagonist may have some selfish desires that compete with his or her more altruistic or communal instincts, but which side ultimately wins out is a foregone conclusion. “Heroes are much better suited for the movies,” Martin quotes Alan Ball saying. “I’m more interested in real people. And real people are fucked up” (106).

Ball is the showrunner behind the HBO series Six Feet Under and True Blood, and though Martin gives him quite a bit of space in Difficult Men he doesn’t seem to notice that Ball’s “feminine style” (102) of showrunning undermines his theory about domineering characters being direct reflections of their domineering creators. The handful of interesting observations about what makes for a good series in Martin’s book is pretty evenly divvied up between Ball and David Simon, the creator of The Wire. Recalling his response to the episode of The Sopranos in which Tony strangles a rat while visiting a college campus with Meadow, Ball says,

I felt like was watching a movie from the seventies. Where it was like, “You know those cartoon ideas of good and evil? Well, forget them. We’re going to address something that’s really real.” The performances were electric. The writing was spectacular. But it was the moral complexity, the complexity of the characters and their dilemmas, that made it incredibly exciting. (94-5)

The connection between us and the characters isn’t just that we have some of the same impulses and desires; it’s that we have to do similar balancing acts as we face similar dilemmas. No, we don’t have to figure out how to whack a guy without our daughters finding out, but a lot of us probably do want to shield our kids from some of the ugliness of our jobs. And most of us have to prioritize career advancement against family obligations in one way or another. What makes for compelling drama isn’t our rooting for a character who knows what’s right and does it—that’s not drama at all. What pulls us into these shows is the process the characters go through of deciding which of their competing desires or obligations they should act on. If we see them do the wrong thing once in a while, well, that just ups the ante for the scenes when doing the right thing really counts.

            On the one hand, parents and sponsors want a show that has a good message, a guy with the right ideas and virtuous motives confronted with people with bad ideas and villainous motives. The good guy wins and the lesson is conveyed to the comfortable audiences. On the other hand, writers, for the most part, want to dispense with this idea of lessons and focus on characters with murderous, adulterous, or self-aggrandizing impulses, allowing for the possibility that they’ll sometimes succumb to them. But sometimes writers face the dilemma of having something they really want to say with their stories. Martin describes David Simon’s struggle to square this circle.

 As late as 2012, he would complain in a New York Times interview that fans were still talking about their favorite characters rather than concentrating on the show’s political message… The real miracle of The Wire is that, with only a few late exceptions, it overcame the proud pedantry of its creators to become one of the greatest literary accomplishments of the early twenty-first century. (135)

But then it’s Simon himself who Martin quotes to explain how having a message to convey can get in the way of a good story.

Everybody, if they’re trying to say something, if they have a point to make, they can be a little dangerous if they’re left alone. Somebody has to be standing behind them saying, dramatically, “Can we do it this way?” When the guy is making the argument about what he’s trying to say, you need somebody else saying, “Yeah, but…” (207)

The exploration of this tension makes up the most substantive and compelling section of Difficult Men.

            Unfortunately, Martin fails to contribute anything to this discussion of drama and dilemmas beyond these short passages and quotes. And at several points he forgets his own observation about drama not being reducible to any underlying message. The most disappointing part of Difficult Men is the chapter devoted to Vince Gilligan and his show Breaking Bad. Gilligan is another counterexample to the theory that domineering and volatile men in the writer’s seat account for domineering and volatile characters in the shows; the writing room he runs gives the chapter its name, “The Happiest Room in Hollywood.” Martin writes that Breaking Bad is “arguably the best show on TV, in many ways the culmination of everything the Third Golden Age had made possible” (264). In trying to explain why the show is so good, he claims that

…whereas the antiheroes of those earlier series were at least arguably the victims of their circumstances—family, society, addiction, and so on—Walter White was insistently, unambiguously, an agent with free will. His journey became a grotesque magnification of the American ethos of self-actualization, Oprah Winfrey’s exhortation that all must find and “live your best life.” What if, Breaking Bad asked, one’s best life happened to be as a ruthless drug lord? (268)

This is Martin making the very mistake he warns against earlier in the book by finding some fundamental message at the core of the show. (Though he could simply believe that even though it’s a bad idea for writers to try to convey messages it’s okay for critics to read them into the shows.) But he’s doing the best he can with the tools of scholarship he’s allowed to marshal. This assessment is an extension of his point about post-feminist dislocation, turning the entire series into a slap in the face to Oprah, that great fount of male angst.

            To point out that Martin is perfectly wrong about Walter White isn’t merely to offer a rival interpretation. Until the end of season four, as any reasonable viewer who’s paid a modicum of attention to the development of his character will attest, Walter is far more at the mercy of circumstances than any of the other antiheroes in the Third Golden Age lineup. Here’s Walter explaining why he doesn’t want to undergo an expensive experimental cancer treatment in season one:

What I want—what I need—is a choice. Sometimes I feel like I never actually make any of my own. Choices, I mean. My entire life, it just seems I never, you know, had a real say about any of it. With this last one—cancer—all I have left is how I choose to approach this.

He’s secretly cooking meth to make money for his family already at this point, but that’s a lot more him making the most of a bad situation than being the captain of his own fate. Can you imagine Tony or Don saying anything like this? Even when Walt delivers his famous “I am the danger” speech in season four—which gets my vote for the best moment in TV history (or film history too for that matter)—the statement is purely aspirational; he’s still in all kinds of danger at that point. Did Martin neglect the first four seasons and pick up watching only after Walt finally killed Gus? Either way, it’s a big, embarrassing mistake.

           The dilemmas Walt faces are what make his story so compelling. He’s far more powerless than other bad boy characters at the start of the series, and he’s also far more altruistic in his motives. That’s precisely why it’s so disturbing—and riveting—to see those motives corrupted by his gradually accumulating power. It’s hard not to think of the cartel drug lords we always hear about in Mexico according to those “cartoon ideas of good and evil” Alan Ball was so delighted to see smashed by Tony Soprano. But Breaking Bad goes a long way toward bridging the divide between such villains and a type of life we have no trouble imagining. The show isn’t about free will or self-actualization at all; it’s about how even the nicest guy can be turned into one of the scariest villains by being placed in a not all that far-fetched set of circumstances. In much the same way, Martin, clearly a smart guy and a talented writer, can be made to look like a bit of an idiot by being forced to rely on a bunch of really bad ideas as he explores the inner workings some really great shows.

            If men’s selfish desires—sex, status, money, freedom—aren’t any more powerful than women’s, their approaches to satisfying them still tend to be more direct, less subtle. But what makes it harder for a woman’s struggles with her own desires to take on the same urgency as a man’s is probably not that far removed from the reasons women are seldom as physically imposing as men. Volatility in a large man can be really frightening. Men are more likely to have high-status careers like Don’s still today, but they’re also far more likely to end up in prison. These are pretty high stakes. And Don’s actions have ramifications for not just his own family’s well-being, but that of everyone at Sterling Cooper and their families, which is a consequence of that high-status. So status works as a proxy for size. Carmela Soprano’s volatility could be frightening too, but she isn’t the time-bomb Tony is. Speaking of bombs, Skyler White is an expert at bullying men, but going head-to-head with Walter she’s way overmatched. Men will always be scarier than women on average, so their struggles to rein in their scarier impulses will seem more urgent. Still, the Third Golden Age is a teenager now, and as anxious as I am to see what happens to Walter White and all his friends and family, I think the bad boy thing is getting a little stale. Anyone seen Damages

Also read:

The Criminal Sublime: Walter White's Brutally Plausible Journey to the Heart of Darkness in Breaking Bad

and:

HOW VIOLENT FICTION WORKS: ROHAN WILSON’S “THE ROVING PARTY” AND JAMES WOOD’S SANGUINARY SUBLIME FROM CONRAD TO MCCARTHY

And:

SABBATH SAYS: PHILIP ROTH AND THE DILEMMAS OF IDEOLOGICAL CASTRATION

Read More
Dennis Junk Dennis Junk

Muddling through "Life after Life": A Reflection on Plot and Character in Kate Atkinson’s New Novel

Kate Atkinson’s “Life after Life” is absorbing and thought-provoking. But it also leaves the reader feeling wrung out. The issue is that if you’re going to tinker with one element of storytelling, the other elements must rock solid to hold the entire structure together.

            Every novelist wants to be the one who rewrites the rules of fiction. But it’s remarkable how for all the experimentations over the past couple of centuries most of the basic elements of storytelling have yet to be supplanted. To be sure, a few writers have won over relatively small and likely ephemeral audiences with their scofflaw writerly antics. But guys like D.F. Wallace and Don DeLillo (and even post-Portrait Joyce) only succeeded by appealing to readers’ desire to fit in with the reigning cohort of sophisticates. If telling stories can be thought of as akin to performing magic, with the chief sleight-of-hand being to make the audience forget for a moment that what they’re witnessing is, after all, just a story, then the meager success of experimental fiction over the past few decades can be ascribed to the way it panders to a subset of readers who like to think of themselves as too cool to believe in magic. In the same way we momentarily marvel, not at a magician’s skillfulness at legerdemain, but at the real magic we’ve just borne witness to, the feat of story magic is accomplished by misdirecting attention away from the mechanics of narrative toward the more compelling verisimilitude of the characters and the concrete immediacy of the their dilemmas. The authors of experimental works pointedly eschew misdirection and instead go out of their way to call attention to the inner workings of narrative, making for some painfully, purposefully bad stories which may nonetheless garner a modicum of popularity because each nudge and wink to the reader serves as a sort of secret hipster handshake.

            That the citadel of realism has withstood  innumerable full-on assaults suggests that the greats who first codified the rules of story writing—the Homers, the Shakespeares, the Austens, the Flauberts, the pre-Ulysses Joyces—weren’t merely making them up whole-cloth and hoping they would catch on, but rather discovering them as entry points to universal facets of the human imagination. Accordingly, the value of any given attempt at fashioning a new narrative mode isn’t exclusively determined by its popularity or staying power. Negative results in fiction, just as in science, can be as fascinating and as fruitful as positive findings because designs with built-in flaws can foster appreciation for more finely tuned and fully functional works. Aspiring novelists might even view the myriad frustrations of experimental fiction as comprising a trail of clues to follow along the path to achievements more faithful to the natural aims of the art form. Such an approach may strike aficionados of the avant-garde as narrow-minded or overly constraining. But writing must operate within a limited set of parameters to be recognized and appreciated as belonging to the category of literary art. And within that category, both societies and individuals find the experience of reading some stories to be more fulfilling, more impactful, more valuable than others. Tastes, societal and individual, along with other factors extrinsic to the story, cannot be discounted. But, though the firmness with which gradations of quality can be established is disputable, the notion that no basis at all reliable could exist for distinguishing the best of stories from the worst resides in a rather remote region on the plausibility scale.

            As an attempt at innovation, Kate Atkinson’s latest novel is uniquely instructive because it relies on a combination of traditional and experimental storytelling techniques. Life after Life has two design flaws, one built-in deliberately, and the other, more damaging one borne either of a misconception or a miscalculation. The deliberate flaw is the central conceit of the plot. Ursula Todd, whose birth in an English house called Fox Corner on a day of heavy snow in February of 1910 we witness again and again, meets with as many untimely demises, only to be granted a new beginning in the next chapter according to the author’s whimsy. Ursula isn’t ever fully aware of what occurred in the previous iterations of her expanding personal multiverse, but she has glimmerings, akin to intense déjà vu, that are at several points vivid enough to influence her decisions. A few tragic occurrences even leave traces on what Ursula describes as the “palimpsest” (506) of time pronounced enough to goad her into drastic measures. One of these instances, when the child Ursula pushes a maid named Bridget down the stairs at Fox Corner to prevent her from attending an Armistice celebration where she’ll contract the influenza that dooms them both, ends up being the point where Ursula as a character comes closest to transcending the abortive contrivances of the plot. But another one, her trying to prevent World War II by killing Hitler before he comes to power, only brings the novel’s second design flaw into sharper focus. Wouldn’t keeping Hitler from playing his now historical role be the first revision that occurred to just about anyone?

But for all the authorial manipulations Life after Life is remarkably readable. Atkinson’s prose and her mastery of scene place her among the best novelists working today. The narration rolls along with a cool precision and a casual sophistication that effortlessly takes on perspective after perspective without ever straying too far from Ursula. And the construction of the scenes as overlapping vignettes, each with interleaved time-travels of its own, often has the effect of engrossing your attention enough to distract you from any concern that the current timeline will be unceremoniously abandoned while also obviating, for the most part, any tedium of repetition. Some of the most devastating scenes occur in the chapters devoted to the Blitz, during which Ursula finds herself in the basement of a collapsed apartment building, once as a resident, and later as a volunteer for a rescue service. The first time through, Ursula is knocked unconscious by the explosion that topples the building. What she sees as she comes to slides with disturbing ease from the mundane to the macabre.

Looking up through the fractured floorboards and the shattered beams she could see a dress hanging limply on a coat hanger, hooked to a picture rail. It was the picture rail in the Miller’s lounge on the ground floor, Ursula recognized the wallpaper of sallow, overblown roses. She had seen Lavinia Nesbit on the stairs wearing the dress only this evening, when it had been the color of pea soup (and equally limp). Now it was a gray bomb-dust shade and had migrated down a floor. A few yards from her head she could see her own kettle, a big brown thing, surplus to requirements in Fox Corner. She recognized it from the thick twine wound around the handle one day long ago by Mrs. Glover. Everything was in the wrong place now, including herself. (272)

The narration then moves backward in time to detail how she ended up amid the rubble of the building, including an encounter with Lavinia on the stairs, before returning to that one hitherto innocuous item. Her neighbor had been wearing a brooch in the shape of a cat with a rhinestone for an eye.

Her attention was caught again by Lavinia Nesbit’s dress hanging from the Miller’s picture rail. But it wasn’t Lavinia Nesbit’s dress. A dress didn’t have arms in it. Not sleeves, but arms. With hands. Something on the dress winked at Ursula, a little cat’s eye caught by the crescent moon. The headless, legless body of Lavinia Nesbit herself was hanging from the Miller’s picture rail. It was so absurd that a laugh began to boil up inside Ursula. It never broke because something shifted—a beam, or part of the wall—and she was sprinkled with a shower of talcum-like dust. Her heart thumped uncontrollably in her chest. It was sore, a time-delay bomb waiting to go off. (286)

It’s hard not to imagine yourself in that basement as you read, right down to the absurd laugh that never makes it into existence. This is Atkinson achieving with élan one of the goals of genre fiction—and where else would we expect to find a line about the heroine’s heart thumping uncontrollably in her chest? But in inviting readers to occupy Ursula’s perspective Atkinson has had to empty some space.

            The seamlessness of the narration and the vivid, often lurid episodes captured in the unfailingly well-crafted scenes of Ursula’s many lives effect a degree of immersion in the story that successfully counterbalances the ejective effects of Atkinson’s experimentations with the plot. The experience of these opposing forces—being simultaneously pulled into and cast out of the story—is what makes Life after Life both so intriguing and so instructive. One of the qualities that make stories good is that their various elements operate outside the audience’s awareness. Just as the best performances in cinema are the ones that embody a broad range of emotion while allowing viewers to forget, at least for the moment, that what they’re witnessing is in fact a performance—you’re not watching Daniel Day-Lewis, for instance, but Abraham Lincoln—the best stories immerse readers to the point where they’re no longer considering the story as a story but anxious to discover what lies in store for the characters. True virtuosos in both cinema and fiction, like magicians, want you to have a direct encounter with what never happens and only marvel afterward at the virtuosity that must’ve gone into arranging the illusion. The trick for an author who wants to risk calling attention to the authored nature of the story is to find a way to enfold her manipulations into the reader’s experiences with the characters. Ursula’s many lives must be accepted and understood as an element of the universe in which the plot of Life after Life unfolds and as part of the struggles we hope to see her through by the end of the novel. Unfortunately, the second design flaw, the weakness of Ursula as a character, sabotages the endeavor.

           The most obvious comparison to the repetitious plot of Life after Life is to the 1993 movie Groundhog Day, in which Bill Murray plays a character, Phil Connors, who keeps waking up to re-live the same day. What makes audiences accept this blatantly unrealistic premise is that Phil responds to his circumstances in such a convincing way, co-opting our own disbelief. As the movie progresses, Phil adjusts to the new nature of reality by adopting a new set of goals, and by this point our attention is focused much more on his evolving values than on the potential distraction of the plot’s impossibility. Eventually, the liberties screenwriters Danny Rubin and Harold Ramis have taken with the plot become so intermingled with the character and his development that witnessing his transformations is as close to undergoing them ourselves as the medium can hope to bring us. While at first we might’ve resisted the contrivance, just as Phil does, by the end its implausibility couldn’t be any more perfectly beside the point. In other words, the character’s struggles and transformation are compelling enough to misdirect our attention away from the author’s manipulations. That’s the magic of the film.

            In calling attention to the authoredness of the story within the confines of the story itself, Life after Life is also similar to Ian McEwan’s 2001 novel Atonement. But McEwan doesn’t drop the veil until near the end of the story; only then do we discover that one of the characters, Briony Tallis, is actually the author of everything we’ve been reading and that she has altered the events to provide a happier and more hopeful ending for two other characters whose lives she had, in her youthful naiveté, set on a tragic course. Giving them the ending they deserve is the only way she knows of now to atone for the all the pain she caused them in the past. Just as Phil’s transformation misdirects our attention from the manipulations of the plot in Groundhog Day, the revelation of how terrible the tragedy was that occurred to the characters in Atonement covers McEwan’s tracks, as we overlook the fact that he’s tricked us as to the true purpose of the narrative because we’re too busy sympathizing with Briony’s futile urge to set things right. In both cases, the experimentation with plot is thoroughly integrated with the development of a strong, unforgettable character, and any expulsive distraction is subsumed by more engrossing revelations. In both cases, the result is pure magic.

            Ursula Todd on the other hand may have been deliberately conceived of as, if not an entirely empty vessel, then a sparsely furnished one. Atkinson may have intended for her to serve as a type of everywoman to make it easy for readers to take on her perspective as she experiences events like the bombing of her apartment building. While we come to know and sympathize with characters like Phil and Briony, we go some distance toward actually becoming Ursula, letting her serve as our avatar in the various historical moments the story allows us to inhabit. By not filling in the outline of Ursula’s character, Atkinson may have been attempting to make our experience of all the scenes more direct and immediate. But the actual effect is to make them less impactful. We have to care about someone in the scene, someone trying to deal with the dilemma it depicts, before we can invest any emotion in it. Atkinson’s description of Lavinia Nesbit’s body makes it easy to imagine, and dismembered bodies are always disturbing to encounter. But her relationship to Ursula is casual, and in the context of the mulligan-calling plot her death is without consequence.

           Another possible explanation for the weakness of Ursula as a character is that Atkinson created her based on the assumption arising out of folk psychology that personality is reducible to personal history, that what happens to you determines who you become. Many authors and screenwriters fall into this trap of thinking they’re exploring characters when all they’re really doing is recounting a series of tragedies that have befallen them. But things happen to everyone. Character is what you do. Ursula is provisioned with a temperament—introverted, agreeable, conscientious—and she has a couple of habits—she’s a stickler for literary quotation—but she’s apathetic about the myriad revisions her life undergoes, and curiously unconcerned about the plot of her own personal story. For all her references to her shifting past, she has no plans or schemes or ambitions for the future. She exists within an intricate network of relationships, but what loves she has are tepid or taken for granted. And throughout the novel what we take at first to be her private thoughts nearly invariably end up being interrupted by memories of how other characters responded when she voiced them. At many points, especially at the beginning of the novel, she’s little more than a bookish girl waiting around for the next really bad thing to happen to her.

After she pushes the maid Bridget down the stairs to prevent her from bringing home the contagion that killed them both in previous lives, Ursula’s mother, Sylvie, sends her to a psychiatrist named Dr. Kellet who introduces her to Nietzsche’s concept of amor fati, which he defines as, “A simple acceptance of what comes to us, regarding it as neither bad nor good.” He then traces the idea back Pindar, whose take he translates as, “become such as you are, having learned what that is” (164). What does Ursula become? After the incident with the maid, there are a couple more instances of her taking action to avoid the tragedies of earlier iterations, and as the novel progresses it does seem like she might be becoming a little less passively stoic, a little less inert and defeated. But as a character she seems to be responding to the serial do-overs of the plot by taking on the attitude that it doesn’t matter what she does or what she becomes. In one of the timelines she most proactively shapes for herself, she travels to the continent to study Modern Languages so she can be a teacher, but even in this life she does little but idly wait for something to happen. Before returning to England,

She had deferred for a year, saying she wanted an opportunity to see a little of the world before “settling down” to a lifetime at the blackboard. That was her rationale anyway, the one that she paraded for parental scrutiny, whereas her true hope was that something would happen in the course of her time abroad that would mean she need never take up the place. What that “something” was she had no idea (“Love perhaps,” Millie said wistfully). Anything really would mean she didn’t end up as an embittered spinster in a girls’ grammar school, spooling her way through the conjugation of foreign verbs, chalk dust falling from her clothes like dandruff. (She based this portrait on her own schoolmistresses.) It wasn’t a profession that had garnered much enthusiasm in her immediate circle either. (333-4)

Again, the scenes and the mindset are easy to imagine (or recall), but just as Ursula’s plan fails to garner much enthusiasm, her plight—her fate—fails to arouse much concern, nowhere near enough, at any rate, to misdirect our attention from the authoredness of the plot.

            There’s a scene late in the novel that has Ursula’s father, Hugh, pondering his children’s personalities. “Ursula, of course, was different to all of them,” he thinks. “She was watchful, as if she were trying to drink in the whole world through those little green eyes that were both his and hers.” He can’t help adding, “She was rather unnerving” (486). But to be unnerving she would have to at least threaten to do something; she would have to be nosy or meddlesome, like Briony, instead of just watchful. What Hugh seems to be picking up on is that Ursula simply knows more than she should, a precocity borne of her wanderings on the palimpsest of time. But whereas a character like Phil quickly learns to exploit his foreknowledge it never occurs to Ursula to make any adjustments unless it’s to save her life or the life of a family member. Tellingly, however, there are a couple of characters in Life after Life for whom amor fati amounts to something other than an argument for impassivity.

Most people muddled through events and only in retrospect realized their significance. The Führer was different, he was consciously making history for the future. Only a true narcissist could do that. And Speer was designing buildings for Berlin so that they would look good when they were ruins a thousand years from now, his gift to the Führer. (To think on such a scale! Ursula lived hour by hour, another consequence of motherhood, the future as much a mystery as the past.) (351)

Of course, we know Ursula’s living hour by hour isn’t just a consequence of her being a mother, since this is the only timeline on which she becomes one. The moral shading to the issue of whether one should actually participate in history is cast over Ursula’s Aunt Izzie as well. Both Ursula and the rest of her family express a vague—or for Sylvie not so vague—disapproval of Izzie, which is ironic because she’s the most—really the only memorable character in the novel. Aunt Izzie actually does things. She elopes to Paris with a married man. She writes a series of children’s books. She moves to California with a playwright. And she’s always there to help Ursula when she gets in trouble.

            Whatever the reason was behind Atkinson’s decision to make her protagonist a mere silent watcher, the consequences for the novel as a whole are to render it devoid of any sense of progression or momentum. Imagine Groundhog Day without a character whose incandescent sarcasm and unchanneled charisma gradually give way to profound fellow-feeling, replaced by one who re-lives the same day over and over without ever seeming to learn or adjust, who never even comes close to pulling off that one perfect day that proves she’s worthy to wake up to a real tomorrow. Imagine Atonement without Briony’s fierce interiority and simmering loneliness. Most stories are going to seem dull compared to these two, but they demonstrate that however fleeting a story’s impact on audiences may be, it begins and ends with the central character’s active engagement with the world and the transformations they undergo as a result of it. Maybe Atkinson wanted to give her readers an experience of life’s preciousness, the contingent nature of everything we hold dear, an antidote to all the rushing desperation to shape an ideal life for ourselves and the wistful worry that we’re at every moment falling short. Unfortunately, those themes make for a story that, as vivid as it can be at points, is as eminently forgettable as its dreamless protagonist. “You may as well have another tot of rum,” a bartender says to the midwife who is being kept from attending Ursula’s umpteenth birth by a snowstorm in the book’s closing line. “You won’t be going anywhere in a hurry tonight” (529). In other words, you’d better find a way to make the most of it.

Also read:

WHAT MAKES "WOLF HALL" SO GREAT?

And:

WHAT MAKES "WOLF HALL" SO GREAT?

And:

SABBATH SAYS: PHILIP ROTH AND THE DILEMMAS OF IDEOLOGICAL CASTRATION

Read More
Dennis Junk Dennis Junk

The Self-Righteousness Instinct: Steven Pinker on the Better Angels of Modernity and the Evils of Morality

Is violence really declining? How can that be true? What could be causing it? Why are so many of us convinced the world is going to hell in a hand basket? Steven Pinker attempts to answer these questions in his magnificent and mind-blowing book.

51a5k0THlNL.jpg

Steven Pinker is one of the few scientists who can write a really long book and still expect a significant number of people to read it. But I have a feeling many who might be vaguely intrigued by the buzz surrounding his 2011 book The Better Angels of Our Nature: Why Violence Has Declined wonder why he had to make it nearly seven hundred outsized pages long. Many curious folk likely also wonder why a linguist who proselytizes for psychological theories derived from evolutionary or Darwinian accounts of human nature would write a doorstop drawing on historical and cultural data to describe the downward trajectories of rates of the worst societal woes. The message that violence of pretty much every variety is at unprecedentedly low rates comes as quite a shock, as it runs counter to our intuitive, news-fueled sense of being on a crash course for Armageddon. So part of the reason behind the book’s heft is that Pinker has to bolster his case with lots of evidence to get us to rethink our views. But flipping through the book you find that somewhere between half and a third of its mass is devoted, not to evidence of the decline, but to answering the questions of why the trend has occurred and why it gives every indication of continuing into the foreseeable future. So is this a book about how evolution has made us violent or about how culture is making us peaceful?

The first thing that needs to be said about Better Angels is that you should read it. Despite its girth, it’s at no point the least bit cumbersome to read, and at many points it’s so fascinating that, weighty as it is, you’ll have a hard time putting it down. Pinker has mastered a prose style that’s simple and direct to the point of feeling casual without ever wanting for sophistication. You can also rest assured that what you’re reading is timely and important because it explores aspects of history and social evolution that impact pretty much everyone in the world but that have gone ignored—if not censoriously denied—by most of the eminences contributing to the zeitgeist since the decades following the last world war.

            Still, I suspect many people who take the plunge into the first hundred or so pages are going to feel a bit disoriented as they try to figure out what the real purpose of the book is, and this may cause them to falter in their resolve to finish reading. The problem is that the resistance Better Angels runs to such a prodigious page-count simultaneously anticipating and responding to doesn’t come from news media or the blinkered celebrities in the carnivals of sanctimonious imbecility that are political talk shows. It comes from Pinker’s fellow academics. The overall point of Better Angels remains obscure owing to some deliberate caginess on the author’s part when it comes to identifying the true targets of his arguments. 

            This evasiveness doesn’t make the book difficult to read, but a quality of diffuseness to the theoretical sections, a multitude of strands left dangling, does at points make you doubt whether Pinker had a clear purpose in writing, which makes you doubt your own purpose in reading. With just a little tying together of those strands, however, you start to see that while on the surface he’s merely righting the misperception that over the course of history our species has been either consistently or increasingly violent, what he’s really after is something different, something bigger. He’s trying to instigate, or at least play a part in instigating, a revolution—or more precisely a renaissance—in the way scholars and intellectuals think not just about human nature but about the most promising ways to improve the lot of human societies.

The longstanding complaint about evolutionary explanations of human behavior is that by focusing on our biology as opposed to our supposedly limitless capacity for learning they imply a certain level of fixity to our nature, and this fixedness is thought to further imply a limit to what political reforms can accomplish. The reasoning goes, if the explanation for the way things are is to be found in our biology, then, unless our biology changes, the way things are is the way they’re going to remain. Since biological change occurs at the glacial pace of natural selection, we’re pretty much stuck with the nature we have. 

            Historically, many scholars have made matters worse for evolutionary scientists today by applying ostensibly Darwinian reasoning to what seemed at the time obvious biological differences between human races in intelligence and capacity for acquiring the more civilized graces, making no secret of their conviction that the differences justified colonial expansion and other forms of oppressive rule. As a result, evolutionary psychologists of the past couple of decades have routinely had to defend themselves against charges that they’re secretly trying to advance some reactionary (or even genocidal) agenda. Considering Pinker’s choice of topic in Better Angels in light of this type of criticism, we can start to get a sense of what he’s up to—and why his efforts are discombobulating.

If you’ve spent any time on a university campus in the past forty years, particularly if it was in a department of the humanities, then you have been inculcated with an ideology that was once labeled postmodernism but that eventually became so entrenched in academia, and in intellectual culture more broadly, that it no longer requires a label. (If you took a class with the word "studies" in the title, then you got a direct shot to the brain.) Many younger scholars actually deny any espousal of it—“I’m not a pomo!”—with reference to a passé version marked by nonsensical tangles of meaningless jargon and the conviction that knowledge of the real world is impossible because “the real world” is merely a collective delusion or social construction put in place to perpetuate societal power structures. The disavowals notwithstanding, the essence of the ideology persists in an inescapable but unremarked obsession with those same power structures—the binaries of men and women, whites and blacks, rich and poor, the West and the rest—and the abiding assumption that texts and other forms of media must be assessed not just according to their truth content, aesthetic virtue, or entertainment value, but also with regard to what we imagine to be their political implications. Indeed, those imagined political implications are often taken as clear indicators of the author’s true purpose in writing, which we must sniff out—through a process called “deconstruction,” or its anemic offspring “rhetorical analysis”—lest we complacently succumb to the subtle persuasion.

In the late nineteenth and early twentieth centuries, faith in what we now call modernism inspired intellectuals to assume that the civilizations of Western Europe and the United States were on a steady march of progress toward improved lives for all their own inhabitants as well as the world beyond their borders. Democracy had brought about a new age of government in which rulers respected the rights and freedom of citizens. Medicine was helping ever more people live ever longer lives. And machines were transforming everything from how people labored to how they communicated with friends and loved ones. Everyone recognized that the driving force behind this progress was the juggernaut of scientific discovery. But jump ahead a hundred years to the early twenty-first century and you see a quite different attitude toward modernity. As Pinker explains in the closing chapter of Better Angels,

A loathing of modernity is one of the great constants of contemporary social criticism. Whether the nostalgia is for small-town intimacy, ecological sustainability, communitarian solidarity, family values, religious faith, primitive communism, or harmony with the rhythms of nature, everyone longs to turn back the clock. What has technology given us, they say, but alienation, despoliation, social pathology, the loss of meaning, and a consumer culture that is destroying the planet to give us McMansions, SUVs, and reality television? (692)

The social pathology here consists of all the inequities and injustices suffered by the people on the losing side of those binaries all us closet pomos go about obsessing over. Then of course there’s industrial-scale war and all the other types of modern violence. With terrorism, the War on Terror, the civil war in Syria, the Israel-Palestine conflict, genocides in the Sudan, Kosovo, and Rwanda, and the marauding bands of drugged-out gang rapists in the Congo, it seems safe to assume that science and democracy and capitalism have contributed to the construction of an unsafe global system with some fatal, even catastrophic design flaws. And that’s before we consider the two world wars and the Holocaust. So where the hell is this decline Pinker refers to in his title?

            One way to think about the strain of postmodernism or anti-modernism with the most currency today (and if you’re reading this essay you can just assume your views have been influenced by it) is that it places morality and politics—identity politics in particular—atop a hierarchy of guiding standards above science and individual rights. So, for instance, concerns over the possibility that a negative image of Amazonian tribespeople might encourage their further exploitation trump objective reporting on their culture by anthropologists, even though there’s no evidence to support those concerns. And evidence that the disproportionate number of men in STEM fields reflects average differences between men and women in lifestyle preferences and career interests is ignored out of deference to a political ideal of perfect parity. The urge to grant moral and political ideals veto power over science is justified in part by all the oppression and injustice that abounds in modern civilizations—sexism, racism, economic exploitation—but most of all it’s rationalized with reference to the violence thought to follow in the wake of any movement toward modernity. Pinker writes,

“The twentieth century was the bloodiest in history” is a cliché that has been used to indict a vast range of demons, including atheism, Darwin, government, science, capitalism, communism, the ideal of progress, and the male gender. But is it true? The claim is rarely backed up by numbers from any century other than the 20th, or by any mention of the hemoclysms of centuries past. (193)

He gives the question even more gravity when he reports that all those other areas in which modernity is alleged to be such a colossal failure tend to improve in the absence of violence. “Across time and space,” he writes in the preface, “the more peaceable societies also tend to be richer, healthier, better educated, better governed, more respectful of their women, and more likely to engage in trade” (xxiii). So the question isn’t just about what the story with violence is; it’s about whether science, liberal democracy, and capitalism are the disastrous blunders we’ve learned to think of them as or whether they still just might hold some promise for a better world.

*******

            It’s in about the third chapter of Better Angels that you start to get the sense that Pinker’s style of thinking is, well, way out of style. He seems to be marching to the beat not of his own drummer but of some drummer from the nineteenth century. In the chapter previous, he drew a line connecting the violence of chimpanzees to that in what he calls non-state societies, and the images he’s left you with are savage indeed. Now he’s bringing in the philosopher Thomas Hobbes’s idea of a government Leviathan that once established immediately works to curb the violence that characterizes us humans in states of nature and anarchy. According to sociologist Norbert Elias’s 1969 book, The Civilizing Process, a work whose thesis plays a starring role throughout Better Angels, the consolidation of a Leviathan in England set in motion a trend toward pacification, beginning with the aristocracy no less, before spreading down to the lower ranks and radiating out to the countries of continental Europe and onward thence to other parts of the world. You can measure your feelings of unease in response to Pinker’s civilizing scenario as a proxy for how thoroughly steeped you are in postmodernism.

            The two factors missing from his account of the civilizing pacification of Europe that distinguish it from the self-congratulatory and self-exculpatory sagas of centuries past are the innate superiority of the paler stock and the special mission of conquest and conversion commissioned by a Christian god. In a later chapter, Pinker violates the contemporary taboo against discussing—or even thinking about—the potential role of average group (racial) differences in a propensity toward violence, but he concludes the case for any such differences is unconvincing: “while recent biological evolution may, in theory, have tweaked our inclinations toward violence and nonviolence, we have no good evidence that it actually has” (621). The conclusion that the Civilizing Process can’t be contingent on congenital characteristics follows from the observation of how readily individuals from far-flung regions acquire local habits of self-restraint and fellow-feeling when they’re raised in modernized societies. As for religion, Pinker includes it in a category of factors that are “Important but Inconsistent” with regard to the trend toward peace, dismissing the idea that atheism leads to genocide by pointing out that “Fascism happily coexisted with Catholicism in Spain, Italy, Portugal, and Croatia, and though Hitler had little use for Christianity, he was by no means an atheist, and professed that he was carrying out a divine plan.” Though he cites several examples of atrocities incited by religious fervor, he does credit “particular religious movements at particular times in history” with successfully working against violence (677).

            Despite his penchant for blithely trampling on the taboos of the liberal intelligentsia, Pinker refuses to cooperate with our reflex to pigeonhole him with imperialists or far-right traditionalists past or present. He continually holds up to ridicule the idea that violence has any redeeming effects. In a section on the connection between increasing peacefulness and rising intelligence, he suggests that our violence-tolerant “recent ancestors” can rightly be considered “morally retarded” (658).

  He singles out George W. Bush as an unfortunate and contemptible counterexample in a trend toward more complex political rhetoric among our leaders. And if it’s either gender that comes out not looking as virtuous in Better Angels it ain’t the distaff one. Pinker is difficult to categorize politically because he’s a scientist through and through. What he’s after are reasoned arguments supported by properly weighed evidence.

But there is something going on in Better Angels beyond a mere accounting for the ongoing decline in violence that most of us are completely oblivious of being the beneficiaries of. For one, there’s a challenge to the taboo status of topics like genetic differences between groups, or differences between individuals in IQ, or differences between genders. And there’s an implicit challenge as well to the complementary premises he took on more directly in his earlier book The Blank Slate that biological theories of human nature always lead to oppressive politics and that theories of the infinite malleability of human behavior always lead to progress (communism relies on a blank slate theory, and it inspired guys like Stalin, Mao, and Pol Pot to murder untold millions). But the most interesting and important task Pinker has set for himself with Better Angels is a restoration of the Enlightenment, with its twin pillars of science and individual rights, to its rightful place atop the hierarchy of our most cherished guiding principles, the position we as a society misguidedly allowed to be usurped by postmodernism, with its own dual pillars of relativism and identity politics.

  But, while the book succeeds handily in undermining the moral case against modernism, it does so largely by stealth, with only a few explicit references to the ideologies whose advocates have dogged Pinker and his fellow evolutionary psychologists for decades. Instead, he explores how our moral intuitions and political ideals often inspire us to make profoundly irrational arguments for positions that rational scrutiny reveals to be quite immoral, even murderous. As one illustration of how good causes can be taken to silly, but as yet harmless, extremes, he gives the example of how “violence against children has been defined down to dodgeball” (415) in gym classes all over the US, writing that

The prohibition against dodgeball represents the overshooting of yet another successful campaign against violence, the century-long movement to prevent the abuse and neglect of children. It reminds us of how a civilizing offensive can leave a culture with a legacy of puzzling customs, peccadilloes, and taboos. The code of etiquette bequeathed to us by this and other Rights Revolutions is pervasive enough to have acquired a name. We call it political correctness. (381)

Such “civilizing offensives” are deliberately undertaken counterparts to the fortuitously occurring Civilizing Process Elias proposed to explain the jagged downward slope in graphs of relative rates of violence beginning in the Middle Ages in Europe. The original change Elias describes came about as a result of rulers consolidating their territories and acquiring greater authority. As Pinker explains,

Once Leviathan was in charge, the rules of the game changed. A man’s ticket to fortune was no longer being the baddest knight in the area but making a pilgrimage to the king’s court and currying favor with him and his entourage. The court, basically a government bureaucracy, had no use for hotheads and loose cannons, but sought responsible custodians to run its provinces. The nobles had to change their marketing. They had to cultivate their manners, so as not to offend the king’s minions, and their empathy, to understand what they wanted. The manners appropriate for the court came to be called “courtly” manners or “courtesy.” (75)

And this higher premium on manners and self-presentation among the nobles would lead to a cascade of societal changes.

Elias first lighted on his theory of the Civilizing Process as he was reading some of the etiquette guides which survived from that era. It’s striking to us moderns to see that knights of yore had to be told not to dispose of their snot by shooting it into their host’s table cloth, but that simply shows how thoroughly people today internalize these rules. As Elias explains, they’ve become second nature to us. Of course, we still have to learn them as children. Pinker prefaces his discussion of Elias’s theory with a recollection of his bafflement at why it was so important for him as a child to abstain from using his knife as a backstop to help him scoop food off his plate with a fork. Table manners, he concludes, reside on the far end of a continuum of self-restraint at the opposite end of which are once-common practices like cutting off the nose of a dining partner who insults you. Likewise, protecting children from the perils of flying rubber balls is the product of a campaign against the once-common custom of brutalizing them. The centrality of self-control is the common underlying theme: we control our urge to misuse utensils, including their use in attacking our fellow diners, and we control our urge to throw things at our classmates, even if it’s just in sport. The effect of the Civilizing Process in the Middle Ages, Pinker explains, was that “A culture of honor—the readiness to take revenge—gave way to a culture of dignity—the readiness to control one’s emotions” (72). In other words, diplomacy became more important than deterrence.

            What we’re learning here is that even an evolved mind can adjust to changing incentive schemes. Chimpanzees have to control their impulses toward aggression, sexual indulgence, and food consumption in order to survive in hierarchical bands with other chimps, many of whom are bigger, stronger, and better-connected. Much of the violence in chimp populations takes the form of adult males vying for positions in the hierarchy so they can enjoy the perquisites males of lower status must forgo to avoid being brutalized. Lower ranking males meanwhile bide their time, hopefully forestalling their gratification until such time as they grow stronger or the alpha grows weaker. In humans, the capacity for impulse-control and the habit of delaying gratification are even more important because we live in even more complex societies. Those capacities can either lie dormant or they can be developed to their full potential depending on exactly how complex the society is in which we come of age. Elias noticed a connection between the move toward more structured bureaucracies, less violence, and an increasing focus on etiquette, and he concluded that self-restraint in the form of adhering to strict codes of comportment was both an advertisement of, and a type of training for, the impulse-control that would make someone a successful bureaucrat.

            Aside from children who can’t fathom why we’d futz with our forks trying to capture recalcitrant peas, we normally take our society’s rules of etiquette for granted, no matter how inconvenient or illogical they are, seldom thinking twice before drawing unflattering conclusions about people who don’t bother adhering to them, the ones for whom they aren’t second nature. And the importance we place on etiquette goes beyond table manners. We judge people according to the discretion with which they dispose of any and all varieties of bodily effluent, as well as the delicacy with which they discuss topics sexual or otherwise basely instinctual. 

            Elias and Pinker’s theory is that, while the particular rules are largely arbitrary, the underlying principle of transcending our animal nature through the application of will, motivated by an appreciation of social convention and the sensibilities of fellow community members, is what marked the transition of certain constituencies of our species from a violent non-state existence to a relatively peaceful, civilized lifestyle. To Pinker, the uptick in violence that ensued once the counterculture of the 1960s came into full blossom was no coincidence. The squares may not have been as exciting as the rock stars who sang their anthems to hedonism and the liberating thrill of sticking it to the man. But a society of squares has certain advantages—a lower probability for each of its citizens of getting beaten or killed foremost among them.

            The Civilizing Process as Elias and Pinker, along with Immanuel Kant, understand it picks up momentum as levels of peace conducive to increasingly complex forms of trade are achieved. To understand why the move toward markets or “gentle commerce” would lead to decreasing violence, us pomos have to swallow—at least momentarily—our animus for Wall Street and all the corporate fat cats in the top one percent of the wealth distribution. The basic dynamic underlying trade is that one person has access to more of something than they need, but less of something else, while another person has the opposite balance, so a trade benefits them both. It’s a win-win, or a positive-sum game. The hard part for educated liberals is to appreciate that economies work to increase the total wealth; there isn’t a set quantity everyone has to divvy up in a zero-sum game, an exchange in which every gain for one is a loss for another. And Pinker points to another benefit:

Positive-sum games also change the incentives for violence. If you’re trading favors or surpluses with someone, your trading partner suddenly becomes more valuable to you alive than dead. You have an incentive, moreover, to anticipate what he wants, the better to supply it to him in exchange for what you want. Though many intellectuals, following in the footsteps of Saints Augustine and Jerome, hold businesspeople in contempt for their selfishness and greed, in fact a free market puts a premium on empathy. (77)

The Occupy Wall Street crowd will want to jump in here with a lengthy list of examples of businesspeople being unempathetic in the extreme. But Pinker isn’t saying commerce always forces people to be altruistic; it merely encourages them to exercise their capacity for perspective-taking. Discussing the emergence of markets, he writes,

The advances encouraged the division of labor, increased surpluses, and lubricated the machinery of exchange. Life presented people with more positive-sum games and reduced the attractiveness of zero-sum plunder. To take advantage of the opportunities, people had to plan for the future, control their impulses, take other people’s perspectives, and exercise the other social and cognitive skills needed to prosper in social networks. (77)

And these changes, the theory suggests, will tend to make merchants less likely on average to harm anyone. As bad as bankers can be, they’re not out sacking villages.

            Once you have commerce, you also have a need to start keeping records. And once you start dealing with distant partners it helps to have a mode of communication that travels. As writing moved out of the monasteries, and as technological advances in transportation brought more of the world within reach, ideas and innovations collided to inspire sequential breakthroughs and discoveries. Every advance could be preserved, dispersed, and ratcheted up. Pinker focuses on two relatively brief historical periods that witnessed revolutions in the way we think about violence, and both came in the wake of major advances in the technologies involved in transportation and communication. The first is the Humanitarian Revolution that occurred in the second half of the eighteenth century, and the second covers the Rights Revolutions in the second half of the twentieth. The Civilizing Process and gentle commerce weren’t sufficient to end age-old institutions like slavery and the torture of heretics. But then came the rise of the novel as a form of mass entertainment, and with all the training in perspective-taking readers were undergoing the hitherto unimagined suffering of slaves, criminals, and swarthy foreigners became intolerably imaginable. People began to agitate and change ensued.

            The Humanitarian Revolution occurred at the tail end of the Age of Reason and is recognized today as part of the period known as the Enlightenment. According to some scholarly scenarios, the Enlightenment, for all its successes like the American Constitution and the abolition of slavery, paved the way for all those allegedly unprecedented horrors in the first half of the twentieth century. Notwithstanding all this ivory tower traducing, the Enlightenment emerged from dormancy after the Second World War and gradually gained momentum, delivering us into a period Pinker calls the New Peace. Just as the original Enlightenment was preceded by increasing cosmopolitanism, improving transportation, and an explosion of literacy, the transformations that brought about the New Peace followed a burst of technological innovation. For Pinker, this is no coincidence. He writes,

If I were to put my money on the single most important exogenous cause of the Rights Revolutions, it would be the technologies that made ideas and people increasingly mobile. The decades of the Rights Revolutions were the decades of the electronics revolutions: television, transistor radios, cable, satellite, long-distance telephones, photocopiers, fax machines, the Internet, cell phones, text messaging, Web video. They were the decades of the interstate highway, high-speed rail, and the jet airplane. They were the decades of the unprecedented growth in higher education and in the endless frontier of scientific research. Less well known is that they were also the decades of an explosion in book publishing. From 1960 to 2000, the annual number of books published in the United States increased almost fivefold. (477)

Violence got slightly worse in the 60s. But the Civil Rights Movement was underway, Women’s Rights were being extended into new territories, and people even began to acknowledge that animals could suffer, prompting them to argue that we shouldn’t cause them to do so without cause. Today the push for Gay Rights continues. By 1990, the uptick in violence was over, and so far the move toward peace is looking like an ever greater success. Ironically, though, all the new types of media bringing images from all over the globe into our living rooms and pockets contributes to the sense that violence is worse than ever.

*******

            Three factors brought about a reduction in violence over the course of history then: strong government, trade, and communications technology. These factors had the impact they did because they interacted with two of our innate propensities, impulse-control and perspective-taking, by giving individuals both the motivation and the wherewithal to develop them both to ever greater degrees. It’s difficult to draw a clear delineation between developments that were driven by chance or coincidence and those driven by deliberate efforts to transform societies. But Pinker does credit political movements based on moral principles with having played key roles:

Insofar as violence is immoral, the Rights Revolutions show that a moral way of life often requires a decisive rejection of instinct, culture, religion, and standard practice. In their place is an ethics that is inspired by empathy and reason and stated in the language of rights. We force ourselves into the shoes (or paws) of other sentient beings and consider their interests, starting with their interest in not being hurt or killed, and we ignore superficialities that may catch our eye such as race, ethnicity, gender, age, sexual orientation, and to some extent, species. (475)

Some of the instincts we must reject in order to bring about peace, however, are actually moral instincts.

Pinker is setting up a distinction here between different kinds of morality. The one he describes that’s based on perspective-taking—which evidence he presents later suggests inspires sympathy—and is “stated in the language of rights” is the one he credits with transforming the world for the better. Of the idea that superficial differences shouldn’t distract us from our common humanity, he writes,

This conclusion, of course, is the moral vision of the Enlightenment and the strands of humanism and liberalism that have grown out of it. The Rights Revolutions are liberal revolutions. Each has been associated with liberal movements, and each is currently distributed along a gradient that runs, more or less, from Western Europe to the blue American states to the red American states to the democracies of Latin America and Asia and then to the more authoritarian countries, with Africa and most of the Islamic world pulling up the rear. In every case, the movements have left Western cultures with excesses of propriety and taboo that are deservedly ridiculed as political correctness. But the numbers show that the movements have reduced many causes of death and suffering and have made the culture increasingly intolerant of violence in any form. (475-6)

So you’re not allowed to play dodgeball at school or tell off-color jokes at work, but that’s a small price to pay. The most remarkable part of this passage though is that gradient he describes; it suggests the most violent regions of the globe are also the ones where people are the most obsessed with morality, with things like Sharia and so-called family values. It also suggests that academic complaints about the evils of Western culture are unfounded and startlingly misguided. As Pinker casually points out in his section on Women’s Rights, “Though the United States and other Western nations are often accused of being misogynistic patriarchies, the rest of the world is immensely worse” (413).

The Better Angels of Our Nature came out about a year before Jonathan Haidt’s The Righteous Mind, but Pinker’s book beats Haidt’s to the punch by identifying a serious flaw in his reasoning. The Righteous Mind explores how liberals and conservatives conceive of morality differently, and Haidt argues that each conception is equally valid so we should simply work to understand and appreciate opposing political views. It’s not like you’re going to change anyone’s mind anyway, right? But the liberal ideal of resisting certain moral intuitions tends to bring about a rather important change wherever it’s allowed to be realized. Pinker writes that

right or wrong, retracting the moral sense from its traditional spheres of community, authority, and purity entails a reduction of violence. And that retraction is precisely the agenda of classical liberalism: a freedom of individuals from tribal and authoritarian force, and a tolerance of personal choices as long as they do not infringe on the autonomy and well-being of others. (637)

Classical liberalism—which Pinker distinguishes from contemporary political liberalism—can even be viewed as an effort to move morality away from the realm of instincts and intuitions into the more abstract domains of law and reason. The perspective-taking at the heart of Enlightenment morality can be said to consist of abstracting yourself from your identifying characteristics and immediate circumstances to imagine being someone else in unfamiliar straits. A man with a job imagines being a woman who can’t get one. A white man on good terms with law enforcement imagines being a black man who gets harassed. This practice of abstracting experiences and distilling individual concerns down to universal principles is the common thread connecting Enlightenment morality to science.

            So it’s probably no coincidence, Pinker argues, that as we’ve gotten more peaceful, people in Europe and the US have been getting better at abstract reasoning as well, a trend which has been going on for as long as researchers have had tests to measure it. Psychologists over the course of the twentieth century have had to adjust IQ test results (the average is always 100) a few points every generation because scores on a few subsets of questions have kept going up. The regular rising of scores is known as the Flynn Effect, after psychologist James Flynn, who was one of the first researchers to realize the trend was more than methodological noise. Having posited a possible connection between scientific and moral reasoning, Pinker asks, “Could there be a moral Flynn Effect?” He explains,

We have several grounds for supposing that enhanced powers of reason—specifically, the ability to set aside immediate experience, detach oneself from a parochial vantage point, and frame one’s ideas in abstract, universal terms—would lead to better moral commitments, including an avoidance of violence. And we have just seen that over the course of the 20th century, people’s reasoning abilities—particularly their ability to set aside immediate experience, detach themselves from a parochial vantage point, and think in abstract terms—were steadily enhanced. (656)

Pinker cites evidence from an array of studies showing that high-IQ people tend have high moral IQs as well. One of them, an infamous study by psychologist Satoshi Kanazawa based on data from over twenty thousand young adults in the US, demonstrates that exceptionally intelligent people tend to hold a particular set of political views. And just as Pinker finds it necessary to distinguish between two different types of morality he suggests we also need to distinguish between two different types of liberalism:

Intelligence is expected to correlate with classical liberalism because classical liberalism is itself a consequence of the interchangeability of perspectives that is inherent to reason itself. Intelligence need not correlate with other ideologies that get lumped into contemporary left-of-center political coalitions, such as populism, socialism, political correctness, identity politics, and the Green movement. Indeed, classical liberalism is sometimes congenial to the libertarian and anti-political-correctness factions in today’s right-of-center coalitions. (662)

And Kanazawa’s findings bear this out. It’s not liberalism in general that increases steadily with intelligence, but a particular kind of liberalism, the type focusing more on fairness than on ideology.

*******

Following the chapters devoted to historical change, from the early Middle Ages to the ongoing Rights Revolutions, Pinker includes two chapters on psychology, the first on our “Inner Demons” and the second on our “Better Angels.” Ideology gets some prime real estate in the Demons chapter, because, he writes, “the really big body counts in history pile up” when people believe they’re serving some greater good. “Yet for all that idealism,” he explains, “it’s ideology that drove many of the worst things that people have ever done to each other.” Christianity, Nazism, communism—they all “render opponents of the ideology infinitely evil and hence deserving of infinite punishment” (556). Pinker’s discussion of morality, on the other hand, is more complicated. It begins, oddly enough, in the Demons chapter, but stretches into the Angels one as well. This is how the section on morality in the Angels chapter begins:

The world has far too much morality. If you added up all the homicides committed in pursuit of self-help justice, the casualties of religious and revolutionary wars, the people executed for victimless crimes and misdemeanors, and the targets of ideological genocides, they would surely outnumber the fatalities from amoral predation and conquest. The human moral sense can excuse any atrocity in the minds of those who commit it, and it furnishes them with motives for acts of violence that bring them no tangible benefit. The torture of heretics and conversos, the burning of witches, the imprisonment of homosexuals, and the honor killing of unchaste sisters and daughters are just a few examples. (622)

The postmodern push to give precedence to moral and political considerations over science, reason, and fairness may seem like a good idea at first. But political ideologies can’t be defended on the grounds of their good intentions—they all have those. And morality has historically caused more harm than good. It’s only the minimalist, liberal morality that has any redemptive promise:

Though the net contribution of the human moral sense to human well-being may well be negative, on those occasions when it is suitably deployed it can claim some monumental advances, including the humanitarian reforms of the Enlightenment and the Rights Revolutions of recent decades. (622)

            One of the problems with ideologies Pinker explores is that they lend themselves too readily to for-us-or-against-us divisions which piggyback on all our tribal instincts, leading to dehumanization of opponents as a step along the path to unrestrained violence. But, we may ask, isn’t the Enlightenment just another ideology? If not, is there some reliable way to distinguish an ideological movement from a “civilizing offensive” or a “Rights Revolution”? Pinker doesn’t answer these questions directly, but it’s in his discussion of the demonic side of morality where Better Angels offers its most profound insights—and it’s also where we start to be able to piece together the larger purpose of the book. He writes,

In The Blank Slate I argued that the modern denial of the dark side of human nature—the doctrine of the Noble Savage—was a reaction against the romantic militarism, hydraulic theories of aggression, and glorification of struggle and strife that had been popular in the late 19th and early 20th centuries. Scientists and scholars who question the modern doctrine have been accused of justifying violence and have been subjected to vilification, blood libel, and physical assault. The Noble Savage myth appears to be another instance of an antiviolence movement leaving a cultural legacy of propriety and taboo. (488)

Since Pinker figured that what he and his fellow evolutionary psychologists kept running up against was akin to the repulsion people feel against poor table manners or kids winging balls at each other in gym class, he reasoned that he ought to be able to simply explain to the critics that evolutionary psychologists have no intention of justifying, or even encouraging complacency toward, the dark side of human nature. “But I am now convinced,” he writes after more than a decade of trying to explain himself, “that a denial of the human capacity for evil runs even deeper, and may itself be a feature of human nature” (488). That feature, he goes on to explain, makes us feel compelled to label as evil anyone who tries to explain evil scientifically—because evil as a cosmic force beyond the reach of human understanding plays an indispensable role in group identity.

            Pinker began to fully appreciate the nature of the resistance to letting biology into discussions of human harm-doing when he read about the work of psychologist Roy Baumeister exploring the wide discrepancies in accounts of anger-inducing incidents between perpetrators and victims. The first studies looked at responses to minor offenses, but Baumeister went on to present evidence that the pattern, which Pinker labels the “Moralization Gap,” can be scaled up to describe societal attitudes toward historical atrocities. Pinker explains,

The Moralization Gap consists of complementary bargaining tactics in the negotiation for recompense between a victim and a perpetrator. Like opposing counsel in a lawsuit over a tort, the social plaintiff will emphasize the deliberateness, or at least the depraved indifference, of the defendant’s action, together with the pain and suffering the plaintiff endures. The social defendant will emphasize the reasonableness or unavoidability of the action, and will minimize the plaintiff’s pain and suffering. The competing framings shape the negotiations over amends, and also play to the gallery in a competition for their sympathy and for a reputation as a responsible reciprocator. (491)

Another of the Inner Demons Pinker suggests plays a key role in human violence is the drive for dominance, which he explains operates not just at the level of the individual but at that of the group to which he or she belongs. We want our group, however we understand it in the immediate context, to rest comfortably atop a hierarchy of other groups. What happens is that the Moralization Gap gets mingled with this drive to establish individual and group superiority. You see this dynamic playing out even in national conflicts. Pinker points out,

The victims of a conflict are assiduous historians and cultivators of memory. The perpetrators are pragmatists, firmly planted in the present. Ordinarily we tend to think of historical memory as a good thing, but when the events being remembered are lingering wounds that call for redress, it can be a call to violence. (493)

Name a conflict and with little effort you’ll likely also be able to recall contentions over historical records associated with it.

            The outcome of the Moralization Gap being taken to the group historical level is what Pinker and Baumeister call the “Myth of Pure Evil.” Harm-doing narratives start to take on religious overtones as what began as a conflict between regular humans pursuing or defending their interests, in ways they probably reasoned were just, transforms into an eternal struggle against inhuman and sadistic agents of chaos. And Pinker has come to realize that it is this Myth of Pure Evil that behavioral scientists ineluctably end up blaspheming:

Baumeister notes that in the attempt to understand harm-doing, the viewpoint of the scientist or scholar overlaps with the viewpoint of the perpetrator. Both take a detached, amoral stance toward the harmful act. Both are contextualizers, always attentive to the complexities of the situation and how they contributed to the causation of the harm. And both believe that the harm is ultimately explicable. (495)

This is why evolutionary psychologists who study violence inspire what Pinker in The Blank Slate called “political paranoia and moral exhibitionism” (106) on the part of us naïve pomos, ravenously eager to showcase our valor by charging once more into the breach against the mythical malevolence. All the while, our impregnable assurance of our own righteousness is borne of the conviction that we’re standing up for the oppressed. Pinker writes,

The viewpoint of the moralist, in contrast, is the viewpoint of the victim. The harm is treated with reverence and awe. It continues to evoke sadness and anger long after it was perpetrated. And for all the feeble ratiocination we mortals throw at it, it remains a cosmic mystery, a manifestation of the irreducible and inexplicable existence of evil in the universe. Many chroniclers of the Holocaust consider it immoral even to try to explain it. (495-6)

We simply can’t help inflating the magnitude of the crime in our attempt to convince our ideological opponents of their folly—though what we’re really inflating is our own, and our group’s, glorification—and so we can’t abide anyone puncturing our overblown conception because doing so lends credence to the opposition, making us look a bit foolish in the process for all our exaggerations.

            Reading Better Angels, you get the sense that Pinker experienced some genuine surprise and some real delight in discovering more and more corroboration for the idea that rates of violence have been trending downward in nearly every domain he explored. But things get tricky as you proceed through the pages because many of his arguments take on opposing positions he avoids naming. He seems to have seen the trove of evidence for declining violence as an opportunity to outflank the critics of evolutionary psychology in leftist, postmodern academia (to use a martial metaphor). Instead of calling them out directly, he circles around to chip away at the moral case for their political mission. We see this, for example, in his discussion of rape, which psychologists get into all kinds of trouble for trying to explain. After examining how scientists seem to be taking the perspective of perpetrators, Pinker goes on to write,

The accusation of relativizing evil is particularly likely when the motive the analyst imputes to the perpetrator appears to be venial, like jealousy, status, or retaliation, rather than grandiose, like the persistence of suffering in the world or the perpetuation of race, class, or gender oppression. It is also likely when the analyst ascribes the motive to every human being rather than to a few psychopaths or to the agents of a malignant political system (hence the popularity of the doctrine of the Noble Savage). (496)

In his earlier section on Woman’s Rights and the decline of rape, he attributed the difficulty in finding good data on the incidence of the crime, as well as some of the “preposterous” ideas about what motivates it, to the same kind of overextensions of anti-violence campaigns that lead to arbitrary rules about the use of silverware and proscriptions against dodgeball:

Common sense never gets in the way of a sacred custom that has accompanied a decline in violence, and today rape centers unanimously insist that “rape or sexual assault is not an act of sex or lust—it’s about aggression, power, and humiliation, using sex as the weapon. The rapist’s goal is domination.” (To which the journalist Heather MacDonald replies: “The guys who push themselves on women at keggers are after one thing only, and it’s not a reinstatement of the patriarchy.”) (406)

Jumping ahead to Pinker’s discussion of the Moralization Gap, we see that the theory that rape is about power, as opposed to the much more obvious theory that it’s about sex, is an outgrowth of the Myth of Pure Evil, an inflation of the mundane drives that lead some pathetic individuals to commit horrible crimes into eternal cosmic forces, inscrutable and infinitely punishable.

            When feminists impute political motives to rapists, they’re crossing the boundary from Enlightenment morality to the type of moral ideology that inspires dehumanization and violence. The good news is that it’s not difficult to distinguish between the two. From the Enlightenment perspective, rape is indefensibly wrong because it violates the autonomy of the victim—it’s an act of violence perpetrated by one individual against another. From the ideological perspective, every rape must be understood in the context of the historical oppression of women by men; it transcends the individuals involved as a representation of a greater evil. The rape-as-a-political-act theory also comes dangerously close to implying a type of collective guilt, which is a clear violation of individual rights.

Scholars already make the distinction between three different waves of feminism. The first two fall within Pinker’s definition of Rights Revolutions; they encompassed pushes for suffrage, marriage rights, and property rights, and then the rights to equal pay and equal opportunity in the workplace. The third wave is avowedly postmodern, its advocates committed to the ideas that gender is a pure social construct and that suggesting otherwise is an act of oppression. What you come away from Better Angels realizing, even though Pinker doesn’t say it explicitly, is that somewhere between the second and third waves feminists effectively turned against the very ideas and institutions that had been most instrumental in bringing about the historical improvements in women’s lives from the Middle Ages to the turn of the twenty-first century. And so it is with all the other ideologies on the postmodern roster.

Another misguided propaganda tactic that dogged Pinker’s efforts to identify historical trends in violence can likewise be understood as an instance of inflating the severity of crimes on behalf of a moral ideology—and the taboo placed on puncturing the bubble or vitiating the purity of evil with evidence and theories of venial motives. As he explains in the preface, “No one has ever recruited activists to a cause by announcing that things are getting better, and bearers of good news are often advised to keep their mouths shut lest they lull people into complacency” (xxii). Here again the objective researcher can’t escape the appearance of trying to minimize the evil, and therefore risks being accused of looking the other way, or even of complicity. But in an earlier section on genocide Pinker provides the quintessential Enlightenment rationale for the clear-eyed scientific approach to studying even the worst atrocities. He writes,

The effort to whittle down the numbers that quantify the misery can seem heartless, especially when the numbers serve as propaganda for raising money and attention. But there is a moral imperative in getting the facts right, and not just to maintain credibility. The discovery that fewer people are dying in wars all over the world can thwart cynicism among compassion-fatigued news readers who might otherwise think that poor countries are irredeemable hellholes. And a better understanding of what drve the numbers down can steer us toward doing things that make people better off rather than congratulating ourselves on how altruistic we are. (320)

This passage can be taken as the underlying argument of the whole book. And it gestures toward some far-reaching ramifications to the idea that exaggerated numbers are a product of the same impulse that causes us to inflate crimes to the status of pure evil.

Could it be that the nearly universal misperception that violence is getting worse all over the world, that we’re doomed to global annihilation, and that everywhere you look is evidence of the breakdown in human decency—could it be that the false impression Pinker set out to correct with Better Angels is itself a manifestation of a natural urge in all of us to seek out evil and aggrandize ourselves by unconsciously overestimating it? Pinker himself never goes as far as suggesting the mass ignorance of waning violence is a byproduct of an instinct toward self-righteousness. Instead, he writes of the “gloom” about the fate of humanity,

I think it comes from the innumeracy of our journalistic and intellectual culture. The journalist Michael Kinsley recently wrote, “It is a crushing disappointment that Boomers entered adulthood with Americans killing and dying halfway around the world, and now, as Boomers reach retirement and beyond, our country is doing the same damned thing.” This assumes that 5,000 Americans dying is the same damned thing as 58,000 Americans dying, and that a hundred thousand Iraqis being killed is the same damned thing as several million Vietnamese being killed. If we don’t keep an eye on the numbers, the programming policy “If it bleeds it leads” will feed the cognitive shortcut “The more memorable, the more frequent,” and we will end up with what has been called a false sense of insecurity. (296)

Pinker probably has a point, but the self-righteous undertone of Kinsley’s “same damned thing” is unmistakable. He’s effectively saying, I’m such an outstanding moral being the outrageous evilness of the invasion of Iraq is blatantly obvious to me—why isn’t it to everyone else? And that same message seems to underlie most of the statements people make expressing similar sentiments about how the world is going to hell.

            Though Pinker neglects to tie all the strands together, he still manages to suggest that the drive to dominance, ideology, tribal morality, and the Myth of Pure Evil are all facets of the same disastrous flaw in human nature—an instinct for self-righteousness. Progress on the moral front—real progress like fewer deaths, less suffering, and more freedom—comes from something much closer to utilitarian pragmatism than activist idealism. Yet the activist tradition is so thoroughly enmeshed in our university culture that we’re taught to exercise our powers of political righteousness even while engaging in tasks as mundane as reading books and articles. 

            If the decline in violence and the improvement of the general weal in various other areas are attributable to the Enlightenment, then many of the assumptions underlying postmodernism are turned on their heads. If social ills like warfare, racism, sexism, and child abuse exist in cultures untouched by modernism—and they in fact not only exist but tend to be much worse—then science can’t be responsible for creating them; indeed, if they’ve all trended downward with the historical development of all the factors associated with male-dominated western culture, including strong government, market economies, run-away technology, and scientific progress, then postmodernism not only has everything wrong but threatens the progress achieved by the very institutions it depends on, emerged from, and squanders innumerable scholarly careers maligning.

Of course some Enlightenment figures and some scientists do evil things. Of course living even in the most Enlightened of civilizations is no guarantee of safety. But postmodernism is an ideology based on the premise that we ought to discard a solution to our societal woes for not working perfectly and immediately, substituting instead remedies that have historically caused more problems than they solved by orders of magnitude. The argument that there’s a core to the Enlightenment that some of its representatives have been faithless to when they committed atrocities may seem reminiscent of apologies for Christianity based on the fact that Crusaders and Inquisitors weren’t loving their neighbors as Christ enjoined. The difference is that the Enlightenment works—in just a few centuries it’s transformed the world and brought about a reduction in violence no religion has been able to match in millennia. If anything, the big monotheistic religions brought about more violence.

Embracing Enlightenment morality or classical liberalism doesn’t mean we should give up our efforts to make the world a better place. As Pinker describes the transformation he hopes to encourage with Better Angels,

As one becomes aware of the decline of violence, the world begins to look different. The past seems less innocent; the present less sinister. One starts to appreciate the small gifts of coexistence that would have seemed utopian to our ancestors: the interracial family playing in the park, the comedian who lands a zinger on the commander in chief, the countries that quietly back away from a crisis instead of escalating to war. The shift is not toward complacency: we enjoy the peace we find today because people in past generations were appalled by the violence in their time and worked to reduce it, and so we should work to reduce the violence that remains in our time. Indeed, it is a recognition of the decline of violence that best affirms that such efforts are worthwhile. (xxvi)

Since our task for the remainder of this century is to extend the reach of science, literacy, and the recognition of universal human rights farther and farther along the Enlightenment gradient until they're able to grant the same increasing likelihood of a long peaceful life to every citizen of every nation of the globe, and since the key to accomplishing this task lies in fomenting future Rights Revolutions while at the same time recognizing, so as to be better equipped to rein in, our drive for dominance as manifested in our more deadly moral instincts, I for one am glad Steven Pinker has the courage to violate so many of the outrageously counterproductive postmodern taboos while having the grace to resist succumbing himself, for the most part, to the temptation of self-righteousness.

Also read:

THE FAKE NEWS CAMPAIGN AGAINST STEVEN PINKER AND ENLIGHTENMENT NOW

And:

THE ENLIGHTENED HYPOCRISY OF JONATHAN HAIDT'S RIGHTEOUS MIND

And:

NAPOLEON CHAGNON'S CRUCIBLE AND THE ONGOING EPIDEMIC OF MORALIZING HYSTERIA IN ACADEMIA

Read More
Dennis Junk Dennis Junk

Why the Critics Are Getting Luhrmann's Great Gatsby so Wrong

There’s something fitting about the almost even split among the movie critics sampled on Rotten Tomatoes—and the much larger percentage of lay viewers who liked the movie (48% vs. 84% as of now). This is because, like the novel itself, Luhrmann’s vision of Gatsby is visionary, and, as Denby points out, when the novel was first published it was panned by most critics.

           I doubt I’m the only one who had to be told at first that The Great Gatsby was a great book. Reading it the first time, you’re guaranteed to miss at least two-thirds of the nuance—and the impact of the story, its true greatness, lies in the very nuance that’s being lost on you. Take, for instance, the narrator Nick Carraway’s initial assessment of the title character. After explaining that his habit of open-minded forbearance was taken to its limit and beyond by the events of the past fall he’s about to recount, he writes, “Only Gatsby, the man who gives his name to this book, was exempt from my reaction—Gatsby who represented everything for which I have an unaffected scorn.” Already we see that Nick’s attitude toward Gatsby is complicated, and even though we can eventually work out that his feelings toward his neighbor are generally positive, at least compared to his feelings for the other characters comprising that famously “rotten crowd,” it’s still hard tell what he really thinks of the man.

When I first saw the previews for the Baz Luhrmann movie featuring Leonardo DiCaprio as Gatsby, I was—well, at first, I was excited, thrilled even. Then came the foreboding. This movie, I was sure, was going to do unspeakable violence to all that nuance, make of it a melodrama that would all but inevitably foreclose on any opportunity for the younger, more vulnerable generation to experience a genuine connection with the story through Nick’s morally, linguistically, existentially tangled telling—contenting them instead with a pop culture cartoon. This foreboding was another thing I wasn’t alone in. David Denby ends his uncharacteristically shaky review in The New Yorker,

Will young audiences go for this movie, with its few good scenes and its discordant messiness? Luhrmann may have miscalculated. The millions of kids who have read the book may not be eager for a flimsy phantasmagoria. They may even think, like many of their elders, that “The Great Gatsby” should be left in peace. The book is too intricate, too subtle, too tender for the movies. Fitzgerald’s illusions were not very different from Gatsby’s, but his illusionless book resists destruction even from the most aggressive and powerful despoilers.

Since Denby’s impulse to protect the book from despoiling so perfectly mirrored my own, I had to wonder in the interval between reading his review and seeing the movie myself if he had settled on this sentiment before or after he’d seen it.

The Great Gatsby is a novel that rewards multiple re-readings like very few others. You don’t simply re-experience it as you might expect; rather, each time feels as if you’re discovering something new, having a richer, more devastating experience than ever before. This investment of time and attention coupled with the sense after each reading of having finally reached some definitive appreciation for the book gives enthusiasts a proprietary attitude toward it. We sneer at tyros who claim to understand its greatness but can’t possibly fathom its deeper profundities the way we do. The main charge so far leveled against Luhrmann by movie critics is that he’s a West Egg tourist—a charge that can only be made convincingly by us natives. This is captured nowhere so well as in the title of Linda Holmes’s review on NPR’s website, “Loving ‘Gatsby’ Too Much and Not Enough.’” (Another frequent criticism focuses on the anachronistic music, as if the critics were afraid we might be tricked into believing Jay-Z harks to the Jazz Age.)

            There’s something fitting about the almost even split among the movie critics sampled on Rotten Tomatoes—and the much larger percentage of lay viewers who liked the movie (48% vs. 84% as of now). This is because, like the novel itself, Luhrmann’s vision of Gatsby is visionary, and, as Denby points out, when the novel was first published it was panned by most critics. The Rotten Tomatoes headline says

the consensus among critics is that movie is “a Case of Style over Substance,” and the main blurb concludes, “The pundits say The Great Gatsby never lacks for spectacle, but what’s missing is the heart beneath the glitz.”

That wasn’t my experience at all, and I’d wager this pronouncement is going to baffle most people who see the movie.

            Let’s face it, Luhrmann was in a no-win situation. His movie fails to capture Fitzgerald’s novel in all its nuance, but that was inevitable. The question is, does the movie convey something of the essence of the novel? Another important question, though I’m at risk of literary blasphemy merely posing it, is whether the movie contributes anything of value to the story, some added dimension, some more impactful encounter with the characters, a more visceral experience of some of the scenes? Cinema can’t do justice to the line-by-line filigree of literature, but it can offer audiences a more immediate and immersive simulation of actual presence in the scenes. And this is what Luhrmann contributes as reparation for all the paved over ironic niceties of Fitzgerald’s language. Denby, a literature-savvy film critic, is breathtakingly oblivious to this fundamental difference between the two media, writing of one of the parties,

Fitzgerald’s scene at the apartment gives off a feeling of sinister incoherence; Luhrmann’s version is merely a frantic jumble. The picture is filled with an indiscriminant swirling motion, a thrashing impress of “style” (Art Deco turned to digitized glitz), thrown at us with whooshing camera sweeps and surges and rapid changes of perspective exaggerated by 3-D.

Thus Denby reveals that he’s either never been drunk or that it’s been so long since he was that he doesn’t remember. The woman I saw the movie with and I both made jokes along the lines of “I think I was at that party.”

            Here’s where I reveal my own literary snobbishness by suggesting that it’s not Luhrmann and his screenwriter Craig Pearce who miss the point of the novel—or at least one of its points—but The Great Denby himself, and all the other critics of his supposedly native West Egg ilk. No one argues that the movie doesn’t do justice to the theme of America’s false promise of social mobility and the careless depredations of the established rich on the nouveau riche. The charge is that there’s too much glitz, too much chaos, incoherence, fleeting swirling passes of the 3-D lens and, oh yeah, hip-hop music. The other crucial theme—or maybe it’s the same theme—of The Great Gatsby isn’t portrayed in this hectic collage of glittering excess. But that’s because instead of portraying it, Luhrmann simulates it for us. Fitzgerald’s novel isn’t “illusionless,” as Denby insists; it’s all about illusions. It’s all about the stories people tell, those stories which Nick complains are so often “plagiaristic and marred by obvious suppression”—a complaint he voices as soon as the third paragraph of the novel. We’re not meant, either in Fitzgerald’s rendition or Luhrmann’s, to watch as reality plays out in some way that’s intended to seem natural—we’re meant to experience the story as a dream, a fantasy that keeps getting frayed and interrupted before ultimately being dashed by the crashing tide of reality.

            The genius of Luhrmann’s contribution lies in his recognition of The Great Gatsby as a story about the collision of a dream with the real world. The beginning of the movie is chaotic and swirling, a bit like a night of drunkenness at an extravagant party. But then there are scenes that are almost painfully real, like the one featuring the final confrontation between Gatsby and Tom Buchannan, the scene which Denby describes as “the dramatic highlight of this director’s career.” And for all the purported lack of heart in this swirling dreamworld I was struck by how many times I found myself being choked up as I watched it. Nick’s final farewell to Gatsby in the movie actually does Fitzgerald one better (yeah, I said it).

            The bad reviews are nothing but the supercilious preening of literature snobs (I probably would’ve written a similar one myself if I hadn’t read so many before seeing the movie). The movie is of course no substitute for the book. Both Nick and Daisy come across more sympathetically, but though this subtly changes the story it still works in its own way. Luhrmann’s Great Gatsby is a fine, even visionary complement to Fitzgerald’s. Ten years from now there may be another film version, and it will probably strike a completely different tone—but it could still be just as successful, contribute just as much. That’s the thing with these dreams and stories—they’re impossible to nail down with finality.

Also read

HOW TO GET KIDS TO READ LITERATURE WITHOUT MAKING THEM HATE IT

Read More
Dennis Junk Dennis Junk

Sabbath Says: Philip Roth and the Dilemmas of Ideological Castration

With “Sabbath’s Theater,” Philip Roth has called down the thunder. The story does away with the concept of a likable character while delivering a wildly absorbing experience. And it satirizes all the woeful facets of how literature is taught today.

Sabbath’s Theater is the type of book you lose friends over. Mickey Sabbath, the adulterous title character who follows in the long literary line of defiantly self-destructive, excruciatingly vulnerable, and offputtingly but eloquently lustful leading males like Holden Caulfield and Humbert Humbert, strains the moral bounds of fiction and compels us to contemplate the nature of our own voyeuristic impulse to see him through to the end of the story—and not only contemplate it but defend it, as if in admitting we enjoy the book, find its irreverences amusing, and think that in spite of how repulsive he often is there still might be something to be said for poor old Sabbath we’re confessing to no minor offense of our own. Fans and admiring critics alike can’t resist rushing to qualify their acclaim by insisting they don’t condone his cheating on both of his wives, the seduction of a handful of his students, his habit of casually violating others’ privacy, his theft, his betrayal of his lone friend, his manipulations, his racism, his caustic, often cruelly precise provocations—but by the time they get to the end of Sabbath’s debt column it’s a near certainty any list of mitigating considerations will fall short of getting him out of the red. Sabbath, once a puppeteer who now suffers crippling arthritis, doesn’t seem like a very sympathetic character, and yet we sympathize with him nonetheless. In his wanton disregard for his own reputation and his embrace, principled in a way, of his own appetites, intuitions, and human nastiness, he inspires a fascination none of the literary nice guys can compete with. So much for the argument that the novel is a morally edifying art form.

            Thus, in Sabbath, Philip Roth has created a character both convincing and compelling who challenges a fundamental—we may even say natural—assumption about readers’ (or viewers’) role in relation to fictional protagonists, one made by everyone from the snarky authors of even the least sophisticated Amazon.com reviews to the theoreticians behind the most highfalutin academic criticism—the assumption that characters in fiction serve as vehicles for some message the author created them to convey, or which some chimerical mechanism within the “dominant culture” created to serve as agents of its own proliferation. The corollary is that the task of audience members is to try to decipher what the author is trying to say with the work, or what element of the culture is striving to perpetuate itself through it. If you happen to like the message the story conveys, or agree with it at some level, then you recommend the book and thus endorse the statement. Only rarely does a reviewer realize or acknowledge that the purpose of fiction is not simply to encourage readers to behave as the protagonists behave or, if the tale is a cautionary one, to expect the same undesirable consequences should they choose to behave similarly. Sabbath does in fact suffer quite a bit over the course of the novel, and much of that suffering comes as a result of his multifarious offenses, so a case can be made on behalf of Roth’s morality. Still, we must wonder if he really needed to write a story in which the cheating husband is abandoned by both of his wives to make the message sink in that adultery is wrong—especially since Sabbath doesn’t come anywhere near to learning that lesson himself. “All the great thoughts he had not reached,” Sabbath muses in the final pages, “were beyond enumeration; there was no bottom to what he did not have to say about the meaning of his life” (779).

           Part of the reason we can’t help falling back on the notions that fiction serves a straightforward didactic purpose and that characters should be taken as models, positive or negative, for moral behavior is that our moral emotions are invariably and automatically engaged by stories; indeed, what we usually mean when we say we got into a story is that we were in suspense as we anticipated whether the characters ultimately met with the fates we felt they deserved. We reflexively size up any character the author introduces the same way we assess the character of a person we’re meeting for the first time in real life. For many readers, the question of whether a novel is any good is interchangeable with the question of whether they liked the main characters, assuming they fare reasonably well in the culmination of the plot. If an author like Roth evinces an attitude drastically different from ours toward a character of his own creation like Sabbath, then we feel that in failing to condemn him, in holding him up as a model, the author is just as culpable as his character. In a recent edition of PBS’s American Masters devoted to Roth, for example, Jonathan Franzen, a novelist himself, describes how even he couldn’t resist responding to his great forebear’s work in just this way. “As a young writer,” Franzen recalls, “I had this kind of moralistic response of ‘Oh, you bad person, Philip Roth’” (54:56).

            That fiction’s charge is to strengthen our preset convictions through a process of narrative tempering, thus catering to our desire for an orderly calculus of just deserts, serves as the basis for a contract between storytellers and audiences, a kind of promise on which most commercial fiction delivers with a bang. And how many of us have wanted to throw a book out of the window when we felt that promise had been broken? The goal of professional and academic critics, we may imagine, might be to ease their charges into an appreciation of more complex narrative scenarios enacted by characters who escape easy categorization. But since scholarship in the humanities, and in literary criticism especially, has been in a century-long sulk over the greater success of science and the greater renown of scientists, professors of literature have scarcely even begun to ponder what anything resembling a valid answer to the questions of how fiction works and what the best strategies for experiencing it might look like. Those who aren’t pouting in a corner about the ascendancy of science—but the Holocaust!—are stuck in the muck of the century-old pseudoscience of psychoanalysis. But the real travesty is that the most popular, politically inspired schools of literary criticism—feminism, Marxism, postcolonialism—actively preach the need to ignore, neglect, and deny the very existence of moral complexity in literature, violently displacing any appreciation of difficult dilemmas with crudely tribal formulations of good and evil.

            For those inculcated with a need to take a political stance with regard to fiction, the only important dynamics in stories involve the interplay of society’s privileged oppressors and their marginalized victims. In 1976, nearly twenty years before the publication of Sabbath’s Theater, the feminist critic Vivian Gornick lumped Roth together with Saul Bellow and Norman Mailer in an essay asking “Why Do These Men Hate Women?” because she took issue with the way women are portrayed in their novels. Gornick, following the methods standard to academic criticism, doesn’t bother devoting any space in her essay to inconvenient questions about how much we can glean about these authors from their fictional works or what it means that the case for her prosecution rests by necessity on a highly selective approach to quoting from those works. And this slapdash approach to scholarship is supposedly justified because she and her fellow feminist critics believe women are in desperate need of protection from the incalculable harm they assume must follow from such allegedly negative portrayals. In this concern for how women, or minorities, or some other victims are portrayed and how they’re treated by their notional oppressors—rich white guys—Gornick and other critics who make of literature a battleground for their political activism are making the same assumption about fiction’s straightforward didacticism as the most unschooled consumers of commercial pulp. The only difference is that the academics believe the message received by audiences is all that’s important, not the message intended by the author. The basis of this belief probably boils down to its obvious convenience.

            In Sabbath’s Theater, the idea that literature, or art of any kind, is reducible to so many simple messages, and that these messages must be measured against political agendas, is dashed in the most spectacularly gratifying fashion. Unfortunately, the idea is so seldom scrutinized, and the political agendas are insisted on so inclemently, clung to and broadcast with such indignant and prosecutorial zeal, that it seems not one of the critics, nor any of the authors, who were seduced by Sabbath were able to fully reckon with the implications of that seduction. Franzen, for instance, in a New Yorker article about fictional anti-heroes, dodges the issue as he puzzles over the phenomenon that “Mickey Sabbath may be a disgustingly self-involved old goat,” but he’s somehow still sympathetic. The explanation Franzen lights on is that

the alchemical agent by which fiction transmutes my secret envy or my ordinary dislike of “bad” people into sympathy is desire. Apparently, all a novelist has to do is give a character a powerful desire (to rise socially, to get away with murder) and I, as a reader, become helpless not to make that desire my own. (63)

If Franzen is right—and this chestnut is a staple of fiction workshops—then the political activists are justified in their urgency. For if we’re powerless to resist adopting the protagonist’s desires as our own, however fleetingly, then any impulse to victimize women or minorities must invade readers’ psyches at some level, conscious or otherwise. The simple fact, however, is that Sabbath has not one powerful desire but many competing desires, ones that shift as the novel progresses, and it’s seldom clear even to Sabbath himself what those desires are. (And is he really as self-involved as Franzen suggests? It seems to me rather that he compulsively tries to get into other people’s heads, reflexively imagining elaborate stories for them.)

            While we undeniably respond to virtuous characters in fiction by feeling anxiety on their behalf as we read about or watch them undergo the ordeals of the plot, and we just as undeniably enjoy seeing virtue rewarded alongside cruelty being punished—the goodies prevailing over the baddies—these natural responses do not necessarily imply that stories compel our interest and engage our emotions by providing us with models and messages of virtue. Stories aren’t sermons. In his interview for American Masters, Roth explained what a writer’s role is vis-à-vis social issues.

My job isn’t to be enraged. My job is what Chekhov said the job of an artist was, which is the proper presentation of the problem. The obligation of the writer is not to provide the solution to a problem. That’s the obligation of a legislator, a leader, a crusader, a revolutionary, a warrior, and so on. That’s not the goal or aim of a writer. You’re not selling it, and you’re not inviting condemnation. You’re inviting understanding. (59:41)

The crucial but overlooked distinction that characters like Sabbath—but none so well as Sabbath—bring into stark relief is the one between declarative knowledge on the one hand and moment-by-moment experience on the other. Consider for a moment how many books and movies we’ve all been thoroughly engrossed in for however long it took to read or watch them, only to discover a month or so later that we can’t remember even the broadest strokes of how their plots resolved themselves—much less what their morals might have been.

            The answer to the question of what the author is trying to say is that he or she is trying to give readers a sense of what it would be like to go through what the characters are going through—or what it would be like to go through it with them. In other words, authors are not trying to say anything; they’re offering us an experience, once-removed and simulated though it may be. This isn’t to say that these simulated experiences don’t engage our moral emotions; indeed, we’re usually only as engaged in a story as our moral emotions are engaged by it. The problem is that in real-time, in real life, political ideologies, psychoanalytic theories, and rigid ethical principles are too often the farthest thing from helpful. “Fuck the laudable ideologies,” Sabbath helpfully insists: “Shallow, shallow, shallow!” Living in a complicated society with other living, breathing, sick, cruel, saintly, conniving, venal, altruistic, deceitful, noble, horny humans demands not so much a knowledge of the rules as a finely honed body of skills—and our need to develop and hone these skills is precisely why we evolved to find the simulated experiences of fictional narratives both irresistibly fascinating and endlessly pleasurable. Franzen was right that desires are important, the desire to be a good person, the desire to do things others may condemn, the desire to get along with our families and friends and coworkers, the desire to tell them all to fuck off so we can be free, even if just for an hour, to breathe… or to fuck an intern, as the case may be. Grand principles offer little guidance when it comes to balancing these competing desires. This is because, as Sabbath explains, “The law of living: fluctuation. For every thought a counterthought, for every urge a counterurge” (518).

            Fiction then is not a conveyance for coded messages—how tedious that would be (how tedious it really is when writers make this mistake); it is rather a simulated experience of moral dilemmas arising from scenarios which pit desire against desire, conviction against reality, desire against conviction, reality against desire, in any and all permutations. Because these experiences are once-removed and, after all, merely fictional, and because they require our sustained attention, the dilemmas tend to play out in the vicinity of life’s extremes. Here’s how Sabbath’s Theater opens:

            Either forswear fucking others or the affair is over.

            This was the ultimatum, the maddeningly improbable, wholly unforeseen ultimatum, that the mistress of fifty-two delivered in tears to her lover of sixty-four on the anniversary of an attachment that had persisted with an amazing licentiousness—and that, no less amazingly, had stayed their secret—for thirteen years. But now with hormonal infusions ebbing, with the prostate enlarging, with probably no more than another few years of semi-dependable potency still his—with perhaps not that much more life remaining—here at the approach of the end of everything, he was being charged, on pain of losing her, to turn himself inside out. (373)

The ethical proposition that normally applies in situations like this is that adultery is wrong, so don’t commit adultery. But these two have been committing adultery with each other for thirteen years already—do we just stop reading? And if we keep reading, maybe nodding once in a while as we proceed, cracking a few wicked grins along the way, does that mean we too must be guilty?

                               *****

            Much of the fiction written by male literary figures of the past generation, guys like Roth, Mailer, Bellow, and Updike, focuses on the morally charged dilemmas instanced by infidelity, while their gen-x and millennial successors, led by guys like Franzen and David Foster Wallace, have responded to shifting mores—and a greater exposure to academic literary theorizing—by completely overhauling how these dilemmas are framed. Whereas the older generation framed the question as how can we balance the intense physical and spiritual—even existential—gratification of sexual adventure on the one hand with our family obligations on the other, for their successors the question has become how can we males curb our disgusting, immoral, intrinsically oppressive lusting after young women inequitably blessed with time-stamped and overwhelmingly alluring physical attributes. “The younger writers are so self-conscious,” Katie Roiphe writes in a 2009 New York Times essay, “so steeped in a certain kind of liberal education, that their characters can’t condone even their own sexual impulses; they are, in short, too cool for sex.” Roiphe’s essay, “The Naked and the Confused,” stands alongside a 2012 essay in The New York Review of Books by Elaine Blair, “Great American Losers,” as the best descriptions of the new literary trend toward sexually repressed and pathetically timid male leads. The typical character in this vein, Blair writes, “is the opposite of entitled: he approaches women cringingly, bracing for a slap.”

            The writers in the new hipster cohort create characters who bury their longings layers-deep in irony because they’ve been assured the failure on the part of men of previous generations to properly check these same impulses played some unspecified role in the abysmal standing of women in society. College students can’t make it past their first semester without hearing about the evils of so-called objectification, but it’s nearly impossible to get a straight answer from anyone, anywhere, to the question of how objectification can be distinguished from normal, non-oppressive male attraction and arousal. Even Roiphe, in her essay lamenting the demise of male sexual virility in literature, relies on a definition of male oppression so broad that it encompasses even the most innocuous space-filling lines in the books of even the most pathetically diffident authors, writing that “the sexism in the work of the heirs apparent” of writers like Roth and Updike,

is simply wilier and shrewder and harder to smoke out. What comes to mind is Franzen’s description of one of his female characters in “The Corrections”: “Denise at 32 was still beautiful.” To the esteemed ladies of the movement I would suggest this is not how our great male novelists would write in the feminist utopia.

How, we may ask, did it get to the point where acknowledging that age influences how attractive a woman is qualifies a man for designation as a sexist? Blair, in her otherwise remarkably trenchant essay, lays the blame for our oversensitivity—though paranoia is probably a better word—at the feet of none other than those great male novelists themselves, or, as David Foster Wallace calls them, the Great Male Narcissists. She writes,

Because of the GMNs, these two tendencies—heroic virility and sexist condescension—have lingered in our minds as somehow yoked together, and the succeeding generations of American male novelists have to some degree accepted the dyad as truth. Behind their skittishness is a fearful suspicion that if a man gets what he wants, sexually speaking, he is probably exploiting someone.

That Roth et al were sexist, condescending, disgusting, narcissistic—these are articles of faith for feminist critics. Yet when we consider how expansive the definition of terms like sexism and misogyny have become—in practical terms, they both translate to: not as radically feminist as me—and the laughably low standard of evidence required to convince scholars of the accusations, female empowerment starts to look like little more than a reserved right to stand in self-righteous judgment of men for giving voice to and acting on desires anyone but the most hardened ideologue will agree are only natural.

             The effect on writers of this ever-looming threat of condemnation is that they either allow themselves to be silenced or they opt to participate in the most undignified of spectacles, peevishly sniping their colleagues, falling all over themselves to be granted recognition as champions for the cause. Franzen, at least early in his career, was more the silenced type. Discussing Roth, he wistfully endeavors to give the appearance of having moved beyond his initial moralistic responses. “Eventually,” he says, “I came to feel as if that was coming out of an envy: like, wow, I wish I could be as liberated of worry about other’s people’s opinion of me as Roth is” (55:18). We have to wonder if his espousal of the reductive theory that sympathy for fictional characters is based solely on the strength of their desires derives from this same longing for freedom to express his own. David Foster Wallace, on the other hand, wasn’t quite as enlightened or forgiving when it came to his predecessors. Here’s how he explains his distaste for a character in one of Updike’s novels, openly intimating the author’s complicity:

It’s that he persists in the bizarre adolescent idea that getting to have sex with whomever one wants whenever one wants is a cure for ontological despair. And so, it appears, does Mr. Updike—he makes it plain that he views the narrator’s impotence as catastrophic, as the ultimate symbol of death itself, and he clearly wants us to mourn it as much as Turnbull does. I’m not especially offended by this attitude; I mostly just don’t get it. Erect or flaccid, Ben Turnbull’s unhappiness is obvious right from the book’s first page. But it never once occurs to him that the reason he’s so unhappy is that he’s an asshole.

So the character is an asshole because he wants to have sex outside of marriage, and he’s unhappy because he’s an asshole, and it all traces back to the idea that having sex with whomever one wants is a source of happiness? Sounds like quite the dilemma—and one that pronouncing the main player an asshole does nothing to solve. This passage is the conclusion to a review in which Wallace tries to square his admiration for Updike’s writing with his desire to please a cohort of women readers infuriated by the way Updike writes about—portrays—women (which begs the question of why they’d read so many of his books). The troubling implication of his compromise is that if Wallace were himself to freely express his sexual feelings, he’d be open to the charge of sexism too—he’d be an asshole. Better to insist he simply doesn’t “get” why indulging his sexual desires might alleviate his “ontological despair.” What would Mickey Sabbath make of the fact that Wallace hanged himself when he was only forty-six, eleven years after publishing that review? (This isn’t just a nasty rhetorical point; Sabbath has a fascination with artists who commit suicide.)

The inadequacy of moral codes and dehumanizing ideologies when it comes to guiding real humans through life’s dilemmas, along with their corrosive effects on art, is the abiding theme of Sabbath’s Theater. One of the pivotal moments in Sabbath’s life is when a twenty-year-old student he’s in the process of seducing leaves a tape recorder out to be discovered in a lady’s room at the university. The student, Kathy Goolsbee, has recorded a phone sex session between her and Sabbath, and when the tape finds its way into the hands of the dean, it becomes grounds for the formation of a committee of activists against the abuse of women. At first, Kathy doesn’t realize how bad things are about to get for Sabbath. She even offers to give him a blow job as he berates her for her carelessness. Trying to impress on her the situation’s seriousness, he says,

Your people have on tape my voice giving reality to all the worst things they want the world to know about men. They have a hundred times more proof of my criminality than could be required by even the most lenient of deans to drive me out of every decent antiphallic educational institution in America. (586)

The committee against Sabbath proceeds to make the full recorded conversation available through a call-in line (the nineties equivalent of posting the podcast online). But the conversation itself isn’t enough; one of the activists gives a long introduction, which concludes,

The listener will quickly recognize how by this point in his psychological assault on an inexperienced young woman, Professor Sabbath has been able to manipulate her into thinking that she is a willing participant. (567-8)

Sabbath knows full well that even consensual phone sex can be construed as a crime if doing so furthers the agenda of those “esteemed ladies of the movement” Roiphe addresses. 

Reading through the lens of a tribal ideology ineluctably leads to the refraction of reality beyond recognizability, and any aspiring male writer quickly learns in all his courses in literary theory that the criteria for designation as an enemy to the cause of women are pretty much whatever the feminist critics fucking say they are. Wallace wasn’t alone in acquiescing to feminist rage by denying his own boorish instincts. Roiphe describes the havoc this opportunistic antipathy toward male sexuality wreaks in the minds of male writers and their literary creations:

Rather than an interest in conquest or consummation, there is an obsessive fascination with trepidation, and with a convoluted, postfeminist second-guessing. Compare [Benjamin] Kunkel’s tentative and guilt-ridden masturbation scene in “Indecision” with Roth’s famous onanistic exuberance with apple cores, liver and candy wrappers in “Portnoy’s Complaint.” Kunkel: “Feeling extremely uncouth, I put my penis away. I might have thrown it away if I could.” Roth also writes about guilt, of course, but a guilt overridden and swept away, joyously subsumed in the sheer energy of taboo smashing: “How insane whipping out my joint like that! Imagine what would have been had I been caught red-handed! Imagine if I had gone ahead.” In other words, one rarely gets the sense in Roth that he would throw away his penis if he could.

And what good comes of an ideology that encourages the psychological torture of bookish young men? It’s hard to distinguish the effects of these so-called literary theories from the hellfire scoldings delivered from the pulpits of the most draconian and anti-humanist religious patriarchs. Do we really need to ideologically castrate all our male scholars to protect women from abuse and further the cause of equality?

*****

The experience of sexual relations between older teacher and younger student in Sabbath’s Theater is described much differently when the gender activists have yet to get involved—and not just by Sabbath but by Kathy as well. “I’m of age!” she protests as he chastises her for endangering his job and opening him up to public scorn; “I do what I want” (586). Absent the committee against him, Sabbath’s impression of how his affairs with his students impact them reflects the nuance of feeling inspired by these experimental entanglements, the kind of nuance that the “laudable ideologies” can’t even begin to capture.

There was a kind of art in his providing an illicit adventure not with a boy of their own age but with someone three times their age—the very repugnance that his aging body inspired in them had to make their adventure with him feel a little like a crime and thereby give free play to their budding perversity and to the confused exhilaration that comes of flirting with disgrace. Yes, despite everything, he had the artistry still to open up to them the lurid interstices of life, often for the first time since they’d given their debut “b.j.” in junior high. As Kathy told him in that language which they all used and which made him want to cut their heads off, through coming to know him she felt “empowered.” (566)

Opening up “the lurid interstices of life” is precisely what Roth and the other great male writers—all great writers—are about. If there are easy answers to the questions of what characters should do, or if the plot entails no more than a simple conflict between a blandly good character and a blandly bad one, then the story, however virtuous its message, will go unattended.

            But might there be too much at stake for us impressionable readers to be allowed free reign to play around in imaginary spheres peopled by morally dubious specters? After all, if denouncing the dreamworlds of privileged white men, however unfairly, redounds to the benefit of women and children and minorities, then perhaps it’s to the greater good. In fact, though, right alongside the trends of increasing availability for increasingly graphic media portrayals of sex and violence have occurred marked decreases in actual violence and the abuse of women. And does anyone really believe it’s the least literate, least media-saturated societies that are the kindest to women? The simple fact is that the theory of literature subtly encouraging oppression can’t be valid. But the problem is once ideologies are institutionalized, once a threshold number of people depend on their perpetuation for their livelihoods, people whose scholarly work and reputations are staked on them, then victims of oppression will be found, their existence insisted on, regardless of whether they truly exist or not.

In another scandal Sabbath was embroiled in long before his flirtation with Kathy Goolsbee, he was brought up on charges of indecency because in the course of a street performance he’d exposed a woman’s nipple. The woman herself, Helen Trumbull, maintains from the outset of the imbroglio that whatever Sabbath had done, he’d done it with her consent—just as will be the case with his “psychological assault” on Kathy. But even as Sabbath sits assured that the case against him will collapse once the jury hears the supposed victim testify on his behalf, the prosecution takes a bizarre twist:

In fact, the victim, if there even is one, is coming this way, but the prosecutor says no, the victim is the public. The poor public, getting the shaft from this fucking drifter, this artist. If this guy can walk along a street, he says, and do this, then little kids think it’s permissible to do this, and if little kids think it’s permissible to do this, then they think it’s permissible to blah blah banks, rape women, use knives. If seven-year-old kids—the seven nonexistent kids are now seven seven-year-old kids—are going to see that this is fun and permissible with strange women… (663-4)

Here we have Roth’s dramatization of the fundamental conflict between artists and moralists. Even if no one is directly hurt by playful scenarios, that they carry a message, one that threatens to corrupt susceptible minds, is so seemingly obvious it’s all but impossible to refute. Since the audience for art is “the public,” the acts of depravity and degradation it depicts are, if anything, even more fraught with moral and political peril than any offense against an individual victim, real or imagined.  

            This theme of the oppressive nature of ideologies devised to combat oppression, the victimizing proclivity of movements originally fomented to protect and empower victims, is most directly articulated by a young man named Donald, dressed in all black and sitting atop a file cabinet in a nurse’s station when Sabbath happens across him at a rehab clinic. Donald “vaguely resembled the Sabbath of some thirty years ago,” and Sabbath will go on to apologize for interrupting him, referring to him as “a man whose aversions I wholeheartedly endorse.” What he was saying before the interruption:

“Ideological idiots!” proclaimed the young man in black. “The third great ideological failure of the twentieth century. The same stuff. Fascism. Communism. Feminism. All designed to turn one group of people against another group of people. The good Aryans against the bad others who oppress them. The good poor against the bad rich who oppress them. The good women against the bad men who oppress them. The holder of ideology is pure and good and clean and the other wicked. But do you know who is wicked? Whoever imagines himself to be pure is wicked! I am pure, you are wicked… There is no human purity! It does not exist! It cannot exist!” he said, kicking the file cabinet for emphasis. “It must not and should not exist! Because it’s a lie. … Ideological tyranny. It’s the disease of the century. The ideology institutionalizes the pathology. In twenty years there will be a new ideology. People against dogs. The dogs are to blame for our lives as people. Then after dogs there will be what? Who will be to blame for corrupting our purity?” (620-1)

It’s noteworthy that this rant is made by a character other than Sabbath. By this point in the novel, we know Sabbath wouldn’t speak so artlessly—unless he was really frightened or angry. As effective and entertaining an indictment of “Ideological tyranny” as Sabbath’s Theater is, we shouldn’t expect to encounter anywhere in a novel by a storyteller as masterful as Roth a character operating as a mere mouthpiece for some argument. Even Donald himself, Sabbath quickly gleans, isn’t simply spouting off; he’s trying to impress one of the nurses.

            And it’s not just the political ideologies that conscript complicated human beings into simple roles as oppressors and victims. The pseudoscientific psychological theories that both inform literary scholarship and guide many non-scholars through life crises and relationship difficulties function according to the same fundamental dynamic of tribalism; they simply substitute abusive family members for more generalized societal oppression and distorted or fabricated crimes committed in the victim’s childhood for broader social injustices. Sabbath is forced to contend with this particular brand of depersonalizing ideology because his second wife, Roseanna, picks it up through her AA meetings, and then becomes further enmeshed in it through individual treatment with a therapist named Barbara. Sabbath, who considers himself a failure, and who is carrying on an affair with the woman we meet in the opening lines of the novel, is baffled as to why Roseanna would stay with him. Her therapist provides an answer of sorts.

But then her problem with Sabbath, the “enslavement,” stemmed, according to Barbara, from her disastrous history with an emotionally irresponsible mother and a violent alcoholic father for both of whom Sabbath was the sadistic doppelganger. (454)

Roseanna’s father was a geology professor who hanged himself when she was a young teenager. Sabbath is a former puppeteer with crippling arthritis. Naturally, he’s confused by the purported identity of roles.

These connections—between the mother, the father, and him—were far clearer to Barbara than they were to Sabbath; if there was, as she liked to put it, a “pattern” in it all, the pattern eluded him. In the midst of a shouting match, Sabbath tells his wife, “As for the ‘pattern’ governing a life, tell Barbara it’s commonly called chaos” (455).

When she protests, “You are shouting at me like my father,” Sabbath asserts his individuality: “The fuck that’s who I’m shouting at you like! I’m shouting at you like myself!” (459). Whether you see his resistance as heroic or not probably depends on how much credence you give to those psychological theories.

            From the opening lines of Sabbath’s Theater when we’re presented with the dilemma of the teary-eyed mistress demanding monogamy in their adulterous relationship, the simple response would be to stand in easy judgment of Sabbath, and like Wallace did to Updike’s character, declare him an asshole. It’s clear that he loves this woman, a Croatian immigrant named Drenka, a character who at points steals the show even from the larger-than-life protagonist. And it’s clear his fidelity would mean a lot to her. Is his freedom to fuck other women really so important? Isn’t he just being selfish? But only a few pages later our easy judgment suddenly gets more complicated:

As it happened, since picking up Christa several years back Sabbath had not really been the adventurous libertine Drenka claimed she could no longer endure, and consequently she already had the monogamous man she wanted, even if she didn’t know it. To women other than her, Sabbath was by now quite unalluring, not just because he was absurdly bearded and obstinately peculiar and overweight and aging in every obvious way but because, in the aftermath of the scandal four years earlier with Kathy Goolsbee, he’s become more dedicated than ever to marshaling the antipathy of just about everyone as though he were, in fact, battling for his rights. (394)

Christa was a young woman who participated in a threesome with Sabbath and Drenka, an encounter to which Sabbath’s only tangible contribution was to hand the younger woman a dildo.

            One of the central dilemmas for a character who loves the thrill of sex, who seeks in it a rekindling of youthful vigor—“the word’s rejuvenation,” Sabbath muses at one point (517)—the adrenaline boost borne of being in the wrong and the threat of getting caught, what Roiphe calls “the sheer energy of taboo smashing,” becomes ever more indispensable as libido wanes with age. Even before Sabbath ever had to contend with the ravages of aging, he reveled in this added exhilaration that attends any expedition into forbidden realms. What makes Drenka so perfect for him is that she has not just a similarly voracious appetite but a similar fondness for outrageous sex and the smashing of taboo. And it’s this mutual celebration of the verboten that Sabbath is so reluctant to relinquish. Of Drenka, he thinks,

The secret realm of thrills and concealment, this was the poetry of her existence. Her crudeness was the most distinguishing force in her life, lent her life its distinction. What was she otherwise? What was he otherwise? She was his last link with another world, she and her great taste for the impermissible. As a teacher of estrangement from the ordinary, he had never trained a more gifted pupil; instead of being joined by the contractual they were interconnected by the instinctual and together could eroticize anything (except their spouses). Each of their marriages cried out for a countermarriage in which the adulterers attack their feelings of captivity. (395)

Those feelings of captivity, the yearnings to experience the flow of the old juices, are anything but adolescent, as Wallace suggests of them; adolescents have a few decades before they have to worry about dwindling arousal. Most of them have the opposite problem.

            The question of how readers are supposed to feel about a character like Sabbath doesn’t have any simple answers. He’s an asshole at several points in the novel, but at several points he’s not. One of the reasons he’s so compelling is that working out what our response to him should be poses a moral dilemma of its own. Whether or not we ultimately decide that adultery is always and everywhere wrong, the experience of being privy to Sabbath’s perspective can help us prepare ourselves for our own feelings of captivity, lusting nostalgia, and sexual temptation. Most of us will never find ourselves in a dilemma like Sabbath gets himself tangled in with his friend Norman’s wife, for instance, but it would be to our detriment to automatically discount the old hornball’s insights.

He could discern in her, whenever her husband spoke, the desire to be just a little cruel to Norman, saw her sneering at the best of him, at the very best things in him. If you don’t go crazy because of your husband’s vices, you go crazy because of his virtues. He’s on Prozac because he can’t win. Everything is leaving her except for her behind, which her wardrobe informs her is broadening by the season—and except for this steadfast prince of a man marked by reasonableness and ethical obligation the way others are marked by insanity or illness. Sabbath understood her state of mind, her state of life, her state of suffering: dusk is descending, and sex, our greatest luxury, is racing away at a tremendous speed, everything is racing off at a tremendous speed and you wonder at your folly in having ever turned down a single squalid fuck. You’d give your right arm for one if you are a babe like this. It’s not unlike the Great Depression, not unlike going broke overnight after years of raking it in. “Nothing unforeseen that happens,” the hot flashes inform her, “is likely ever again going to be good.” Hot flashes mockingly mimicking the sexual ecstasies. Dipped, she is, in the very fire of fleeting time. (651)

Welcome to messy, chaotic, complicated life.

Sabbath’s Theater is, in part, Philip Roth’s raised middle finger to the academic moralists whose idiotic and dehumanizing ideologies have spread like a cancer into all the venues where literature is discussed and all the avenues through which it’s produced. Unfortunately, the unrecognized need for culture-wide chemotherapy hasn’t gotten any less dire in the nearly two decades since the novel was published. With literature now drowning in the devouring tide of new media, the tragic course set by the academic custodians of art toward bloodless prudery and impotent sterility in the name of misguided political activism promises to do nothing but ensure the ever greater obsolescence of epistemologically doomed and resoundingly pointless theorizing, making of college courses the places where you go to become, at best, profoundly confused about where you should stand with relation to fiction and fictional characters, and, at worst, a self-righteous demagogue denouncing the chimerical evils allegedly encoded into every text or cultural artifact. All the conspiracy theorizing about the latent evil urgings of literature has amounted to little more than another reason not to read, another reason to tune in to Breaking Bad or Mad Men instead. But the only reason Roth’s novel makes such a successful case is that it at no point allows itself to be reducible to a mere case, just as Sabbath at no point allows himself to be conscripted as a mere argument. We don’t love or hate him; we love and hate him. But we sort of just love him because he leaves us free to do both as we experience his antics, once removed and simulated, but still just as complicatedly eloquent in their message of “Fuck the laudable ideologies”—or not, as the case may be. 

Also read

JUST ANOTHER PIECE OF SLEAZE: THE REAL LESSON OF ROBERT BOROFSKY'S "FIERCE CONTROVERSY"

And

PUTTING DOWN THE PEN: HOW SCHOOL TEACHES US THE WORST POSSIBLE WAY TO READ LITERATURE

And

HOW VIOLENT FICTION WORKS: ROHAN WILSON’S “THE ROVING PARTY” AND JAMES WOOD’S SANGUINARY SUBLIME FROM CONRAD TO MCCARTHY

Read More