READING SUBTLY

This
was the domain of my Blogger site from 2009 to 2018, when I moved to this domain and started
The Storytelling Ape
. The search option should help you find any of the old posts you're looking for.
 

Dennis Junk Dennis Junk

Nice Guys with Nothing to Say: Brett Martin’s Difficulty with “Difficult Men” and the Failure of Arts Scholarship

Brett Martin’s book “Difficult Men” contains fascinating sections about the history and politics behind some of our favorite shows. But whenever he reaches for deeper insights about the shows’ appeal, the results range from utterly banal to unwittingly comical. The reason for his failure is his reliance on politically motivated theorizing, which is all too fashionable in academia.

With his book Difficult Men: Behind the Scenes of a Creative Revolution: From “The Sopranos” and “The Wire” to “Mad Men” and “Breaking Bad”, Brett Martin shows that you can apply the whole repertoire of analytic tools furnished by contemporary scholarship in the arts to a cultural phenomenon without arriving at anything even remotely approaching an insight. Which isn’t to say the book isn’t worth reading: if you’re interested in the backstories of how cable TV series underwent their transformation to higher production quality, film-grade acting and directing, greater realism, and multiple, intricately interlocking plotlines, along with all the gossip surrounding the creators and stars, then you’ll be delighted to discover how good Martin is at delivering the dish. 

He had excellent access to some of the showrunners, seems to know everything about the ones he didn’t have access to anyway, and has a keen sense for the watershed moments in shows—as when Tony Soprano snuck away from scouting out a college with his daughter Meadow to murder a man, unceremoniously, with a smile on his face, despite the fears of HBO executives that audiences would turn against the lead character for doing so. And Difficult Men is in no way a difficult read. Martin’s prose is clever without calling too much attention to itself. His knowledge of history and pop culture rivals that of anyone in the current cohort of hipster sophisticates. And his enthusiasm for the topic radiates off the pages while not marring his objectivity with fanboyism. But if you’re more interested in the broader phenomenon of unforgivable male characters audiences can’t help loving you’ll have to look elsewhere for any substantive discussion of it.

Difficult Men would have benefited from Martin being a more difficult man himself. Instead, he seems at several points to be apologizing on behalf of the show creators and their creations, simultaneously ecstatic at the unfettering of artistic freedom and skittish whenever bumping up against questions about what the resulting shows are reflecting about artists and audiences alike. He celebrates the shows’ shucking off of political correctness even as he goes out of his way to brandish his own PC bona fides. With regard to his book’s focus on men, for instance, he writes,

Though a handful of women play hugely influential roles in this narrative—as writers, actors, producers, and executives—there aren’t enough of them. Not only were the most important shows of the era run by men, they were also largely about manhood—in particular the contours of male power and the infinite varieties of male combat. Why that was had something to do with a cultural landscape still awash in postfeminist dislocation and confusion about exactly what being a man meant. (13)

Martin throws multiple explanations at the centrality of “male combat” in high-end series, but the basic fact that he suggests accounts for the prevalence of this theme across so many shows in TV’s Third Golden Age is that most of the artists working on the shows are afflicted with the same preoccupations.

In other words, middle-aged men predominated because middle-aged men had the power to create them. And certainly the autocratic power of the showrunner-auteur scratches a peculiarly masculine itch. (13)

Never mind that women make up a substantial portion of the viewership. If it ever occurred to Martin that this alleged “masculine itch” may have something to do with why men outnumber women in high-stakes competitive fields like TV scriptwriting, he knew better than to put the suspicion in writing.

            The centrality of dominant and volatile male characters in America’s latest creative efflorescence is in many ways a repudiation of the premises underlying the scholarship of the decades leading up to it. With women moving into the workplace after the Second World War, and with the rise of feminism in the 1970s, the stage was set for an experiment in how malleable human culture really was with regard to gender roles. How much change did society’s tastes undergo in the latter half of the twentieth century? Despite his emphasis on “postfeminist dislocation” as a factor in the appeal of TV’s latest crop of bad boys, Martin is savvy enough to appreciate these characters’ long pedigree, up to a point. He writes of Tony Soprano, for instance,

In his self-absorption, his horniness, his alternating cruelty and regret, his gnawing unease, Tony was, give or take Prozac and one or two murders, a direct descendant of Updike’s Rabbit Angstrom. In other words, the American Everyman. (84)

According to the rules of modern criticism, it’s okay to trace creative influences along their historical lineages. And Martin is quite good at situating the Third Golden Age in its historical and technological context:

The ambition and achievement of these shows went beyond the simple notion of “television getting good.” The open-ended, twelve- or thirteen-episode serialized drama was maturing into its own, distinct art form. What’s more, it had become the signature American art form of the first decade of the twenty-first century, the equivalent of what the films of Scorsese, Altman, Coppola, and others had been to the 1970s or the novels of Updike, Roth, and Mailer had been to the 1960s. (11)

What you’re not allowed to do, however—and what Martin knows better than to try to get away with—is notice that all those male filmmakers and novelists of the 60s and 70s were dealing with the same themes as the male showrunners Martin is covering. Is this pre-feminist dislocation? Mad Men could’ve featured Don Draper reading Rabbit, Run right after it was published in 1960. In fact, Don bears nearly as much resemblance to the main character of what was arguably the first novel ever written, The Tale of Genji, by the eleventh-century Japanese noblewoman, Murasaki Shikibu, as Tony Soprano bears to Rabbit Angstrom.

            Missed connections, tautologies, and non sequiturs abound whenever Martin attempts to account for the resonance of a particular theme or show, and at points his groping after insight is downright embarrassing. Difficult Men, as good as it is on history and the politicking of TV executives, can serve as a case study in the utter banality and logical bankruptcy of scholarly approaches to discussing the arts. These politically and academically sanctioned approaches can be summed up concisely, without scanting any important nuances, in the space of paragraph. While any proposed theory about average gender differences with biological bases must be strenuously and vociferously criticized and dismissed (and its proponents demonized without concern for fairness), any posited connection between a popular theme and contemporary social or political issues is seen not just as acceptable but as automatically plausible, to the point where after drawing the connection the writer need provide no further evidence whatsoever.

One of several explanations Martin throws out for the appeal of characters like Tony Soprano and Don Draper, for instance, is that they helped liberal HBO and AMC subscribers cope with having a president like George W. Bush in office. “This was the ascendant Right being presented to the disempowered Left—as if to reassure it that those in charge were still recognizably human” (87). But most of Mad Men’s run, and Breaking Bad’s too, has been under a President Obama. This doesn’t present a problem for Martin’s analysis, though, because there’s always something going on in the world that can be said to resonate with a show’s central themes. Of Breaking Bad, he writes,

Like The Sopranos, too, it uncannily anticipated a national mood soon to be intensified by current events—in this case the great economic unsettlement of the late aughts, which would leave many previously secure middle-class Americans suddenly feeling like desperate outlaws in their own suburbs. (272)

If this strikes you as comically facile, I can assure you that were the discussion taking place in the context of an explanation proposed by a social scientist, writers like Martin would be falling all over themselves trying to be the first to explain the danger of conflating correlation with causation, whether the scientist actually made that mistake or not.

            But arts scholarship isn’t limited to this type of socio-historical loose association because at some point you simply can’t avoid bringing individual artists, characters, and behind-the-scenes players into the discussion. Even when it comes to a specific person or character’s motivation, though, it’s important to focus on upbringing in a given family and sociopolitical climate as opposed to any general trend in human psychology. This willful blindness becomes most problematic when Martin tries to identify commonalities shared by all the leading men in the shows he’s discussing. He writes, for example,

All of them strove, awkwardly at times, for connection, occasionally finding it in glimpses and fragments, but as often getting blocked by their own vanities, their fears, and their accumulated past crimes. (189-90)

This is the closest Martin comes to a valid insight into difficult men in the entire book. The problem is that the rule against recognizing trends in human nature has made him blind to the applicability of this observation to pretty much everyone in the world. You could use this passage as a cold read and convince people you’re a psychic.

            So far, our summation of contemporary arts scholarship includes a rule against referring to human nature and an injunction to focus instead on sociopolitical factors, no matter how implausible their putative influence. But the allowance for social forces playing a role in upbringing provides something of a backdoor for a certain understanding of human nature to enter the discussion. Although the academic versions of this minimalist psychology are byzantine to the point of incomprehensibility, most of the main precepts will be familiar to you from movie and book reviews and criticism: parents, whom we both love and hate, affect nearly every aspect of our adult personalities; every category of desire, interest, or relationship is a manifestation of the sex drive; and we all have subconscious desires—all sexual in one way or another—based largely on forgotten family dramas that we enjoy seeing played out and given expression in art. That’s it. 

            So, if we’re discussing Breaking Bad for instance, a critic might refer to Walt and Jesse’s relationship as either oedipal, meaning they’re playing the roles of father and son who love but want to kill each other, or homoerotic, meaning their partnership substitutes for the homosexual relationship they’d both really prefer. The special attention the show gives to the blue meth and all the machines and gadgets used to make it constitutes a fetish. And the appeal of the show is that all of us in the audience wish we could do everything Walt does. Since we must repress those desires, we come to the show because watching it effects a type of release.

            Not a single element of this theory has any scientific validity. If we were such horny devils, we could just as easily watch internet pornography as tune into Mad Men. Psychoanalysis is to modern scientific psychology what alchemy is to chemistry and what astrology is to astronomy. But the biggest weakness of Freud’s pseudo-theories from a scientific perspective is probably what has made them so attractive to scholars in the humanities over the past century: they don’t lend themselves to testable predictions, so they can easily be applied to a variety of outcomes. As explanations, they can never fail or be definitively refuted—but that’s because they don’t really explain anything. Quoting Craig Wright, a writer for Six Feet Under, Martin writes that

…the left always articulates a critique through the arts.  “But the funny part is that masked by, or nested within, that critique is a kind of helpless eroticization of the power of the Right. They’re still in love with Big Daddy, even though they hate him.”

That was certainly true for the women who made Tony Soprano an unlikely sex symbol—and for the men who found him no less seductive. Wish fulfillment has always been at the queasy heart of the mobster genre, the longing for a life outside the bounds of convention, mingled with the conflicted desire to see the perpetrator punished for the same transgression… Likewise for viewers, for whom a life of taking, killing, and sleeping with whomever and whatever one wants had an undeniable, if conflict-laden, appeal. (88)

So Tony reminds us of W. because they’re both powerful figures, and we’re interested in powerful figures because they remind us of our dads and because we eroticize power. Even if this were true, would it contribute anything to our understanding or enjoyment of the show? Are any of these characters really that much like your own dad? Tony smashes some poor guy’s head because he got in his way, and sometimes we wish we could do that. Don Draper sleeps with lots of attractive women, and all the men watching the show would like to do that too. Startling revelations, those.

What a scholar in search of substantive insights might focus on instead is the universality of the struggle to reconcile selfish desires—sex, status, money, comfort—with the needs and well-being of the groups to which we belong. Don Draper wants to sleep around, but he also genuinely wants Betty and their children to be happy. Tony Soprano wants to be feared and respected, but he doesn’t want his daughter to think he’s a murderous thug. Walter White wants to prove he can provide for his family, but he also wants Skyler and Walter Junior to be safe. These tradeoffs and dilemmas—not the difficult men themselves—are what most distinguish these shows from conventional TV dramas. In most movies and shows, the protagonist may have some selfish desires that compete with his or her more altruistic or communal instincts, but which side ultimately wins out is a foregone conclusion. “Heroes are much better suited for the movies,” Martin quotes Alan Ball saying. “I’m more interested in real people. And real people are fucked up” (106).

Ball is the showrunner behind the HBO series Six Feet Under and True Blood, and though Martin gives him quite a bit of space in Difficult Men he doesn’t seem to notice that Ball’s “feminine style” (102) of showrunning undermines his theory about domineering characters being direct reflections of their domineering creators. The handful of interesting observations about what makes for a good series in Martin’s book is pretty evenly divvied up between Ball and David Simon, the creator of The Wire. Recalling his response to the episode of The Sopranos in which Tony strangles a rat while visiting a college campus with Meadow, Ball says,

I felt like was watching a movie from the seventies. Where it was like, “You know those cartoon ideas of good and evil? Well, forget them. We’re going to address something that’s really real.” The performances were electric. The writing was spectacular. But it was the moral complexity, the complexity of the characters and their dilemmas, that made it incredibly exciting. (94-5)

The connection between us and the characters isn’t just that we have some of the same impulses and desires; it’s that we have to do similar balancing acts as we face similar dilemmas. No, we don’t have to figure out how to whack a guy without our daughters finding out, but a lot of us probably do want to shield our kids from some of the ugliness of our jobs. And most of us have to prioritize career advancement against family obligations in one way or another. What makes for compelling drama isn’t our rooting for a character who knows what’s right and does it—that’s not drama at all. What pulls us into these shows is the process the characters go through of deciding which of their competing desires or obligations they should act on. If we see them do the wrong thing once in a while, well, that just ups the ante for the scenes when doing the right thing really counts.

            On the one hand, parents and sponsors want a show that has a good message, a guy with the right ideas and virtuous motives confronted with people with bad ideas and villainous motives. The good guy wins and the lesson is conveyed to the comfortable audiences. On the other hand, writers, for the most part, want to dispense with this idea of lessons and focus on characters with murderous, adulterous, or self-aggrandizing impulses, allowing for the possibility that they’ll sometimes succumb to them. But sometimes writers face the dilemma of having something they really want to say with their stories. Martin describes David Simon’s struggle to square this circle.

 As late as 2012, he would complain in a New York Times interview that fans were still talking about their favorite characters rather than concentrating on the show’s political message… The real miracle of The Wire is that, with only a few late exceptions, it overcame the proud pedantry of its creators to become one of the greatest literary accomplishments of the early twenty-first century. (135)

But then it’s Simon himself who Martin quotes to explain how having a message to convey can get in the way of a good story.

Everybody, if they’re trying to say something, if they have a point to make, they can be a little dangerous if they’re left alone. Somebody has to be standing behind them saying, dramatically, “Can we do it this way?” When the guy is making the argument about what he’s trying to say, you need somebody else saying, “Yeah, but…” (207)

The exploration of this tension makes up the most substantive and compelling section of Difficult Men.

            Unfortunately, Martin fails to contribute anything to this discussion of drama and dilemmas beyond these short passages and quotes. And at several points he forgets his own observation about drama not being reducible to any underlying message. The most disappointing part of Difficult Men is the chapter devoted to Vince Gilligan and his show Breaking Bad. Gilligan is another counterexample to the theory that domineering and volatile men in the writer’s seat account for domineering and volatile characters in the shows; the writing room he runs gives the chapter its name, “The Happiest Room in Hollywood.” Martin writes that Breaking Bad is “arguably the best show on TV, in many ways the culmination of everything the Third Golden Age had made possible” (264). In trying to explain why the show is so good, he claims that

…whereas the antiheroes of those earlier series were at least arguably the victims of their circumstances—family, society, addiction, and so on—Walter White was insistently, unambiguously, an agent with free will. His journey became a grotesque magnification of the American ethos of self-actualization, Oprah Winfrey’s exhortation that all must find and “live your best life.” What if, Breaking Bad asked, one’s best life happened to be as a ruthless drug lord? (268)

This is Martin making the very mistake he warns against earlier in the book by finding some fundamental message at the core of the show. (Though he could simply believe that even though it’s a bad idea for writers to try to convey messages it’s okay for critics to read them into the shows.) But he’s doing the best he can with the tools of scholarship he’s allowed to marshal. This assessment is an extension of his point about post-feminist dislocation, turning the entire series into a slap in the face to Oprah, that great fount of male angst.

            To point out that Martin is perfectly wrong about Walter White isn’t merely to offer a rival interpretation. Until the end of season four, as any reasonable viewer who’s paid a modicum of attention to the development of his character will attest, Walter is far more at the mercy of circumstances than any of the other antiheroes in the Third Golden Age lineup. Here’s Walter explaining why he doesn’t want to undergo an expensive experimental cancer treatment in season one:

What I want—what I need—is a choice. Sometimes I feel like I never actually make any of my own. Choices, I mean. My entire life, it just seems I never, you know, had a real say about any of it. With this last one—cancer—all I have left is how I choose to approach this.

He’s secretly cooking meth to make money for his family already at this point, but that’s a lot more him making the most of a bad situation than being the captain of his own fate. Can you imagine Tony or Don saying anything like this? Even when Walt delivers his famous “I am the danger” speech in season four—which gets my vote for the best moment in TV history (or film history too for that matter)—the statement is purely aspirational; he’s still in all kinds of danger at that point. Did Martin neglect the first four seasons and pick up watching only after Walt finally killed Gus? Either way, it’s a big, embarrassing mistake.

           The dilemmas Walt faces are what make his story so compelling. He’s far more powerless than other bad boy characters at the start of the series, and he’s also far more altruistic in his motives. That’s precisely why it’s so disturbing—and riveting—to see those motives corrupted by his gradually accumulating power. It’s hard not to think of the cartel drug lords we always hear about in Mexico according to those “cartoon ideas of good and evil” Alan Ball was so delighted to see smashed by Tony Soprano. But Breaking Bad goes a long way toward bridging the divide between such villains and a type of life we have no trouble imagining. The show isn’t about free will or self-actualization at all; it’s about how even the nicest guy can be turned into one of the scariest villains by being placed in a not all that far-fetched set of circumstances. In much the same way, Martin, clearly a smart guy and a talented writer, can be made to look like a bit of an idiot by being forced to rely on a bunch of really bad ideas as he explores the inner workings some really great shows.

            If men’s selfish desires—sex, status, money, freedom—aren’t any more powerful than women’s, their approaches to satisfying them still tend to be more direct, less subtle. But what makes it harder for a woman’s struggles with her own desires to take on the same urgency as a man’s is probably not that far removed from the reasons women are seldom as physically imposing as men. Volatility in a large man can be really frightening. Men are more likely to have high-status careers like Don’s still today, but they’re also far more likely to end up in prison. These are pretty high stakes. And Don’s actions have ramifications for not just his own family’s well-being, but that of everyone at Sterling Cooper and their families, which is a consequence of that high-status. So status works as a proxy for size. Carmela Soprano’s volatility could be frightening too, but she isn’t the time-bomb Tony is. Speaking of bombs, Skyler White is an expert at bullying men, but going head-to-head with Walter she’s way overmatched. Men will always be scarier than women on average, so their struggles to rein in their scarier impulses will seem more urgent. Still, the Third Golden Age is a teenager now, and as anxious as I am to see what happens to Walter White and all his friends and family, I think the bad boy thing is getting a little stale. Anyone seen Damages

Also read:

The Criminal Sublime: Walter White's Brutally Plausible Journey to the Heart of Darkness in Breaking Bad

and:

HOW VIOLENT FICTION WORKS: ROHAN WILSON’S “THE ROVING PARTY” AND JAMES WOOD’S SANGUINARY SUBLIME FROM CONRAD TO MCCARTHY

And:

SABBATH SAYS: PHILIP ROTH AND THE DILEMMAS OF IDEOLOGICAL CASTRATION

Read More
Dennis Junk Dennis Junk

Sabbath Says: Philip Roth and the Dilemmas of Ideological Castration

With “Sabbath’s Theater,” Philip Roth has called down the thunder. The story does away with the concept of a likable character while delivering a wildly absorbing experience. And it satirizes all the woeful facets of how literature is taught today.

Sabbath’s Theater is the type of book you lose friends over. Mickey Sabbath, the adulterous title character who follows in the long literary line of defiantly self-destructive, excruciatingly vulnerable, and offputtingly but eloquently lustful leading males like Holden Caulfield and Humbert Humbert, strains the moral bounds of fiction and compels us to contemplate the nature of our own voyeuristic impulse to see him through to the end of the story—and not only contemplate it but defend it, as if in admitting we enjoy the book, find its irreverences amusing, and think that in spite of how repulsive he often is there still might be something to be said for poor old Sabbath we’re confessing to no minor offense of our own. Fans and admiring critics alike can’t resist rushing to qualify their acclaim by insisting they don’t condone his cheating on both of his wives, the seduction of a handful of his students, his habit of casually violating others’ privacy, his theft, his betrayal of his lone friend, his manipulations, his racism, his caustic, often cruelly precise provocations—but by the time they get to the end of Sabbath’s debt column it’s a near certainty any list of mitigating considerations will fall short of getting him out of the red. Sabbath, once a puppeteer who now suffers crippling arthritis, doesn’t seem like a very sympathetic character, and yet we sympathize with him nonetheless. In his wanton disregard for his own reputation and his embrace, principled in a way, of his own appetites, intuitions, and human nastiness, he inspires a fascination none of the literary nice guys can compete with. So much for the argument that the novel is a morally edifying art form.

            Thus, in Sabbath, Philip Roth has created a character both convincing and compelling who challenges a fundamental—we may even say natural—assumption about readers’ (or viewers’) role in relation to fictional protagonists, one made by everyone from the snarky authors of even the least sophisticated Amazon.com reviews to the theoreticians behind the most highfalutin academic criticism—the assumption that characters in fiction serve as vehicles for some message the author created them to convey, or which some chimerical mechanism within the “dominant culture” created to serve as agents of its own proliferation. The corollary is that the task of audience members is to try to decipher what the author is trying to say with the work, or what element of the culture is striving to perpetuate itself through it. If you happen to like the message the story conveys, or agree with it at some level, then you recommend the book and thus endorse the statement. Only rarely does a reviewer realize or acknowledge that the purpose of fiction is not simply to encourage readers to behave as the protagonists behave or, if the tale is a cautionary one, to expect the same undesirable consequences should they choose to behave similarly. Sabbath does in fact suffer quite a bit over the course of the novel, and much of that suffering comes as a result of his multifarious offenses, so a case can be made on behalf of Roth’s morality. Still, we must wonder if he really needed to write a story in which the cheating husband is abandoned by both of his wives to make the message sink in that adultery is wrong—especially since Sabbath doesn’t come anywhere near to learning that lesson himself. “All the great thoughts he had not reached,” Sabbath muses in the final pages, “were beyond enumeration; there was no bottom to what he did not have to say about the meaning of his life” (779).

           Part of the reason we can’t help falling back on the notions that fiction serves a straightforward didactic purpose and that characters should be taken as models, positive or negative, for moral behavior is that our moral emotions are invariably and automatically engaged by stories; indeed, what we usually mean when we say we got into a story is that we were in suspense as we anticipated whether the characters ultimately met with the fates we felt they deserved. We reflexively size up any character the author introduces the same way we assess the character of a person we’re meeting for the first time in real life. For many readers, the question of whether a novel is any good is interchangeable with the question of whether they liked the main characters, assuming they fare reasonably well in the culmination of the plot. If an author like Roth evinces an attitude drastically different from ours toward a character of his own creation like Sabbath, then we feel that in failing to condemn him, in holding him up as a model, the author is just as culpable as his character. In a recent edition of PBS’s American Masters devoted to Roth, for example, Jonathan Franzen, a novelist himself, describes how even he couldn’t resist responding to his great forebear’s work in just this way. “As a young writer,” Franzen recalls, “I had this kind of moralistic response of ‘Oh, you bad person, Philip Roth’” (54:56).

            That fiction’s charge is to strengthen our preset convictions through a process of narrative tempering, thus catering to our desire for an orderly calculus of just deserts, serves as the basis for a contract between storytellers and audiences, a kind of promise on which most commercial fiction delivers with a bang. And how many of us have wanted to throw a book out of the window when we felt that promise had been broken? The goal of professional and academic critics, we may imagine, might be to ease their charges into an appreciation of more complex narrative scenarios enacted by characters who escape easy categorization. But since scholarship in the humanities, and in literary criticism especially, has been in a century-long sulk over the greater success of science and the greater renown of scientists, professors of literature have scarcely even begun to ponder what anything resembling a valid answer to the questions of how fiction works and what the best strategies for experiencing it might look like. Those who aren’t pouting in a corner about the ascendancy of science—but the Holocaust!—are stuck in the muck of the century-old pseudoscience of psychoanalysis. But the real travesty is that the most popular, politically inspired schools of literary criticism—feminism, Marxism, postcolonialism—actively preach the need to ignore, neglect, and deny the very existence of moral complexity in literature, violently displacing any appreciation of difficult dilemmas with crudely tribal formulations of good and evil.

            For those inculcated with a need to take a political stance with regard to fiction, the only important dynamics in stories involve the interplay of society’s privileged oppressors and their marginalized victims. In 1976, nearly twenty years before the publication of Sabbath’s Theater, the feminist critic Vivian Gornick lumped Roth together with Saul Bellow and Norman Mailer in an essay asking “Why Do These Men Hate Women?” because she took issue with the way women are portrayed in their novels. Gornick, following the methods standard to academic criticism, doesn’t bother devoting any space in her essay to inconvenient questions about how much we can glean about these authors from their fictional works or what it means that the case for her prosecution rests by necessity on a highly selective approach to quoting from those works. And this slapdash approach to scholarship is supposedly justified because she and her fellow feminist critics believe women are in desperate need of protection from the incalculable harm they assume must follow from such allegedly negative portrayals. In this concern for how women, or minorities, or some other victims are portrayed and how they’re treated by their notional oppressors—rich white guys—Gornick and other critics who make of literature a battleground for their political activism are making the same assumption about fiction’s straightforward didacticism as the most unschooled consumers of commercial pulp. The only difference is that the academics believe the message received by audiences is all that’s important, not the message intended by the author. The basis of this belief probably boils down to its obvious convenience.

            In Sabbath’s Theater, the idea that literature, or art of any kind, is reducible to so many simple messages, and that these messages must be measured against political agendas, is dashed in the most spectacularly gratifying fashion. Unfortunately, the idea is so seldom scrutinized, and the political agendas are insisted on so inclemently, clung to and broadcast with such indignant and prosecutorial zeal, that it seems not one of the critics, nor any of the authors, who were seduced by Sabbath were able to fully reckon with the implications of that seduction. Franzen, for instance, in a New Yorker article about fictional anti-heroes, dodges the issue as he puzzles over the phenomenon that “Mickey Sabbath may be a disgustingly self-involved old goat,” but he’s somehow still sympathetic. The explanation Franzen lights on is that

the alchemical agent by which fiction transmutes my secret envy or my ordinary dislike of “bad” people into sympathy is desire. Apparently, all a novelist has to do is give a character a powerful desire (to rise socially, to get away with murder) and I, as a reader, become helpless not to make that desire my own. (63)

If Franzen is right—and this chestnut is a staple of fiction workshops—then the political activists are justified in their urgency. For if we’re powerless to resist adopting the protagonist’s desires as our own, however fleetingly, then any impulse to victimize women or minorities must invade readers’ psyches at some level, conscious or otherwise. The simple fact, however, is that Sabbath has not one powerful desire but many competing desires, ones that shift as the novel progresses, and it’s seldom clear even to Sabbath himself what those desires are. (And is he really as self-involved as Franzen suggests? It seems to me rather that he compulsively tries to get into other people’s heads, reflexively imagining elaborate stories for them.)

            While we undeniably respond to virtuous characters in fiction by feeling anxiety on their behalf as we read about or watch them undergo the ordeals of the plot, and we just as undeniably enjoy seeing virtue rewarded alongside cruelty being punished—the goodies prevailing over the baddies—these natural responses do not necessarily imply that stories compel our interest and engage our emotions by providing us with models and messages of virtue. Stories aren’t sermons. In his interview for American Masters, Roth explained what a writer’s role is vis-à-vis social issues.

My job isn’t to be enraged. My job is what Chekhov said the job of an artist was, which is the proper presentation of the problem. The obligation of the writer is not to provide the solution to a problem. That’s the obligation of a legislator, a leader, a crusader, a revolutionary, a warrior, and so on. That’s not the goal or aim of a writer. You’re not selling it, and you’re not inviting condemnation. You’re inviting understanding. (59:41)

The crucial but overlooked distinction that characters like Sabbath—but none so well as Sabbath—bring into stark relief is the one between declarative knowledge on the one hand and moment-by-moment experience on the other. Consider for a moment how many books and movies we’ve all been thoroughly engrossed in for however long it took to read or watch them, only to discover a month or so later that we can’t remember even the broadest strokes of how their plots resolved themselves—much less what their morals might have been.

            The answer to the question of what the author is trying to say is that he or she is trying to give readers a sense of what it would be like to go through what the characters are going through—or what it would be like to go through it with them. In other words, authors are not trying to say anything; they’re offering us an experience, once-removed and simulated though it may be. This isn’t to say that these simulated experiences don’t engage our moral emotions; indeed, we’re usually only as engaged in a story as our moral emotions are engaged by it. The problem is that in real-time, in real life, political ideologies, psychoanalytic theories, and rigid ethical principles are too often the farthest thing from helpful. “Fuck the laudable ideologies,” Sabbath helpfully insists: “Shallow, shallow, shallow!” Living in a complicated society with other living, breathing, sick, cruel, saintly, conniving, venal, altruistic, deceitful, noble, horny humans demands not so much a knowledge of the rules as a finely honed body of skills—and our need to develop and hone these skills is precisely why we evolved to find the simulated experiences of fictional narratives both irresistibly fascinating and endlessly pleasurable. Franzen was right that desires are important, the desire to be a good person, the desire to do things others may condemn, the desire to get along with our families and friends and coworkers, the desire to tell them all to fuck off so we can be free, even if just for an hour, to breathe… or to fuck an intern, as the case may be. Grand principles offer little guidance when it comes to balancing these competing desires. This is because, as Sabbath explains, “The law of living: fluctuation. For every thought a counterthought, for every urge a counterurge” (518).

            Fiction then is not a conveyance for coded messages—how tedious that would be (how tedious it really is when writers make this mistake); it is rather a simulated experience of moral dilemmas arising from scenarios which pit desire against desire, conviction against reality, desire against conviction, reality against desire, in any and all permutations. Because these experiences are once-removed and, after all, merely fictional, and because they require our sustained attention, the dilemmas tend to play out in the vicinity of life’s extremes. Here’s how Sabbath’s Theater opens:

            Either forswear fucking others or the affair is over.

            This was the ultimatum, the maddeningly improbable, wholly unforeseen ultimatum, that the mistress of fifty-two delivered in tears to her lover of sixty-four on the anniversary of an attachment that had persisted with an amazing licentiousness—and that, no less amazingly, had stayed their secret—for thirteen years. But now with hormonal infusions ebbing, with the prostate enlarging, with probably no more than another few years of semi-dependable potency still his—with perhaps not that much more life remaining—here at the approach of the end of everything, he was being charged, on pain of losing her, to turn himself inside out. (373)

The ethical proposition that normally applies in situations like this is that adultery is wrong, so don’t commit adultery. But these two have been committing adultery with each other for thirteen years already—do we just stop reading? And if we keep reading, maybe nodding once in a while as we proceed, cracking a few wicked grins along the way, does that mean we too must be guilty?

                               *****

            Much of the fiction written by male literary figures of the past generation, guys like Roth, Mailer, Bellow, and Updike, focuses on the morally charged dilemmas instanced by infidelity, while their gen-x and millennial successors, led by guys like Franzen and David Foster Wallace, have responded to shifting mores—and a greater exposure to academic literary theorizing—by completely overhauling how these dilemmas are framed. Whereas the older generation framed the question as how can we balance the intense physical and spiritual—even existential—gratification of sexual adventure on the one hand with our family obligations on the other, for their successors the question has become how can we males curb our disgusting, immoral, intrinsically oppressive lusting after young women inequitably blessed with time-stamped and overwhelmingly alluring physical attributes. “The younger writers are so self-conscious,” Katie Roiphe writes in a 2009 New York Times essay, “so steeped in a certain kind of liberal education, that their characters can’t condone even their own sexual impulses; they are, in short, too cool for sex.” Roiphe’s essay, “The Naked and the Confused,” stands alongside a 2012 essay in The New York Review of Books by Elaine Blair, “Great American Losers,” as the best descriptions of the new literary trend toward sexually repressed and pathetically timid male leads. The typical character in this vein, Blair writes, “is the opposite of entitled: he approaches women cringingly, bracing for a slap.”

            The writers in the new hipster cohort create characters who bury their longings layers-deep in irony because they’ve been assured the failure on the part of men of previous generations to properly check these same impulses played some unspecified role in the abysmal standing of women in society. College students can’t make it past their first semester without hearing about the evils of so-called objectification, but it’s nearly impossible to get a straight answer from anyone, anywhere, to the question of how objectification can be distinguished from normal, non-oppressive male attraction and arousal. Even Roiphe, in her essay lamenting the demise of male sexual virility in literature, relies on a definition of male oppression so broad that it encompasses even the most innocuous space-filling lines in the books of even the most pathetically diffident authors, writing that “the sexism in the work of the heirs apparent” of writers like Roth and Updike,

is simply wilier and shrewder and harder to smoke out. What comes to mind is Franzen’s description of one of his female characters in “The Corrections”: “Denise at 32 was still beautiful.” To the esteemed ladies of the movement I would suggest this is not how our great male novelists would write in the feminist utopia.

How, we may ask, did it get to the point where acknowledging that age influences how attractive a woman is qualifies a man for designation as a sexist? Blair, in her otherwise remarkably trenchant essay, lays the blame for our oversensitivity—though paranoia is probably a better word—at the feet of none other than those great male novelists themselves, or, as David Foster Wallace calls them, the Great Male Narcissists. She writes,

Because of the GMNs, these two tendencies—heroic virility and sexist condescension—have lingered in our minds as somehow yoked together, and the succeeding generations of American male novelists have to some degree accepted the dyad as truth. Behind their skittishness is a fearful suspicion that if a man gets what he wants, sexually speaking, he is probably exploiting someone.

That Roth et al were sexist, condescending, disgusting, narcissistic—these are articles of faith for feminist critics. Yet when we consider how expansive the definition of terms like sexism and misogyny have become—in practical terms, they both translate to: not as radically feminist as me—and the laughably low standard of evidence required to convince scholars of the accusations, female empowerment starts to look like little more than a reserved right to stand in self-righteous judgment of men for giving voice to and acting on desires anyone but the most hardened ideologue will agree are only natural.

             The effect on writers of this ever-looming threat of condemnation is that they either allow themselves to be silenced or they opt to participate in the most undignified of spectacles, peevishly sniping their colleagues, falling all over themselves to be granted recognition as champions for the cause. Franzen, at least early in his career, was more the silenced type. Discussing Roth, he wistfully endeavors to give the appearance of having moved beyond his initial moralistic responses. “Eventually,” he says, “I came to feel as if that was coming out of an envy: like, wow, I wish I could be as liberated of worry about other’s people’s opinion of me as Roth is” (55:18). We have to wonder if his espousal of the reductive theory that sympathy for fictional characters is based solely on the strength of their desires derives from this same longing for freedom to express his own. David Foster Wallace, on the other hand, wasn’t quite as enlightened or forgiving when it came to his predecessors. Here’s how he explains his distaste for a character in one of Updike’s novels, openly intimating the author’s complicity:

It’s that he persists in the bizarre adolescent idea that getting to have sex with whomever one wants whenever one wants is a cure for ontological despair. And so, it appears, does Mr. Updike—he makes it plain that he views the narrator’s impotence as catastrophic, as the ultimate symbol of death itself, and he clearly wants us to mourn it as much as Turnbull does. I’m not especially offended by this attitude; I mostly just don’t get it. Erect or flaccid, Ben Turnbull’s unhappiness is obvious right from the book’s first page. But it never once occurs to him that the reason he’s so unhappy is that he’s an asshole.

So the character is an asshole because he wants to have sex outside of marriage, and he’s unhappy because he’s an asshole, and it all traces back to the idea that having sex with whomever one wants is a source of happiness? Sounds like quite the dilemma—and one that pronouncing the main player an asshole does nothing to solve. This passage is the conclusion to a review in which Wallace tries to square his admiration for Updike’s writing with his desire to please a cohort of women readers infuriated by the way Updike writes about—portrays—women (which begs the question of why they’d read so many of his books). The troubling implication of his compromise is that if Wallace were himself to freely express his sexual feelings, he’d be open to the charge of sexism too—he’d be an asshole. Better to insist he simply doesn’t “get” why indulging his sexual desires might alleviate his “ontological despair.” What would Mickey Sabbath make of the fact that Wallace hanged himself when he was only forty-six, eleven years after publishing that review? (This isn’t just a nasty rhetorical point; Sabbath has a fascination with artists who commit suicide.)

The inadequacy of moral codes and dehumanizing ideologies when it comes to guiding real humans through life’s dilemmas, along with their corrosive effects on art, is the abiding theme of Sabbath’s Theater. One of the pivotal moments in Sabbath’s life is when a twenty-year-old student he’s in the process of seducing leaves a tape recorder out to be discovered in a lady’s room at the university. The student, Kathy Goolsbee, has recorded a phone sex session between her and Sabbath, and when the tape finds its way into the hands of the dean, it becomes grounds for the formation of a committee of activists against the abuse of women. At first, Kathy doesn’t realize how bad things are about to get for Sabbath. She even offers to give him a blow job as he berates her for her carelessness. Trying to impress on her the situation’s seriousness, he says,

Your people have on tape my voice giving reality to all the worst things they want the world to know about men. They have a hundred times more proof of my criminality than could be required by even the most lenient of deans to drive me out of every decent antiphallic educational institution in America. (586)

The committee against Sabbath proceeds to make the full recorded conversation available through a call-in line (the nineties equivalent of posting the podcast online). But the conversation itself isn’t enough; one of the activists gives a long introduction, which concludes,

The listener will quickly recognize how by this point in his psychological assault on an inexperienced young woman, Professor Sabbath has been able to manipulate her into thinking that she is a willing participant. (567-8)

Sabbath knows full well that even consensual phone sex can be construed as a crime if doing so furthers the agenda of those “esteemed ladies of the movement” Roiphe addresses. 

Reading through the lens of a tribal ideology ineluctably leads to the refraction of reality beyond recognizability, and any aspiring male writer quickly learns in all his courses in literary theory that the criteria for designation as an enemy to the cause of women are pretty much whatever the feminist critics fucking say they are. Wallace wasn’t alone in acquiescing to feminist rage by denying his own boorish instincts. Roiphe describes the havoc this opportunistic antipathy toward male sexuality wreaks in the minds of male writers and their literary creations:

Rather than an interest in conquest or consummation, there is an obsessive fascination with trepidation, and with a convoluted, postfeminist second-guessing. Compare [Benjamin] Kunkel’s tentative and guilt-ridden masturbation scene in “Indecision” with Roth’s famous onanistic exuberance with apple cores, liver and candy wrappers in “Portnoy’s Complaint.” Kunkel: “Feeling extremely uncouth, I put my penis away. I might have thrown it away if I could.” Roth also writes about guilt, of course, but a guilt overridden and swept away, joyously subsumed in the sheer energy of taboo smashing: “How insane whipping out my joint like that! Imagine what would have been had I been caught red-handed! Imagine if I had gone ahead.” In other words, one rarely gets the sense in Roth that he would throw away his penis if he could.

And what good comes of an ideology that encourages the psychological torture of bookish young men? It’s hard to distinguish the effects of these so-called literary theories from the hellfire scoldings delivered from the pulpits of the most draconian and anti-humanist religious patriarchs. Do we really need to ideologically castrate all our male scholars to protect women from abuse and further the cause of equality?

*****

The experience of sexual relations between older teacher and younger student in Sabbath’s Theater is described much differently when the gender activists have yet to get involved—and not just by Sabbath but by Kathy as well. “I’m of age!” she protests as he chastises her for endangering his job and opening him up to public scorn; “I do what I want” (586). Absent the committee against him, Sabbath’s impression of how his affairs with his students impact them reflects the nuance of feeling inspired by these experimental entanglements, the kind of nuance that the “laudable ideologies” can’t even begin to capture.

There was a kind of art in his providing an illicit adventure not with a boy of their own age but with someone three times their age—the very repugnance that his aging body inspired in them had to make their adventure with him feel a little like a crime and thereby give free play to their budding perversity and to the confused exhilaration that comes of flirting with disgrace. Yes, despite everything, he had the artistry still to open up to them the lurid interstices of life, often for the first time since they’d given their debut “b.j.” in junior high. As Kathy told him in that language which they all used and which made him want to cut their heads off, through coming to know him she felt “empowered.” (566)

Opening up “the lurid interstices of life” is precisely what Roth and the other great male writers—all great writers—are about. If there are easy answers to the questions of what characters should do, or if the plot entails no more than a simple conflict between a blandly good character and a blandly bad one, then the story, however virtuous its message, will go unattended.

            But might there be too much at stake for us impressionable readers to be allowed free reign to play around in imaginary spheres peopled by morally dubious specters? After all, if denouncing the dreamworlds of privileged white men, however unfairly, redounds to the benefit of women and children and minorities, then perhaps it’s to the greater good. In fact, though, right alongside the trends of increasing availability for increasingly graphic media portrayals of sex and violence have occurred marked decreases in actual violence and the abuse of women. And does anyone really believe it’s the least literate, least media-saturated societies that are the kindest to women? The simple fact is that the theory of literature subtly encouraging oppression can’t be valid. But the problem is once ideologies are institutionalized, once a threshold number of people depend on their perpetuation for their livelihoods, people whose scholarly work and reputations are staked on them, then victims of oppression will be found, their existence insisted on, regardless of whether they truly exist or not.

In another scandal Sabbath was embroiled in long before his flirtation with Kathy Goolsbee, he was brought up on charges of indecency because in the course of a street performance he’d exposed a woman’s nipple. The woman herself, Helen Trumbull, maintains from the outset of the imbroglio that whatever Sabbath had done, he’d done it with her consent—just as will be the case with his “psychological assault” on Kathy. But even as Sabbath sits assured that the case against him will collapse once the jury hears the supposed victim testify on his behalf, the prosecution takes a bizarre twist:

In fact, the victim, if there even is one, is coming this way, but the prosecutor says no, the victim is the public. The poor public, getting the shaft from this fucking drifter, this artist. If this guy can walk along a street, he says, and do this, then little kids think it’s permissible to do this, and if little kids think it’s permissible to do this, then they think it’s permissible to blah blah banks, rape women, use knives. If seven-year-old kids—the seven nonexistent kids are now seven seven-year-old kids—are going to see that this is fun and permissible with strange women… (663-4)

Here we have Roth’s dramatization of the fundamental conflict between artists and moralists. Even if no one is directly hurt by playful scenarios, that they carry a message, one that threatens to corrupt susceptible minds, is so seemingly obvious it’s all but impossible to refute. Since the audience for art is “the public,” the acts of depravity and degradation it depicts are, if anything, even more fraught with moral and political peril than any offense against an individual victim, real or imagined.  

            This theme of the oppressive nature of ideologies devised to combat oppression, the victimizing proclivity of movements originally fomented to protect and empower victims, is most directly articulated by a young man named Donald, dressed in all black and sitting atop a file cabinet in a nurse’s station when Sabbath happens across him at a rehab clinic. Donald “vaguely resembled the Sabbath of some thirty years ago,” and Sabbath will go on to apologize for interrupting him, referring to him as “a man whose aversions I wholeheartedly endorse.” What he was saying before the interruption:

“Ideological idiots!” proclaimed the young man in black. “The third great ideological failure of the twentieth century. The same stuff. Fascism. Communism. Feminism. All designed to turn one group of people against another group of people. The good Aryans against the bad others who oppress them. The good poor against the bad rich who oppress them. The good women against the bad men who oppress them. The holder of ideology is pure and good and clean and the other wicked. But do you know who is wicked? Whoever imagines himself to be pure is wicked! I am pure, you are wicked… There is no human purity! It does not exist! It cannot exist!” he said, kicking the file cabinet for emphasis. “It must not and should not exist! Because it’s a lie. … Ideological tyranny. It’s the disease of the century. The ideology institutionalizes the pathology. In twenty years there will be a new ideology. People against dogs. The dogs are to blame for our lives as people. Then after dogs there will be what? Who will be to blame for corrupting our purity?” (620-1)

It’s noteworthy that this rant is made by a character other than Sabbath. By this point in the novel, we know Sabbath wouldn’t speak so artlessly—unless he was really frightened or angry. As effective and entertaining an indictment of “Ideological tyranny” as Sabbath’s Theater is, we shouldn’t expect to encounter anywhere in a novel by a storyteller as masterful as Roth a character operating as a mere mouthpiece for some argument. Even Donald himself, Sabbath quickly gleans, isn’t simply spouting off; he’s trying to impress one of the nurses.

            And it’s not just the political ideologies that conscript complicated human beings into simple roles as oppressors and victims. The pseudoscientific psychological theories that both inform literary scholarship and guide many non-scholars through life crises and relationship difficulties function according to the same fundamental dynamic of tribalism; they simply substitute abusive family members for more generalized societal oppression and distorted or fabricated crimes committed in the victim’s childhood for broader social injustices. Sabbath is forced to contend with this particular brand of depersonalizing ideology because his second wife, Roseanna, picks it up through her AA meetings, and then becomes further enmeshed in it through individual treatment with a therapist named Barbara. Sabbath, who considers himself a failure, and who is carrying on an affair with the woman we meet in the opening lines of the novel, is baffled as to why Roseanna would stay with him. Her therapist provides an answer of sorts.

But then her problem with Sabbath, the “enslavement,” stemmed, according to Barbara, from her disastrous history with an emotionally irresponsible mother and a violent alcoholic father for both of whom Sabbath was the sadistic doppelganger. (454)

Roseanna’s father was a geology professor who hanged himself when she was a young teenager. Sabbath is a former puppeteer with crippling arthritis. Naturally, he’s confused by the purported identity of roles.

These connections—between the mother, the father, and him—were far clearer to Barbara than they were to Sabbath; if there was, as she liked to put it, a “pattern” in it all, the pattern eluded him. In the midst of a shouting match, Sabbath tells his wife, “As for the ‘pattern’ governing a life, tell Barbara it’s commonly called chaos” (455).

When she protests, “You are shouting at me like my father,” Sabbath asserts his individuality: “The fuck that’s who I’m shouting at you like! I’m shouting at you like myself!” (459). Whether you see his resistance as heroic or not probably depends on how much credence you give to those psychological theories.

            From the opening lines of Sabbath’s Theater when we’re presented with the dilemma of the teary-eyed mistress demanding monogamy in their adulterous relationship, the simple response would be to stand in easy judgment of Sabbath, and like Wallace did to Updike’s character, declare him an asshole. It’s clear that he loves this woman, a Croatian immigrant named Drenka, a character who at points steals the show even from the larger-than-life protagonist. And it’s clear his fidelity would mean a lot to her. Is his freedom to fuck other women really so important? Isn’t he just being selfish? But only a few pages later our easy judgment suddenly gets more complicated:

As it happened, since picking up Christa several years back Sabbath had not really been the adventurous libertine Drenka claimed she could no longer endure, and consequently she already had the monogamous man she wanted, even if she didn’t know it. To women other than her, Sabbath was by now quite unalluring, not just because he was absurdly bearded and obstinately peculiar and overweight and aging in every obvious way but because, in the aftermath of the scandal four years earlier with Kathy Goolsbee, he’s become more dedicated than ever to marshaling the antipathy of just about everyone as though he were, in fact, battling for his rights. (394)

Christa was a young woman who participated in a threesome with Sabbath and Drenka, an encounter to which Sabbath’s only tangible contribution was to hand the younger woman a dildo.

            One of the central dilemmas for a character who loves the thrill of sex, who seeks in it a rekindling of youthful vigor—“the word’s rejuvenation,” Sabbath muses at one point (517)—the adrenaline boost borne of being in the wrong and the threat of getting caught, what Roiphe calls “the sheer energy of taboo smashing,” becomes ever more indispensable as libido wanes with age. Even before Sabbath ever had to contend with the ravages of aging, he reveled in this added exhilaration that attends any expedition into forbidden realms. What makes Drenka so perfect for him is that she has not just a similarly voracious appetite but a similar fondness for outrageous sex and the smashing of taboo. And it’s this mutual celebration of the verboten that Sabbath is so reluctant to relinquish. Of Drenka, he thinks,

The secret realm of thrills and concealment, this was the poetry of her existence. Her crudeness was the most distinguishing force in her life, lent her life its distinction. What was she otherwise? What was he otherwise? She was his last link with another world, she and her great taste for the impermissible. As a teacher of estrangement from the ordinary, he had never trained a more gifted pupil; instead of being joined by the contractual they were interconnected by the instinctual and together could eroticize anything (except their spouses). Each of their marriages cried out for a countermarriage in which the adulterers attack their feelings of captivity. (395)

Those feelings of captivity, the yearnings to experience the flow of the old juices, are anything but adolescent, as Wallace suggests of them; adolescents have a few decades before they have to worry about dwindling arousal. Most of them have the opposite problem.

            The question of how readers are supposed to feel about a character like Sabbath doesn’t have any simple answers. He’s an asshole at several points in the novel, but at several points he’s not. One of the reasons he’s so compelling is that working out what our response to him should be poses a moral dilemma of its own. Whether or not we ultimately decide that adultery is always and everywhere wrong, the experience of being privy to Sabbath’s perspective can help us prepare ourselves for our own feelings of captivity, lusting nostalgia, and sexual temptation. Most of us will never find ourselves in a dilemma like Sabbath gets himself tangled in with his friend Norman’s wife, for instance, but it would be to our detriment to automatically discount the old hornball’s insights.

He could discern in her, whenever her husband spoke, the desire to be just a little cruel to Norman, saw her sneering at the best of him, at the very best things in him. If you don’t go crazy because of your husband’s vices, you go crazy because of his virtues. He’s on Prozac because he can’t win. Everything is leaving her except for her behind, which her wardrobe informs her is broadening by the season—and except for this steadfast prince of a man marked by reasonableness and ethical obligation the way others are marked by insanity or illness. Sabbath understood her state of mind, her state of life, her state of suffering: dusk is descending, and sex, our greatest luxury, is racing away at a tremendous speed, everything is racing off at a tremendous speed and you wonder at your folly in having ever turned down a single squalid fuck. You’d give your right arm for one if you are a babe like this. It’s not unlike the Great Depression, not unlike going broke overnight after years of raking it in. “Nothing unforeseen that happens,” the hot flashes inform her, “is likely ever again going to be good.” Hot flashes mockingly mimicking the sexual ecstasies. Dipped, she is, in the very fire of fleeting time. (651)

Welcome to messy, chaotic, complicated life.

Sabbath’s Theater is, in part, Philip Roth’s raised middle finger to the academic moralists whose idiotic and dehumanizing ideologies have spread like a cancer into all the venues where literature is discussed and all the avenues through which it’s produced. Unfortunately, the unrecognized need for culture-wide chemotherapy hasn’t gotten any less dire in the nearly two decades since the novel was published. With literature now drowning in the devouring tide of new media, the tragic course set by the academic custodians of art toward bloodless prudery and impotent sterility in the name of misguided political activism promises to do nothing but ensure the ever greater obsolescence of epistemologically doomed and resoundingly pointless theorizing, making of college courses the places where you go to become, at best, profoundly confused about where you should stand with relation to fiction and fictional characters, and, at worst, a self-righteous demagogue denouncing the chimerical evils allegedly encoded into every text or cultural artifact. All the conspiracy theorizing about the latent evil urgings of literature has amounted to little more than another reason not to read, another reason to tune in to Breaking Bad or Mad Men instead. But the only reason Roth’s novel makes such a successful case is that it at no point allows itself to be reducible to a mere case, just as Sabbath at no point allows himself to be conscripted as a mere argument. We don’t love or hate him; we love and hate him. But we sort of just love him because he leaves us free to do both as we experience his antics, once removed and simulated, but still just as complicatedly eloquent in their message of “Fuck the laudable ideologies”—or not, as the case may be. 

Also read

JUST ANOTHER PIECE OF SLEAZE: THE REAL LESSON OF ROBERT BOROFSKY'S "FIERCE CONTROVERSY"

And

PUTTING DOWN THE PEN: HOW SCHOOL TEACHES US THE WORST POSSIBLE WAY TO READ LITERATURE

And

HOW VIOLENT FICTION WORKS: ROHAN WILSON’S “THE ROVING PARTY” AND JAMES WOOD’S SANGUINARY SUBLIME FROM CONRAD TO MCCARTHY

Read More
Dennis Junk Dennis Junk

Freud: The Falsified Cipher

Upon entering a graduate program in literature, I was appalled to find that Freud’s influence was alive and well in the department. Didn’t they know that nearly all of Freud’s theories have been disproven? Didn’t they know psychoanalysis is pseudoscience?

[As I'm hard at work on a story, I thought I'd post an essay from my first course as a graduate student on literary criticism. It was in the fall of 2009, and I was shocked and appalled that not only were Freud's ideas still being taught but there was no awareness whatsoever that psychology had moved beyond them. This is my attempt at righting the record while keeping my tone in check.]

The matter of epistemology in literary criticism is closely tied to the question of what end the discipline is supposed to serve. How critics decide what standard of truth to adhere to is determined by the role they see their work playing, both in academia and beyond. Freud stands apart as a literary theorist, professing in his works a commitment to scientific rigor in a field that generally holds belief in even the possibility of objectivity as at best naïve and at worst bourgeois or fascist. For the postmodernists, both science and literature are suspiciously shot through with the ideological underpinnings of capitalist European male hegemony, which they take as their duty to undermine. Their standard of truth, therefore, seems to be whether a theory or application effectively exposes one or another element of that ideology to “interrogation.” Admirable as the values underlying this patently political reading of texts are, the science-minded critic might worry lest such an approach merely lead straight back to the a priori assumptions from which it set forth. Now, a century after Freud revealed the theory and practice of psychoanalysis, his attempt to interpret literature scientifically seems like one possible route of escape from the circularity (and obscurantism) of postmodernism. Unfortunately, Freud’s theories have suffered multiple devastating empirical failures, and Freud himself has been shown to be less a committed scientist than an ingenious fabulist, but it may be possible to salvage from the failures of psychoanalysis some key to a viable epistemology of criticism.

A text dating from early in the development of psychoanalysis shows both the nature of Freud’s methods and some of the most important substance of his supposed discoveries. Describing his theory of the Oedipus complex in The Interpretation of Dreams, Freud refers vaguely to “observations on normal children,” to which he compares his experiences with “psychoneurotics” to arrive at his idea that both display, to varying degrees, “feelings of love and hatred to their parents” (920). There is little to object to in this rather mundane observation, but Freud feels compelled to write that his discovery is confirmed by a legend,

…a legend whose profound and universal power to move can only be understood if the hypothesis I have put forward in regard to the psychology of children has an equally universal validity. (920)

He proceeds to relate the Sophocles drama from which his theory gets its name. In the story, Oedipus is tricked by fate into killing his father and marrying his mother. Freud takes this as evidence that the love and hatred he has observed in children are of a particular kind. According to his theory, any male child is fated to “direct his first sexual impulse towards his mother” and his “first murderous wish against his father” (921). But Freud originally poses this idea as purely hypothetical. What settles the issue is evidence he gleans from dream interpretations. “Our dreams,” he writes, “convince us that this is so” (921). Many men, it seems, confided to him that they dreamt of having sex with their mothers and killing their fathers.

Freud’s method, then, was to seek a thematic confluence between men’s dreams, the stories they find moving, and the behaviors they display as children, which he knew mostly through self-reporting years after the fact. Indeed, the entire edifice of psychoanalysis is purported to have been erected on this epistemic foundation. In a later essay on “The Uncanny,” Freud makes the sources of his ideas even more explicit. “We know from psychoanalytic experience,” he writes, “that the fear of damaging or losing one’s eyes is a terrible one in children” (35). A few lines down, he claims that, “A study of dreams, phantasies and myths has taught us that anxiety about one’s eyes…is a substitute for the dread of being castrated” (36). Here he’s referring to another facet of the Oedipus complex which theorizes that the child keeps his sexual desire for his mother in check because of the threat of castration posed by his jealous father. It is through this fear of his father, which transforms into grudging respect, and then into emulation, that the boy learns his role as a male in society. And it is through the act of repressing his sexual desire for his mother that he first develops his unconscious, which will grow into a general repository of unwanted desires and memories (Eagleton 134).

But what led Freud to this theory of repression, which suggests that we have the ability to willfully forget troubling incidents and drive urges to some portion of our minds to which we have no conscious access? He must have arrived at an understanding of this process in the same stroke that led to his conclusions about the Oedipus complex, because, in order to put forth the idea that as children we all hated one parent and wanted to have sex with the other, he had to contend with the fact that most people find the idea repulsive. What accounts for the dramatic shift between childhood desires and those of adults? What accounts for our failure to remember the earlier stage? The concept of repression had to be firmly established before Freud could make such claims. Of course, he could have simply imported the idea from another scientific field, but there is no evidence he did so. So it seems that he relied on the same methods—psychoanalysis, dream interpretation, and the study of myths and legends—to arrive at his theories as he did to test them. Inspiration and confirmation were one and the same.

Notwithstanding Freud’s claim that the emotional power of the Oedipus legend “can only be understood” if his hypothesis about young boys wanting to have sex with their mothers and kill their fathers has “universal validity,” there is at least one alternative hypothesis which has the advantage of not being bizarre. It could be that the point of Sophocles’s drama was that fate is so powerful it can bring about exactly the eventualities we most desire to avoid. What moves audiences and readers is not any sense of recognition of repressed desires, but rather compassion for the man who despite, even because of, his heroic efforts fell into this most horrible of traps. (Should we assume that the enduring popularity of W.W. Jacobs’s story, “The Monkey’s Paw,” which tells a similar fated story about a couple who inadvertently wish their son dead, proves that all parents want to kill their children?) The story could be moving because it deals with events we would never want to happen. It is true however that this hypothesis fails to account for why people enjoy watching such a tragedy being enacted—but then so does Freud’s. If we have spent our conscious lives burying the memory of our childhood desires because they are so unpleasant to contemplate, it makes little sense that we should find pleasure in seeing those desires acted out on stage. And assuming this alternative hypothesis is at least as plausible as Freud’s, we are left with no evidence whatsoever to support his theory of repressed childhood desires.

To be fair, Freud did look beyond the dreams and myths of men of European descent to test the applicability of his theories. In his book Totem and Taboo he inventories “savage” cultures and adduces the universality among them of a taboo against incest as further proof of the Oedipus complex. He even goes so far as to cite a rival theory put forth by a contemporary:

Westermarck has explained the horror of incest on the ground that “there is an innate aversion to sexual intercourse between persons living very closely together from early youth, and that, as such persons are in most cases related by blood, this feeling would naturally display itself in custom and law as a horror of intercourse between near kin.” (152)

To dismiss Westermarck’s theory, Freud cites J. G. Frazer, who argues that laws exist only to prevent us from doing things we would otherwise do or prod us into doing what we otherwise would not. That there is a taboo against incest must therefore signal that there is no innate aversion, but rather a proclivity, for incest. Here it must be noted that the incest Freud had in mind includes not just lust for the mother but for sisters as well. “Psychoanalysis has taught us,” he writes, again vaguely referencing his clinical method, “that a boy’s earliest choice of objects for his love is incestuous and that those objects are forbidden ones—his mother and sister” (22). Frazer’s argument is compelling, but Freud’s test of the applicability of his theories is not the same as a test of their validity (though it seems customary in literary criticism to conflate the two).

As linguist and cognitive neuroscientist Steven Pinker explains in How the Mind Works, in tests of validity Westermarck beats Freud hands down. Citing the research of Arthur Wolf, he explains that without setting out to do so, several cultures have conducted experiments on the nature of incest aversion. Israeli kibbutzim, in which children grew up in close proximity to several unrelated agemates, and the Chinese and Taiwanese practice of adopting future brides for sons and raising them together as siblings are just two that Wolf examined. When children from the kibbutzim reached sexual maturity, even though there was no discouragement from adults for them to date or marry, they showed a marked distaste for each other as romantic partners. And compared to more traditional marriages, those in which the bride and groom grew up in conditions mimicking siblinghood were overwhelmingly “unhappy, unfaithful, unfecund, and short” (459). The effect of proximity in early childhood seems to apply to parents as well, at least when it comes to fathers’ sexual feelings for their daughters. Pinker cites research that shows the fathers who sexually abuse their daughters tend to be the ones who have spent the least time with them as infants, while the stepdads who actually do spend a lot of time with their stepdaughters are no more likely to abuse them than their biological parents. These studies not only favor Westermarck’s theory; they also provide a counter to Frazer’s objection to it. Human societies are so complex that we often grow up in close proximity with people who are unrelated, or don’t grow up with people who are, and therefore it is necessary for there to be a cultural proscription—a taboo—against incest in addition to the natural mechanism of aversion.

Among biologists and anthropologists, what is now called the Westermarck effect has displaced Freud’s Oedipus complex as the best explanation for incest avoidance. Since Freud’s theory of childhood sexual desires has been shown to be false, the question arises of where this leaves his concept of repression. According to literary critic—and critic of literary criticism—Frederick Crews, repression came to serve in the 1980’s and 90’s a role equivalent to the “spectral evidence” used in the Salem witch trials. Several psychotherapists latched on to the idea that children can store reliable information in their memories, especially when that information is too terrible for them to consciously handle. And the testimony of these therapists has led to many convictions and prison sentences. But the evidence for this notion of repression is solely clinical—modern therapists base their conclusions on interactions with patients, just as Freud did. Unfortunately, researchers outside the clinical setting are unable to find any phenomenon answering to the description of repressed but retrievable memories. Crews points out that there are plenty of people who are known to have survived traumatic experiences: “Holocaust survivors make up the most famous class of such subjects, but whatever group or trauma is chosen, the upshot of well-conducted research is always the same” (158). That upshot:

Unless a victim received a physical shock to the brain or was so starved or sleep deprived as to be thoroughly disoriented at the time, those experiences are typically better remembered than ordinary ones. (159, emphasis in original)

It seems here, as with incest aversion, Freud got the matter exactly wrong—and with devastating fallout for countless families and communities. But Freud was sketchy when it came to whether or not it was memories of actual events that were repressed or just fantasies. The crux of his argument was that we repress unacceptable and inappropriate drives and desires.

And the concept of repressed desires is integral to the use of psychoanalysis in literary criticism. In The Interpretation of Dreams, Freud distinguishes between the manifest content of dreams and their latent content. Having been exiled from consciousness, troublesome desires press against the bounds of the ego, Freud’s notional agent in charge of tamping down uncivilized urges. In sleep, the ego relaxes, allowing the desires of the id, from whence all animal drives emerge, an opportunity for free play. Even in dreams, though, full transparency of the id would be too disconcerting for the conscious mind to accept, so the ego disguises all the elements which surface with a kind of code. Breaking this code is the work of psychoanalytic dream interpretation. It is also the basis for Freud’s analysis of myths and the underlying principle of Freudian literary criticism. (In fact, the distinction between manifest and latent content is fundamental to many schools of literary criticism, though they each have their own version of the true nature of the latent content.) Science writer Steven Johnson compares Freud’s conception of repressed impulses to compressed gas seeping through the cracks of the ego’s defenses, emerging as slips of the tongue or baroque dream imagery. “Build up enough pressure in the chamber, though, and the whole thing explodes—into uncontrolled hysteria, anxiety, madness” (191). The release of pressure, as it were, through dreams and through various artistic media, is sanity-saving.

Johnson’s book, Mind Wide Open: Your Brain and the Neuroscience of Everyday life, takes the popular currency of Freud’s ideas as a starting point for his exploration of modern science. The subtitle is a homage to Freud’s influential work The Psychopathology of Everyday Life. Perhaps because he is not a working scientist, Johnson is able to look past the shaky methodological foundations of psychoanalysis and examine how accurately its tenets map onto the modern findings of neuroscience. Though he sees areas of convergence, like the idea of psychic conflict and that of the unconscious in general, he has to admit in his conclusion that “the actual unconscious doesn’t quite look like the one Freud imagined” (194). Rather than a repository of repressed fantasies, the unconscious is more of a store of implicit, or procedural, knowledge. Johnson explains, “Another word for unconscious is ‘automated’—the things you do so well you don’t even notice doing them” (195). And what happens to all the pressurized psychic energy resulting from our repression of urges? “This is one of those places,” Johnson writes, “where Freud’s metaphoric scaffolding ended up misleading him” (198). Instead of a steam engine, neuroscientists view the brain as type of ecosystem, with each module competing for resources; if the module goes unused—the neurons failing to fire—then the strength of their connections diminishes.

What are the implications of this new conception of how the mind works for the interpretation of dreams and works of art? Without the concept of repressed desires, is it still possible to maintain a distinction between the manifest and latent content of mental productions? Johnson suggests that there are indeed meaningful connections that can be discovered in dreams and slips of the tongue. To explain them, he points again to the neuronal ecosystem, and to the theory that “Neurons that fire together wire together.” He writes:

These connections are not your unconscious speaking in code. They’re much closer to free-associating. These revelations aren’t the work of some brilliant cryptographer trying to get a message to the frontlines without enemy detection. They’re more like echoes, reverberations. One neuronal group fires, and a host of others join in the chorus. (200-201)

Mind Wide Open represents Johnson’s attempt to be charitable to the century-old, and now popularly recognized, ideas of psychoanalysis. But in this description of the shortcomings of Freud’s understanding of the unconscious and how it reveals itself, he effectively discredits the epistemological underpinnings of any application of psychoanalysis to art. It’s not only the content of the unconscious that Freud got outrageously wrong, but the very nature of its operations. And if Freud could so confidently look into dreams and myths and legends and find in them material that simply wasn’t there, it is cause for us to marvel at the power of his preconceptions to distort his perceptions.

Ultimately, psychoanalysis failed to move from the realm of proto-science to that of methodologically well-founded science, and got relegated rather to the back channel of pseudoscience by the hubris of its founder. And yet, if Freud had relied on good science, his program of interpreting literature in terms of the basic themes of human nature, and even his willingness to let literature inform his understanding of those themes, may have matured into a critical repertoire free of the obscurantist excesses and reality-denying absurdities of postmodernism. (Anthropologist Clifford Geertz once answered a postmodernist critic of his work by acknowledging that perfect objectivity is indeed impossible, but then so is a perfectly germ-free operating room; that shouldn’t stop us from trying to be as objective and as sanitary as our best methods allow.)

            Critics could feasibly study the production of novels by not just one or a few authors, but a large enough sample—possibly extending across cultural divides—to analyze statistically. They could pose questions systematically to even larger samples of readers. And they could identify the themes in any poem or novel which demonstrate the essential (in the statistical sense) concerns of humanity that have been studied by behavioral scientists, themes like status-seeking, pair-bonding, jealousy, and even the overwhelming strength of the mother-infant bond. “The human race has produced only one successfully validated epistemology,” writes Frederick Crews (362). That epistemology encompasses a great variety of specific research practices, but they all hold as inviolable the common injunction “to make a sharp separation between hypothesis and evidence” (363). Despite his claims to scientific legitimacy, Freud failed to distinguish himself from other critical theorists by relying too much on his own intuitive powers, a reliance that all but guarantees succumbing to the natural human tendency to discover in complex fields precisely what you’ve come to them seeking.

Also read:

WHY SHAKESPEARE NAUSEATED DARWIN: A REVIEW OF KEITH OATLEY'S "SUCH STUFF AS DREAMS"

NICE GUYS WITH NOTHING TO SAY: BRETT MARTIN’S DIFFICULTY WITH “DIFFICULT MEN” AND THE FAILURE OF ARTS SCHOLARSHIP

GETTING GABRIEL WRONG: PART 1 OF 3

Read More
Dennis Junk Dennis Junk

The Mental Illness Zodiac: Why the DSM 5 Won't Be Anything But More Pseudoscience

That the diagnostic categories are necessarily ambiguous and can’t be tied to any objective criteria like biological markers has been much discussed, as have the corruptions of the mental health industry, including clinical researchers who make their livings treating the same disorders they lobby to have included in the list of official diagnoses. What’s not being discussed, however, is the propensity in humans to take on roles, to play parts, even tragic ones, even horrific ones, without being able to recognize they’re doing so.

            Thinking you can diagnose psychiatric disorders using checklists of symptoms means taking for granted a naïve model of the human mind and human behavior. How discouraging to those in emotional distress, or to those doubting their own sanity, that the guides they turn to for help and put their faith in to know what’s best for them embrace this model. The DSM has taken it for granted since its inception, and the latest version, the DSM 5, due out next year, despite all the impediments to practical usage it does away with, despite all the streamlining, and despite all the efforts to adhere to common sense, only perpetuates the mistake. That the diagnostic categories are necessarily ambiguous and can’t be tied to any objective criteria like biological markers has been much discussed, as have the corruptions of the mental health industry, including pharmaceutical companies’ reluctance to publish failed trials for their blockbuster drugs, and clinical researchers who make their livings treating the same disorders they lobby to have included in the list of official diagnoses. Indeed, there’s good evidence that prognoses for mental disorders have actually gotten worse over the past century. What’s not being discussed, however, is the propensity in humans to take on roles, to play parts, even tragic ones, even horrific ones, without being able to recognize they’re doing so.

            In his lighthearted, mildly satirical but severely important book on self-improvement 59 Seconds: Change Your Life in Under a Minute, psychologist Richard Wiseman describes an experiment he conducted for the British TV show The People Watchers. A group of students spending an evening in a bar with their friends was given a series of tests, and then they were given access to an open bar. The tests included memorizing a list of numbers, walking along a line on the floor, and catching a ruler dropped by experimenters as quickly as possible. Memory, balance, and reaction time—all areas our performance diminishes in predictably as we drink. The outcomes of the tests were well in-keeping with expectation as they were repeated over the course of the evening. All the students did progressively worse the more they drank. And the effects of the alcohol were consistent throughout the entire group of students. It turns out, however, that only half of them were drinking alcohol.

At the start of the study, Wiseman had given half the participants a blue badge and the other half a red badge. The bartenders poured regular drinks for everyone with red badges, but for those with blue ones they made drinks which looked, smelled, and tasted like their alcoholic counterparts but were actually non-alcoholic. Now, were the students with the blue badges faking their drunkenness? They may have been hamming it for the cameras, but that would be true of the ones who were actually drinking too. What they were doing instead was taking on the role—you might even say taking on the symptoms—of being drunk. As Wiseman explains,

Our participants believed that they were drunk, and so they thought and acted in a way that was consistent with their beliefs. Exactly the same type of effect has emerged in medical experiments when people exposed to fake poison ivy developed genuine rashes, those given caffeine-free coffee became more alert, and patients who underwent a fake knee operation reported reduced pain from their “healed” tendons. (204)

After being told they hadn’t actually consumed any alcohol, the students in the blue group “laughed, instantly sobered up, and left the bar in an orderly and amused fashion.” But not all the natural role-playing humans engage in is this innocuous and short-lived.

            In placebo studies like the one Wiseman conducted, participants are deceived. You could argue that actually drinking a convincing replica of alcohol or taking a realistic-looking pill is the important factor behind the effects. People who seek treatment for psychiatric disorders aren’t tricked in this way; so what would cause them to take on the role associated with, say, depression, or bipolar? But plenty of research shows that pills or potions aren’t necessary. We take on different roles in different settings and circumstances all the time. We act much differently at football games and rock concerts than we do at work or school. These shifts are deliberate, though, and we’re aware of them, at least to some degree, when they occur. But many cues are more subtle. It turns out that just being made aware of the symptoms of a disease can make you suspect that you have it. What’s called Medical Student Syndrome afflicts those studying both medical and psychiatric diagnoses. For the most part, you either have a biological disease or you don’t, so the belief that you have one is contingent on the heightened awareness that comes from studying the symptoms. But is there a significant difference between believing you’re depressed and having depression? There answer, according to check-list diagnosis, is no. 

            In America, we all know the symptoms of depression because we’re bombarded with commercials, like the one that uses squiggly circle faces to explain that it’s caused by a deficit of the neurotransmitter serotonin—a theory that had already been ruled out by the time that commercial began to air. More insidious though are the portrayals of psychiatric disorders in movies, TV series, or talk shows—more insidious because they embed the role-playing instructions in compelling stories. These shows profess to be trying to raise awareness so more people will get help to end their suffering. They profess to be trying to remove the stigma so people can talk about their problems openly. They profess to be trying to help people cope. But, from a perspective of human behavior that acknowledges the centrality of role-playing to our nature, all these shows are actually doing is shilling for the mental health industry, and they are probably helping to cause much of the suffering they claim to be trying to assuage.

            Multiple Personality Disorder, or Dissociative Identity Disorder as it’s now called, was an exceedingly rare diagnosis until the late 1970s and early 1980s when its incidence spiked drastically. Before the spike, there were only ever around a hundred cases. Between 1985 and 1995, there were around 40,000 new cases. What happened? There was a book and a miniseries called Sybil starring Sally Field that aired in 1977. Much of the real-life story on which Sybil was based has been cast into doubt through further investigation (or has been shown to be completely fabricated). But if you’re one to give credence to the validity of the DID diagnosis (and you shouldn’t), then we can look at another strange behavioral phenomenon whose incidence spiked after a certain movie hit the box offices in the 1970’s. Prior to the release of The Exorcist, the Catholic church had pretty much consigned the eponymous ritual to the dustbins of history. Lately, though, they’ve had to dust it off.

The Skeptic’s Dictionary says of a TV series devoted to the exorcism ritual, or the play rather, on the Sci-Fi channel,

The exorcists' only prop is a Bible, which is held in one hand while they talk down the devil in very dramatic episodes worthy of Jerry Springer or Jenny Jones. The “possessed” could have been mentally ill, actors, mentally ill actors, drug addicts, mentally ill drug addicts, or they may have been possessed, as the exorcists claimed. All the participants shown being exorcized seem to have seen the movie “The Exorcist” or one of the sequels. They all fell into the role of husky-voiced Satan speaking from the depths, who was featured in the film. The similarities in speech and behavior among the “possessed” has led some psychologists such as Nicholas Spanos to conclude that both “exorcist” and “possessed” are engaged in learned role-playing.

If people can somehow inadvertently fall into the role of having multiple personalities or being possessed by demons, it’s not hard to imagine them hearing about, say, bipolar, briefly worrying that they may have some of the symptoms, and then subsequently taking on the role, even the identity of someone battling bipolar disorder.

            Psychologist Dan McAdams theorizes that everyone creates his or her own “personal myth,” which serves to give life meaning and trajectory. The character we play in our own myth is what we recognize as our identity, what we think of when we try to answer the question “Who am I?” in all its profundity. But, as McAdams explains in The Stories We Live By: Personal Myths and the Making of the Self,

Stories are less about facts and more about meanings. In the subjective and embellished telling of the past, the past is constructed—history is made. History is judged to be true or false not solely with respect to its adherence to empirical fact. Rather, it is judged with respect to such narrative criteria as “believability” and “coherence.” There is a narrative truth in life that seems quite removed from logic, science, and empirical demonstration. It is the truth of a “good story.” (28-9)

The problem when it comes to diagnosing psychiatric disorders is that the checklist approach tries to use objective, scientific criteria, when the only answers they’ll ever get will be in terms of narrative criteria. But why, if people are prone to taking on roles, wouldn’t they take on something pleasant, like kings or princesses?

            Since our identities are made up of the stories we tell about ourselves—even to ourselves—it’s important that those stories be compelling. And if nothing ever goes wrong in the stories we tell, well, they’d be pretty boring. As Jonathan Gottschall writes in The Storytelling Animal: How Stories Make Us Human,

This need to see ourselves as the striving heroes of our own epics warps our sense of self. After all, it’s not easy to be a plausible protagonist. Fiction protagonists tend to be young, attractive, smart, and brave—all the things that most of us aren’t. Fiction protagonists usually live interesting lives that are marked by intense conflict and drama. We don’t. Average Americans work retail or cubicle jobs and spend their nights watching protagonists do interesting things on television. (171)

Listen to the ways talk show hosts like Oprah talk about mental disorders, and count how many times in an episode she congratulates the afflicted guests for their bravery in keeping up the struggle. Sometimes, the word hero is even bandied about. Troublingly, the people who cast themselves as heroes spreading awareness, countering stigmas, and helping people cope even like to do really counterproductive things like publishing lists of celebrities who supposedly suffer from the disorder in question. Think you might have bipolar? Kay Redfield Jameson thinks you’re in good company. In her book Touched By Fire, she suggests everyone from rocker Curt Cobain to fascist Mel Gibson is in that same boat-full of heroes.

            The reason medical researchers insist a drug must not only be shown to make people feel better but must also be shown to work better than a placebo is that even a sham treatment will make people report feeling better between 60 and 90% of the time, depending on several well-documented factors. What psychiatrists fail to acknowledge is that the placebo dynamic can be turned on its head—you can give people illnesses, especially mental illnesses, merely by suggesting they have the symptoms—or even by increasing their awareness of and attention to those symptoms past a certain threshold. If you tell someone a fact about themselves, they’ll usually believe it, especially if you claim a test, or an official diagnostic manual allowed you to determine the fact. This is how frauds convince people they’re psychics. An experiment you can do yourself involves giving horoscopes to a group of people and asking how true they ring. After most of them endorse their reading, reveal that you changed the labels and they all in fact read the wrong sign’s description.  

            Psychiatric diagnoses, to be considered at all valid, would need to be double-blind, just like drug trials: the patient shouldn’t know the diagnosis being considered; the rater shouldn’t know the diagnosis being considered; only a final scorer, who has no contact with the patient, should determine the diagnosis. The categories themselves are, however, equally problematic. In order to be properly established as valid, they need to have predictive power. Trials would have to be conducted in which subjects assigned to the prospective categories using double-blind protocols were monitored for long periods of time to see if their behavior adheres to what’s expected of the disorder. For instance, bipolar is supposedly marked by cyclical mood swings. Where are the mood diary studies? (The last time I looked for them was six months ago, so if you know of any, please send a link.) Smart phones offer all kinds of possibilities for monitoring and recording behaviors. Why aren’t they being used to do actual science on mental disorders?

            To research the role-playing dimension of mental illness, one (completely unethical) approach would be to design from scratch a really bizarre disorder, publicize its symptoms, maybe make a movie starring Mel Gibson, and monitor incidence rates. Let’s call it Puppy Pregnancy Disorder. We all know dog saliva is chock-full of gametes, right? So, let’s say the disorder is caused when a canine, in a state of sexual arousal of course, bites the victim, thus impregnating her—or even him. Let’s say it affects men too. Wouldn’t that be funny? The symptoms would be abdominal pain, and something just totally out there, like, say, small pieces of puppy feces showing up in your urine. Now, this might be too outlandish, don’t you think? There’s no way we could get anyone to believe this. Unfortunately, I didn’t really make this up. And there are real people in India who believe they have Puppy Pregnancy Disorder.

Also read:


THE STORYTELLING ANIMAL: A LIGHT READ WITH WEIGHTY IMPLICATIONS 

And:

THE SELF-TRANSCENDENCE PRICE TAG: A REVIEW OF ALEX STONE'S FOOLING HOUDINI

Read More
Dennis Junk Dennis Junk

What's the Point of Difficult Reading?

For every John Galt, Tony Robbins, and Scheherazade, we may need at least half a Proust. We are still, however, left with quite a dilemma. Some authors really are just assholes who write worthless tomes designed to trick you into wasting your time. But some books that seem impenetrable on the first attempt will reward your efforts to decipher them.

You sit reading the first dozen or so pages of some celebrated classic and gradually realize that having to sort out how the ends of the long sentences fix to their beginnings is taking just enough effort to distract you entirely from the setting or character you’re supposed to be getting to know. After a handful of words you swear are made up and a few tangled metaphors you find yourself riddling over with nary a resolution, the dread sinks in. Is the whole book going to be like this? Is it going to be one of those deals where you get to what’s clearly meant to be a crucial turning point in the plot but for you is just another riddle without a solution, sending you paging back through the forest of verbiage in search of some key succession of paragraphs you spaced out while reading the first time through? Then you wonder if you’re missing some other kind of key, like maybe the story’s an allegory, a reference to some historical event like World War II or some Revolution you once had to learn about but have since lost all recollection of. Maybe the insoluble similes are allusions to some other work you haven’t read or can’t recall. In any case, you’re not getting anything out of this celebrated classic but frustration leading to the dual suspicion that you’re too ignorant or stupid to enjoy great literature and that the whole “great literature” thing is just a conspiracy to trick us into feeling dumb so we’ll defer to the pseudo-wisdom of Ivory Tower elites.

If enough people of sufficient status get together and agree to extol a work of fiction, they can get almost everyone else to agree. The readers who get nothing out of it but frustration and boredom assume that since their professors or some critic in a fancy-pants magazine or the judges of some literary award committee think it’s great they must simply be missing something. They dutifully continue reading it, parrot a few points from a review that sound clever, and afterward toe the line by agreeing that it is indeed a great work of literature, clearly, even if it doesn’t speak to them personally. For instance, James Joyce’s Ulysses, utterly nonsensical to anyone without at least a master’s degree, tops the Modern Library’s list of 100 best novels in the English language. Responding to the urging of his friends to write out an explanation of the novel, Joyce scoffed, boasting,

I’ve put in so many enigmas and puzzles that it will keep the professors busy for centuries arguing over what I meant, and that’s the only way of ensuring one’s immortality.

He was right. To this day, professors continue to love him even as Ulysses and the even greater monstrosity Finnegan’s Wake do nothing but bore and befuddle everyone else—or else, more fittingly, sit inert or unchecked-out on the shelf, gathering well-deserved dust.

Joyce’s later novels are not literature; they are lengthy collections of loosely connected literary puzzles. But at least his puzzles have actual solutions—or so I’m told. Ulysses represents the apotheosis of the tradition in literature called modernism. What came next, postmodernism, is even more disconnected from the universal human passion for narrative. Even professors aren’t sure what to do with it, so they simply throw their hands up, say it’s great, and explain that the source of its greatness is its very resistance to explanation. Jonathan Franzen, whose 2001 novel The Corrections represented a major departure from the postmodernism he began his career experimenting with, explained the following year in The New Yorker how he’d turned away from the tradition. He’d been reading the work of William Gaddis “as a kind of penance” (101) and not getting any meaning out of it. Of the final piece in the celebrated author’s oeuvre, Franzen writes,

The novel is an example of the particular corrosiveness of literary postmodernism. Gaddis began his career with a Modernist epic about the forgery of masterpieces. He ended it with a pomo romp that superficially resembles a masterpiece but punishes the reader who tries to stay with it and follow its logic. When the reader finally says, Hey, wait a minute, this is a mess, not a masterpiece, the book instantly morphs into a performance-art prop: its fraudulence is the whole point! And the reader is out twenty hours of good-faith effort. (111)

In other words, reading postmodern fiction means not only forgoing the rewards of narratives, having them replaced by the more taxing endeavor of solving multiple riddles in succession, but those riddles don’t even have answers. What’s the point of reading this crap? Exactly. Get it?

You can dig deeper into the meaningless meanderings of pomos and discover there is in fact an ideology inspiring all the infuriating inanity. The super smart people who write and read this stuff point to the willing, eager complicity of the common reader in the propagation of all the lies that sustain our atrociously unjust society (but atrociously unjust compared to what?). Franzen refers to this as the Fallacy of the Stupid Reader,

wherein difficulty is a “strategy” to protect art from cooptation and the purpose of art is to “upset” or “compel” or “challenge” or “subvert” or “scar” the unsuspecting reader; as if the writer’s audience somehow consisted, again and again, of Charlie Browns running to kick Lucy’s football; as if it were a virtue in a novelist to be the kind of boor who propagandizes at friendly social gatherings. (109)

But if the author is worried about art becoming a commodity does making the art shitty really amount to a solution? And if the goal is to make readers rethink something they take for granted why not bring the matter up directly, or have a character wrestle with it, or have a character argue with another character about it? The sad fact is that these authors probably just suck, that, as Franzen suspects, “literary difficulty can operate as a smoke screen for an author who has nothing interesting, wise, or entertaining to say” (111).

Not all difficulty in fiction is a smoke screen though. Not all the literary emperors are naked. Franzen writes that “there is no headache like the headache you get from working harder on deciphering a text than the author, by all appearances, has worked on assembling it.” But the essay, titled “Mr. Difficult,” begins with a reader complaint sent not to Gaddis but to Franzen himself. And the reader, a Mrs. M. from Maryland, really gives him the business:

Who is it that you are writing for? It surely could not be the average person who enjoys a good read… The elite of New York, the elite who are beautiful, thin, anorexic, neurotic, sophisticated, don’t smoke, have abortions tri-yearly, are antiseptic, live in penthouses, this superior species of humanity who read Harper’s and The New Yorker. (100)

In this first part of the essay, Franzen introduces a dilemma that sets up his explanation of why he turned away from postmodernism—he’s an adherent of the “Contract model” of literature, whereby the author agrees to share, on equal footing, an entertaining or in some other way gratifying experience, as opposed to the “Status model,” whereby the author demonstrates his or her genius and if you don’t get it, tough. But his coming to a supposed agreement with Mrs. M. about writers like Gaddis doesn’t really resolve Mrs. M.’s conflict with him.

The Corrections, after all, the novel she was responding to, represents his turning away from the tradition Gaddis wrote in. (It must be said, though, that Freedom, Franzen’s next novel, is written in a still more accessible style.)

The first thing we must do to respond properly to Mrs. M. is break down each of Franzen’s models into two categories. The status model includes writers like Gaddis whose difficulty serves no purpose but to frustrate and alienate readers. But Franzen’s own type specimen for this model is Flaubert, much of whose writing, though difficult at first, rewards any effort to re-read and further comprehend with a more profound connection. So it is for countless other writers, the one behind number two on the Modern Library’s ranking for instance—Fitzgerald and Gatsby. As for the contract model, Franzen admits,

Taken to its free-market extreme, Contract stipulates that if a product is disagreeable to you the fault must be the product’s. If you crack a tooth on a hard word in a novel, you sue the author. If your professor puts Dreiser on your reading list, you write a harsh student evaluation… You’re the consumer; you rule. (100)

Franzen, in declaring himself a “Contract kind of person,” assumes that the free-market extreme can be dismissed for its extremity. But Mrs. M. would probably challenge him on that. For many, particularly right-leaning readers, the market not only can but should be relied on to determine which books are good and which ones belong in some tiny niche. When the Modern Library conducted a readers' poll to create a popular ranking to balance the one made by experts, the ballot was stuffed by Ayn Rand acolytes and scientologists. Mrs. M. herself leaves little doubt as to her political sympathies. For her and her fellow travelers, things like literature departments, National Book Awards—like the one The Corrections won—Nobels and Pulitzers are all an evil form of intervention into the sacred workings of the divine free market, un-American, sacrilegious, communist. According to this line of thinking, authors aren’t much different from whores—except of course literal whoring is condemned in the bible (except when it isn’t).

A contract with readers who score high on the personality dimension of openness to new ideas and experiences (who tend to be liberal), those who have spent a lot of time in the past reading books like The Great Gatsby or Heart of Darkness or Lolita (the horror!), those who read enough to have developed finely honed comprehension skills—that contract is going to look quite a bit different from one with readers who attend Beck University, those for whom Atlas Shrugged is the height of literary excellence. At the same time, though, the cult of self-esteem is poisoning schools and homes with the idea that suggesting that a student or son or daughter is anything other than a budding genius is a form of abuse. Heaven forbid a young person feel judged or criticized while speaking or writing. And if an author makes you feel the least bit dumb or ignorant, well, it’s an outrage—heroes like Mrs. M. to the rescue.

One of the problems with the cult of self-esteem is that anticipating criticism tends to make people more, not less creative. And the link between low self-esteem and mental disorders is almost purely mythical. High self-esteem is correlated with school performance, but as far as researchers can tell it’s the performance causing the esteem, not the other way around. More invidious, though, is the tendency to view anything that takes a great deal of education or intelligence to accomplish as an affront to everyone less educated or intelligent. Conservatives complain endlessly about class warfare and envy of the rich—the financially elite—but they have no qualms about decrying intellectual elites and condemning them for flaunting their superior literary achievements. They see the elitist mote in the eye of Nobel laureates without noticing the beam in their own.

         What’s the point of difficult reading? Well, what’s the point of running five or ten miles? What’s the point of eating vegetables as opposed to ice cream or Doritos? Difficulty need not preclude enjoyment. And discipline in the present is often rewarded in the future. It very well may be that the complexity of the ideas you’re capable of understanding is influenced by how many complex ideas you attempt to understand. No matter how vehemently true believers in the magic of markets insist otherwise, markets don’t have minds. And though an individual’s intelligence need not be fixed a good way to ensure children never get any smarter than they already are is to make them feel fantastically wonderful about their mediocrity. We just have to hope that despite these ideological traps there are enough people out there determined to wrap their minds around complex situations depicted in complex narratives about complex people told in complex language, people who will in the process develop the types of minds and intelligence necessary to lead the rest of our lazy asses into a future that’s livable and enjoyable. For every John Galt, Tony Robbins, and Scheherazade, we may need at least half a Proust. We are still, however, left with quite a dilemma. Some authors really are just assholes who write worthless tomes designed to trick you into wasting your time. But some books that seem impenetrable on the first attempt will reward your efforts to decipher them. How do we get the rewards without wasting our time?

Read More
Dennis Junk Dennis Junk

Madness and Bliss: Critical vs Primitive Readings in A.S. Byatt's Possession: A Romance 2

Critics responded to A.S. Byatt’s challenge to their theories in her book Possession by insisting that the work fails to achieve high-brow status and fails to do anything but bolster old-fashioned notions about stories and language—not to mention the roles of men and women. What they don’t realize is how silly their theories come across to the non-indoctrinated.

Read part one.

The critical responses to the challenges posed by Byatt in Possession fit neatly within the novel’s satire. Louise Yelin, for instance, unselfconsciously divides the audience for the novel into “middlebrow readers” and “the culturally literate” (38), placing herself in the latter category. She overlooks Byatt’s challenge to her methods of criticism and the ideologies underpinning them, for the most part, and suggests that several of the themes, like ventriloquism, actually support poststructuralist philosophy. Still, Yelin worries about the novel’s “homophobic implications” (39). (A lesbian, formerly straight character takes up with a man in the end, and Christabel LaMotte’s female lover commits suicide after the dissolution of their relationship, but no one actually expresses any fear or hatred of homosexuals.) Yelin then takes it upon herself to “suggest directions that our work might take” while avoiding the “critical wilderness” Byatt identifies. She proposes a critical approach to a novel that “exposes its dependencies on the bourgeois, patriarchal, and colonial economies that underwrite” it (40). And since all fiction fails to give voice to one or another oppressed minority, it is the critic’s responsibility to “expose the complicity of those effacements in the larger order that they simultaneously distort and reproduce” (41). This is not in fact a response to Byatt’s undermining of critical theories; it is instead an uncritical reassertion of their importance.

Yelin and several other critics respond to Possession as if Byatt had suggested that “culturally literate” readers should momentarily push to the back of their minds what they know about how literature is complicit in various forms of political oppression so they can get more enjoyment from their reading. This response is symptomatic of an astonishing inability to even imagine what the novel is really inviting literary scholars to imagine—that the theories implicating literature are flat-out wrong. Monica Flegel for instance writes that “What must be privileged and what must be sacrificed in order for Byatt’s Edenic reading (and living) state to be achieved may give some indication of Byatt’s own conventionalizing agenda, and the negative enchantment that her particular fairy tale offers” (414). Flegel goes on to show that she does in fact appreciate the satire on academic critics; she even sympathizes with the nostalgia for simpler times, before political readings became mandatory. But she ends her critical essay with another reassertion of the political, accusing Byatt of “replicating” through her old-fashioned novel “such negative qualities of the form as its misogyny and its omission of the lower class.” Flegel is particularly appalled by Maud’s treatment in the final scene, since, she claims, “stereotypical gender roles are reaffirmed” (428). “Maud is reduced in the end,” Flegel alleges, “to being taken possession of by her lover…and assured that Roland will ‘take care of her’” (429). This interpretation places Flegel in the company of the feminists in the novel who hiss at Maud for trying to please men, forcing her to bind her head.

Flegel believes that her analysis proves Byatt is guilty of misogyny and mistreatment of the poor. “Byatt urges us to leave behind critical readings and embrace reading for enjoyment,” she warns her fellow critics, “but the narrative she offers shows just how much is at stake when we leave criticism behind” (429). Flegel quotes Yelin to the effect that Possession is “seductive,” and goes on to declaim that

it is naïve, and unethical, to see the kind of reading that Byatt offers as happy. To return to an Edenic state of reading, we must first believe that such a state truly existed and that it was always open to all readers of every class, gender, and race. Obviously, such a belief cannot be held, not because we have lost the ability to believe, but because such a space never really did exist. (430)

In her preening self-righteous zealotry, Flegel represents a current in modern criticism that’s only slightly more extreme than that represented by Byatt’s misguided but generally harmless scholars. The step from using dubious theories to decode alleged justifications for political oppression in literature to Flegel’s frightening brand of absurd condemnatory moralizing leveled at authors and readers alike is a short one.

Another way critics have attempted to respond to Byatt’s challenge is by denying that she is in fact making any such challenge. Christien Franken suggests that Byatt’s problems with theories like poststructuralism stem from her dual identity as a critic and an author. In a lecture Byatt once gave titled “Identity and the Writer” which was later published as an essay, Franken finds what she believes is evidence of poststructuralist thinking, even though Byatt denies taking the theory seriously. Franken believes that in the essay, “the affinity between post-structuralism and her own thought on authorship is affirmed and then again denied” (18). Her explanation is that the critic in A.S. Byatt begins her lecture “Identity and the Writer” with a recognition of her own intellectual affinity with post-structuralist theories which criticize the paramount importance of “the author.”

The writer in Byatt feels threatened by the same post-structuralist criticism. (17)

Franken claims that this ambivalence runs throughout all of Byatt’s fiction and criticism. But Ann Marie Adams disagrees, writing that “When Byatt does delve into poststructuralist theory in this essay, she does so only to articulate what ‘threatens’ and ‘beleaguers’ her as a writer, not to productively help her identify the true ‘identity’ of the writer” (349). In Adams view, Byatt belongs in the humanist tradition of criticism going back to Matthew Arnold and the romantics. In her own response to Byatt, Adams manages to come closer than any of her fellow critics to being able to imagine that the ascendant literary theories are simply wrong. But her obvious admiration for Byatt doesn’t prevent her from suggesting that “Yelin and Flegel are right to note that the conclusion of Possession, with its focus on closure and seeming transcendence of critical anxiety, affords a particularly ‘seductive’ and ideologically laden pleasure to academic readers” (120). And, while she seems to find some value in Arnoldian approaches, she fails to engage in any serious reassessment of the theories Byatt targets.

Frederick Holmes, in his attempt to explain Byatt’s attitude toward history as evidenced by the novel, agrees with the critics who see in Possession clear signs of the author’s embrace of postmodernism in spite of the parody and explicit disavowals. “It is important to acknowledge,” he writes,

that the liberation provided by Roland’s imagination from the previously discussed sterility of his intellectual sophistication is never satisfactorily accounted for in rational terms. It is not clear how he overcomes the post-structuralist positions on language, authorship, and identity. His claim that some signifiers are concretely attached to signifieds is simply asserted, not argued for. (330)

While Holmes is probably mistaken in taking this absence of rational justification as a tacit endorsement of the abandoned theory, the observation is the nearest any of the critics comes to a rebuttal of Byatt’s challenge. What Holmes is forgetting, though, is that structuralist and poststructuralist theorists themselves, from Saussure through Derrida, have been insisting on the inadequacy of language to describe the real world, a radical idea that flies in the face of every human’s lived experience, without ever providing any rational, empirical, or even coherent support for the departure.

The stark irony to which Holmes is completely oblivious is that he’s asking for a rational justification to abandon a theory that proclaims such justification impossible. The burden remains on poststructuralists to prove that people shouldn’t trust their own experiences with language. And it is precisely this disconnect with experience that Byatt shows to be problematic.

Holmes, like the other critics, simply can’t imagine that critical theories have absolutely no validity, so he’s forced to read into the novel the same chimerical ambivalence Franken tries so desperately to prove.

Roland’s dramatic alteration is validated by the very sort of emotional or existential experience that critical theory has conditioned him to dismiss as insubstantial. We might account for Roland’s shift by positing, not a rejection of his earlier thinking, but a recognition that his psychological well-being depends on his living as if such powerful emotional experiences had an unquestioned reality. (330)

Adams quotes from an interview in which Byatt discusses her inspiration for the characters in Possession, saying, “poor moderns are always asking themselves so many questions about whether their actions are real and whether what they say can be thought to be true […] that they become rather papery and are miserably aware of this” (111-2). Byatt believes this type of self-doubt is unnecessary. Indeed, Maud’s notion that “there isn’t a unitary ego” (290) and Roland’s thinking of the “idea of his ‘self’ as an illusion” (459)—not to mention Holmes’s conviction that emotional experiences are somehow unreal—are simple examples of the reductionist fallacy. While it is true that an individual’s consciousness and sense of self rest on a substrate of unconscious mental processes and mechanics that can be traced all the way down to the firing of neurons, to suggest this mechanical accounting somehow delegitimizes selfhood is akin to saying that water being made up of hydrogen and oxygen atoms means the feeling of wetness can only be an illusion.

       Just as silly are the ideas that romantic love is a “suspect ideological construct” (290), as Maud calls it, and that “the expectations of Romance control almost everyone in the Western world” (460), as Roland suggests. Anthropologist Helen Fisher writes in her book Anatomy of Love, “some Westerners have come to believe that romantic love is an invention of the troubadours… I find this preposterous. Romantic love is far more widespread” (49). After a long list of examples of love-strickenness from all over the world from west to east to everywhere in-between, Fisher concludes that it “must be a universal human trait” (50). Scientists have found empirical support as well for Roland’s discovery that words can in fact refer to real things. Psychologist Nicole Speer and her colleagues used fMRI to scan people’s brains as they read stories. The actions and descriptions on the page activated the same parts of the brain as witnessing or perceiving their counterparts in reality. The researchers report, “Different brain regions track different aspects of a story, such as a character’s physical location or current goals. Some of these regions mirror those involved when people perform, imagine, or observe similar real-world activities” (989).

       Critics like Flegel insist on joyless reading because happy endings necessarily overlook the injustices of the world. But this is like saying anyone who savors a meal is complicit in world hunger (or for that matter anyone who enjoys reading about a character savoring a meal). If feminist poststructuralists were right about how language functions as a vehicle for oppressive ideologies, then the most literate societies would be the most oppressive, instead of the other way around. Jacques Lacan is the theorist Byatt has the most fun with in Possession—and he is also the main target of the book Fashionable Nonsense:Postmodern Intellectuals’ Abuse of Science by the scientists Alan Sokal and Jean Bricmont. “According to his disciples,” they write, Lacan “revolutionized the theory and practice of psychoanalysis; according to his critics, he is a charlatan and his writings are pure verbiage” (18). After assessing Lacan’s use of concepts in topological mathematics, like the Mobius strip, which he sets up as analogies for various aspects of the human psyche, Sokal and Bricmont conclude that Lacan’s ideas are complete nonsense. They write,

The most striking aspect of Lacan and his disciples is probably their attitude toward science, and the extreme privilege they accord to “theory”… at the expense of observations and experiments… Before launching into vast theoretical generalizations, it might be prudent to check the empirical adequacy of at least some of its propositions. But, in Lacan’s writings, one finds mainly quotations and analyses of texts and concepts. (37)

Sokal and Bricmont wonder if the abuses of theorists like Lacan “arise from conscious fraud, self-deception, or perhaps a combination of the two” (6). The question resonates with the poem Randolph Henry Ash wrote about his experience exposing a supposed spiritualist as a fraud, in which he has a mentor assure her protégée, a fledgling spiritualist with qualms about engaging in deception, “All Mages have been tricksters” (444).

There’s even some evidence that Byatt is right about postmodern thinking making academics into “papery” people. In a 2006 lecture titled “The Inhumanity of the Humanities,” William van Peer reports on research he conducted with a former student comparing the emotional intelligence of students in the humanities to students in the natural sciences.

Although the psychologists Raymond Mar and Keith Oately (407) have demonstrated that reading fiction increases empathy and emotional intelligence, van Peer found that humanities students had no advantage in emotional intelligence over students of natural science. In fact, there was a weak trend in the opposite direction—this despite the fact that the humanities students were reading more fiction. Van Peer attributes the deficit to common academic approaches to literature:

Consider the ills flowing from postmodern approaches, the “posthuman”: this usually involves the hegemony of “race/class/gender” in which literary texts are treated with suspicion. Here is a major source of that loss of emotional connection between student and literature. How can one expect a certain humanity to grow in students if they are continuously instructed to distrust authors and texts? (8)

Whether it derives from her early reading of Arnold and his successors, as Adams suggests, or simply from her own artistic and readerly sensibilities, Byatt has an intense desire to revive that very humanity so many academics sacrifice on the altar of postmodern theory. Critical theory urges students to assume that any discussion of humanity, or universal traits, or human nature can only be exclusionary, oppressive, and, in Flegel’s words, “naïve” and “unethical.” The cognitive linguist Steven Pinker devotes his book The Blank Slate: The Modern Denial of Human Nature to debunking the radical cultural and linguistic determinism that is the foundation of modern literary theory. In a section on the arts, Pinker credits Byatt for playing a leading role in what he characterizes as a “revolt”:

Museum-goers have become bored with the umpteenth exhibit on the female body featuring dismembered torsos or hundreds of pounds of lard chewed up and spat out by the artist. Graduate students in the humanities are grumbling in emails and conference hallways about being locked out of the job market unless they write in gibberish while randomly dropping the names of authorities like Foucault and Butler. Maverick scholars are doffing the blinders that prevented them from looking at exciting developments in the sciences of human nature. And younger artists are wondering how the art world got itself into the bizarre place in which beauty is a dirty word. (Pinker 416)

There are resonances with Roland Mitchell’s complaints about how psychoanalysis transforms perceptions of landscapes in Pinker’s characterization of modern art. And the idea that beauty has become a dirty word is underscored by the critical condemnations of Byatt’s “fairy tale ending.” Pinker goes on to quote Byatt’s response in New York Times Magazine to the question of what the best story ever told was.

Her answer—The Arabian Nights:

The stories in “The Thousand and One Nights”… are about storytelling without ever ceasing to be stories about love and life and death and money and food and other human necessities. Narration is as much a part of human nature as breath and the circulation of the blood. Modernist literature tried to do away with storytelling, which it thought vulgar, replacing it with flashbacks, epiphanies, streams of consciousness. But storytelling is intrinsic to biological time, which we cannot escape. Life, Pascal said, is like living in a prison from which every day fellow prisoners are taken away to be executed. We are all, like Scheherazade, under sentences of death, and we all think of our lives as narratives, with beginnings, middles, and ends. (quoted in Pinker 419)

Byatt’s satire of papery scholars and her portrayals of her characters’ transcendence of nonsensical theories are but the simplest and most direct ways she celebrates the power of language to transport readers—and the power of stories to possess them. Though she incorporates an array of diverse genres, from letters to poems to diaries, and though some of the excerpts’ meanings subtly change in light of discoveries about their authors’ histories, all these disparate parts nonetheless “hook together,” collaborating in the telling of this magnificent tale. This cooperation would be impossible if the postmodern truism about the medium being the message were actually true. Meanwhile, the novel’s intimate engagement with the mythologies of wide-ranging cultures thoroughly undermines the paradigm according to which myths are deterministic “repeating patterns” imposed on individuals, showing instead that these stories simultaneously emerge from and lend meaning to our common human experiences. As the critical responses to Possession make abundantly clear, current literary theories are completely inadequate in any attempt to arrive at an understanding of Byatt’s work. While new theories may be better suited to the task, it is incumbent on us to put forth a good faith effort to imagine the possibility that true appreciation of this and other works of literature will come only after we’ve done away with theory altogether.

Read More
Dennis Junk Dennis Junk

Intuition vs. Science: What's Wrong with Your Thinking, Fast and Slow

Kahneman has no faith in our ability to clean up our thinking. He’s an expert on all the ways thinking goes awry, and even he catches himself making all the common mistakes time and again. So he proposes a way around the impenetrable wall of cognitive illusion and self-justification. If all the people gossiping around the water cooler are well-versed in the language of biases and heuristics and errors of intuition, we may all benefit because anticipating gossip can have a profound effect on behavior. No one wants to be spoken of as the fool.

From Completely Useless to Moderately Useful

            In 1955, a twenty-one-year-old Daniel Kahneman was assigned the formidable task of creating an interview procedure to assess the fitness of recruits for the Israeli army. Kahneman’s only qualification was his bachelor’s degree in psychology, but the state of Israel had only been around for seven years at the time so the Defense Forces were forced to satisfice. In the course of his undergraduate studies, Kahneman had discovered the writings of a psychoanalyst named Paul Meehl, whose essays he would go on to “almost memorize” as a graduate student. Meehl’s work gave Kahneman a clear sense of how he should go about developing his interview technique.

If you polled psychologists today to get their predictions for how successful a young lieutenant inspired by a book written by a psychoanalyst would be in designing a personality assessment protocol—assuming you left out the names—you would probably get some dire forecasts. But Paul Meehl wasn’t just any psychoanalyst, and Daniel Kahneman has gone on to become one of the most influential psychologists in the world. The book whose findings Kahneman applied to his interview procedure was Clinical vs. Statistical Prediction: A Theoretical Analysis and a Review of the Evidence, which Meehl lovingly referred to as “my disturbing little book.” Kahneman explains,

Meehl reviewed the results of 20 studies that had analyzed whether clinical predictions based on the subjective impressions of trained professionals were more accurate than statistical predictions made by combining a few scores or ratings according to a rule. In a typical study, trained counselors predicted the grades of freshmen at the end of the school year. The counselors interviewed each student for forty-five minutes. They also had access to high school grades, several aptitude tests, and a four-page personal statement. The statistical algorithm used only a fraction of this information: high school grades and one aptitude test. (222)

The findings for this prototypical study are consistent with those arrived at by researchers over the decades since Meehl released his book:

The number of studies reporting comparisons of clinical and statistical predictions has increased to roughly two hundred, but the score in the contest between algorithms and humans has not changed. About 60% of the studies have shown significantly better accuracy for the algorithms. The other comparisons scored a draw in accuracy, but a tie is tantamount to a win for the statistical rules, which are normally much less expensive to use than expert judgment. No exception has been convincingly documented. (223)       

            Kahneman designed the interview process by coming up with six traits he thought would have direct bearing on a soldier’s success or failure, and he instructed the interviewers to assess the recruits on each dimension in sequence. His goal was to make the process as systematic as possible, thus reducing the role of intuition. The response of the recruitment team will come as no surprise to anyone: “The interviewers came close to mutiny” (231). They complained that their knowledge and experience were being given short shrift, that they were being turned into robots. Eventually, Kahneman was forced to compromise, creating a final dimension that was holistic and subjective. The scores on this additional scale, however, seemed to be highly influenced by scores on the previous scales.

When commanding officers evaluated the new recruits a few months later, the team compared the evaluations with their predictions based on Kahneman’s six scales. “As Meehl’s book had suggested,” he writes, “the new interview procedure was a substantial improvement over the old one… We had progressed from ‘completely useless’ to ‘moderately useful’” (231).   

            Kahneman recalls this story at about the midpoint of his magnificent, encyclopedic book Thinking, Fast and Slow. This is just one in a long series of run-ins with people who don’t understand or can’t accept the research findings he presents to them, and it is neatly woven into his discussions of those findings. Each topic and each chapter feature a short test that allows you to see where you fall in relation to the experimental subjects. The remaining thread in the tapestry is the one most readers familiar with Kahneman’s work most anxiously anticipated—his friendship with AmosTversky, with whom he shared the Nobel prize in economics in 2002.

Most of the ideas that led to experiments that led to theories which made the two famous and contributed to the founding of an entire new field, behavioral economics, were borne of casual but thrilling conversations both found intrinsically rewarding in their own right. Reading this book, as intimidating as it appears at a glance, you get glimmers of Kahneman’s wonder at the bizarre intricacies of his own and others’ minds, flashes of frustration at how obstinately or casually people avoid the implications of psychology and statistics, and intimations of the deep fondness and admiration he felt toward Tversky, who died in 1996 at the age of 59.

Pointless Punishments and Invisible Statistics

            When Kahneman begins a chapter by saying, “I had one of the most satisfying eureka experiences of my career while teaching flight instructors in the Israeli Air Force about the psychology of effective training” (175), it’s hard to avoid imagining how he might have relayed the incident to Amos years later. It’s also hard to avoid speculating about what the book might’ve looked like, or if it ever would have been written, if he were still alive. The eureka experience Kahneman had in this chapter came about, as many of them apparently did, when one of the instructors objected to his assertion, in this case that “rewards for improved performance work better than punishment of mistakes.” The instructor insisted that over the long course of his career he’d routinely witnessed pilots perform worse after praise and better after being screamed at. “So please,” the instructor said with evident contempt, “don’t tell us that reward works and punishment does not, because the opposite is the case.” Kahneman, characteristically charming and disarming, calls this “a joyous moment of insight” (175).

            The epiphany came from connecting a familiar statistical observation with the perceptions of an observer, in this case the flight instructor. The problem is that we all have a tendency to discount the role of chance in success or failure. Kahneman explains that the instructor’s observations were correct, but his interpretation couldn’t have been more wrong.

What he observed is known as regression to the mean, which in that case was due to random fluctuations in the quality of performance. Naturally, he only praised a cadet whose performance was far better than average. But the cadet was probably just lucky on that particular attempt and therefore likely to deteriorate regardless of whether or not he was praised. Similarly, the instructor would shout into the cadet’s earphones only when the cadet’s performance was unusually bad and therefore likely to improve regardless of what the instructor did. The instructor had attached a causal interpretation to the inevitable fluctuations of a random process. (175-6)

The roster of domains in which we fail to account for regression to the mean is disturbingly deep. Even after you’ve learned about the phenomenon it’s still difficult to recognize the situations you should apply your understanding of it to. Kahneman quotes statistician David Freedman to the effect that whenever regression becomes pertinent in a civil or criminal trial the side that has to explain it will pretty much always lose the case. Not understanding regression, however, and not appreciating how it distorts our impressions has implications for even the minutest details of our daily experiences. “Because we tend to be nice to other people when they please us,” Kahneman writes, “and nasty when they do not, we are statistically punished for being nice and rewarded for being nasty” (176). Probability is a bitch.

The Illusion of Skill in Stock-Picking

            Probability can be expensive too. Kahneman recalls being invited to give a lecture to advisers at an investment firm. To prepare for the lecture, he asked for some data on the advisers’ performances and was given a spreadsheet for investment outcomes over eight years. When he compared the numbers statistically, he found that none of the investors was consistently more successful than the others. The correlation between the outcomes from year to year was nil. When he attended a dinner the night before the lecture “with some of the top executives of the firm, the people who decide on the size of bonuses,” he knew from experience how tough a time he was going to have convincing them that “at least when it came to building portfolios, the firm was rewarding luck as if it were a skill.” Still, he was amazed by the execs’ lack of shock:

We all went on calmly with our dinner, and I have no doubt that both our findings and their implications were quickly swept under the rug and that life in the firm went on just as before. The illusion of skill is not only an individual aberration; it is deeply ingrained in the culture of the industry. Facts that challenge such basic assumptions—and thereby threaten people’s livelihood and self-esteem—are simply not absorbed. (216)

The scene that follows echoes the first chapter of Carl Sagan’s classic paean to skepticism Demon-Haunted World, where Sagan recounts being bombarded with questions about science by a driver who was taking him from the airport to an auditorium where he was giving a lecture. He found himself explaining to the driver again and again that what he thought was science—Atlantis, aliens, crystals—was, in fact, not. "As we drove through the rain," Sagan writes, "I could see him getting glummer and glummer. I was dismissing not just some errant doctrine, but a precious facet of his inner life" (4). In Kahneman’s recollection of his drive back to the airport after his lecture, he writes of a conversation he had with his own driver, one of the execs he’d dined with the night before. 

He told me, with a trace of defensiveness, “I have done very well for the firm and no one can take that away from me.” I smiled and said nothing. But I thought, “Well, I took it away from you this morning. If your success was due mostly to chance, how much credit are you entitled to take for it? (216)

Blinking at the Power of Intuitive Thinking

            It wouldn’t surprise Kahneman at all to discover how much stories like these resonate. Indeed, he must’ve considered it a daunting challenge to conceive of a sensible, cognitively easy way to get all of his vast knowledge of biases and heuristics and unconscious, automatic thinking into a book worthy of the science—and worthy too of his own reputation—while at the same time tying it all together with some intuitive overarching theme, something that would make it read more like a novel than an encyclopedia.

Malcolm Gladwell faced a similar challenge in writing Blink: the Power of Thinking without Thinking, but he had the advantages of a less scholarly readership, no obligation to be comprehensive, and the freedom afforded to someone writing about a field he isn’t one of the acknowledged leaders and creators of. Ultimately, Gladwell’s book painted a pleasing if somewhat incoherent picture of intuitive thinking. The power he refers to in the title is over the thoughts and actions of the thinker, not, as many must have presumed, to arrive at accurate conclusions.

It’s entirely possible that Gladwell’s misleading title came about deliberately, since there’s a considerable market for the message that intuition reigns supreme over science and critical thinking. But there are points in his book where it seems like Gladwell himself is confused. Robert Cialdini, Steve Marin, and Noah Goldstein cover some of the same research Kahneman and Gladwell do, but their book Yes!: 50 Scientifically Proven Ways to be Persuasive is arranged in a list format, with each chapter serving as its own independent mini-essay.

Early in Thinking, Fast and Slow, Kahneman introduces us to two characters, System 1 and System 2, who pass the controls of our minds back and forth between themselves according the expertise and competency demanded by current exigency or enterprise. System 1 is the more intuitive, easygoing guy, the one who does what Gladwell refers to as “thin-slicing,” the fast thinking of the title. System 2 works deliberately and takes effort on the part of the thinker. Most people find having to engage their System 2—multiply 17 by 24—unpleasant to one degree or another.

The middle part of the book introduces readers to two other characters, ones whose very names serve as a challenge to the field of economics. Econs are the beings market models and forecasts are based on. They are rational, selfish, and difficult to trick. Humans, the other category, show inconsistent preferences, changing their minds depending on how choices are worded or presented, are much more sensitive to the threat of loss than the promise of gain, are sometimes selfless, and not only can be tricked with ease but routinely trick themselves. Finally, Kahneman introduces us to our “Two Selves,” the two ways we have of thinking about our lives, either moment-to-moment—experiences he, along with Mihaly Csikzentmihhalyi (author of Flow) pioneered the study of—or in abstract hindsight. It’s not surprising at this point that there are important ways in which the two selves tend to disagree.

Intuition and Cerebration

  The Econs versus Humans distinction, with its rhetorical purpose embedded in the terms, is plenty intuitive. The two selves idea, despite being a little too redolent of psychoanalysis, also works well. But the discussions about System 1 and System 2 are never anything but ethereal and abstruse. Kahneman’s stated goal was to discuss each of the systems as if they were characters in a plot, but he’s far too concerned with scientifically precise definitions to run with the metaphor. The term system is too bloodless and too suggestive of computer components; it’s too much of the realm of System 2 to be at all satisfying to System 1. The collection of characteristics Thinking links to the first system (see a list below) is lengthy and fascinating and not easily summed up or captured in any neat metaphor. But we all know what Kahneman is talking about. We could use mythological figures, perhaps Achilles or Orpheus for System 1 and Odysseus or Hephaestus for System 2, but each of those characters comes with his own narrative baggage. Not everyone’s System 1 is full of rage like Achilles, or musical like Orpheus. Maybe we could assign our System 1s idiosyncratic totem animals.

But I think the most familiar and the most versatile term we have for System 1 is intuition. It is a hairy and unpredictable beast, but we all recognize it. System 2 is actually the harder to name because people so often mistake their intuitions for logical thought. Kahneman explains why this is the case—because our cognitive resources are limited our intuition often offers up simple questions as substitutes from more complicated ones—but we must still have a term that doesn’t suggest complete independence from intuition and that doesn’t imply deliberate thinking operates flawlessly, like a calculator. I propose cerebration. The cerebral cortex rests on a substrate of other complex neurological structures. It’s more developed in humans than in any other animal. And the way it rolls trippingly off the tongue is as eminently appropriate as the swish of intuition. Both terms work well as verbs too. You can intuit, or you can cerebrate. And when your intuition is working in integrated harmony with your cerebration you are likely in the state of flow Csikzentmihalyi pioneered the study of.

While Kahneman’s division of thought into two systems never really resolves into an intuitively manageable dynamic, something he does throughout the book, which I initially thought was silly, seems now a quite clever stroke of brilliance. Kahneman has no faith in our ability to clean up our thinking. He’s an expert on all the ways thinking goes awry, and even he catches himself making all the common mistakes time and again. In the introduction, he proposes a way around the impenetrable wall of cognitive illusion and self-justification. If all the people gossiping around the water cooler are well-versed in the language describing biases and heuristics and errors of intuition, we may all benefit because anticipating gossip can have a profound effect on behavior. No one wants to be spoken of as the fool.

Kahneman writes, “it is much easier, as well as far more enjoyable, to identify and label the mistakes of others than to recognize our own.” It’s not easy to tell from his straightforward prose, but I imagine him writing lines like that with a wry grin on his face. He goes on,

Questioning what we believe and want is difficult at the best of times, and especially difficult when we most need to do it, but we can benefit from the informed opinions of others. Many of us spontaneously anticipate how friends and colleagues will evaluate our choices; the quality and content of these anticipated judgments therefore matters. The expectation of intelligent gossip is a powerful motive for serious self-criticism, more powerful than New Year resolutions to improve one’s decision making at work and at home. (3)

So we encourage the education of others to trick ourselves into trying to be smarter in their eyes. Toward that end, Kahneman ends each chapter with a list of sentences in quotation marks—lines you might overhear passing that water cooler if everyone where you work read his book.  I think he’s overly ambitious. At some point in the future, you may hear lines like “They’re counting on denominator neglect” (333) in a boardroom—where people are trying to impress colleagues and superiors—but I seriously doubt you’ll hear it in the break room. Really, what he’s hoping is that people will start talking more like behavioral economists. Though some undoubtedly will, Thinking, Fast and Slow probably won’t ever be as widely read as, say, Freud’s lurid pseudoscientific On the Interpretation of Dreams. That’s a tragedy.

Still, it’s pleasant to think about a group of friends and colleagues talking about something other than football and American Idol. Characteristics of System 1 (105): Try to come up with a good metaphor.·

generates impressions, feelings, and inclinations; when endorsed by System 2 these become beliefs, attitudes, and intentions·

operates automatically and quickly, with little or no effort, and no sense of voluntary control·

can be programmed by System 2 to mobilize attention when particular patterns are detected (search) ·

executes skilled responses and generates skilled intuitions, after adequate training·

creates a coherent pattern of activated ideas in associative memory·

links a sense of cognitive ease to illusions of truth, pleasant feelings, and reduced vigilance·

distinguishes the surprising from the normal·

infers and invents causes and intentions·

neglects ambiguity and suppresses doubt·

is biased to believe and confirm·

exaggerates emotional consistency (halo effect)·

focuses on existing evidence and ignores absent evidence (WYSIATI)·

generates a limited set of basic assessments·

represents sets by norms and prototypes, does not integrate·

matches intensities across scales (e.g., size and loudness)·

computes more than intended (mental shotgun)·

sometimes substitutes an easier question for a difficult one (heuristics) ·

is more sensitive to changes than to states (prospect theory)·

overweights low probabilities.

shows diminishing sensitivity to quantity (psychophysics)·

responds more strongly to losses than to gains (loss aversion)·

frames decision problems narrowly, in isolation from one another

Also read:

LAB FLIES: JOSHUA GREENE’S MORAL TRIBES AND THE CONTAMINATION OF WALTER WHITE

“THE WORLD UNTIL YESTERDAY” AND THE GREAT ANTHROPOLOGY DIVIDE: WADE DAVIS’S AND JAMES C. SCOTT’S BIZARRE AND DISHONEST REVIEWS OF JARED DIAMOND’S WORK

Read More
Dennis Junk Dennis Junk

The Upper Hand in Relationships

What began as an exercise in SEO (search engine optimization) became such a success it’s probably my most widely read piece of writing (somewhat to my chagrin). Apparently, my background in psychology and long history of dating equipped me with some helpful insight. Bottom line: stay away from people playing zero-sum games. And don’t give away too much too soon. You’ll have to read to find out more about what made this post so popular.

People perform some astoundingly clever maneuvers in pursuit of the upper hand in their romantic relationships, and some really stupid ones too. They try to make their partners jealous. They feign lack of interest. They pretend to have enjoyed wild success in the realm of dating throughout their personal histories, right up until the point at which they met their current partners. The edge in cleverness, however, is usually enjoyed by women—though you may be inclined to call it subtlety, or even deviousness.

Some of the most basic dominance strategies used in romantic relationships are based either on one partner wanting something more than the other, or on one partner being made to feel more insecure than the other. We all know couples whose routine revolves around the running joke that the man is constantly desperate for sex, which allows the woman to set the terms he must meet in order to get some. His greater desire for sex gives her the leverage to control him in other domains. I’ll never forget being nineteen and hearing a friend a few years older say of her husband, “Why would I want to have sex with him when he can’t even remember to take out the garbage?” Traditionally, men held the family purse strings, so they—assuming they or their families had money—could hold out the promise of things women wanted more. Of course, some men still do this, giving their wives little reminders of how hard they work to provide financial stability, or dropping hints of their extravagant lifestyles to attract prospective dates.

You can also get the upper hand on someone by taking advantage of his or her insecurities. (If that fails, you can try producing some.) Women tend to be the most vulnerable to such tactics at the moment of choice, wanting their features and graces and wiles to make them more desirable than any other woman prospective partners are likely to see. The woman who gets passed up in favor of another goes home devastated, likely lamenting the crass superficiality of our culture.

Most of us probably know a man or two who, deliberately or not, manages to keep his girlfriend or wife in constant doubt when it comes to her ability to keep his attention. These are the guys who can’t control their wandering eyes, or who let slip offhand innuendos about incremental weight gain. Perversely, many women respond by expending greater effort to win his attention and his approval.

Men tend to be the most vulnerable just after sex, in the Was-it-good-for-you moments. If you found yourself seething at some remembrance of masculine insensitivity reading the last paragraph, I recommend a casual survey of your male friends in which you ask them how many of their past partners at some point compared them negatively to some other man, or men, they had been with prior to the relationship. The idea that the woman is settling for a man who fails to satisfy her as others have plays into the narrative that he wants sex more—and that he must strive to please her outside the bedroom.

If you can put your finger on your partner’s insecurities, you can control him or her by tossing out reassurances like food pellets to a trained animal. The alternative would be for a man to be openly bowled over by a woman’s looks, or for a woman to express in earnest her enthusiasm for a man’s sexual performances. These options, since they disarm, can be even more seductive; they can be tactics in their own right—but we’re talking next-level expertise here so it’s not something you’ll see very often.

I give the edge to women when it comes to subtly attaining the upper hand in relationships because I routinely see them using a third strategy they seem to have exclusive rights to. Being the less interested party, or the most secure and reassuring party, can work wonders, but for turning proud people into sycophants nothing seems to work quite as well as a good old-fashioned guilt-trip.

To understand how guilt-trips work, just consider the biggest example in history: Jesus died on the cross for your sins, and therefore you owe your life to Jesus. The illogic of this idea is manifold, but I don’t need to stress how many people it has seduced into a lifetime of obedience to the church. The basic dynamic is one of reciprocation: because one partner in a relationship has harmed the other, the harmer owes the harmed some commensurate sacrifice.

I’m probably not the only one who’s witnessed a woman catching on to her man’s infidelity and responding almost gleefully—now she has him. In the first instance of this I watched play out, the woman, in my opinion, bore some responsibility for her husband’s turning elsewhere for love. She was brutal to him. And she believed his guilt would only cement her ascendancy. Fortunately, they both realized about that time she must not really love him and they divorced.

But the guilt need not be tied to anything as substantive as cheating. Our puritanical Christian tradition has joined forces in America with radical feminism to birth a bastard lovechild we encounter in the form of a groundless conviction that sex is somehow inherently harmful—especially to females. Women are encouraged to carry with them stories of the traumas they’ve suffered at the hands of monstrous men. And, since men are of a tribe, a pseudo-logic similar to the Christian idea of collective guilt comes into play. Whenever a man courts a woman steeped in this tradition, he is put on early notice—you’re suspect; I’m a trauma survivor; you need to be extra nice, i.e. submissive.

It’s this idea of trauma, which can be attributed mostly to Freud, that can really make a relationship, and life, fraught and intolerably treacherous. Behaviors that would otherwise be thought inconsiderate or rude—a hurtful word, a wandering eye—are instead taken as malicious attempts to cause lasting harm. But the most troubling thing about psychological trauma is that belief in it is its own proof, even as it implicates a guilty party who therefore has no way to establish his innocence.

Over the course of several paragraphs, we’ve gone from amusing but nonetheless real struggles many couples get caught up in to some that are just downright scary. The good news is that there is a subset of people who don’t see relationships as zero-sum games. (Zero-sum is a game theory term for interactions in which every gain for one party is a loss for the other. Non zero-sum games are those in which cooperation can lead to mutual benefits.) The bad news is that they can be hard to find.

There are a couple of things you can do now though that will help you avoid chess match relationships—or minimize the machinations in your current romance. First, ask yourself what dominance tactics you tend to rely on. Be honest with yourself. Recognizing your bad habits is the first step toward breaking them. And remember, the question isn’t whether you use tactics to try to get the upper hand; it’s which ones you use how often?

           The second thing you can do is cultivate the habit and the mutual attitude of what’s good for one is good for the other. Relationship researcher Arthur Aron says that celebrating your partner’s successes is one of the most important things you can do in a relationship. “That’s even more important,” he says, “than supporting him or her when things go bad.” Watch out for zero-sum responses, in yourself and in your partner. And beware of zero-summers in the realm of dating.

Ladies, you know the guys who seem vaguely resentful of the power you have over them by dint of your good looks and social graces. And, guys, you know the women who make you feel vaguely guilty and set-upon every time you talk to them. The best thing to do is stay away.

     But you may be tempted, once you realize a dominance tactic is being used on you, to perform some kind of countermove. It’s one of my personal failings to be too easily provoked into these types of exchanges. It is a dangerous indulgence.

Also read

Anti-Charm - Its Powers and Perils

Read More
Dennis Junk Dennis Junk

Eric Harris: Antisocial Aggressor or Narcissistic Avenger?

Conventional wisdom after the Columbine shooting was that the perpetrators had lashed out after being bullied. They were supposed to have low self-esteem and represented the dangers of letting kids go about feeling bad about themselves. But is it possible at least one of the two was in fact a narcissist? Eric Harris definitely had some fantasies of otherworldly grandeur.

Coincident with my writing a paper defending Gabriel Conroy in James Joyce’s story “The Dead” from charges of narcissism leveled by Lacanian critics, my then girlfriend was preparing a presentation on the Columbine shooter Eric Harris which had her trying to determine whether he would have better fit the DSM-IV diagnostic criteria for Narcissistic or for Antisocial Personality Disorder. Everything about Harris screamed narcissist, but there was a deal-breaker for the diagnosis: people who hold themselves in astronomical esteem seem unlikely candidates for suicide, and Harris turned his gun on himself in culmination of his murder spree.

Clinical diagnoses are mere descriptive categorizations which don’t in any way explain behavior; at best, they may pave the way for explanations by delineating the phenomenon to be explained. Yet the nature of Harris’s thinking about himself has important implications for our understanding of other types of violence. Was he incapable of empathizing with others, unable to see and unwilling to treat them as feeling, sovereign beings, in keeping with an antisocial diagnosis? Or did he instead believe himself to be so superior to his peers that they simply didn’t merit sympathy or recognition, suggesting narcissism? His infamous journals suggest pretty unequivocally that the latter was the case. But again we must ask if a real narcissist would kill himself?

This seeming paradox was brought to my attention again this week as I was reading 50 Great Myths of Popular Psychology: Shattering Widespread Misconceptions about Human Behavior (about which I will very likely be writing more here). Myth #33 is that “Low Self-Esteem Is a Major Cause of Psychological Problems” (162). The authors make use of the common misconception that the two boys responsible for the shootings were meek and shy and got constantly picked on until their anger boiled over into violence. (It turns out the boiling-over metaphor is wrong too, as explained under Myth #30: “It’s Better to Express Anger to Others than to Hold It in.”) The boys were indeed teased and taunted, but the experience didn’t seem to lower their view of themselves. “Instead,” the authors write, “Harris and Klebold’s high self-esteem may have led them to perceive the taunts of their classmates as threats to their inflated sense of self-worth, motivating them to seek revenge” (165).

Narcissists, they explain, “believe themselves deserving of special privileges” or entitlements. “When confronted with a challenge to their perceived worth, or what clinical psychologists term a ‘narcissistic injury,’ they’re liable to lash out at others” (165). We usually think of school shootings as random acts of violence, but maybe the Columbine massacre wasn’t exactly random. It may rather have been a natural response to perceived offenses—just one that went atrociously beyond the realm of what anyone would consider fair. If what Harris did on that day in April of 1999 was not an act of aggression but one of revenge, it may be useful to consider it in terms of costly punishment, a special instance of costly signaling.

The strength of a costly signal is commensurate with that cost, so Harris’s willingness both to kill and to die might have been his way of insisting that the offense he was punishing was deathly serious. What the authors of 50 Great Myths argue is that the perceived crime consisted of his classmates not properly recognizing and deferring to his superiority. Instead of contradicting the idea that Harris held himself in great esteem then, his readiness to die for the sake of his message demonstrates just how superior he thought he was—in his mind the punishment was justified by the offense, and how seriously he took the slights of his classmates can be seen as an index of how superior to them he thought he was. The greater the difference in relative worth between Harris and his schoolmates, the greater the injustice.

Perceived relative status plays a role in all punishments. Among two people of equal status, such factors as any uncertainty regarding guilt, mitigating circumstances surrounding the offense, and concern for making the punishment equal to the crime will enter into any consideration of just deserts. But the degree to which these factors are ignored can be used as an index for the size of the power differential between the two individuals—or at least to the perceived power differential. Someone who feels infinitely superior will be willing to dish out infinite punishment. Absent a truly horrendous crime, revenge is a narcissistic undertaking.

Also read

SYMPATHIZING WITH PSYCHOS: WHY WE WANT TO SEE ALEX ESCAPE HIS FATE AS A CLOCKWORK ORANGE

And:

THE MENTAL ILLNESS ZODIAC: WHY THE DSM 5 WON'T BE ANYTHING BUT MORE PSEUDOSCIENCE

Read More