READING SUBTLY

This
was the domain of my Blogger site from 2009 to 2018, when I moved to this domain and started
The Storytelling Ape
. The search option should help you find any of the old posts you're looking for.
 

Dennis Junk Dennis Junk

How to be Interesting: Dead Poets and Causal Inferences

Henry James famously wrote, “The only obligation to which in advance we may hold a novel without incurring the accusation of being arbitrary, is that it be interesting.” But how does one be interesting? The answer probably lies in staying one step ahead of your readers, but every reader moves at their own pace.

No one writes a novel without ever having read one. Though storytelling comes naturally to us as humans, our appreciation of the lengthy, intricately rendered narratives we find spanning the hundreds of pages between book covers is contingent on a long history of crucial developments, literacy for instance. In the case of an individual reader, the faithfulness with which ontogeny recapitulates phylogeny will largely determine the level of interest taken in any given work of fiction. In other words, to appreciate a work, it is necessary to have some knowledge of the literary tradition to which it belongs. T.S. Eliot’s famous 1919 essay “Tradition and the Individual Talent” eulogizes great writers as breathing embodiments of the entire history of their art. “The poet must be very conscious of the main current,” Eliot writes,

which does not at all flow invariably through the most distinguished reputations. He must be quite aware of the obvious fact that art never improves, but that the material of art is never quite the same. He must be aware that the mind of Europe—the mind of his own country—a mind which he learns in time to be much more important than his own private mind—is a mind which changes, and that this change is a development which abandons nothing en route, which does not superannuate either Shakespeare, or Homer, or the rock of the Magdalenian draughtsmen.

Though Eliot probably didn’t mean to suggest that to write a good poem or novel you have to have thoroughly mastered every word of world literature, a condition that would’ve excluded most efforts even at the time he wrote the essay, he did believe that to fully understand a work you have to be able to place it in its proper historical context. “No poet,” he wrote,

no artist of any art, has his complete meaning alone. His significance, his appreciation is the appreciation of his relation to the dead poets and artists. You cannot value him alone; you must set him, for contrast and comparison, among the dead.

If this formulation for what goes into the appreciation of art is valid, then as time passes and historical precedents accumulate the burden of knowledge that must be shouldered to sustain adequate interest in or appreciation for works in the tradition will be getting constantly bigger. Accordingly, the number of people who can manage it will be getting constantly smaller.

But what if there is something like a threshold awareness of literary tradition—or even of current literary convention—beyond which the past ceases to be the most important factor influencing your appreciation for a particular work? Once your reading comprehension is up to snuff and you’ve learned how to deal with some basic strategies of perspective—first person, third person omniscient, etc.—then you’re free to interpret stories not merely as representative of some tradition but of potentially real people and events, reflective of some theme that has real meaning in most people’s lives. Far from seeing the task of the poet or novelist as serving as a vessel for artistic tradition, Henry James suggests in his 1884 essay “The Art of Fiction” that

The only obligation to which in advance we may hold a novel without incurring the accusation of being arbitrary, is that it be interesting. That general responsibility rests upon it, but it is the only one I can think of. The ways in which it is at liberty to accomplish this result (of interesting us) strike me as innumerable and such as can only suffer from being marked out, or fenced in, by prescription. They are as various as the temperament of man, and they are successful in proportion as they reveal a particular mind, different from others. A novel is in its broadest definition a personal impression of life; that, to begin with, constitutes its value, which is greater or less according to the intensity of the impression.

Writing for dead poets the way Eliot suggests may lead to works that are historically interesting. But a novel whose primary purpose is to represent, say, Homer’s Odyssey in some abstract way, a novel which, in other words, takes a piece of literature as its subject matter rather than some aspect of life as it is lived by humans, will likely only ever be interesting to academics. This isn’t to say that writers of the past ought to be ignored; rather, their continuing relevance is likely attributable to their works’ success in being interesting. So when you read Homer you shouldn’t be wondering how you might artistically reconceptualize his epics—you should be attending to the techniques that make them interesting and wondering how you might apply them in your own work, which strives to artistically represent some aspect of live. You go to past art for technical or thematic inspiration, not for traditions with which to carry on some dynamic exchange.

Representation should, as a rule of thumb, take priority over tradition. And to insist, as Eliot does, as an obvious fact or otherwise, that artistic techniques never improve is to admit defeat before taking on the challenge. But this leaves us with the question of how, beyond a devotion to faithful representations of recognizably living details, one manages to be interesting. Things tend to interest us when they’re novel or surprising. That babies direct their attention to incidents which go against their expectations is what allows us to examine what those expectations are. Babies, like their older counterparts, stare longer at bizarre occurrences. If a story consisted of nothing but surprising incidents, however, we would probably lose interest in it pretty quickly because it would strike us as chaotic and incoherent. Citing research showing that while surprise is necessary in securing the interest of readers but not sufficient, Sung-Il Kim, a psychologist at Korea University, explains that whatever incongruity causes the surprise must somehow be resolved. In other words, the surprise has to make sense in the shifted context.

In Aspects of the Novel, E.M. Forster makes his famous distinction between flat and round characters with reference to the latter’s ability to surprise readers. He notes however that surprise is only half the formula, since a character who only surprises would seem merely erratic—or would seem like something other than a real person. He writes,

The test of a round character is whether it is capable of surprising in a convincing way. If it never surprises, it is flat. If it does not convince, it is a flat pretending to be round. It has the incalculability of life about it—life within the pages of a book. And by using it sometimes alone, more often in combination with the other kind, the novelist achieves his task of acclimatization and harmonizes the human race with the other aspects of his work. (78)

Kim discovered that this same dynamic is at play even in the basic of unit of a single described event, suggesting that the convincing surprise is important for all aspects of the story, not just character. He went on to test the theory that what lies at the heart of our interest in these seeming incongruities that are in time resolved is our tendency to anticipate the resolution. When a brief description involves some element that must be inferred, it is considered more interesting, and it proves more memorable, than when the same incident is described in full detail without any demand for inference. However, when researchers rudely distract readers in experiments, keeping them from being able to infer, the differences in recall and reported interest vanish.

Kim proposes a “causal bridging inference” theory to explain what makes a story interesting. If there aren’t enough inferences to be made, the story seems boring and banal. But if there are too many then the reader gets overwhelmed and spaces out. “Whether inferences are drawn or not,” Kim writes,

depends on two factors: the amount of background knowledge a reader possesses and the structure of a story… In a real life situation, for example, people are interested in new scientific theories, new fashion styles, or new leading-edge products only when they have an adequate amount of background knowledge on the domain to fill the gap between the old and the new… When a story contains such detailed information that there is no gap to fill in, a reader does not need to generate inferences. In this case, the story would not be interesting even if the reader possessed a great deal of background knowledge. (69)

One old-fashioned and intuitive way of thinking about causal bridge inference theory is to see the task of a writer as keeping one or two steps ahead of the reader. If the story runs ahead by more than a few steps it risks being too difficult to follow and the reader gets lost. If it falls behind, it drags, like the boor who relishes the limelight and so stretches out his anecdotes with excruciatingly superfluous detail.

For a writer, the takeaway is that you want to shock and surprise your readers, which means making your story take unexpected, incongruous turns, but you should also seed the narrative with what in hindsight can be seen as hints to what’s to come so that the surprises never seem random or arbitrary—and so that the reader is trained to seek out further clues to make further inferences. This is what Forster meant when he said characters should change in ways that are both surprising and convincing. It’s perhaps a greater achievement to have character development, plot, and language integrated so that an inevitable surprise in one of these areas has implications for or bleeds into both the others. But we as readers can enjoy on its own an unlikely observation or surprising analogy that we discover upon reflection to be fitting. And of course we can enjoy a good plot twist in isolation too—witness Hollywood and genre fiction.

Naturally, some readers can be counted on to be better at making inferences than others. As Kim points out, this greater ability may be based on a broader knowledge base; if the author makes an allusion, for instance, it helps to know about the subject being alluded to. It can also be based on comprehension skills, awareness of genre conventions, understanding of the physical or psychological forces at play in the plot, and so on. The implication is that keeping those crucial two steps ahead, no more no less, means targeting readers who are just about as good at making inferences as you are and working hard through inspiration, planning, and revision to maintain your lead. If you’re erudite and agile of mind, you’re going to bore yourself trying to write for those significantly less so—and those significantly less so are going to find what is keenly stimulating for you to write all but impossible to comprehend.

Interestingness is also influenced by fundamental properties of stories like subject matter—Percy Fawcett explores the Amazon in search of the lost City of Z is more interesting than Margaret goes grocery shopping—and the personality traits of characters that influence the degree to which we sympathize with them. But technical virtuosity often supersedes things like topic and character. A great writer can write about a boring character in an interesting way. Interestingly, however, the benefit in interest won through mastery of technique will only be appreciated by those capable of inferring the meaning hinted at by the narration, those able to make the proper conceptual readjustments to accommodate surprising shifts in context and meaning. When mixed martial arts first became popular, for instance, audiences roared over knockouts and body slams, and yawned over everything else. But as Joe Rogan has remarked from ringside at events over the past few years fans have become so sophisticated that they cheer when one fighter passes the other’s guard.

What this means is that no matter how steadfast your devotion to representation, assuming your skills continually develop, there will be a point of diminishing returns, a point where improving as a writer will mean your work has greater interest but to a shrinking audience. My favorite illustration of this dilemma is Steven Millhauser’s parable “In the Reign of Harad IV,” in which “a maker of miniatures” carves and sculpts tiny representations of a king’s favorite possessions. Over time, though, the miniaturist ceases to care about any praise he receives from the king or anyone else at court and begins working to satisfy an inner “stirring of restlessness.” His creations become smaller and smaller, necessitating greater and greater magnification tools to appreciate. No matter how infinitesimal he manages to make his miniatures, upon completion of each work he seeks “a farther kingdom.” It’s one of the most interesting short stories I’ve read in a while.

Causal Bridging Inference Model:

Read More
Dennis Junk Dennis Junk

What's the Point of Difficult Reading?

For every John Galt, Tony Robbins, and Scheherazade, we may need at least half a Proust. We are still, however, left with quite a dilemma. Some authors really are just assholes who write worthless tomes designed to trick you into wasting your time. But some books that seem impenetrable on the first attempt will reward your efforts to decipher them.

You sit reading the first dozen or so pages of some celebrated classic and gradually realize that having to sort out how the ends of the long sentences fix to their beginnings is taking just enough effort to distract you entirely from the setting or character you’re supposed to be getting to know. After a handful of words you swear are made up and a few tangled metaphors you find yourself riddling over with nary a resolution, the dread sinks in. Is the whole book going to be like this? Is it going to be one of those deals where you get to what’s clearly meant to be a crucial turning point in the plot but for you is just another riddle without a solution, sending you paging back through the forest of verbiage in search of some key succession of paragraphs you spaced out while reading the first time through? Then you wonder if you’re missing some other kind of key, like maybe the story’s an allegory, a reference to some historical event like World War II or some Revolution you once had to learn about but have since lost all recollection of. Maybe the insoluble similes are allusions to some other work you haven’t read or can’t recall. In any case, you’re not getting anything out of this celebrated classic but frustration leading to the dual suspicion that you’re too ignorant or stupid to enjoy great literature and that the whole “great literature” thing is just a conspiracy to trick us into feeling dumb so we’ll defer to the pseudo-wisdom of Ivory Tower elites.

If enough people of sufficient status get together and agree to extol a work of fiction, they can get almost everyone else to agree. The readers who get nothing out of it but frustration and boredom assume that since their professors or some critic in a fancy-pants magazine or the judges of some literary award committee think it’s great they must simply be missing something. They dutifully continue reading it, parrot a few points from a review that sound clever, and afterward toe the line by agreeing that it is indeed a great work of literature, clearly, even if it doesn’t speak to them personally. For instance, James Joyce’s Ulysses, utterly nonsensical to anyone without at least a master’s degree, tops the Modern Library’s list of 100 best novels in the English language. Responding to the urging of his friends to write out an explanation of the novel, Joyce scoffed, boasting,

I’ve put in so many enigmas and puzzles that it will keep the professors busy for centuries arguing over what I meant, and that’s the only way of ensuring one’s immortality.

He was right. To this day, professors continue to love him even as Ulysses and the even greater monstrosity Finnegan’s Wake do nothing but bore and befuddle everyone else—or else, more fittingly, sit inert or unchecked-out on the shelf, gathering well-deserved dust.

Joyce’s later novels are not literature; they are lengthy collections of loosely connected literary puzzles. But at least his puzzles have actual solutions—or so I’m told. Ulysses represents the apotheosis of the tradition in literature called modernism. What came next, postmodernism, is even more disconnected from the universal human passion for narrative. Even professors aren’t sure what to do with it, so they simply throw their hands up, say it’s great, and explain that the source of its greatness is its very resistance to explanation. Jonathan Franzen, whose 2001 novel The Corrections represented a major departure from the postmodernism he began his career experimenting with, explained the following year in The New Yorker how he’d turned away from the tradition. He’d been reading the work of William Gaddis “as a kind of penance” (101) and not getting any meaning out of it. Of the final piece in the celebrated author’s oeuvre, Franzen writes,

The novel is an example of the particular corrosiveness of literary postmodernism. Gaddis began his career with a Modernist epic about the forgery of masterpieces. He ended it with a pomo romp that superficially resembles a masterpiece but punishes the reader who tries to stay with it and follow its logic. When the reader finally says, Hey, wait a minute, this is a mess, not a masterpiece, the book instantly morphs into a performance-art prop: its fraudulence is the whole point! And the reader is out twenty hours of good-faith effort. (111)

In other words, reading postmodern fiction means not only forgoing the rewards of narratives, having them replaced by the more taxing endeavor of solving multiple riddles in succession, but those riddles don’t even have answers. What’s the point of reading this crap? Exactly. Get it?

You can dig deeper into the meaningless meanderings of pomos and discover there is in fact an ideology inspiring all the infuriating inanity. The super smart people who write and read this stuff point to the willing, eager complicity of the common reader in the propagation of all the lies that sustain our atrociously unjust society (but atrociously unjust compared to what?). Franzen refers to this as the Fallacy of the Stupid Reader,

wherein difficulty is a “strategy” to protect art from cooptation and the purpose of art is to “upset” or “compel” or “challenge” or “subvert” or “scar” the unsuspecting reader; as if the writer’s audience somehow consisted, again and again, of Charlie Browns running to kick Lucy’s football; as if it were a virtue in a novelist to be the kind of boor who propagandizes at friendly social gatherings. (109)

But if the author is worried about art becoming a commodity does making the art shitty really amount to a solution? And if the goal is to make readers rethink something they take for granted why not bring the matter up directly, or have a character wrestle with it, or have a character argue with another character about it? The sad fact is that these authors probably just suck, that, as Franzen suspects, “literary difficulty can operate as a smoke screen for an author who has nothing interesting, wise, or entertaining to say” (111).

Not all difficulty in fiction is a smoke screen though. Not all the literary emperors are naked. Franzen writes that “there is no headache like the headache you get from working harder on deciphering a text than the author, by all appearances, has worked on assembling it.” But the essay, titled “Mr. Difficult,” begins with a reader complaint sent not to Gaddis but to Franzen himself. And the reader, a Mrs. M. from Maryland, really gives him the business:

Who is it that you are writing for? It surely could not be the average person who enjoys a good read… The elite of New York, the elite who are beautiful, thin, anorexic, neurotic, sophisticated, don’t smoke, have abortions tri-yearly, are antiseptic, live in penthouses, this superior species of humanity who read Harper’s and The New Yorker. (100)

In this first part of the essay, Franzen introduces a dilemma that sets up his explanation of why he turned away from postmodernism—he’s an adherent of the “Contract model” of literature, whereby the author agrees to share, on equal footing, an entertaining or in some other way gratifying experience, as opposed to the “Status model,” whereby the author demonstrates his or her genius and if you don’t get it, tough. But his coming to a supposed agreement with Mrs. M. about writers like Gaddis doesn’t really resolve Mrs. M.’s conflict with him.

The Corrections, after all, the novel she was responding to, represents his turning away from the tradition Gaddis wrote in. (It must be said, though, that Freedom, Franzen’s next novel, is written in a still more accessible style.)

The first thing we must do to respond properly to Mrs. M. is break down each of Franzen’s models into two categories. The status model includes writers like Gaddis whose difficulty serves no purpose but to frustrate and alienate readers. But Franzen’s own type specimen for this model is Flaubert, much of whose writing, though difficult at first, rewards any effort to re-read and further comprehend with a more profound connection. So it is for countless other writers, the one behind number two on the Modern Library’s ranking for instance—Fitzgerald and Gatsby. As for the contract model, Franzen admits,

Taken to its free-market extreme, Contract stipulates that if a product is disagreeable to you the fault must be the product’s. If you crack a tooth on a hard word in a novel, you sue the author. If your professor puts Dreiser on your reading list, you write a harsh student evaluation… You’re the consumer; you rule. (100)

Franzen, in declaring himself a “Contract kind of person,” assumes that the free-market extreme can be dismissed for its extremity. But Mrs. M. would probably challenge him on that. For many, particularly right-leaning readers, the market not only can but should be relied on to determine which books are good and which ones belong in some tiny niche. When the Modern Library conducted a readers' poll to create a popular ranking to balance the one made by experts, the ballot was stuffed by Ayn Rand acolytes and scientologists. Mrs. M. herself leaves little doubt as to her political sympathies. For her and her fellow travelers, things like literature departments, National Book Awards—like the one The Corrections won—Nobels and Pulitzers are all an evil form of intervention into the sacred workings of the divine free market, un-American, sacrilegious, communist. According to this line of thinking, authors aren’t much different from whores—except of course literal whoring is condemned in the bible (except when it isn’t).

A contract with readers who score high on the personality dimension of openness to new ideas and experiences (who tend to be liberal), those who have spent a lot of time in the past reading books like The Great Gatsby or Heart of Darkness or Lolita (the horror!), those who read enough to have developed finely honed comprehension skills—that contract is going to look quite a bit different from one with readers who attend Beck University, those for whom Atlas Shrugged is the height of literary excellence. At the same time, though, the cult of self-esteem is poisoning schools and homes with the idea that suggesting that a student or son or daughter is anything other than a budding genius is a form of abuse. Heaven forbid a young person feel judged or criticized while speaking or writing. And if an author makes you feel the least bit dumb or ignorant, well, it’s an outrage—heroes like Mrs. M. to the rescue.

One of the problems with the cult of self-esteem is that anticipating criticism tends to make people more, not less creative. And the link between low self-esteem and mental disorders is almost purely mythical. High self-esteem is correlated with school performance, but as far as researchers can tell it’s the performance causing the esteem, not the other way around. More invidious, though, is the tendency to view anything that takes a great deal of education or intelligence to accomplish as an affront to everyone less educated or intelligent. Conservatives complain endlessly about class warfare and envy of the rich—the financially elite—but they have no qualms about decrying intellectual elites and condemning them for flaunting their superior literary achievements. They see the elitist mote in the eye of Nobel laureates without noticing the beam in their own.

         What’s the point of difficult reading? Well, what’s the point of running five or ten miles? What’s the point of eating vegetables as opposed to ice cream or Doritos? Difficulty need not preclude enjoyment. And discipline in the present is often rewarded in the future. It very well may be that the complexity of the ideas you’re capable of understanding is influenced by how many complex ideas you attempt to understand. No matter how vehemently true believers in the magic of markets insist otherwise, markets don’t have minds. And though an individual’s intelligence need not be fixed a good way to ensure children never get any smarter than they already are is to make them feel fantastically wonderful about their mediocrity. We just have to hope that despite these ideological traps there are enough people out there determined to wrap their minds around complex situations depicted in complex narratives about complex people told in complex language, people who will in the process develop the types of minds and intelligence necessary to lead the rest of our lazy asses into a future that’s livable and enjoyable. For every John Galt, Tony Robbins, and Scheherazade, we may need at least half a Proust. We are still, however, left with quite a dilemma. Some authors really are just assholes who write worthless tomes designed to trick you into wasting your time. But some books that seem impenetrable on the first attempt will reward your efforts to decipher them. How do we get the rewards without wasting our time?

Read More
Dennis Junk Dennis Junk

Can’t Win for Losing: Why There Are So Many Losers in Literature and Why It Has to Change

David Lurie’s position in Disgrace is similar to that of John Proctor in The Crucible (although this doesn’t come out nearly as much in the movie version). And it’s hard not to see feminism in its current manifestations—along with Marxism and postcolonialism—as a pernicious new breed of McCarthyism infecting academia and wreaking havoc with men and literature alike.

Ironically, the author of The Golden Notebook, celebrating its 50th anniversary this year, and considered by many a “feminist bible,” happens to be an outspoken critic of feminism. When asked in a 1982 interview with Lesley Hazelton about her response to readers who felt some of her later works were betrayals of the women whose cause she once championed, Doris Lessing replied,

What the feminists want of me is something they haven't examined because it comes from religion. They want me to bear witness. What they would really like me to say is, ‘Ha, sisters, I stand with you side by side in your struggle toward the golden dawn where all those beastly men are no more.’ Do they really want people to make oversimplified statements about men and women? In fact, they do. I've come with great regret to this conclusion.

Lessing has also been accused of being overly harsh—“castrating”—to men, too many of whom she believes roll over a bit too easily when challenged by women aspiring to empowerment. As a famous novelist, however, who would go on to win the Nobel prize in literature in 2007, she got to visit a lot of schools, and it gradually dawned on her that it wasn’t so much that men were rolling over but rather that they were being trained from childhood to be ashamed of their maleness. In a lecture she gave to the Edinburgh book festival in 2001, she said,

Great things have been achieved through feminism. We now have pretty much equality at least on the pay and opportunities front, though almost nothing has been done on child care, the real liberation. We have many wonderful, clever, powerful women everywhere, but what is happening to men? Why did this have to be at the cost of men? I was in a class of nine- and 10-year-olds, girls and boys, and this young woman was telling these kids that the reason for wars was the innately violent nature of men. You could see the little girls, fat with complacency and conceit while the little boys sat there crumpled, apologising for their existence, thinking this was going to be the pattern of their lives.

Lessing describes how the teacher kept casting glances expectant of her approval as she excoriated these impressionable children. 

Elaine Blair, in “Great American Losers,” an essay that’s equal parts trenchant and infuriatingly obtuse, describes a dynamic in contemporary fiction that’s similar to the one Lessing saw playing out in the classroom.

The man who feels himself unloved and unlovable—this is a character that we know well from the latest generation or two of American novels. His trials are often played for sympathetic laughs. His loserdom is total: it extends to his stunted career, his squalid living quarters, his deep unease in the world.

At the heart of this loserdom is his auto-manifesting knowledge that women don’t like him. As opposed to men of earlier generations who felt entitled to a woman’s respect and admiration, Blair sees this modern male character as being “the opposite of entitled: he approaches women cringingly, bracing for a slap.” This desperation on the part of male characters to avoid offending women, to prove themselves capable of sublimating their own masculinity so they can be worthy of them, finds its source in the authors themselves. Blair writes,

Our American male novelists, I suspect, are worried about being unloved as writers—specifically by the female reader. This is the larger humiliation looming behind the many smaller fictional humiliations of their heroes, and we can see it in the way the characters’ rituals of self-loathing are tacitly performed for the benefit of an imagined female audience.

Blair quotes a review David Foster Wallace wrote of a John Updike novel to illustrate how conscious males writing literature today are of their female readers’ hostility toward men who write about sex and women without apologizing for liking sex and women—sometimes even outside the bounds of caring, committed relationships. Labeling Updike as a “Great Male Narcissist,” a distinction he shares with writers like Philip Roth and Norman Mailer, Wallace writes,

Most of the literary readers I know personally are under forty, and a fair number are female, and none of them are big admirers of the postwar GMNs. But it’s John Updike in particular that a lot of them seem to hate. And not merely his books, for some reason—mention the poor man himself and you have to jump back:

“Just a penis with a thesaurus.”

“Has the son of a bitch ever had one unpublished thought?”

“Makes misogyny seem literary the same way Rush 

[Limbaugh] makes fascism seem funny.”

And trust me: these are actual quotations, and I’ve heard even

worse ones, and they’re all usually accompanied by the sort of

facial expressions where you can tell there’s not going to be

any profit in appealing to the intentional fallacy or talking

about the sheer aesthetic pleasure of Updike’s prose.

Since Wallace is ready to “jump back” at the mere mention of Updike’s name, it’s no wonder he’s given to writing about characters who approach women “cringingly, bracing for a slap.”

Blair goes on to quote from Jonathan Franzen’s novel The Corrections, painting a plausible picture of male writers who fear not only that their books will be condemned if too misogynistic—a relative term which has come to mean "not as radically feminist as me"—but they themselves will be rejected. In Franzen’s novel, Chip Lambert has written a screenplay and asked his girlfriend Julia to give him her opinion. She holds off doing so, however, until after she breaks up with him and is on her way out the door. “For a woman reading it,” she says, “it’s sort of like the poultry department. Breast, breast, breast, thigh, leg” (26). Franzen describes his character’s response to the critique:

It seemed to Chip that Julia was leaving him because “The Academy Purple” had too many breast references and a draggy opening, and that if he could correct these few obvious problems, both on Julia’s copy of the script and, more important, on the copy he’d specially laser-printed on 24-pound ivory bond paper for [the film producer] Eden Procuro, there might be hope not only for his finances but also for his chances of ever again unfettering and fondling Julia’s own guileless, milk-white breasts. Which by this point in the day, as by late morning of almost every day in recent months, was one of the last activities on earth in which he could still reasonably expect to take solace for his failures. (28)

If you’re reading a literary work like The Corrections, chances are you’ve at some point sat in a literature class—or even a sociology or culture studies class—and been instructed that the proper way to fulfill your function as a reader is to critically assess the work in terms of how women (or minorities) are portrayed. Both Chip and Julia have sat through such classes. And you’re encouraged to express disapproval, even outrage if something like a traditional role is enacted—or, gasp, objectification occurs. Blair explains how this affects male novelists:

When you see the loser-figure in a novel, what you are seeing is a complicated bargain that goes something like this: yes, it is kind of immature and boorish to be thinking about sex all the time and ogling and objectifying women, but this is what we men sometimes do and we have to write about it. We fervently promise, however, to avoid the mistake of the late Updike novels: we will always, always, call our characters out when they’re being self-absorbed jerks and louts. We will make them comically pathetic, and punish them for their infractions a priori by making them undesirable to women, thus anticipating what we imagine will be your judgments, female reader. Then you and I, female reader, can share a laugh at the characters’ expense, and this will bring us closer together and forestall the dreaded possibility of your leaving me.

In other words, these male authors are the grownup versions of those poor school boys Lessing saw forced to apologize for their own existence. Indeed, you can feel this dynamic, this bargain, playing out when you’re reading these guys’ books. Blair’s description of the problem is spot on. Her theory of what caused it, however, is laughable.

Because of the GMNs, these two tendencies—heroic virility and sexist condescension—have lingered in our minds as somehow yoked together, and the succeeding generations of American male novelists have to some degree accepted the dyad as truth. Behind their skittishness is a fearful suspicion that if a man gets what he wants, sexually speaking, he is probably exploiting someone.

The dread of slipping down the slope from attraction to exploitation has nothing to do with John Updike. Rather, it is embedded in terms at the very core of feminist ideology. Misogyny, for instance, is frequently deemed an appropriate label for men who indulge in lustful gazing, even in private. And the term objectification implies that the female whose subjectivity isn’t being properly revered is the victim of oppression. The main problem with this idea—and there are several—is that the term objectification is synonymous with attraction. The deluge of details about the female body in fiction by male authors can just as easily be seen as a type of confession, an unburdening of guilt by the offering up of sins. The female readers respond by assigning the writers some form of penance, like never writing, never thinking like that again without flagellating themselves.

The conflict between healthy male desire and disapproving feminist prudery doesn’t just play out in the tortured psyches of geeky American male novelists. A.S. Byatt, in her Booker prize-winning novel Possession, satirizes the plight of scholars steeped in literary theories for being “papery” and sterile. But the novel ends with a male scholar named Roland overcoming his theory-induced self-consciousness to initiate sex with another scholar named Maud. Byatt describes the encounter:

And very slowly and with infinite gentle delays and delicate diversions and variations of indirect assault Roland finally, to use an outdated phrase, entered and took possession of all her white coolness that grew warm against him, so that there seemed to be no boundaries, and he heard, towards dawn, from a long way off, her clear voice crying out, uninhibited, unashamed, in pleasure and triumph. (551)

The literary critic Monica Flegel cites this passage as an example of how Byatt’s old-fashioned novel features “such negative qualities of the form as its misogyny and its omission of the lower class.” Flegel is particularly appalled by how “stereotypical gender roles are reaffirmed” in the sex scene. “Maud is reduced in the end,” Flegel alleges, “to being taken possession of by her lover…and assured that Roland will ‘take care of her.’” How, we may wonder, did a man assuring a woman he would take care of her become an act of misogyny?

Perhaps critics like Flegel occupy some radical fringe; Byatt’s book was after all a huge success with audiences and critics alike, and it did win Byatt the Booker. The novelist Martin Amis, however, isn’t one to describe his assaults as indirect. He routinely dares to feature men who actually do treat women poorly in his novels—without any authorial condemnation.

Martin Goff, the non-intervening director of the Booker Prize committee, tells the story of the 1989 controversy over whether or not Amis’s London Fields should be on the shortlist. Maggie Gee, a novelist, and Helen McNeil, a professor, simply couldn’t abide Amis’s treatment of his women characters. “It was an incredible row,” says Goff.

Maggie and Helen felt that Amis treated women appallingly in the book. That is not to say they thought books which treated women badly couldn't be good, they simply felt that the author should make it clear he didn't favour or bless that sort of treatment. Really, there was only two of them and they should have been outnumbered as the other three were in agreement, but such was the sheer force of their argument and passion that they won. David [Lodge] has told me he regrets it to this day, he feels he failed somehow by not saying, “It's two against three, Martin's on the list”.

In 2010, Amis explained his career-spanning failure to win a major literary award, despite enjoying robust book sales, thus:

There was a great fashion in the last century, and it's still with us, of the unenjoyable novel. And these are the novels which win prizes, because the committee thinks, “Well it's not at all enjoyable, and it isn't funny, therefore it must be very serious.”

Brits like Hilary Mantel, and especially Ian McEwan are working to turn this dreadful trend around. But when McEwan dared to write a novel about a neurosurgeon who prevails in the end over an afflicted, less privileged tormenter he was condemned by critic Jennifer Szalai in the pages of Harper’s Magazine for his “blithe, bourgeois sentiments.” If you’ve read Saturday, you know the sentiments are anything but blithe, and if you read Szalai’s review you’ll be taken aback by her articulate blindness.

Amis is probably right in suggesting that critics and award committees have a tendency to mistake misery for profundity. But his own case, along with several others like it, hint at something even more disturbing, a shift in the very idea of what role fictional narratives play in our lives.

The sad new reality is that, owing to the growing influence of ideologically extreme and idiotically self-righteous activist professors, literature is no longer read for pleasure and enrichment—it’s no longer even read as a challenging exercise in outgroup empathy. Instead, reading literature is supposed by many to be a ritual of male western penance. Prior to taking an interest in literary fiction, you must first be converted to the proper ideologies, made to feel sufficiently undeserving yet privileged, the beneficiary of a long history of theft and population displacement, the scion and gene-carrier of rapists and genocidaires—the horror, the horror. And you must be taught to systematically overlook and remain woefully oblivious of all the evidence that the Enlightenment was the best fucking thing that ever happened to the human species. Once you’re brainwashed into believing that so-called western culture is evil and that you’ve committed the original sin of having been born into it, you’re ready to perform your acts of contrition by reading horrendously boring fiction that forces you to acknowledge and reflect upon your own fallen state.

Fittingly, the apotheosis of this new literary tradition won the Booker in 1999, and its author, like Lessing, is a Nobel laureate. J.M. Coetzee’s Disgrace chronicles in exquisite free indirect discourse the degradation of David Lurie, a white professor in Cape Town, South Africa, beginning with his somewhat pathetic seduction of black student, a crime for which he pays with the loss of his job, his pension, and his reputation, and moving on to the aftermath of his daughter’s rape at the hands of three black men who proceed to rob her, steal his car, douse him with spirits and light him on fire. What’s unsettling about the novel—and it is a profoundly unsettling novel—is that its structure implies that everything that David and Lucy suffer flows from his original offense of lusting after a young black woman. This woman, Melanie, is twenty years old, and though she is clearly reluctant at first to have sex with her teacher there’s never any force involved. At one point, she shows up at David’s house and asks to stay with him. It turns out she has a boyfriend who is refusing to let her leave him without a fight. It’s only after David unheroically tries to wash his hands of the affair to avoid further harassment from this boyfriend—while stooping so low as to insist that Melanie make up a test she missed in his class—that she files a complaint against him.

David immediately comes clean to university officials and admits to taking advantage of his position of authority. But he stalwartly refuses to apologize for his lust, or even for his seduction of the young woman. This refusal makes him complicit, the novel suggests, in all the atrocities of colonialism. As he’s awaiting a hearing to address Melanie’s complaint, David gets a message:

On campus it is Rape Awareness Week. Women Against Rape, WAR, announces a twenty-four-hour vigil in solidarity with “recent victims”. A pamphlet is slipped under his door: ‘WOMEN SPEAK OUT.’ Scrawled in pencil at the bottom is a message: ‘YOUR DAYS ARE OVER, CASANOVA.’ (43)

During the hearing, David confesses to doctoring the attendance ledgers and entering a false grade for Melanie. As the attendees become increasingly frustrated with what they take to be evasions, he goes on to confess to becoming “a servant of Eros” (52). But this confession only enrages the social sciences professor Farodia Rassool:

Yes, he says, he is guilty; but when we try to get specificity, all of a sudden it is not abuse of a young woman he is confessing to, just an impulse he could not resist, with no mention of the pain he has caused, no mention of the long history of exploitation of which this is part. (53)

There’s also no mention, of course, of the fact that already David has gone through more suffering than Melanie has, or that her boyfriend deserves a great deal of the blame, or that David is an individual, not a representative of his entire race who should be made to answer for the sins of his forefathers.

After resigning from his position in disgrace, David moves out to the country to live with his daughter on a small plot of land. The attack occurs only days after he’s arrived. David wants Lucy to pursue some sort of justice, but she refuses. He wants her to move away because she’s clearly not safe, but she refuses. She even goes so far as to accuse him of being in the wrong for believing he has any right to pronounce what happened an injustice—and for thinking it is his place to protect his daughter. And if there’s any doubt about the implication of David’s complicity she clears it up. As he’s pleading with her to move away, they begin talking about the rapists’ motivation. Lucy says to her father,

When it comes to men and sex, David, nothing surprises me anymore. Maybe, for men, hating the woman makes sex more exciting. You are a man, you ought to know. When you have sex with someone strange—when you trap her, hold her down, get her under you, put all your weight on her—isn’t it a bit like killing? Pushing the knife in; exiting afterwards, leaving the body behind covered in blood—doesn’t it feel like murder, like getting away with murder? (158)

The novel is so engrossing and so disturbing that it’s difficult to tell what the author’s position is vis à vis his protagonist’s degradation or complicity. You can’t help sympathizing with him and feeling his treatment at the hands of Melanie, Farodia, and Lucy is an injustice. But are you supposed to question that feeling in light of the violence Melanie is threatened with and Lucy is subjected to? Are you supposed to reappraise altogether your thinking about the very concept of justice in light of the atrocities of history? Are we to see David Lurie as an individual or as a representative of western male colonialism, deserving of whatever he’s made to suffer and more?

Personally, I think David Lurie’s position in Disgrace is similar to that of John Proctor in The Crucible (although this doesn’t come out nearly as much in the movie version). And it’s hard not to see feminism in its current manifestations—along with Marxism and postcolonialism—as a pernicious new breed of McCarthyism infecting academia and wreaking havoc with men and literature alike. It’s really no surprise at all that the most significant developments in the realm of narratives lately haven’t occurred in novels at all. Insofar as the cable series contributing to the new golden age of television can be said to adhere to a formula, it’s this: begin with a bad ass male lead who doesn’t apologize for his own existence and has no qualms about expressing his feelings toward women. As far as I know, these shows are just as popular with women viewers as they are with the guys.

When David first arrives at Lucy’s house, they take a walk and he tells her a story about a dog he remembers from a time when they lived in a neighborhood called Kenilworth.

It was a male. Whenever there was a bitch in the vicinity it would get excited and unmanageable, and with Pavlovian regularity the owners would beat it. This went on until the poor dog didn’t know what to do. At the smell of a bitch it would chase around the garden with its ears flat and its tail between its legs, whining, trying to hide…There was something so ignoble in the spectacle that I despaired. One can punish a dog, it seems to me, for an offence like chewing a slipper. A dog will accept the justice of that: a beating for a chewing. But desire is another story. No animal will accept the justice of being punished for following its instincts.

Lucy breaks in, “So males must be allowed to follow their instincts unchecked? Is that the moral?” David answers,

No, that is not the moral. What was ignoble about the Kenilworth spectacle was that the poor dog had begun to hate its own nature. It no longer needed to be beaten. It was ready to punish itself. At that point it would be better to shoot it.

“Or have it fixed,” Lucy offers. (90)

Also Read:

SABBATH SAYS: PHILIP ROTH AND THE DILEMMAS OF IDEOLOGICAL CASTRATION

And:
FROM DARWIN TO DR. SEUSS: DOUBLING DOWN ON THE DUMBEST APPROACH TO COMBATTING RACISM

Read More
Dennis Junk Dennis Junk

The Storytelling Animal: a Light Read with Weighty Implications

The Storytelling Animal is not groundbreaking. But the style of the book contributes something both surprising and important. Gottschall could simply tell his readers that stories almost invariably feature come kind of conflict or trouble and then present evidence to support the assertion. Instead, he takes us on a tour from children’s highly gendered, highly trouble-laden play scenarios, through an examination of the most common themes enacted in dreams, through some thought experiments on how intensely boring so-called hyperrealism, or the rendering of real life as it actually occurs, in fiction would be. The effect is that we actually feel how odd it is to devote so much of our lives to obsessing over anxiety-inducing fantasies fraught with looming catastrophe.

A review of Jonathan Gottschall's The Storytelling Animal: How Stories Make Us Human

Vivian Paley, like many other preschool and kindergarten teachers in the 1970s, was disturbed by how her young charges always separated themselves by gender at playtime. She was further disturbed by how closely the play of each gender group hewed to the old stereotypes about girls and boys. Unlike most other teachers, though, Paley tried to do something about it. Her 1984 book Boys and Girls: Superheroes in the Doll Corner demonstrates in microcosm how quixotic social reforms inspired by the assumption that all behaviors are shaped solely by upbringing and culture can be. Eventually, Paley realized that it wasn’t the children who needed to learn new ways of thinking and behaving, but herself. What happened in her classrooms in the late 70s, developmental psychologists have reliably determined, is the same thing that happens when you put kids together anywhere in the world. As Jonathan Gottschall explains,

Dozens of studies across five decades and a multitude of cultures have found essentially what Paley found in her Midwestern classroom: boys and girls spontaneously segregate themselves by sex; boys engage in more rough-and-tumble play; fantasy play is more frequent in girls, more sophisticated, and more focused on pretend parenting; boys are generally more aggressive and less nurturing than girls, with the differences being present and measurable by the seventeenth month of life. (39)

Paley’s study is one of several you probably wouldn’t expect to find discussed in a book about our human fascination with storytelling. But, as Gottschall makes clear in The Storytelling Animal: How Stories Make Us Human, there really aren’t many areas of human existence that aren’t relevant to a discussion of the role stories play in our lives. Those rowdy boys in Paley’s classes were playing recognizable characters from current action and sci-fi movies, and the fantasies of the girls were right out of Grimm’s fairy tales (it’s easy to see why people might assume these cultural staples were to blame for the sex differences). And the play itself was structured around one of the key ingredients—really the key ingredient—of any compelling story, trouble, whether in the form of invading pirates or people trying to poison babies.

The Storytelling Animal is the book to start with if you have yet to cut your teeth on any of the other recent efforts to bring the study of narrative into the realm of cognitive and evolutionary psychology. Gottschall covers many of the central themes of this burgeoning field without getting into the weedier territories of game theory or selection at multiple levels. While readers accustomed to more technical works may balk at wading through all the author’s anecdotes about his daughters, Gottschall’s keen sense of measure and the light touch of his prose keep the book from getting bogged down in frivolousness. This applies as well to the sections in which he succumbs to the temptation any writer faces when trying to explain one or another aspect of storytelling by making a few forays into penning abortive, experimental plots of his own.

None of the central theses of The Storytelling Animal is groundbreaking. But the style and layout of the book contribute something both surprising and important. Gottschall could simply tell his readers that stories almost invariably feature come kind of conflict or trouble and then present evidence to support the assertion, the way most science books do. Instead, he takes us on a tour from children’s highly gendered, highly trouble-laden play scenarios, through an examination of the most common themes enacted in dreams—which contra Freud are seldom centered on wish-fulfillment—through some thought experiments on how intensely boring so-called hyperrealism, or the rendering of real life as it actually occurs, in fiction would be (or actually is, if you’ve read any of D.F.Wallace’s last novel about an IRS clerk). The effect is that instead of simply having a new idea to toss around we actually feel how odd it is to devote so much of our lives to obsessing over anxiety-inducing fantasies fraught with looming catastrophe. And we appreciate just how integral story is to almost everything we do.

This gloss of Gottschall’s approach gives a sense of what is truly original about The Storytelling Animal—it doesn’t seal off narrative as discrete from other features of human existence but rather shows how stories permeate every aspect of our lives, from our dreams to our plans for the future, even our sense of our own identity. In a chapter titled “Life Stories,” Gottschall writes,

This need to see ourselves as the striving heroes of our own epics warps our sense of self. After all, it’s not easy to be a plausible protagonist. Fiction protagonists tend to be young, attractive, smart, and brave—all of the things that most of us aren’t. Fiction protagonists usually live interesting lives that are marked by intense conflict and drama. We don’t. Average Americans work retail or cubicle jobs and spend their nights watching protagonists do interesting things on television, while they eat pork rinds dipped in Miracle Whip. (171)

If you find this observation a tad unsettling, imagine it situated on a page underneath a mug shot of John Wayne Gacy with a caption explaining how he thought of himself “more as a victim than as a perpetrator.” For the most part, though, stories follow an easily identifiable moral logic, which Gottschall demonstrates with a short plot of his own based on the hypothetical situations Jonathan Haidt designed to induce moral dumbfounding. This almost inviolable moral underpinning of narratives suggests to Gottschall that one of the functions of stories is to encourage a sense of shared values and concern for the wider community, a role similar to the one D.S. Wilson sees religion as having played, and continuing to play in human evolution.

Though Gottschall stays away from the inside baseball stuff for the most part, he does come down firmly on one issue in opposition to at least one of the leading lights of the field. Gottschall imagines a future “exodus” from the real world into virtual story realms that are much closer to the holodecks of Star Trek than to current World of Warcraft interfaces. The assumption here is that people’s emotional involvement with stories results from audience members imagining themselves to be the protagonist. But interactive videogames are probably much closer to actual wish-fulfillment than the more passive approaches to attending to a story—hence the god-like powers and grandiose speechifying.

William Flesch challenges the identification theory in his own (much more technical) book Comeuppance. He points out that films that have experimented with a first-person approach to camera work failed to capture audiences (think of the complicated contraption that filmed Will Smith’s face as he was running from the zombies in I am Legend). Flesch writes, “If I imagined I were a character, I could not see her face; thus seeing her face means I must have a perspective on her that prevents perfect (naïve) identification” (16). One of the ways we sympathize with one another, though, is to mirror them—to feel, at least to some degree, their pain. That makes the issue a complicated one. Flesch believes our emotional involvement comes not from identification but from a desire to see virtuous characters come through the troubles of the plot unharmed, vindicated, maybe even rewarded. Attending to a story therefore entails tracking characters' interactions to see if they are in fact virtuous, then hoping desperately to see their virtue rewarded.

Gottschall does his best to avoid dismissing the typical obsessive Larper (live-action role player) as the “stereotypical Dungeons and Dragons player” who “is a pimply, introverted boy who isn’t cool and can’t play sports or attract girls” (190). And he does his best to end his book on an optimistic note. But the exodus he writes about may be an example of another phenomenon he discusses. First the optimism:

Humans evolved to crave story. This craving has, on the whole, been a good thing for us. Stories give us pleasure and instruction. They simulate worlds so we can live better in this one. They help bind us into communities and define us as cultures. Stories have been a great boon to our species. (197)

But he then makes an analogy with food cravings, which likewise evolved to serve a beneficial function yet in the modern world are wreaking havoc with our health. Just as there is junk food, so there is such a thing as “junk story,” possibly leading to what Brian Boyd, another luminary in evolutionary criticism, calls a “mental diabetes epidemic” (198). In the context of America’s current education woes, and with how easy it is to conjure images of glazy-eyed zombie students, the idea that video games and shows like Jersey Shore are “the story equivalent of deep-fried Twinkies” (197) makes an unnerving amount of sense.

Here, as in the section on how our personal histories are more fictionalized rewritings than accurate recordings, Gottschall manages to achieve something the playful tone and off-handed experimentation don't prepare you for. The surprising accomplishment of this unassuming little book (200 pages) is that it never stops being a light read even as it takes on discoveries with extremely weighty implications. The temptation to eat deep-fried Twinkies is only going to get more powerful as story-delivery systems become more technologically advanced. Might we have already begun the zombie apocalypse without anyone noticing—and, if so, are there already heroes working to save us we won’t recognize until long after the struggle has ended and we’ve begun weaving its history into a workable narrative, a legend?

Also read:

WHAT IS A STORY? AND WHAT ARE YOU SUPPOSED TO DO WITH ONE?

And:

HOW TO GET KIDS TO READ LITERATURE WITHOUT MAKING THEM HATE IT

Read More
Dennis Junk Dennis Junk

Madness and Bliss: Critical vs Primitive Readings in A.S. Byatt's Possession: A Romance 2

Critics responded to A.S. Byatt’s challenge to their theories in her book Possession by insisting that the work fails to achieve high-brow status and fails to do anything but bolster old-fashioned notions about stories and language—not to mention the roles of men and women. What they don’t realize is how silly their theories come across to the non-indoctrinated.

Read part one.

The critical responses to the challenges posed by Byatt in Possession fit neatly within the novel’s satire. Louise Yelin, for instance, unselfconsciously divides the audience for the novel into “middlebrow readers” and “the culturally literate” (38), placing herself in the latter category. She overlooks Byatt’s challenge to her methods of criticism and the ideologies underpinning them, for the most part, and suggests that several of the themes, like ventriloquism, actually support poststructuralist philosophy. Still, Yelin worries about the novel’s “homophobic implications” (39). (A lesbian, formerly straight character takes up with a man in the end, and Christabel LaMotte’s female lover commits suicide after the dissolution of their relationship, but no one actually expresses any fear or hatred of homosexuals.) Yelin then takes it upon herself to “suggest directions that our work might take” while avoiding the “critical wilderness” Byatt identifies. She proposes a critical approach to a novel that “exposes its dependencies on the bourgeois, patriarchal, and colonial economies that underwrite” it (40). And since all fiction fails to give voice to one or another oppressed minority, it is the critic’s responsibility to “expose the complicity of those effacements in the larger order that they simultaneously distort and reproduce” (41). This is not in fact a response to Byatt’s undermining of critical theories; it is instead an uncritical reassertion of their importance.

Yelin and several other critics respond to Possession as if Byatt had suggested that “culturally literate” readers should momentarily push to the back of their minds what they know about how literature is complicit in various forms of political oppression so they can get more enjoyment from their reading. This response is symptomatic of an astonishing inability to even imagine what the novel is really inviting literary scholars to imagine—that the theories implicating literature are flat-out wrong. Monica Flegel for instance writes that “What must be privileged and what must be sacrificed in order for Byatt’s Edenic reading (and living) state to be achieved may give some indication of Byatt’s own conventionalizing agenda, and the negative enchantment that her particular fairy tale offers” (414). Flegel goes on to show that she does in fact appreciate the satire on academic critics; she even sympathizes with the nostalgia for simpler times, before political readings became mandatory. But she ends her critical essay with another reassertion of the political, accusing Byatt of “replicating” through her old-fashioned novel “such negative qualities of the form as its misogyny and its omission of the lower class.” Flegel is particularly appalled by Maud’s treatment in the final scene, since, she claims, “stereotypical gender roles are reaffirmed” (428). “Maud is reduced in the end,” Flegel alleges, “to being taken possession of by her lover…and assured that Roland will ‘take care of her’” (429). This interpretation places Flegel in the company of the feminists in the novel who hiss at Maud for trying to please men, forcing her to bind her head.

Flegel believes that her analysis proves Byatt is guilty of misogyny and mistreatment of the poor. “Byatt urges us to leave behind critical readings and embrace reading for enjoyment,” she warns her fellow critics, “but the narrative she offers shows just how much is at stake when we leave criticism behind” (429). Flegel quotes Yelin to the effect that Possession is “seductive,” and goes on to declaim that

it is naïve, and unethical, to see the kind of reading that Byatt offers as happy. To return to an Edenic state of reading, we must first believe that such a state truly existed and that it was always open to all readers of every class, gender, and race. Obviously, such a belief cannot be held, not because we have lost the ability to believe, but because such a space never really did exist. (430)

In her preening self-righteous zealotry, Flegel represents a current in modern criticism that’s only slightly more extreme than that represented by Byatt’s misguided but generally harmless scholars. The step from using dubious theories to decode alleged justifications for political oppression in literature to Flegel’s frightening brand of absurd condemnatory moralizing leveled at authors and readers alike is a short one.

Another way critics have attempted to respond to Byatt’s challenge is by denying that she is in fact making any such challenge. Christien Franken suggests that Byatt’s problems with theories like poststructuralism stem from her dual identity as a critic and an author. In a lecture Byatt once gave titled “Identity and the Writer” which was later published as an essay, Franken finds what she believes is evidence of poststructuralist thinking, even though Byatt denies taking the theory seriously. Franken believes that in the essay, “the affinity between post-structuralism and her own thought on authorship is affirmed and then again denied” (18). Her explanation is that the critic in A.S. Byatt begins her lecture “Identity and the Writer” with a recognition of her own intellectual affinity with post-structuralist theories which criticize the paramount importance of “the author.”

The writer in Byatt feels threatened by the same post-structuralist criticism. (17)

Franken claims that this ambivalence runs throughout all of Byatt’s fiction and criticism. But Ann Marie Adams disagrees, writing that “When Byatt does delve into poststructuralist theory in this essay, she does so only to articulate what ‘threatens’ and ‘beleaguers’ her as a writer, not to productively help her identify the true ‘identity’ of the writer” (349). In Adams view, Byatt belongs in the humanist tradition of criticism going back to Matthew Arnold and the romantics. In her own response to Byatt, Adams manages to come closer than any of her fellow critics to being able to imagine that the ascendant literary theories are simply wrong. But her obvious admiration for Byatt doesn’t prevent her from suggesting that “Yelin and Flegel are right to note that the conclusion of Possession, with its focus on closure and seeming transcendence of critical anxiety, affords a particularly ‘seductive’ and ideologically laden pleasure to academic readers” (120). And, while she seems to find some value in Arnoldian approaches, she fails to engage in any serious reassessment of the theories Byatt targets.

Frederick Holmes, in his attempt to explain Byatt’s attitude toward history as evidenced by the novel, agrees with the critics who see in Possession clear signs of the author’s embrace of postmodernism in spite of the parody and explicit disavowals. “It is important to acknowledge,” he writes,

that the liberation provided by Roland’s imagination from the previously discussed sterility of his intellectual sophistication is never satisfactorily accounted for in rational terms. It is not clear how he overcomes the post-structuralist positions on language, authorship, and identity. His claim that some signifiers are concretely attached to signifieds is simply asserted, not argued for. (330)

While Holmes is probably mistaken in taking this absence of rational justification as a tacit endorsement of the abandoned theory, the observation is the nearest any of the critics comes to a rebuttal of Byatt’s challenge. What Holmes is forgetting, though, is that structuralist and poststructuralist theorists themselves, from Saussure through Derrida, have been insisting on the inadequacy of language to describe the real world, a radical idea that flies in the face of every human’s lived experience, without ever providing any rational, empirical, or even coherent support for the departure.

The stark irony to which Holmes is completely oblivious is that he’s asking for a rational justification to abandon a theory that proclaims such justification impossible. The burden remains on poststructuralists to prove that people shouldn’t trust their own experiences with language. And it is precisely this disconnect with experience that Byatt shows to be problematic.

Holmes, like the other critics, simply can’t imagine that critical theories have absolutely no validity, so he’s forced to read into the novel the same chimerical ambivalence Franken tries so desperately to prove.

Roland’s dramatic alteration is validated by the very sort of emotional or existential experience that critical theory has conditioned him to dismiss as insubstantial. We might account for Roland’s shift by positing, not a rejection of his earlier thinking, but a recognition that his psychological well-being depends on his living as if such powerful emotional experiences had an unquestioned reality. (330)

Adams quotes from an interview in which Byatt discusses her inspiration for the characters in Possession, saying, “poor moderns are always asking themselves so many questions about whether their actions are real and whether what they say can be thought to be true […] that they become rather papery and are miserably aware of this” (111-2). Byatt believes this type of self-doubt is unnecessary. Indeed, Maud’s notion that “there isn’t a unitary ego” (290) and Roland’s thinking of the “idea of his ‘self’ as an illusion” (459)—not to mention Holmes’s conviction that emotional experiences are somehow unreal—are simple examples of the reductionist fallacy. While it is true that an individual’s consciousness and sense of self rest on a substrate of unconscious mental processes and mechanics that can be traced all the way down to the firing of neurons, to suggest this mechanical accounting somehow delegitimizes selfhood is akin to saying that water being made up of hydrogen and oxygen atoms means the feeling of wetness can only be an illusion.

       Just as silly are the ideas that romantic love is a “suspect ideological construct” (290), as Maud calls it, and that “the expectations of Romance control almost everyone in the Western world” (460), as Roland suggests. Anthropologist Helen Fisher writes in her book Anatomy of Love, “some Westerners have come to believe that romantic love is an invention of the troubadours… I find this preposterous. Romantic love is far more widespread” (49). After a long list of examples of love-strickenness from all over the world from west to east to everywhere in-between, Fisher concludes that it “must be a universal human trait” (50). Scientists have found empirical support as well for Roland’s discovery that words can in fact refer to real things. Psychologist Nicole Speer and her colleagues used fMRI to scan people’s brains as they read stories. The actions and descriptions on the page activated the same parts of the brain as witnessing or perceiving their counterparts in reality. The researchers report, “Different brain regions track different aspects of a story, such as a character’s physical location or current goals. Some of these regions mirror those involved when people perform, imagine, or observe similar real-world activities” (989).

       Critics like Flegel insist on joyless reading because happy endings necessarily overlook the injustices of the world. But this is like saying anyone who savors a meal is complicit in world hunger (or for that matter anyone who enjoys reading about a character savoring a meal). If feminist poststructuralists were right about how language functions as a vehicle for oppressive ideologies, then the most literate societies would be the most oppressive, instead of the other way around. Jacques Lacan is the theorist Byatt has the most fun with in Possession—and he is also the main target of the book Fashionable Nonsense:Postmodern Intellectuals’ Abuse of Science by the scientists Alan Sokal and Jean Bricmont. “According to his disciples,” they write, Lacan “revolutionized the theory and practice of psychoanalysis; according to his critics, he is a charlatan and his writings are pure verbiage” (18). After assessing Lacan’s use of concepts in topological mathematics, like the Mobius strip, which he sets up as analogies for various aspects of the human psyche, Sokal and Bricmont conclude that Lacan’s ideas are complete nonsense. They write,

The most striking aspect of Lacan and his disciples is probably their attitude toward science, and the extreme privilege they accord to “theory”… at the expense of observations and experiments… Before launching into vast theoretical generalizations, it might be prudent to check the empirical adequacy of at least some of its propositions. But, in Lacan’s writings, one finds mainly quotations and analyses of texts and concepts. (37)

Sokal and Bricmont wonder if the abuses of theorists like Lacan “arise from conscious fraud, self-deception, or perhaps a combination of the two” (6). The question resonates with the poem Randolph Henry Ash wrote about his experience exposing a supposed spiritualist as a fraud, in which he has a mentor assure her protégée, a fledgling spiritualist with qualms about engaging in deception, “All Mages have been tricksters” (444).

There’s even some evidence that Byatt is right about postmodern thinking making academics into “papery” people. In a 2006 lecture titled “The Inhumanity of the Humanities,” William van Peer reports on research he conducted with a former student comparing the emotional intelligence of students in the humanities to students in the natural sciences.

Although the psychologists Raymond Mar and Keith Oately (407) have demonstrated that reading fiction increases empathy and emotional intelligence, van Peer found that humanities students had no advantage in emotional intelligence over students of natural science. In fact, there was a weak trend in the opposite direction—this despite the fact that the humanities students were reading more fiction. Van Peer attributes the deficit to common academic approaches to literature:

Consider the ills flowing from postmodern approaches, the “posthuman”: this usually involves the hegemony of “race/class/gender” in which literary texts are treated with suspicion. Here is a major source of that loss of emotional connection between student and literature. How can one expect a certain humanity to grow in students if they are continuously instructed to distrust authors and texts? (8)

Whether it derives from her early reading of Arnold and his successors, as Adams suggests, or simply from her own artistic and readerly sensibilities, Byatt has an intense desire to revive that very humanity so many academics sacrifice on the altar of postmodern theory. Critical theory urges students to assume that any discussion of humanity, or universal traits, or human nature can only be exclusionary, oppressive, and, in Flegel’s words, “naïve” and “unethical.” The cognitive linguist Steven Pinker devotes his book The Blank Slate: The Modern Denial of Human Nature to debunking the radical cultural and linguistic determinism that is the foundation of modern literary theory. In a section on the arts, Pinker credits Byatt for playing a leading role in what he characterizes as a “revolt”:

Museum-goers have become bored with the umpteenth exhibit on the female body featuring dismembered torsos or hundreds of pounds of lard chewed up and spat out by the artist. Graduate students in the humanities are grumbling in emails and conference hallways about being locked out of the job market unless they write in gibberish while randomly dropping the names of authorities like Foucault and Butler. Maverick scholars are doffing the blinders that prevented them from looking at exciting developments in the sciences of human nature. And younger artists are wondering how the art world got itself into the bizarre place in which beauty is a dirty word. (Pinker 416)

There are resonances with Roland Mitchell’s complaints about how psychoanalysis transforms perceptions of landscapes in Pinker’s characterization of modern art. And the idea that beauty has become a dirty word is underscored by the critical condemnations of Byatt’s “fairy tale ending.” Pinker goes on to quote Byatt’s response in New York Times Magazine to the question of what the best story ever told was.

Her answer—The Arabian Nights:

The stories in “The Thousand and One Nights”… are about storytelling without ever ceasing to be stories about love and life and death and money and food and other human necessities. Narration is as much a part of human nature as breath and the circulation of the blood. Modernist literature tried to do away with storytelling, which it thought vulgar, replacing it with flashbacks, epiphanies, streams of consciousness. But storytelling is intrinsic to biological time, which we cannot escape. Life, Pascal said, is like living in a prison from which every day fellow prisoners are taken away to be executed. We are all, like Scheherazade, under sentences of death, and we all think of our lives as narratives, with beginnings, middles, and ends. (quoted in Pinker 419)

Byatt’s satire of papery scholars and her portrayals of her characters’ transcendence of nonsensical theories are but the simplest and most direct ways she celebrates the power of language to transport readers—and the power of stories to possess them. Though she incorporates an array of diverse genres, from letters to poems to diaries, and though some of the excerpts’ meanings subtly change in light of discoveries about their authors’ histories, all these disparate parts nonetheless “hook together,” collaborating in the telling of this magnificent tale. This cooperation would be impossible if the postmodern truism about the medium being the message were actually true. Meanwhile, the novel’s intimate engagement with the mythologies of wide-ranging cultures thoroughly undermines the paradigm according to which myths are deterministic “repeating patterns” imposed on individuals, showing instead that these stories simultaneously emerge from and lend meaning to our common human experiences. As the critical responses to Possession make abundantly clear, current literary theories are completely inadequate in any attempt to arrive at an understanding of Byatt’s work. While new theories may be better suited to the task, it is incumbent on us to put forth a good faith effort to imagine the possibility that true appreciation of this and other works of literature will come only after we’ve done away with theory altogether.

Read More
Dennis Junk Dennis Junk

Madness and Bliss: Critical versus Primitive Readings in A.S. Byatt’s Possession: a Romance

Possession can be read as the novelist’s narrative challenge to the ascendant critical theories, an “undermining of facile illusions” about language and culture and politics—a literary refutation of current approaches to literary criticism.

Part 1 of 2

“You have one of the gifts of the novelist at least,” Christabel LaMotte says to her cousin Sabine de Kercoz in A.S. Byatt’s Possession: a Romance, “you persist in undermining facile illusions” (377). LaMotte is staying with her uncle and cousin, Sabine later learns, because she is carrying the child of the renowned, and married, poet Randolph Henry Ash. The affair began when the two met at a breakfast party where they struck up an impassioned conversation that later prompted Ash to instigate a correspondence. LaMotte too was a poet, so each turned out to be an ideal reader for the other’s work. Just over a hundred years after this initial meeting, in the present day of Byatt’s narrative, the literary scholar Roland Mitchell finds two drafts of Ash’s first letter to LaMotte tucked away in the pages of a book he’s examining for evidence about the great poet’s life, and the detective work begins.

Roland, an unpaid research assistant financially dependent on the girlfriend he’s in a mutually unfulfilling relationship with, is overtaken with curiosity and embarks on a quest to piece together the story of what happened between LaMotte and Ash. Knowing next to nothing about LaMotte, Mitchell partners with the feminist scholar Maud Bailey, who one character describes as “a chilly mortal” (159), and a stilted romance develops between them as they seek out the clues to the earlier, doomed relationship. Through her juxtaposition of the romance between the intensely passionate, intensely curious nineteenth century couple and the subdued, hyper-analytic, and sterile modern one, the novelist Byatt does some undermining of facile illusions of her own.

       Both of the modern characters are steeped in literary theory, but Byatt’s narrative suggests that their education and training is more a hindrance than an aid to true engagement with literature, and with life. It is only by breaking with professional protocol—by stealing the drafts of the letter from Ash to LaMotte—and breaking away from his mentor and fellow researchers that Roland has a chance to read, and experience, the story that transforms him. “He had been taught that language was essentially inadequate, that it could never speak what was there, that it only spoke itself” (513). But over the course of the story Roland comes to believe that this central tenet of poststructuralism is itself inadequate, along with the main tenets of other leading critical theories, including psychoanalysis. Byatt, in a later book of criticism, counts herself among the writers of fiction who “feel that powerful figures in the modern critical movements feel almost a gladiatorial antagonism to the author and the authority the author claims” (6).

Indeed, Possession can be read as the novelist’s narrative challenge to the ascendant critical theories, an “undermining of facile illusions” about language and culture and politics—a literary refutation of current approaches to literary criticism. In the two decades since the novel’s publication, critics working in these traditions have been unable to adequately respond to Byatt’s challenge because they’ve been unable to imagine that their ideas are not simply impediments to pleasurable reading but that they’re both wrong and harmful to the creation and appreciation of literature.

       The possession of the title refers initially to how the story of LaMotte and Ash’s romance takes over Maud and Roland—in defiance of the supposed inadequacy of language. If words only speak themselves, then true communication would be impossible. But, as Roland says to Maud after they’ve discovered some uncanny correspondences between each of the two great poets’ works and the physical setting the modern scholars deduce they must’ve visited together, “People’s minds do hook together” (257). This hooking-together is precisely what inspires them to embark on their mission of discovery in the first place. “I want to—to—follow the path,” Maud says to Roland after they’ve read the poets’ correspondence together.

I feel taken over by this. I want to know what happened, and I want it to be me that finds out. I thought you were mad when you came to Lincoln with your piece of stolen letter.

Now I feel the same. It isn’t professional greed. It’s something more primitive. (239)

Roland interrupts to propose the label “Narrative curiosity” for her feeling of being taken over, to which she responds, “Partly” (239). Later in the story, after several more crucial discoveries, Maud proposes revealing all they’ve learned to their academic colleagues and returning to their homes and their lives. Roland worries doing so would mean going back “Unenchanted.” “Are we enchanted?” Maud replies. “I suppose we must start thinking again, sometime” (454). But it’s the primitive, enchanted, supposedly unthinking reading of the biographical clues about the poets that has brought the two scholars to where they are, and their journey ends up resulting in a transformation that allows Maud and Roland to experience the happy ending LaMotte and Ash were tragically deprived of.

Before discovering and being possessed by the romance of the nineteenth century poets, both Maud and Roland were living isolated and sterile lives. Maud, for instance, always has her hair covered in a kind of “head-binding” and twisted in tightly regimented braids that cause Roland “a kind of sympathetic pain on his own skull-skin” (282). She later reveals that she has to cover it because her fellow feminists always assume she’s “dyeing it to please men.” “It’s exhausting,” Roland has just said. “When everything’s a deliberate political stance. Even if it’s interesting” (295). Maud’s bound head thus serves as a symbol (if read in precisely the type of way Byatt’s story implicitly admonishes her audience to avoid) of the burdensome and even oppressive nature of an ideology that supposedly works for the liberation and wider consciousness of women.

Meanwhile, Roland is troubling himself about the implications of his budding romantic feelings for Maud. He has what he calls a “superstitious dread” of “repeating patterns,” a phrase he repeats over and over again throughout the novel. Thinking of his relations with Maud, he muses,

“Falling in love,” characteristically, combs the appearances of the world, and of the particular lover’s history, out of a random tangle and into a coherent plot. Roland was troubled that the opposite might be true. Finding themselves in a plot, they might suppose it appropriate to behave as though it was a sort of plot. And that would be to compromise some kind of integrity they had set out with. (456)

He later wrestles with the idea that “a Romance was one of the systems that controlled him, as the expectations of Romance control almost everyone in the Western world” (460). Because of his education, he cannot help doubting his own feelings, suspecting that giving in to their promptings would have political implications, and worrying that doing so would result in a comprising of his integrity (which he must likewise doubt) and his free will. Roland’s self-conscious lucubration forms a stark contrast to what Randolph Henry Ash wrote in an early letter to his wife Ellen: “I cannot get out of my mind—as indeed, how should I wish to, whose most ardent desire is to be possessed entirely by the pure thought of you—I cannot get out of my mind the entire picture of you” (500). It is only by reading letters like this, and by becoming more like Ash, turning away in the process from his modern learning, that Roland can come to an understanding of himself and accept his feelings for Maud as genuine and innocent.

Identity for modern literary scholars, Byatt suggests, is a fraught and complicated issue. At different points in the novel, both Maud and Roland engage in baroque, abortive efforts to arrive at a sense of who they are. Maud, reflecting on how another scholar’s writing about Ash says more about the author than about the subject, meditates,

Narcissism, the unstable self, the fractured ego, Maud thought, who am I? A matrix for a susurration of texts and codes? It was both a pleasant and an unpleasant idea, this requirement that she think of herself as intermittent and partial. There was the question of the awkward body. The skin, the breath, the eyes, the hair, their history, which did seem to exist. (273)

Roland later echoes this head-binding poststructuralist notion of the self as he continues to dither over whether or not he should act on his feelings for Maud.

Roland had learned to see himself, theoretically, as a crossing-place for a number of systems, all loosely connected. He had been trained to see his idea of his “self” as an illusion, to be replaced by a discontinuous machinery and electrical message-network of various desires, ideological beliefs and responses, language forms and hormones and pheromones. Mostly he liked this. He had no desire for any strenuous Romantic self-assertion. (459)

But he mistakes that lack of desire for self-assertion as genuine, when it fact it is borne of his theory-induced self-doubt. He will have to discover in himself that very desire to assert or express himself if he wants to escape his lifeless, menial occupation and end his sexless isolation. He and Maud both have to learn how to integrate their bodies and their desires into their conceptions of themselves.

Unfortunately, thinking about sex is even more fraught with exhausting political implications for Byatt’s scholars than thinking about the self. While on a trek to retrace the steps they believe LaMotte and Ash took in the hills of Yorkshire, Roland considers the writing of a psychoanalytic theorist. Disturbed, he asks Maud, “Do you never have the sense that our metaphors eat up our world?” (275). He goes on to explain, that no matter what they tried to discuss,

It all reduced like boiling jam to—human sexuality… And then, really, what is it, what is this arcane power we have, when we see everything is human sexuality? It’s really powerlessness… We are so knowing… Everything relates to us and so we’re imprisoned in ourselves—we can’t see things. (276)

The couple is coming to realize that they can in fact see things, the same things that the couple whose story they're tracking down saw over a century ago. This budding realization inspires in Roland an awareness of how limiting, even incapacitating, the dubious ideas of critical theorizing can be. Through the distorting prism of psychoanalysis, “Sexuality was like thick smoked glass; everything took on the same blurred tint through it. He could not imagine a pool with stones and water” (278).

The irony is that for all the faux sophistication of psychoanalytic sexual terminology it engenders in both Roland and Maud nothing but bafflement and aversion to actual sex. Roland highlights this paradox later, thinking,

They were children of a time and culture that mistrusted love, “in love,” romantic love, romance in toto, and which nevertheless in revenge proliferated sexual language, linguistic sexuality, analysis, dissection, deconstruction, exposure. (458)

Maud sums up the central problem when she says to Roland, “And desire, that we look into so carefully—I think all the looking-into has some very odd effects on the desire” (290). In that same scene, while still in Yorkshire trying to find evidence of LaMotte’s having accompanied Ash on his trip, the two modern scholars discover they share a fantasy, not a sexual fantasy, but one involving “An empty clean bed,” “An empty bed in an empty room,” and they wonder if “they’re symptomatic of whole flocks of exhausted scholars and theorists” (290-1).

Guided by their intense desire to be possessed by the two poets of the previous century, Maud and Roland try to imagine how they would have seen the world, and in so doing they try to imagine what it would be like not to believe in the poststructuralist and psychoanalytic theories they’ve been inculcated with. At first Maud tells Roland, “We live in the truth of what Freud discovered. Whether or not we like it. However we’ve modified it. We aren’t really free to suppose—to imagine—he could possibly have been wrong about human nature” (276). But after they’ve discovered a cave with a pool whose reflected light looks like white fire, a metaphor that both LaMotte and Ash used in poems written around the time they would’ve come to that very place, prompting Maud to proclaim, “She saw this. I’m sure she saw this” (289), the two begin trying in earnest to imagine what it would be like to live without their theories. Maud explains to Roland,

We know all sorts of things, too—about how there isn’t a unitary ego—how we’re made up of conflicting, interacting systems of things—and I suppose we believe that? We know we’re driven by desire, but we can’t see it as they did, can we? We never say the word Love, do we—we know it’s a suspect ideological construct—especially Romantic Love—so we have to make a real effort of imagination to know what it felt like to be them, here, believing in these things—Love—themselves—that what they did mattered—(290)

       Though many critics have pointed out how the affair between LaMotte and Ash parallels the one between Maud and Roland, in some way the trajectories of the two relationships run in opposite directions. For instance, LaMotte leaves Ash as even more of a “chilly mortal” (310) than she was when she first met him. It turns out the term derives from a Mrs. Cammish, who lodged LaMotte and Ash while they were on their trip, and was handed down to the Lady Bailey, Maud’s relative, who applies it to her in a conversation with Roland. And whereas the ultimate falling out between LaMotte and Ash comes in the wake of Ash exposing a spiritualist, whose ideas and abilities LaMotte had invested a great deal of faith in, as a fraud, Roland’s counterpart disillusionment, his epiphany that literary theory as he has learned it is a fraud, is what finally makes the consummation of his relationship with Maud possible. Maud too has to overcome, to a degree, her feminist compunctions to be with Roland. Noting how this chilly mortal is warming over the course of their quest, Roland thinks how, “It was odd to hear Maud Bailey talking wildly of madness and bliss” (360). But at last she lets her hair down.

Sabine’s journal of the time her cousin Christabel stayed with her and her father on the Brittany coast, where she’d sought refuge after discovering she was pregnant, offers Roland and Maud a glimpse at how wrongheaded it can be to give precedence to their brand of critical reading over what they would consider a more primitive approach. Ironically, it is the young aspiring writer who gives them this glimpse as she chastises her high-minded poet cousin for her attempts to analyze and explain the meanings of the myths and stories she’s grown up with. “The stories come before the meanings,” Sabine insists to Christabel. “I do not believe all these explanations. They diminish. The idea of Woman is less than brilliant Vivien, and the idea of Merlin will not allegorise into male wisdom. He is Merlin” (384). These words come from the same young woman who LaMotte earlier credited for her persistence “in undermining facile illusions” (377).

Readers of Byatt’s novel, though not Maud and Roland, both of whom likely already know of the episode, learn about how Ash attended a séance and, reaching up to grab a supposedly levitating wreath, revealed it to be attached to a set of strings connected to the spiritualist. In a letter to Ruskin read for Byatt’s readers by another modern scholar, Ash expresses his outrage that someone would exploit the credulity and longing of the bereaved, especially mothers who’ve lost children. “If this is fraud, playing on a mother’s harrowed feelings, it is wickedness indeed” (423). He also wonders what the ultimate benefit would be if spiritualist studies into other realms proved to be valid. “But if it were so, if the departed spirits were called back—what good does it do? Were we meant to spend our days sitting and peering into the edge of the shadows?” (422). LaMotte and Ash part ways for good after his exposure of the spiritualist as a charlatan because she is so disturbed by the revelation. And, for the reader, the interlude serves as a reminder of past follies that today are widely acknowledged to have depended on trickery and impassioned credulity. So it might be for the ideas of Freud and Derrida and Lacan.

Roland arrives at the conclusion that this is indeed the case. Having been taught that language is inadequate and only speaks itself, he gradually comes to realize that this idea is nonsense. Reflecting on how he was taught that language couldn’t speak about what really existed in the world, he suddenly realizes that he’s been disabused of the idea. “What happened to him was that the ways in which it could be said had become more interesting than the idea that it could not” (513). He has learned through his quest to discover what had occurred between LaMotte and Ash that “It is possible for a writer to make, or remake at least, for a reader, the primary pleasures of eating, or drinking, or looking on, or sex.” People’s minds do in fact “hook together,” as he’d observed earlier, and they do it through language. The novel’s narrator intrudes to explain here near the end of the book what Roland is coming to understand.

Now and then there are readings that make the hairs on the neck, the non-existent pelt, stand on end and tremble, when every word burns and shines hard and clear and infinite and exact, like stones of fire, like points of stars in the dark—readings when the knowledge that we shall know the writing differently or better or satisfactorily, runs ahead of any capacity to say what we know, or how. In these readings, a sense of that text has appeared to be wholly new, never before seen, is followed, almost immediately, by the sense that it was always there, that we the readers, knew it was always there, and have always known it was as it was, though we have now for the first time recognised, become fully cognisant of, our knowledge. (512) (Neuroscientists agree.)

The recognition the narrator refers to—which Roland is presumably experiencing in the scene—is of a shared human nature, and shared human experience, the notions of which are considered by most literary critics to be politically reactionary.

Though he earlier claimed to have no desire to assert himself, Roland discovers he has a desire to write poetry. He decides to turn away from literary scholarship altogether and become a poet. He also asserts himself by finally taking charge and initiating sex with Maud.

And very slowly and with infinite gentle delays and delicate diversions and variations of indirect assault Roland finally, to use an outdated phrase, entered and took possession of all her white coolness that grew warm against him, so that there seemed to be no boundaries, and he heard, towards dawn, from a long way off, her clear voice crying out, uninhibited, unashamed, in pleasure and triumph. (551)

This is in fact, except for postscript focusing on Ash, the final scene of the novel, and it represents Roland’s total, and Maud’s partial transcendence of the theories and habits that hitherto made their lives so barren and lonely.

Read part 2

Related posts:

Read

POSTSTRUCTURALISM: BANAL WHEN IT'S NOT BUSY BEING ABSURD

Read

CAN’T WIN FOR LOSING: WHY THERE ARE SO MANY LOSERS IN LITERATURE AND WHY IT HAS TO CHANGE

Or

SABBATH SAYS: PHILIP ROTH AND THE DILEMMAS OF IDEOLOGICAL CASTRATION

Read More
Dennis Junk Dennis Junk

Who Needs Complex Narratives? : Tim Parks' Enlightened Cynicism

Identities can be burdensome, as Parks intimates his is when he reveals his story has brought him to a place where he’s making a living engaging in an activity that serves no need—and yet he can’t bring himself seek out other employment. In an earlier, equally fascinating post titled "The Writer's Job," Parks interprets T.S. Eliot's writing about writers as suggesting that "only those who had real personality, special people like himself, would appreciate what a burden personality was and wish to shed it."

One of my professors asked our class last week how many of us were interested in writing fiction of our own. She was trying to get us to consider the implications of using one strategy for telling a story based on your own life over another. But I was left thinking instead about the implications of nearly everyone in the room raising a hand. Is the audience for any aspiring author’s work composed exclusively of other aspiring authors? If so, does that mean literature is no more than a exclusive society of the published and consumed forever screening would-be initiates, forever dangling the prize of admission to their ranks, allowing only the elite to enter, and effectively sealed off from the world of the non-literary?

Most of our civilization has advanced beyond big books. People still love their stories, but everyone’s time is constrained, and the choices of entertainment are infinite. Reading The Marriage Plot is an extravagance. Reading Of Human Bondage, the book we’re discussing in my class, is only for students of college English and the middle-class white guys trying to impress them. Nevertheless, Jonathan Franzen, whose written two lengthy, too lengthy works of fiction that enjoy a wide readership, presumably made up primarily of literary aspirants like me (I read and enjoyed both), told an Italian interviewer that “There is an enormous need for long, elaborate, complex stories, such as can only be written by an author concentrating alone, free from the deafening chatter of Twitter.”

British author Tim Parks quotes Franzen in a provocative post at The New York Review of Books titled “Do We Need Stories?” Parks notes that “as a novelist it is convenient to think that by the nature of the job one is on the side of the good, supplying an urgent and general need.” Though he’s written some novels of his own, and translated several others from Italian to English, Parks suspects that Franzen is wrong, that as much as we literary folk may enjoy them, we don’t really need complex narratives. We should note that just as Franzen is arguing on behalf of his own vocation Parks is arguing against his, thus effecting a type of enlightened cynicism toward his own work and that of others in the same field. “Personally,” he says, “I fear I’m too enmired in narrative and self narrative to bail out now. I love an engaging novel, I love a complex novel; but I am quite sure I don’t need it.”

         Parks’ argument is fascinating for what it reveals about what many fiction writers and aficionados believe they’re doing when they’re telling stories. It’s also fascinating for what it represents about authors and their attitudes toward writing. Parks rubs up against some profound insights, but then succumbs to some old-fashioned humanities nonsense. Recalling a time when he served as a judge for a literary award, Parks quotes the case made by a colleague on behalf of his or her favored work, which is excellent, it was insisted, “because it offers complex moral situations that help us get a sense of how to live and behave.” As life becomes increasingly complex, then, fraught with distractions like those incessant tweets, we need fictional accounts of complex moral dilemmas to help us train our minds to be equal to the task of living in the modern world. Parks points out two problems with this view: fiction isn’t the only source of stories, and behind all that complexity is the author’s take on the moral implications of the story’s events which readers must decide whether to accept or reject. We can’t escape complex moral dilemmas, so we may not really need any simulated training. And we have to pay attention lest we discover our coach has trained us improperly. The power of stories can, as Parks suggests, be “pernicious.” “In this view of things, rather than needing stories we need to learn how to smell out their drift and resist them.” (Yeah, but does anyone read Ayn Rand who isn't already convinced?)

But Parks doesn’t believe the true goal of either authors or readers is moral development or practical training. Instead, complex narratives give pleasure because they bolster our belief in complex selves. Words like God, angel, devil, and ghost, Parks contends, have to come with stories attached to them to be meaningful because they don’t refer to anything we can perceive. From this premise of one-word stories, he proceeds,

Arguably the most important word in the invented-referents category is “self.” We would like the self to exist perhaps, but does it really? What is it? The need to surround it with a lexical cluster of reinforcing terms—identity, character, personality, soul—all with equally dubious referents suggests our anxiety. The more words we invent, the more we feel reassured that there really is something there to refer to.

When my classmates and I raised our hands and acknowledged our shared desire to engage in the creative act of storytelling, what we were really doing, according to Parks, was expressing our belief in that fictional character we refer to reverentially as ourselves. 

One of the accomplishments of the novel, which as we know blossomed with the consolidation of Western individualism, has been to reinforce this ingenious invention, to have us believe more and more strongly in this sovereign self whose essential identity remains unchanged by all vicissitudes. Telling the stories of various characters in relation to each other, how something started, how it developed, how it ended, novels are intimately involved with the way we make up ourselves. They reinforce a process we are engaged in every moment of the day, self creation. They sustain the idea of a self projected through time, a self eager to be a real something (even at the cost of great suffering) and not an illusion.

Parks is just as much a product of that “Western individualism” as the readers he’s trying to enlighten as to the fictional nature of their essential being. As with his attempt at undermining the ultimate need for his own profession, there’s a quality of self-immolation in this argument—except of course there’s nothing, really, to immolate.

What exactly, we may wonder, is doing the reading, is so desperate to believe in its own reality? And why is that belief in its own reality so powerful that this thing, whatever it may be, is willing to experience great suffering to reinforce it? Parks suggests the key to the self is some type of unchanging and original coherence. So we like stories because we like characters who are themselves coherent and clearly delineated from other coherent characters.

The more complex and historically dense the stories are, the stronger the impression they give of unique and protracted individual identity beneath surface transformations, conversions, dilemmas, aberrations. In this sense, even pessimistic novels—say, J.M. Coetzee’s Disgrace—can be encouraging: however hard circumstances may be, you do have a self, a personal story to shape and live. You are a unique something that can fight back against all the confusion around. You have pathos.

In this author’s argument for the superfluity of authors, the centrality of pain and suffering to the story of the self is important to note. He makes the point even more explicit, albeit inadvertently, when he says, “If we asked the question of, for example, a Buddhist priest, he or she would probably tell us that it is precisely this illusion of selfhood that makes so many in the West unhappy.”

I don’t pretend to have all the questions surrounding our human fascination with narrative—complex and otherwise—worked out, but I do know Parks’ premise is faulty.

Unlike many professional scholars in the Humanities, Parks acknowledges that at least some words can refer to things in the world. But he goes wrong when he assumes that if there exists no physical object to refer to the word must have a fictional story attached to it. There is good evidence, for instance, that our notions of God and devils and spirits are not in fact based on stories, though stories clearly color their meanings. Our interactions with invisible beings are based on the same cognitive mechanisms that help us interact with completely visible fellow humans. What psychologists call theory of mind, our reading of intentions and mental states into others, likely extends into realms where no mind exists to have intentions and states. That’s where our dualistic philosophy comes from.

While Parks is right in pointing out that the words God and self don’t have physical referents—though most of us, I assume, think of our bodies as ourselves to some degree—he’s completely wrong in inferring these words only work as fictional narratives. People assume, wrongly, that God is a real being because they have experiences with him. In the same way, the self isn’t an object but an experience—and a very real experience. (Does the word fun have to come with a story attached?) The consistency across time and circumstance, the sense of unified awareness, these are certainly exaggerated at times. So too is our sense of transformation though, as anyone knows who’s discovered old writings from an earlier stage of life and thought, “Wow, I was thinking about the same stuff back then as I am now—even my writing style is similar!”

Parks is wrong too about so-called Western society, as pretty much everyone who uses that term is. It’s true that some Asian societies have a more collectivist orientation, but I’ve heard rumors that a few Japanese people actually enjoy reading novels. (The professor of the Brittish Lit course I'm taking is Chinese.) Those Buddhists monks are deluded too. Ruut Veenhoven surveyed 43 nations in the early 1990s and discovered that as individualism increases, so too does happiness. “There is no pattern of diminishing returns,” Veenhoven writes. “This indicates that individualization has not yet passed its optimum.” What this means is that, assuming Parks is right in positing that novel-reading increases individualism, reading novels could make you happier. Unfortunately, a lot of high-brow, literary authors would bristle at this idea because it makes of their work less a heroic surveying of the abyss and more of a commodity.

Parks doesn’t see any meaningful distinction between self and identity, but psychologists would use the latter term to label his idea of a coherent self-story. Dan McAdams is the leading proponent of the idea that in addition to a unified and stable experience of ourselves we each carry with us a story whose central theme is our own uniqueness and how it developed. He writes in his book The Stories We Live By: Personal Myths and the Making of the Self that identity is “an inner story of the self that integrates the reconstructed past, perceived present, and anticipated future to provide a life with unity, purpose, and meaning.” But we don’t just tell these stories to ourselves, nor are we solely interested in our own story. One of the functions of identity is to make us seem compelling and attractive to other people. Parks, for instance, tells us the story of how he provides a service, writing and translating, he understands isn’t necessary to anyone. And, if you’re like me, at least for a moment, you’re impressed with his ability to shoulder the burden of this enlightened cynicism. He’s a bit like those Buddhist monks who go to such great lengths to eradicate their egos.

The insight that Parks never quite manages to arrive at is that suffering is integral to stories of the self. If my story of myself, my identity, doesn’t feature any loss or conflict, then it’s not going to be very compelling to anyone. But what’s really compelling are the identities which somehow manage to cause the self whose stories they are their own pain. Identities can be burdensome, as Parks intimates his is when he reveals his story has brought him to a place where he’s making a living engaging in an activity that serves no need—and yet he can’t bring himself seek out other employment. In an earlier, equally fascinating post titled "The Writer's Job," Parks interprets T.S. Eliot's writing about writers as suggesting that "only those who had real personality, special people like himself, would appreciate what a burden personality was and wish to shed it."

If we don’t suffer for our identities, then we haven’t earned them. Without the pain of initiation, we don’t really belong. We’re not genuinely who we claim to be. We’re tourists. We’re poseurs. Mitt Romney, for instance, is thought to be an inauthentic conservative because he hasn’t shown sufficient willingness to lose votes—and possibly elections—for the sake of his convictions. We can’t help but assume equivalence between cost and value. If your identity doesn’t entail some kind of cost, well, then it’s going to come off as cheap. So a lot of people play up, or even fabricate, the suffering in their lives.

What about Parks’ question? Are complex narratives necessary? Maybe, like identities, the narratives we tell, as well as the narratives we enjoy, work as costly signals, so that the complexity of the stories you like serves as a reliable indication of the complexity of your identity. If you can truly appreciate a complex novel, you can truly appreciate a complex individual. Maybe our complicated modern civilization, even with its tweets and Kindles, is more a boon than a hindrance to complexity and happiness. What this would mean is that if two people on the subway realize they’re both reading the same complex narrative they can be pretty sure they’re compatible as friends or lovers. Either that, or they’re both English professors and they have no idea what’s going on, in which case they’re still compatible but they’ll probably hate each other regardless.

At least, that's the impression I get from David Lodge's Small World, the latest complex narrative assigned in my English Lit course taught by a professor from an Eastern society.

Also read:

WHAT IS A STORY? AND WHAT ARE YOU SUPPOSED TO DO WITH ONE?

And:

WHAT'S THE POINT OF DIFFICULT READING?

Read More
Dennis Junk Dennis Junk

Life's White Machine: James Wood and What Doesn't Happen in Fiction

For James Wood, fiction is communion. This view has implications about what constitutes the best literature—all the elements from description to dialogue should work to further the dramatic development of the connection between reader and character.

No one is a better reader of literary language than James Wood. In his reviews, he conveys with grace and precision his uncanny feel for what authors set out to say, what they actually end up saying, and what any discrepancy might mean for their larger literary endeavor. He effortlessly and convincingly infers from the lurch of faulty lines the confusions and pretentions and lacuna in understanding of struggling writers. Some take steady aim at starkly circumscribed targets, his analysis suggests, while others, desperate to achieve some greater, more devastating impact, shoot wistfully into the clouds. He can even listen to the likes of republican presidential nominee Rick Santorum and explain, with his seemingly eidetic knowledge of biblical history, what is really meant when the supposed Catholic uses the word steward.

As a critic, Wood’s ability to see character in narration and to find the author, with all his conceits and difficulties, in the character is often downright unsettling. For him there exists no divide between language and psychology—literature is the struggle of conflicted minds to capture the essence of experiences, their own and others’.

When Robert Browning describes the sound of a bird singing its song twice over, in order to ‘recapture/ The first fine careless rapture,’ he is being a poet, trying to find the best poetic image; but when Chekhov, in his story ‘Peasants,’ says that a bird’s cry sounded as if a cow had been locked up in a shed all night, he is being a fiction writer: he is thinking like one of his peasants. (24)

This is from Wood’s How Fiction Works. In the midst of a long paean to the power of free indirect style, the technique that allows the language of the narrator to bend toward and blend with the thoughts and linguistic style of characters—moving in and out of their minds—he deigns to mention, in a footnote, an actual literary theory, or rather Literary Theory. Wood likes Nabokov’s scene in the novel Pnin that has the eponymous professor trying to grasp a nutcracker in a sink full of dishes. The narrator awkwardly calls it a “leggy thing” as it slips through his grasp. “Leggy” conveys the image. “But ‘thing’ is even better, precisely because it is vague: Pnin is lunging at the implement, and what word in English better conveys a messy lunge, a swipe at verbal meaning, than ‘thing’?” (25) The vagueness makes of the psychological drama a contagion. There could be no symbol more immediately felt.

The Russian Formalists come into Wood’s discussion here. Their theory focused on metaphors that bring about an “estranging” or “defamiliarizing” effect. Wood would press them to acknowledge that this making strange of familiar objects and experiences works in the service of connecting the minds of the reader with the mind of the character—it’s anything but random:

But whereas the Russian Formalists see this metaphorical habit as emblematic of the way that fiction does not refer to reality, is a self-enclosed machine (such metaphors are the jewels of the author’s freakish, solipsistic art), I prefer the way that such metaphors, as in Pnin’s “leggy thing,” refer deeply to reality: because they emanate from the characters themselves, and are fruits of free indirect style. (26)

Language and words and metaphors, Wood points out, by their nature carry us toward something that is diametrically opposed to collapsing in on ourselves. Indeed, there is something perverse about the insistence of so many professional scholars devoted to the study of literature that the main thrust of language is toward some unacknowledged agenda of preserving an unjust status quo—with the implication that the only way to change the world is to torture our modes of expression, beginning with literature (even though only a tiny portion of most first world populations bother to read any).

For Wood, fiction is communion. This view has implications about what constitutes the best literature—all the elements from description to dialogue should work to further the dramatic development of the connection between reader and character. Wood even believes that the emphasis on “round” characters is overstated, pointing out that many of the most memorable—Jean Brodie, Mr. Biswas—are one-dimensional and unchanging. Nowhere in the table of contents of How Fiction Works, or even in the index, does the word plot appear. He does, however, discuss plot in his response to postmodernists’ complaints about realism. Wood quotes author Rick Moody:

It’s quaint to say so, but the realistic novel still needs a kick in the ass. The genre, with its epiphanies, its rising action, its predictable movement, its conventional humanisms, can still entertain and move us on occasion, but for me it’s politically and philosophically dubious and often dull. Therefore, it needs a kick in the ass.

Moody is known for a type of fiction that intentionally sabotages the sacred communion Wood sees as essential to the experience of reading fiction. He begins his response by unpacking some of the claims in Moody’s fussy pronouncement:

Moody’s three sentences efficiently compact the reigning assumptions. Realism is a “genre” (rather than, say, a central impulse in fiction-making); it is taken to be mere dead convention, and to be related to a certain kind of traditional plot, with predictable beginnings and endings; it deals in “round” characters, but softly and piously (“conventional humanisms”); it assumes that the world can be described, with a naively stable link between word and referent (“philosophically dubious”); and all this will tend toward a conservative or even oppressive politics (“politically… dubious”).

Wood begins the section following this analysis with a one-sentence paragraph: “This is all more or less nonsense” (224-5) (thus winning my devoted readership).

That “more or less” refers to Wood’s own frustrations with modern fiction. Conventions, he concedes, tend toward ossification, though a trope’s status as a trope, he maintains, doesn’t make it untrue. “I love you,” is the most clichéd sentence in English. That doesn’t nullify the experience of falling in love (236). Wood does believe, however, that realistic fiction is too eventful to live up to the label.

Reviewing Ben Lerner’s exquisite short novel Leaving the Atocha Station, Wood lavishes praise on the postmodernist poet’s first work of fiction. He writes of the author and his main character Adam Gordon,

Lerner is attempting to capture something that most conventional novels, with their cumbersome caravans of plot and scene and "conflict," fail to do: the drift of thought, the unmomentous passage of undramatic life. Several times in the book, he describes this as "that other thing, the sound-absorbent screen, life’s white machine, shadows massing in the middle distance… the texture of et cetera itself." Reading Tolstoy, Adam reflects that even that great master of the texture of et cetera itself was too dramatic, too tidy, too momentous: "Not the little miracles and luminous branching injuries, but the other thing, whatever it was, was life, and was falsified by any way of talking or writing or thinking that emphasized sharply localized occurrences in time." (98)

Wood is suspicious of plot, and even of those epiphanies whereby characters are rendered dynamic or three-dimensional or “round,” because he seeks in fiction new ways of seeing the world he inhabits according to how it might be seen by lyrically gifted fellow inhabitants. Those “cumbersome caravans of plot and scene and ‘conflict’" tend to be implausible distractions, forcing the communion into narrow confessionals, breaking the spell.

As a critic who has garnered wide acclaim from august corners conferring a modicum of actual authority, and one who's achieved something quite rare for public intellectuals, a popular following, Wood is (too) often criticized for his narrow aestheticism. Once he closes the door on goofy postmodern gimcrack, it remains closed to other potentially relevant, potentially illuminating cultural considerations—or so his detractors maintain. That popular following of his is, however, comprised of a small subset of fiction readers. And the disconnect between consumers of popular fiction and the more literary New Yorker subscribers speaks not just to the cultural issue of declining literacy or growing apathy toward fictional writing but to the more fundamental question of why people seek out narratives, along with the question Wood proposes to address in the title of his book, how does fiction work?

While Wood communes with synesthetic flaneurs, many readers are looking to have their curiosity piqued, their questing childhood adventurousness revived, their romantic and nightmare imaginings played out before them. “If you look at the best of literary fiction," Benjamin Percy said in an interview with Joe Fassler,

you see three-dimensional characters, you see exquisite sentences, you see glowing metaphors. But if you look at the worst of literary fiction, you see that nothing happens. Somebody takes a sip of tea, looks out the window at a bank of roiling clouds and has an epiphany.

The scene Percy describes is even more eventful than what Lerner describes as “life’s white machine”—it features one of those damn epiphanies. But Percy is frustrated with heavy-handed plots too.

In the worst of genre fiction, you see hollow characters, you see transparent prose, you see the same themes and archetypes occurring from book to book. If you look at the best of genre fiction, you see this incredible desire to discover what happens next.

The interview is part of Fessler’s post on the Atlantic website, “How Zombies and Superheroes Conquered Highbrow Fiction.” Percy is explaining the appeal of a new class of novel.

So what I'm trying to do is get back in touch with that time of my life when I was reading genre, and turning the pages so quickly they made a breeze on my face. I'm trying to take the best of what I've learned from literary fiction and apply it to the best of genre fiction, to make a kind of hybridized animal.

Is it possible to balance the two impulses: the urge to represent and defamiliarize, to commune, on the one hand, and the urge to create and experience suspense on the other? Obviously, if the theme you’re taking on is the struggle with boredom or the meaningless wash of time—white machine reminds me of a washer—then an incident-rich plot can only be ironic.

The solution to the conundrum is that no life is without incident. Fiction’s subject has always been births, deaths, comings-of-age, marriages, battles. I’d imagine Wood himself is often in the mood for something other than idle reflection. Ian McEwan, whose Atonement provides Wood an illustrative example of how narration brilliantly captures character, is often taken to task for overplotting his novels. Citing Henry James in a New Yorker interview with Daniel Zalewski to the effect that novels have an obligation to “be interesting,” McEwan admits finding “most novels incredibly boring. It’s amazing how the form endures. Not being boring is quite a challenge.” And if he thinks most novels are boring he should definitely stay away from the short fiction that gets published in the New Yorker nowadays.

A further implication of Wood’s observation about narration’s capacity for connecting reader to character is that characters who live eventful lives should inhabit eventful narratives. This shifts the issue of plot back to the issue of character, so the question is not what types of things should or shouldn’t happen in fiction but rather what type of characters do we want to read about? And there’s no question that literary fiction over the last century has been dominated by a bunch of passive losers, men and women flailing desperately about before succumbing to societal or biological forces. In commercial fiction, the protagonists beat the odds; in literature, the odds beat the protagonists.

There’s a philosophy at play in this dynamic. Heroes are thought to lend themselves to a certain view of the world, where overcoming sickness and poverty and cultural impoverishment is more of a rite of passage than a real gauge of how intractable those impediments are for nearly everyone who faces them. If audiences are exposed to too many tales of heroism, then hardship becomes a prop in personal development. Characters overcoming difficulties trivializes those difficulties. Winston Smith can’t escape O’Brien and Room 101 or readers won’t appreciate the true threat posed by Big Brother. The problem is that the ascent of the passive loser and the fiction of acquiescence don’t exactly inspire reform-minded action either.

Adam Gordon, the narrator of Leaving the Atocha Station, is definitely a loser. He worries all day that he’s some kind of impostor. He’s whiny and wracked with self-doubt. But even he doesn’t sit around doing nothing. The novel is about his trip to Spain. He pursues women with mixed success. He does readings of his poetry. He witnesses a terrorist attack. And these activities and events are interesting, as James insisted they must be. Capturing the feel of uneventful passages of time may be a worthy literary ambition, but most people seek out fiction to break up periods of nothingness. It’s never the case in real life that nothing is happening anyway—we’re at every instance getting older. I for one don’t find the prospect of spending time with people or characters who just sit passively by as that happens all that appealing.

In a remarkably lame failure of a lampoon in Harper's Colson Whitehead targets Wood's enthusiasm for Saul Bellow. And Bellow was indeed one of those impossibly good writers who could describe eating Corn Flakes and make it profound and amusing. Still, I'm a little suspicious of anyone who claims to enjoy (though enjoyment shouldn't be the only measure of literary merit) reading about the Bellow characters who wander around Chicago as much as reading about Henderson wandering around Africa. 

  Henderson: I'm actually looking forward to the next opportunity I get to hang out with that crazy bastard.

Read More
Dennis Junk Dennis Junk

The Adaptive Appeal of Bad Boys

From the intro to my master’s thesis where I explore the evolved psychological dynamics of storytelling and witnessing, with a special emphasis on the paradox that the most compelling characters are often less than perfect human beings. Why do audiences like Milton’s Satan, for instance? Why did we all fall in love with Tyler Durden from Fight Club? It turns out both of these characters give indications that they just may be more altruistic than they appear at first.

Excerpt from Hierarchies in Hell and Leaderless Fight ClubsAltruism, Narrative Interest, and the Adaptive Appeal of Bad Boys, my master’s thesis

            In a New York Times article published in the spring of 2010, psychologist Paul Bloom tells the story of a one-year-old boy’s remarkable response to a puppet show. The drama the puppets enacted began with a central character’s demonstration of a desire to play with a ball. After revealing that intention, the character roles the ball to a second character who likewise wants to play and so rolls the ball back to the first. When the first character rolls the ball to a third, however, this puppet snatches it up and quickly absconds. The second, nice puppet and the third, mean one are then placed before the boy, who’s been keenly attentive to their doings, and they both have placed before them a few treats. The boy is now instructed by one of the adults in the room to take a treat away from one of the puppets. Most children respond to the instructions by taking the treat away from the mean puppet, and this particular boy is no different. He’s not content with such a meager punishment, though, and after removing the treat he proceeds to reach out and smack the mean puppet on the head.

            Brief stage shows like the one featuring the nice and naughty puppets are part of an ongoing research program lead by Karen Wynn, Bloom’s wife and colleague, and graduate student Kiley Hamlin at Yale University’s Infant Cognition Center. An earlier permutation of the study was featured on PBS’s Nova series The Human Spark(jump to chapter 5), which shows host Alan Alda looking on as an infant named Jessica attends to a puppet show with the same script as the one that riled the boy Bloom describes. Jessica is so tiny that her ability to track and interpret the puppets’ behavior on any level is impressive, but when she demonstrates a rudimentary capacity for moral judgment by reaching with unchecked joy for the nice puppet while barely glancing at the mean one, Alda—and Nova viewers along with him—can’t help but demonstrate his own delight. Jessica shows unmistakable signs of positive emotion in response to the nice puppet’s behaviors, and Alda in turn feels positive emotions toward Jessica. Bloom attests that “if you watch the older babies during the experiments, they don’t act like impassive judges—they tend to smile and clap during good events and frown, shake their heads and look sad during the naughty events” (6). Any adult witnessing the children’s reactions can be counted on to mirror these expressions and to feel delight at the babies’ incredible precocity.

            The setup for these experiments with children is very similar to experiments with adult participants that assess responses to anonymously witnessed exchanges. In their research report, “Third-Party Punishment and Social Norms,” Ernst Fehr and Urs Fischbacher describe a scenario inspired by economic game theory called the Dictator Game. It begins with an experimenter giving a first participant, or player, a sum of money. The experimenter then explains to the first player that he or she is to propose a cut of the money to the second player. In the Dictator Game—as opposed to other similar game theory scenarios—the second player has no choice but to accept the cut from the first player, the dictator. The catch is that the exchange is being witnessed by a third party, the analogue of little Jessica or the head-slapping avenger in the Yale experiments.  This third player is then given the opportunity to reward or punish the dictator. As Fehr and Fischbacher explain, “Punishment is, however, costly for the third party so a selfish third party will never punish” (3).

It turns out, though, that adults, just like the infants in the Yale studies, are not selfish—at least not entirely. Instead, they readily engage in indirect, or strong, reciprocity. Evolutionary literary theorist William Flesch explains that “the strong reciprocator punishes and rewards others for their behavior toward any member of the social group, and not just or primarily for their interactions with the reciprocator” (21-2). According to Flesch, strong reciprocity is the key to solving what he calls “the puzzle of narrative interest,” the mystery of why humans so readily and eagerly feel “anxiety on behalf of and about the motives, actions, and experiences of fictional characters” (7). The human tendency toward strong reciprocity reaches beyond any third party witnessing an exchange between two others; as Alda, viewers of Nova, and even readers of Bloom’s article in the Times watch or read about Wynn and Hamlin’s experiments, they have no choice but to become participants in the experiments themselves, because their own tendency to reward good behavior with positive emotion and to punish bad behavior with negative emotion is automatically engaged. Audiences’ concern, however, is much less with the puppets’ behavior than with the infants’ responses to it.

The studies of social and moral development conducted at the Infant Cognition Center pull at people’s heartstrings because they demonstrate babies’ capacity to behave in a way that is expected of adults. If Jessica had failed to discern between the nice and the mean puppets, viewers probably would have readily forgiven her. When older people fail to make moral distinctions, however, those in a position to witness and appreciate that failure can be counted on to withdraw their favor—and may even engage in some type of sanctioning, beginning with unflattering gossip and becoming more severe if the immorality or moral complacency persists. Strong reciprocity opens the way for endlessly branching nth–order reciprocation, so not only will individuals be considered culpable for offenses they commit but also for offenses they passively witness. Flesch explains,

Among the kinds of behavior that we monitor through tracking or through report, and that we have a tendency to punish or reward, is the way others monitor behavior through tracking or through report, and the way they manifest a tendency to punish and reward. (50)

Failing to signal disapproval makes witnesses complicit. On the other hand, signaling favor toward individuals who behave altruistically simultaneously signals to others the altruism of the signaler. What’s important to note about this sort of indirect signaling is that it does not necessarily require the original offense or benevolent act to have actually occurred. People take a proclivity to favor the altruistic as evidence of altruism—even if the altruistic character is fictional. 

        That infants less than a year old respond to unfair or selfish behavior with negative emotions—and a readiness to punish—suggests that strong reciprocity has deep evolutionary roots in the human lineage. Humans’ profound emotional engagement with fictional characters and fictional exchanges probably derives from a long history of adapting to challenges whose Darwinian ramifications were far more serious than any attempt to while away some idle afternoons. Game theorists and evolutionary anthropologists have a good idea what those challenges might have been: for cooperativeness or altruism to be established and maintained as a norm within a group of conspecifics, some mechanism must be in place to prevent the exploitation of cooperative or altruistic individuals by selfish and devious ones. Flesch explains,

Darwin himself had proposed a way for altruism to evolve through the mechanism of group selection. Groups with altruists do better as a group than groups without. But it was shown in the 1960s that, in fact, such groups would be too easily infiltrated or invaded by nonaltruists—that is, that group boundaries are too porous—to make group selection strong enough to overcome competition at the level of the individual or the gene. (5)

If, however, individuals given to trying to take advantage of cooperative norms were reliably met with slaps on the head—or with ostracism in the wake of spreading gossip—any benefits they (or their genes) might otherwise count on to redound from their selfish behavior would be much diminished. Flesch’s theory is “that we have explicitly evolved the ability and desire to track others and to learn their stories precisely in order to punish the guilty (and somewhat secondarily to reward the virtuous)” (21). Before strong reciprocity was driving humans to bookstores, amphitheaters, and cinemas, then, it was serving the life-and-death cause of ensuring group cohesion and sealing group boundaries against neighboring exploiters. 

Game theory experiments that have been conducted since the early 1980s have consistently shown that people are willing, even eager to punish others whose behavior strikes them as unfair or exploitative, even when administering that punishment involves incurring some cost for the punisher. Like the Dictator Game, the Ultimatum Game involves two people, one of whom is given a sum of money and told to offer the other participant a cut. The catch in this scenario is that the second player must accept the cut or neither player gets to keep any money. “It is irrational for the responder not to accept any proposed split from the proposer,” Flesch writes. “The responder will always come out better by accepting than vetoing” (31). What the researchers discovered, though, was that a line exists beneath which responders will almost always refuse the cut. “This means they are paying to punish,” Flesch explains. “They are giving up a sure gain in order to punish the selfishness of the proposer” (31). Game theorists call this behavior altruistic punishment because “the punisher’s willingness to pay this cost may be an important part in enforcing norms of fairness” (31). In other words, the punisher is incurring a cost to him or herself in order to ensure that selfish actors don’t have a chance to get a foothold in the larger, cooperative group. 

The economic logic notwithstanding, it seems natural to most people that second players in Ultimatum Game experiments should signal their disapproval—or stand up for themselves, as it were—by refusing to accept insultingly meager proposed cuts. The cost of the punishment, moreover, can be seen as a symbol of various other types of considerations that might prevent a participant or a witness from stepping up or stepping in to protest. Discussing the Three-Player Dictator Game experiments conducted by Fehr and Fischbacher, Flesch points out that strong reciprocity is even more starkly contrary to any selfish accounting:

Note that the third player gets nothing out of paying to reward or punish except the power or agency to do just that. It is highly irrational for this player to pay to reward or punish, but again considerations of fairness trump rational self-interest. People do pay, and pay a substantial amount, when they think that someone has been treated notably unfairly, or when they think someone has evinced marked generosity, to affect what they have observed. (33)

Neuroscientists have even zeroed in on the brain regions that correspond to our suppression of immediate self-interest in the service of altruistic punishment, as well as those responsible for the pleasure we take in anticipating—though not in actually witnessing—free riders meeting with their just deserts (Knoch et al. 829Quevain et al. 1254). Outside of laboratories, though, the cost punishers incur can range from the risks associated with a physical confrontation to time and energy spent convincing skeptical peers a crime has indeed been committed.

Flesch lays out his theory of narrative interest in a book aptly titled Comeuppance:Costly Signaling, Altruistic Punishment, and Other Biological Components of Fiction. A cursory survey of mainstream fiction, in both blockbuster movies and best-selling novels, reveals the good guys versus bad guys dynamic as preeminent in nearly every plot, and much of the pleasure people get from the most popular narratives can quite plausibly be said to derive from the goodie prevailing—after a long, harrowing series of close calls and setbacks—while the baddie simultaneously gets his or her comeuppance. Audiences love to see characters get their just deserts. When the plot fails to deliver on this score, they walk away severely disturbed. That disturbance can, however, serve the author’s purposes, particularly when the goal is to bring some danger or injustice to readers’ or viewers’ attention, as in the case of novels like Orwell’s 1984. Plots, of course, seldom feature simple exchanges with meager stakes on the scale of game theory experiments, and heroes can by no means count on making it to the final scene both vindicated and rewarded—even in stories designed to give audiences exactly what they want. The ultimate act of altruistic punishment, and hence the most emotionally poignant behavior a character can engage in, is martyrdom. It’s no coincidence that the hero dies in the act of vanquishing the villain in so many of the most memorable books and movies.

            If narrative interest really does emerge out of a propensity to monitor each other’s behaviors for signs of a capacity for cooperation and to volunteer affect on behalf of altruistic individuals and against selfish ones they want to see get their comeuppance, the strong appeal of certain seemingly bad characters emerges as a mystery calling for explanation.  From England’s tradition of Byronic heroes like Rochester to America’s fascination with bad boys like Tom Sawyer, these characters win over audiences and stand out as perennial favorites even though at first blush they seem anything but eager to establish their nice guy bone fides. On the other hand, Rochester was eventually redeemed in Jane Eyre, and Tom Sawyer, though naughty to be sure, shows no sign whatsoever of being malicious. Tellingly, though, these characters, and a long list of others like them, also demonstrate a remarkable degree of cleverness: Rochester passing for a gypsy woman, for instance, or Tom Sawyer making fence painting out to be a privilege. One hypothesis that could account for the appeal of bad boys is that their badness demonstrates undeniably their ability to escape the negative consequences most people expect to result from their own bad behavior.

This type of demonstration likely functions in a way similar to another mechanism that many evolutionary biologists theorize must have been operating for cooperation to have become established in human societies, a process referred to as the handicap principle, or costly signaling. A lone altruist in any group is unlikely to fare well in terms of survival and reproduction. So the question arises as to how the minimum threshold of cooperators in a population was first surmounted. Flesch’s fellow evolutionary critic, Brian Boyd, in his book On the Origin of Stories, traces the process along a path from mutualism, or coincidental mutual benefits, to inclusive fitness, whereby organisms help others who are likely to share their genes—primarily family members—to reciprocal altruism, a quid pro quo arrangement in which one organism will aid another in anticipation of some future repayment (54-57). However, a few individuals in our human ancestry must have benefited from altruism that went beyond familial favoritism and tit-for-tat bartering.

In their classic book The Handicap Principal, Amotz and Avishag Zahavi suggest that altruism serves a function in cooperative species similar to the one served by a peacock’s feathers. The principle could also help account for the appeal of human individuals who routinely risk suffering consequences which deter most others. The idea is that conspecifics have much to gain from accurate assessments of each other’s fitness when choosing mates or allies. Many species have thus evolved methods for honestly signaling their fitness, and as the Zahavis explain, “in order to be effective, signals have to be reliable; in order to be reliable, signals have to be costly” (xiv). Peacocks, the iconic examples of the principle in action, signal their fitness with cumbersome plumage because their ability to survive in spite of the handicap serves as a guarantee of their strength and resourcefulness. Flesch and Boyd, inspired by evolutionary anthropologists, find in this theory of costly signaling the solution the mystery of how altruism first became established; human altruism is, if anything, even more elaborate than the peacock’s display. 

Humans display their fitness in many ways. Not everyone can be expected to have the wherewithal to punish free-riders, especially when doing so involves physical conflict. The paradoxical result is that humans compete for the status of best cooperator. Altruism is a costly signal of fitness. Flesch explains how this competition could have emerged in human populations:

If there is a lot of between-group competition, then those groups whose modes of costly signaling take the form of strong reciprocity, especially altruistic punishment, will outcompete those whose modes yield less secondary gain, especially less secondary gain for the group as a whole. (57)

Taken together, the evidence Flesch presents suggests the audiences of narratives volunteer affect on behalf of fictional characters who show themselves to be altruists and against those who show themselves to be selfish actors or exploiters, experiencing both frustration and delight in the unfolding of the plot as they hope to see the altruists prevail and the free-riders get their comeuppance. This capacity for emotional engagement with fiction likely evolved because it serves as a signal to anyone monitoring individuals as they read or view the story, or as they discuss it later, that they are disposed either toward altruistic punishment or toward third-order free-riding themselves—and altruism is a costly signal of fitness.

The hypothesis emerging from this theory of social monitoring and volunteered affect to explain the appeal of bad boy characters is that their bad behavior will tend to redound to the detriment of still worse characters. Bloom describes the results of another series of experiments with eight-month-old participants:

When the target of the action was itself a good guy, babies preferred the puppet who was nice to it. This alone wasn’t very surprising, given that the other studies found an overall preference among babies for those who act nicely. What was more interesting was what happened when they watched the bad guy being rewarded or punished. Here they chose the punisher. Despite their overall preference for good actors over bad, then, babies are drawn to bad actors when those actors are punishing bad behavior. (5)

These characters’ bad behavior will also likely serve an obvious function as costly signaling; they’re bad because they’re good at getting away with it. Evidence that the bad boy characters are somehow truly malicious—for instance, clear signals of a wish to harm innocent characters—or that they’re irredeemable would severely undermine the theory. As the first step toward a preliminary survey, the following sections examine two infamous instances in which literary characters whose creators intended audiences to recognize as bad nonetheless managed to steal the show from the supposed good guys.

(Watch Hamlin discussing the research in an interview from earlier today.)

And check out this video of the experiments.

Read More
Dennis Junk Dennis Junk

Campaigning Deities: Justifying the ways of Satan

Why do readers tend to admire Satan in Milton’s Paradise Lost? It’s one of the instances where a nominally bad character garners more attention and sympathy than the good guy, a conundrum I researched through an evolutionary lens as part of my master’s thesis.

[Excerpt from Hierarchies in Hell and Leaderless Fight Clubs: Altruism, Narrative Interest, and the Adaptive Appeal of Bad Boys, my master’s thesis]

Milton believed Christianity more than worthy of a poetic canon in the tradition of the classical poets, and Paradise Lost represents his effort at establishing one. What his Christian epic has offered for many readers over the centuries, however, is an invitation to weigh the actions and motivations of immortals in mortal terms. In the story, God becomes a human king, albeit one with superhuman powers, while Satan becomes an upstart subject. As Milton attempts to “justify the ways of God to Man,” he is taking it upon himself simultaneously, and inadvertently, to justify the absolute dominion of a human dictator. One of the consequences of this shift in perspective is the transformation of a philosophical tradition devoted to parsing the logic of biblical teachings into something akin to a political campaign between two rival leaders, each laying out his respective platform alongside a case against his rival. What was hitherto recondite and academic becomes in Milton’s work immediate and visceral.

Keats famously penned the wonderfully self-proving postulate, “Axioms in philosophy are not axioms until they are proved upon our pulses,” which leaves open the question of how an axiom might be so proved. Milton’s God responds to Satan’s approach to Earth, and his foreknowledge of Satan’s success in tempting the original pair, with a preemptive defense of his preordained punishment of Man:

…Whose fault?

Whose but his own? Ingrate! He had of Me

All he could have. I made him just and right,

Sufficient to have stood though free to fall.

Such I created all th’ ethereal pow’rs

And spirits, both them who stood and who failed:

Freely they stood who stood and fell who fell.

Not free, what proof could they have giv’n sincere

Of true allegiance, constant faith or love

Where only what they needs must do appeared,

Not what they would? What praise could they receive?

What pleasure I from such obedience paid

When will and reason… had served necessity,

Not me? (3.96-111)

God is defending himself against the charge that his foreknowledge of the fall implies that Man’s decision to disobey was borne of something other than his free will. What choice could there have been if the outcome of Satan’s temptation was predetermined? If it wasn’t predetermined, how could God know what the outcome would be in advance? God’s answer—of course I granted humans free will because otherwise their obedience would mean nothing—only introduces further doubt. Now we must wonder why God cherishes Man’s obedience so fervently. Is God hungry for political power? If we conclude he is—and that conclusion seems eminently warranted—then we find ourselves on the side of Satan. It’s not so much God’s foreknowledge of Man’s fall that undermines human freedom; it’s God’s insistence on our obedience, under threat of God’s terrible punishment.

Milton faces a still greater challenge in his attempt to justify God’s ways “upon our pulses” when it comes to the fallout of Man’s original act of disobedience. The Son argues on behalf of Man, pointing out that the original sin was brought about through temptation. If God responds by turning against Man, then Satan wins. The Son thus argues that God must do something to thwart Satan: “Or shall the Adversary thus obtain/ His end and frustrate Thine?” (3.156-7). Before laying out his plan for Man’s redemption, God explains why punishment is necessary:

…Man disobeying

Disloyal breaks his fealty and sins

Against the high supremacy of Heav’n,

Affecting godhead, and so, losing all,

To expiate his treason hath naught left

But to destruction sacred and devote

He with his whole posterity must die. (3. 203-9)

The potential contradiction between foreknowledge and free choice may be abstruse enough for Milton’s character to convincingly discount: “If I foreknew/ Foreknowledge had no influence on their fault/ Which had no less proved certain unforeknown” (3.116-9). There is another contradiction, however, that Milton neglects to take on. If Man is “Sufficient to have stood though free to fall,” then God must justify his decision to punish the “whole posterity” as opposed to the individuals who choose to disobey. The Son agrees to redeem all of humanity for the offense committed by the original pair. His knowledge that every last human will disobey may not be logically incompatible with their freedom to choose; if every last human does disobey, however, the case for that freedom is severely undermined. The axiom of collective guilt precludes the axiom of freedom of choice both logically and upon our pulses.

In characterizing disobedience as a sin worthy of severe punishment—banishment from paradise, shame, toil, death—an offense he can generously expiate for Man by sacrificing the (his) Son, God seems to be justifying his dominion by pronouncing disobedience to him evil, allowing him to claim that Man’s evil made it necessary for him to suffer a profound loss, the death of his offspring. In place of a justification for his rule, then, God resorts to a simple guilt trip.

Man shall not quite be lost but saved who will,

Yet not of will in him but grace in me

Freely vouchsafed. Once more I will renew

His lapsed pow’rs though forfeit and enthralled

By sin to foul exorbitant desires.

Upheld by me, yet once more he shall stand

On even ground against his mortal foe,

By me upheld that he may know how frail

His fall’n condition is and to me owe

All his deliv’rance, and to none but me. (3.173-83)

Having decided to take on the burden of repairing the damage wrought by Man’s disobedience to him, God explains his plan:

Die he or justice must, unless for him

Some other as able and as willing pay

The rigid satisfaction, death for death. (3.210-3)

He then asks for a volunteer. In an echo of an earlier episode in the poem which has Satan asking for a volunteer to leave hell on a mission of exploration, there is a moment of hesitation before the Son offers himself up to die on Man’s behalf.

…On Me let thine anger fall.

Account Me Man. I for his sake will leave

Thy bosom and this glory next to Thee

Freely put off and for him lastly die

Well pleased. On Me let Death wreck all his rage! (3.37-42)

This great sacrifice, which is supposed to be the basis of the Son’s privileged status over the angels, is immediately undermined because he knows he won’t stay dead for long: “Yet that debt paid/ Thou wilt not leave me in the loathsome grave” (246-7). The Son will only die momentarily. This sacrifice doesn’t stack up well against the real risks and sacrifices made by Satan.

All the poetry about obedience and freedom and debt never takes on the central question Satan’s rebellion forces readers to ponder: Does God deserve our obedience? Or are the labels of good and evil applied arbitrarily? The original pair was forbidden from eating from the Tree of Knowledge—could they possibly have been right to contravene the interdiction? Since it is God being discussed, however, the assumption that his dominion requires no justification, that it is instead simply in the nature of things, might prevail among some readers, as it does for the angels who refuse to join Satan’s rebellion. The angels, after all, owe their very existence to God, as Abdiel insists to Satan. Who, then, are any of them to question his authority? This argument sets the stage for Satan’s remarkable rebuttal:

…Strange point and new!

Doctrine which we would know whence learnt: who saw

When this creation was? Remember’st thou

Thy making while the Maker gave thee being?

We know no time when we were not as now,

Know none before us, self-begot, self-raised

By our own quick’ning power…

Our puissance is our own. Our own right hand

Shall teach us highest deeds by proof to try

Who is our equal. (5.855-66)

Just as a pharaoh could claim credit for all the monuments and infrastructure he had commissioned the construction of, any king or dictator might try to convince his subjects that his deeds far exceed what he is truly capable of. If there’s no record and no witness—or if the records have been doctored and the witnesses silenced—the subjects have to take the king’s word for it.

That God’s dominion depends on some natural order, which he himself presumably put in place, makes his tendency to protect knowledge deeply suspicious. Even the angels ultimately have to take God’s claims to have created the universe and them along with it solely on faith. Because that same unquestioning faith is precisely what Satan and the readers of Paradise Lost are seeking a justification for, they could be forgiven for finding the answer tautological and unsatisfying. It is the Tree of Knowledge of Good and Evil that Adam and Eve are forbidden to eat fruit from. When Adam, after hearing Raphael’s recounting of the war in heaven, asks the angel how the earth was created, he does receive an answer, but only after a suspicious preamble:

…such commission from above

I have received to answer thy desire

Of knowledge with bounds. Beyond abstain

To ask nor let thine own inventions hope

Things not revealed which the invisible King

Only omniscient hath suppressed in night,

To none communicable in Earth or Heaven:

Enough is left besides to search and know. (7.118-125)

Raphael goes on to compare knowledge to food, suggesting that excessively indulging curiosity is unhealthy. This proscription of knowledge reminded Shelley of the Prometheus myth. It might remind modern readers of The Wizard of Oz—“Pay no attention to that man behind the curtain”—or to the space monkeys in Fight Club, who repeatedly remind us that “The first rule of Project Mayhem is, you do not ask questions.” It may also resonate with news about dictators in Asia or the Middle East trying to desperately to keep social media outlets from spreading word of their atrocities.

Like the protesters of the Arab Spring, Satan is putting himself at great risk by challenging God’s authority. If God’s dominion over Man and the angels is evidence not of his benevolence but of his supreme selfishness, then Satan’s rebellion becomes an attempt at altruistic punishment. The extrapolation from economic experiments like the ultimatum and dictator games to efforts to topple dictators may seem like a stretch, especially if humans are predisposed to forming and accepting positions in hierarchies, as a casual survey of virtually any modern organization suggests is the case.

Organized institutions, however, are a recent development in terms of human evolution. The English missionary Lucas Bridges wrote about his experiences with the Ona foragers in Tierra del Fuego in his 1948 book Uttermost Part of the Earth, and he expresses his amusement at his fellow outsiders’ befuddlement when they learn about the Ona’s political dynamics:

A certain scientist visited our part of the world and, in answer to his inquiries on this matter, I told him that the Ona had no chieftains, as we understand the word. Seeing that he did not believe me, I summoned Kankoat, who by that time spoke some Spanish. When the visitor repeated his question, Kankoat, too polite to answer in the negative, said: “Yes, senor, we, the Ona, have many chiefs. The men are all captains and all the women are sailors” (quoted in Boehm 62).

At least among Ona men, it seems there was no clear hierarchy. The anthropologist Richard Lee discovered a similar dynamic operating among the !Kung foragers of the Kalahari. In order to ensure that no one in the group can attain an elevated status which would allow him to dominate the others, several leveling mechanisms are in place. Lee quotes one of his informants:

When a young man kills much meat, he comes to think of himself as a chief or a big man, and he thinks of the rest of us as his servants or inferiors. We can’t accept this. We refuse one who boasts, for someday his pride will make him kill somebody. So we always speak of his meat as worthless. In this way we cool his heart and make him gentle. (quoted in Boehm 45)

These examples of egalitarianism among nomadic foragers are part of anthropologist Christopher Boehm’s survey of every known group of hunter-gatherers. His central finding is that “A distinctively egalitarian political style is highly predictable wherever people live in small, locally autonomous social and economic groups” (35-36). This finding bears on any discussion of human evolution and human nature because small groups like these constituted the whole of humanity for all but what amounts to the final instants of geological time.

Also read:

THE ADAPTIVE APPEAL OF BAD BOYS

SYMPATHIZING WITH PSYCHOS: WHY WE WANT TO SEE ALEX ESCAPE HIS FATE AS A CLOCKWORK ORANGE

THE PEOPLE WHO EVOLVED OUR GENES FOR US: CHRISTOPHER BOEHM ON MORAL ORIGINS – PART 3 OF A CRASH COURSE IN MULTILEVEL SELECTION THEORY

Read More
Dennis Junk Dennis Junk

Seduced by Satan

As long as the group they belong to is small enough for each group member to monitor the actions of the others, people can maintain strict egalitarianism, giving up whatever dominance they may desire for the assurance of not being dominated themselves. Satan very likely speaks to this natural ambivalence in humans. Benevolent leaders win our love and admiration through their selflessness and charisma. But no one wants to be a slave.

[Excerpt from Hierarchies in Hell and Leaderless Fight Clubs: Altruism, Narrative Interest, and the Adaptive Appeal of Bad Boys, my master’s thesis]

Why do we like the guys who seem not to care whether or not what they’re doing is right, but who often manage to do what’s right anyway? In the Star Wars series, Han Solo is introduced as a mercenary, concerned only with monetary reward. In the first episode of Mad Men, audiences see Don Draper saying to a woman that they should get married, and then in the final scene he arrives home to his actual wife. Tony Soprano, Jack Sparrow, Tom Sawyer, the list of male characters who flout rules and conventions, who lie, cheat and steal, but who nevertheless compel the attention, the favor, even the love of readers and moviegoers would be difficult to exhaust.

John Milton has been accused of both betraying his own and inspiring others' sympathy and admiration for what should be the most detestable character imaginable. When he has Satan, in Paradise Lost, say, “Better to reign in hell than serve in heaven,” many believed he was signaling his support of the king of England’s overthrow. Regicidal politics are well and good—at least from the remove of many generations—but voicing your opinions through such a disreputable mouthpiece? That’s difficult to defend. Imagine using a fictional Hitler to convey your stance on the current president.

Stanley Fish theorizes that Milton’s game was a much subtler one: he didn’t intend for Satan to be sympathetic so much as seductive, so that in being persuaded and won over to him readers would be falling prey to the same temptation that brought about the fall. As humans, all our hearts are marked with original sin. So if many readers of Milton’s magnum opus come away thinking Satan may have been in the right all along, the failure wasn’t the author’s unconstrained admiration for the rebel angel so much as it was his inability to adequately “justify the ways of God to men.” God’s ways may follow a certain logic, but the appeal of Satan’s ways is deeper, more primal.

In the “Argument,” or summary, prefacing Book Three, Milton relays some of God’s logic: “Man hath offended the majesty of God by aspiring to godhead and therefore, with all his progeny devoted to death, must die unless someone can be found sufficient to answer for his offence and undergo his punishment.” The Son volunteers. This reasoning has been justly characterized as “barking mad” by Richard Dawkins. But the lines give us an important insight into what Milton saw as the principle failing of the human race, their ambition to be godlike. It is this ambition which allows us to sympathize with Satan, who incited his fellow angels to rebellion against the rule of God.

In Book Five, we learn that what provoked Satan to rebellion was God’s arbitrary promotion of his own Son to a status higher than the angels: “by Decree/ Another now hath to himself ingross’t/ All Power, and us eclipst under the name/ Of King anointed.” Citing these lines, William Flesch explains, “Satan’s grandeur, even if it is the grandeur of archangel ruined, comes from his iconoclasm, from his desire for liberty.” At the same time, however, Flesch insists that, “Satan’s revolt is not against tyranny. It is against a tyrant whose place he wishes to usurp.” So, it’s not so much freedom from domination he wants, according to Flesch, as the power to dominate.

Anthropologist Christopher Boehm describes the political dynamics of nomadic peoples in his book Hierarchy in the Forest: TheEvolution of Egalitarian Behavior, and his descriptions suggest that parsing a motive of domination from one of preserving autonomy is much more complicated than Flesch’s analysis assumes. “In my opinion,” Boehm writes, “nomadic foragers are universally—and all but obsessively—concerned with being free from the authority of others” (68). As long as the group they belong to is small enough for each group member to monitor the actions of the others, people can maintain strict egalitarianism, giving up whatever dominance they may desire for the assurance of not being dominated themselves.

Satan very likely speaks to this natural ambivalence in humans. Benevolent leaders win our love and admiration through their selflessness and charisma. But no one wants to be a slave. Does Satan’s admirable resistance and defiance shade into narcissistic self-aggrandizement and an unchecked will to power? If so, is his tyranny any more savage than that of God? And might there even be something not altogether off-putting about a certain degree self-indulgent badness?

Also read:

CAMPAIGNING DEITIES: JUSTIFYING THE WAYS OF SATAN

THE ADAPTIVE APPEAL OF BAD BOYS

SYMPATHIZING WITH PSYCHOS: WHY WE WANT TO SEE ALEX ESCAPE HIS FATE AS A CLOCKWORK ORANGE

Read More
Dennis Junk Dennis Junk

T.J. Eckleburg Sees Everything: The Great God-Gap in Gatsby part 2 of 2

The simple explanation for Fitzgerald’s decision not to gratify his readers but rather to disappoint and disturb them is that he wanted his novel to serve as an indictment of the types of behavior that are encouraged by the social conditions he describes in the story, conditions which would have been easily recognizable to many readers of his day and which persist into the Twenty-First Century.

Read part 1

Though The Great Gatsby does indeed tell a story of punishment, readers are left with severe doubts as to whether those who receive punishment actually deserve it. Gatsby is involved in criminal activities, and he has an affair with a married woman. Myrtle likewise is guilty of adultery. But does either deserve to die? What about George Wilson? His is the only attempt in the novel at altruistic punishment. So natural is his impulse toward revenge, however, and so given are readers to take that impulse for granted, that its function in preserving a broader norm of cooperation requires explanation. Flesch describes a series of experiments in the field of game theory centering on an exchange called the ultimatum game. One participant is given a sum of money and told he or she must propose a split with a second participant, with the proviso that if the second person rejects the cut neither will get to keep anything. Flesch points out, however, that

It is irrational for the responder not to accept any proposed split from the proposer. The responder will always come out better by accepting than by vetoing. And yet people generally veto offers of less than 25 percent of the original sum. This means they are paying to punish. They are giving up a sure gain in order to punish the selfishness of the proposer. (31)

To understand why George’s attempt at revenge is altruistic, consider that he had nothing to gain, from a purely selfish and rational perspective, and much to lose by killing the man he believed killed his wife. He was risking physical harm if a fight ensued. He was risking arrest for murder. Yet if he failed to seek revenge readers would likely see him as somehow less than human. His quest for justice, as futile and misguided as it is, would likely endear him to readers—if the discovery of how futile and misguided it was didn’t precede their knowledge of it taking place. Readers, in fact, would probably respond more favorably toward George than any other character in the story, including the narrator. But the author deliberately prevents this outcome from occurring.

The simple explanation for Fitzgerald’s decision not to gratify his readers but rather to disappoint and disturb them is that he wanted his novel to serve as an indictment of the types of behavior that are encouraged by the social conditions he describes in the story, conditions which would have been easily recognizable to many readers of his day and which persist into the Twenty-First Century. Though the narrator plays the role of second-order free-rider, the author clearly signals his own readiness to punish by publishing his narrative about such bad behavior perpetrated by characters belonging to a particular group of people, a group corresponding to one readers might encounter outside the realm of fiction.

Fitzgerald makes it obvious in the novel that beyond Tom’s simple contempt for George there exist several more severe impediments to what biologists would call group cohesion but that most readers would simply refer to as a sense of community. The idea of a community as a unified entity whose interests supersede those of the individuals who make it up is something biological anthropologists theorize religion evolved to encourage. In his book Darwin’s Cathedral, in which he attempts to explain religion in terms of group selection theory, David Sloan Wilson writes:

A group of people who abandon self-will and work tirelessly for a greater good will fare very well as a group, much better than if they all pursue their private utilities, as long as the greater good corresponds to the welfare of the group. And religions almost invariably do link the greater good to the welfare of the community of believers, whether an organized modern church or an ethnic group for whom religion is thoroughly intermixed with the rest of their culture. Since religion is such an ancient feature of our species, I have no problem whatsoever imagining the capacity for selflessness and longing to be part of something larger than ourselves as part of our genetic and cultural heritage. (175)

One of the main tasks religious beliefs evolved to handle would have been addressing the same “free-rider problem” William Flesch discovers at the heart of narrative. What religion offers beyond the social monitoring of group members is the presence of invisible beings whose concerns are tied to the collective concerns of the group.

Obviously, Tom Buchanan’s sense of community has clear demarcations. “Civilization is going to pieces,” he warns Nick as prelude to his recommendation of a book titled “The Rise of the Coloured Empires.” “The idea,” Tom explains, “is that if we don’t look out the white race will be—will be utterly submerged” (17). “We’ve got to beat them down,” Daisy helpfully, mockingly chimes in (18). While this animosity toward members of other races seems immoral at first glance, in the social context the Buchanans inhabit it actually represents a concern for the broader group, “the white race.” But Tom’s animosity isn’t limited to other races. What prompts Catherine to tell Nick how her sister “can’t stand” her husband during the gathering in Tom and Myrtle’s apartment is in fact Tom’s ridiculing of George. In response to another character’s suggestion that he’d like to take some photographs of people in Long Island “if I could get the entry,” Tom jokingly insists to Myrtle that she should introduce the man to her husband. Laughing at his own joke, Tom imagines a title for one of the photographs: “‘George B. Wilson at the Gasoline Pump,’ or something like that” (37). Disturbingly, Tom’s contempt for George based on his lowly social status has contaminated Myrtle as well. Asked by her sister why she married George in the first place, she responds, “I married him because I thought he was a gentleman…I thought he knew something about breeding but he wasn’t fit to lick my shoe” (39). Her sense of superiority, however, is based on the artificial plan for her and Tom to get married.

That Tom’s idea of who belongs to his own superior community is determined more by “breeding” than by economic success—i.e. by birth and not accomplishment—is evidenced by his attitude toward Gatsby. In a scene that has Tom stopping with two friends, a husband and wife, at Gatsby’s mansion while riding horses, he is shocked when Gatsby shows an inclination to accept an invitation to supper extended by the woman, who is quite drunk. Both the husband and Tom show their disapproval. “My God,” Tom says to Nick, “I believe the man’s coming…Doesn’t he know she doesn’t want him?” (109). When Nick points out that woman just said she did want him, Tom answers, “he won’t know a soul there.” Gatsby’s statement in the same scene that he knows Tom’s wife provokes him, as soon as Gatsby has left the room, to say, “By God, I may be old-fashioned in my ideas but women run around too much these days to suit me. They meet all kinds of crazy fish” (110). In a later scene that has Tom accompanying Daisy, with Nick in tow, to one of Gatsby’s parties, he asks, “Who is this Gatsby anyhow?... Some big bootlegger?” When Nick says he’s not, Tom says, “Well, he certainly must have strained himself to get this menagerie together” (114). Even when Tom discovers that Gatsby and Daisy are having an affair, he still doesn’t take Gatsby seriously. He calls Gatsby “Mr. Nobody from Nowhere” (137), and says, “I’ll be damned if I see how you got within a mile of her unless you brought the groceries to the back door” (138). Once he’s succeeded in scaring Daisy with suggestions of Gatsby’s criminal endeavors, Tom insists the two drive home together, saying, “I think he realizes that his presumptuous little flirtation is over” (142).

When George Wilson looks to the eyes of Dr. Eckleburg in supplication after that very car ride leads to Myrtle’s death, the fact that this “God” is an advertisement, a supplication in its own right to viewers on behalf of the optometrist to boost his business, symbolically implicates the substitution of markets for religion—or a sense of common interest—as the main factor behind Tom’s superciliously careless sense of privilege. The eyes seem such a natural stand-in for an absent God that it’s easy to take the symbolic logic for granted without wondering why George might mistake them as belonging to some sentient agent. Evolutionary psychologist Jesse Bering takes on that very question in The God Instinct: The Psychology of Souls, Destiny, and the Meaning of Life, where he cites research suggesting that “attributing moral responsibility to God is a sort of residual spillover from our everyday social psychology dealing with other people” (138). Bering theorizes that humans’ tendency to assume agency behind even random physical events evolved as a by-product of our profound need to understand the motives and intentions of our fellow humans: “When the emotional climate is just right, there’s hardly a shape or form that ‘evidence’ cannot assume. Our minds make meaning by disambiguating the meaningless” (99). In place of meaningless events, humans see intentional signs.

According to Bering’s theory, George Wilson’s intense suffering would have made him desperate for some type of answer to the question of why such tragedy has befallen him. After discussing research showing that suffering, as defined by societal ills like infant mortality and violent crime, and “belief in God were highly correlated,” Bering suggests that thinking of hardship as purposeful, rather than random, helps people cope because it allows them to place what they’re going through in the context of some larger design (139). What he calls “the universal common denominator” to all the permutations of religious signs, omens, and symbols, is the same cognitive mechanism, “theory of mind,” that allows humans to understand each other and communicate so effectively as groups. “In analyzing things this way,” Bering writes,

we’re trying to get into God’s head—or the head of whichever culturally constructed supernatural agent we have on offer… This is to say, just like other people’s surface behaviors, natural events can be perceived by us human beings as being about something other than their surface characteristics only because our brains are equipped with the specialized cognitive software, theory of mind, that enables us to think about underlying psychological causes. (79)

So George, in his bereaved and enraged state, looks at a billboard of a pair of eyes and can’t help imagining a mind operating behind them, one whose identity he’s learned to associate with a figure whose main preoccupation is the judgment of individual humans’ moral standings. According to both David Sloan Wilson and Jesse Bering, though, the deity’s obsession with moral behavior is no coincidence.

Covering some of the same game theory territory as Flesch, Bering points out that the most immediate purpose to which we put our theory of mind capabilities is to figure out how altruistic or selfish the people around us are. He explains that

in general, morality is a matter of putting the group’s needs ahead of one’s own selfish interests. So when we hear about someone who has done the opposite, especially when it comes at another person’s obvious expense, this individual becomes marred by our social judgment and grist for the gossip mills. (183)

Having arisen as a by-product of our need to monitor and understand the motives of other humans, religion would have been quickly co-opted in the service of solving the same free-rider problem Flesch finds at the heart of narratives. Alongside our concern for the reputations of others is a close guarding of our own reputations. Since humans are given to assuming agency is involved even in random events like shifts in weather, group cohesion could easily have been optimized with the subtlest suggestion that hidden agents engage in the same type of monitoring as other, fully human members of the group. Bering writes:

For many, God represents that ineradicable sense of being watched that so often flares up in moments of temptation—He who knows what’s in our hearts, that private audience that wants us to act in certain ways at critical decision-making points and that will be disappointed in us otherwise. (191)

Bering describes some of his own research that demonstrates this point. Coincident with the average age at which children begin to develop a theory of mind (around 4), they began responding to suggestions that they’re being watched by an invisible agent—named Princess Alice in honor of Bering’s mother—by more frequently resisting the temptation to avail themselves of opportunities to cheat that were built into the experimental design of a game they were asked to play (Piazza et al. 311-20). An experiment with adult participants, this time told that the ghost of a dead graduate student had been seen in the lab, showed the same results; when competing in a game for fifty dollars, they were much less likely to cheat than others who weren’t told the ghost story (Bering 193).

Bering also cites a study that has even more immediate relevance to George Wilson’s odd behavior vis-à-vis Dr. Eckleburg’s eyes. In “Cues of Being Watched Enhance Cooperation in a Real-World Setting,” the authors describe an experiment in which they tested the effects of various pictures placed near an “honesty box,” where people were supposed to be contributing money in exchange for milk and tea. What they found is that when the pictures featured human eyes more people contributed more money than when they featured abstract patterns of flowers. They theorize that

images of eyes motivate cooperative behavior because they induce a perception in participants of being watched. Although participants were not actually observed in either of our experimental conditions, the human perceptual system contains neurons that respond selectively to stimuli involving faces and eyes…, and it is therefore possible that the images exerted an automatic and unconscious effect on the participants’ perception that they were being watched. Our results therefore support the hypothesis that reputational concerns may be extremely powerful in motivating cooperative behavior. (2) (But also see Sparks et al. for failed replications)

This study also suggests that, while Fitzgerald may have meant the Dr. Eckleburg sign as a nod toward religion being supplanted by commerce, there is an alternate reading of the scene that focuses on the sign’s more direct impact on George Wilson. In several scenes throughout the novel, Wilson shows his willingness to acquiesce in the face of Tom’s bullying. Nick describes him as “spiritless” and “anemic” (29). It could be that when he says “God sees everything” he’s in fact addressing himself because he is tempted not to pursue justice—to let the crime go unpunished and thus be guilty himself of being a second-order free-rider. He doesn’t, after all, exert any great effort to find and kill Gatsby, and he kills himself immediately thereafter anyway.

Religion in Gatsby does, of course, go beyond some suggestive references to an empty placeholder. Nick ends the story with a reflection on how “Gatsby believed in the green light,” the light across the bay which he knew signaled Daisy’s presence in the mansion she lived in there. But for Gatsby it was also “the orgastic future that year by year recedes before us. It eluded us then, but that’s no matter—tomorrow we will run faster, stretch out our arms farther… And one fine morning—” (189). Earlier Nick had explained how Gatsby “talked a lot about the past and I gathered that he wanted to recover something, some idea of himself perhaps, that had gone into loving Daisy.” What that idea was becomes apparent in the scene describing Gatsby and Daisy’s first kiss, which occurred years prior to the events of the plot. “He knew that when he kissed this girl, and forever wed his unutterable visions to her perishable breath, his mind would never romp again like the mind of God… At his lips’ touch she blossomed for him like a flower and the incarnation was complete” (117). In place of some mind in the sky, the design Americans are encouraged to live by is one they have created for themselves. Unfortunately, just as there is no mind behind the eyes of Doctor T.J. Eckleburg, the designs many people come up with for themselves are based on tragically faulty premises.

The replacement of religiously inspired moral principles with selfish economic and hierarchical calculations, which Dr. Eckleburg so perfectly represents, is what ultimately leads to all the disgraceful behavior Nick describes. He writes, “They were careless people, Tom and Daisy—they smashed up things and people and creatures and then retreated back into their money or their vast carelessness or whatever it was that kept them together, and let other people clean up the mess” (188). Game theorist and behavioral economist Robert Frank, whose earlier work greatly influenced William Flesch’s theories of narrative, has recently written about how the same social dynamics Fitzgerald lamented are in place again today. In The Darwin Economy, he describes what he calls an “expenditure cascade”:

The explosive growth of CEO pay in recent decades, for example, has led many executives to build larger and larger mansions. But those mansions have long since passed the point at which greater absolute size yields additional utility… Top earners build bigger mansions simply because they have more money. The middle class shows little evidence of being offended by that. On the contrary, many seem drawn to photo essays and TV programs about the lifestyles of the rich and famous. But the larger mansions of the rich shift the frame of reference that defines acceptable housing for the near-rich, who travel in many of the same social circles… So the near-rich build bigger, too, and that shifts the relevant framework for others just below them, and so on, all the way down the income scale. By 2007, the median new single-family house built in the United States had an area of more than 2,300 square feet, some 50 percent more than its counterpart from 1970. (61-2)

How exactly people are straining themselves to afford these houses would be a fascinating topic for Fitzgerald’s successors. But one thing is already abundantly clear: it’s not the CEOs who are cleaning up the mess.

Also read:
HOW TO GET KIDS TO READ LITERATURE WITHOUT MAKING THEM HATE IT

WHY THE CRITICS ARE GETTING LUHRMANN'S GREAT GATSBY SO WRONG

WHY SHAKESPEARE NAUSEATED DARWIN: A REVIEW OF KEITH OATLEY'S "SUCH STUFF AS DREAMS"

Read More
Dennis Junk Dennis Junk

T.J. Eckleburg Sees Everything: The Great God-Gap in Gatsby part 1 of 2

So profound is humans’ concern for their reputations that they can even be nudged toward altruistic behaviors by the mere suggestion of invisible witnesses or the simplest representation of watching eyes. The billboard featuring Dr. Eckleburg’s eyes, however, holds no sway over George’s wife Myrtle, or the man she has an affair with. That this man, Tom Buchanan, has such little concern for his reputation—or that he simply feels entitled to exploit Myrtle—serves as an indictment of the social and economic inequality in the America of Fitzgerald’s day.

            When George Wilson, in one of the most disturbing scenes in F. Scott Fitzgerald’s classic The Great Gatsby, tells his neighbor that “God sees everything” while staring disconsolately at the weathered advertisement of some long-ago optometrist named T.J. Eckleburg, his longing for a transcendent authority who will mete out justice on his behalf pulls at the hearts of readers who realize his plea will go unheard. Anthropologists and psychologists studying the human capacity for cooperation and altruism are coming to view religion as an important factor in our evolution. Since the cooperative are always at risk of being exploited by the selfish, mechanisms to enforce altruism had to be in place for any tendency to behave for the benefit of others to evolve. The most basic of these mechanisms is a constant awareness of our own and our neighbors’ reputations. Humans, research has shown, are far more tempted to behave selfishly when they believe it won’t harm their reputations—i.e. when they believe no witnesses are present.

So profound is humans’ concern for their reputations that they can even be nudged toward altruistic behaviors by the mere suggestion of invisible witnesses or the simplest representation of watching eyes. The billboard featuring Dr. Eckleburg’s eyes, however, holds no sway over George’s wife Myrtle, or the man she has an affair with. That this man, Tom Buchanan, has such little concern for his reputation—or that he simply feels entitled to exploit Myrtle—serves as an indictment of the social and economic inequality in the America of Fitzgerald’s day, which carved society into hierarchically arranged echelons and exposed the have-nots to the careless depredations of the haves.

Nick Carraway, the narrator, begins the story by recounting a lesson he learned from his father as part of his Midwestern upbringing. “Whenever you feel like criticizing anyone,” Nick’s father had told him, “just remember that all the people in this world haven’t had the advantages that you’ve had”(5). This piece of wisdom serves at least two purposes: it explains Nick’s self-proclaimed inclination to “reserve all judgments,” highlighting the severity of the wrongdoings which have prompted him to write the story; and it provides an ironic moral lens through which readers view the events of the plot. What is to be made, in light of Nick’s father’s reminder about unevenly parceled out advantages, of the crimes committed by wealthy characters like Tom and Daisy Buchanan?

The focus on morality notwithstanding, religion plays a scant, but surprising, role in The Great Gatsby. It first appears in a conversation between Nick and Catherine, the sister of Myrtle Wilson. Catherine explains to Nick that neither Tom nor Myrtle “can stand the person they’re married to” (37). To the obvious question of why they don’t simply leave their spouses, Catherine responds that it’s Daisy, Tom’s wife, who represents the sole obstacle to the lovers’ happiness. “She’s a Catholic,” Catherine says, “and they don’t believe in divorce” (38). However, Nick explains that “Daisy was not a Catholic,” and he goes on to admit, “I was a little shocked by the elaborateness of the lie.” The conversation takes place at a small gathering hosted by Tom and Myrtle in an apartment rented, it seems, for the sole purpose of giving the two a place to meet. Before Nick leaves the party, he witnesses an argument between the hosts over whether Myrtle has any right to utter Daisy’s name which culminates in Tom striking her and breaking her nose. Obviously, Tom doesn’t despise his wife as much as Myrtle does her husband. And the lie about Daisy’s religious compunctions serves simply to justify Tom’s refusal to leave her and facilitate his continued exploitation of Myrtle.

The only other scene in which a religious belief is asserted explicitly is the one featuring the conversation between George and his neighbor. It comes after Myrtle, whose dalliance had finally aroused her husband’s suspicion, has been struck by a car and killed. George, upon discovering that something had been going on behind his back, locked Myrtle in his garage, and it was when she escaped and ran out into the road to stop the car she thought Tom was driving that she got hit. As the dust from the accident settles—literally, since the garage and the stretch of road are situated in a “valley of ashes” created by the remnants of the coal powering the nearby city being dumped alongside the adjacent rail tracks—George is left alone with a fellow inhabitant of the valley, a man named Michaelis, who asks if he belongs to a church where there might a be a priest he can call to come comfort him. “Don’t belong to one,” George answers (165). He does, however, describe a religious belief of sorts to Michaelis. Having explained why he’d begun to suspect Myrtle was having an affair, George goes on to say, “I told her she might fool me but she couldn’t fool God. I took her to the window.” He walks to the window again as he’s telling the story to his neighbor. “I said, ‘God knows what you’ve been doing, everything you’ve been doing. You may fool me but you can’t fool God!’” (167). Michaelis, who is by now fearing for George’s sanity, notices something disturbing as he stands listening to this rant: “Standing behind him Michaelis saw with a shock that he was looking at the eyes of Doctor T.J. Eckleburg which had just emerged pale and enormous from the dissolving night” (167). When George speaks again, repeating, “God sees everything,” Michaelis feels compelled to assure him, “That’s an advertisement” (167). Though when George first expresses the sentiment, part declaration, part plea, he was clearly thinking of Myrtle’s crime against him, when he repeats it he seems to be thinking of the driver’s crime against Myrtle. God may have seen it, but George takes it upon himself to deliver the punishment.

George Wilson’s turning to God for some moral accounting, despite his general lack of religious devotion, mirrors Nick Carraway’s efforts to settle the question of culpability, despite his own professed reluctance to judge, through the telling of this tragic story. Nick learns from Gatsby that it was in fact Daisy, with whom Gatsby has been carrying on an affair, who was behind the wheel of the car that killed Myrtle. But Gatsby, who was in the passenger seat, assures him it was an accident, not revenge for the affair Myrtle was carrying on with Daisy’s husband. Yet when George finally leaves his garage and turns to Tom to find out who owns the car that killed his wife, assuming it is the same man his wife was cheating on him with, Tom informs him the car belongs to Gatsby, leaving out the crucial fact that Gatsby never met Myrtle. George goes to Gatsby’s mansion, finds him in his pool, shoots and kills him, and then turns the gun on himself. Three people end up dead, Myrtle, George, and Gatsby. Despite their clear complicity, though, Tom and Daisy experience nary a repercussion beyond the natural grief of losing their lovers. Insofar as Nick believes the Buchanans’ perfect getaway is an intolerable injustice, he must realize he holds the power to implicate them, to damage their reputations, by writing and publishing his account of the incidents leading up to the deaths.

Evolutionary critic William Flesch sees our human passion for narrative as a manifestation of our obsession with our own and our fellow humans’ reputations, which evolved at least in part to keep track of each other’s propensities for moral behavior. In Comeuppance: Costly Signaling, Altruistic Punishment, and Other Biological Components of Fiction, Flesch lays out his attempt at solving what he calls “the puzzle of narrative interest,” by which he means the question of why people feel “anxiety on behalf of and about the motives, actions, and experiences of fictional characters” (7). He finds the key to solving this puzzle in a concept called “strong reciprocity,” whereby “the strong reciprocator punishes and rewards others for their behavior toward any member of the social group, and not just or primarily for their individual interactions with the reciprocator” (22). An example of this phenomenon takes place in the novel when the guests at Gatsby’s parties gossip and ardently debate about which of the rumors circling their host are true—particularly of interest is the one saying that “he killed a man” (48). Flesch cites reports from experiments demonstrating that in uneven exchanges, participants with no stake in the outcome are actually willing to incur some cost to themselves in an effort to enforce fairness (31-5). He then goes on to give a compelling account of how this tendency goes a long way toward an explanation of our human fascination with storytelling.

Flesch’s theory of narrative interest begins with models of the evolution of cooperation. For the first groups of human ancestors to evolve cooperative or altruistic traits, they would have had to solve what biologists and game theorists call “the free-rider problem.” Flesch explains:

Darwin himself had proposed a way for altruism to evolve through a mechanism of group selection. Groups with altruists do better as a group than groups without. But it was shown in the 1960s that, in fact, such groups would be too easily infiltrated or invaded by nonaltruists—that is, that group boundaries were too porous—to make group selection strong enough to overcome competition at the level of the individual or the gene. (5)

Strong, or indirect reciprocity, coupled with a selfish concern for one’s own reputation, may have evolved as mechanisms to address this threat of exploitative non-cooperators. For instance, in order for Tom Buchanan to behave selfishly by sleeping with George Wilson’s wife, he had to calculate his chances of being discovered in the act and punished. Interestingly, after “exchanging a frown with Doctor Eckleburg” while speaking to Nick in an early scene in Wilson’s garage, Tom suggests his motives for stealing away with Myrtle are at least somewhat noble. “Terrible place,” he says of the garage and the valley of ashes. “It does her good to get away” (30). Nick, clearly uncomfortable with the position Tom has put him in, where he has to choose whether to object to Tom’s behavior or play the role of second-order free-rider himself, poses the obvious question: “Doesn’t her husband object?” To which Tom replies, “He’s so dumb he doesn’t know he’s alive” (30). Nick, inclined to reserve judgment, keeps Tom and Myrtle’s secret. Later in the novel, though, he keeps the same secret for Daisy and Gatsby.

What makes Flesch’s theory so compelling is that it sheds light on the roles played by everyone from the author, in this case Fitzgerald, to the readers, to the characters, whose nonexistence beyond the pages of the novel is little obstacle to their ability to arouse sympathy or ire. Just as humans are keen to ascertain the relative altruism of their neighbors, so too are they given to broadcasting signals of their own altruism. Flesch explains, “we track not only the original actor whose actions we wish to see reciprocated, whether through reward or more likely punishment; we track as well those who are in a position to track that actor, and we track as well those in a position to track those tracking the actor” (50). What this means is that even if the original “actor” is fictional, readers can signal their own altruism by becoming emotionally engaged in the outcome of the story, specifically by wanting to see altruistic characters rewarded and selfish characters punished.

Nick Carraway is tracking Tom Buchanan’s actions, for instance. Reading the novel, we have little doubt what Nick’s attitude toward Tom is, especially as the story progresses. Though we may favor Nick over Tom, Nick’s failure to sufficiently punish Tom when the degree of his selfishness first becomes apparent tempers any positive feelings we may have toward him. As Flesch points out, “altruism could not sustain an evolutionarily stable system without the contribution of altruistic punishers to punish the free-riders who would flourish in a population of purely benevolent altruists” (66).

On the other hand, through the very act of telling the story, the narrator may be attempting to rectify his earlier moral complacence. According to Flesch’s model of the dynamics of fiction, “The story tells a story of punishment; the story punishes as story; the storyteller represents him- or herself as an altruistic punisher by telling it” (83). However, many readers of Gatsby probably find Nick’s belated punishment insufficient, and if they fail to see the novel as a comment on the real injustice Fitzgerald saw going on around him they would be both confused and disappointed by the way the story ends.

Read part 2

Read More
Dennis Junk Dennis Junk

I am Jack’s Raging Insomnia: The Tragically Overlooked Moral Dilemma at the Heart of Fight Club

There’s a lot of weird theorizing about what the movie Fight Club is really about and why so many men find it appealing. The answer is actually pretty simple: the narrator can’t sleep because his job has him doing something he knows is wrong, but he’s so emasculated by his consumerist obsessions that he won’t risk confronting his boss and losing his job. He needs someone to teach him to man up, so he creates Tyler Durden. Then Tyler gets out of control.

Image by Canva’s Magic Media

[This essay is a brief distillation of ideas explored in much greater depth in Hierarchies in Hell and Leaderless Fight Clubs: Altruism, Narrative Interest, and the Adaptive Appeal of Bad Boys, my master’s thesis]

If you were to ask one of the millions of guys who love the movie Fight Club what the story is about, his answer would most likely emphasize the violence. He might say something like, “It’s about men returning to their primal nature and getting carried away when they find out how good it feels.” Actually, this is an answer I would expect from a guy with exceptional insight. A majority would probably just say it’s about a bunch of guys who get together to beat the crap out of each other and pull a bunch pranks. Some might remember all the talk about IKEA and other consumerist products. Our insightful guy may even connect the dots and explain that consumerism somehow made the characters in the movie feel emasculated, and so they had to resort to fighting and vandalism to reassert their manhood. But, aside from ensuring they would know what a duvet is—“It’s a fucking blanket”—what is it exactly about shopping for household décor and modern conveniences that makes men less manly?

Maybe Fight Club is just supposed to be fun, with all the violence, and the weird sex scene with Marla, and all the crazy mischief the guys get into, but also with a few interesting monologues and voiceovers to hint at deeper meanings. And of course there’s Tyler Durden—fearless, clever, charismatic, and did you see those shredded abs? Not only does he not take shit from anyone, he gets a whole army to follow his lead, loyal to the death. On the other hand, there’s no shortage of characters like this in movies, and if that’s all men liked about Fight Club they wouldn’t sit through all the plane flights, support groups, and soap-making. It just may be that, despite the rarity of fans who can articulate what they are, the movie actually does have profound and important resonances.

If you recall, the Edward Norton character, whom I’ll call Jack (following the convention of the script), decides that his story should begin with the advent of his insomnia. He goes to the doctor but is told nothing is wrong with him. His first night’s sleep comes only after he goes to a support group and meets Bob, he of the “bitch tits,” and cries a smiley face onto his t-shirt. But along comes Marla who like Jack is visiting support groups but is not in fact recovering, sick, or dying. She is another tourist. As long as she's around, he can’t cry, and so he can’t sleep. Soon after Jack and Marla make a deal to divide the group meetings and avoid each other, Tyler Durden shows up and we’re on our way to Fight Clubs and Project Mayhem. Now, why the hell would we accept these bizarre premises and continue watching the movie unless at some level Jack’s difficulties, as well as their solutions, make sense to us?

So why exactly was it that Jack couldn’t sleep at night? The simple answer, the one that Tyler gives later in the movie, is that he’s unhappy with his life. He hates his job. Something about his “filing cabinet” apartment rankles him. And he’s alone. Jack’s job is to fly all over the country to investigate accidents involving his company’s vehicles and to apply “the formula.” I’m going to quote from Chuck Palahniuk’s book:

You take the population of vehicles in the field (A) and multiply it by the probable rate of failure (B), then multiply the result by the average cost of an out-of-court settlement (C).

A times B times C equals X. This is what it will cost if we don’t initiate a recall.

If X is greater than the cost of a recall, we recall the cars and no one gets hurt.

If X is less than the cost of a recall, then we don’t recall (30).

Palahniuk's inspiration for Jack's job was an actual case involving the Ford Pinto. What this means is that Jack goes around trying to protect his company's bottom line to the detriment of people who drive his company's cars. You can imagine the husband or wife or child or parent of one of these accident victims hearing about this job and asking Jack, "How do you sleep at night?"

Going to support groups makes life seem pointless, short, and horrible. Ultimately, we all have little control over our fates, so there's no good reason to take responsibility for anything. When Jack burst into tears as Bob pulls his face into his enlarged breasts, he's relinquishing all accountability; he's, in a sense, becoming a child again. Accordingly, he's able to sleep like a baby. When Marla shows up, not only is he forced to confront the fact that he's healthy and perfectly able to behave responsibly, but he is also provided with an incentive to grow up because, as his fatuous grin informs us, he likes her. And, even though the support groups eventually fail to assuage his guilt, they do inspire him with the idea of hitting bottom, losing all control, losing all hope.

Here’s the crucial point: If Jack didn't have to worry about losing his apartment, or losing all his IKEA products, or losing his job, or falling out of favor with his boss, well, then he would be free to confront that same boss and tell him what he really thinks of the operation that has supported and enriched them both. Enter Tyler Durden, who systematically turns all these conditionals into realities. In game theory terms, Jack is both a 1st order and a 2nd order free rider because he both gains at the expense of others and knowingly allows others to gain in the same way. He carries on like this because he's more motivated by comfort and safety than he is by any assurance that he's doing right by other people.

This is where Jack being of "a generation of men raised by women" becomes important (50). Fathers and mothers tend to treat children differently. A study that functions well symbolically in this context examined the ways moms and dads tend to hold their babies in pools. Moms hold them facing themselves. Dads hold them facing away. Think of the way Bob's embrace of Jack changes between the support group and the fight club. When picked up by moms, babies breathing and heart-rates slow. Just the opposite happens when dads pick them up--they get excited. And if you inventory the types of interactions that go on between the two parents it's easy to see why.

Not only do dads engage children in more rough-and-tumble play; they are also far more likely to encourage children to take risks. In one study, fathers told they'd have to observe their child climbing a slope from a distance making any kind of rescue impossible in the event of a fall set the slopes at a much steeper angle than mothers in the same setup.

Fight Club isn't about dominance or triumphalism or white males' reaction to losing control; it's about men learning that they can't really live if they're always playing it safe. Jack actually says at one point that winning or losing doesn't much matter. Indeed, one of the homework assignments Tyler gives everyone is to start a fight and lose. The point is to be willing to risk a fight when it's necessary--i.e. when someone attempts to exploit or seduce you based on the assumption that you'll always act according to your rational self-interest.

And the disturbing truth is that we are all lulled into hypocrisy and moral complacency by the allures of consumerism. We may not be "recall campaign coordinators" like Jack. But do we know or care where our food comes from? Do we know or care how our soap is made? Do we bother to ask why Disney movies are so devoid of the gross mechanics of life? We would do just about anything for comfort and safety. And that is precisely how material goods and material security have emasculated us. It's easy to imagine Jack's mother soothing him to sleep some night, saying, "Now, the best thing to do, dear, is to sit down and talk this out with your boss."

There are two scenes in Fight Club that I can't think of any other word to describe but sublime. The first is when Jack finally confronts his boss, threatening to expose the company's practices if he is not allowed to leave with full salary. At first, his boss reasons that Jack's threat is not credible, because bringing his crimes to light would hurt Jack just as much. But the key element to what game theorists call altruistic punishment is that the punisher is willing to incur risks or costs to mete out justice. Jack, having been well-fathered, as it were, by Tyler, proceeds to engage in costly signaling of his willingness to harm himself by beating himself up, literally. In game theory terms, he's being rationally irrational, making his threat credible by demonstrating he can't be counted on to pursue his own rational self-interest. The money he gets through this maneuver goes, of course, not into anything for Jack, but into Fight Club and Project Mayhem.

The second sublime scene, and for me the best in the movie, is the one in which Jack is himself punished for his complicity in the crimes of his company. How can a guy with stitches in his face and broken teeth, a guy with a chemical burn on his hand, be punished? Fittingly, he lets Tyler get them both in a car accident. At this point, Jack is in control of his life, he's no longer emasculated. And Tyler flees.

One of the confusing things about the movie is that it has two overlapping plots. The first, which I've been exploring up to this point, centers on Jack's struggle to man up and become an altruistic punisher. The second is about the danger of violent reactions to the murder machine of consumerism. The male ethic of justice through violence can all too easily morph into fascism. And so, once Jack has created this father figure and been initiated into manhood by him, he then has to reign him in--specifically, he has to keep him from killing Marla. This second plot entails what anthropologist Christopher Boehm calls a "domination episode," in which an otherwise egalitarian group gets taken over by a despot who must then be defeated. Interestingly, only Jack knows for sure how much authority Tyler has, because Tyler seemingly undermines that authority by giving contradictory orders. But by now Jack is well schooled on how to beat Tyler--pretty much the same way he beat his boss.

It's interesting to think about possible parallels between the way Fight Club ends and what happened a couple years later on 9/11. The violent reaction to the criminal excesses of consumerism and capitalism wasn't, as it actually occurred, homegrown. And it wasn't inspired by any primal notion of manhood but by religious fanaticism. Still, in the minds of the terrorists, the attacks were certainly a punishment, and there's no denying the cost to the punishers.

Also read:
WHAT MAKES "WOLF HALL" SO GREAT?

HOW VIOLENT FICTION WORKS: ROHAN WILSON’S “THE ROVING PARTY” AND JAMES WOOD’S SANGUINARY SUBLIME FROM CONRAD TO MCCARTHY

THE ADAPTIVE APPEAL OF BAD BOYS

Read More
Dennis Junk Dennis Junk

Review of "Building Great Sentences," a "Great Courses" Lecture Series by Brooks Landon

Brooks Landon’s course is well worth the effort and cost, but he makes some interesting suggestions about what constitutes a great sentence. To him, greatness has to do with the structure of the language, but truly great sentences—truly great writing—gets its power from its role as a conveyance of meaning, i.e. the words’ connection to the real world.

You’ve probably received catalogues in the mail advertising “Great Courses.” I’ve been flipping through them for years thinking I should try a couple but have always been turned off by the price. Recently, I saw that they were on sale, and one in particular struck me as potentially worthwhile. “Building Great Sentences: Exploring the Writer’s Craft” is taught by Brooks Landon, who is listed as part of the faculty at the University of Iowa. It turns out, however, he’s not in any way affiliated with the august Creative Writing Workshop, and though he uses several example sentences from literature I’d say his primary audience is people interested in Rhetoric and Composition—and that makes the following criticisms a bit unfair. So let me first say that I enjoyed the lectures and think it well worth the money (about thirty bucks) and time (twenty-four half-hour-long lectures).

            Landon is obviously reading from a teleprompter, and he’s standing behind a lectern in what looks like Mr. Roger’s living room decked out to look scholarly. But he manages nonetheless to be animated, enthusiastic, and engaging. He gives plenty of examples of the principles he discusses, all of which appear in text form and are easy to follow—though they do at times veer toward the eye-glazingly excessive.

            The star of the show is what Landon calls “cumulative sentences,” those long developments from initial capitalized word through a series of phrases serving as free modifiers, each building on its predecessor, focusing in, panning out, or taking it as a point of departure as the writer moves forward into unexplored territory. After watching several lectures, I went to the novel I’m working on and indeed discovered more than a few instances where I’d seen fit to let my phrases accumulate into a stylistic flourish. The catch is that these instances were distantly placed from one another. Moving from my own work to some stories in the Summer Fiction Issue of The New Yorker, I found the same trend. The vast majority of sentences follow Strunk and White’s dictum to be simple and direct, a point Landon acknowledges. Still, for style and rhetorical impact, the long sentences Landon describes are certainly effective.

            Landon and I part ways, though, when it comes to “acrobatic” sentences which “draw attention to themselves.” Giving William Gass a high seat in his pantheon of literary luminaries, Landon explains that “Gass always sees language as a subject every bit as interesting and important as is the referential world his language points to, invokes, or stands for.” While this poststructuralist sentiment seems hard to object to, it misses the point of what language does and how it works. Sentences can call attention to themselves for performing their functions well, but calling attention to themselves should never be one of their functions.

            Writers like Gass and Pynchon and Wallace fail in their quixotic undertakings precisely because they perform too many acrobatics. While it is true that many readers, particularly those who appreciate literary as opposed to popular fiction—yes, there is a difference—are attuned to the pleasures of language, luxuriating in precise and lyrical writing, there’s something perverse about fixating on sentences to the exclusion of things like character. Great words in great sentences incorporating great images and suggestive comparisons can make the world in which a story takes place come alive—so much so that the life of the story escapes the page and transforms the way readers see the world beyond it. But the prompt for us to keep reading is not the promise of more transformative language; it’s the anticipation of transforming characters. Great sentences in literature owe their greatness to the moments of inspiration, from tiny observation to earth-shattering epiphany, experienced by the people at the heart of the story. Their transformations become our transformations. And literary language may seem to derive whatever greatness it achieves from precision and lyricism, but at a more fundamental level of analysis it must be recognized that writing must be precise and lyrical in its detailing of the thoughts and observations of the characters readers seek to connect with. This takes us to a set of considerations that transcend the workings of any given sentence.

            Landon devotes an entire lecture to the rhythm of prose, acknowledging it must be thought of differently from meter in poetry, but failing to arrive at an adequate, objective definition. I wondered all the while why we speak about rhythm at all when we’re discussing passages that don’t follow one. Maybe the rhythm is variable. Maybe it’s somehow progressive and evolving. Or maybe we should simply find a better word to describe this inscrutable quality of impactful and engaging sentences. I propose grace. Indeed, a singer demonstrates grace by adhering to a precisely measured series of vocal steps. Noting a similar type of grace in writing, we’re tempted to hear it as rhythmical, even though its steps are in no way measured. Grace is that quality of action that leaves audiences with an overwhelming sense of its having been well-planned and deftly executed, well-planned because its deft execution appeared so effortless—but with an element of surprise just salient enough to suggest spontaneity. Grace is a delicate balance between the choreographed and the extemporized.

            Grace in writing is achieved insofar as the sequential parts—words, phrases, clauses, sentences, paragraphs, sections, chapters—meet the demands of their surroundings, following one another seamlessly and coherently, performing the function of conveying meaning, in this case of connecting the narrator’s thoughts and experiences to the reader. A passage will strike us as particularly graceful when it conveys a great deal of meaning in a seemingly short chain of words, a feat frequently accomplished with analogies (a point on which Landon is eloquent), or when it conveys a complex idea or set of impressions in a way that’s easily comprehended. I suspect Landon would agree with my definition of grace. But his focus on lyrical or graceful sentences, as opposed to sympathetic or engaging characters—or any of the other aspects of literary writing—precludes him from lighting on the idea that grace can be strategically lain aside for the sake of more immediate connections with the people and events of the story, connections functioning in real-time as the reader’s eyes take in the page.

            Sentences in literature like to function mimetically, though this observation goes unmentioned in the lectures. Landon cites the beautifully graceful line from Gatsby,

Slenderly, languidly, their hands set lightly on their hips the two young women preceded us out onto a rosy-colored porch open toward the sunset where four candles flickered on the table in the diminished wind (16).

The multiple L’s roll out at a slow pace, mimicking the women and the scene being described. This is indeed a great sentence. But so too is the later sentence in which Nick Carraway recalls being chagrined upon discovering the man he’s been talking to about Gatsby is in fact Gatsby himself.

Nick describes how Gatsby tried to reassure him: “He smiled understandingly—much more than understandingly.” The first notable thing about this sentence is that it stutters. Even though Nick is remembering the scene at a more comfortable future time, he re-experiences his embarrassment, and readers can’t help but sympathize. The second thing to note is that this one sentence, despite serving as a crucial step in the development of Nick’s response to meeting Gatsby and forming an impression of him, is just that, a step. The rest of the remarkable passage comes in the following sentences:

It was one of those rare smiles with a quality of eternal reassurance in it, that you may come across four or five times in life. It faced—or seemed to face—the whole external world for an instant, and then concentrated on you with an irresistible prejudice in your favor. It understood you just so far as you wanted to be understood, believed in you as you would like to believe in yourself and assured you that it had precisely the impression of you that, at your best, you hoped to convey. Precisely at that point it vanished—and I was looking at an elegant young rough-neck, a year or two over thirty, whose elaborate formality of speech just missed being absurd. Some time before he introduced himself I’d got a strong impression that he was picking his words with care (52-3).

            Beginning with a solecism (“reassurance in it, that…”) that suggests Nick’s struggle to settle on the right description, moving onto another stutter (or seemed to face) which indicates his skepticism creeping in beside his appreciation of the regard, the passage then moves into one of those cumulative passages Landon so appreciates. But then there’s the jarring incongruity of the smile’s vanishing. This is, as far as I can remember, the line that sold me on the book when I first read it. You can really feel Nick’s confusion and astonishment. And the effect is brought about by sentences, an irreducible sequence of them, that are markedly ungraceful. (Dashes are wonderful for those break-ins so suggestive of spontaneity and advance in real-time.)

Also read:

POSTSTRUCTURALISM: BANAL WHEN IT'S NOT BUSY BEING ABSURD

WHAT TO LEAVE OUT: MINIMALISM AND THE HEMINGWAY MYSTIQUE

Read More
Dennis Junk Dennis Junk

Art as Altruism: Lily Briscoe and the Ghost of Mrs. Ramsay in To the Lighthouse Part 2 of 2

Because evolution took advantage of our concern for our reputations and our ability to reason about the thoughts and feelings of others to ensure cooperation, Lily’s predicament, her argument with the ghost of Mrs. Ramsay over the proper way for a woman to live, could only be resolved through proof that she was not really free-riding or cheating, but was in fact altruistic in her own way.

The question remains, though, of why Virginia Woolf felt it necessary to recall scenes from her childhood in order to lay to rest her inner conflict over her chosen way of life—if that is indeed what To the Lighthouse did for her. She did not, in fact, spend her entire life single but married her husband Leonard in 1912 at the age of thirty and stayed with him until her death in 1941. The Woolfs had been married fifteen years by the time Lighthouse was published (Lee 314). But Virginia’s marriage was quite different from her mother Julia’s. For one, as is made abundantly clear in her diaries, Leonard Woolf was much more supportive and much less demanding than her father Leslie Stephens. More important, though, Julia had seven children of her own and cared for one of Leslie’s from a previous marriage (Lee xx), whereas Virginia remained childless all her life. But, even if she felt her lifestyle represented such a cataclysmic break from her mother’s cultural tradition, it is remarkable that the pain of this partition persisted from the time of Julia’s death when Virginia was thirteen, until the writing of Lighthouse when she was forty-four—the same age as Lily in the last section of the novel. Lily returns to the Ramsays’ summer house ten years after the visit described in the first section, Mrs. Ramsay having died rather mysteriously in the interim, and sets to painting the same image she struggled to capture before. “She had never finished that picture. She would paint that picture now. It had been knocking about in her mind all these years” (147). But why should Lily experience such difficulty handling a conflict of views with a woman who has been dead for years?

Wilson sees the universal propensity among humans to carry on relationships with supernatural beings—like the minds and personalities of the dead, but also including disembodied characters like deities—as one of a host of mechanisms, partly cultural, partly biological, devoted to ensuring group cohesion. In his book Darwin’s Cathedral, in which he attempts to explain religion in terms of his group selection theory, he writes,

A group of people who abandon self-will and work tirelessly for a greater good will fare very well as a group, much better than if they all pursue their private utilities, as long as the greater good corresponds to the welfare of the group. And religions almost invariably do link the greater good to the welfare of the community of believers, whether an organized modern church or an ethnic group for whom religion is thoroughly intermixed with the rest of their culture. Since religion is such an ancient feature of our species, I have no problem whatsoever imagining the capacity for selflessness and longing to be part of something larger than ourselves as part of our genetic and cultural heritage. (175)

One of the main tasks religious beliefs must handle is the same “free-rider problem” William Flesch discovers at the heart of narrative. What religion offers beyond the social monitoring of group members is the presence of invisible beings whose concerns are tied in to the collective concerns of the group. Jesse Bering contributes to this perspective by positing a specific cognitive mechanism which paved the way for the evolution of beliefs about invisible agents, and his theory provides a crucial backdrop for any discussion of the role the dead play for the living, in life or in literature. Of course, Mrs. Ramsay is not a deity, and though Lily feels as she paints “a sense of some one there, of Mrs. Ramsay, relieved for a moment of the weight that the world had put on her” (181), which she earlier describes as, “Ghost, air, nothingness, a thing you could play with easily and safely at any time of day or night, she had been that, and then suddenly she put her hand out and wrung the heart thus” (179), she does not believe Mrs. Ramsay is still around in any literal sense. Bering suggests this “nothingness” with the power to wring the heart derives from the same capacity humans depend on to know, roughly, what other humans are thinking. Though there is much disagreement about whether apes understand differences in each other’s knowledge and intentions, it is undeniably the case that humans far outshine any other creature in their capacity to reason about the inner, invisible workings of the minds of their conspecifics. We are so predisposed to this type of reasoning that, according to Bering, we apply it to natural phenomena in which no minds are involved. He writes,

just like other people’s surface behaviors, natural events can be perceived by us human beings as being about something other than their surface characteristics only because our brains are equipped with the specialized cognitive software, theory of mind, that enables us to think about underlying psychological causes. (79)

As Lily reflects, “this making up scenes about them, is what we call ‘knowing’ people” (173). And we must make up these scenes because, like the bees hovering about the hive she compares herself to in the first section, we have no direct access to the minds of others. Yet if we are to coordinate our actions adaptively—even competitively when other groups are involved—we have no choice but to rely on working assumptions, our theories of others’ knowledge and intentions, updating them when necessary.

The reading of natural evens as signs of some mysterious mind, as well as the continued importance of minds no longer attached to bodies capable of emitting signs, might have arisen as a mere byproduct of humans’ need to understand one another, but at some point in the course our evolution our theories of disembodied minds was co-opted in the service of helping to solve the free-rider problem. In his book The God Instinct, Bering describes a series of experiments known as “The Princess Alice studies,” which have young children perform various tasks after being primed to believe an invisible agent (named Alice in honor of Bering’s mother) is in the room with them. What he and his colleagues found was that Princess Alice’s influence only emerged as the children’s theory of mind developed, suggesting “the ability to be superstitious actually demands some mental sophistication” (96). But once a theory of mind is operating the suggestion of an invisible presence has a curious effect. First in a study of college students casually told about the ghost of a graduate student before taking a math test, and then in a study of children told Princess Alice was watching them as they performed a difficult task involving Velcro darts, participants primed to consider the mind of a supernatural agent were much less likely to take opportunities to cheat which were built into the experimental designs (193-4).

Because evolution took advantage of our concern for our reputations and our ability to reason about the thoughts and feelings of others to ensure cooperation, Lily’s predicament, her argument with the ghost of Mrs. Ramsay over the proper way for a woman to live, could only be resolved through proof that she was not really free-riding or cheating, but was in fact altruistic in her own way. Considering the fate of a couple Mrs. Ramsay had encouraged to marry, Lily imagines, “She would feel a little triumphant, telling Mrs. Ramsay that the marriage had not been a success.” But, she would go on, “They’re happy like that; I’m happy like this. Life has changed completely.” Thus Lily manages to “over-ride her wishes, improve away her limited, old-fashioned ideas” (174-5). Lily’s ultimate redemption, though, can only come through acknowledgement that the life she has chosen is not actually selfish. The difficulty in this task stems from the fact that “one could not imagine Mrs. Ramsay standing painting, lying reading, a whole morning on the lawn” (196). Mrs. Ramsay has no appreciation for art or literature, but for Lily it is art—and for Woolf it is literature—that is both the product of all that time alone and her contribution to society as a whole. Lily is redeemed when she finishes her painting, and that is where the novel ends. At the same time, Virginia Woolf, having completed this great work of literature, bequeathed it to society, to us, and in so doing proved her own altruism, thus laying to rest the ghost of Julia Stephens.

Also read:
THEY COMES A DAY: CELEBRATING COOPERATION IN A GATHERING OF OLD MEN AND HORTON HEARS A WHO!

T.J. ECKLEBURG SEES EVERYTHING: THE GREAT GOD-GAP IN GATSBY

MADNESS AND BLISS: CRITICAL VERSUS PRIMITIVE READINGS IN A.S. BYATT’S POSSESSION: A ROMANCE

Read More
Dennis Junk Dennis Junk

Art as Altruism: Lily Briscoe and the Ghost of Mrs. Ramsay in To the Lighthouse Part 1 of 2

Woolf’s struggle with her mother, and its manifestation as Lily’s struggle with Mrs. Ramsay, represents a sort of trial in which the younger living woman defends herself against a charge of selfishness leveled by her deceased elder. And since Woolf’s obsession with her mother ceased upon completion of the novel, she must have been satisfied that she had successfully exonerated herself.

Virginia Woolf underwent a transformation in the process of writing To the Lighthouse the nature of which has been the subject of much scholarly inquiry. At the center of the novel is the relationship between the beautiful, self-sacrificing, and yet officious Mrs. Ramsay, and the retiring, introverted artist Lily Briscoe. “I wrote the book very quickly,” Woolf recalls in “Sketch of the Past,” “and when it was written, I ceased to be obsessed by my mother. I no longer hear her voice; I do not see her.” Quoting these lines, biographer Hermione Lee suggests the novel is all about Woolf’s parents, “a way of pacifying their ghosts” (476). But how exactly did writing the novel function to end Woolf’s obsession with her mother? And, for that matter, why would she, at forty-four, still be obsessed with a woman who had died when she was only thirteen? Evolutionary psychologist Jesse Bering suggests that while humans are uniquely capable of imagining the inner workings of each other’s minds, the cognitive mechanisms underlying this capacity, which psychologists call “theory of mind,” simply fail to comprehend the utter extinction of those other minds. However, the lingering presence of the dead is not merely a byproduct of humans’ need to understand and communicate with other living humans. Bering argues that the watchful gaze of disembodied minds—real or imagined—serves a type of police function, ensuring that otherwise selfish and sneaky individuals cooperate and play by the rules of society. From this perspective, Woolf’s struggle with her mother, and its manifestation as Lily’s struggle with Mrs. Ramsay, represents a sort of trial in which the younger living woman defends herself against a charge of selfishness leveled by her deceased elder. And since Woolf’s obsession with her mother ceased upon completion of the novel, she must have been satisfied that she had successfully exonerated herself.

Woolf made no secret of the fact that Mr. and Mrs. Ramsay were fictionalized versions of her own parents, and most critics see Lily as a stand-in for the author—even though she is merely a friend of the Ramsay family. These complex relationships between author and character, and between daughter and parents, lie at the heart of a dynamic which readily lends itself to psychoanalytic explorations. Jane Lilienfeld, for instance, suggests Woolf created Lily as a proxy to help her accept her parents, both long dead by the time she began writing, “as monumental but flawed human beings,” whom she both adored and detested. Having reduced the grand, archetypal Mrs. Ramsay to her proper human dimensions, Lily is free to acknowledge her own “validity as a single woman, as an artist whose power comes not from manipulating others’ lives in order to fulfill herself, but one whose mature vision encapsulates and transcends reality” (372). But for all the elaborate dealings with mythical and mysterious psychic forces, the theories of Freud and Jung explain very little about why writers write and why readers read. And they explain very little about how people relate to the dead, or about what role the dead play in narrative. Freud may have been right about humans’ intense ambivalence toward their parents, but why should this tension persist long after those parents have ceased to exist? And Jung may have been correct in his detection of mythic resonances in his patients’ dreams, but what accounts for such universal narrative patterns? What do they explain?

Looking at narrative from the perspective of modern evolutionary biology offers several important insights into why people devote so much time and energy to, and get so much gratification from immersing themselves in the plights and dealings of fictional characters. Anthropologists believe the primary concern for our species at the time of its origin was the threat of rival tribes vying for control of limited resources. The legacy of this threat is the persistent proclivity for tribal—us versus them—thinking among modern humans. But alongside our penchant for dehumanizing members of out-groups arose a set of mechanisms designed to encourage—and when necessary to enforce—in-group cooperation for the sake of out-competing less cohesive tribes. Evolutionary literary theorist William Flesch sees in narrative a play of these cooperation-enhancing mechanisms. He writes, “our capacity for narrative developed as a way for us to keep track of cooperators” (67), and he goes on to suggest we tend to align ourselves with those we perceive as especially cooperative or altruistic while feeling an intense desire to see those who demonstrate selfishness get their comeuppance. This is because “altruism could not sustain an evolutionarily stable system without the contribution of altruistic punishers to punish the free-riders who would flourish in a population of purely benevolent altruists” (66). Flesch cites the findings of numerous experiments which demonstrate people’s willingness to punish those they see as exploiting unspoken social compacts and implicit rules of fair dealing, even when meting out that punishment involves costs or risks to the punisher (31-34). Child psychologist Karen Wynn has found that even infants too young to speak prefer to play with puppets or blocks with crude plastic eyes that have in some way demonstrated their altruism over the ones they have seen behaving selfishly or aggressively (557-560). Such experiments lead Flesch to posit a social monitoring and volunteered affect theory of narrative interest, whereby humans track the behavior of others, even fictional others, in order to assess their propensity for altruism or selfishness and are anxious to see that the altruistic are vindicated while the selfish are punished. In responding thus to other people’s behavior, whether they are fictional or real, the individual signals his or her own propensity for second- or third-order altruism.

The plot of To the Lighthouse is unlike anything else in literature, and yet a great deal of information is provided regarding the relative cooperativeness of each of the characters. Foremost among them in her compassion for others is Mrs. Ramsay. While it is true from the perspective of her own genetic interests that her heroic devotion to her husband and their eight children can be considered selfish, she nonetheless extends her care beyond the sphere of her family. She even concerns herself with the tribulations of complete strangers, something readers discover early in the novel, as

she ruminated the other problem, of rich and poor, and the things she saw with her own eyes… when she visited this widow, or that struggling wife in person with a bag on her arm, and a note- book and pencil with which she wrote down in columns carefully ruled for the purpose wages and spendings, employment and unemployment, in the hope that thus she would cease to be a private woman whose charity was half a sop to her own indignation, half relief to her own curiosity, and become what with her untrained mind she greatly admired, an investigator, elucidating the social problem. (9)

No sooner does she finish reflecting on this social problem than she catches sight of her husband’s friend Charles Tansley, who is feeling bored and “out of things,” because no one staying at the Ramsays’ summer house likes him. Regardless of the topic Tansley discusses with them, “until he had turned the whole thing around and made it somehow reflect himself and disparage them—he was not satisfied” (8). And yet Mrs. Ramsay feels compelled to invite him along on an errand so that he does not have to be alone. Before leaving the premises, though, she has to ask yet another houseguest, Augustus Carmichael, “if he wanted anything” (10). She shows this type of exquisite sensitivity to others’ feelings and states of mind throughout the first section of the novel.

Mrs. Ramsay’s feelings about Lily, another houseguest, are at once dismissive and solicitous. Readers are introduced to Lily only through Mrs. Ramsay’s sudden realization, after prolonged absentmindedness, that she is supposed to be holding still so Lily can paint her. Mrs. Ramsay’s son James, who is sitting with her as he cuts pictures out of a catalogue, makes a strange noise she worries might embarrass him. She turns to see if anyone has heard: “Only Lily Briscoe, she was glad to find; and that did not matter.” Mrs. Ramsay is doing Lily the favor of posing, but the gesture goes no further than mere politeness. Still, there is a quality the younger woman possesses that she admires. “With her little Chinese eyes,” Mrs. Ramsay thinks, “and her puckered-up face, she would never marry; one could not take her painting very seriously; she was an independent little creature, and Mrs. Ramsay liked her for it” (17). Lily’s feelings toward her hostess, on the other hand, though based on a similar recognition that the other enjoys aspects of life utterly foreign to her, are much more intense. At one point early in the novel, Lily wonders, “what could one say to her?” The answer she hazards is “I’m in love with you?” But she decides that is not true and settles on, “‘I’m in love with this all,’ waving her hand at the hedge, at the house, at the children” (19). What Lily loves, and what she tries to capture in her painting, is the essence of the family life Mrs. Ramsay represents, the life Lily herself has rejected in pursuit of her art. It must be noted too that, though Mrs. Ramsay is not related to Lily, Lily has only an elderly father, and so some of the appeal of the large, intact Ramsay family to Lily is the fact that she has been sometime without a mother.

Apart from admiring in the other what each lacks herself, the two women share little in common. The tension between them derives from Lily’s having resigned herself to life without a husband, life in the service of her art and caring for her father, while Mrs. Ramsay simply cannot imagine how any woman could be content without a family. Underlying this conviction is Mrs. Ramsay’s unique view of men and her relationship to them:

Indeed, she had the whole of the other sex under her protection; for reasons she could not explain, for their chivalry and valour, for the fact that they negotiated treaties, ruled India, controlled finance; finally for an attitude towards herself which no woman could fail to feel or to find agreeable, something trustful, childlike, reverential; which an old woman could take from a young man without loss of dignity, and woe betide the girl—pray Heaven it was none of her daughters!—who did not feel the worth of it, and all that it implied, to the marrow of her bones! (6)

In other words, woe betide Lily Briscoe. Anthropologists Peter Richerson and Robert Boyd, whose work on the evolution of cooperation in humans provides the foundation for Flesch’s theory of narrative, put forth the idea that culture functions to simultaneously maintain group cohesion and to help the group adapt to whatever environment it inhabits. “Human cultures,” they point out, “can change even more quickly than the most rapid examples of genetic evolution by natural selection” (43). What underlies the divergence of views about women’s roles between the two women in Woolf’s novel is that their culture is undergoing major transformations owing to political and economic upheaval in the lead-up to The First World War.

Lily has no long-established tradition of women artists in which to find solace and guidance; rather, the most salient model of womanhood is the family-minded, self-sacrificing Mrs. Ramsay. It is therefore to Mrs. Ramsay that Lily must justify her attempt at establishing a new tradition. She reads the older woman as making the implicit claim that “an unmarried woman has missed the best of life.” In response, Lily imagines how

gathering a desperate courage she would urge her own exemption from the universal law; plead for it; she liked to be alone; she liked to be herself; she was not made for that; and so have to meet a serious stare from eyes of unparalleled depth, and confront Mrs. Ramsay’s simple certainty… that her dear Lily, her little Brisk, was a fool. (50)

Living alone, being herself, and refusing to give up her time or her being to any husband or children strikes even Lily herself as both selfish and illegitimate, lacking cultural sanction and therefore doubly selfish. Trying to figure out the basis of her attraction to Mrs. Ramsay, beyond her obvious beauty, Lily asks herself, “did she lock up within her some secret which certainly Lily Briscoe believed people must have for the world to go on at all? Every one could not be as helter skelter, hand to mouth as she was” (50). Lily’s dilemma is that she can either be herself, or she can be a member of a family, because being a member of a family means she cannot be wholly herself; like Mrs. Ramsay, she would have to make compromises, and her art would cease to have any more significance than the older woman’s note-book with all its writing devoted to social problems. But she must justify devoting her life only to herself. Meanwhile, she’s desperate for some form of human connection beyond the casual greetings and formal exchanges that take place under the Ramsays’ roof.

Lily expresses a desire not just for knowledge from Mrs. Ramsay but for actual unity with her because what she needs is “nothing that could be written in any language known to men.” She wants to be intimate with the “knowledge and wisdom… stored up in Mrs. Ramsay’s heart,” not any factual information that could be channeled through print. The metaphor Lily uses for her struggle is particularly striking for anyone who studies human evolution.

How then, she had asked herself, did one know one thing or another thing about people, sealed as they were? Only like a bee, drawn by some sweetness or sharpness in the air intangible to touch or taste, one haunted the dome-shaped hive, ranged the wastes of the air over the countries of the world alone, and then haunted the hives with their murmurs and their stirrings; the hives, which were people. (51)

According to evolutionary biologist David Sloan Wilson, bees are one of only about fifteen species of social insect that have crossed the “Cooperation Divide,” beyond which natural selection at the level of the group supercedes selection at the level of the individual. “Social insect colonies qualify as organisms,” Wilson writes, “not because they are physically bounded but because their members coordinate their activities in organ-like fashion to perpetuate the whole” (144). The main element that separates humans from their ancestors and other primates, he argues, “is that we are evolution’s newest transition from groups of organisms to groups as organisms. Our social groups are the primate equivalent of bodies and beehives” (154). The secret locked away from Lily in Mrs. Ramsay’s heart, the essence of the Ramsay family that she loves so intensely and feels compelled to capture in her painting, is that human individuals are adapted to life in groups of other humans who together represent a type of unitary body. In trying to live by herself and for herself, Lily is going not only against the cultural traditions of the previous generation but even against her own nature.

Part 2.

Read More
Dennis Junk Dennis Junk

How to Read Stories--You're probably doing it wrong

Your efforts to place each part into the context of the whole will, over time, as you read more stories, give you a finer appreciation for the strategies writers use to construct their work, one scene or one section at a time. And as you try to anticipate the parts to come from the parts you’ve read you will be training your mind to notice patterns, laying down templates for how to accomplish the types of effects—surprise, emotional resonance, lyricism, profundity—the author has accomplished.

There are whole books out there about how to read like a professor or a writer, or how to speed-read and still remember every word. For the most part, you can discard all of them. Studies have shown speed readers are frauds—the faster they read the less they comprehend and remember. The professors suggest applying the wacky theories they use to write their scholarly articles, theories which serve to cast readers out of the story into some abstract realm of symbols, psychological forces, or politics. I find the endeavor offensive.

Writers writing about how to read like a writer are operating on good faith. They just tend to be a bit deluded. Literature is very much like a magic trick, but of course it’s not real magic. They like to encourage people to stand in awe of great works and great passages—something I frankly don’t need any encouragement to do (what is it about the end of “Mr. Sammler’s Planet”?) But to get to those mystical passages you have to read a lot of workaday prose, even in the work of the most lyrical and crafty writers. Awe simply can’t be used as a reading strategy.

Good fiction is like a magic trick because it’s constructed of small parts that our minds can’t help responding to holistically. We read a few lines and all the sudden we have a person in mind; after a few pages we find ourselves caring about what happens to this person. Writers often avoid talking about the trick and the methods and strategies that go into it because they’re afraid once the mystery is gone the trick will cease to convince. But even good magicians will tell you well performed routines frequently astonish even the one performing them. Focusing on the parts does not diminish appreciation for the whole.

The way to read a piece of fiction is to use the information you've already read in order to anticipate what will happen next. Most contemporary stories are divided into several sections, which offer readers the opportunity to pause after each, reflecting how it may fit into the whole of the work. The author had a purpose in including each section: furthering the plot, revealing the character’s personality, developing a theme, or playing with perspective. Practice posing the questions to yourself at the end of each section, what has the author just done, and what does it suggests she’ll likely do in sections to come.

In the early sections, questions will probably be general: What type of story is this? What type of characters are these? But by the time you reach about the two/thirds point they will be much more specific: What’s the author going to do with this character? How is this tension going to be resolved? Efforts to classify and anticipate the elements of the story will, if nothing else, lead to greater engagement with it. Every new character should be memorized—even if doing so requires a mnemonic (practice coming up with one on the fly).

The larger goal, though, is a better understanding of how the type of fiction you read works. Your efforts to place each part into the context of the whole will, over time, as you read more stories, give you a finer appreciation for the strategies writers use to construct their work, one scene or one section at a time. And as you try to anticipate the parts to come from the parts you’ve read you will be training your mind to notice patterns, laying down templates for how to accomplish the types of effects—surprise, emotional resonance, lyricism, profundity—the author has accomplished.

By trying to get ahead of the author, as it were, you won’t be learning to simply reproduce the same effects. By internalizing the strategies, making them automatic, you’ll be freeing up your conscious mind for new flights of creative re-working. You’ll be using the more skilled author’s work to bootstrap your own skill level. But once you’ve accomplished this there’ll be nothing stopping you from taking your own writing to the next level. Anticipation makes reading a challenge in real time—like a video game. And games can be conquered.

Finally, if a story moves you strongly, re-read it immediately. And then put it in a stack for future re-reading.

Also read:

PUTTING DOWN THE PEN: HOW SCHOOL TEACHES US THE WORST POSSIBLE WAY TO READ LITERATURE

Read More
Dennis Junk Dennis Junk

Productivity as Practice:An Expert Performance Approach to Creative Writing Pedagogy Part 3

Psychologist K. Anders Ericsson’s central finding in his research on expert achievement is that what separates those who attain a merely sufficient level of proficiency in a performance domain from those who reach higher levels of excellence is the amount of time devoted over the course of training to deliberate practice. But, in a domain with criteria for success that can only be abstractly defined, like creative writing, what would constitute deliberate practice is difficult to define.

Start reading at part one.

            But the question of what standards of success the instructor is to apply to students’ work, as well as the ones instructors will encourage the students to apply to each others’ work, has yet to be addressed. The skills students develop through practicing evaluating their own work will both be based on their evaluations of the works of others and be applied to them. The first step toward becoming a creative writer is recognizing how much one likes the writing of another. The work the student is initially exposed to will almost certainly have gone through a complex series of assessments beginning with the author’s of his own work, onto commenters and editors working on behalf of the author, then onto editors working on behalf of publishers, and finally to the publishers themselves. Even upon publication, any given work is unlikely to be read by a majority of readers who appreciate the type of writing it represents until a critical threshold is reached beyond which its likelihood of becoming recommended reading is increased. At some point in the process it may even reach the attention of critics and reviewers, who will themselves evaluate the work either positively or negatively. (This is leaving out the roles of branding and author reputation because they probably aren’t practicable skills.) Since feedback cannot be grounded in any absolute or easily measurable criteria, Ericsson advocates a “socially based definition of creativity” (330). And, since students develop their evaluative skills through internalizing and anticipating the evaluations of others, the choice of which workshop to attend is paramount. The student should seek out those most versed in and most appreciative of the type of writing he aspires to master.

            Simply reading theoretical essays on poetry or storytelling, as Vikil has his students do, is probably far less effective than sampling a theorist’s or critic’s work and then trying to anticipate that evaluator’s response to a work he or she has written about. Some critics’ work lends itself to this type of exercise more readily than others; those who focus on literary as opposed to political elements, and those who put more effort into using sound methods to ensure the validity of their psychological or sociological theories—if they must theorize—will be much more helpful than those who see each new work as an opportunity to reiterate their favorite ideas in a fresh context. It may be advisable, in other words, to concentrate on reviewers rather than critics and theorists. After having learned to anticipate the responses of a few reviewers whose work is influential, the student will be better equipped to evaluate his or her own work in terms of how it will be received in the social context that will be the final arbiter of success or failure.

            Anticipation, as it allows for feedback, could form the basis for several types of practice exercises. Ericsson cites his own and others’ research demonstrating that chess players improve not as a function of how much time they spend playing chess but through studying past games between chess masters. “By trying to select the best move for each position of the game,” Ericsson writes, “and comparing their selected move to the actual move of the game, the players identify discrepancies where they must study the chess position more deeply to uncover the reason for the master’s move” (37). In a similar way, pausing in the process of reading to anticipate a successful author’s next move in a story or novel should offer an opportunity for creative writing students to compare their ideas with the author’s. Of course, areas of divergence between the reader’s ideas for a next move and the one the author actually made need not be interpreted as a mistake on the part of the reader—the reader’s idea may even be better. However, in anticipating what will happen next in a story, the student is generating ideas and therefore getting practice in the area of productivity. And, whether or not the author’s ideas are better, the student will develop greater familiarity with her methods through such active engagement with them. Finally, the students will be getting practice evaluating ideas as they compare their own to those of the author.

            A possible objection to implementing this anticipatory reading method in a creative writing curriculum is that a student learning to anticipate an author’s moves would simply be learning to make moves like the ones that author makes—which amounts to reproduction, not creativity. Indeed, one of the theories Ericsson has explored to explain how expertise develops posits a sort of rote memorization of strategies and their proper application to a limited set of situations. “For a long time it was believed that experts acquired a large repertoire of patterns,” he explains, “and their superior performance could be attributed to simple pattern matching and recall of previously stored actions from memory in an effortless and automatic manner” (331). If expertise relies on memory and pattern recognition, though, then experts would fare no better in novel situations than non-experts. Ericsson has found just the opposite to be the case.

Superior expert performers in domains such as music, chess, and medicine can generate better actions than their less skilled peers even in situations they have never directly experienced. Expert performers have acquired refined mental representations that maintain access to relevant information about the situation and support more extensive, flexible reasoning to determine the appropriate actions demanded by the encountered situation. (331)

What the creative writer would be developing through techniques for practice such as anticipation-based reading likely goes beyond a simple accumulation of fixed strategies—a bigger bag of tricks appropriated from other authors. They would instead be developing a complex working model of storytelling as well as a greater capacity for representing and manipulating the various aspects of their own stories in working memory.

            Skepticism about whether literary writing of any sort can be taught—or learned in any mundane or systematic way—derives from a real and important insight: authors are judged not by how well they reproduce the formulas of poetry and storytelling but by how successful they are in reformulating the conventional techniques of the previous generation of writers. No one taught Cervantes his epic-absurd form of parody. No one taught Shakespeare how to explore the inner workings of his characters’ minds through monologues. No one taught Virginia Woolf how to shun external trappings and delve so exquisitely into the consciousness of her characters. Yet observations of where authors came to reside in relation to prevailing literary cultures don’t always offer clues to the mode of transportation. Woolf, for instance, wrote a great deal about the fashion for representing characters through references to their physical appearances and lists of their possessions in her reviews for the Times Literary Supplement. She didn’t develop her own approach oblivious of what she called “materialism”; fully understanding the method, she found it insufficient for what she hoped to accomplish with her own work. And she’d spent a lot time in her youth reading Shakespeare, with those long eminently revealing monologues (Wood 110). Supposing creative genius is born of mastery of conventions and techniques and not ignorance of or antipathy toward them, the emphasis on the works of established authors in creative writing pedagogy ceases to savor of hidebound conservatism.

            The general pedagogical outline focusing on practice devoted to productivity, as well as the general approach to reading based on anticipation can be refined to accommodate any student’s proclivities or concerns. A student who wants to develop skill in describing characters’ physical appearances in a way that captures something of the essence of their personalities may begin by studying the work of authors from Charles Dickens to Saul Bellow. Though it’s difficult to imagine how such descriptions might be anticipated, the characters’ later development over the course of the plot does offer opportunities to test predictions. Coming away from studies of past works, the student need not be limited to exercises on blank screens or sheets of paper; practice might entail generating multiple ideas for describing some interesting individual he knows in real life, or describing multiple individuals he knows. He may set himself the task of coming up with a good description for everyone interviewed during the course of a television news program. He can practice describing random people who pass on a campus sidewalk, imagining details of their lives and personalities, or characters in shows and movies. By the time the aspiring author is sitting down to write about her own character in a story or novel, she will all but automatically produce a number of possible strategies for making that character come alive through words, increasing the likelihood that she’ll light on one that resonates strongly, first with her own memories and emotions and then with those of her readers. And, if Simonton’s theory has any validity, the works produced according to this strategy need not resemble each other any more than one species resembles another.

            All of the conventional elements of craft—character, plot, theme, dialogue, point of view, and even higher-order dimensions like voice—readily lend themselves to this qualitative approach to practice. A creative writing instructor may coach a student who wants to be better able to devise compelling plots to read stories recognized as excelling in that dimension, encouraging her to pause along the way to write a list of possible complications, twists, and resolutions to compare with the ones she’ll eventually discover in the actual text. If the student fails to anticipate the author’s moves, she can then compare her ideas with the author’s, giving her a deeper sense of why one works better than the others. She may even practice anticipating the plots of television shows and movies, or trying to conceive of how stories in the news might be rendered as fictional plots. To practice describing settings, students could be encouraged to come up with multiple metaphors and similes based on one set and then another of the physical features they observe in real places. How many ways, a student may be prompted, can you have characters exchange the same basic information in a dialogue? Which ones reveal more of the characters’ personalities? Which ones most effectively reprise and develop the themes you’re working with? Any single idea generated in these practice sessions is unlikely to represent a significant breakthrough. But the more ideas one has the more likely she’ll discover one which seems likely to her to garner wider recognition of superior quality. The productivity approach can also be applied to revision and would consist of the writer identifying weak passages or scenes in an early draft and generating several new versions of each one so that a single, best version can be chosen for later drafts.

            What I’ve attempted here is a sketch of one possible approach to teaching. It seems that since many worry about the future of literature, fearing that the growing influence of workshops will lead to insularity and standardization, too few teachers are coming forward with ideas on how to help their students improve, as if whatever methods they admit to using would inevitably lend credence to the image of workshops as assembly lines for the production of mediocre and tragically uninspired poems and short stories. But, if creative writing is in danger of being standardized into obsolescence, the publishing industry is the more likely culprit, as every starry-eyed would-be author knows full well publication is the one irreducible factor underlying professional legitimacy. And research has pretty thoroughly ruled out the notion that familiarity with the techniques of the masters in any domain inevitably precludes original, truly creative thinking. The general outline for practice based on productivity and evaluation can be personalized and refined in countless ways, and students can be counted on to bring an endless variety of experiences and perspectives to workshops, variety that would be difficult, to say the least, to completely eradicate in the span of the two or three years allotted to MFA programs.

            The productivity and evaluation model for creative writing pedagogy also holds a great deal of potential for further development. For instance, a survey of successful poets and fiction writers asking them how they practice—after providing them a précis of Ericsson’s and Simonton’s findings on what constitutes practice—may lead to the development of an enormously useful and surprising repertoire of training techniques. How many authors engage in activities they think of as simple games or distractions but in fact contribute to their ability to write engaging and moving stories or poems? Ultimately, though, the discovery of increasingly effective methods will rely on rigorously designed research comparing approaches to each other. The popular rankings for MFA programs based on the professional success of students who graduate from them are a step in the direction of this type of research, but they have the rather serious flaw of sampling bias owing to higher ranking schools having the advantage of larger applicant pools. At this stage, though, even the subjective ratings of individuals experimenting with several practice techniques would be a useful guide for adjusting and refining teaching methods.

            Applying the expert performance framework developed by Ericsson, Simonton, Csikszentmihaly, and their colleagues to creative writing pedagogy would probably not drastically revolutionize teaching and writing practices. It would rather represent a shift in focus from the evaluation-heavy workshop model onto methods for generating ideas. And of course activities like brainstorming and free writing are as old as the hills. What may be new is the conception of these invention strategies as a form of practice to be engaged in for the purpose of developing skills, and the idea that this practice can and should be engaged in independent of any given writing project. Even if a writing student isn’t working on a story or novel, even if he doesn’t have an idea for one yet, he should still be practicing to be a better storyteller or novelist. It’s probably the case, too, that many or most professional writers already habitually engage in activities fitting the parameters of practice laid out by the expert performance model. Such activities probably already play at least some role in classrooms. Since the basic framework can be tailored to any individual’s interests, passions, weaknesses, and strengths, and since it stresses the importance and quantity of new ideas, it’s not inconceivable that more of this type of practice will lead to greater as opposed to less originality.

Also read:

WHAT IS A STORY? AND WHAT ARE YOU SUPPOSED TO DO WITH ONE?

HOW TO GET KIDS TO READ LITERATURE WITHOUT MAKING THEM HATE IT

PUTTING DOWN THE PEN: HOW SCHOOL TEACHES US THE WORST POSSIBLE WAY TO READ LITERATURE

Read More
Dennis Junk Dennis Junk

Productivity as Practice: An Expert Performance Approach to Creative Writing Pedagogy Part 2

Psychologist K. Anders Ericsson’s central finding in his research on expert achievement is that what separates those who attain a merely sufficient level of proficiency in a performance domain from those who reach higher levels of excellence is the amount of time devoted over the course of training to deliberate practice. But, in a domain with criteria for success that can only be abstractly defined, like creative writing, what would constitute deliberate practice is difficult to define.

            Of course, productivity alone cannot account for impactful ideas and works; at some point the most promising ideas must be culled from among the multitude. Since foresight seems to play little if any role in the process, Simonton, following D.T. Campbell, describes it as one of “blind variation and selective retention” (310). Simonton thus theorizes that creativity is Darwinian. Innovative and valuable ideas are often borne of non-linear or “divergent” thinking, which means their future use may not be at all apparent when they are originally conceived. So, Csikszentmihalyi follows his advice to produce multiple ideas with the suggestion, “Try to produce unlikely ideas” (369). Ignoring future utility, then, seems to be important for the creative process, at least until the stage is reached when one should “Shift from openness to closure.” Csikszentmihalyi explains

Good scientists, like good artists, must let their minds roam playfully or they will not discover new facts, new patterns, new relationships. At the same time, they must also be able to evaluate critically every novelty they encounter, forget immediately the spurious ones, and then concentrate their minds on developing and realizing the few that are promising. (361)

So, two sets of skills appear to lie at the heart of creative endeavors, and they suggest themselves as focal areas for those hoping to build on their talents. In the domain of creative writing, it would seem the most important things to practice are producing multiple and unlikely ideas, and evaluating those ideas to see which are the most viable.

            The workshop method prevalent in graduate writing programs probably involves at least some degree of practice in both of these skills. Novelist and teacher Ardashir Vakil, in his thoughtful and candid essay, “Teaching Creative Writing,” outlines what happens in his classrooms.

In the first part of the workshop students are given writing exercises. These vary from the most basic—write about a painful experience from your childhood—to more elaborate games in which you get pairs to tell stories to each other and then write the other person’s story with some bits invented. Then we look at texts by established writers and try to analyse what makes them work—what has the writer done?—in terms of character, language, voice and structure to capture our attention and how have they brought forth a visceral emotional response from the reader. (158)

            These exercises amount to efforts to generate ideas by taking inspiration from life, the writer’s or someone else’s, or from the work of successful writers. The analysis of noteworthy texts also shades into the practice of evaluating ideas and methods. Already, though, it seems the focus leans more toward evaluation, the ability to recognize useful ideas, than productivity. And this emphasis becomes even more pronounced as the course progresses.

Along the way, we read essays, interviews and extracts by established writers, reflecting on their practice. Sometimes I use an essay by Freud, Bakhtin or Benjamin on theories of storytelling. Finally, there is a group workshop in which we read and discuss each others’ writing. Each week, someone offers up a story or a few poems or an extract to the group, who go away and read, annotate and comment on it. (158)

Though the writer’s reflections on their practices may describe methods for generating ideas, those methods don’t seem to comprise an integral part of the class. Vakil reports that “with minor variations” this approach is common to creative writing programs all over England and America.

            Before dismissing any of these practices, though, it is important to note that creative writing must rely on skills beyond the production and assessment of random ideas. One could have a wonderful idea for a story but not have the language or storytelling skills necessary to convey it clearly and movingly. Or one may have a great idea for how to string words together into a beautiful sentence but lack any sense of how to fit it into a larger plot or poem. In a critique of the blind-variation and selective-retention model, Ericsson points out that productivity in terms of published works, which is what Simonton used to arrive at his equal odds rule, takes a certain level of expertise for granted. Whether students learn to develop multiple new ideas by writing down each other’s stories or not, it is likely important that they get practice with the basic skills of stringing words together into narratives. As Ericsson explains, “Unless the individual has the technical mastery to develop his or her ideas or products fully, it is unlikely that judges will be able to recognize their value and potential” (330). Though mere productivity may be what separates the professional from the game-changer, to get to the level of expertise required to reliably produce professional-grade work of any quality takes more than a bunch of blindly conceived ideas. As if defending Vakil’s assigned reading of “established writers,” Ericsson argues that “anyone interested in being able to anticipate better what is valued by experts in a domain should study the teachings and recognized masterpieces of master teachers in that domain” (330).

            Part of the disagreement between Simonton and Ericsson stems from their focusing on different levels of the creative process. In the domain of creative writing, even the skills underlying “technical mastery” are open to revision. Writers can—and certainly have—experimented with every aspect of storytelling from word choice and syntax at the fundamental level to perspective and voice at higher-order levels. The same is true for poetry. Assuming the equal odds principle can be extrapolated into each of these levels, teachers of creative writing might view their role as centering on the assessment of their students’ strengths and weaknesses and thenceforth offering encouragement to them to practice in those areas they show poorer skills in. “Develop what you lack” (360) is another of Csikszentmihalyi’s prescriptions. The teacher also has at her disposal the evaluations of each student’s classmates, which she might collate into a more unified assessment. Rather than focusing solely on a student’s text, then, the teacher could ask for the class’s impressions of the writer’s skills as evidenced by the text in relation to others offered by that student in the past. Once a few weaknesses have been agreed upon, the student can then devote practice sessions to generating multiple ideas in that area and subsequently evaluating them with reference to the works of successful authors.

            The general outline for creative writing workshops based on the expert performance framework might entail the following: Get the workshop students to provide assessments, based on each of the texts submitted for the workshop, of their colleagues’ strengths and weaknesses to supplement those provided by the instructor. If a student learns from such assessments that, say, his plots are engaging but his characters are eminently forgettable, he can then devote practice sessions to characterization. These sessions should consist of 1) studying the work of authors particularly strong in this area, 2) brainstorming exercises in which the student generates a large number of ideas in the area, and 3) an exercise at a later time involving a comparative assessment of the ideas produced in the prior stage. This general formula can be applied to developing skills in any aspect of creative writing, from word choice and syntax to plot and perspective. As the workshop progresses and the student receives more feedback from the evaluations, he will get better at anticipating the responses of the instructor and his classmates, thus honing the evaluative skills necessary for the third practice phase.

            The precedence of quantity of ideas over quality may be something of a dirty little secret for those with romantic conceptions of creative writing. Probably owing to these romantic or mystical notions about creativity, workshops focus on assessment and evaluation to the exclusion of productivity. One objection to applying the productivity approach within the expert performance framework likely to be leveled by those with romantic leanings is that they ignore the emotional aspects of creative writing. Where in the process of developing a slew of random words and images and characters does the heart come in? Many writers report that they are routinely struck with ideas and characters—some of whom are inspired by real people—that they simply have to write about. And these ideas that come equipped with preformed emotional resonance are the same ones that end up striking a chord with readers. Applying a formula to the process of conceiving ideas, even assuming it doesn’t necessarily lead to formulaic works, might simply crowd out or somehow dissipate the emotional pull of these moments of true inspiration.

            This account may, however, go further toward supporting the expert performance methods than casting doubt on them. For one, it leaves unanswered the question of how many ideas the writer was developing when the big inspiration struck. How many other real people had the author considered as possible characters before lighting on the one deemed perfect? Like the dreamer who becomes convinced of her dreams’ prophetic powers by dint of forgetting the much greater proportion of dreams that don’t come true, these writers are likely discounting a large number of ideas they generated before settling on the one with the greatest potential. Far from being completely left out of the training methods, emotional resonance can just as easily take ascendant priority as an evaluative criterion. It can even play a central role in the other two phases of practice.

            If the student reads a passage from another author’s work that evokes a strong emotion, she can then analyze the writing to see what made it so powerful. Also, since the emotion the passage evoked will likely prime the reader’s mind to recall experiences that aroused similar feelings—memories which resonate with the passage—it offers an opportunity for brainstorming ideas linked with those feelings, which will in turn have greater potential for evoking them in other readers. And of course the student need not limit herself to memories of actual events; she can elaborate on those events or extemporize to come up with completely new scenes. The fear that a cognitive exercise must preclude any emotional component is based on a false dichotomy between feeling and thought and the assumption that emotions are locked off from thinking and thus not amenable to deliberate training. Any good actor can attest that this assumption is wrong.

Part 3 of this essay.

Read More