Do stories have to appeal to either the intellect or the emotions? Can they do both? Can they do neither and still work as stories?
I am instinctively suspicious of setting up a duality of intellect and emotion. What we know shapes what we feel and what we feel shapes what we know, Consider this passage from Night by Elie Wiesel:
“But we had reached a station. Those who were next to the windows told us its name:
‘Auschwitz.’
No one had ever heard that name.”
This is a gut-punch. But only if you know what Auschwitz was Without that knowledge, the lines are bland.
All good stories have to appeal to our emotions, I think. That is to say, they have to engage us, make us care and want to read on. The most fundamental story technique for doing that is to make us empathise with the characters. But empathy is not the only technique or the only emotion stories deploy.
Consider the well-known “hook”. This usually comes right at the beginning of the story: the device that makes us sit up and take the bait. The normal emotion here is intrigue, or curiosity. For example, this opening to Dodie Smith’s I Capture the Castle:
“I write this sitting in the kitchen sink.”
Who can resist reading on to discover why she is in the sink?
Curiosity is an emotion with a heavy dose of intellect. It is the emotion that drives scientific enquiry. Even in empathetic reading, there is a strong dose of curiosity. The reader asks themselves “If I were in this situation, how would I react?”, because reading fiction is, among other things, a rehearsal for social life. We may enter story worlds to engage with situations we have never experienced (at least not in quite the same form) and to learn how we might behave and how we might exercise greater courage or to discover a more authentic way of being ourselves.
I would argue that stories that deploy emotion without intellect are almost always composed of “easy” emotional ploys: tropes we instantly recognise without occasioning any need for examination or self-examination. The king is good, the stepmother is bad, the innocent princess is imperilled. Such stories are almost always sentimental, giving us a simple and affirming “hit” of emotion without troubling us in any way. The emotions have bulk, but they fail to nourish us, Similarly, stories can appeal to intellect without engaging emotion: they deploy puzzles where we are interested in discovering the solution, even if the characters are flat. Detective fiction often falls into this category.
Finally, can a story appeal neither to emotion nor to intellect? I would argue not, but I stand open to persuasion.
“Verbing” (or denominalisation) is the practice of turning nouns into verbs. For example, Matt Damon in The Martian saying “I’m gonna have to science the shit out of this.” The noun “science” here becomes a verb.
My enquiry into the habit began when I queried the use of the verb “to mirror” in a friend’s novel set in the early nineteenth century. I wondered if this was a modern habit, perhaps derived from the compressed speech of texts and Twitter.
Verbing is old. Really old.
Some verbing is so old, we no longer recognize it. We are unfazed by “rain” as a verb, or by the act of “buttering bread”. And the practice goes even further back. The verb “enchant” is a borrowing from Old French.
Techniques for verbing
As with enchant, putting the suffix “en” in front of a noun is a common way of making new verbs. Shakespeare was a great one for doing this. For example, Iago says to Othello “Do but encave yourself”. Many other examples of Shakespearean coinages can be found in David Crystal’s article Verbing: Shakespeare’s linguistic innovation.
Substituting a name for an action is another common technique. Hence we get the verb boycott (after Charles C Boycott, an English land agent in 19th century Ireland who refused to reduce rents for his tenants and was, in consequence, ignored by local residents). We also get the verbs hoover and google in this way.
Finally, of course, a noun can simply become a verb. Consider these lines from Shakespeare’s Richard II “Within my mouth you have enjailed my tongue, Doubly portcullised with my teeth and lips.” There are two examples here: enjailed and portcullised.
A powerful and direct, if not creative, example of this previous technique comes from the exasperated threat of parents to importuning children. “Can I have an ice cream? Please?” “Ice cream? I’ll ice cream you.”
Why do we verb?
One motive may be impact. Compressed expression has an immediacy that a full exposition may not. For example. Burt Lancaster’s demand for a light, “Match me,” in the 1957 The Sweet Smell of Success. The brevity of the line conveys the contempt and power of Lancaster’s character for the character he’s addressing. Much more so than had he said, “Will you light my cigarette, please?”
The desire for brevity has two sources, a linguistic shortcut and the attempt to collide words together, as if in a particle accelerator, to study what new meanings come off. The shortcutting (the word is itself a verbing) is most evident in acronyms. LOL is a text abbreviation for Laugh Out Loud (and not, as David Cameron believed. Lots of Love). Like verbing, acronyms have a long history. Consider, for example, the ancient Roman SPQR (Senatus Populsque Romanus—the Senate and People of Rome) added as a stamp of anything official.
But, finally, let’s consider the particle-smashing element of verbing. When a noun (a thing) and a verb (an action) are smashed together, we get something new: a process, a thing that changes and evolves in time. Nouns and verbs (as well as pronouns, conjunctions, prepositions, adjectives and adverbs) are just convenient ways of cataloguing our linguistic world, not necessarily a reflection of the real world. It’s possible everything may be a process rather than an object or an action. What would our world be like if we saw it this way? Here’s to the catting sitting matting!
Could an AI write a story? Yes, they already exist. I the Roadwas published in 2018. Here is a list of some others. World Clock was published in 2013, Dinner Depression in 2019. The Day a Computer Writes a Novel was entered in 2015 for the third Hoshi Sinichi award, a Japanese sci-fi competition, and proceeded past the first judging round. There is also a collection of books entirely written by AIs.
None of these stories are perfect, and those that were not edited by humans tend to be rambling and incoherent. AI-generated fiction was still not very good. The plots tended to be prosaic and the characterisation shallow. But the field is advancing by leaps and bounds.
Chat GPT
The third generation of the language generating AI, Generative Pre-trained Transformer (GPT3), introduced in 2020, can hold remarkably human-like conversations and write passable fiction. You can play with GPT3 and explore its abilities through ChatGPT (though you’ll have to surrender both your e-mail address and your telephone number) or through writing apps such as Sudowrite and Jasper.
The consensus of technical opinion is that GPT3 is “scary good” at tasks such as copywriting, composing essays, and holding human-like conversations. However, it does also make mistakes, so don’t rely on its output. Its makers admit it can create “plausible-sounding but incorrect or nonsensical answers.” This is, perhaps, because according to some critics, it’s very good at putting words into an order that makes sense from a statistical point of view, but with no awareness of the meaning or its correctness. This may be an overly harsh judgment. ChatGPT was good enough to score between B and B- in an MBA exam, though it made a fairly monumental arithmetical error.
Here’s where the field gets philosophically interesting.
The debate
There has been quite a debate about whether an AI might surpass the abilities of a human writer. Below is a flavour of some of the positions from a writing website to which I belong:
“Better than quite a few writers, for sure. I’ve literally seen worse. So, impossible that AI could someday be indistinguishable from a real human? It’s already on par with a lot of real people, if not better than the worst writers.”
“It also doesn’t matter to those looking for literary fiction if the genre readers are getting their books through a computer software program. What these readers want, the software cannot provide.”
“As writers, we are somehow biased by the ethical question – some of us see AI-generated text as a mere tool (as Photoshop is for visual artists), while others consider it cheating. For me, speaking solely as a reader, if the book is good, I wouldn’t mind that it’s written partly or entirely by an AI.”
“I think this is hella cool. It’s at least a basic foundation some writers can use upon which to flesh out their ideas, if they so choose. They’re still writing the story, executing the ideas in their own unique way.”
“I can’t think of any technology that has, or could, replace human creativity.”
“Can it ever deliver emotionally and philosophically illuminating stories in ways that skilled and experienced authors can? Personally, I doubt it, because story telling isn’t just plot. Or characters. Or subplots and twists and opening sentences and all the “rules” people like to clutter their imaginations with.”
“Compared to my very limited life, and the pathetically tiny amount of literature I have consumed, ChatGPT in its current form already has vastly more experience to drawn upon than me. Even with just a short time playing with it, it has written short stories with characters and settings that I could never have dreamed of writing, and come up with ideas that I could never have thought of.”
“I really can’t believe the people who are saying AI will never write better than humans. It literally writes better than me at this moment.”
“But what about meaning? What about the illuminating ideas of self and behaviour and memory and emotion and justice? Do you believe that personal expression – the epiphany of the author in the scenes they write and the meaning they are trying to share with others is something that software can create?”
There are several views here. One holds that AI is a tool for authors, much as dictionaries, thesauri, word processors, grammar and spelling checkers are tools. Another holds that a sufficiently complex AI should be able to write works that would satisfy readers. Still another holds that, while AI may be capable of writing formulaic genre fiction, only a human writer can be truly creative.
The second and third positions are philosophical arguments about what it is to be human. To be human, the third position argues, is to attach meaning to things and manipulate them symbolically to create new things. The second position implicitly denies there is anything particularly special about creativity: that it’s just a highly complex set of mental operations.
Brain and consciouness
Let’s explore these two positions about humanity. I acknowledge from the start that machines are not conscious (at least not yet) and do not “understand”. GPT3 is a language program trained on a huge data set of writing. There is a reason that understanding consciousness is labelled “the hard problem” by philosophers and neuro-scientists. We know quite a lot about what brains are and how they work, but consciousness has evaded scientific explanation (to date). So machine learning is not capable of understanding meaning. Instead, GPT works by detecting language patterns, following rules it has generated about what words are likely to follow other words.
I’m going to present three concepts here that may help in unravelling the problem. The first is the Turing Test; the second is the Chinese Room problem; and the third is the role of metaphor in creativity.
First, the Turing Test. Proposed in 1950 by the mathematician Alan Turing, the test assesses whether people can tell when they are conversing with a machine. If the evaluator cannot reliably tell the machine from the human, the machine would be said to have passed the test.
Second, the Chinese Room. This is a 1980 thought experiment by the philosopher John Searle in rebuttal of the Turing Test. He imagines he is a sealed room with access to the instructions used by a language computer which can answer questions in Chinese. Questions are fed in through a slot and he follows the instructions, enabling him to write out entirely correct answers without speaking a word of Chinese. This, Searle argues, is what artificial intelligence is doing. You will see that my position concurs with Searle that the machine does not “understand” anything.
The question is, does it matter that the machine understands nothing? Since we don’t know what consciousness is, we can’t measure it directly, but we can infer (though again without proof) that other people possess it. If our judgment of a respondent’s humanity is all we can rely on, we would have to conclude that the ability to perform as if conscious is indistinguishable from being conscious. In the case of creative writing, the reader’s response is the arbiter. The Turing Test becomes: could a sufficiently discerning reader tell that a piece of fiction was written by a computer? Already, this may be difficult and will certainly become more so as AI advances.
The nature of creativity
This brings me to the third element: the nature of creativity. The quotes from the writers’ discussion above contain the view that while a machine can follow the rules of a formula, it would be incapable of investing this with original meaning and creativity. Let us grant the fact many readers enjoy repetitions of formulae. That is what the strictures of genre mean. There is no shortage of formulae available to writers. The Hero’s Quest is among the most popular. So let’s consider only writing that possesses greater literary “depth” and that explores complex meaning.
Where does that depth and meaning come from? It would, in principle, be possible to write a set of rules for deep writing by specifying what the meaning behind the story is, and some recurring motifs to express this. But would a machine be able to use these effectively and creatively? What is creativity? The Cambridge Dictionary defines it as “the ability to produce or use original and unusual ideas”, which is good enough for my purpose here. Understanding creativity is, arguably, almost as difficult a problem as consciousness. But there are techniques and routines for developing the habit of creativity, such as Edward de Bono’s methods. If you’re stuck in thinking, try to add a wild card to free up creativity (for example, “how could you use spaghetti to solve this problem?”). One of the GPT apps, Sudowrite, offers a facility for “adding a twist” to a story.
Most spaghetti ideas don’t work, but a few do. And exploring them frees up creativity.
I want to finish by suggesting a mechanistic answer to how creativity works, which is an extension of the spaghetti idea. It’s not my concept but one developed by Donald Schon in his book Invention and the Evolution of Ideas. He argues, and this pleases me as a writer, that metaphor is at the root of creativity, whether in the arts or the sciences. A metaphor or simile relates unlike things (“my love is like a red, red rose”). We know they’re unlike, but in conjoining them, our sense of each of them changes, the one illuminates our understanding of the other. James Clerk Maxwell used the well-understood properties of waves to explore the mathematics of electricity and magnetism and uncovered the physics of electro-magnetism. He used water waves as a metaphor for electrical waves.
If there are, indeed, “algorithms” for creativity, a machine should be programmable to replicate it.
In the skies above the port, the neon lights and holographic advertisements flickered and pulsed like the synapses of some vast, artificial brain, the electric nerves of the city stretched taut against the darkness. The port itself was a glittering hive of activity, a mass of chrome and steel, the beating heart of the sprawling metropolis.
Was this passage written by a person or a machine? It has metaphor. As does this one:
Sometimes the scattered thoughts of their deaths run like a jagged red seam of fire inside me and I burn from the inside out, like a lightning-struck tree; the outside whole, the inside, that carried the lightning’s charge, a coal. At other times, I feel empty, transparent, a child of the wind…they are gone, I tell myself. Nothing comes back
One of these passages was written by ChatGPT. The other by a prize-winning human author. Can you tell the difference? It would be great to hear your decision and the reasons for it.
Judges for the 2018 Man Booker Prize appealed to authors to edit. “Occasionally we felt that inside the book we read, was a better one – sometimes a thinner one – wildly signalling to be let out,” said chair of the judging panel, Kwame Anthony Appiah.
I have great sympathy with this appeal. A few years ago, I wrote a blog post bemoaning the sharp increase in the length of novels after 1950. I also remarked there that this is an odd phenomenon in age in which, we are told, attention span is shortening and instant gratification is the norm.
The modern reader, confronted with this opening of Dicken’s A Tale of Two Cities, might be expected to scrawl TL;DR (too long; didn’t read) and move on:
“It was the best of times, it was the worst of times, it was the age of wisdom, it was the age of foolishness, it was the epoch of belief, it was the epoch of incredulity, it was the season of Light, it was the season of Darkness, it was the spring of hope, it was the winter of despair, we had everything before us, we had nothing before us, we were all going direct to heaven, we were all going direct the other way — in short, the period was so far like the present period, that some of its noisiest authorities insisted on its being received, for good or for evil, in the superlative degree of comparison only.”
The reason for the discrepancy between short attention span and long novels is probably that the novels I used for comparative analysis are those that received critical approval, not necessarily those from the mass market.
Though I enjoy brevity, I want to make a few points here in defence of verbosity:
To be verbose is not necessarily to be imprecise. The opposite of verbose is “succinct”, not “precise”.
There is pleasure to be had in lush description.
Verbosity has real narrative function.
Look back at the Dickens excerpt. What is he telling us? He is writing about the era of the French revolution, and contrasting two opposed worlds: one of radical change and the other of conservative stasis. He could, of course, have just said that. Instead, he makes us experience the contradiction. In the mode of the writing coach, I might say he “shows” us, rather than “tells” us. And, surely, that is the job of the writer—to allow us to live for a moment in another’s reality.
Excessive description (with a few exceptions like the Biblical Song of Solomon) is largely absent from literature before the advent of the modem novel. Just take a look at Homer’s spare prose, if you doubt this.
The classics of earlier eras are plot-rich. There was little drive to explore the inner life of protagonists because their goodness or evil was a function of their actions, not their thoughts and feelings. These stories were created for homogeneous communities with a shared understanding of the world. So, exploration of inner worlds would have been superfluous.They also describe worlds where change generally came slowly and yesterday was much like today.
Today, many of us live in diverse communities where the pace of change is dizzyingly fast. Understanding that diversity and capturing the fleeting present is one of the functions of description in fiction.
If I want to understand you, I need to appreciate your perception and your motivation, not just your actions. We all experience things differently. And, perhaps, the modern obsession with recording everything, and of “making memories” rather than simply experiencing things, is the clue to why verbosity matters. If the slow world of the past generated stories full of fast action, our fast world needs slow stories that capture the moments before they’re gone.
I wondered why these three should be the most popular. They’re not necessarily my best posts. The answer, I thought probably lay in the way search engines work. A post is likely to be read if it (a) deals with a topic that is often searched for; and (b) has relatively few competitors, so it’s thrown up on the first page of a search.
My post on descriptive writing appears on the first page in a search for “The art of descriptive writing”, and my Magic Realism post appears as number one on the list generated by the search string “Magic Realism, Fantasy and Surrealism”.
And yet my sex writing post does not appear in the first three pages of results in response to either “How to Write Sex” or “Writing Sex”. That was curious, and I investigated further. For the first three years since it was posted, it averaged 23 reads a year. And then it took off, rising to 165 reads in the fourth year, 196 the following year, and 395 so far this year. So perhaps this post achieved its popularity by word of mouth. But the descriptive writing post showed the same pattern, while the magic realism post rose and then declined in popularity.
My uncle was a headmaster and an English teacher by profession. I once asked him if he knew the poet, e.e. cummings. He replied “No, I haven’t had that pleasure. If you mean, do I know of him, the answer is yes.” We’ve all encountered grammar snobs and writers’ inboxes bulge with well-meaning suggested corrections from friends and colleagues. The interesting thing is that, often, the suggestions invoke rules that don’t exist.
Rule 1: Never start a sentence with a conjunction
Can you start a sentence with a conjunction? Many people believe the answer is no. But, grammatically, it’s fine. See? I started that sentence with the conjunction “but” (“and” is another common conjunction and “or” another).
What was effect of using “but” there? I could have written it as “Many people believe the answer is no, but, grammatically, it’s fine.” Separating the thoughts with a full stop rather than a comma is a way of emphasising the contrast (“On the one hand this. But, in fact, on the other hand, that”).
The truth is you can start a sentence with a conjunction if it feels right. Bear in mind, though, that some people will think it’s a grammatical mistake and be pulled out of the flow. Also (another conjunction), don’t overuse it, or your writing will start to seem choppy.
Rule 2: Never end a sentence with a preposition
Ending a sentence with a preposition is a mistake up with which you should not put. Sounds odd, doesn’t it? It’s much more natural to say “Ending a sentence with a preposition is a mistake which you should not put up with.” Again, this is perfectly grammatical. There is no rule that you can’t end a sentence with a preposition. It’s simply less formal. Prepositions are words like to, up, at, in, of, for, with, etc. They show the relationship between one thing and another. If you are writing formally (such as in a report) you might want to avoid ending sentences with prepositions. For example, you’d probably be better advised to say in court “That’s the town in which I live”, rather than “That’s the town which I live in.”
Ruler 3: Never split an infinitive
This rule is an odd one. It wasn’t introduced formally until the nineteenth century and was gone by the end of the twentieth. Three generations were taught that it is grammatically incorrect “to boldly go”.
An infinitive is a verb with its “to” suffix. When an adverb is inserted into between the ”to” and the verb, the infinitive is “split”. The Merriam-Webster dictionary says “”the objection to the split infinitive has never had a rational basis.”
Rule 4: Never use adverbs
Don’t even get me started on this one. Stephen King famously said “the road to hell is paved with adverbs”. Yet, across 51 books, he used an average of 105 adverbs per 10,000 words. That’s more than Ernest Hemingway (at 80 per 10,000 words. Also more than six other famous writers, but less than E.L. James at 155 per 10,000 words. The rule is silly. Adverbs do a job—modifying verbs, as adjectives modify nouns. Of course, they should not be used where all they’re doing is strengthening weak verbs. Enough said.
Rule 5:
There really should be fifth rule to debunk, if only for reasons of number magic. Five sounds complete, whereas four sounds slapdash. Oh dear.
By convention, we think of character as the essence of the novel. At least of the literary novel. Or, at least of the Western literary novel.
I picked some writing advice from a random internet search: Masterclass has this to say:
“While a mastery of plot can help you develop exciting twists and turns, great character development draws readers in by giving them strong characters with whom they can identify…A novel consists of a character interacting with events over time.”
Someone told me that once you have a character with a want, you have the beginnings of a plot.
But, of course, the idea that stories should be about characters striving to achieve their goal is a comparatively new one in story-telling, and depends on historical preconditions. These include the development of the belief that what goes on in other people’s heads is interesting, and the existence of societies in which it’s meaningful for individuals to strive towards goals.
Orhan Pamuk, Turkish novelist and winner of the Nobel Prize for Literature has an interesting take on the primacy of character.
“The aspect of the human being we call ‘character’ is a historical construct and,…just like our own psychological and emotional makeup, the character of literary figures is an artifice we choose to believe in… Since I believe that the essential aim of the art of the novel is to present an accurate depiction of life, let me be forthright. People do not actually have as much character as we find portrayed in novels, especially in nineteenth- and twentieth-century novels…
“Furthermore, human character is not nearly as important in the shaping of our lives as it is made out to be in the novels and literary criticism of the West. To say that character-creation should be the primary goal of the novelist runs counter to what we know about everyday human life…In the beginning, there are patterns formed by people, objects, stories, images, situations, beliefs, history, and the juxtaposition of all these things—in other words, a texture…
“The character of my novel’s main protagonist is determined the same way a person’s character is formed in life: by the situations and events he lives through…The defining question of the art of the novel is not the personality or character of the protagonists, but rather how the universe within the tale appears to them…
“Novelists do not first invent a protagonist with a very special soul, and then get pulled along, according to the wishes of this figure, into specific subjects or experiences. The desire to explore particular topics comes first. Only then do novelists conceive the figures who would be most suitable for elucidating these topics.”
Orhan Pamuk, The Naïve and Sentimental Novelist pp 67-77, Faber & Faber 2011
Note that he is not denying the premise that readers become immersed through characters. He is saying that character is a product of a wider interplay of forces (what he calls texture) than simply a want. This texture is shaped by landscape, class, gender, and ethnicity. Character brings those forces to life. As a writer who was shaped by successive periods of military rule in Turkey, he is probably more sensitive to historical and political forces than are his Western counterparts.
Does a story have to be original to succeed? Indeed, can a story ever be original?
The answer may depend on how we define the term, story. I’m going to distinguish here between the terms, “plot” and “story”. A plot is the WHAT of a tale: what happens and to whom. A story is the HOW, the way the plot is related: in what sequence, with what stylistic devices.
There have been repeated claims that there’s only one story (Joseph Campbell’s idea of the Monomyth ) or seven (Christopher Brooker) or 31 (Vladimir Propp). Other thinkers suggest other numbers. In all cases, what they are talking about it plot, not story.
Even if there were only one plot, there might be a large (perhaps infinite) number of ways of telling it. Consider these examples of basic plots and their realisations:
Characters converge — usually in high school or university — and then diverge. But changed! The Group, Mary McCarthy The Interestings, Meg Wolitzer Private Citizens, Tony Tulathimutte A swindler is double-crossed, either out of vengeance or greed The Grifters, Jim Thompson The Mark Inside, Amy Reading A character’s fall Tender is the Night, F. Scott Fitzgerald Mrs Dalloway, Virginia Woolf T The protagonist is deceived until the scales fall from her eyes The Portrait of a Lady, Henry James Gone Girl, Gillian Flynn Forbidden Love Romeo and Juliet, William Shakespeare Anna Karenina, Leo Tolstoy A woman turns down proposals of convenience and settles on an unexpected Mr. Right Pride and Prejudice, Jane Austen Jane Eyre, Charlotte BronteBridget Jones’s Diary, Helen Fielding More at https://www.vulture.com/2016/08/encyclopedia-of-every-literary-plot-ever.html
So, while plots may not be original, stories can be. What devices turn a plot into an original story?
The words. Evocative language can turn a pedestrian plot into a thing of beauty.
Characterisation. People are individual, and endlessly fascinating in the way they act and see the world.
The point of view. Relating The Three Little Pigs from the Wolf’s point of view creates a fresh story.
The point of telling. This is the location in the plot from which the author starts to tell the story. A tale told from the middle or the end can feel very different from one told in sequence.
In these senses, a story can be original. Indeed, it must be original. If a story doesn’t resonate and hum, give me a sense of something fresh, a new insight, a new way of seeing a familiar problem, why would I want to read it?
I remember the future I was promised so long ago. There would be flying cars, and Dick Tracey watches that allowed you to communicate by speaking into your wrist. We got the Dick Tracey watches (cellphones), but not the flying cars or personal jet packs. Nor did we get colonies in space or universal plenty. But we got other stuff that nobody predicted.
We got a climate crisis and degradation of the environment: wildfires, floods, and famine. And a pandemic. And, so far from universal plenty, it seems we don’t have enough truck drivers to guarantee food delivery to the shops, or enough carers to look after the frail and elderly.
It would be easy to be scared. Apocalypse novels sell by the thousands. There’s a lot around that’s scarey. But perhaps there always has been. A generation ago, we were terrified by the threat of nuclear annihilation. And we survived. A generation before that, there was the World War to defend civilization. And a generation before that, the War to End All Wars. There was a pandemic in that generation too.
Yet, we survived. To survive now we need to be able to re-imagine a future we want to live in. Because crises don’t just go away by themselves—we have to want a change and work for it. A world without the vested interests of big oil and the snooping of big tech. Clean cheap energy. Food, shelter and a meaningful life for everyone. That seems worth working for, and it’s within our grasp. What would be the path towards that? Perhaps we need story-tellers to help us visualize that future.
What are stories for? According to a provocative book by Angus Fletcher, they are technologies invented (or discovered) to help us deal with life experience. They are psychotherapeutic tools.
Wonderworks: The 25 Most Powerful Inventions in the History of Literature reviews the blueprints for literary technologies that Fletcher claims can be scientifically shown to alleviate grief. trauma, loneliness, anxiety, numbness, depression, pessimism, and ennui, while sparking creativity, courage, love, empathy, hope, joy, and positive change. He argues that they can be found throughout literature from ancient Chinese lyrics to Shakespeare’s plays.
His aim is to subvert centuries of literary scholarship, asking not “what is this story about” but rather “what are its effects on us”. You might say, at the risk of anti-intellectualism, narrative are not to be studied but experienced.
He gives each of these technologies an annoying name. For example, the Hurt Delay (giving us distance on trauma), with which he explores Sophocles’ Oedipus; and the Almighty Heart (instilling courage), with which he explores Homer’s Iliad. He explores the neurobiology of fear and courage through the origins of fear in the amygdala, and the neuropharmacological response in the counterbalancing releases of adrenaline and oxytocin.
I have to first acknowledge that it is beautifully and engagingly written. And to applaud his sentiment that stories are there to be enjoyed. And then I have to confess to a strong distaste for its underlying framework.
The distaste isn’t that of a literary scholar. Granted, Fletcher takes immense liberties with the context of the works he cites and of their authors’ probable intents. He claims, for example, that Shakespeare’s Hamlet stages a play about a king murdered by his brother as a tribute to his dead father. Generations of literature students have known that the play’s purpose is so that Hamlet can observe his uncle’s reaction and gauge his guilt for the murder.
He may well also be accused of playing fast and loose with the chronology of literary creation. But these sins would be easily forgivable if they serve to expose a deeper reality.
Nor is my distaste for the neurobiological exploration of narrative. Any thing in the universe is a potential object for scientific investigation. Though I’m not always persuaded by Fletcher’s piecing together of the neuronal circuitry.
These were ideas of their time, when the world, and consequently scholarship, were in the grip of totalizing systems. They were superceded by ideas of a later time, which emphasised context and cultural diversity. I make no claim that the earlier ideas were wrong and the modern fashion right. Merely that they are more to my taste. I’m not even sure it’s possible to prove literary criticism ideas right or wrong.
What concerns me about Fletcher’s reprise of an earlier era’s concerns is that it is forced to strip stories of their specificity. And that specificity is the source of their delight. The one-sentence summary “The spoiled Emma’s pride makes her prejudiced against Mr. Darcy, though they eventually realize they’re perfect for each other” https://www.huffpost.com/entry/one-sentence-guides-to-16_b_98480 tells the reader little about whether they’ll enjoy Pride and Prejudice. The specificity of the story-telling is everything.
Stories are produced by cultures, not neurons, as Laura Miller observes in a critical essay on the book https://slate.com/culture/2021/03/wonderworks-angus-fletcher-review.html. She notes that culture determines “who in a society is permitted to read and write, who (if anyone) pays the author for her work, how the work is circulated, what its audience expects of it, etc.” She accuses Fletcher of a “calculating utilitarianism” which reduces literature and reading to a feel-good therapy. His enterprise rests on the totalising universalism that argues that human beings face relatively unchanging problems created by the way our brains work and that a set of enterprising literary entrepreneurs have been steadily inventing solutions to these problems.
He has ignored all the specificity and cultural diversity that created the epic poetry of the classical and pre-classical eras. These epics were about gods, about fate, about heroism. He has ignored all the factors that had to come into existence before stories could be about individual people and their feelings. Laura Ashe https://www.historytoday.com/miscellanies/invention-fiction, for example, argues that for fiction, as we know it today in the West, to come into existence, we had first to develop the notion that individuals and their feelings mattered. She writes:
“In Old English poetry, to be an individual, cut off from these collective bonds, is to be lost. More than this, there is no attention to an inner life that can be meaningfully distinguished from exterior action. Will the warrior make good on his boasts in the mead hall? Only in action is a man’s value known; intention is nothing.”
What changed, she argues is a set of economic, political and theological conditions in twelfth century England that permitted a literature in which The Romance could flourish. “This,” she says “is the literary paradigm which gives us the novel: access to the unknowable inner lives of others, moving through a world in which their interior experience is as significant as their exterior action.”
This is not to ignore the fact that in other places and at other times, writers have explored the theme of love, its joys and its sadness. But let us not forget that literary forms are inventions that partake of their cultures. Many story traditions in Asia are still indifferent to the idea of a “protagonist” and the changes he or she undergoes. Rather, stories are about the unfolding of circumstances.
There are many other cultural variations in story-telling. For example, while the normal Western story is composed of three parts (beginning, middle or climax, end or resolution) some Asian cultures have a four-part form, known as Kishōtenketsu in Japanese. The structure here is beginning, middle, twist, end. All of this specificity and cultural detail is irrelevant to Fletcher’s project. And, the acid test is whether, as a writer, it equips me to write better stories or, as a reader, to gain more enjoyment. The answer to both is no