The Bulwark

The Bulwark

Home
Shows
Newsletters
Chat
Special Projects
Events
Founders
Store
Archive
About

Share this post

The Bulwark
The Bulwark
AI and the Nature of Literary Creativity
Copy link
Facebook
Email
Notes
More
User's avatar
Discover more from The Bulwark
The Bulwark is home to Sarah Longwell, Tim Miller, Bill Kristol, JVL, Sam Stein, and more. We are the largest pro-democracy bundle on Substack for news and analysis on politics and culture—supported by a community built on good-faith.
Over 819,000 subscribers
Already have an account? Sign in

AI and the Nature of Literary Creativity

Are ChatGPT and its competitors pushing literature toward its destruction—or just another in a long line of technological transformations?

Corbin K. Barthold's avatar
Corbin K. Barthold
Sep 27, 2023
30

Share this post

The Bulwark
The Bulwark
AI and the Nature of Literary Creativity
Copy link
Facebook
Email
Notes
More
4
Share
(The Bulwark / Midjourney)

IN 1968, ROLAND BARTHES pronounced “the death of the author.” David Foster Wallace distilled the French literary theorist’s rather screwy point a quarter-century later: “It is really critical readers who decide and thus determine what a piece of writing means.” Perhaps the “author” (the “entity whose intentions are taken to be responsible for a text’s meaning”) was “dead” in some postmodern sense; but the “writer” (the “person whose choices and actions account for a text’s features”) remained very much alive. Whatever “the death of the author” might mean, “one thing which it cannot mean,” Wallace assured us, quoting the novelist William Gass, “is that no one did it.”

Then, however, along came a wondrous and terrible technology—artificial intelligence. Barthes’s glib theatricality now looks surprisingly prophetic. The author is at risk, suddenly, of becoming dead dead. Or, to put it in Wallace’s terms, we are faced with the death of the writer. There really will be a text of which it can be said: no one did it.

Share

Naturally, there are skeptics. (Wired headline: “Why Generative AI Won’t Disrupt Books.”) But for the moment, let’s focus on the believers. (Wired headline: “One Day, AI Will Seem as Human as Anyone. What Then?”) They are not few in number. “Eventually, no particular human skill”—not even creativity—“is going to differentiate us from AI.” (So opines a computer scientist.) Large language models like ChatGPT already have “a surprisingly nimble grasp of prose style.” (A novelist). Before long, they might “allow readers to endlessly generate their own books.” (Another writer.) The “reading public” will soon “miss the days when a bit of detective work could identify completely fictitious authors.” (An internet researcher.) And at that point, we’re told (by a technologist), as AI “blur[s] the line between human and machine,” we will have to grapple with “some difficult questions” about who we are.

Will we, though? The camera has, in many ways, displaced the painter. Does that keep you up at night? Generative AI might, in many ways, displace the author, a transition that would indeed have profound aesthetic, economic, and even political consequences. But why should it make you feel any less human?


TO BEGIN WITH, THIS BUSINESS of “authorship” is contingent—historically, culturally, and otherwise. In his treatise on law and literature, Richard Posner considers the matter in the context of the birth of copyright. “Authorship,” he explains, “is an ascribed status rather than a natural kind”:

The notion that one is an author only if one wrote the work rather than having discovered, copied, improved, praised, financed, commanded, or sponsored it is a convention of particular cultures, a convention the causes of which may be as much material as ideological. A medieval writer of books was a member of a team of equally skilled craftsmen (others being the binder, the scribe or later the printer, the illustrator, the seller, perhaps the censor) engaged in the production of a book.

In the tradition-bound societies of the ancient and medieval worlds, Posner claims, a writer’s lodestar was “not original creation but creative imitation.” Then, however, came a wondrous and terrible technology—the printing press. Now a writer could make a living by selling a lot of books, rather than by relying on a patron. It became profitable for an author to speak in an original voice and attract a public following. (At the same time, it became costly to lose sales to cheap knockoffs—hence the copyrights.)

This simplified account can, of course, be elaborated, refined, complicated. Take, for instance, the wealth of factors that contributed to the rise of the novel as we know it. Jonathan Franzen mentions “the expansion of a literate bourgeoisie eager to read about itself, the rise in social mobility (inviting writers to exploit its anxieties), the specialization of labor (creating a society of interesting differences), . . . and, of course, among the newly comfortable middle class, the dramatic increase in leisure for reading.” Rounding out his list are the fall of “the old social order,” the rise of “the enterprising individual,” and rapid secularization.

We lived without novels before, and we could do it again. One of the great living authors, Zadie Smith, has this to say: “Most mornings I think: death of the novel? Yeah, sure, why not? The novel is not an immutable fact of human artistic life, after all, just a historically specific phenomenon.”


NOR IS THE CONCEPT OF AUTHORSHIP theoretically indispensable. Ideas can outlast castles and empires; isn’t it obvious, on some level, that they live their own lives? That they are distinct from, and independent of, their frail mortal hosts? For that matter, how does a single sentence form? True enough, most writers plan and prepare; but composition—putting the words on the page—is a mysterious process. There is an element of inspiration—of the author as medium for, rather than master of, what is set down. To return to Posner, a work’s meaning tends to “emerge” from “the act of creation or completion.”

A text and its author are not neat representations of each other. Smith once called Philip Larkin’s writing style “ethical.” Presented with the problem of Larkin’s chauvinism, she responded that she was praising “the things his poems believe in.” Her own best essays, she added, are “smarter than me in every way.” The intuition is a common one—that a writer can write better than he lives, or better than she knows. This is one reason, Posner observes, why trying to understand a work by reference to an author’s life, beliefs, or intent can diminish the work, making it less interesting, less insightful, less universal.

Keep up with all our coverage of politics, ideas, and culture—sign up today for a free or paid subscription.

Some of the most distinguished novels are the most indeterminate. Because they see all time at once, like a mountain range, the Tralfamadorians of Slaughterhouse-Five reject free will. Does this mean Kurt Vonnegut did so as well? Not necessarily. He “leaves that question open,” Salman Rushdie concludes, “as a good writer should. That openness is the space in which the reader is allowed to make up his or her own mind.” Posner agrees: “Much great literature . . . achieves an equipoise, rather than a resolution, of opposing forces.” But then why should we get worked up about what the author “really thinks”?

Those poststructuralists—Barthes, Derrida, and the rest—were on to something. They were not wrong to ask whether literary interpretation is boundless, elusive, inexhaustible. One generation “depicted Shakespeare as a subversive writer,” notes Posner, another “as an orthodox spokesman of medieval Christian values.” The truth, in Smith’s view, is that Shakespeare is “a writer sullied by our attempts to define him.” Like all literary genius, he “is a gift we give ourselves, a space so wide we can play in it forever.”

In Crime and Punishment, the murderer Raskolnikov juggles motivations, endlessly disputing himself. Was Dostoevsky boldly crafting the first polyphonic novel? Or was he vacillating in his own mind, on the nature of his character, as he rushed to meet deadlines for a story published in installments? Does it really matter? Crime and Punishment escapes not only its time and place but even its author’s designs. And even if you insist on treating Dostoevsky’s biography—the mock execution; the mystical pre-seizure visions—as a compass for his fiction, you’re out of luck with Shakespeare. We know remarkably little about the Bard’s life, and he is famously removed from his own oeuvre; he lacks authorial presence. Yet we don’t need a map of the walking, talking Shakespeare to navigate his plays. They’d be just as enriching if they were written by someone else—or if they’d fallen from the sky.

Doubts are traitors. Wounds heal by degrees. Beware the daggers in men’s smiles. Do not try to read the book of fate. A man told us these things. A man who, as best we can tell, lived a mundane life. From where did these sublime remarks originate? Thoughts arise from the void; their source is unknown. There’s magic at play here. What is the fount of transcendence? Does it really matter?

Share


THESE QUESTIONS PRECEDED artificial intelligence and they will persist alongside it. Decades ago, the deconstructionists tried to kill the author. Decades from now, many will still deny that she is dead. If you believe in authorship today, you can expect to go on believing in it tomorrow. In this instance, technology is unlikely to succeed where philosophy has failed.

Just ask authors themselves. In recent days, upon learning that their books appear in the massive datasets used to train large language models, many authors have expressed outrage and unleashed their lawyers. Understandably, they feel linked to their work, to their art, and no abstract argument is going to budge them.

Humans are not logic machines. A basketball is worth something. A basketball that a star player used for a game-winning shot is worth more. We have deep intuitions about human connection, and many of us will continue to apply those instincts to the act of storytelling. “For those of us civilians who know in our gut that writing is an act of communication between one human being and another,” Wallace said, “the whole question” of authorship seems “sort of arcane.” It was a given, for him, that “serious fiction’s purpose” is to provide human beings “imaginative access” to each other. Here’s Smith: “The true reason I read is to feel less alone, to make a connection with a consciousness other than my own.”

People will continue to write, just as people continue to play chess. And people will remain interested in other people’s writing, just as people remain interested in other people’s chess-playing. AlphaZero can run circles around the world’s grandmasters—yet humans keep watching humans. If anything, AI has made the best players more interesting to watch. Literature will likely head down a similar path. Even if a pile of code is someday hailed as the next Shakespeare, we’ll still be reading stories crafted by the Smiths, the Franzens, and the Wallaces. AI might simply make the best authors even more interesting to read. (Meanwhile, a few of us will continue to write with a pad and a pen—for pleasure, if not for profit.)

How much help from AI renders a story “non-human”? That will be a topic of debate—for a time. In 1719, Daniel Defoe published The Life and Strange Surprizing Adventures of Robinson Crusoe . . . Written by Himself. As Franzen points out, “many of its first readers took the story as nonfiction.” The eighteenth century was the moment, Franzen continues, when authors not only “abandoned the pretense that their narratives weren’t fictional,” but also “began taking pains to make their narratives seem not fictional—when verisimilitude became paramount.” From the start, the novel has been marked by evolution. Did the author make up her story? Did she write it with the help of generative AI? One day, these two questions may come to look equally quaint.

AI will unsettle received ideas. That is what wondrous and terrible technologies do. Yet when it comes to literature—to what fiction can teach us—much will stay the same. Or, if not, we’ll find fresh ways to understand ourselves. A new technology comes along, and we suffer an identity crisis. We worry about losing our humanity. But what’s new and alarming to one generation is old and ordinary to the next. We sort it out, we adapt. That, indeed, might be the most human trait of all.

The human being who wrote this article would appreciate if you would pass it along to as many friends as possible.

Share


Subscribe to The Bulwark

Tens of thousands of paid subscribers
The Bulwark is home to Sarah Longwell, Tim Miller, Bill Kristol, JVL, Sam Stein, and more. We are the largest pro-democracy bundle on Substack for news and analysis on politics and culture—supported by a community built on good-faith.
Travis's avatar
Thea's avatar
Martyn Wendell Jones's avatar
steve robertshaw's avatar
Wisley Lau's avatar
30 Likes∙
4 Restacks
30

Share this post

The Bulwark
The Bulwark
AI and the Nature of Literary Creativity
Copy link
Facebook
Email
Notes
More
4
Share
A guest post by
Corbin K. Barthold
Internet Policy Counsel at TechFreedom.
Subscribe to Corbin
The American Age Is Over
Emergency Triad: The United States commits imperial suicide.
Apr 3 • 
Jonathan V. Last
5,345

Share this post

The Bulwark
The Bulwark
The American Age Is Over
Copy link
Facebook
Email
Notes
More
1,469
How to Think (and Act) Like a Dissident Movement
AOC, solidarity, and people power.
Mar 24 • 
Jonathan V. Last
4,113

Share this post

The Bulwark
The Bulwark
How to Think (and Act) Like a Dissident Movement
Copy link
Facebook
Email
Notes
More
1,170
“How Can You Look at Yourself in the Mirror?”
George is furious.
Apr 3 • 
Sarah Longwell
2,110

Share this post

The Bulwark
The Bulwark
“How Can You Look at Yourself in the Mirror?”
Copy link
Facebook
Email
Notes
More
349
49:37

Ready for more?

© 2025 Bulwark Media
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share

Copy link
Facebook
Email
Notes
More