No one can predict the future. Even Nostradamus, the ‘‘man who saw through time’’, got it wrong off and on. But I feel pretty confident in making this prediction: GPT (Generative Pre-trained Transformers)—neural network models that enable apps like ChatGPT to produce ‘human-like’ text, images and music—won’t be wiping out the tribe of fiction writers, poets and artists from the planet any time soon.
Why GPT Won’t Be Writing War And Peace?
AI systems homogenise language, whereas language runs like hot blood through a writer’s veins
Aren’t GPT models a stunning breakthrough in Artificial Intelligence (AI)? Of course they are. Aren’t the scale and speed at which they execute tasks amazing? Of course, they are mighty gods of efficiency. They excel at data extraction, translation, document summarisation; generate marketing material, targeted blog posts and animations; build websites and produce specific content for social media campaigns. Businesses across the world are using them. Many a hotshot marketing campaign swears by them. But this is not about what the machines do. This is about what they are not—human.
In his 2023 New York Times essay, ‘The False Promise of ChatGPT’, Naom Chomsky said, “OpenAI’s ChatGPT, Google’s Bard and Microsoft’s Sydney are marvels of machine learning…These programs have been hailed as the first glimmers on the horizon of artificial general intelligence—that long-prophesied moment when mechanical minds surpass human brains not only quantitatively but also qualitatively in terms of intellectual insight, artistic creativity and every other distinctively human faculty. That day may come, but its dawn is not yet breaking…”
The creation of literature and art, however flawed, are human activities forged in the crucible of lived experience. A story or a novel or a piece of art spring from a particular consciousness. An individual sensibility at work. In the heart of the Russian winter, Anton Chekhov made a set of choices. In Victorian England, Virginia Woolf made a whole other set of choices. And so we have The Seagull and Uncle Vanya and the Cherry Orchard; we have To the Lighthouse and Mrs Dalloway and A Room of One’s Own.
AI systems homogenise language, whereas language runs like hot blood through a writer’s veins. A writer’s sensory experiences—touch and taste, sights and sounds of the world around—shape her stories. They breathe life into settings, into characters and their inner lives. When I write a story, I channel every sight I have seen, every emotion that pierced my heart, for better or for worse. Creative writing teachers tend to talk a lot about ‘‘voice’’, how voice can make or break your short story or novel. What is voice? It is the life that animates your writing, the flesh that grows on the bare bones of your ideas, the electric spark that makes your sentences crackle and sing. You can mine all the data on the planet and feed them to Large Language Models, you can finesse the Models till kingdom comes, but chances are zero to none that they will create another War and Peace or Gone With the Wind or Gitanjali. My point: human-like is not the same as human. Technological mimicry has its limits. Especially when it comes to creativity and originality.
Ted Chiang, the uncannily prescient sci-fi writer, pointed out in an interview that AI engineers who link words such as “learn”, “know” “understand” and personal pronouns like “I” with chatbots such as ChatGPT, create the false perception that these AI tools are sentient entities. “The machines we have now, they’re not conscious,” said Chiang. Without consciousness, creation is not even a remote possibility. Art is marinated in individual consciousness. It cannot bloom in the confines of a programmed machine, no matter how state-of-the art, efficient, or marketable the machine may be. Chiang’s novella The Lifecycle of Software Objects, a brilliantly imagined work of fiction, depicts artificial intelligences that were built in a digital world as entities that need human “parenting” and a whole lot of handholding to understand the real world. Despite all the help humans provide, they still struggle. I’d say it’s recommended reading for anyone who fears the machines will take over and that human creativity is doomed!
According to Jeff Hancock, Professor of Communication at Stanford University, the ability of AI systems to generate text is a blessing and a curse in terms of communication. It has major implications for how humans communicate and for human-computer interactions. This is a grey area right now as not much research has not been done on it yet. But it is safe to say that AI systems will change the skillsets writers need in the future. Writers who learn to work with these tools and use them to their advantage will have an edge in the job market. A recent news report about job losses among copywriters in a UK-based company came with an interesting twist. Several members of a team of copywriters lost their jobs due to automation. Since the AI model was cheaper, the company gave it the task of generating outlines and writing articles. This did not go as planned though. The AI-generated articles turned out to have a weird, overtly formal tone and did not sound “human” enough. In order to solve the problem, some of the flesh and blood copywriters were brought back to the office in order to humanise the AI-generated text and make it sound less stilted!