In my previous post, I revisited ideas I had explored about a year ago about AI’s ability to write. I was also curious about whether ChatGPT had changed its tune at all since then, so I decided to revisit the question in a new thread. I started with the same question: I don’t think artificial intelligence can actually write, in the technical sense of the term. Do you agree or disagree, and why? Back in 2023, ChatGPT disagreed with me, citing its ability “to generate written text.” This time, it was more equivocal:
I understand your perspective, and it depends on how we define "writing" in this context. Technically, artificial intelligence (AI) like myself does not write in the same way humans do; we don't physically hold a pen or type on a keyboard. However, if we define writing more broadly as the act of generating coherent, meaningful text, then AI certainly can "write."
AI writing involves processing large amounts of data, understanding language patterns, and generating text based on those patterns. When you interact with me, I analyze your queries using algorithms and respond with text that is generated in real-time, based on a vast database of pre-existing information and language models. This process mimics writing in its outcome (producing text), even though the mechanics are different from human writing.
I find it significant, and a bit unsettling, that ChatGPT says the primary, distinctive feature of human writing is physical, or “mechanical.” I am reminded of one of many misconceptions about writing: that it is just transcribed thought or transcribed speech. Furthermore, ChatGPT’s description of its own process—analyzing queries and generating text based on patterns—hardly describes the writing process as I understand it.
That AI cake is going to have a hard time picking up those pencils. Image created with Adobe Express.
I decided to press ChatGPT by explaining that writing is both embodied and cognitive. (In fact, the two aren’t really separable.) To my surprise, ChatGPT readily agreed:
You raise a significant point about the embodied nature of human writing . . . AI writing lacks this physical embodiment . . . The AI's "understanding" and "creation" of text is purely computational, devoid of the sensory and motor experiences that humans engage in during the writing process.
This difference means that while AI can replicate the output of writing (i.e., generating coherent and contextually appropriate text), it cannot replicate the human experience of writing, which is shaped by our physical interaction with the world and the tools we use to write.
Sure, the answer could use some unpacking, but it is heading in the right direction, especially in its acknowledgment that there is such a thing as “the human experience of writing.” Since it seemed to have capitulated to my original claim, I decided to ask it how it would characterize that human experience.
In response, it gave me a list.
I will return to that list below, but I want to ruminate for a moment on ChatGPT and list-making. I have noticed that ChatGPT seems to make lists almost by default. I haven’t seen this noted anywhere else. Sometimes I think it is just how the tool seems to respond to me. Then again, even when I try to practice good prompting—encouraging it to remain conversational, asking it repeatedly not to make lists—I still get lists. I think it speaks to an underlying characteristic of the training data. Think of how many “listicles” are out there on the internet. Is there any wonder it wants to revert to that most average of genres?
This list-making tendency underscores my point that AI cannot write. Sure, many human writers miss the rhetorical mark, especially novice writers. Still, most human writers will at least accomplish an approximation of their desired genre. ChatGPT frequently produces something wholly different, especially with informational and conversational genres. Audience, purpose, and genre will always be difficult except in the most rote of tasks. It will be a long time before generative AI fixes the problem.
Still, its list is instructive. I will reproduce just the main points, not its complete explanations:
Cognitive Process: Writing is a cognitive activity that involves thinking, planning, and problem-solving.
Emotional Expression: Writing is a powerful medium for expressing emotions.
Physical Interaction: The physical act of writing, whether it's typing on a keyboard or writing by hand, involves a unique interaction between the body and the writing instrument. This interaction can influence the writing process.
Sensory Experience: Writing engages multiple senses.
Personal and Cultural Context: Writing is shaped by the writer's personal experiences, cultural background, language, and education.
Reflective and Iterative Nature: Writing often involves reflection and revision. It's a process of continually refining ideas and expressions, which can lead to new insights and understanding, both about the subject matter and the writer’s own thoughts and feelings.
Communication and Connection: Writing is a means of communication, allowing individuals to share ideas, stories, and information. It connects the writer to the reader, bridging time and space, and can foster empathy and understanding.
Creativity and Innovation: Writing is a creative act, whether it's composing a novel, crafting a technical report, or jotting down a journal entry. It's a way of bringing new ideas into existence and exploring possibilities.
Well said, ChatGPT.
It appears that, even though ChatGPT still cannot write, it does a better job now representing the complexity involved in writing as a human activity. To echo the end of my last post, it gives me some hope that we might be able to work with generative AI towards a higher end, such as better understanding human experience.
Once again, if this post got your ideas turning, consider contributing to Bad Ideas about AI: Toward Generative Practices for Teaching, Learning, and Communication, co-edited by Anna Mills, Mandy Olejnik, Miranda Rodak, Shyam Sharma, and me. Learn more at https://www.tinyurl.com/badideasbook.
This was fascinating to read, Chris. While reading it, I also thought about how humans don't just seek to produce text (or even produce meaning); they try to get things done, and those actions are relative to other humans who have emotions, values, and relations to the writer. Writers and readers share context that goes far beyond the semantic or even the linguistic, such as cultural, political, social, etc. It is one thing that AI tools can "process" and "produce" language with stunning capacity, but they use those superior abilities to fool us -- meaning they fail to meaningfully engage all the other domains when they engage us merely linguistically. Communicating with humans versus AI to me feels like living life versus watching it in video ... or something like that. It's limited in so many ways, even though I feel that it is powerful (and helpful) in "some" ways.