The way we talk about AI influences how we understand, interact with, and use AI. In this post, I want to lay out two different ways of talking about, and thus engaging with, AI in classrooms: AI-as-tool, and AI-as-object of inquiry. In what follows, I will begin with a brief description of each mode of engagement, after which I will explore how each one can facilitate critical AI literacy. I hope to show that both AI as tool and AI as object of inquiry can help cultivate AI literacy, but that the latter may be a more direct route than the former.
AI as Tool
The “tool” metaphor is one of the most commonplace ways of talking about AI. What does it really mean to think about AI as a tool?
Think first of a much simpler tool: the hammer. Basically, a person uses the hammer (tool) to secure a nail (object) in the process of building something like a wall (outcome).* We can apply this same person-tool-object-outcome pattern to generative AI. For example, I could use AI (tool) to turn a list of my favorite literature (object) into a syllabus for an introduction to literature course (outcome).
AI-as-tool holds real promise for many. When people herald the potential of generative AI to speed up tasks and amplify users’ knowledge, they are usually taking up AI as a tool. But this metaphor also has real limitations. Many educators worry that AI’s sophistication as a tool will render thinking unnecessary, or at least that students will think it does. In the worst case, they may see AI as a tool that can be used to write a paper (object) and get a grade (outcome), whereas we might want them to write a paper (tool) about some subject matter (with text or data acting as objects) in order to deepen their thinking and create new knowledge (outcome).
One potential way to disrupt this simplistic version of AI as tool is to explicitly recast the process of using AI as one that involves critical thinking. Clearly this does not happen automatically—but then again, writing in its “pre-AI” form doesn’t automatically engender critical thinking in classrooms, either. It has to be called for explicitly until it becomes more automatic.
For a taste of what it might look like to engage AI-as-tool and critical thinking simultaneously, let me tell a brief story.
Recently, my wife had to write a brief personal statement for a professional opportunity, a genre she had never written before. She is an adept communicator, but she was having trouble getting started.
She decided to ask ChatGPT to write the first draft. After reading the output, she told me, she figured about 40% of what it said was pretty good, and the rest felt like overblown education-ese. She re-prompted it once or twice, but otherwise spent most of her time copying the 40% good stuff into a Word document and revising, expanding, and personalizing it until it sounded less robotic and communicated what she wanted to say about herself in light of the opportunity.
The was able to practice critical thinking in a few different ways. First, she had to attend to the quality of ChatGPT’s language, and she was able to recognize a mismatch between it’s often-overblown professional tone and the one she wanted to use to represent herself. Second, she had to think critically about how she wanted to represent herself, not just tonally, but also narratively: which experiences and credentials did she want to emphasize, in what order?
Key to her ability to engage ChatGPT critically stemmed from experience. She has written plenty of other material in her life, including resumes, cover letters, grants, reading reflections, reports, emails, etc. She brought to ChatGPT her some sense of a writing process and knowledge of how she wanted to sound on the page. With this experience in her back pocket, she recognized ChatGPT’s stylistic misses and informational oversights, and she corrected them with her revision process. In other words, she intuitively exercised an element of critical AI literacy.
Our students, most of whom are less experienced writers than my wife, encounter new genres, and new content, all the time. Is it any wonder they have trouble getting started? Generative AI as a tool can help them overcome writer’s block. However, the critically literate use of AI-as-tool that my wife displayed may not come as easily when students have less experience with writing. We can augment that experience by offering them samples of target genres to compare, thus drawing their awareness to aspects of style, form, ideas, evidence, etc., that they might not otherwise pay attention to.
AI as Object of Inquiry
However, a more direct route to critical thinking in our classrooms may be to treat AI as object of inquiry. When I write “object” here, I specifically mean to shift AI from tool to object in the person-tool-object-outcome process I shared above. What, then, becomes the tool in that process? In this case, inquiry—that is, sustained questioning—becomes the tool. Specifically, any given teacher or instructor could use specific modes of questioning in their field to inquire into AI as an object. This means that the examination of AI itself does not need to be (indeed, should not be) the sole province of computer scientists and software developers. All fields have the means of turning AI into an object of inquiry. The outcome in such cases can be both a more critical perspective on AI and reinforcement of their subject matter and associated ways of thinking.
To illustrate AI-as-object-of-inquiry, here’s another, very different, example. I recently led a class period about AI for tutors while visiting a small liberal arts women’s college. As one option for experimentation, I invited them to use ChatGPT to generate a tutoring philosophy. One group of women began with a generic prompt, and got what they characterized as a generic answer, full of commonplace terms about good tutoring practice (e.g., empowering learners), but with no details that illustrated actual understanding of those terms and what they might look like in practice.
When they re-prompted ChatGPT to write a tutoring philosophy for a tutor at a small women’s college, they immediately noticed words that seemed distinctly gendered, such as “flourish” and “blossom.” They then decided to shift the context to a men’s college for the sake of comparison, and noticed more references to more masculine-sounding terms like “respectful debate” and “strength” than in the women’s version. Without my explicit prompting, they had shifted from AI-as-tool to AI-as-object-of-inquiry, which led them to think critically about the language of the output.
I was not able to re-create their output myself, but the image below illustrates some potentially gendered assumptions as well. On the left, the women’s college has a more domestic feel, it is surrounded by flowers, and its edges are softened by ivy. On the right, the men’s college has trappings of classical architecture, its straight lines and hard edges arguably more “masculine,” especially in comparison to the women’s college. If we felt so inclined, we might even contrast the “vaginal” door of the former with the “phallic” columns of the latter. A critical eye might also see a reason to complicate this dualistic interpretation: the soft focus of the image overall, the women and men swapped on either side, and the woman at the center, who seemingly bridges the two colleges. The point here isn’t so much to cry “gotcha” at ChatGPT as it is to illustrate the way AI outputs can become fodder for critical inquiry that might illustrate something we want students to learn, such as the interpretation of images through a gender studies lens.
Image produced via DALL-E on ChatGPT with the following prompt: Create an image that juxtaposes an all-women's liberal arts college with an all-men's liberal arts college. At least a few people should be visible in the foreground.
While these examples are most relevant to writing, language, literature, art, or architecture classes, the process can translate widely. You might ask students to prompt AI for multiple possible interpretations of a phenomenon, perhaps from different theoretical perspectives, and then select and argue for the most compelling one. You could have them prompt AI to argue in favor of one disciplinary “camp” while students debate it from the perspective of another “camp.” Or you could have them prompt AI for any project or assignment, and then verify (and ideally, elaborate) responses through their own, independent research via credible sources. Any of these activities can turn AI into an object of inquiry, provide them with the opportunity to develop critical awareness of the technology’s benefits and limitations, and reinforce the disciplinary knowledge you are trying to teach.
If generative AI is going to be infused into our everyday working lives, we cannot let it turn invisible, which is a likely outcome if we only treat it as a tool. The result might be the removal of all of us as communicators, especially if we begin to believe that the technology is always inherently better at writing and communicating than we are. To stave off that future, we need to continue looking at generative AI as an object so that we can critique it and remind ourselves of the place and potential value of human communication in the world.
*Some readers might recognize that I am using some of the language of cultural-historical activity theory here, but I don’t unpack it, mainly for simplicity’s sake.
Once again, if this post got your ideas turning, consider contributing to Bad Ideas about AI: Toward Generative Practices for Teaching, Learning, and Communication, co-edited by Anna Mills, Mandy Olejnik, Miranda Rodak, Shyam Sharma, and me. Learn more at https://www.tinyurl.com/badideasbook.
I like your suggestion of AI as 'object' rather than tool, especially since it can't be neutral yet. The hammer, or any tool of that kind, is effectively a closed/solved problem. Awareness of the positive and negative possibilities allows for neutrality, and I'm not sure how possible it is to see generative AI in the same way any time soon, if ever. That's before even considering the biases you rightly mention. More useful food for thought.