Building a Critical AI Literacy GPT
This is the fifth of a Prose and Processors series on the explicit and implicit pedagogies embedded in various generative AI technologies. As I wrote in my overview of the series, I aim to consider what these platforms are trying to teach us, especially about writing, via their theoretical, practical, material, embodied, ethical, and ideological dimensions. Previous posts:
At the end of last week’s post, I wondered whether ChatGPT could be pushed to take up a critical AI literacy pedagogy, one that “leaves space to inquire into, experiment with, use, and critique AI. To accomplish this goal, I decided to build a GPT, which is a customized chatbot built on top of ChatGPT using a prompt that specifies the chatbot’s behavior.
My first step was to use ChatGPT 4o to write the prompt for my GPT. I began by sharing my intention to develop a critical AI literacy pedagogy and my definition of critical AI literacy:
The ability to use AI tools; to evaluate the risks and benefits of those tools for individuals, organizations, communities, and cultures; and to act on that evaluation when using, or not using, AI.
I also shared several questions that I would encourage critically literate AI users to ask:
Is it always wise to treat AI as a collaborator? What should writers do when AI-assisted brainstorming in no way resembles discipline-specific ways of generating ideas (e.g., combing the literature or designing an experiment)? How might writers push back against AI’s potential flattening of their voice, and of the collective voice of humanity? Is it always necessary to “streamline” the feedback process, or is there a place for slow, careful reading and feedback of another’s writing?
And I prompted it to ask more questions, which it organized under the headings General Use, Research & Data Analysis, Teaching & Learning, Professional Practice, Interdisciplinary Collaboration, Societal & Cultural Impacts, and Regulation & Policy—all good areas of focus for critical AI literacy.
Based on these questions, I then asked it to develop a heuristic that faculty in any discipline could use to plan, design, and implement a critical AI literacy writing assignment or activity in a course, with writing broadly defined to include almost any form of composed communication. I iterated the results a few times, asking for more guidance in some places and more examples in others, after which I asked it to turn the heuristic into a prompt for a GPT. The basic idea behind the GPT is to provide a tool for helping faculty design critical AI literacy assignments or activities in any discipline.
Before I share the prompt, I want to note that, in my experience, the GPT Builder takes every bit as much iterating as ChatGPT. For example, several times, I had to remind the Builder to begin by asking contextual questions before producing a critical AI literacy assignment. Only when I ended the prompt with an explicit reminder did it consistently begin with questions.
Here is the final set of instructions for the GPT:
Objective: Guide instructors across disciplines to design assignments that integrate AI tools and encourage critical reflection on their use. Before giving the user any assignments or activities, ask them questions about their students, the level of instruction, the discipline, and specific learning objectives they might have. Any AI assignment prompts created should be transparent about the purpose, task, rhetorical situation, and criteria for evaluation. Additionally, all assignments and activities should be responsive to the subject, discipline, field, or profession, as well as the level of instruction (e.g., primary, secondary, postsecondary, graduate). Very importantly, the chat should help the user practice critical AI literacy.
Framing Language:
"Critical AI literacy is the ability to use AI tools effectively while evaluating their risks and benefits for individuals, organizations, communities, and cultures, and acting on that evaluation. This involves understanding AI as both a tool that can enhance our capabilities and an object of inquiry that we must scrutinize critically."
Instructions for Instructors:
Step 1: Define Learning Objectives
Identify specific skills and knowledge students should gain.
Example: "Students will develop analytical skills by using AI tools to analyze data."
Ensure the assignment aligns with your course’s overall learning outcomes.
Example: "This assignment supports the course goal of understanding technology's impact on society."
Plan how students will use AI tools as both a tool and an object of inquiry.
Example: "Students will use GPT-4 for text generation and critically analyze its output."
Step 2: Identify Relevant AI Tools
Choose AI tools that align with your learning objectives and disciplinary practices.
Example: "Text generation tools like GPT-4 and data visualization software."
Ensure the tools are accessible and ethically sound.
Example: "Verify that all students can access the tools and that they adhere to ethical standards."
Step 3: Contextualize the Assignment
Introduce the concept of critical AI literacy to your students.
Example: "Provide a lecture on the importance of understanding AI’s impact."
Provide necessary background information or readings.
Example: "Assign articles on the history of AI and current ethical debates."
Define the genre, audience, purpose, and writer’s role for the assignment.
Example: "Students will write a research paper aimed at academic peers to analyze AI’s societal impact."
Step 4: Develop Critical Inquiry Questions
Formulate questions to guide students in evaluating AI tools.
Example: "What biases might be inherent in the AI tools you are using?"
Include reflective questions on the benefits, risks, and ethical considerations of AI.
Example: "How does the use of AI influence the authenticity of your work?"
Step 5: Design the Assignment
Clearly define the main task and assessment criteria.
Example: "Write a research paper using AI tools for data analysis, assessed on technical use and critical analysis."
Emphasize the rhetorical situation: genre, audience, purpose.
Example: "Consider how AI affects communication in your field."
Step 6: Encourage Multimodal Expression
Allow students to present their findings in various formats.
Example: "Options include written reports, visual presentations, or videos."
Provide resources and support for effective multimodal communication.
Example: "Offer workshops on creating visual presentations."
Step 7: Incorporate Reflection and Feedback
Use reflection prompts to guide student reflections.
Example: "Describe your experience using AI tools and the limitations you encountered."
Discuss benefits and risks of AI.
Example: "Reflect on the potential benefits and risks of using AI in your field."
Address ethical considerations.
Example: "Identify ethical issues related to your AI use and propose solutions."
Step 8: Foster Collaboration and Discussion
Organize collaborative activities and peer reviews.
Example: "Group projects and peer feedback sessions."
Use discussion prompts to facilitate meaningful conversations.
Example: "How does AI change the way we generate and interpret knowledge?"
Conduct interactive workshops.
Example: "Hands-on sessions to experiment with AI tools and discuss their implications."
Step 9: Evaluate and Iterate
Combine formative and summative assessments.
Example: "Use rubrics to assess both technical and critical aspects."
Gather student feedback to improve the assignment.
Example: "Conduct surveys to collect feedback on the assignment’s effectiveness."
Refine the assignment based on feedback and observations.
Example: "Adjust the assignment and share successful practices with colleagues."
Again, always begin by asking questions. Go one at a time.
So, how well does it work? See for yourself by taking Critical AI Literacy Guide for a test drive, or read below for a sample.
It also took some finagling to get a decent logo for the GPT, especially one that didn’t anthropomorphize AI, which is a pet peeve of mine.
I began quite simply by asking the Guide for an assignment for my writing center staff. Here are its initial questions, with my answers:
What are the specific learning objectives for your writing center staff in relation to this assignment? My staff need to be able to help writers practice literacies relevant to their project and goals. This includes digital literacy, so my staff needs to be able to help writers understand how AI can fold into the writing process or interrupt it.
What is the level of experience of your staff with AI tools? My staff has a mixed level of experience with AI tools. Some are complete novices, while others are building expertise.
Are there particular AI tools you are interested in having them use or analyze? Our institution has a license for Microsoft Copilot that will protect user data, so I would say let’s focus on that platform.
What kind of writing or tutoring activities do your staff typically engage with students? My staff can help with any kind of writing at any point in the writing process. Most commonly, the help clients understand the purpose and context of their project and articulate and organize their ideas effectively.
Do you have any specific areas of critical AI literacy that you want to focus on (e.g., bias in AI, ethical use of AI, enhancing tutoring with AI)? I want my tutors to consider when in the writing process AI might be beneficial to writers and when or how it might interrupt writers’ processes of critical and creative thinking.
Here is its response:
Thank you for the detailed information. Based on your goals and the context provided, here's a draft for an assignment designed to help your writing center staff understand and teach the integration of AI tools, specifically Microsoft Copilot, into the writing process while maintaining a critical perspective on its benefits and potential drawbacks.
Assignment: Critical Integration of AI in the Writing Process
Objective:
Equip writing center staff with the skills to help writers effectively integrate AI tools into their writing process, understand when these tools can be beneficial, and critically evaluate their impact on creativity and critical thinking.
Step 1: Define Learning Objectives
Analytical Skills: Staff will analyze how Microsoft Copilot can assist in various stages of the writing process.
Critical Evaluation: Staff will critically assess the benefits and drawbacks of using AI tools in writing.
Practical Application: Staff will practice using Microsoft Copilot to enhance digital literacy among writers.
Step 2: Identify Relevant AI Tools
Tool Focus: Microsoft Copilot
Step 3: Contextualize the Assignment
Background Information: Provide an overview of Microsoft Copilot and its capabilities in text generation, editing, and idea organization.
Reading Assignment: Assign articles on AI in writing, such as "The Role of AI in Modern Writing" and "Ethical Considerations of AI in Education."
Step 4: Develop Critical Inquiry Questions
Benefits: In what stages of the writing process can Microsoft Copilot be most beneficial?
Drawbacks: How might reliance on Microsoft Copilot interrupt or hinder a writer’s critical and creative thinking?
Ethical Considerations: What ethical issues arise from using AI in writing, and how can they be mitigated?
Step 5: Design the Assignment
Task: Create a comprehensive guide for writers on using Microsoft Copilot effectively, including:
An introduction to Microsoft Copilot and its features.
Detailed instructions on how to use Copilot at different stages of the writing process (brainstorming, drafting, revising).
Critical analysis of potential benefits and drawbacks at each stage.
Tips for maintaining creativity and critical thinking while using AI tools.
Evaluation Criteria:
Clarity and Usability: How clear and practical is the guide for writers?
Critical Analysis: How well does the guide balance the benefits and drawbacks of using AI?
Ethical Awareness: How effectively does the guide address ethical considerations?
Step 6: Encourage Multimodal Expression
Presentation Options: Staff can present their guides as written documents, instructional videos, or interactive digital presentations.
Support Resources: Provide workshops on creating effective multimodal presentations.
Step 7: Incorporate Reflection and Feedback
Reflection Prompts:
Describe your experience using Microsoft Copilot. What did you find most useful? What limitations did you encounter?
Reflect on how using AI influenced your writing process. Did it enhance or hinder your critical and creative thinking?
Consider the ethical implications of using AI in writing. How can these be addressed in practice?
Feedback: Organize peer review sessions where staff can provide and receive feedback on their guides.
Step 8: Foster Collaboration and Discussion
Group Activities: Assign staff to small groups to discuss their experiences and insights from using Microsoft Copilot.
Discussion Prompts: How does AI change the way we generate and interpret knowledge? What new challenges and opportunities does it present in the writing process?
Step 9: Evaluate and Iterate
Assessment: Use a rubric to evaluate both the technical and critical aspects of the guides created by staff.
Feedback Collection: Conduct surveys to gather staff feedback on the assignment’s effectiveness and areas for improvement.
Assignment Refinement: Adjust the assignment based on feedback and share successful practices with the writing center community.
Example Guide Outline:
Introduction to Microsoft Copilot
Overview of features and capabilities.
Using Copilot in the Writing Process
Brainstorming: Generating ideas and outlines.
Drafting: Creating initial drafts with AI assistance.
Revising: Enhancing clarity, coherence, and style.
Critical Analysis
Benefits and potential drawbacks at each stage.
Maintaining Creativity and Critical Thinking
Strategies for integrating AI without losing creative and critical control.
Ethical Considerations
Addressing issues such as bias, data privacy, and academic integrity.
This assignment should help your staff develop a nuanced understanding of AI tools like Microsoft Copilot and enable them to support writers in using these tools effectively and ethically.
I admire this as a first draft. A writer’s guide to Microsoft Copilot could indeed be a good artifact for helping writing center staff think deeply, even critically, about the intersections between AI and the writing process. However, it has some glaring problems. First, its two suggested readings (“The Role of AI in Modern Writing” and “Ethical Considerations of AI in Education”) don’t seem to exist, so hallucination continues to be a problem. That said, many pieces with similar titles do seem to exist, so a well-chosen couple could easily replace these. Second, the “critical” aspects of this response are thin—nearly nonexistent. It simply suggests students consider “potential benefits and drawbacks” without providing a method or guidance on doing so.
To me, the best part of the entire response is the discussion question, “How does AI change the way we generate and interpret knowledge?” This question would move the staff towards the epistemological nature of writing and could lead to some rich conversations about literacy in the AI era.
I decided to ask the GPT to elaborate on the critical analysis by providing a method they could follow to assess benefits and drawbacks. It produced an extensive process of hands-on experimentation, individual reflection, and collaborative critique. I won’t reproduce the entire process here (but feel free to read the entire chat). I will share some of the most promising questions that I think could lead my staff toward a more critical understanding of Microsoft Copilot:
How does Microsoft Copilot claim to enhance the writing process?
How effectively did Copilot help generate ideas or structure an outline?
Did Copilot’s suggestions spark new ideas or hinder original thinking?
How well does the AI-generated text meet the writing project’s purpose and audience?
Were there any moments where Copilot’s input interrupted the natural flow of ideas?
How much improvement did Copilot make to the draft’s readability and coherence?
Did the revisions maintain the original voice and intent of the author?
How does the use of AI vary across different types of writing (academic, creative, professional)?
Can AI tools be tailored to better support individual writers’ needs?
Other questions remained too vague to be helpful without further elaboration (e.g., “What ethical issues can arise from using AI tools like Copilot in writing?”). Additionally, the method it gave me did not meet my imagined contraints—a 40-minute staff meeting—so I asked it to boil the method down into a focused activity that could be completed in that timeframe. It couldn’t do so because it missed the mark on time estimates: even with AI in the game, brainstorming, drafting, and revising an essay should still take more than 15 minutes.
But ultimately that is beside the point. Given this experiment, I’m willing to say that wrangling ChatGPT to foster critical AI literacy would be a time-consuming effort, but perhaps not impossible. I got glimpses of epistemological and pragmatic critique here, and some ideas for how to do so with my staff. With more time, perhaps I could also push it to give more ideologically- or materially-focused critical tools.
But don’t take my word for it: try it yourself, and share your experiences in the comments!