Creating and Using Rubrics for Assessment 

checklist on a clipboard

A rubric is a scoring tool that breaks down the expectations for an assignment into grading criteria. Rubrics serve as a guide for students to complete an assignment successfully and as a measurement tool for instructors to determine to what degree students have met the assignment’s expectations. Rubrics are highly flexible and can be used for a wide variety of assessments. Besides instructor grading, rubrics can also be useful for peer review and student self-evaluation. This guide provides an overview of the different types of rubrics, considerations for creating and using them, as well as examples you can borrow from.

Table of Contents

Advantages of Using Rubrics

Using rubrics for assessment can benefit both the student and the instructor. Rubrics provide transparency in how an assignment will be graded, helping students understand their instructor’s expectations. For instructors, using rubrics can help ensure that their assignment’s grading criteria are aligned with course and assignment outcomes. Grading with a rubric can also increase consistency and objectivity, whether you are the sole grader or are working with a co-teacher or TA. Although creating a rubric requires an initial time investment, it can make your grading process more efficient in the long run.

Rubric Types and Components

Rubrics can be used to assess a wide range of activities – formative and summative assessments, written and oral reports, individual and group projects, and everything in between. Most rubrics list the criteria along the left side and performance level categories (e.g., “meets” or “does not meet” expectations) along the top, creating a matrix for scoring. Other rubrics may omit these performance level categories in favor of freeform comments. A rubric may or may not have points attached to each criterion, depending on how the rubric is being used to assess a student’s work.

Criteria

checklistA rubric defines the criteria used to assess an activity, project, or performance. On a typical rubric, the criteria are listed along the left side, and the document is divided into rows. The number of criteria a rubric contains will vary greatly depending on the complexity of the task being assessed and how granular the instructor would like the grade breakdown to be. A rubric for a simple activity might only have two or three criteria, whereas a rubric for a complex summative assessment might have ten.

Generally speaking, a rubric’s criteria should be:

  • Mutually exclusive. Criteria should not overlap with one another to avoid awarding or detracting points for the same category more than once.
  • Objective. Criteria should be measurable and rely on concrete, observable evidence. Try to avoid using subjective terminology like “interesting” or “good.”
  • Exhaustive. The listed criteria should cover all aspects that an assignment is designed to assess. Likewise, the point total for a rubric should match the point total for the activity.

Additionally, a rubric’s criteria should align with the assignment and course outcomes. As you develop a rubric, compare its criteria with the outcomes of the assignment. Are there any elements you need to assess that are not captured in the rubric? Are there elements in the rubric that are irrelevant to the assignment’s purpose? If you answered “yes” to either of these questions, consider revising your rubric’s criteria to more accurately reflect the assignment’s learning outcomes.

Performance Levels

Most rubrics are broken down into performance levels that describe the quality of a student’s work and/or the level of completeness. Like criteria, the number of performance level categories can vary greatly depending on the type of assessment and the preferences of the instructor. Including more performance levels allows for more granular grading, but also makes a rubric more complex. Performance levels are usually listed as a scale along the top of a rubric, dividing the document into columns.

Example Performance Level Scales

2 Performance Levels 3+ Performance Levels
  • Meets Expectations
  • Does Not Meet Expectations
  • Exceeds Expectations
  • Meets Expectations
  • Does Not Meet Expectations
  • Complete
  • Incomplete
  • Advanced
  • Proficient
  • Developing
  • Beginner
  • Yes
  • No
  • Excellent
  • Good
  • Fair
  • Poor
  • Unacceptable/Inadequate

Descriptions

If you include performance levels, you should also explain what these levels look like for each criterion. For example, if “organization” is a criterion for a written report, what exactly does “excellent” organization in a report look like? What about a paper with “good” or “fair” organization? These descriptions should clarify any ambiguity about the criteria and the performance levels, guiding students in their successful completion of the assignment.

Points

scoreboard

It is common for each criterion of a rubric to have a point value tied to it. The point values can be the same for each criterion, or they can vary if some criteria are a bigger contributing factor to students’ success on the assignment compared to the other criteria. If the rubric uses performance levels, each performance level should be assigned a point value as well. The highest performance level is awarded the maximum point value for a given criterion, with the rest of the performance levels assigned decreasing amounts of points accordingly.

If you’re not sure how to define point categories on a rubric, first determine the maximum number of points you’d like to award for a given criterion. Then, set a point value scale based on this maximum point value and the number of performance categories. Not every point scale will contain “0”, but if the criterion is something that a student could hypothetically earn no points on, you may want to factor that into your point scale.

You will also need to decide if you want each performance level to correlate to a single point value or encompass a range of point values. Using point value ranges allows for more flexibility in terms of scoring but it can also make grading more complicated than using set values. For example, if the “excellent” performance level is worth between 8 and 10 points, that allows you to assign a score of “9,” “9.5”, or any score that falls within that range when grading.

Example Point Scales

Let’s say you have a criterion worth 10 points and five performance level categories. Here are a few ways you could go about setting your point value scale depending on your grading needs. Notice that for the “Point Value Range” example there must not be any gaps or overlaps in the score ranges.

Set Point Values (Omitting Zero) Set Point Values (Including Zero) Point Value Range
Excellent: 10 pts

Good: 8 pts

Fair: 6 pts

Poor: 4 pts

Incomplete/No Submission: 2 pts

Excellent: 10 pts

Good: 7.5 pts

Fair: 5 pts

Poor: 2.5 pts

Incomplete/No Submission: 0 pts

Excellent: 10 > 8 pts

Good: 8 > 6 pts

Fair: 6 > 4 pts

Poor: 4 > 2 pts

Incomplete/No Submission: 2 > 0 pts

Using Rubrics Without Points

It is also possible to use a rubric without point values. If you’d like, you can grade students using just the performance categories or by writing freeform comments for each criterion. This can be useful for low-stakes formative assessments, in-class practice activities, and peer review exercises. Using rubrics without points also allows you to provide qualitative feedback for work graded on a complete/incomplete basis.

Recommendations for Using Rubrics

students adding post-it notes to a wall

In addition to the decisions outlined above regarding criteria, performance levels, descriptions, and points, here are a few recommendations to consider when using rubrics. These strategies can help you make the most out of rubrics as both a teaching tool and an assessment tool. Click on a suggestion to expand the accordion and read more.

One of the key advantages of using rubrics for assessment is that they can make your expectations more transparent to students. By sharing the rubric for an assignment in advance, students can use it as a guide to successfully complete the assignment. This practice is beneficial for all students but has particularly positive impacts for certain demographics that may require additional transparency in assignment directions, like first-gen students and neurodivergent students. 

There is quite a bit of research that supports the idea of involving students in the assessment creation process to enhance their engagement and learning (Stiggins & Chappuis, 2005Lubicz-Nawrocka, 2018; ). One way to achieve this is by developing rubrics together as a class. This work can be done synchronously through brainstorming session during class or asynchronously through a discussion board or survey. By co-authoring rubrics with your students, you allow them to develop a deeper understanding of their own learning and the nature of assessment. If you’d like to learn more about this strategy, this model for collaborative rubric construction from the Journal of University Teaching & Learning Practice is a good place to start. 

Sometimes showing is more powerful than telling. In addition to providing written descriptions of your expectations within the rubric itself, consider providing a couple examples of what exemplary, satisfactory, or unsatisfactory work looks like. These could be resources that you’ve created, examples sourced online, or anonymized student work from a previous semester that you've received consent to use. Keep in mind that you can share just part of a work sample if you want it to serve as an example for a specific criterion. 

Although you can grade with rubrics in Microsoft Word or write out comments on paper, using the rubrics tool in Canvas has its own unique advantages. When you attach a Canvas rubric to an assignment or graded discussion, the rubric will also show up in SpeedGrader, making it very quick and easy to grade online submissions. To grade with a Canvas rubric, simply click a box for each criterion to indicate the student’s performance level. You can also type comments for each criterion. If you check the box “use rubric for assignment grading” when attaching a rubric to an assignment, Canvas will even auto-calculate the point total as you fill out the rubric. Plus, once you’ve built a rubric in Canvas, you can easily reuse it in any of your other courses. You can learn more about creating and using Canvas rubrics in the Canvas instructor guides or by setting up and consultation with CATL

Example Rubrics

Not sure where to start? See the examples below for rubrics for various assessments, projects, and disciplines. You might also consider using a GAI tool like Microsoft Copilot to speed up the drafting process.

Questions?

CATL is available for consultations if you have any questions about rubrics or are wondering how to create your own. Send us an email or fill out our consultation form to set up a meeting with a CATL member. Or drop us a comment below to let us know how you’ve rubrics in your own courses!

Additional Resources & Further Reading

Web Guides from Other Universities

Books

Articles and Research

Generative Artificial Intelligence: Updates and Articles for Instructors

Welcome to our GAI resource-sharing blog page! Here you’ll find some of the latest updates and articles on generative AI, curated especially for faculty and instructional staff. While there are numerous resources available out there, CATL will share a select, timely sample of articles and perspectives to help instructors stay informed about new changes in AI technology and education.

For more in-depth, instructor-focused articles on generative AI by CATL, explore our AI Toolbox Articles.

Table of Contents

Generative AI Tools

UWGB faculty, staff, and students have access to Microsoft Copilot. Microsoft Copilot is a University supported tool that can be used with your UWGB account. Learn more about using and signing into Copilot below!

Stay updated on the different AI tools being created and discover what your peers or fields might be using!

Monthly Resources for Educators

(Resources in this section are updated for each month)

September 2024

Tips for Teachers

  • On September 23, 2024, the Division of Student Access & Success at UWGB launched Phlash, a new AI bot designed to assist students by answering questions, providing resources, and offering support through SMS text messaging. We encourage you to familiarize yourself with how this tool can support your students and consider sharing it with them.

Latest Educational Updates

  • Must-Have Competencies and Skills in Our New AI World: A Synthesis for Educational Reform, September 17, 2024. This EDUCAUSE Review article explores educational reform strategies to equip students for an “AI-integrated world.” The article highlights key competencies that institutions could consider emphasizing like intelligent design, human interaction, and data management.
  • The Impact of AI in Advancing Accessibility for Learners with Disabilities, September 10, 2024. This EDUCAUSE Review article examines how AI technology could enhance accessibility, helping create more inclusive and equitable learning environments. The article discusses AI’s potential to remove educational barriers by providing tailored support for students with disabilities. It also offers examples of current and future AI applications aimed at improving accessibility in testing, content delivery, and personalized learning experiences.

Latest AI Tech Advancements

  • Ask Microsoft Accessibility is a new Bing AI-powered tool designed to help users find accessibility information for Microsoft products and services. Try it out the next time you are working in Microsoft 365 if you have questions on how to make your materials digitally accessible.
  • Additionally, Microsoft has released an update to Copilot so that it now provides precise information based on specific timeframes (e.g., yesterday, last month). Note: UWGB currently only has access to Microsoft Copilot through www.copilot.microsoft.com and does not have Copilot features in MS 365 products like Word, PowerPoint, or Excel.

August 2024

Tips for Teachers

  • Make sure your syllabus is ready for the upcoming semester! If you haven’t drafted an AI policy yet, now’s the time. Your students will have different experiences with generative AI—some may avoid it, while others are well-versed. Including a syllabus statement and a brief discussion on AI, even just 5 minutes, will help them understand your expectations.

Latest Educational Updates

July 2024

Tips for Teachers

  • Now that you’re signed into Copilot, give it a try with one of your assignments. You can ask Copilot to compare your assignments to the TILT framework, generate a prompt or activity with examples, or even see how well it completes the assignment and where it might miss the mark. This kind of practice will help build your confidence and comfort with using Copilot and might also offer new insights into your assignments.

Latest AI Tech Advancements

June 2024

Tips for Teachers

  • If you haven’t signed into Copilot with your UWGB account, now is the time! Microsoft Copilot, accessible through any browser and soon integrated into Windows 11, avoids using your personal email, which makes it a better alternative for classes. It doesn’t require providing, for example, a personal cellphone number for use, and it is available to all UWGB faculty, staff, and students with an institutional login and ID. Copilot also offers enhanced data protection when logged in using your UWGB account, although FERPA-protected and personally identifiable information should still not be entered. Watch this short video on how to log in. Remember, use any AI tool responsibly and always vet outputs for accuracy.

Latest Educational Updates

  • Latest AI Announcements Mean Another Big Adjustment for Educators, June 6, 2024. This article from EdSurge recaps some of the latest AI advancements that will heavily impact education and provides advice from instructors and ed tech experts on how to adapt.
  • AI Detectors Don’t Work. Here’s What to Do Instead, 2024. MIT’s Teaching & Learning Technologies Center critiques AI detection software and suggests better alternatives. The article advocates for clear guidelines, open dialogue, creative assignment design, and equitable assessment practices to effectively engage students and maintain academic standards.

May 2024

Tip for Teachers

  • Subscribe to the “One Useful Thing” blog by Ethan Mollick, an Associate Professor at the Wharton School of the University of Pennsylvania and Co-Director of the Generative AI Lab at Wharton.

Latest Educational Updates

Latest AI Tech Advancements

(Resources in this section are updated biannually)

May 2023 – June 2024

  • Artificial Intelligence and the Future of Teaching and Learning, May 2023. This report by the Office of Educational Technology provides insights on how AI can be integrated into education practices, and recommended responses for educators.
  • The AI Index Report: Measuring trends in AI, April 2024. Created by the Institute for Human-Centered AI at Stanford University, this report provides an analysis of AI trends and metrics, including important insights into the current state and future direction of AI for educators grappling with the rapidly evolving technology and what it means for their teaching practices.
  • AI in 2024: Major Developments & Innovations, Jan. 3, 2024. This article provides a timeline of AI developments during 2023 and newest updates in 2024.
  • 2024 AI Business Predictions, 2024. This report by PwC describes how businesses are preparing for and incorporating AI, with predictions on future trends and AI strategies in the corporate world.

Generative Artificial Intelligence (GAI) and Acknowledging or Citing Use

UW-Green Bay’s libraries have an excellent student-facing webpage on how to acknowledge or formally cite the use of GAI. This blog is intended to supplement that resource with information more specific to instructors. Professors will be vital in helping students understand both the ethics and practicalities of transparency when employing GAI tools in our work. Please keep the following caveats in mind as you explore this resource.

  • As with all things GAI, new developments are rapid and commonplace, which means everyone needs to be on the alert for changes.
  • Instructors are the ones who decide their specific course policies on disclosing or citing GAI. The information below provides some options for formatting acknowledgments, but they are not exhaustive.
  • Providing acknowledgment for the use of GAI may seem straightforward, but it is actually a very nuanced topic. Questions about copyright implications, whether AI can be considered an “author,” and the ethics of relationships between large AI entities and publishing houses are beyond the scope of this blog. Know, though, that such issues are being discussed.
  • Please remember that it is not only important for students to acknowledge or cite the use of GAI. Instructors need to do so with their use of it, as well.

Acknowledgment or Citation of GAI

There is a difference between acknowledging the use of GAI with a simple statement at the end of a paper, requiring students to submit a full transcript of their GAI chat in an appendix, and providing a formal citation in APA, MLA, or Chicago styles.

  • UWGB Libraries have some excellent acknowledgment examples on their page.
  • UWM’s library page provides basic templates for citations intended to be consistent with APA, MLA, and Chicago styles.
  • There are also lengthy blog explanations and detailed citation examples available directly from APA, MLA, and the Chicago Manual of Style.

Regardless of the specific format being used, the information likely to be required to acknowledge or cite GAI includes:

  1. The name of the GAI tool (e.g., Copilot, ChatGPT)
    Microsoft Copilot, OpenAI’s ChatGPT 4.o (May 23, 2024 version), etc.
  2. The specific use of the GAI tool
    “to correct grammar and reduce the length in one paragraph of a 15-page paper”
  3. The precise prompts entered (initial and follow-up)
    “please reduce this paragraph by 50 words and correct grammatical errors”; follow-up prompt: “now cut 50 words from this revised version”
  4. The specific output and how it was used (perhaps even a full transcript)
    “specific suggestions, some of which were followed, of words to cut and run-on sentences to revise”
  5. The date the content was created
    August 13, 2024

Ultimately, instructors decide what format is best for their course based on their field of study, the nature and extent of GAI use permitted, and the purpose of the assignment. It is important to proactively provide specific information to students about assignments. Professors who are particularly interested in whether students are using GAI effectively may focus on the prompts used or even ask for the full transcript of a session. If, in a specific assignment, the instructor is more interested in students learning their discipline’s citation style, then they might ask for a formal citation using APA format. Although the decision is up to the professor, they should tell students in advance and strongly encourage them to have separate Word documents for each of their classes in which they save any GAI chats (including prompts and output) and their date. That way they have records to go back to; If they use Copilot with data protection, it does not save the content of sessions.

What Messages Might I Give to Students about Using, Disclosing, or Citing GAI?

Instructors should consider how they will apply this information about acknowledgments and citations in their own classes. CATL encourages you to do the following in your work with students.

  1. Decide on a policy for acknowledging/citing GAI use for each course assignment and communicate it in your syllabus and any applicable handouts, Canvas pages, etc.
  2. Reinforce for students that GAI makes mistakes. Students are ultimately responsible for the accuracy of the work they submit and for not using others’ intellectual property without proper acknowledgment. They should be encouraged to check on the actual existence of any sources cited by a GAI tool because they are sometimes “hallucinated,” not genuine.
  3. Talk to students about the peer review and publication processes and what those mean for source credibility compared to the “scraping” process used to train GAI models.
  4. Explain that GAI is not objective. It can contain bias. It has been created by humans and trained on data primarily produced by humans, which means it can reflect their very real biases.
  5. Communicate that transparency in GAI use is critical. Instructors should be clear with their students about when and how they may use GAI to complete specific assignments. At the same time, one of the best ways instructors can share the importance of transparency and attribution is through modeling it themselves (e.g., an instructor disclosing that they used Copilot to create a case study for their course and modeling how to format the disclosure).
  6. Remind students that even if the specific format varies, the information they are most likely to have to produce for a disclosure/acknowledgment or citation is: a) the name of the tool, b) the specific use of the tool, c) the prompts used, d) the output produced, and e) the date of use.
  7. Finally, encourage students to copy and paste all GAI interaction information, including an entire chat history, into a Word document for your course and to save it for future reference. One advantage of Microsoft Copilot with data protections is that it does not retain chat histories. That’s wonderful from a security perspective, but it makes it impossible to re-create that information once a session has ended. They should also know that even GAI tools that save interactions and use them to train their model are unlikely to re-produce a session even if the same prompt is entered.

How Will Generative AI Change My Course (GenAI Checklist)?

With the growing prevalence of generative AI applications and the ongoing discussions surrounding their integration in higher education, it can be overwhelming to contemplate their impact on your courses, learning materials, and field. As we navigate these new technologies, it is crucial to reflect on how generative AI can either hinder or enhance your teaching methods. CATL has created a checklist designed to help instructors consider how generative artificial intelligence (GAI) products may affect your courses and learning materials (syllabi, learning outcomes, and assessment).

Each step provides guidance on how to make strategic course adaptations and set course expectations that address these tools. As you go through the checklist, you may find yourself revisiting previous steps as you reconsider your course specifics and understanding of GAI.

Checklist for Assessing the Impact of Generative AI on your Course

View an abridged, printable version of the checklist to work through on your own.

Step One: Experiment with Generative AI

  • Experiment with GAI tools. Test Copilot (available to UWGB faculty, staff, and students) by inputting your own assignment prompts and assessing its performance in completing your assignments.
  • Research the potential benefits, concerns, and use cases regarding generative AI to gain a sense of the potential applications and misuses of this technology.

Step Two: Review Your Learning Outcomes

  • Reflect on your course learning outcomes. A good place to start is by reviewing this resource on AI and Bloom’s Taxonomy which considers AI capabilities for each learning level. Which outcomes lend themselves well to the use of generative AI and which outcomes emphasize your students’ distinctive human skills? Keep this in mind as you move on to steps three and four, as the way students demonstrate achieved learning outcomes may need to be revised.

Step Three: Assess the Extent of GAI Use in Class

  • Assess to what extent your course or discipline will be influenced by AI advancements. Are experts in your discipline already collaborating with GAI tools? Will current or future careers in your field work closely with these technologies? If so, consider what that means about your responsibility to prepare students for using generative AI effectively and ethically.
  • Determine the extent of usage appropriate for your course. Will you allow students to use GAI all the time or not at all? If students can use it, is it appropriate only for certain assignments/activities with guidance and permission from the instructor? If students can use GAI, how and when should they cite their use of these technologies (MLA, APA, Chicago)? Be specific and clear with your students.
  • Revisit your learning outcomes (step two). After assessing the impact of advancements in generative AI on your discipline and determining how the technology will be used (or not used) in your course, return to your learning outcomes and reassess if they align with course changes/additions you may have identified in this step.

Step Four: Review Your Assignments/Assessments

  • Evaluate your assignments to determine how AI can be integrated to support learning outcomes. The previous steps asked you to consider the relevance of AI to your field and its potential impact on students’ future careers. How are professionals in your discipline using AI, and how might you include AI-related skills in your course? What types of skills will students need to develop independently of AI, such as creativity, interpersonal skills, judgement, metacognitive reflection, and contextual reasoning? Can using AI for some parts of an assignment free up students’ time to focus more on the parts that develop these skills?
  • View, again, this resource on AI capabilities versus distinctive human skills as they relate to the levels of Bloom’s Taxonomy.
  • Define AI’s role in your course assignments and activities. Like step three, you’ll want to be clear with your students on how AI may be used for specific course activities. Articulate which parts of an assignment students can use AI assistance for and which parts students need to complete without AI. If AI use doesn’t benefit an assignment, explain to your students why it’s excluded and how the assignment work will develop relevant skills that AI can’t assist with. If you find AI is beneficial, consider how you will support your students’ usage for tasks like editing, organizing information, brainstorming, and formatting. In your assignment instructions, explain how students should cite or otherwise disclose their use of AI.
  • Apply the TILT framework to your assignments to help students understand the value of the work and the criteria for success.

Step Five: Update Your Syllabus

  • Add a syllabus statement outlining the guidelines you’ve determined pertaining to generative AI in your course. You can refer to our syllabus snippets for examples of generative AI-related syllabi statements.
  • Include your revised or new learning outcomes in your syllabus and consider how you will emphasize the importance of those course outcomes for students’ career/skill development.
  • Address and discuss your guidelines and expectations for generative AI usage with students on day one of class and put them in your syllabus. Inviting your students to provide feedback on course AI guidelines can help increase their understanding and buy-in.

Step Six: Seek Support and Resources

  • Engage with your colleagues to exchange experiences and practices for incorporating or navigating generative AI.
  • Stay informed about advancements and applications of generative AI technology.

Checklist for Assessing the Impact of Generative AI on Your Course © 2024 by Center for the Advancement of Teaching and Learning is licensed under Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International

Want More Resources?

Visit the CATL blog, The Cowbell, for more resources related to generative AI in higher education.

Need Help?

CATL is available to offer assistance and support at every step of the checklist presented above. Contact CATL for a consultation or by email at CATL@uwgb.edu if you have questions, concerns, or perhaps are apprehensive to go through this checklist.

 

 

My Resistance (and Maybe Yours): Help Me Explore Generative AI

Article by Tara DaPra, CATL Instructional Development Consultant & Associate Teaching Professor of English & Writing Foundations

I went to the OPID conference in April to learn from colleagues across the Universities of Wisconsin who know much more than I do about Generative AI. I was looking for answers, for insight, and maybe for a sense that it’s all going to be okay.

I picked up a few small ideas. One group of presenters disclosed that AI had revised their PowerPoint slides for concision, something that, let’s be honest, most presentations could benefit from. Bryan Kopp, an English professor at UW-La Crosse, opened his presentation “AI & Social Inequity” by plainly stating that discussions of AI are discussions of power. He went on to describe his senior seminar that explored these social dynamics and offered the reassurance that we can figure this thing out with our students.

I also heard a lot of noise: AI is changing everything! Students are already using it! Other students are scared, so you have to give them permission. But don’t make them use it, which means after learning how to teach it, and teaching them how to use it, you must also create an alternate assessment. And you have to use it, too! But you can’t use it to grade or write LORs or in any way compromise FERPA. Most of all, don’t wait! You’re already sooo behind!

In sum: AI is everywhere. It’s in your car, inside the house, in your pocket. And (I think?) it’s coming for your refrigerator and your grocery shopping.

I left the conference with a familiar ache behind my right shoulder blade. This is the place where stress lives in my body, the place of “you really must” and “have to.” And my body is resisting.

I am not an early adopter. I let the first gen of any new tech tool come and go, waiting for the bugs to be worked out, to see if it will survive the Hype Cycle. This year, my syllabus policy on AI essentially read, “I don’t know how to use this thing, so please just don’t.” Though, in my defense, the fact that I even had a policy on Generative AI might actually make me an early adopter, since a recent national survey of provosts found only 20% were at the helm of institutions with formal, published policy on the use of AI.

So I still don’t have very many answers, but I am remembering to breathe through my resistance, which has helped me develop a few questions: How can I break down this big scary thing into smaller pieces? How might I approach these tools with a sense of play? How can I experiment in the classroom with students? How can I help them understand the limitations of AI and the essential nature of their human brains, their human voices?

To those ends, I’d like to hear from you. Send me your anxieties, your moral outrage, your wildest hopes and dreams. What have you been puzzling over this year? Have you found small ways to use Generative AI in your teaching or writing? Have your ethical questions shifted or deepened? And should I worry that maybe, in about two hundred years, AI is going to destroy us all?

This summer and next year, CATL will publish additional materials and blog posts exploring Generative AI. CATL has already covered some of the “whats,” and will continue to do so, as AI changes rapidly. But, just as we understand that to motivate students, we must also talk about “the why,” we must make space for these questions ourselves. In the meantime, as I explore these questions, I’m leaning into human companionship, as members of my unit (Applied Writing & English) will read Co-Intelligence: Living and Working with AI by Ethan Mollick. We’re off contract this summer, so it’s not required, but, you know, we have to figure this out. So if we must, let’s at least do it over dinner.