Generative Artificial Intelligence: Updates and Articles for Instructors

Welcome to our GAI resource-sharing blog page! Here you’ll find some of the latest updates and articles on generative AI, curated especially for faculty and instructional staff. While there are numerous resources available out there, CATL will share a select, timely sample of articles and perspectives to help instructors stay informed about new changes in AI technology and education.

For more in-depth, instructor-focused articles on generative AI by CATL, explore our AI Toolbox Articles.

Table of Contents

Generative AI Tools Directory

Stay updated on the different AI tools being created and discover what your peers or fields might be using!

(Resources in this section are updated biannually)

May 2023 – June 2024

  • Artificial Intelligence and the Future of Teaching and Learning, May 2023. This report by the Office of Educational Technology provides insights on how AI can be integrated into education practices, and recommended responses for educators.
  • The AI Index Report: Measuring trends in AI, April 2024. Created by the Institute for Human-Centered AI at Stanford University, this report provides an analysis of AI trends and metrics, including important insights into the current state and future direction of AI for educators grappling with the rapidly evolving technology and what it means for their teaching practices.
  • AI in 2024: Major Developments & Innovations, Jan. 3, 2024. This article provides a timeline of AI developments during 2023 and newest updates in 2024.
  • 2024 AI Business Predictions, 2024. This report by PwC describes how businesses are preparing for and incorporating AI, with predictions on future trends and AI strategies in the corporate world.

Monthly Resources for Educators

(Resources in this section are updated for each month)

June 2024

Tips for Teachers

  • If you haven’t signed into Copilot with your UWGB account, now is the time! Microsoft Copilot, accessible through any browser and soon integrated into Windows 11, avoids using your personal email, which makes it a better alternative for classes. It doesn’t require providing, for example, a personal cellphone number for use, and it is available to all UWGB faculty, staff, and students with an institutional login and ID. Copilot also offers enhanced data protection when logged in using your UWGB account, although FERPA-protected and personally identifiable information should still not be entered. Watch this short video on how to log in. Remember, use any AI tool responsibly and always vet outputs for accuracy.

Latest Educational Updates

  • Latest AI Announcements Mean Another Big Adjustment for Educators, June 6, 2024. This article from EdSurge recaps some of the latest AI advancements that will heavily impact education and provides advice from instructors and ed tech experts on how to adapt.
  • AI Detectors Don’t Work. Here’s What to Do Instead, 2024. MIT’s Teaching & Learning Technologies Center critiques AI detection software and suggests better alternatives. The article advocates for clear guidelines, open dialogue, creative assignment design, and equitable assessment practices to effectively engage students and maintain academic standards.

May 2024

Tip for Teachers

  • Subscribe to the “One Useful Thing” blog by Ethan Mollick, an Associate Professor at the Wharton School of the University of Pennsylvania and Co-Director of the Generative AI Lab at Wharton.

Latest Educational Updates

Latest AI Tech Advancements

My Resistance (and Maybe Yours): Help Me Explore Generative AI

Article by Tara DaPra, CATL Instructional Development Consultant & Associate Teaching Professor of English & Writing Foundations

I went to the OPID conference in April to learn from colleagues across the Universities of Wisconsin who know much more than I do about Generative AI. I was looking for answers, for insight, and maybe for a sense that it’s all going to be okay.

I picked up a few small ideas. One group of presenters disclosed that AI had revised their PowerPoint slides for concision, something that, let’s be honest, most presentations could benefit from. Bryan Kopp, an English professor at UW-La Crosse, opened his presentation “AI & Social Inequity” by plainly stating that discussions of AI are discussions of power. He went on to describe his senior seminar that explored these social dynamics and offered the reassurance that we can figure this thing out with our students.

I also heard a lot of noise: AI is changing everything! Students are already using it! Other students are scared, so you have to give them permission. But don’t make them use it, which means after learning how to teach it, and teaching them how to use it, you must also create an alternate assessment. And you have to use it, too! But you can’t use it to grade or write LORs or in any way compromise FERPA. Most of all, don’t wait! You’re already sooo behind!

In sum: AI is everywhere. It’s in your car, inside the house, in your pocket. And (I think?) it’s coming for your refrigerator and your grocery shopping.

I left the conference with a familiar ache behind my right shoulder blade. This is the place where stress lives in my body, the place of “you really must” and “have to.” And my body is resisting.

I am not an early adopter. I let the first gen of any new tech tool come and go, waiting for the bugs to be worked out, to see if it will survive the Hype Cycle. This year, my syllabus policy on AI essentially read, “I don’t know how to use this thing, so please just don’t.” Though, in my defense, the fact that I even had a policy on Generative AI might actually make me an early adopter, since a recent national survey of provosts found only 20% were at the helm of institutions with formal, published policy on the use of AI.

So I still don’t have very many answers, but I am remembering to breathe through my resistance, which has helped me develop a few questions: How can I break down this big scary thing into smaller pieces? How might I approach these tools with a sense of play? How can I experiment in the classroom with students? How can I help them understand the limitations of AI and the essential nature of their human brains, their human voices?

To those ends, I’d like to hear from you. Send me your anxieties, your moral outrage, your wildest hopes and dreams. What have you been puzzling over this year? Have you found small ways to use Generative AI in your teaching or writing? Have your ethical questions shifted or deepened? And should I worry that maybe, in about two hundred years, AI is going to destroy us all?

This summer and next year, CATL will publish additional materials and blog posts exploring Generative AI. CATL has already covered some of the “whats,” and will continue to do so, as AI changes rapidly. But, just as we understand that to motivate students, we must also talk about “the why,” we must make space for these questions ourselves. In the meantime, as I explore these questions, I’m leaning into human companionship, as members of my unit (Applied Writing & English) will read Co-Intelligence: Living and Working with AI by Ethan Mollick. We’re off contract this summer, so it’s not required, but, you know, we have to figure this out. So if we must, let’s at least do it over dinner.

What is ChatGPT? Exploring AI Tools and Their Relationship with Education

Generative Artificial Intelligence (AI) and machine-generated content have become prominent in educational discussions. Amidst technical jargon and concerns about the impact of traditional learning, writing, and other facets, understanding what these tools are and what they can do can be overwhelming. This toolbox guide provides insights into some commonly used generative AI tools and explains how they are changing the landscape of higher education.

What is Generative AI?

CATL created a short video presentation in Fall 2023 that provides instructors with an introduction to generative AI. The video and the linked PowerPoint slides below can help you understand how generative AI tools work, their capabilities, and their limitations. Please note, minor parts of the tool identification in the video have been corrected below in the ‘Common Generative AI Tools’ section. 

Introduction to Generative AI – CATL Presentation Slides (PDF)

Common Generative AI Tools

One of the most popular text-based generative AI tools is ChatGPT by OpenAI. Since its November 2022 release, various companies have developed their own generative AI applications based on or in direct competition with OpenAI’s framework. Learn more about a few common, browser-based generative AI tools below.

  • ChatGPT is an AI-powered chatbot created by OpenAI. The "GPT" in "ChatGPT" stands for Generative Pre-trained Transformer.
  • ChatGPT previously required users to sign up for an account and verify with a phone number, but it can now be used without an account. Users can use the chatbot features of ChatGPT both with or without an account (currently version ChatGPT 3.5) or access more advanced models and features with a paid account (currently version ChatGPT 4.0). For more information or to try it yourself, visit chatgpt.com.
  • Microsoft has created its own AI called Copilot using a customized version of OpenAI’s large language model and many of the features of ChatGPT. Users can interact with the AI through a chatbot, compose feature, or the Bing search engine. Microsoft is also rolling out Copilot-powered features in many of its Office 365 products, but these features are currently only available for an additional subscription fee.
  • Faculty, staff, and students can access Copilot (which uses both ChatGPT 4.0 and Bing Chat) with their UWGB account. Visit bing.com to try out Copilot or watch our short video on how to log in using a different browser. By logging in with UWGB credentials, a green shield and “protected” should appear on the screen. The specifics of what is/is not protected can be complicated, but this Microsoft document is intended to provide guidance. Regardless of potential protections, FERPA and HIPPA-protected information (student or employee) should not be entered.
  • Google has created their own AI tool called Gemini (formerly Google Bard). Similar to ChatGPT and Copilot, Gemini can generate content based on users’ inputs. Outputs may also include sources fetched from Google.
  • Using Gemini requires a free Google account. If you have a personal Google account, you can try out Gemini at gemini.google.com.

 

home page for Microsoft Copilot
The Microsoft Copilot home page as of May 2024

Note: For UWGB faculty, staff, and students, we recommend using Microsoft Copilot and other tools that do not require users to provide personal information in the sign-up process. Note that we are also learning more about potential access to Adobe Express and Firefly (including their image generation features) with UWGB login credentials, at least for employees. Watch this space for additional details as they become available.

What Can Generative AI Tools Do?

The generative AI tools we’ve discussed so far are all trained on large datasets that produce outputs based on patterns in that dataset. User prompts and feedback can be used to improve their outputs and models, so these tools are constantly evolving. Explore below to learn about some use cases and limitations of text-based generative AI tools.

Generative AI tools can be used in a multitude of ways. Some common use cases for text-based generative AI tools include: 

  • Language generation: Users can ask the AI to write essays, poems, emails, or code snippets on a given topic.  
  • Information retrieval: Users can ask the AI simple questions like “explain the rules of football to me” or “what is the correct way to use a semicolon?”.
  • Language translation: Users can use the AI to translate words or phrases into different languages.  
  • Text summarization: Users can ask them to condense long texts, including lecture notes or entire books, into shorter summaries.
  • Idea generation: Users can use the AI to brainstorm and generate ideas for a story, research outline, email, or cover letter. 
  • Editorial assistance: Users can input their own writing and then ask the AI to provide feedback or rewrite it to make it more concise or formal.

These tools are constantly evolving and improving, but in their current state, many have the following limitations:

  • False or hallucinated responses: Most AI-powered text generators produce responses that they deem are likely answers based on complex algorithms and probability, which is not always the correct answer. As a result, AI may produce outputs that are misleading or incorrect. When asking AI complex questions, it may also generate an output that is grammatically correct but logically nonsensical or contradictory. These incorrect responses are sometimes called AI "hallucinations."
  • Limited frame of reference: Outputs are generated based on the user's input and the data that the AI has been trained on. When asking an AI about current events or information not widely circulated on the internet, it may produce outputs that are not accurate, relevant, or current because its frame of reference is limited to data that it has been trained on. 
  • Citation: Although the idea behind generative AI is to generate unique responses, there have been documented cases in which an AI has produced outputs containing unchanged, copyrighted content from its dataset. Even when an AI produces a unique response, some are unable to verify the accuracy of their outputs or provide sources supporting their claims.
  • Machine learning bias: AI tools may produce outputs that are discriminatory or harmful due to pre-existing bias in the data it has been trained.

The potential for tools like ChatGPT seems almost endless — writing complete essays, creating poetry, summarizing books and large texts, creating games, translating languages, analyzing data, and more. ChatGPT and its contemporaries can interpret and analyze language, similar to how human beings can. These tools have become more conversational and adaptive with each update, making it difficult to discern between what is generated by an AI and what is produced by a human, and the machine-learning models they are based upon imitate the way humans learn, so their accuracy and utility will only continue to improve over time.

What Does This Mean for Educators?

The existence of this technology raises questions about which tasks will be completed all or in part by machines in the future and what that means for our learning outcomes, assessments, and even disciplines. Some experts are discussing to what extent it should become part of the educational enterprise to teach students how to write effective AI prompts and use tools like ChatGPT to produce work that balances quality with efficiency. Other instructors are considering integrating lessons on AI ethics or information literacy into their teaching. Meanwhile, organizations like Inside Higher Ed have rushed to conduct research and surveys on current and prospective AI usage in higher ed to offer some benefits and challenges of using generative AI for leaders in higher education looking to make informed decisions about AI guidance and policy.

Next Steps for UWGB Instructors

The Universities of Wisconsin have issued official guidance on the use of generative AI, but the extent to which courses will engage with this technology is largely left up to the individual instructor. Instructors may wish to mitigate, support, or even elevate students’ use of generative AI depending on their discipline and courses.

Those interested in using these tools in the classroom should familiarize themselves with these considerations for using generative AI, especially regarding a tool’s accuracy, privacy, and security. As with any tool we incorporate into our teaching, we must be thoughtful about how and when to use AI and then provide students with proper scaffolding, framing, and guardrails to encourage responsible and effective usage.

Still, even for those who don’t want to incorporate this technology into their courses right now, we can’t ignore its existence either. All instructors, regardless of their philosophy on AI, are highly encouraged to consider how generative AI will impact their assessments, incorporate explicit guidance on AI tool usage in their syllabi, and continue to engage in conversations around these topics with their colleagues, chairs, and deans.

Learn More

Explore even more CATL resources related to AI in education:

If you have questions, concerns, or ideas specific to generative AI tools in education and the classroom, please email us at catl@uwgb.edu or set up a consultation!