Considerations for Using Generative AI Tools

Staying updated on the rapidly evolving generative-AI tools can be challenging, and educators may hold divergent (and strong!) views about them. In a previous article, we introduced generative-AI technologies, their capabilities, and implications for higher education. While some educators are enthusiastic about incorporating AI into their teaching methodologies, others may harbor doubts, apprehensions, or simply lack interest in exploring these tools. Regardless of one’s stance, understanding the disruptive impact of these technologies is crucial as we grapple with their ethical and pedagogical implications as educators.

In this article, we will explore some considerations for using generative-AI tools in the classroom, including preliminary precautions and ethical concerns. The more we understand these technologies, the better we can adapt to maximize their potential benefits while minimizing their negative impact.

Things to Consider When Using AI-Powered Tools in your Courses

Understand the inevitability of advancing AI technology.

AI, like many other recent technologies (e.g., personal computers or the internet), will continue to advance and not go away. In fact, they will progress and become better than previous models. This is not something we can “outrun.”

Encourage dialogue on the impact of AI in education

Consider discussing AI technology and its implications with your department, colleagues, and students. In what ways will generative-AI tools change the nature of learning outcomes and even careers in your discipline? How are other instructors responding? In what ways can instructors support each other as they each grapple with these questions?

Provide clear communication with your students on expectations

Whichever camp or situation you may fall into, it is always important to provide students with clear expectations for their use of AI in the classroom. Be specific in your syllabi and assignment descriptions about where and when you will allow or prohibit the use of these tools. You should also make sure whatever guidance you provide is also consistent with UWS Chapter 14 and the communications from our Provost Office. For example statements, view our Syllabus Snippets related to generative-AI

Use generative-AI tools with caution

Exercise caution when using generative-AI tools because the information provided by them may not always be accurate. AI creators, like OpenAI, are upfront about the fact that ChatGPT’s answers aren’t always correct. Due to their ability to hallucinate facts and resources, it’s best to avoid using these tool as a primary source. Be sure to also watch out for potential bias that can appear in outputs by these tools as they are trained on human-generated data.

Offer alternatives for privacy-minded students

If you are asking students to complete an assignment using generative-AI technology, you will also want to provide an opt-out or alternative assignment because students may legitimately not want to provide personal information to sign-up and use certain AI technologies. Many tools openly state they will sell that information.

AI detection tools are not perfect

When using Turnitin’s AI writing detection indicator, it is important to note that there is currently insufficient data to validate its effectiveness. Therefore, results from such reports should be treated as signals that additional review may be necessary. If you suspect academic misconduct, be prepared to support the claim with additional information beyond the detection tool’s report.

Consider ethical and legal issues when using AI tools

As instructors, it is also important to consider the potential ethical, legal, and security risks of AI technologies. Many generative-AI tools are “trained” on the data we put into them, so we must exercise caution when providing prompts to the tools. For example, never put students’ personal information into an AI-powered tool, as this may violate FERPA. Asking students to submit their work (or doing it yourself) to get feedback from ChatGPT or a similar resource puts their intellectual property into the public domain. This should not be done without their explicit consent.

Prepare students to use AI effectively

If you assign tasks that require students to use AI technology, it is important to provide clear instructions about how to do so and not assume students already know. Consider incorporating a discussion on the benefits, limitations, cautions, and ethics of using generative-AI. This could be a valuable in-class activity.

Don’t get caught up in the smoke

Although the capabilities of generative-AI can be scary or worrying at this point, it is best to not get bogged down in the negatives of AI or focus on how to detect cheating through AI use. Are you worried about what AI tools mean for your course materials? Schedule a consultation with us. CATL is here to help!

Learn More

Explore even more CATL resources related to AI in education:

If you have questions, concerns, or ideas specific to generative AI tools in education and the classroom, please email us at catl@uwgb.edu or set up a consultation!

What is ChatGPT? Exploring AI Tools and Their Relationship with Education

Artificial Intelligence (AI) and machine-generated content have become prominent in educational discussions. Amidst technical jargon and concerns about the impact of traditional writing and learning, understanding these topics can be overwhelming. This toolbox guide simplifies the generative AI landscape, providing clear definitions and insights into some commonly used generative AI tools.

What is Generative AI?

To provide an introduction to generative AI, CATL has created an informative video presentation. This video, paired with interactive PowerPoints slides, serves as a valuable resource for understanding how generative AI tools work, their capabilities, and limitations.

Introduction to Generative AI – CATL Presentation Slides (PDF)

Common Generative AI Tools

One of the most popular AI-powered text generators is ChatGPT by OpenAI. Since its November 2022 release, various companies have developed their own generative AI applications based on or in direct competitive with OpenAI’s framework. Learn more about common generative AI tools below.

Note: For UWGB faculty, staff, and students, we recommend using Microsoft Copilot and other tools that do not require users to provide personal information in the sign-up process.

What Can Generative AI Tools Do?

The generative AI tools we’ve discussed so far are all trained on large datasets that produce outputs based on patterns in that dataset. User prompts and feedback can be used to improve their outputs and models, so these tools are constantly evolving. Explore below to learn about general use cases for generative AI tools and their limitations.

Generative AI tools can be used in a multitude of ways. Some common uses cases for text-based generative AI tools include: 

  • Language generation: Users can ask them to create essays, poems, or code snippets on a given topic.  
  • Information retrieval: Users can ask them to answer simple non-academic questions like “explain the rules of football to me” or “what is the correct way to use a semicolon?”  
  • Language translation: Users can ask them to translate words or phrases into different languages.  
  • Text summarization: Users can ask them to condense notes from a lecture and or long texts, including entire books, into shorter summaries. 
  • Idea generation & editorial assistance: Users can ask them to brainstorm and generate ideas for a story or a research outline or provide feedback on writing to make it more concise or formal.  

However, these tools also have some limitations, including but not limited to:  

  • Lack of real-world understanding: They do not understand the context and/or logic of the real world. They do not understand sarcasm, analogies, jokes, and satire. For example, an output created by the technology may be grammatically correct, but semantically is nonsensical or contradictory.  
  • Dependent upon the data it is trained on: They may produce outputs that are not accurate, relevant, or current because they rely on the data they are trained on. 
  • False results or hallucinated responses: They may produce outputs that are false, misleading, or plagiarized from other sources, and are unable to verify the accuracy of their outputs.  
  • Machine learning bias: They may produce outputs that are discriminatory or harmful due to bias in the data they are trained on.  

The possibilities of tools like ChatGPT seem to be almost endless — writing complete essays, creating poetry, summarizing books and large texts, creating games, and translating languages and data. ChatGPT and its contemporaries can understand text and spoken words similar to how human beings can. These tools have become more conversational and corrective with each update, making it difficult to discern between what is generated by an AI and what is produced by a human. In addition, the data and algorithms they draw from imitate the way humans learn and can gradually improve their accuracy the more you interact with it. As explored above, they offer large potential in their use cases, yet they still come with their own set of limitations to consider.

What Does This Mean for Educators?

The existence of this technology raises questions about which tasks will be completed all or in part by machines in the future and what that means for our learning outcomes and assessments. Some experts are also discussing to what extent it should become part of the educational enterprise to teach students how to write effective AI prompts and use tools like ChatGPT thoughtfully to produce work that balances quality with efficiency.

One way to approach the conversation surrounding AI technology is to consider these applications as tools that educators can choose either to work with or without in their classes. Some may also consider teaching their students how to use these tools most effectively and/or integrating lessons on AI ethics into their teaching. With any teaching tool we look to incorporate, we must provide proper thought, scaffolding, and framing around what it can do and where it falls short so that students can use the tool responsibly.

Learn More

Explore even more CATL resources related to AI in education:

If you have questions, concerns, or ideas specific to generative AI tools in education and the classroom, please email us at catl@uwgb.edu or set up a consultation!

Reading and Resources About AI in Education

To help you as you research and explore AI tools, we have provided a list of resources and additional readings on the topic of Generative AI technology below.

Additionally, CATL developed a GenAI checklist for instructors that will help you assess the extent to which generative AI will affect your courses and provide guidance on steps for moving forward.

Generative Artificial Intelligence In the Classroom

ChatGPT, built on the GPT-4 system, and other Generative AI platforms, offer unique opportunities for instructors and students to leverage the technology while still providing robust, comprehensive learning experiences. However, some instructors are apprehensive about its potential misuse by learning activities. Below you will find a variety of resources on how to use generative AI in classroom activities, with examples of activities that may not require any usage of AI.

Add a Generative AI Syllabus Statement

Incorporating Generative AI

Working Around Generative AI

Additional Resources on Assessment and Generative AI

Learning to Use AI Yourself

Playing Around with AI

Additional Commentary on AI (Articles, Podcast, etc.)

Other Center Resources

Learn More

Explore even more CATL resources related to AI in education:

If you have questions, concerns, or ideas specific to generative AI tools in education and the classroom, please email us at catl@uwgb.edu or set up a consultation!