A colorful, geometric, and somewhat abstract illustration featuring buildings and streets covered with arrows, numbers, and the text "AI"

Generative AI and Assessments Workshop (June 28, July 18, Aug. 8, & Aug. 30, 2023)

Please join CATL for a virtual summer workshop focused on creating assessments in the age of generative AI (e.g., ChatGPT)! CATL facilitators will work with instructors to review their learning objectives, discuss the implications of emerging AI products, and brainstorm creative, high-quality, aligned, and feasible strategies for adapting course materials and assessments.

To participate in this virtual workshop, CATL asks that instructors bring a course syllabus with learning outcomes, ideas for at least two assessments for that course, and a willingness to engage in a reflective process that includes thinking about how generative AI technologies might impact those course materials. This workshop, “Generative AI and Assessments,” will occur three times throughout the summer months with more offerings to come in the fall. While registration is not required to attend, we encourage you to register today to receive a calendar reminder for the timeslot that works best for you!

Workshop Dates and Times:

All sessions are fully virtual and will meet via Microsoft Teams. Each workshop will be the same so please only sign up for one timeslot.

If you need accommodation for this virtual event, please contact CATL at CATL@uwgb.edu.

Register

 

Strategies for Creating “Generative AI-Resistant” Assessments

The use of generative-AI tools in education has recently garnered significant attention, placing educators in a unique position to consider their roles in higher education and how students engage with such tools. In a previous blog post, we introduced AI technologies and their endless capabilities, as well as potential implications for higher education. Additionally, we provided advice on considerations, precautions, and ethical concerns for using generative-AI in the classroom.

While some educators are excited about integrating AI collaboration into their teaching practices, others are apprehensive about its potential misuse by students. To address these concerns, this blog post presents assignment strategies that can be more “generative AI-resistant.” There are no “AI-proof” assessments, but these suggestions should serve as starting points for creating authentic assignments and/or ones that require demonstration of original and critical thinking.

Assessment Strategies

Be specific and personal

Consider creating assessments that involve tasks ChatGPT struggles to do such as referring to personal anecdotes or student reflections, referencing current events and recent field developments, or administering interviews or making references to specific course materials. Asking students to connect specific details from course materials (readings, lectures, experiments, etc.) to their personal lives or career paths can also help students see relevance in their course activities and engage with them on a deeper level.

Go beyond research papers

Instead of asking students to create a research paper that ChatGPT could do for them, ask students to create an annotated reference list that demonstrates their ability to apply proper research methods and analysis of resources collected. Alternatively, consider asking students to write a paper analyzing a case study that you’ve created yourself.

Mix up the medium

Incorporate assignments that make use of multimedia content such as creating, writing, recording, and producing a podcast episode relevant to course content and ideas.

Flip your classroom

Consider using in-class time for activities like classroom debates and/or peer-to-peer feedback on projects. Grade those efforts that are happening in real-time and under your observation (without generative AI). Note that a flipped classroom approach also allows students to practice higher order thinking and application of the content they’ve learned through homework.

Look at grading

Consider reviewing your course grading criteria and use growth-oriented rubrics that prioritize process over product.

Ask students to show their process

Consider adding assignment elements that ask students to think about the process of their work. This could be done by requiring students to submit notes they took on sources to prepare for their papers or presentations. You could also ask students to show their work in progress as they move toward a final draft (e.g., require submission of a project outline or proposal, annotated bibliography, and multiple drafts).

Allow for growth and resubmission

Consider adding in some flexibility when students fall short of an objective by allowing for revisions or resubmissions on certain assignments. This can reduce the “high stakes” nature of assessments and associated pressures.

Make adjustments to current assessments

Review your existing assignments to see if there are areas where you can have students demonstrate their holistic growth and development. For exams, you could add supplemental reflection questions or even consider adopting oral exams.

A Note on Academic Honesty

It can be easy or tempting to spend a lot of effort trying to catch instances of academic dishonesty using tools like ChatGPT. Although there are detection tools available, such as Turnitin, the effectiveness of AI detection reports remains uncertain due to insufficient data. Please review our blog on Considerations for Using Generative AI Tools to learn more and remember how important it is to communicate explicitly with students about if, when, and how they may use AI in your class.

Learn More

Explore even more CATL resources related to AI in education:

If you have questions, concerns, or ideas specific to generative AI tools in education and the classroom, please email us at catl@uwgb.edu or set up a consultation!

Oral Exams as Alternative (and Authentic) Assessments

Article by Amy J. & James E. Kabrhel, Ph.D., Associate Professors of Chemistry

In the Summer of 2020 during the heat of the COVID-19 pandemic, we learned that we would be allowed to come back to face-to-face instruction in Fall 2020 as long as we used methods allowing social distancing and flexibility for student attendance. I knew I wanted to return to face-to-face teaching as much as possible, but I also understood that many students might contract COVID and be unable to attend in-person class sessions, thereby potentially missing exams or other assessments. I considered several different assessment options to replace the exams I traditionally gave in my CHEM 211 & CHEM 212 (Principles of Chemistry I & II) courses. Some options I considered were online exams and take-home exams. Each has its own pros and cons. For instance, students can easily cheat on online exams thanks to websites like Chegg. The same goes for take-home exams. After carefully considering the pros and cons of various assessment methods, I decided to try oral exams. 

I scheduled the oral exams for each week following the conclusion of each unit. This led to four oral exams per student throughout the semester and one oral final exam per student during finals week. The in-semester exams were 30 min each and the final exam was 60 min each. I googled how to create an Appointment Group in Canvas and created several 30-min timeslots for students to choose from at times that worked for me but hopefully also worked for my students. I tried to offer a variety of days and times throughout each exam week, and I made the Appointment Group available one week before each exam week. I also provided students with instructions on how to select a timeslot in Canvas. The students then went into the Canvas Calendar and chose the timeslot that worked best for them on a first-come-first-served basis. Canvas emails instructors each time an appointment is selected, so I would then create a Zoom link for that oral exam session and send it to the student. I chose to use unique Zoom links for each student to ensure privacy for each exam. 

The student and I logged into Zoom at the scheduled exam time. I had several questions prepared to ask them that I also shared in Zoom for added accessibility. In essence, we had a conversation about the chemistry they had learned for the past month. I was able to give them immediate feedback on their answers and explanations, allowing them to correct their thinking on the spot. I had a grading rubric with me to keep track of how many times I needed to help them answer a question. Each bit of help was a loss of a point (see example assessment table). If a student was stuck on a question, we could move on to the next and return to any left at the end. In most cases, students finished in less than 30 min. In some cases, however, students needed more time, and I emailed them the questions to complete as a take-home exam.

Overal Understanding 20 pts
Ch. 13 30 pts
Ch. 14 30 pts
Prompting 20 pts
TOTAL  100 pts

I did wonder if students would share exam questions with classmates taking the exam later in the week, so I made slight changes to each question for each day of the exam week. For example: 

Which______  diffuses through air most ______ and why? 
Tuesday:  Halogen  quickly 
Wednesday: Halogen  slowly 
Thursday: Noble Gas  quickly 
Friday: Noble Gas  slowly 

The students and I were obviously nervous each time we would meet on Zoom, but after a few minutes, we would ease into the setting. Students became much more adept at explaining their chemical knowledge to me, and by the final exam, they seemed much less nervous and much more comfortable. Student evaluations confirm this observation as students stated they felt their oral communications skills improved throughout the semester. Overall, I think the students may not have loved the oral exams, but they appreciated their flexibility and immediate feedback. Due to this success, my husband and fellow instructor, James Kabrhel, decided to incorporate some oral assessments into his CHEM 302 & 303 (Organic Chemistry I & II) courses, and his impressions are given in the following paragraph.

Organic Chemistry originating from Manitowoc/Sheboygan has been taught through the point-to-point (P2P) modality since the 2000s, but with the addition of the Marinette and Green Bay campuses to the class, providing exams and finding proctors is a much more complicated problem. One solution to the problem was to shift all exams to the take-home format, but as previously mentioned, take-home exams have inherent risks. To balance those risks, an oral assignment and an oral final exam have been added to the course to provide multiple assessment modes for the students. Students must complete an oral assignment in the middle of the semester as a practice with the format, so they are then somewhat comfortable when the final exam comes around. The oral assignment also acts as a mid-semester check-in with the students to see how they are coping with organic chemistry and their classes overall. The addition of the oral exams has been successful enough for me to consider adding an oral part to every exam, not just the final. – J. Kabrhel, personal interview, 2023

You may be thinking that giving oral exams is way too much of a time commitment. I thought it would be as well, but it was not as bad as I expected. I found that it took me the same amount of time to write an oral exam as it did a traditional exam. The difference in time came when comparing the time to administer the exam versus the time to grade the exam (see below). A traditional exam takes one class session to administer but takes much longer to grade, which is dependent on the number of students in the course as well as the difficulty in grading each question. However, administering the oral exams took 30 min per student (9 hours for me as I had 18 students in Fall 2020) but took me almost no time at all to grade because I was grading them while administering them, so the only extra time needed was to type those scores into Canvas. In the end, I found oral exams to be slightly less time intensive than traditional exams.

We (Amy & James) have found the following pros and cons of using oral exams in our courses:

Pros  Cons 
Assessment as a conversation. Big time commitment during exam weeks.
Opportunity for 2nd chances on questions. Not easy to assess complex problem-solving questions.
Misunderstandings corrected immediately. Technology issues.
Immediate feedback; faster grading.
Greater connection with students.  
Improvement in oral communication skills.  

As mentioned in the list above, one con is the huge time commitment during exam weeks. I had to block off nearly my entire week for oral exams. It also was not possible to ask complex problem-solving questions as it simply took students too long to answer them during the exams. Finally, there were some technical issues once in a while. Many of my students live in rural areas where their internet connections are not strong, so we would lose our connection, which took time away from the exams. (This is often when I would have to resort to giving the exam as a take-home, which was not my preference but was a suitable backup option.)

Overall, we feel the pros outweigh the cons, and oral exams are an excellent assessment option if they work for your pedagogical style. They do work best in smaller classes. We feel the max number of students for this method to be manageable would be 24. Beyond that, you would need to have a co-instructor or teaching assistant to help you complete all the exams within one week. In case you are curious, I (Amy) did not continue using oral exams after we returned to fully face-to-face courses, but this is mostly due to my introverted personality 🙂. James, however, is planning on adding more oral assessments to his course due to their equitable nature and the way it allows him to better connect with his students, especially those at campuses he cannot visit regularly. Oral exams are a valid and valuable assessment method, and if you have any questions for us about using oral exams in your courses, we would be more than happy to chat with you about them