Navigating New Realms: Generative AI’s Impact on Students in Higher Education

By Emma Plank, Charlotte Steckart & Rayney Wilson

In 2019, an article published by Brooklyn College predicted that Artificial Intelligence (AI) would soon be a problem in classrooms worldwide. Years later in 2024 we can look back and say they were correct. Before 2022, teachers did not have to question whether or not a student’s work was uniquely their own or written by sources like Chat GPT. AI has become more available to the general public for free, creating something from just instructions alone. Whether users want a photo of themselves with Taylor Swift, or an essay justifying the use of AI in school, generative AI programs can help. The question on everyone’s mind is: how far is too far, and what ethical lines need to be drawn?

The early beginnings of AI start with Alan Turing, and his notable Turing test. Turing was an English mathematician and computer scientist who theorized that machines could learn to imitate human responses from experience.  He crafted the Turing test to help determine whether or not a machine could successfully do this. In the test, a participant poses questions, and both a real person and a computer respond. The test is then repeated to see if a majority of the time the participant can correctly identify which response came from the computer.  According to Britannica, no computer has come close to passing the Turing test, until the advent of ChatGPT, whose success in passing is still debated by experts.

An AI image generated by Charlotte Tenebrini Steckart using Canva’s Magic Media Tool.

Since then AI has been ever-evolving. To quote Dr. Phil Clampitt, a Communication and Information Technology & Data Science professor at the University of Wisconsin-Green Bay (UWGB), AI is the Turing test “on mega steroids.” Clampitt expanded that now the important experiment is to take something people know a lot about and ask AI to write an essay on it. Then consider to what extent to disagree or agree with the responses? AI will not give opposing beliefs unless it is asked to. Because of this, Clampitt says “equivocation is purposeful vagueness.”

At UWGB, students were asked to participate in an anonymous survey regarding the use of AI in their studies and potential career paths. This survey included statements/questions with five varying options to express levels of agreement/disagreement or frequency (always – never). A rating of five indicated strong agreement or always, while a rating of one indicated strong disagreement or never. Initially, it was thought there would be a specific major that would use ChatGPT or other AI sources more than others, however, that was not the case.

The above chart represents the average answers for each major at UWGB regarding their usage and perceptions of ChatGPT’s usefulness.

The majors that thought they were most likely to use Chat GPT or AI for help were chemistry, economics, HR management, information science, and mathematics. The majors that were least likely to use Chat GPT or AI for help were social work, human biology, art studies, and biology. This made sense because the majors that use the most AI for homework were economics and mathematics.

The above chart represents the average answers for each major at UWGB.

Again economics, ranked as one of the highest majors who not only support the use of Chat GPT or other AI sources but also used it the most and thought it should be allowed to be used for schoolwork. Ranking low again was social work majors. Marketing majors support AI use and think it should be allowed in schools, but when looking only at the frequency of use, they rank lowest along with biology and social work majors.

The above chart represents the average answers for each major at UWGB.

When observing if plagiarism had to do with guilt, political science and social work majors felt strongly that using AI for homework or quiz answers was considered plagiarism and felt mildly guilty when they used it. Social work majors felt more guilty than political science majors. The rest of the majors did not feel as guilty and did not strongly consider it plagiarism.

The above chart represents the average answers for each major at UWGB.

It is interesting to compare survey results about the belief of AI being allowed in schools with results about whether using it is considered plagiarism. When compared, the results suggest conflicting ideas. Many majors felt that it was considered plagiarism but almost all majors felt it should be allowed in schools. HR and information science students felt that it was not considered plagiarism and should be allowed in schools. Social work majors felt the opposite.

An AI image generated by Charlotte Tenebrini Steckart using Canva’s Magic Media Tool.

Leading the discussion today are universities. Many are reviewing their policies on AI and those policies are in constant flux as AI is ever growing. When looking into UWGB’s academic integrity policy, The Fourth Estate found only one line about AI under the example section, saying “Taking credit for the work or efforts of another without authorization or citation (this includes using, without Instructor authorization, generative artificial intelligence software or websites).”

This policy leaves a lot of room for interpretation and this is where students are getting into trouble. Each professor has their own opinion and policy listed in their class syllabus – but should there just be one policy that fits all? Based on our survey results, use varies from major to major.

In the realm of higher education, the integration of generative AI has ushered in a new era of innovation and opportunity, transcending traditional boundaries and reshaping the educational landscape. From personalized learning experiences to enhanced research capabilities, its impact spans across disciplines and institutions. However, amidst the excitement and potential, it is crucial to acknowledge the complexities and challenges that accompany this technological advancement and ensure equitable access and mitigation of potential biases.

Leave a Reply

Your email address will not be published. Required fields are marked *