A.I. and Academic Integrity-WCC STEM Scholars Newsletter

Martin Gargaro

Graphic Designer

Writer

Adobe Acrobat

Canva

Microsoft Word

Education

Multi-Month STEM Topic Artificial Intelligence: A Double-Edged Sword
By Martin Gargaro
In a day and age where technology is rapidly advancing, it may not come as a surprise that we are becoming more fixated on making our machines more intelligent. This can come in a variety of ways such as changing speech-to-text and vice versa, recommended searches, language translation, as well as creative tools. Although artificial intelligence can provide many benefits to make our lives easier, it also has created many drawbacks and risks in terms of morality and ethics. Over the next several months, I will be taking us on a tour exploring the both the positives and the negatives that these advancements in technology have and will continue to provide to our society in various different, ranging from academics, to creativity, to employment, to even our democracy. The issue of “just because we can, doesn't mean we should” hangs in the balance in every different aspect that AI impacts. Part 1: ChatGPT and Academic Integrity If you look at any class syllabus, either for grade school or college, almost all of them have a strong focus on academic integrity. The work you turn in must be your own and represent your own ideas while at the same time, giving credit where its due when featuring the work and ideas of other people. Should you fail to comply with these rules, you risk harsh consequences ranging from class failure to suspension, and even expulsion. Back then, this was common sense, but now, it’s flat-out foolish to assume that students will comply with these rules. According to an article published by a student at Columbia University, “[t]here’s a remarkable disconnect between how those with influence over education systems…think students use generative AI on written work and how we actually use it.”¹ The student claims that is very naïve to assume that an educator would be able to tell if a student had written the paper or whether it was written by ChatGPT, an AI chatbot designed to answer questions and help with writing tasks. The fact is, ChatGPT can easily do most of the work for the student and it is not easy to distinguish between whether a paper was written with or without it. As a result, the schools are going to need to change in order to accommodate for this new technology, while at the same time ensuring students are still learning and understanding the material.
Considering how new ChatGPT is, there are many concerns about how educators should address the matter in terms to even start. According to Tricia Bertram Gallant, Director of UC San Diego's Academic Integrity Office, many professors have stated that they are trying to “prohibit the use…because [they currently] don't have time to redesign [their] assessments.”² The biggest concern that needs to be addressed is how the school is going to have to adapt and learn to work with it. Gallant’s bottom line is that the main challenge is how the whole system will need to readjust to a new paradigm. Some of the questions that she feels needs to be answered are “‘What do I do in the interim to both ensure that that students are learning?’, and [how do I] ensure that I’m assessing that learning and not giving out grades for work that GPT has done?’”² Sarah Eaton, a professor of Education at the University of Calgary, has been hearing many different viewpoints across both her campus as well as others, ranging from positive to negative. However, she also finds that a lot of those people have “[made] a judgment about that utility without having tried it.”² She and other faculty members would suggest experimenting with a variety of “different apps and tools, [because] ChatGPT is by far not the only tool out there…accessible to our students.”² While many people may assume that “generative AI has appeared out of nowhere,”² the reality is that this technology has been in the making for a long time. The main issues for both academics and faculty is a lack of understanding on how ChatGPT works in terms of what it is capable or not capable of and students knowing what they can or can’t do, depending on the state of affairs. Many educators would like to prevent this technology from being used by students, considering that there is so much we don’t know about this technology, but considering how much of a digitized age we are living in, simply banning the usage of ChatGPT doesn’t seem feasible. It makes sense considering that these tools are already here, and thus sooner or later we are going to have to use this, that the solution should focus on how we use them.
The purpose of higher education is to develop the basic skills that one would need, both in their academic and their professional careers. Unsurprisingly, many of these skills are very human in nature and cannot be replaced by machines. With that said, there is still a place for learning how many skills could be applicable for the use of artificial intelligence, in particular with which tool or application is appropriate, and most importantly, how to keep up with any technological advancements. Thomas Lancaster, a computer scientist at Imperial College London, recognizes that as AI progresses, so will the quality of information, so we need to examine how we “[think] about appropriate assessments for the future, and not [assume that] just because everything has always been assessed in one way that is going to be the right way for the [future].”² Regardless, these advancements won’t impact the reason as to why students will cheat, one of the biggest being that they don’t find any meaning in their work. Unsurprisingly, one of the main aspects to examine is how students can actually find meaning in the assignments for their class, which could be to obtain certain skills. The natural instinct is to focus on the skills that are exclusive to humans, such as critical thinking and analysis of what they are learning. Lancaster suggests having oral exams as opposed to mere research papers as a means to examine how students are applying their knowledge and skills.
One of the obvious concerns about ChatGPT in regard to academic integrity is the higher likelihood of plagiarism. A few years ago, Sarah Eaton published a book called Plagiarism in Higher Education, which ended about how plagiarism could be carried out in the future, including with the use of artificial intelligence. ChatGPT has a language generator which allows it to produce text very much like that of a student. As such, it raises a lot of questions to be answered: if much of the writing was written by the student, but the rest was written by AI, does it still qualify as original work? If an essay was completely written by AI, but the student did all the coding, does it still qualify as original work? Furthermore, where should the line be drawn between AI assistance and cheating.³ The concept in which AI-human writing will become more commonplace may require us to redefine plagiarism, as well as the values of academic effort and morality. However, Eaton also believes it could work both ways. In contrast to a lot of the main discourse, she also suggests that using AI could “give us prompts, write drafts, and inspire us [to think critically]. There's no evidence that I can see that there's going to be any threat to the human imagination or creativity. Our ability to continue to inspire and imagine and create will remain boundless. And so, I don't actually see AI as any kind of a threat, in that way.”² She also finds it to be a complicated problem completely deny access to these AI tools, as it would be similar to denying access to the Internet or spell and grammar checks. Thomas Lancaster, meanwhile, finds this to be a double-edged sword. Although there are reasons to allow the use of Chat GPT and other AI-related technology, students still have to know how to write without it, in particular for exams. Furthermore, while it may also be easier to translate writing from other languages perfectly fine, “they [also] miss out on the opportunity to learn [such languages].”² As such, educators have to answer the question on how they can be proficient writers with having any need to use ChatGPT.
In order to address this situation, Tricia Bertram Gallant believes that students need to be taught about the basic skills of writing in order to understand how it works, so they can get better skills, similar to how students are taught about basic mathematics, despite having calculators. If anything, those calculators are intended to help build on the skills they have already learned. Likewise, artificial intelligence should most likely be used to help build on the writing skills that they were already taught in order to make them stronger. If writing is used as a way of thinking and communicating, instructors may have to “flip everything that we assume on its head and assume that it might be different, that that it might be different in the future.”² Furthermore, educators may also have to consider other venues outside of writing, such as “oral assessments, presentations, active learning activities, and group assessments.”²
One of our own students from Washtenaw Community College has more than his fair share of experience using ChatGPT. Matt Strang, a STEM Scholar in the math and science program. As somebody who used to work in the neural networks of AI, he had been following no development of the technology for quite a few years. He also uses ChatGPT to help him work out math problems for chemistry, stoichiometry and chemistry as well as help write some formal emails. He finds it very helpful as a tutoring service as it can show him how to solve problems step-by-step quickly and effectively. While he understands the apprehensions that people have regarding the plagiarism and the ethics of having it do your work for you, he believes those ethics come down to the individual. The real challenge that he sees is the fact that since it's a new technology, the schools are going to have to find a way to work with it and readjust their courses and education plans to accommodate it. He believes that there needs to be boundaries set between making sure people aren't having it do their work for them, but also still allowing people to learn with it.
It is inevitable that ChatGPT is going to be here to stay, and it poses a ethical dilemma to completely bar students from using it altogether in order to prevent cheating. While we want to promote academic integrity, we also want to make sure that students have resources for them to be successful. The best solution appears to be third educators should take the time to learn about the software and modify their curriculum in order to accommodate. A good solution would be to require a class on how to properly use ChatGPT and integrate it into learning as opposed to having it do the work for you.
Works Cited
1. Terry, O. K. (2023, May 12). I’m a Student. You Have No Idea How Much We’re Using ChatGPT. The Chronicle of Higher Education. https://www.chronicle.com/article/im-a-student- you-have-no-idea-how-much-were-using-chatgpt
2. Wilhelm, I., Gallant, T. B., Eaton, S. E., & Lancaster, T. (2023, May 17). Academic Integrity and AI [Webinar]. The Chronicle of Higher Education. https://chronicle.zoom.us/rec/play/r0gEt02sKIIqC7pX4pxiLXBHbPhg4MVfSRiSd1p1RsHsF7qdWI pNvjG6EneYQ8kgsGEqAloOdjS9Ky-0.KxvCZ-XtrSQvNzoM? canPlayFromShare=true&from=share_recording_detail&startTime=1684259957000&componentN ame=rec- play&originRequestUrl=https%3A%2F%2Fchronicle.zoom.us%2Frec%2Fshare%2FXDUwoNDYO _ckTtYe- CuY6W1XA4A93XRNYSCh0uqB8Z6530Hus2OZ1jqsYIecNTWs.5u2g11Rdke0_Ov5S%3FstartTim e%3D1684259957000%26mkt_tok%3DOTMxLUVLQS0yMTgAAAGLyax0xA4jgKB73zTO89J5Zob 7QtqYTXzpLSezWmSXotYJamxJv5qcdTrqfhdYNQJjFW80lbGd6I5Qi8p1IaqOkF- UEku1lAePHTU5IH8nLXUqg18
3. Mindzak, M., & Eaton, S. E. (2021, November 4). Artificial Intelligence is getting better at writing, and universities should worry about plagiarism. The Conversation. https://theconversation.com/artificial-intelligence-is-getting-better-at-writing-and-universities- should-worry-about-plagiarism-160481
Like this project
0

Posted Mar 18, 2025

This was a part of a monthly newsletter for the STEM Scholars program at Washtenaw Community College. The complete article can be found starting on page 5.

Likes

0

Views

0

Timeline

May 1, 2023 - Jun 1, 2023

Clients

Washtenaw Community College

Tags

Graphic Designer

Writer

Adobe Acrobat

Canva

Microsoft Word

Education

Rogue Waves-WCC STEM Scholars Newsletter November 2023
Rogue Waves-WCC STEM Scholars Newsletter November 2023
Veteran's Day Music Video - YouTube
Veteran's Day Music Video - YouTube
Remembering the Crocodile Hunter: 10 Years On - YouTube
Remembering the Crocodile Hunter: 10 Years On - YouTube
Christmas 1914 Music Video - YouTube
Christmas 1914 Music Video - YouTube