With assistive AI tools making their way into every facet of our daily lives, it’s no wonder that students and faculty are becoming increasingly aware of how they can affect education.
The Daily Egyptian did an informal survey at SIU to ask students, faculty and staff about their take on the use of AI on assignments.
Advertisement
Of 96 student respondents, 51 percent said they do use AI for assignments and 49 percent said they do not.
For the students who do use the tool, 74 percent said their instructors do not know about their use of it, while 26 percent said their instructors do know.
More than half, 58 percent of students who responded to the anonymous survey said they use ChatGPT for research.
Advertisement*
One person who has found the tool helpful for researching purposes is a graduate student studying clinical mental health counseling.
“We have to take a comprehensive exam in order to graduate. I was given a study guide, and with all the things that I didn’t know, I would just type into ChatGPT ‘give me more information about this topic,’ and then it would give me all the information I needed. It gave me one spot to go to instead of if I Google it,” the student said.
One of the more responsible ways to use ChatGPT could be to utilize it like this, decreasing the time spent clicking through Google links, but people need to keep in mind, AI tools are not always accurate.
Anas AlSobeh, assistant professor in the IT college at SIU said, “Most of the students don’t have the patience when looking for information. For example, I would like to find information about Mark Lothar. I don’t want to go to Google right now and get ten or hundreds of links and visit these links. It takes too much time. Now, I can suggest AI tools to help summarize where students can find this information. You like a summary, okay. Now that Gemini, ChatGPT, and Copilot give you the citation where this is coming from directly, with one click you will be there.”
This study was conducted after the Daily Egyptian was presented with data from a survey from Intelligent.com on 588 college students, finding that 37% are currently using ChatGPT. The students were asked how they were utilizing the tool, and the survey found that 96% were using it for schoolwork, while 80% were using it for other tasks including communication and job searching. The full results of that study can be found through their website, intelligent.com.
One student in the SIU survey cautions that it is important not to rely on AI fully, that there may be conflicting opinions between what you want and what ChatGPT recommends.
“For example, my son has some behavioral issues, and so one of the things I do with ChatGPT is try to look at different outside resources for how to handle it because I don’t like behavioral therapy, it focuses more on the behavior rather than what’s behind the behavior.”
This respondent said ChatGPT strongly supports behavioral therapy.
“I’ve got this other guy that I’ve been studying, and he’s really reliable too but he doesn’t use behavioral therapy and I can search him in ChatGPT, but then ChatGPT went back to saying ‘if you want something reliable, you need to go to behavioral therapy’ and to me, that’s not reliable. So that’s where it’s kind of limited because it doesn’t think outside the box. It’s just like, no, behavioral therapy, that’s where the answer is. And I disagree.”
There is also the matter of privacy.
How much information is too much to give to this AI tool while performing searches?
“It will write an entire case conception paper. You just give it a little information. You just say, this is the client, age, what they’re dealing with, what their background is, and it’ll write an entire case conception paper for you,” a student said, “but I wouldn’t put someone’s specific information in it, like where you would break confidentiality.”
Privacy is a big issue with AI because it’s a black box, and the level of personal information within it is a mystery to its users.
AlSobeh said, “So far, this is the big issue in the large models, the ethical issues. It’s not recommended to use AI tools to expose personal information because some AI tools are self-learning. Maybe they take this data and keep it in their records. Especially healthcare. Maybe someone can take this data and use it somewhere else,”
60.3% of students using ChatGPT for writing, showing that this is what the tool is most utilized for.”
This has sparked concerns in students and faculty.
“I think it’s a very disingenuous tool to be using in an academic setting. To me, it feels like another form of plagiarism, almost. You’re doing work that isn’t really your own,” SIU student Liam Groves said.
This idea seems to be a common issue among students and educators.
“People think it’s not plagiarism because it’s not borrowing from someone else’s work or someone else’s ideas, but that’s exactly what ChatGPT does. It borrows from everything it sees. From everything it reads,” Vicki Kreher, senior lecturer in the school of journalism and advertising said. “So it’s picking from here and there and then if you need a bibliography in a class and you ask it to make the bibliography, it’s just going to make shit up, right? I mean, you can ask ChatGPT ‘is using ChatGPT plagiarism?’ and you can get it to argue for it and against it.”
AlSobeh said a big issue in ethics with AI is its lack of regulations addressing what constitutes plagiarism.
But there are students who find other, more responsible ways of going about using the tool for writing assistance.
“Sometimes I have something I want to write, but I don’t have the smoothness. So, I will have it rewrite it for me. It comes across more, like, using the transitions and finesse that I want,” one student said.
“I’ll mainly use it for fluff,” another said. “AI in general is not really great at creative work, but it’s great for filling.”
AlSobeh said, “Large language models are mathematical, and when you ask it a question, your question will be converted to the mathematical model. They try to translate this mathematical model to the English text. So basically, a mathematical model in computer science is logical. It still can write a whole essay for the student and it’s going to sound logical, but it’s just not correct.”
Ultimately, ChatGPT writing has its flaws, and users need to be vigilant with its responses.
In other areas, 26 percent of students responded saying they use it for quizzes and/or tests.
“I would use Chat GPT for things like quizzes mainly for core curriculum classes that I have no interest in,” a student said.
Inside the classroom, using ChatGPT for quizzes and tests is likely less common, but for remote students, the tool is easier to use.
Kreher said there are ways to avoid this problem, like proctoring exams for remote students.
Other students have also been required to purchase external webcams for their remote quizzes and tests and use lock down browsers so they cannot open other windows during their exams.
If AI use is suspected, Kreher said, “I’ll write a few prompts that they probably would have stuck in ChatGPT. I’ll run those and if they come out sounding exactly like – which they will sound darn close – to what the students did, I’ll give them two weeks to resubmit the assignment. No AI this time.”
Educators are responsible for addressing the use of AI like ChatGPT in their coursework, but faculty like Kreher hopes the university will employ training on AI for professors and teaching assistants, and create policies on how to incorporate ChatGPT in the classroom.
Of the 105 respondents, 90 percent were students, nearly 8 percent were faculty, and the rest were staff.
Out of the students surveyed, 49 percent were seniors, 24 percent were juniors, 17 percet were sophomores, and 8 percent were freshmen. Respondents were encouraged to answer each of the questions but were not required to, explaining any gaps in the data.
Whether students agreed or disagreed with the use of AI like ChatGPT in education, for the most part, they weren’t hearing much about the topic from their instructors.
“Our program director was like, ‘I don’t know what to say about that,’ it was so new and like, they didn’t know how to handle it either,” one student said.
Another said, “I’m taking a web development class this semester, and there’s a notice that says ‘do not use AI’ and personally I’ve never used AI in the class because problem solving is one of the most fun parts of computer science. That teacher acknowledges that it’s real, but in my more advanced classes my instructors realize that ChatGPT can’t really be used, so they’re kind of like ‘whatever.’”
Groves said, “I know this has been a big discussion point for years, but for the past couple of years with just how big it’s gotten and how quickly it’s developing, I think we’re kind of at the stages where we’re still trying to see where it’s going to go ultimately. I honestly haven’t really heard a lot of conversation about it.”
In Kreher’s opinion, “Educators are hopeful that their students are building their own critical thinking skills and are becoming people that can go out into the working force and not have to rely on their phone to answer a question because that shouldn’t be, you shouldn’t have to pick up your phone for it to answer a question for you when you’re at work,”
But at the same time, she has observed that AI experience is becoming a requirement of those entering the advertising industry, and believes faculty needs to address this with students’ education.
So, it’s easy to understand the conflict of integrating AI tools and discouraging reliance on them.
“We certainly can’t have our heads in the sand, which seems to be happening in some places. ChatGPT is a tool we need to learn how to manage,” Kreher said.
“I have relatives in the Bay Area, and friends and relatives who work at Meta, Google, and Microsoft, and I’m hearing the same thing from all of them, that students need to understand how to use AI because AI writers is a job now,” she added.
Kreher said other universities have policies in place.
“The university [SIU] needs to have training on this for professors and for teaching assistants. Like, ‘here’s what it is, here’s what to do with it’, there are ways,” she said.
AlSobeh thinks students and educators can benefit by an integration approach. Some options include awareness workshops, including IT or general computer science in core curriculum, and collaboration between colleges.
“I would recommend students involving AI in all studies, a plan in all departments, not just here in the engineering or information technology or computer science departments, but in human science, in animal science, and others to know how we can employ these technologies to improve students’ skills,” he said, “and a collaboration between areas like computer science, engineering, and IT with the other colleges to maybe customize or adapt or create a material that addresses this.”
Students are hopeful that we can find a way to integrate AI into education, benefitting us when it’s time to enter the workforce.
“My girlfriend is in a creative writing class right now, and one of her assignments was to use AI to create a story. So I think that the tide is changing,” one student said.
“I mean, things are evolving. So, instead of fighting it and saying no, let’s not use it, it’s better to figure out how best to use it and let people learn about it,” another student added. “These are the pitfalls. These are the negative sides of it. So you kind of educate students about it. So they know what they’re getting into. Instead of just trying to be like, no, we’re not going to do it at all.”
In terms of pitfalls, deepfakes are an example both Kreher and AlSobeh agree students must be aware of.
Deepfakes are digitally manipulated media that can replace or alter a person’s face or body, even surroundings, typically used to spread false information.
Kreher teaches her students about the topic within one of her courses, and AlSobeh’s department is working on a survey that will be distributed to more than 500 students at SIU to test their ability to spot deepfakes.
At the end of the day, these technological advancements are part of our lives.
“I’d say a lot of the fears that people had of those new technologies was that they would just completely remove the human part of the equation from the workforce as a whole, and it didn’t happen,” Groves said, “So I’m starting to wonder if maybe we will get to a point where we see AI being used in ways to help guide people instead of just relying on it to do everything for us.”
AlSobeh and Kreher agree, “AI tools are things you would have to learn to be equipped for the employment or labor market. If you go to an interview, I think now most of the interviews will be having questions about the AI tools and how we can use them,” AlSobeh said. “Many people are scared that AI tools will replace or take their jobs, right? Small jobs, but not that many jobs. But AI can open new jobs too.”
So what can SIU students do to advance alongside this technological disruptor?
“Talk with your colleges and your educators that aren’t really proactive about using it, or don’t really have set rules,” Kreher suggested, “Discuss with them your concerns and try and make it more talked about.”
With a proper grasp on the knowledge of AI tools, students can be sure they are prepared after graduation and use them to benefit their lives.
Advertisement
Richmond B.Adams, Ph.D, English, 2011 • Mar 23, 2024 at 11:16 am
As an alumni (Ph.D., Class of 2011), I am beyond appalled. It’s not even so much that most or many students rationalize taking the easiest route possible to complete an assignment, but once again, university administrators and compliant faculty do not insist on individual research and work to do so. To extend one student’s thoughts herein, AI is plagiarism, and will only accelerate what began 50 years ago throughout higher education: the collapse of intellectual life and the decline of individualized creativity in the name of what is now called “recruitment and retention.”
Tony Williams • Mar 26, 2024 at 5:01 pm
I wholeheartedly support the above comment. At this moment I’ve just finished reading a course module from Stockholm University where it clearly states that using A-1 and Chat are grounds for dismissal for plagiarism. The fact that SIUC is fully embracing A-1 while other countries (including the USA) are investigating its implications shows how far this University has fallen. This will damage the reputation of SIUC, making its degrees questionable and probably encouraging future students to look elsewhere for a valid education and qualifications/