AI Raises Bigger Concerns for Students Than Teachers, Admins: Study

By John P. Mello Jr.

A multinational study on the state of artificial intelligence in secondary and higher education released Tuesday found that students are more concerned about the impact of the technology on their learning than academic administrators and educators.
The study, based on a survey of 3,500 academic administrators, students, and educators in seven nations, found that more than three out of five students (64%) worry about the use of AI in education, compared to 50% of educators and 41% of academic administrators.
Top AI risks for educators and students were overreliance on the technology and potential loss of critical thinking skills, while prime risks for administrators were data privacy and security breaches, according to the survey conducted by Vanson Bourne for Turnitin, an academic integrity and assessment solutions company, in Oakland, Calif.
“I’m not surprised that students were concerned, but the depth of their concern was surprising to me,” said Turnitin’s Senior Director for Customer Engagement, Patti West-Smith.
“I was surprised at the way the numbers shook out, that they expressed more concern about AI than admins or instructors,” she told TechNewsWorld.
Erosion of Critical Thinking
Karen Kovacs North, a clinical professor of communication at the Annenberg School for Communication and Journalism at the University of Southern California, said that if students are concerned about AI, it isn’t stopping them from using it to complete assigned work.
Nevertheless, she told TechNewsWorld, “It’s heartening to know that students are now embracing the idea that critical thinking is what’s lost when they hand off problem-solving to AI.”
“If students have an increasing understanding and appreciation of their own critical thinking and their own ability to problem solve, then we’re at least moving toward a better world where people will do their own problem solving and critical thinking.”
Nearly half the students in the survey (49%) said they worry about becoming overreliant on AI, while more than half (59%) fret about overreliance on AI reducing their critical thinking skills.
“Reading and writing are two fundamental ways that people make meaning of new information, so if you get into a situation where a student is over-relying on AI, they’re outsourcing a lot of meaning-making that they would normally be doing,” West-Smith explained.
“The danger there is that if you outsource the reading and writing, you might also outsource the thinking that goes into reading and writing,” she said. “The result of that is you might learn less because you are turning over your thinking process to the technology.”
Overuse of AI Weakens Thinking Skills
Kaveh Vahdat, founder and president of RiseOpp, a San Francisco marketing agency specializing in chief marketing officer services, maintained that overreliance on AI risks displacing the cognitive friction that critical thinking depends on.
“When students defer too quickly to machine-generated answers, they may engage less in evaluating assumptions, weighing evidence, or forming independent judgments,” he told TechNewsWorld. “These are foundational to learning.”
“I think of critical thinking as a muscle. It needs to be regularly exercised,” added Ryan Trattner, co-founder of StudyFetch, an AI-powered learning platform in Los Angeles. “If students are not properly reasoning and evaluating information on a regular basis, this muscle will atrophy.”
“We’re starting to see an entire generation of students who just magically makes answers appear without any effort,” he told TechNewsWorld. “When browsing the internet and looking for answers, it was still similar to a library. You had to find and read the information, but it wasn’t exactly answering your question, so you had to interpret and understand it, and then apply what you learned to answer a question. That is not the case with AI, where it’s just a simple copy and paste with zero critical thinking.”
Wide Agreement on AI Misuse
An overwhelming number of survey participants (95%) felt that AI was being misused. “The risk of intentional misuse will always exist with generative AI,” Turnitin Chief Product Officer Annie Chechitelli said in a statement. “Transparency throughout the student writing process enables educators to leverage the opportunities that AI technologies present while upholding the integrity of original student work.”
“It can be helpful to have students do more of their writing in class, where it’s more difficult for them to use AI since they risk getting caught, but should we be teaching defensively and coming up with strategies that anticipate cheating?” asked Dan Kennedy, a professor of journalism at Northeastern University in Boston.
“I don’t think so,” he told TechNewsWorld. “Maybe having students produce a few writing samples in class at the beginning of the semester would be helpful, but overall, I’m uncomfortable with the idea of assuming my students will cheat.”
AI is a huge challenge for people at universities because it is so enticing for students to get their homework done fast, added North. “It puts a burden on faculty to come up with assignments that will challenge students to approach a problem in a uniquely individual way,” she said. “I always try to come up with problems that would be very hard for AI to solve, but it’s exhausting to try to figure out how to circumvent AI.”
Mark N. Vena, president and principal analyst at SmartTech Research in Las Vegas, maintains that misuse can be reduced through clear guidelines, ethical training, and early integration of AI literacy into the curriculum. “Educators should model responsible AI use and encourage students to see AI as a tool for exploration, not automation,” he told TechNewsWorld.
Getting the Most From AI
The survey also noted that while organizations may be expecting an AI-ready future workforce, more than two-thirds of the students surveyed (67%) felt they are shortcutting their learning by using AI. In addition to feeling they are shortcutting learning, 50% of students report not knowing how to get the most benefit from AI in their studies.
“Often schools are falling behind in their understanding of how to use these systems, and by extension, how to help students get the most out of AI,” said Matt Mittelsteadt, a technology policy research fellow at the Cato Institute, a Washington, D.C. think tank.
“To fill the gap, I’d encourage students to look to YouTube and other free-to-use online learning platforms,” he told TechNewsWorld. “Today there is a growing wealth of free resources on AI use cases, prompting techniques and limitations that students should dive into to fill the gaps in their education.”
“I’d also recommend students investigate more ‘bespoke’ use cases of AI that could really supercharge their learning,” he added. “Machine translation, for instance, is now largely mature, opening the door for students to translate novel primary sources, follow international news, and engage with unique information sources that previously would have been walled behind a language barrier.”
Vena added: “Students can derive the greatest benefit from employing artificial intelligence to augment — not supplant — their learning process. This entails utilizing AI for feedback, brainstorming, and conceptual clarification while simultaneously maintaining engagement and reflective thinking throughout the learning journey.”
Need To Reimagine Education for the AI Era
David Bader, director of the Institute for Data Science at the New Jersey Institute of Technology in Newark, N.J., maintained that society is at an inflection point that requires reimagining education, not just adding AI as another tool in the existing framework.
“The key question isn’t whether to use AI, but how to evolve education to prepare students for a world where AI is ubiquitous,” he told TechNewsWorld. “This means shifting emphasis from fact memorization to higher-order thinking skills that complement rather than compete with AI capabilities.”
“It means reconsidering assessment fundamentally — what are we measuring and why?” he continued. “It means acknowledging that literacy now includes understanding algorithmic influences on information.”
“Most importantly, we need to maintain focus on the uniquely human aspects of education — the creativity, ethical reasoning, and interpersonal skills that remain distinctly human domains,” he said. “AI should amplify these capacities, not replace them.”
“Educational institutions have a responsibility to model thoughtful AI implementation, being neither uncritically enthusiastic nor fearfully resistant,” he added. “The decisions we make now about AI in education will shape not just learning outcomes but society’s relationship with these powerful technologies for years to come.”