Who fares better in the AI era: students who use it or those who don’t?

As academia adapts to a world where students can generate flawless work with a click, the job market now expects AI literacy; in a special interview, Dr. Amir Gefen explains why grades remain, exams are returning and whether overusing AI or avoiding it altogether is the bigger concern

Three years after ChatGPT entered our lives, and against the backdrop of Gemini’s meteoric rise over the past year, academia has found itself in a crisis whose outcome is still unclear. On the one hand, universities are still trying to understand how to assess knowledge in a world where almost any student can generate a flawless paper at the click of a button.
On the other hand, it is clear to everyone that proficiency in AI tools in today’s job market is equivalent to the ability to use a computer. In other words, basic command of AI tools is not only desirable but increasingly expected of candidates for many positions, especially in the high-tech industry. Between the desire to preserve academic standards and the pressure to remain relevant to the labor market, it is clear that change will not stop at reports, pilot programs or statements of intent.
3 View gallery
סטודנטים בשיעור
סטודנטים בשיעור
Students in class
(Photo: Shutterstock)

Between grades and artificial intelligence

So have grades lost their meaning? Is the solution surprise in-class exams? And where is the line between legitimate use of artificial intelligence and abandoning independent thinking? A conversation with Dr. Amir Gefen, a researcher and lecturer in artificial intelligence and online safety at Gordon Academic College of Education and an academic adviser to Israel’s Education Ministry, reveals a far more complex picture than the simple question of whether students are copy-pasting their assignments.
Today, when AI tools are available to everyone, do academic grades still reflect ability and talent, or mainly indicate who knows how to use AI better?
“We need to say this very clearly: in academia, we still have to give grades. That isn’t going away,” Gefen says. “For now, we are still operating within existing academic frameworks, even if they are shifting and changing. From the academy’s perspective, as long as there are students, we need to assess them and their abilities and ultimately assign a grade.
“The real challenge is ensuring that the grade reflects the student’s knowledge, not just their ability to operate one model or another. At the same time, it’s important to say that using artificial intelligence is legitimate, even welcome. We encourage students to use these tools as part of the learning process, and the world of work expects it as well.
“But at the end of the day, artificial intelligence is not what’s supposed to receive a grade from me. I need to assess the person in front of me. The question is how we separate what the student knows from what the AI knows how to generate for them, and that is the challenge facing academia right now.
3 View gallery
בינה מלאכותית
בינה מלאכותית
Artificial intelligence; Threshold requirement in many workplaces
(Photo: Shutterstock)
Today there are tools that can make AI-generated text sound ‘human.’ How can an academic institution even know who actually learned and who just submitted a product? Do we need a new assessment model?
“Absolutely. We need a new assessment model, or more accurately, a new-old one. In the end, I need to know what’s in the student’s head. I have no way to get inside it, so I have to create an assessment event that allows the student to express their knowledge.
“There are two ways to do that, in writing or orally. It can be done through a closed-book exam, that ancient method used by every educational institution since forever, or through a conversation, a kind of oral defense.
“In that conversation, you can ask the student to explain what they wrote, what research method they used and how they reached their conclusions. If they can describe their work and answer questions they didn’t know in advance, then as far as I’m concerned, they know the material and I can give them a grade.”
That makes sense in small classes, but universities have courses with 200 or 300 students. A lecturer can’t have those conversations with everyone.
“We need to separate the pedagogical principle from the logistical challenge. Pedagogically, there’s no question here: to give a grade, I need to know what the student knows, and the way to express knowledge is in writing or orally. Exams can also be held in large groups, and there’s no problem with that. The student can use AI in their learning, but at the moment of truth, they are alone with the paper.
“An oral exam does create an operational difficulty in large classes, but that’s a logistical issue, not a pedagogical one. You can think of creative solutions like a short video, real-time questions or formats that don’t require a one-on-one meeting. It’s not simple, but it’s solvable. The question is whether the system is ready to change and adapt itself to reality.”
ד״ר עמיר גפן Dr. Amir GefenPhoto: Shlomi Amsalem
How can academia, in your view, maintain standards on the one hand and remain relevant on the other?
“People come to academia because they believe it will benefit them. There is no compulsory higher education law. They’re looking for understanding, employment, advancement and an intellectual experience. Universities have two parallel roles: preparing students for the job market and cultivating academic research.
“The world may be changing, but the job market still looks at academic degrees, including in the humanities and social sciences, and therefore expects grades to reflect real knowledge. But it’s important to understand that not only assessment is changing, teaching is changing too.
“As a lecturer, for example, I have a smart personal teaching assistant. At the same time, there are things I choose to teach myself and things I allow students to learn independently with the help of AI. We are in the midst of a deep change across three dimensions: teaching, learning and assessment. Everything is changing together. It’s complex, but there is no choice. If we ignore AI, we will become irrelevant both to students and to the job market.”
Who worries you more, students who don’t use AI at all or those who use it too much?
“Both, but in different ways. Students who don’t use artificial intelligence at all and lack AI literacy worry me a lot, because they will struggle to find jobs. Today, in job interviews, more and more places ask candidates how proficient they are with AI tools. Employers understand how important this is.
“In the end, the ones who will replace people at work won’t be AI itself, but people who know how to use AI and use it to improve their efficiency and the quality of their work. That’s why I have a duty as a lecturer to promote AI literacy among my students. If academia doesn’t do that, it will only deepen existing gaps.
“At the same time, a student who uses AI but becomes overly dependent on it or misuses it harms themselves because they don’t develop critical thinking. But that’s less serious, because it’s something you can work on, improve and teach the student how to use the tools properly.
3 View gallery
הבינה המלאכותית ושוק העבודה
הבינה המלאכותית ושוק העבודה
Artificial intelligence and the labor market
(Photo: Orion Production, Shutterstock)
“It’s much harder to deal with someone who says today that they don’t know how to use artificial intelligence at all. Three years after ChatGPT’s breakthrough, that’s no longer legitimate. A student who doesn’t know how to use AI today is like a student who doesn’t know how to turn on a computer.
“At the end of the day, AI literacy doesn’t fall from the sky. You need to work at it intentionally and want to learn as much as possible. That’s where academia comes in, among other things, and it’s also our responsibility to reduce these gaps among students.
“Alongside that, academic faculty also need training. Lecturers and teachers weren’t born with AI knowledge. The Education Ministry and academic institutions are investing in this, but there is always more that can and should be done. Only in this way can artificial intelligence be properly integrated and passed on to students.”
Comments
The commenter agrees to the privacy policy of Ynet News and agrees not to submit comments that violate the terms of use, including incitement, libel and expressions that exceed the accepted norms of freedom of speech.
""