‘You’re basically stupid if you don’t use ChatGPT’
Many lecturers are concerned about students using AI tools. But public philosopher Bas Haring sees mostly opportunities: “Outsourcing part of the thinking to AI shouldn’t be forbidden.” Which tasks should be outsourced to AI in education?

Image by: Ivar Pel
Bas Haring has ruffled quite a few feathers with his ‘provocative’ experiment. The Leiden philosopher and professor of public understanding of science outsourced his role as graduation supervisor to AI for one student last academic year. She discussed the progress of her thesis not with him, but with ChatGPT. And it went surprisingly well.
Haring is enthusiastic about the outcome of the experiment, though not everyone shares his excitement. Unethical, irresponsible, pure poverty or even disgusting – these were just some of the written reactions. It would, they say, give populists an excuse to cut education budgets even further.
Perspective
Is it really such a bad idea to leave student supervision to AI – even partially? And just as important: what is a graduation thesis still worth if AI takes over much of the student’s thinking?
‘Do we still do all our own thinking, or can we outsource some of it to the computer, because it simply does it better?’
Haring is keen to put his experiment in perspective, he says when we ask him about it. “Twenty-five years ago, there was also something new in the world: the internet. It had exactly the same vibe as now. Back then, I gave classes on the question: what can we do with it? How does the internet change our thinking? Everyone was busy with that then, and now we are again.”
“Back then, it was about knowledge becoming available everywhere. Now, with AI, it’s about thinking. Do we still do all our own thinking, or can we outsource some of it to the computer, because it simply does it better?”
ChatGPT can do that
According to the most recent figures from Statistics Netherlands (CBS), nearly a quarter of Dutch people use an AI program from time to time. Among 18- to 25-year-olds, it’s almost half. But those figures are already over a year old and don’t distinguish between students and other young people.
“There’s no getting around it”, says Leiden assistant professor of linguistics Alex Reuneker. “It’s not allowed in thesis writing, but how can you guarantee it’s really written by students? Some things are bound to slip through without us noticing they came from a tool like that.”
“I was reading more and more student reports that were clearly largely written by AI”, says Meryem Janse, former lecturer at Saxion University of Applied Sciences. “That’s one of the reasons I quit teaching. I noticed educational institutions weren’t flexible enough to adapt to the rise of AI – but they should be. Otherwise, what is a diploma still worth?”
For many students, AI has become an integral part of daily life. “I use AI tools for almost everything”, says ‘Mark’, a master’s student in biomolecular sciences. He prefers not to use his real name in the media. “Officially, you’re not allowed to use AI for writing texts, but I do it anyway – and so does everyone I know.”
‘Milou’ (also not her real name), a master’s student in healthcare management, uses AI tools several times a week – as a search engine, to write emails or to format citations correctly. “For practical things that take me a lot of time, I quickly think: ChatGPT can do that.”
Speculative
Meanwhile, Bas Haring has seen thesis quality improve significantly since the arrival of ChatGPT. “That’s because almost all students use AI. The education system just hasn’t yet figured out how to handle it.”
Haring was one of the first AI students in the Netherlands back in 1988. “It was still a very speculative field. What AI can do today seemed impossible then.”
What does the rise of AI mean for higher education?
“We need to think carefully about what we want students to learn. Students use AI – they’re not stupid. ChatGPT is available 24 hours a day – you’re almost stupid if you don’t use it. So how do we make sure a student continues to learn?”
“I always encourage my students to watch lots of films, read literature and visit museums to prepare for their thesis. They sometimes find it frustrating and inefficient. But if you talk to ChatGPT for a moment, it can also give you all kinds of suggestions and ideas. I’m not sure that’s such a bad thing.”
Isn’t coming up with ideas yourself valuable?
“Yes, but as a supervisor I also give prompts. I might say: maybe try looking at it this way. And ChatGPT happens to be very good at that too. It’s not exactly the same process – with AI it goes faster and you get more varied input. It can help with thinking.”
But AI doesn’t encourage thinking if it writes your thesis for you, right?
“No, but if students use it correctly, AI can be stimulating. Like a critical supervisor that offers ideas without spelling everything out.”
‘I can imagine that in a few years we’ll have an AI tool designed specifically for education that provides this kind of supervision without spoon-feeding’
But that means it’s the student’s responsibility?
“For now, yes. I can imagine that in a few years we’ll have an AI tool designed specifically for education that provides this kind of supervision without spoon-feeding. That’s how I use AI myself. When I write a text, I ask: could I phrase this differently, what counterarguments are there, what am I overlooking? It always gives interesting ideas.”
You trusted your student to use AI. But can we trust all students?
“No, I wouldn’t say that. I didn’t suggest we fully automate graduation supervision. With the experiment, I merely wanted to raise the question: what can we still teach students, and what tasks are we willing to outsource? That said, I don’t think it’s wise in the long run to use a commercial product like ChatGPT for that purpose.”
You compare this to the rise of the internet. Back then, the idea was that we’d need less knowledge by heart and focus more on critical thinking.
“That’s not such a bad idea, is it? Everyone has an encyclopedia in their phone now. Knowledge has become less important. And now thinking is becoming less important too.”
Does that apply only to ‘thinking’, or also to ‘critical thinking’?
“Critical thinking is still relevant, but perhaps less so for certain types of reasoning tasks.”
What’s the difference?
“Critical thinking is more inquisitive, more precise. It’s hard to define the exact difference between academic reflection and constructing a logical argument, but they’re not the same.”
So what will be the most important role of education in future?
“I wouldn’t be surprised if, in education, the interpersonal becomes more important than the intellectual. Sitting next to someone, looking someone in the eye, relating to each other. The social aspect. I’ve spoken to GPs about AI in their work. They said a GP has three main tasks: 1) making an initial diagnosis, 2) treating or referring, and 3) offering a listening ear. It’s quite plausible that the first two – the ‘intellectual’ tasks – will become less important. They’ll increasingly be handled by machines. But the ‘human’ work will remain ours.”
Open letter
The reality, however, is that many students take the path of least resistance – and many lecturers are worried. Frans Prins, associate professor at Utrecht University and director of education advice & training: “Students can now outsource some routine tasks, but we still need to safeguard education quality. There’s a risk students will miss out on learning moments.”
Lecturer Reuneker says something similar: “We need to teach students what the risks are and how to engage critically with AI.”
Some lecturers, however, see no role for AI in the classroom at all and advocate a complete ban. At the end of June, 500 researchers and professors signed an open letter calling for more critical scrutiny of AI in education. “AI use demonstrably hinders student learning and undermines critical thinking”, they wrote.
What do you think of such a ban?
“I think it’s a completely nonsensical suggestion. What do you think students are doing at home? Of course they’re going to use AI when writing their thesis. If students really are learning less well to think for themselves – which seems likely to me – then a ban won’t help. We need to find better ways to use the tool.”
“We also don’t just give kids a calculator without teaching them to do sums first. At the same time, people today are indeed worse at mental arithmetic than a hundred years ago. Maybe it’s the same here. We don’t want students to become structurally dumber because of AI, but we shouldn’t block the technology entirely. That’s not even possible.”
But education programmes don’t have a grip on it yet.
“We’re in a unique period. Everyone is struggling with it – especially in the humanities, where a lot of writing is involved.” After all, AI programmes are particularly good at producing text.
So what is a diploma still worth, then?
“Maybe a few students will slip through the net and get a diploma when they shouldn’t. Universities are understandably concerned about that. Should we make students write texts by hand? Put them in a booth? Perhaps at the beginning of the programme. Later on, you could allow more AI use.”
And the thesis?
“A thesis shouldn’t be entirely written by AI. But during the defence it quickly becomes clear whether the student is truly the ‘owner’ of the work. And as a supervisor you speak to the student throughout the process. You’ll notice whether they really have a grip on the topic. But outsourcing part of the thinking shouldn’t be forbidden. That’s why I thought it was an interesting – and maybe slightly provocative – experiment.”
“Writing forces you to think carefully, but it’s not equally meaningful for everyone. I had a student who was interested in street litter. She often went cycling and took photos. She learned more from that than from the writing process.”
Could the thesis eventually disappear, if it no longer shows what a student can do?
“Education is a conservative world. Degree programmes often want everything to stay the same. That’s why theses won’t disappear anytime soon. But something has to change – though what, exactly? My advice would be to talk to students and ask them how AI can be used wisely. They know everything about it, they use it daily. I think there are a lot of hidden solutions waiting to be uncovered.”
No oversight on AI in higher education – yet
Ensuring the quality of education isn’t just the responsibility of the degree programmes themselves. There are also legal requirements and guidelines. But these are lagging behind developments: AI still plays no role.
Degree programmes at universities and universities of applied sciences are reviewed every six years by accreditation body NVAO. This quality watchdog is not yet concerned with AI in education.
During reviews, a panel of experts assesses various aspects of the programme, based on fixed standards. “AI is not mentioned in these”, a spokesperson says. Panel members are not trained in assessing AI-related education.
Education Inspectorate
The Education Inspectorate oversees the system as a whole (not individual programmes). In higher education, inspectors assess issues like social safety, equal opportunities and the quality assurance system.
The inspectorate is also currently examining ‘digital resilience and AI in education’. The guiding question: to what extent do university boards ensure digital resilience, and what opportunities and risks do they see for AI in education? But this report isn’t yet finished.
Academic society KNAW
University students are expected to learn how to conduct research with integrity – but what role should AI play in this? Science itself doesn’t yet have a clear answer. The Dutch Code of Conduct for Scientific Integrity currently offers no guidelines.
The Royal Netherlands Academy of Arts and Sciences (KNAW) is currently involved in revising the code. The KNAW says the committee ‘has been tasked with examining how AI should be included in the code’, but there is ‘not much to say about that yet’.
Read more
-
‘Give me a positive review’: secret AI prompts in publications are effective
Gepubliceerd op:-
Artificial Intelligence
-
De redactie
Latest news
-
University calls on people to remind smokers, security guards don’t send smokers off campus
Gepubliceerd op:-
Campus
-
-
What do the new European housing plans mean for students?
Gepubliceerd op:-
Campus
-
-
Makeover for Erasmus Magazine: new and more accessible website is live
Gepubliceerd op:-
Campus
-
Comments
Comments are closed.
Read more in artificial intelligence
-
‘Give me a positive review’: secret AI prompts in publications are effective
Gepubliceerd op:-
Artificial Intelligence
-
-
In-house chatbot Erudite aims to lure EUR staff away from privacy drama ChatGPT
Gepubliceerd op:-
Artificial Intelligence
-
-
Erasmus X: Innovation in Education – an article by ChatGPT
Gepubliceerd op:-
Artificial Intelligence
-