Of the 154 respondents in the anonymous EM survey, no less than 92 per cent said that they regularly used ChatGPT for their studies. Most students use it at least once a week, while a quarter even use it daily. Only 8 per cent said they had never used the chatbot in their studies.
Primarily used for revising text
Although 18 respondents admitted that they had occasionally submitted texts generated by OpenAI, with at least 17 of those respondents this did not entail submitting entire texts or essays without first revising them. They use it as a tool, for example to translate text they have written themselves, or to rephrase a text to make the wording more formal. Another respondent used Chat GPT to produce programming code.
Some do use it to generate text, but they don’t submit it without first reading through it. An economics student who started using the chatbot ‘five days after its launch in December 2022’, and has been using it daily since, stated: “I always edit it first and then revise it several times. ChatGPT isn’t good enough, and it would be silly because I’m supposed to be learning about subjects myself.”
One student could perhaps be suspected of plagiarism, because they replied somewhat cryptically to the question whether they had ever submitted an AI-generated text. ‘Google is your friend’, was the only explanation from the Economy and Philosophy student. Another student admitted to handing in a text entirely generated by ChatGPT, but stated ‘that was the assignment’.
Brainstorming
The most popular use of the chatbot is ‘to generate ideas’. No less than 64 per cent of respondents said they use it as a brainstorming machine. “For my Bachelor’s thesis I used it to brainstorm potential subquestions”, an ESHCC Master’s student wrote. Google should be worried too, because the second-most popular use of ChatGPT is ‘as a search engine’. Almost half of the students use it make summaries or get feedback on language and grammar.
It’s striking how many respondents use the chatbot to generate programming code. The advantage of this is that you don’t have to worry about the reliability of the answer: “If the code works, you know it’s reliable”, a student explained.
Solution for introverts
Another commonly mentioned use for ChatGPT is as a sparring partner, which can be a great solution for introverted students. “I use it to have conversations about concepts. I suffer from a lot of social anxiety, so it’s nice to be able to talk to ChatGPT about philosophical concepts or SPSS functions”, a Master’s student studying at three faculties wrote.
A law student sometimes asked ChatGPT to generate a practice exam. “I upload my literature as a pdf file, plus a past practice exam or past actual exam if available. I then ask ChatGPT if it can make a new practice exam with questions on the literature in the same style as the previous exam.” Only 13 per cent of respondents used ChatGPT to generate practice assignments, but half considered this to be a useful application of the chatbot.
‘It does make calculation errors’
Doubts about the chatbot’s reliability seems to be main reason why students shy away from using it to commit plagiarism. The respondents rated its reliability with a 3.2 out of 5. Only one student gave ChatGPT a score of 5 out of 5. “Everything relating to factual knowledge is reliable”, the Business Administration student wrote. But even this first-year student noted some issues: “It does make calculation errors.”
Other students, who use the paid version of the chatbot, upload scientific papers via a plug-in to increase its reliability.
Misunderstandings
From the answers of the students, there do appear to be some misunderstandings about the nature of the AI chatbot. “I assume that GPT also searches the internet on its own”, someone wrote. In reality, the bot cannot look up information on the internet. The free version uses a language model, which is based on sources that were last updated in January 2022.
Another student believed that ChatGPT only uses ‘reliable sources’, but the bot’s sources actually consist of millions of websites. The chatbot actually doesn’t search for information at all; it generates answers with words that seem the most logical based on its language model. Consequently, the program usually doesn’t know what the source of its information is.
Lies to you when it feels like it
According to some respondents, you can increase the reliability of ChatGPT by asking it to check its own answers. When the bot is questioned, sometimes it may indeed discover that it made an error, but its corrections are often wrong as well. “You’ll sometimes see recurring hallucinations”, a Master’s student noted. Another student, who rated its reliability with a 2 out of 5, even suspected ChatGPT of being recalcitrant: “It lies to you when it feels like it.” Yet another respondent felt that the chatbot was ‘an unreliable team member’. “You can give it assignments, but make sure not to hand in anything it writes.”
A few respondents felt that the use of ChatGPT was not justifiable in any context. “It’s damaging to higher education, because it undermines students’ ability to write, conduct research and analyse texts. And it’s simply unfair”, a second-year economics student said.
One student questioned the reliability of ChatGPT because the software ‘cannot feel emotions’. EM asked ChatGPT how it feels about that, and the answer was: “Oh no, my lack of emotions makes me the digital world’s deadpan comedian! I try my best, but laughing is not built into my programming language. Perhaps I should take a ‘humour for AI’ course to add some more jokes to my database.”
Explanation of the survey
In the period from 28 September to 19 October, EUR students had the opportunity to complete a survey on the Erasmus Magazine website on their use of ChatGPT in their studies. The survey was put together in collaboration with Quincy Breidel, a lecturer at the Erasmus School of Law. The Dutch-language survey was completed by 84 respondents and the identical English-language version by 70, bringing the total number to 154.
Slightly over half of the respondents were Bachelor’s students, the rest were Master’s students. Master’s students were overrepresented in the survey, as they make up 36 per cent of the student population but comprised 45 per cent of respondents. The share of ESHCC students among respondents was twice as high as would be expected (12 per cent despite making up just 6 per cent of the student population), whereas Law students were underrepresented (8 per cent of respondents compared to 16 per cent of the student population). The respondents were not asked about any other personal characteristics.