Normally, at least four people are involved in every article EM publishes: a journalist who writes the story; a photographer or illustrator who creates the image for it under the supervision of an art director; and a senior editor who supervises throughout the process and reviews the article before it is published.

For the piece ‘Erasmus X: Innovation in Education’, artificial intelligence was assigned the roles of journalist and illustrator. The imaging program Midjourney created the illustration, while ChatGPT 3.5 was selected to write the article. This version of the chatbot is free, making it the most accessible to the average person.

Read more

Erasmus X: Innovation in Education – an article by ChatGPT

Can generative artificial intelligence take over the work done by journalists and…

AI in action

To start, I uploaded the transcript of the interview (also made using an AI application, Amberscript) to ChatGPT and instructed the bot to turn it into a journalistic article. The resulting text was eleven paragraphs long, without a single quote. It looked so boring that it didn’t invite you to read it.

I entered a second prompt: add quotes from Abel to the text. Then, when I gave it a third instruction asking the chatbot to use less formal language, it generated sentences like: ‘…something they call ChatGPT, a totally awesome technology that might just rock your school’s world’. While it’s amusing that ChatGPT thinks of itself as awesome, that language is too informal for a journalistic article. I also asked ChatGPT to come up with some sub-headings, but rather than sub-headings, the bot added a chapter structure like you would see in an academic thesis. In the end, I disregarded the results of the last two prompts.

ChatGPT_universiteit_illustratie-Noa-Zonderland-scaled

Read more

University to offer ChatGPT in its own secure cloud environment soon

Erasmus University will start to make ChatGPT available to students and staff in a…

Failure to recognise mistakes

During the creative process with ChatGPT, I was reminded of a song title, an old hit by the rock band the Scorpions: What you give, you get back. Whatever you enter is precisely what you’ll see in the result. ChatGPT is not able to place the input in context or to correct mistakes on its own. Spelling errors are one example of this. Because the transcription program, Amberscript, didn’t recognise the word ‘ChatGPT’, the text includes things like ‘Chai BT’, ‘Chai tea’, ‘Czech pity’ and ‘Chachapiti’. I left those errors untouched in the text I uploaded, hoping that the bot would realise – based on the context – what the word should have been and correct it accordingly. Unfortunately, that was not the case. ChatGPT also adopted other errors, like splitting Dutch compound words in two (for example: innovatie lab instead of innovatielab).

Problematic

The first thing I notice is that ChatGPT tries to imitate human behaviour. This is evident in sentences such as: “Vanessa Abel, an enthusiastic representative of Erasmus X, talks about their strategic and refreshing approach…”

The chatbot has added a sprinkling of adjectives to the text: words like ‘enthusiastic’, ‘exciting’ and ‘refreshing’. The fact that these are its own inventions – the words do not appear in the original transcription – is problematic. What’s more, with these words ChatGPT inserts baseless value judgements into the text, without explanation or justification. As a journalist, I consciously strive to avoid such words and to allow my readers to form their own opinions.

No critical questions

During the interview, Abel talked about a variety of ErasmusX projects, but we also addressed her organisation’s role in education: how is ErasmusX making a relevant contribution to educational innovation? How are technological experiments conducted by ErasmusX useful; what value do they add? What concrete objectives does the organisation have and how does it measure its impact? We additionally discussed the collaboration between ErasmusX and other departments at the university, like Erasmus Digitalisation & Information Service and the Community for Learning & Innovation.

ChatGPT omitted this portion of the interview entirely. Instead, the bot chose to list the projects and skip the critical questions. I suspect that ChatGPT only summed up the ErasmusX projects mentioned in the article because this was the obvious overall theme of the interview. In other words, the bot lacks the journalistic skills needed to select information and critically weigh the facts before deciding how to approach a topic.

AI enquete survey computer help chatgpt_2023_Francesca Mora

Read more

Almost everyone uses ChatGPT, but students say they stop short of plagiarism

The vast majority of students use ChatGPT in their studies. But using it to generate…

Lack of depth

ChatGPT devoted exactly two paragraphs to each project. The structure is identical as well: first exposition, then a lengthy quote. Every piece of information in the text seems to be equally important, making it difficult to interpret the story. This shows that ChatGPT is incapable of ‘zooming in’ on important information and omitting irrelevant details.

It also lumps facts and opinions together throughout the story, so that the reader has difficulty telling them apart. Normally, opinions would be placed in quotation marks to create a clear distinction between the journalist’s account and the opinions of the person being interviewed. In this article, however, the aforementioned made-up value judgements are described as facts and Abel’s opinions are not always in quotation marks – and vice versa. To give an example: the quote ‘The Ace Yourself app is intended to help young students prepare for academic life’ would be fine without quotation marks.

News value and specific examples

The journalistic value of the article is also unclear because ChatGPT has a hard time deciding what’s important. A journalist is better at that: what’s new, what do readers need to know and where’s the urgency in the story? Personally, I would zoom in on the role ErasmusX plays and its impact on education, or else start with the most recent project (urgency). I would also mention one or two projects, choosing the ones that are most likely to interest students (relevancy). I’d put background information at the end and then close with a quote from Abel about how important continuous experimentation and innovation are to the educational and scientific communities.

I would also give specific examples – something that’s completely absent in the text by ChatGPT. Take for example the HefHouse project, in which students work with neighbourhood residents to find solutions to social issues. This project includes a buddy programme that pairs students with newcomers to the Netherlands (so-called status holders). A description of this programme and a typical example of a buddy pair would make the story more appealing and more relatable to the reader.

Style

Senior editor Tim Ficheroux is shocked by the poor quality of the article. “I find the text to be fairly clichéd, vague and suggestive. It offers a shallow summary and at times, reads almost like an advertisement. The language errors aren’t that bad; in the Dutch version, most of the mistakes you see are common ones like compound words that have been split in two”, he explains. “I also noticed that, although the entire piece is written in one cohesive style, that style is a little overblown. Words like ‘reveal’, for instance, are not appropriate for the content of the piece.”

“You can see that same overblown style in the clichéd adjectives. Apart from that ChatGPT is inventing opinions like ‘fascinating’, ‘refreshing’ or ‘exciting’ on its own, those words also create a clear frame. It is interesting to see that ChatGPT does not produce a neutral text, while it pretends that it has”, he says.

Human illustration vs. Midjourney

And now for the illustration. Whereas with ChatGPT you can simply toss in the whole transcript, Midjourney only accepts a brief prompt. In response to the prompt ‘students getting to know each other in a VR learning experience in digital art style’, the program spits out four images. I pick the most appealing one and ask the image bot to provide the illustration in a horizontal format.

Illustrator Femke Legué created an illustration for this article based on what ChatGPT wrote. Like Midjourney, her instructions are often no more than a summary of an article. “From that, I distil the main points and then think about what I can do with them”, she says, explaining her work process. “What do you typically see in illustrations about this subject? Am I putting my own spin on the clichés, or am I seeking out a whole new perspective instead?”

After that, Legué’s process consists mainly of making lots of sketches and trying out different things. Once she knows what she wants to draw, it takes her around two hours to create an illustration. “But after that, I usually put the drawing aside and let it sit overnight. Then, the next day, I can come back to it with fresh eyes.”

We did a side by side comparison of Legué’s illustration and Midjourney’s result. “I think the AI illustration is really nice in terms of colour and composition”, Legué notes. “But all it has done is create a literal interpretation of the prompt.”

Legué’s illustration, on the other hand, is about the interpretation and personalisation of artificial intelligence. “Artificial intelligence is an abstract concept, so in this illustration, I tried to give AI a face”, explains Legué. “I also played with colour and used it to differentiate between the figures. And I included ErasmusX’s VR project in the image as well. It is more or less what AI looks like in a classroom.”

AI is not ready to take over journalistic work

“AI is not exactly creative”, Legue notes. As a result, she is not worried about it taking her job. “I think my individual experience and ideas ensure that my work remains unique. It seems to me that AI knows nothing about culture, clichés or the connections between concepts, like between colours and emotions.”

The same goes for the text. While artificial intelligence can produce a reasonably successful summary in terms of content, it is not yet capable of writing a decent piece of journalism. For now, I’ll have to let go of my dream of letting ChatGPT do my work while I binge-watch my favourite show on the big screen in our conference room, bowl of microwave popcorn in hand.

The cost of AI

ChatGPT: The version we used, 3.5, is free. Subscriptions to ChatGPT 4.0 are available for 20 dollars a month.

Amberscript: 15 euros for a one-hour recording.

Midjourney: A subscription costs 10 dollars a month.