This has never been done before anywhere in the world: asking researchers from every discipline the same questions, from pharmacy to literature and from sociology to art history. What is going well within their discipline, what are the dubious practices and which factors play a role?
Everyone in the academic world sees academic integrity as important, especially after the prominent scandal involving Tilburg Professor of Social Psychology Diederik Stapel, whose years of academic misconduct came to light in 2011. And another issue that is increasingly being highlighted is the grey area in which researchers sometimes cut corners or make their results appear more meaningful than they actually are.
ZonMW, an organisation that provides funding for research and is generally mainly involved in medical research, has launched a programme to promote good research. It has earmarked €3.8 million for this. The ‘National Survey on Research Integrity’ is one of its four pillars.
But after two and a half years of preparation, the survey is only being supported by five universities and three UMCs. These are the universities of Nijmegen, Tilburg, Maastricht and Amsterdam (two universities) and the academic hospitals in – yes – Nijmegen, Maastricht and Amsterdam.
Lack of cooperation
All the other institutions have decided not support the survey: they are not providing lists of e-mail addresses, let alone encouraging their staff to participate. But the organisers of the project are still doing everything they can to get through to people, says researcher Gowri Gopalakrishna.
How urgent is this study?
“Focusing on ethical research seems more important than ever during the coronavirus crisis. We are seeing a flood of research, and many articles are being published as ‘preprints’. Those are articles that have not yet gone through the peer-review process as they normally would before being published in a scientific journal. So there are fewer controls on quality. That makes it even more important that the authors themselves work as carefully as possible.”
But even peer review does not mean that every error is spotted.
“That’s true. Dubious research sometimes still makes it into the leading scientific journals.”
Is there a recent example of that?
“This spring, there was a study in The Lancet about a malaria drug which, unfortunately, is not an effective treatment for Covid-19. The underlying data turned out not to exist at all. Fortunately, this did not affect the treatment of patients, because other research showed that the drug was ineffective.”
Do problems like this happen outside the field of medicine?
“That’s what we’d like to find out. We hear all kinds of stories, but the aim of this study is to find out exactly what is going on, and in which disciplines. It would show us where we need to focus. The results of this study wouldn’t be the end of a process, but the beginning: the survey would give us the hard data we need to start a dialogue.”
Can you cover all disciplines with one questionnaire?
“Well you can’t capture everything in one questionnaire, of course, and some terms may lead to questions. But this is the first step. It won’t take researchers any more than 15 to 25 minutes to complete the questionnaire. We had a shorter questionnaire to start with, but it turned out that it was not nuanced enough.”
Most universities and UMCs are not actively taking part in the survey. What effect will that have?
“Researchers can still take part, but there is nobody to point out how important the survey is. We’ve noticed that often our emails are not even opened. The researchers might think it’s spam, or not recognise the name of the research agency, Kantar. Or they might lose interest when they see the ZonMW logo, because they don’t usually have anything to do with us. So none of that helps.”
Are the institutions being deliberately obstructive?
“I don’t believe so, but this is a problem that affects all of us and so I’m disappointed that the institutions aren’t more willing to tackle it together. Apparently there is no collective will to work on this problem together. But we’ll need to work together if we want to understand the issue of scientific integrity in all its complexity. We hope that researchers will still take part, even if their own institution is not actively supporting the survey.”
A broader palette
The HOP approached one general university and one technical university to ask why they are not supporting the study. Both referred to the forthcoming evaluation of the Code of Conduct for Research Integrity, which was introduced in 2018.
“That evaluation will highlight a broader range of issues and is expected to provide better information on how to improve policy,” replied Utrecht University. “Because our researchers will need to take part in that evaluation, after careful consideration we have decided not to participate actively in the NSRI. We regularly need to decide whether or not to send surveys to our researchers.”
TU Eindhoven also referred to the evaluation of the new code of conduct: “We expect that evaluation to provide more information on possible action in a wider range of areas. But of course, individual researchers are still free to take part in the NSRI survey. We look forward to the results with interest.”
Erasmus University had other reasons for not cooperating, the spokesperson of the Executive Board says. That decision was taken at the time because a survey had already been conducted among all EUR staff following the new national Code of Conduct for Research Integrity from 2018. In addition, the National Survey on Research Integrity provides ‘general answers and answers per scientific discipline’, but ‘no data per university’.
“The Executive Board did not want to put an extra burden on staff by carrying out another survey, but instead wanted to take action on the results of its own research.” These results were used, for example, to revise the general integrity code of the university, to develop a dilemma game on scientific integrity and to professionalise the ethical review committees and data management at the EUR.