Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Around the world, long-standing scientific institutions are turning their keen and penetrating gaze inward. With society facing problems such as climate change that urgently require scientific solutions, research funders are seeking to deliver innovations quickly.
But how do funders speed up innovation? Do research grants need to be better designed? Is the increasing average age of scientists an issue? How should scientific publishing be structured? What’s the best way to organise research teams?
When confronted with tricky questions, scientists resort to the scientific method. Research directors are no different. With the curiosity of the best scientists, funders are “turning science on itself”. They are coming up with answers using data collection, statistical analysis and experimentation – the standard scientific toolkit.
Using scientific tools to improve funding systems isn’t a new idea. In the mid-1990s, rigorous experimentation transformed international development strategies. In 2019, the three economists responsible were awarded the Nobel Prize in Economics. Abhijit Banerjee, Esther Duflo and Michael Kremer were recognised for their “new experiment-based approach” to obtaining reliable answers about the best ways to fight global poverty.
This strategy was completely different from conventional approaches. Traditionally, economists were like detectives looking for clues. Researchers would gather information – teacher qualifications, textbook availability, school meal provision – and see if these causes could be traced to good grades and test scores. They tried to establish correlations.
[ Global warming explained for everyone: a coffee-time guide to its reality and causesOpens in new window ]
Kramer and his team tackled more specific questions. Instead of asking, “What input drives good grades?”, they designed field experiments to answer the question, “Do educational materials improve student outcomes?” Some schools were randomly chosen to receive additional resources, while others just got the usual support. Outcomes were measured and the research team was able to draw direct conclusions relevant to education policy.
The results were startling. Whereas conventional studies had suggested student materials improved test scores, the randomised experiments showed no significant correlation. Subsequent experiments indicated that grouping students by ability rather than age was much more effective at improving test scores. Since those results were published, the “teaching at the right level” policy has “improved learning opportunities for over 60 million students in India and Africa”.
Readers may note that this approach is the same as randomised clinical trials – the bedrock of pharmaceutical research. Developed to produce results in which we can have statistical confidence, the experimental design allows researchers to draw causal relationships. If the drug is given, the disease is cured; if input X, then output Y. Kramer and his fellow Nobel Laureates took an imaginative step – they brought these experiments out of the lab. Now, research funders are turning their agencies into labs in the same way.
Like the development economists, research funders are beginning to run experiments on how they fund projects. For example, the US National Institutes of Health (NIH) is piloting a grant to reduce the administrative burden on top scientists. In the US, top scientists are spending nearly half of their time on administrative paper work and maintenance, not science.
A pilot scheme, the Maximising Investigators’ Research Award (MIRA R35), aims to “increase the efficiency of… funding by providing investigators with greater stability and flexibility”. The scheme aims to enhance the productivity of science, increasing the chances of significant breakthroughs.
MIRA is solving another NIH problem – that the average age of principal investigators rose from 39 in 1980 to 51 in 2008. With 78 per cent of Nobel Prize winners between 1980 and 2010 conducting their research before the age of 51, the rising average age suggests that groundbreaking work is being pushed out. A “MIRA for Early Stage Investigators” pilot is ensuring the model benefits young, ambitious researchers.
[ Geneticist Aoife McLysaght due to be appointed Government science adviser by CabinetOpens in new window ]
Efforts to understand efficient resource allocation are at the heart of metascience. The “science of science” can reveal the best strategy – whether to fund people or projects, use teams or individuals, or have rigorous applications or fast grants. Peer review, whereby scientists assess each other’s proposals and review each other’s results, traditionally made these decisions. It, too, is being improved with metascience.
Peer review, while valuable, has some significant flaws. It was primarily designed to distinguish between good and bad scientific proposals. Scientific publishing uses it to triage between high and low-quality results. At this, it performs very well.
However, it runs into difficulty when trying to make fine-grained distinctions. Most grant applications fall into three categories: very good (gets funded), very bad (gets rejected) and spectacularly average. Ranking proposals in the same category is challenging with peer review. The signal is lost in the noise. In these cases, reviewers’ biases – potentially based on gender, age, institution and race – might kick in and influence the final decision.
Metascience research conducted by Mike Lauer, deputy director for extramural research at the NIH, quantified the weaknesses of peer review. His analysis found that peer review can put proposals into broad categories (good or bad) but has no power to rank them in order of merit. Indeed, any other method would be just as effective. Industry panels, lotteries and positive discrimination could all work just as well.
[ EU citizens are entitled to study at any university within the European UnionOpens in new window ]
Not only has metascience diagnosed this issue, but it has also developed potential solutions. In 2018, the Swiss National Science Foundation (SNSF) experimented with drawing lots to make funding decisions. In rare cases where funding applications receive the same score, SNSF used a lottery procedure. This was piloted in their postdoctoral mobility programme. Following evaluation, the Swiss National Research Council allowed evaluators to draw lots, as required, across all funding schemes. As of 2021, randomisation decided all tiebreaker cases. Prof Matthias Egger, president of the Research Council, said, “In such cases, drawing lots is the fairest solution because it is blind and rules out bias.”
Peer review may also favour incremental research over potentially transformative but risky ideas. This risk-aversion occurs because peer review panels tend to be polarised on the merit of big ideas. Proposals often need to win majority support before securing funding. Bold plans with transformative potential but a low chance of success usually only have one, insufficient, champion on a funding panel.
Once again, metascientists are trialling solutions. In partnership with the Institute for Progress, the US National Science Foundation is investigating “golden tickets” as a way to ensure high-risk, high-reward proposals survive peer review. Under this system, each peer reviewer can override a majority decision once in a funding cycle. According to Nature, “the golden-ticket model would, in theory, mean that unorthodox, high-risk proposals that don’t necessarily have the unanimous backing of all referees end up being funded”. Like the international development experiments of Kramer et al, pilots and trials will determine whether this strategy works.
[ Students perform best with regular homework tasks of up to 15 minutes, research showsOpens in new window ]
Running these experiments need not be costly. Involving just two dozen researchers in a yearly pilot would yield significant data over time. By the end of a five-year experiment, over 100 data points could be collected, offering robust insights. One of the considerable advantages of conducting these pilots is gaining a deeper understanding of the entire research system, enabling more effective and targeted improvements.
“There’s a palpable surge in levels of ambition and investment in metascience from governments, funding agencies and foundations across an array of countries,” says James Wilsdon, executive director of the Research on Research Institute (RoRI). “And we see swelling ranks of researchers engaging from almost every discipline, applying novel methods they have honed in their home fields to analyse and improve the effectiveness of the research systems and cultures they work in.”
RoRI’s Experimental Research Funder’s Handbook is a practical guide for metascience experiments. It collates information on trials from around the world. Lessons in the design and implementation of experiments with funding processes are explored. A new metascience unit in the UK government, funded by £10 million, will be putting it to use across the British innovation ecosystem. Another Nature article asks when other countries will follow suit. “The launch of Research Ireland presents a clear opportunity for Ireland to position itself at the vanguard of this global movement for metascience,” Wilsdon says.
These funders and governments recognise that by improving the efficiency of scientific research, metascience accelerates progress in solving society’s most pressing challenges. Metascience is paramount for countries that have staked their future prosperity on being knowledge economies. Ireland is a prime example.
The future of science is not just about what we research but how we research. The potential rewards – for individual countries and global progress – are too great to ignore.
Luke Fehily is director of innovation policy at Progress Ireland, an independent think tank on a mission to connect Ireland to data-driven policy solutions worldwide. He previously worked as an open innovation engineer at Johnson & Johnson