New Zealand spends less money on research, relative to its size, than three-quarters of the countries in the OECD. The government is considering expanding public funding to narrow this gap, but very little is known about the efficacy of existing funding mechanisms. Motu recently released a statistical analysis of the effect of public funding given to business firms for R&D. In this paper, we examine the effectiveness of public funding of basic scientific research.
The Marsden Fund is the premiere funding mechanism for blue skies research in New Zealand. In 2014, $56 million was awarded to 101 research projects chosen from among 1222 applications from researchers at universities, Crown Research Institutes and independent research organizations. This funding mechanism is similar to those in other countries, such as the European Research Council.
This research measures the effect of funding receipt from the New Zealand Marsden Fund using a unique dataset of funded and unfunded proposals that includes the evaluation scores assigned to all proposals. This allows us to control statistically for potential bias driven by the Fund’s efforts to fund projects that are expected to be successful, and also to measure the efficacy of the selection process itself. We find that Marsden Funding does increase the scientific output of the funded researchers, but that there is no evidence that the final selection process is able to meaningfully predict the likely success of different proposals.
Background to the Marsden Fund
The Marsden Fund is funded by the Government, with selection and administration delegated to the Royal Society of New Zealand. Proposal review is carried out by assessment panels of between five and ten members that cover broad areas such as ‘Physical Sciences and Engineering’ and ‘Biomedical Sciences’.
There are two types of Marsden Fund grant:
• Standard – runs for up to three years with a maximum budget of $300,000 per year.
• Fast Start – Applicants within 7 years of their PhD award can apply, limited to $100,000 per year.
The application process has two stages. Each initial one-page proposal is reviewed by a subset of the appropriate panel and given a preliminary score. At this point, panels reject 71-84% of the proposals. In the second stage, longer proposals are submitted and sent to external (typically international) anonymous referees for review. Applicants are given the chance to respond to referee comments before the panel scores and ranks the proposals. METHODOLOGY – TEAM SUCCESS This analysis is based on 1,263 Marsden proposals from the second round reviews between 2003 and 2008. Overall 41% of the second-round proposals were funded. Around 25% of the proposals were FS and slightly more than half of these were funded. The average researcher on these teams made six proposals and received 1.2 grants between 2000 and 2012.
Figure 1: Researcher interaction with the Marsden Fund, 2000-2012, New Zealand-based researchers who submitted at least one full proposal, 2003-2008
We identified all the publications and all citations received by those publications from 1995 to 2012, for all of the named researchers on both the successful and unsuccessful proposals. We did not attempt to identify specific publications tied to the research funded, so our results should be interpreted as the overall effect of the Fund on researchers’ scientific output.
Results – Team Success
Our analysis looks at the effect of receiving funding on subsequent scientific output, after controlling for what would be predicted by the researchers’ previous record. In other words, we compare the output trajectory of funded researchers to that of unfunded researchers, and look for an acceleration upon receipt of funding. The statistical model also includes the quality ranking assigned to the proposal by the Marsden Panel. This controls for “selection bias” that might otherwise make us think that the funding is effective simply because the funded proposals were inherently likely to result in more output whether they received funding or not. Depending on the exact statistical model employed, we find that Marsden funding is associated with an increase of 6-15% in publications and 22-26% in citations, relative to what would have been predicted based on the researchers’ previous performance. The larger effect for citations implies that funding increases both the number of papers published and the average impact or significance of those publications.
FS teams are associated with around 16% greater research output, consistent with these younger investigators being on a steeper upward output trajectory than other researchers. Since overall winning Fast-Start proposals are given about one-third as much money as winning standard proposals, the results suggest that they get a bigger boost per budget dollar.
A researcher’s participation in a Marsden proposal can range between zero FTE and full-time. However, we cannot distinguish differential impact of funding based on the different funding levels of the team members. (Zero FTE researchers are often overseas researchers whose true role in the research is unclear; excluding these researchers does not materially affect the results.)
We attempted to look at each disciplinary pool seperately, but unfortunately at that point the sample sizes are too small to yield statistically meaningful results.
While we included the panel evaluation rank to control for selection bias, we found that a good ranking is not associated with better subsequent performance. In fact, after controlling for previous performance, the ranking is negatively associated with success. It is as if the panel rankings place so much weight on “track record” that they actually over-estimate the likely future performance of those researchers with strong past performance.
Methodology – Individual Success
As an alternative window on this process, we identified the approximately 1500 New Zealand based researchers named on these proposals and examined their annual publication and citation record between 1996-2012. Not surprisingly, many of these researchers participated in multiple proposals over that period and some of them received multiple grants:
• 90% of these researchers submitted two or more preliminary proposals,
• 60% submitted two or more full proposals, and
• 30% received two or more contracts from 2000-2012.
In our regression analysis of these data, we examine the extent to which the publications (and citations to those publications) produced in a given year are affected by the number of Marsden grants received in the previous five years. We restrict observations to those individuals who submitted at least one second round proposal in the preceding five years. The analysis is for the period 2004-2012.
Results – Individual success
The impact of receiving a funded contract is estimated as an approximate 3-5% increase in publications and a 5-8% increase in annual citations for each of the subsequent five years, relative to what otherwise would have occurred.
Not publishing in the previous five years greatly increases the probability of not publishing in a given year and the stronger a researcher’s average performance over the past five years, the less likely they will be unpublished in a given year.
As above, we find that a second stage ‘Fast-Start’ applicant, irrespective of funding, is on a steeper publication growth trajectory than ‘Standard’ applicants but the incremental effect of receiving ‘Fast-Start’ funding over ‘Standard’ funding is statistically zero.
Overall, we find that funding is associated with a significant increase in researchers’ scientific output and the apparent impact of that output as measured by subsequent citation. Because the true connection of any single researcher to a Marsden proposal on which they are listed is uncertain, we believe that the project team results of a 6-12% increase in publications and a 13-30% increase in citation-weighted papers is likely the best summary measure of the programme effect.
It is important to emphasize that what is captured here is a general impact on the publication/citation success of the researchers. It seems likely that Marsden funding shifts researchers’ focus to some extent towards the subject of the grant, so that the funding impact on research outputs directly related to the proposal would likely be greater than those estimated here, but our empirical framework does not allow us to measure that. We also cannot determine the extent to which the increase comes from direct use of the Marsden money versus indirect impact of Marsden success on researcher opportunities and resources.
While our initial intention was to include panel rank in the analysis to control for selectivity bias, we find no robust evidence of selection based on likely research success in the Marsden second round. We have tested many different versions of how that selection might operate, including trying both panel scores and raw referee scores, testing for an effect with or without conditioning on prior performance, and testing for a variety of non-linearities in the selection effect. There really seems to be nothing there. Given the significant researcher and RSNZ time and resources that are devoted to second-round selection, this suggests a potentially large misallocation of resources.
Publications and citations are, of course, only proxies for research output. However, we find it hard to describe a plausible conception of the programme’s goals that, if successful, would not produce research that would be expected to be highly cited.
While the study should be interpreted with these caveats, it results are suggestive of three important policy implications:
• The public expenditure on the Marsden Fund is effective in increasing scientific outputs.
• The fact that panel rankings are not predictive of subsequent success implies that if the unfunded projects could have been funded, the benefit of that funding would have been as great as it was for the projects actually funded. This means there is no reason to expect diminishing returns if Marsden funding were increased.
• The significant resources devoted to the second round evaluation could be reduced without degrading the quality of selection decisionmaking. More generally, our analysis demonstrates the benefit of retaining and utilizing information on both successful and unsuccessful grant proposals. This basic strategy for identifying the treatment effect in the presence of potential selection bias is powerful in concept but very rarely applied in practice.