External Peer Review
Understanding the process of how a funding proposal is adjudicated will help you to craft a stronger proposal – one that is aligned with what the particular funding agency is looking for.
There are some fundamental principles about peer review processes, but each funding agency will also have its own specific criteria and processes. The resources on this page may help you think about the fundamental principles in new ways and will also direct you to agency-specific information.
Tip: Taking a quick scan through a wide range of agencies’ documentation on peer review can help you to develop a broader understanding of the perspectives that your readers may be coming from.
The 'Three Rights' Principle
To secure funding, you need to submit the right proposal for the right program at the right agency.
Reviewers are interested in identifying proposals that are closely aligned with the funding agency’s interests and with the goals of the specific funding program. If a reviewer thinks that your proposal is very interesting, very important, but not very aligned with the goals of the program you’ve applied to, chances are they will enthusiastically endorse the work… And recommend that you resubmit the proposal to a different agency/program.
Action: Step outside your own perspective and take the perspective of the funding agency. In your proposal, focus on showing how your efforts will meet their needs, if they choose to fund you.
The 'If It Isn’t in the Proposal…' Principle
Reviewers decide based on what you include in the proposal, not what you meant to include.
Funding agencies rarely ask for more information or clarification after they receive your application, and even if their process allows such follow-up, they’ll only ask for additional information if what you initially provided is complete enough for them to understand its significance.
Action: Put extra effort into making your proposal as complete a stand-alone document as possible. Define your key terms (even if “everyone” knows them, they may not use them the way you do), provide the necessary background information, and convince them that you have already worked out the details of what you plan to do from the moment you receive the funding though to the project wrap-up.
The 'Goldilocks' Principle
Too much or too little emphasis on any part of the proposal will hurt your chances; ensure every part is “just right”.
Most funding agencies place strict limits on application length, and it can be difficult to decide how much emphasis to place on each component of the proposal. If you could easily fill 30 pages with a review of the relevant literature, how do you condense this for a program that has a 6-page limit for the whole proposal? It can be tempting to gloss over the components that seem less important to you so that there will be more space for other requirements. However, more and more, agencies are using checklist - rather than holistic -approaches to evaluating proposals. As a result, devoting too much or too little space to any one component can sink an otherwise strong proposal.
Action: Use the instructions, evaluation criteria, and (if available) relative weightings for each section to help you judge how much emphasis will be “just right” for each required component of the application.
The 'See Our Proposal as Others See It' Principle
This could also be called the Outside Perspective Principle as it emphasizes the importance of the reviewer’s beliefs and values and the fact that these may be very different from yours.
Not everyone looks at the world the way you do or shares the same beliefs about what is most important. In fact, most of the people making decisions about your proposal will have chosen to devote their research careers to different questions and topics that they may believe are more important than the questions you have chosen. It’s important to consider how/why a reviewer could miss the importance of your chosen field and to guide them to see your perspective.
Action: Be both explicit and honest about how the world will be a better place if your proposed research goes ahead. A “better world” may include anything from richer understanding of a particular theory to a cure for cancer – it’s up to you to make the case for the realistic contribution that your research will make. Keep in mind, though, that satisfying your personal curiosity and helping you do additional research that would make you happy are generally not considered improvements to the overall state of the world!
Canadian Tri-Agency Resources
SSHRC's Peer Review Process
Some SSHRC programs make use of external reviewers (experts in a specific discipline who are asked to review one or a small number of proposals in that discipline and to provide written feedback); some use only the reviews of members of the adjudication committee. In most cases, committee members assigned to review (“read”) a proposal will be in the room when the proposal is discussed, but, in the case of some interdisciplinary proposals, one of the readers may be from a different committee and may not be present for the discussion.
SSHRC now asks reviewers to use criteria that are very closely aligned to the “evaluation criteria” listed in the instructions for each competition. In fact, they essentially create a checklist using the bullet points from the listed criteria and ask reviewers to rate proposals as Excellent, Very Good, Good, or Not Fundable on each point. For proposal writers, this is helpful, as it means you know exactly how your proposal will be evaluated…the next step is for you to take advantage of this knowledge by explicitly cross-checking your drafts against such a checklist and asking colleagues to also give you checklist-related feedback.
NSERC's Peer Review Process
NSERC makes its Peer Review Manual (the instructions given to those who serve as peer reviewers) publicly available. While some of the information relates to details such as timelines, there is also information that is very useful in helping proposal writers to take the perspective of reviewers as they write.
NSERC typically makes use of external reviewers (referees). They have a dynamic committee structure such that the people sitting around the table to discuss proposal A may not be exactly the same group as is present for discussion of proposal B. NSERC’s intent is to match the most appropriate reviewers to each proposal.
CIHR's Peer Review Process
Overview on CIHR’s Peer Review Process: http://www.cihr-irsc.gc.ca/e/37790.html
CIHR Peer Review Manual for Grant Applications: http://www.cihr-irsc.gc.ca/e/4656.html
CIHR Policies and Procedures on Peer Review: http://www.cihr-irsc.gc.ca/e/39414.html
Depending on the Program, CIHR may or may not make use of external reviewers. However, they provide detailed information on their website on their peer review processes. Note that over the next several years, there will be significant changes to CIHR’s funding programs and review processes, which may impact some of the information in the links below.
Other Resources on Peer Review
Canadian Funding Agencies
Grand Challenges in Global Health: http://www.grandchallenges.org/ABOUT/Pages/Overview.aspx
Canada Research Chairs Program: http://www.chairs-chaires.gc.ca/program-programme/nomination-mise_en_candidature-eng.aspx#review
US Governmental Funding Agencies
Institute of Education Sciences (USA) Peer Review Information: http://ies.ed.gov/director/sro/peer_review/application_review.asp
National Institutes of Health (USA) Peer Review Information: http://grants.nih.gov/grants/peer_review_process.htm
Center for Scientific Review (USA) information on the National Institutes of Health Peer Review Process: http://cms.csr.nih.gov/resourcesforapplicants/insidethenihgrantreviewprocessvideo.htm
European Funding Agencies
European Science Foundation: http://www.esf.org/fileadmin/FlipBooks/Peer_Review/peer_review.html
European Research Council: http://erc.europa.eu/evaluation-panels
UK Arts and Humanities Research Council: http://www.ahrc.ac.uk/Pages/Home.aspx
UK Economic and Social Research Council: http://www.esrc.ac.uk/funding-and-guidance/guidance/peer-rapport/index.aspx
UK Engineering and Physical Sciences Research Council: http://www.epsrc.ac.uk/Pages/default.aspx
Funding from Other Regions
Australian Research Council: http://www.arc.gov.au/general/assessment_process.htm and http://www.arc.gov.au/general/peer_consultation.htm
Global Summit on Merit Review: http://www.globalresearchcouncil.org and www.nsf.gov/news/newsmedia/globalsummit/gs_principles.pdf
Michael Smith Foundation for Health Research (MSFHR): http://www.msfhr.org/funding
Research and Commentary on the Peer Review Process for Research Funding
The References Cited section of CIHR’s Design Discussion Document (2012) contains many references related to research and commentary on different approaches to peer review (http://www.cihr-irsc.gc.ca/e/documents/design_discussion_doc-en.pdf). Here are some examples:
Association of American Universities. (2011). Understanding Peer Review of Scientific Research: Association of American Universities. (http://www.aau.edu/publications/reports.aspx?id=6900)
Blair, B., Cline, G. R., & Bowen, W. R. (2007). NSF-Style Peer Review for Teaching Undergraduate Grant-Writing. American Biology Teacher, 69(1), 34-37. (http://www.bioone.org/toc/ambt/69/1)
Demicheli, V. & Di Pietrantonj, C. (2008). Peer review for improving the quality of grant applications. The Cochrane Library, 2, 1-15.
Frodeman, R., & Briggle, A. (2012). The Dedisciplining of Peer Review. Minerva: A Review of Science, Learning and Policy, 50(1), 3-19.
Hansson, F., & Monsted, M. (2012). Changing the Peer Review or Changing the Peers--Recent Development in Assessment of Large Research Collaborations. Higher Education Policy, 25(3), 361-379. (Abstract: http://www.palgrave-journals.com/hep/journal/v25/n3/abs/hep201217a.html)
Ioannidis, J. P. A. (2011). [Comment]. More time for research: Fund people not projects. Nature, 477, 529-531. (http://www.nature.com/nature/journal/v477/n7366/full/477529a.html)
Graves, N., Barnett, A.G. and Clarke, P. (2011). Funding grant proposals for scientific research: Retrospective analysis of scores by members of grant review panel. British Medical Journal, 343, 1-8. (http://www.bmj.com/content/343/bmj.d4797)
Obrecht, M., Tibelius, K., and D’Aloisio, G. (2007). Examining the value added by committee discussion in the review of applications for research awards. Research Evaluation, 16(2), 79-91. (www.rev.oxfordjournals.org/content/16/2/79.full.pdf)
Vener, K. J., Feuer, E. J., & Gorelic, L. (1993). A statistical model validating triage for the peer review process: keeping the competitive applications in the review pipeline. Federation of American Societies for Experimental Biology (FASEB) Journal, 7, 1312-1319. (www.fasebj.org/content/7/14/1312.full.pdf+html)
Cole, S., Cole, J. R., and Simon, G. A. (1981). Chance and consensus in peer review. Science New Series, 214(4523), 881-886. (www.columbia.edu/cu/univprof/.../1981Chance_and_Consensus.pdf)
Mayo, N. E., Brophy, J, Goldberg, M. S., Klein, M. B., Miller, S., Platt, R. W., & Ritchie, J. (2006). Peering at peer review revealed high degree of chance associated with funding of grant applications. Journal of Clinical Epidemiology, 59, 842-848. (Abstract: http://www.ncbi.nlm.nih.gov/pubmed/16828678)
Cicchetti, D. V. (1991). The reliability of peer review for manuscript and grant submissions: A cross-disciplinary investigation. Behavioural and Brain Sciences, 14, 119-186. (Abstract: http://journals.cambridge.org/action/displayAbstract?fromPage=online&aid=7250552)
Marsh, H. W., Jayasinghe, U. W. & Bond, N. W. (2008). Improving the peer-review process for grant applications: Reliability, validity, bias and generalizability. American Psychologist, 63(3), 160-168. (http://psycnet.apa.org/journals/amp/63/3/) [N.B. This study was conducted using education and psychology grant propodsals.]
Roberts, T. J., & Shambrook, J. (2012). Academic Excellence: A Commentary and Reflections on the Inherent Value of Peer Review. Journal of Research Administration, 43(1), 33-38. (http://www.srainternational.org/sra03/template/tntbjour.cfm?id=2424)
van Arensbergen, P., & van den Besselaar, P. (2012). The Selection of Scientific Talent in the Allocation of Research Grants. Higher Education Policy, 25(3), 381-405. (Abstract: http://www.palgrave-journals.com/hep/journal/v25/n3/abs/hep201215a.html)