Casualties of NSERC peer review.

Two courageous colleagues have spoken up about the results of the recent Discovery Grant review from NSERC, which was based on new evaluation criteria. Both had previously been funded, and it is hard to imagine how these results are sensible. I think sharing stories like this is very important because otherwise a) researchers can become very discouraged, and b) systemic problems with the review system may go unnoticed for a long time. As Dr. Nancy Reid (University of Toronto) stated, individuals in this situation should not “hide in the office feeling you’ve done something wrong”. I am planning to share some of my own experiences with NSERC peer review soon. But in the meantime, you can read the plea from Dr. France Dufresne (Université du Québec à Rimouski) here, and Dr. Reid has graciously allowed me to repost her message below.

As news about the results of this year’s discovery grant competition trickle out, the list of alarming stories seems to grow. Here is my story.

My DG grant was cut this year from $48,000 to $25,000. Grants in statistics tend to be low, and our GSC (14) is perpetually short of money. My grant of $48k was the largest among those re-applying this year, and among the top 5 or so in the country. The last few years have seen most of the top grants cut to help the pressure on junior people who need to move up through the system. So I expected to see a decrease in my grant.

But, I didn’t expect to be knee-capped. This is just $1k short of the maximum cut allowed (50%).

My first reactions were deeply personal: I assumed that my standing, my research, and my training of HQP had been found wanting, I assumed that this was probably a correct judgment, and I started asking myself whether it was time to think about retiring. I asked myself why a group of my peers felt that it was necessary to deliver such a blow, even if I was judged to be slipping. I was embarrassed to tell even my closest friends what had happened.

But wait! The referees’ reports are quite positive! I looked at my proposal again: it was not bad! I had a long list of HQP! I had won a national and an international award! In fact, my proposal, my record of publication, training and achievement are very similar to past competitions.

This year, as we all know, NSERC instituted a completely different mechanism for determining grant amounts. They will of course be monitoring the situation, and assessing what worked and what didn’t. On their web page we read:

The emphasis on quality assessment under the new common rating system has achieved the desired objectives. It preserves continuity of funding for the most productive researchers who maintain a strong record of contributions to research and training. It also permits a more rapid ramp up of funding for applicants with superior accomplishments and research plans, no matter their history in the system.

Sounds admirable. But is it working?

We won’t know unless we share our stories. And NSERC won’t know unless it hears from us. Don’t hide in your office feeling that you’ve done something wrong. If there is any doubt in your mind about the adequacy of the review process and the outcome for your Discovery Grant, submit an appeal. And share your story.

I suppose it should be pointed out that the writing, reviewing, and denial of the proposals from these two productive researchers cost taxpayers about $80,000 according to one recent analysis. Again, we should ask whether the peer review system is working as well as it should, or indeed whether it should be abandoned in favour of baseline grants for all qualified researchers in Canada.

If you have a similar experience to share, I encourage you to visit Don’t Leave Canada Behind and post your story.


Leave a Reply

 

 

 

You can use these HTML tags

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

*