Anomalous results of the 2011 NSERC Discovery Grants competition in mathematics have provoked a loss of confidence in the NSERC peer review system. To avoid a substantial loss of Canada’s scientific talent, which has been enhanced through the Canada Research Chairs program and other spectacular hiring over the past ten years, scientific policymakers need to quickly fix the broken peer review system. In the absence of an effective peer review process setting the strategy for research investment, Canada will miss out on the rewards made over the past decade’s recruitment of scientific talent.

What is happening in other sciences? Anecdotal reports from the following sources suggest the anomalies are not restricted to the math department at Toronto:

- Toronto: MATH, CHM, EEB, PHY, STA, Engineering
- UBC: CS, MATH
- Queens: MATH

What happened to other disciplines? I’d like to know but NSERC won’t reveal the 2011 data until after the federal election. I’d like to hear from other scientific disciplines about their confidence in the recently transformed NSERC peer review system.

In 2007, NSERC commissioned a review by an international committee culminating in this report. (Please find my annotated version here and a marked up version of the NSERC Management response to the 2007 International Review Committee Report.) This report made recommendations leading to fundamental changes in the peer review process for all disciplines starting in 2009. The implementation of these changes (involving the so-called *conference model* and *binning* system) and other forces have provoked a loss of confidence in the peer review process at NSERC among mathematicians at Toronto, and across Canada.

## Toronto Math Results are Anomalous

The results (names omitted) of the 2011 NSERC Discovery Grants Competition for the Department of Mathematics at the University of Toronto are anomalous:

- Professor A. \$29k/y to \$18k/y
- Professor B. 40 to 15
- Professor C. 42 to 30 to 42 to 18
- Professor D. 26 to 18
- Professor E. 40 to 40
- Professor F. 38 to 47
- Professor G. 0 to 0
- Professor H. 15 to 13

(The numbers represent annual research grant amount in dollars for the past 5 years and the new number for the next 5 years. For a description about how mathematician’s use these funds, go here.)

### About Professor C.

Consider the case of *CMS award winning* Professor C. In 2010, this researcher’s grant was cut from 42 down to 30. After an appeal, the grant was reinstated for one year back to 42. In this year’s competition, one year after the appeal, NSERC drops it to 18, a 57% cut. Meanwhile, Professor C’s frequent collaborator (each had more than 50% overlap of their research with the other during 2006-2011), Professor K., received 45 staying at 100% of the previous level in this competition. Will the real opinion of NSERC on Professor C’s research please stand up? Professor C’s story is quite similar to Don Fraser’s personal account. (Within the conference model, I understand that a different group of *only* 5 experts may have reviewed the proposals of Professors C and K. This remark can account for the inconsistency but reveals aspects of larger issues that need to be fixed.)

### About Professor B.

Imagine running a successful research operation (success, former students get awards, 13 major pubs in 2006-2011) for the past five years like Professor B using 45K/y. Students are in the pipeline; postdoc candidates have been scouted; Professor B has new ideas. NSERC rewards this person with a drop from 45 down to 18, a 60% cut. This researcher is confused with the outcome: “What did I change? What should I have done differently?”

### About Professor G.

This person is a (perhaps *the*) world leading expert on a substantial research area. It seems this person, despite spectacular research success, is unworthy of a Discovery Grant because they don’t produce enough students. It is as though Canada has a sports car and they don’t put tires on it.

## Secondary Effects Scenarios

Mathematicians, of international calibre, with a steady research production stream and surrounded by young researchers have had their grants slashed by nearly 60% during the 2011 NSERC Discovery Grants Competition.

- Now, imagine you are an assistant professor in Canada. You might have nice support right now, like a Sloan or an ERA. You are building a research group, spending money on HQP, scouting talent. But your funding has a finite time horizon and the Discovery Grants look unstable, unpredictable. So, would you leave Canada if you could?
**Fix NSERC or the young talent will leave Canada.** - Now, imagine you are a recently recruited Canada Research Chair: if you just concluded that your junior faculty member might be wise to leave Canada, how do you see your department in 10 years? Would you leave Canada if you could?
**Fix NSERC or the CRCs will leave Canada.**

The system is broken and needs to be fixed.

There are (at least) two main problems:

- Math in Canada is treated unfairly compared to other disciplines
- The Peer Review system is broken

## Math in Canada is treated unfairly

The main problem with mathematics funding in Canada is the amount invested is too low. I’ve written about this before. Consider the data from 2009 of NSERC Discovery Grants (2010 is similar, 2011 is not available) over the disciplines:

The average math and stats grant is $20K/y while the **average over all disciplines** is $41K/y. Why is it that the average scientist in Canada can expect more than double the amount a Canadian mathematician can expect? Keep in mind that Discovery Grants are primarily used to fund research personnel not expensive labs.

David Wehlau’s data (posted and discussed here) reveals the trend: over the past twenty years, mathematics investment as a percentage of the total amount in Discovery Grants funding has declined from nearly 4% down to 2%. Math has received less and less funding compared to other disciplines. This subtle reallocation needs to be abruptly reversed.

**The unfairness then multiplies.** Consider the following snippet from the 2007 international review report:

Other disciplines are benefitting more from other industrially targeted NSERC programs and other sources compared to pure mathematicians. NSERC views Discovery Grants as *grant-in-aid*: a precursor grant leading to other sources of funds. The international review committee reports that is not the case for mathematics AND mathematicians receive on average less than half the funds received by other scientists. This is an implicit funding reallocation away from mathematics toward other disciplines and is unfair.

## Broken Peer Review System

The outcome of the 2011 competition, and consistent reports (like Don Fraser’s) from the past two years, have provoked a loss of confidence in the peer review system at NSERC. To rebuild trust and avoid the departure of talented scientists, the mathematics community of Canada needs to know what happened in 2011. We need to understand why the peer review system produced the 2011 funding allocations.

There has been considerable chatter in the mathematics community about the 2011 competition. However, **we need people with official roles to speak officially at this time**. In addition to the forthcoming data from NSERC, I hope that Section 1508, the Mathematics and Statistics Evaluation Committee will explain the 2011 evaluation process and actively participate in discussions leading to an improved system that regains the confidence of mathematicians and statisticians working in Canada. What happened? How can we fix it?

[…] NSERC staff and management, as well as the various internal and external consultative panels in all this mess, and where does the responsibility […]

I reported Professor B’s numbers incorrectly as 45 to 18. The correct numbers are 40 to 15. Correction is made in the post above.

[…] peer review system: broken and unreliable 13 04 2011 Jim Colliander has a post on the results of the 2011 NSERC Discovery Grants competition in mathematics. The official results […]

To us, possessing NSERC funding does not simply mean having necessary financial support to conduct high quality research, its scale should also truly represent researchers standings in their own discipline/community. The funding scheme shouldn’t be changed dramatically within a small time period, not even two consecutive years. There should be an allocation scheme that is stable for a reasonable time period, say 5 years, so most people would be evaluated under the same or similar framework regardless when they apply. The outcome of this year can only give out this message: it doesn’t matter HOW you do research; it matters WHEN you apply for your grant; your fate is not determined by how good you are; it is only affected by how lucky you are.

Lack of sufficient budget for math/stat community has always been a main issue; but if NSERC people can’t do anything to resolve this problem, they, at least, shouldn’t man-make additional chaos to the community. They should at least do a basic job to maintain that relative rankings between research profile and $ amount are fair, otherwise, top researchers would be double hurt: insufficient (or even no) funding, and receiving no objective measure/recognition for their research profiles.

Nancy Reid recently shared with me a letter by Heinz Bauschke at UBC Okanagan. Bauschke’s letter resonates with remarks I heard today from Joe Repka attributed to unnamed sources at Lakehead University. Basically, the message is that smaller institutions are unfairly disadvantaged by the HQP requirements in the current proposal-to-bin algorithm imposed upon the evaluation group by NSERC. Bauschke’s letter makes these points very eloquently and provide a different perspective on the main issues the math community in Canada faces:

1. There is insufficient funding for mathematical research activity in the system.

2. “The evaluation process needs an infusion of common sense.” to quote Heinz Bauschke.

Mathematics in Canada will not thrive if only the bigger Universities like Toronto, UBC and McGill get the funding to allow for research. We need a nation filled with colleges and universities transforming high school kids into scientists, engineers and mathematicians.

[…] Monday, James Colliander criticized the how broken the NSERC peer review is and was seconded by Izabella Laba at The Accidental Mathematician two days later; Tuesday was […]

My understanding is that the real problems that have been appearing over the last couple of years in the DG program have relatively little to do with the recommendations in the international review of the program, and have much more to do with a desire of the federal government to reduce the success rate of the program, despite the review stating “the [formerly] relatively high success rate of DGP applications is not incompatible with, and in fact encourages, a high degree of research excellence across a broad range of fields”

[…] James Colliander: NSERC Peer Review System is Broken for Mathematics […]

[…] and angry at NSERC. Some of their mathematicians are screaming from the rooftops that the “NSERC Peer Review System is Broken for Mathematics“. We don’t have the full picture of this year’s Discovery Grants competition yet, […]

The main problem with NSERC is the weakness of the grant committee member selection process. The panel leaders look weak and so do the members in both math and CS. This issue seems to be endemic in all Canadian grant agencies, including NSERC, CIHR, etc. Arbitrary people are making arbitrary decisions.

I believe the problems are more ramified. An evaluation group comprised of Gauss, Euler, Hilbert, Poincaré …would have produced anomalous outcomes as well using the conference model, the bins, and the constraints imposed by NSERC. The system fails to turn the expertise of the panel, no matter how strong, into a fair research investment strategy.

Except couple of names the members of the panel do not ring any bell.

Purely formally: in mathematics AFAIK: not a single FRSC. Not a single (former) speaker at ICM. Some are professors of third- and fourth-rate American universities (why do we need them? why nobody from MIT, Harvard, Princeton,…?)

And these people are supposed to decide grants of mathematicians of much higher standing? While I agree that the peer review system is universally broken (from my experience as a referee for many top notch journals), this is not even peer review as peer means equal.

I disagree, Jim, with your statement about evaluation group comprised of Gauss, Euler, Hilbert, Poincaré,… or even of leading Canadian mathematicians as those are much broader.

I remember when I was asked to provide evaluation of NSF grants I was asked if the application was excellent, very good, good, satisfactory, fair or poor. Now NSERC uses instead exceptional, outstanding, very strong, strong, moderate or insufficient. Catch1: it sounds complimentary when you are evaluated as “outstanding” but it is only the second tier. Catch2: referee are not asked these questions, these marks are exclusively in the hands of NSERC panelists.

Also: there is a discrepancy between list of mathematical specialities in forms 180 and 101 where the former is a modern much better reflecting realities of today and the latter is a legacy of Math/App Math committees. I noticed this discrepancy almost 2 y.a. and notified NSERC and was promised that it will be fixed in Aug 2009 (it was not), then in 2010 (it was not either). Incompetence or malice? If mathematical speciality is not mentioned in 101 then it is less important and may be never represented by a member of the panel.

Victor, thanks for your comment. We are faced with a system with flaws on many fronts. My comment above was not a suggestion that the panel should not be filled with eminent researchers. Instead, I argued that the structure of the system fails to convert any panel’s expertise into good funding decisions. I like the explicit suggestions you made:

1. The Math/Stats Evaluation Group should be strengthened, e.g. chairs should be full professors, some FRSC members, leading researchers from outside Canada.

2. The labels used by the EG to place proposals into bins should be explicitly shared with external referees commenting on the quality of the proposal.

3. The labels used should be reevaluated: “Exceptional” and “Outstanding” are two words that are not so clearly ordered in English but have significantly different impact in funding level.

4. The forms 80 and 101 should be examined for discrepancies and improved.

These suggestions should also be considered by the LRP: http://longrangeplan.ca/

Well done, Viktor. I have tried to fight the system with a 1-4 score. Happy that you are still optimistic. Lev

The problem is worse and deeper than what you are showing here.

What are the odds of a mathematics department split cleanly into three distinct groups: one third spoiling for any fight, one third sitting on any fence and one third doing their best to do anything decent? What are the odds of the same department being split in a 50-50 vote, with precision, in a presumed democracy? What are the odds of the same math department flirting with an attrition rate in female grad students of a little under … you guessed it … one third of each year’s incoming.

What are the odds of a 7 billion CND drug lord whose mind is mathematically incompetent and whose “orders” seem mathematically educated complaining about two monkeys on his back? What are the odds of an entire city “voluntarily” spending from a little over 50% on housing in the mid-1990s to 92% on housing by 2011? Omit telling me anyone imagines an entire city is incapable of knowing better. What are the odds of the same drug lord demonstrating communication incompetence in 2000 and seeming to know how to communicate in 2011, and also seeming to be learning disabled?

So, yeah, I think we mathematicians may have a bit of a problem inside our mathematical community … just a guess? I think our government clued into this fact during the 1990s since October 1994 is when I saw and confirmed the NSERC-driven trend in funding cuts you are reporting now.

Roll with me here: allow this question … is it possible we have a psychopath mathematician in Canada, and if so, how do we know whom? How do we know whether to reject the question? We could let government funding deteriorate forever, that could be the reasonable thing to do.

What are the odds that Jonathan Borwein, Peter Borwein and myself one day, on a lark, implemented our correction to two of Aristotle’s errors in physical reality and on paper … seemingly on a whim in 1996? Jon and Peter have otherwise expressed little interest in logic and Aristotle, generally speaking.

What are the odds that Canada was recently removed from the UN Security Council?

What part of : the mathematician who corrected the definition of _if_ seems outside the field of mathematics in Canada : is supposed to make sense?

What part of : the Vancouver Sun seeming confused about what constitutes democratic, public commenting online : is supposed to make sense?

Jonathan Borwein’s genius includes constructing organizations which achieve excellent results by design. Jon Borwein’s organizational success pre-1993 is in Dalhousie University. Jon Borwein’s organizational success post-2003 is in Dalhousie University and in the University of Newcastle. I was in the Center for Experimental and Constructive Mathematics during 1995 and I remember Jon Borwein designing and constructing the Pacific Institute of Mathematics (PIMS) and Mathematics of Information Technology and Complex Systems (MITACS). Jon and Peter like working together very much, which is rare for brothers, so why is Jon Borwein in Australia?

Why are Vancouver’s housing prices by far the worst in Canada?

Look at what happened in America. When the economy adjusted, the housing bubble popped.

What type of human mind would it take to make all this seem like a coincidence? Vancouver has the resources, and intelligence, to be able to feed and shelter itself. Why the fashion trend in Vancouver women from the mid-1990s to 2011, showing more skin and being less assertive?

Gee, gosh, golly: what is in Vancouver that seems so different from the rest of Canada, that has the United Nations and NSERC seeming to pay attention?

How much funding do Canadian mathematicians want to lose before starting to pay attention?

This is real. I play chess in public. Who likes poker? I like chess.

Prokhorov/Overington

Student of Ivan Matveevich Vinogradov, Jonathan Borwein, Peter Borwein, Robert Ansell, Bob Pare, Richard Nowakowsi, Robert Dawson :: top third of contestants in the Putnam competition in 1993 :: what are the odds that I would be asking these questions in public instead of doing something else?

How loud, how much money gone, how many hit by the economy, before Canadian mathematicians pay attention?