Fifteen1 faculty members from the Department of Mathematics at the University of Toronto submitted proposals to the 2012 NSERC Discovery Grants competition. Of these, one was a first time applicant (En), two (Ga, Ia) applied after a successful appeal of 2011 results, and one (Cd) was an appellant whose appeal was denied but could reapply because the 2011 award was for zero dollars. The first table below shows the 2012 results (in thousands of dollars per year) with 2010, 2011 award amounts for those researchers. The second table shows similar data for Toronto mathematicians in the 2011 competition, including the amounts for researchers (1d, 2d, 3d, 4d) whose appeals of 2011 results were rejected. The average for 2012 awards to Toronto mathematicians was 153% the average for 2011.

Average Grant Amount (Toronto Math)

  • 2006: \$27k/y
  • 2007: \$26.3k/y
  • 2008: \$26.5k/y
  • 2009: \$25k/y
  • 2010: \$25.3k/y
  • 2011: \$19.3k/y
  • 2012: \$29.5k/y

Instability Visualized

Viewing the results from the perspective of researchers in these competitions reveals instability in the Discovery Grants evaluation and appeals processes:

  • Imagine the experience of researcher 2d. This person had five years at \$42k/y, was cut to \$30k/y in 2009, and successfully appealed that outcome. The result of the appeal was a one year reinstatement of the previous grant level at \$42k/y and permission to reapply to the 2011 competition. In 2011, this researcher’s grant was hacked to \$18k/y so this person files another appeal. The 2011 appeal is rejected.
  • Contrast the experience of professors 4d and Ga. Both launched their Canadian research careers and entered the Discovery Grants competition for the first time in 2011. Ga’s appeal of the \$13k/y result from 2011 was successful and the 2012 competition led to a new result of \$30k/y for the next five years. 4d’s 2011 appeal was denied so this researcher is locked in for five years at \$11k/y. Which of these researchers is likely to have better HQP numbers at renewal time five years from now?
  • The experience of 2011 appellant Ia is also a bit strange. After a long run of celebrated research funded at the \$40k/y level, this researcher’s funding level was dropped in 2011 to \$15k/y. That outcome was successfully appealed and the 2012 outcome was \$35k/y.
  • The 2011 appeals by 1d, 2d, 3d, and 4d were all turned down so these researchers are locked in at relatively low funding levels for the next five years.
  • NSERC deviated from standard policy (a “pilot program”) in their handling of the 2011 appeals of Toronto mathematicians. Toronto’s appeals were evaluated by multiple appeals advisers while appeals from other universities were evaluated by one. There is evidence2 showing that Toronto appeals were denied even when one of the appeals advisers advocated for granting the appeal.

Consistent Results on Appeals Cases show 2011 was Anomalous

Proposals by three Toronto researchers were evaluated in both the 2011 and 2012 competitions. The outcomes for these proposals provide a comparison3 between the accuracy of the merit evaluations by the 2011 and 2012 Evaluation Groups and bin-to-funding assignment by the Executive Committee and NSERC staff. Here is the data, including the percentage adjustment from 2011 to 2012:

These three cases provide further evidence, consistent with the message in the public statement signed by over 300 Canadian researchers, that the 2011 evaluations were anomalous. Despite the consensus opinion from the Canadian math/stats community, a public message from a majority of the 2011 Evaluation Group, and advice from top administrators that the 2011 anomalies required an altered appeals process, NSERC chose not to reevaluate the scientific merit of proposals when considering whether to grant or deny an appeal. The grounds for a successful appeal required evidence of administrative errors; evidence of error in merit evaluation was not considered germane.

The Math-NSERC Liaison Committee is collecting data from department chairs about the 2012 competition for Section 1508. It might turn out that 2012 will be viewed as more consistent with expectations, a more accurate evaluation compared to 2011. This would be encouraging. However, the unsuccessful 2011 Toronto “pilot program” appellants (researchers 1d, 2d, 3d, 4d) face the next five years with inadequate funding to support their research programs.


Footnotes:

  1. The names of the faculty members are suppressed. The 2012 applicants will be referenced with codes A, B, C, …, H; 2011 applicants (without successful appeal) will be referenced using 1, 2, …, 7. The appended small case letters indicate whether the researcher had an appeal granted (a), an appeal denied (d), or was a new applicant (n).
  2. This evidence is contained in the reports of the appeals advisers provided to the appellants by NSERC and also in documents obtained by some of the appellants through formal requests under the Access to Information Act.
  3. It would be useful to know the data (number, success rate, basis for granting) for Discovery Grant appeals submitted to NSERC over the past few years. As far as I can tell, NSERC does not provide this data.

 

, ,

The 2011 Winter CMS Meeting took place this past weekend. On Sunday (2011-12-11), there was a CMS Town Hall meeting. The discussion was led by a panel consisting of Long Range Plan Committee Chair Nancy Reid, CMS President Jacques Hurtubise, and CMS Director Johan Rudnick (who was partially obscured by a poinsetta). Attendees to the town hall meeting were fed a free lunch.

Long Range Plan Update

Nancy gave a status update on the long range plan. She reported that the committee met in October and is in the process of writing. They hope to have a draft version of the report available in late February with a target goal of final publication in June 2012. Nancy also reported that she and Math-NSERC Liaison Committee Chair Walter Craig had sent a letter of recommendations to NSERC for this year’s Discovery Grants competition. The letter is posted here and also on the LRP web space.

Nancy reported that the overall federal funding envelope for math/stats through NSERC (Discovery grants, Institute, …) is around \$21M/y. To function effectively, the Institutes require additional funds from the Provinces and other partners. To avoid the departures of talented mathematicians from Canada as forecasted last year, there is a need for more funding. I asked if the LRP report would contrast the circumstances faced by Canada’s financially threatened math/stats community with the recent \$100M ( \$50M federal, \$50M Ontario) gift to the Perimeter Institute. Nancy replied that the LRP report will only address funds granted through NSERC and will not comment on grants given through other sources.

Persistent Concerns about NSERC and Discovery Grants

The open discussion revealed that there remain serious concerns within the Canadian mathematical community about NSERC and the Discovery Grants program. Nancy reported that consultations with NSERC have “not always been easy.” Brett Stevens expressed the view that the weight given to the training of highly qualified personnel in evaluating the merit of proposals was problematic. Nancy relayed that these issues had been raised with Isabelle Blain. Based on those conversations, Nancy predicted that no substantial review of the new system would take place until 2014.

According to Nancy, NSERC staff claims that no discipline other than mathematics has complained about the new peer review system. I relayed that there have been reports by computer scientists and physicists of troubles with the outcomes of their recent competitions. NSERC might not be aware of the troubles in other disciplines but that doesn’t mean they don’t exist. Through the Liaison Committee, the LRP process, the Institutes, meetings of chairs, trans-Canada research collaborations, and meetings of the CMS and SSC, the math/stats community in Canada has strong communications channels. These channels allowed us to see the anomalies in the 2011 outcome through a national lens. I wonder if other disciplines would see problems with the conference model if they too had a wide enough vantage point on the outcome of their recent Discovery Grants competitions.

There was some discussion about how NSERC has moved funds away from the Discovery Grants program into a potpourri of programs supporting commercialization and academic-industrial partnerships. As mentioned in the 2007 report of the International Review panel (which led to the new peer review system), pure mathematics is unfairly punished when basic research funds are redirected toward short term commercialization goals.

Eddie Campbell isolated an issue that our community must confront. Should NSERC fund lots of smaller grants or fewer larger grants? There is a fixed amount of money in the federal budget for mathematics. How should those funds be spread out? Jacques mentioned that “spreading peanut butter” is a frequent metaphor used in discussions around this issue.

Transparency

I highlighted Minister Tony Clement’s call for federal government transparency. With this background, I asked Nancy if the LRP could arrange for the release of NSERC President Suzanne Fortier’s slides from her public presentation at the Summer 2011 CMS meeting. Nancy replied that the LRP has the slides but has been instructed not to circulate them. Nancy also reported that the LRP had requested data regarding the appeals numbers and success rate. She said the success rate is running around 25% but the LRP had not yet received the requested data. In light of Mr. Clement’s call for federal government transparency, I wonder why the data presented publicly by NSERC’s President to the Canadian Mathematical Society and requested by the LRP remains concealed from public view.

Immigration Policy Concerns

David Pike expressed frustration that Canada’s immigration rules prevent international graduate students from seeking permanent residency. I didn’t quite understand the details. David reported that the policy is seriously affecting his finishing PhD student who wishes to stay in Canada but will probably be forced to leave. Canada and the provinces invest heavily in the training of international graduate students. David reported that current immigration policy prevents Canada and the provinces from benefiting from this investment since these highly qualified graduates are often required to leave after earning their PhD.

The CMS town hall meeting was a great opportunity for discussion among members of the Canadian math/stats community. I look forward to the LRP report and am grateful for all the hard work that Nancy Reid and the committee have done.

 

, , ,

Anomalous results of the 2011 NSERC Discovery Grants competition in mathematics have provoked a loss of confidence in the NSERC peer review system. To avoid a substantial loss of Canada’s scientific talent, which has been enhanced through the Canada Research Chairs program and other spectacular hiring over the past ten years, scientific policymakers need to quickly fix the broken peer review system. In the absence of an effective peer review process setting the strategy for research investment, Canada will miss out on the rewards made over the past decade’s recruitment of scientific talent.

What is happening in other sciences? Anecdotal reports from the following sources suggest the anomalies are not restricted to the math department at Toronto:

  • Toronto: MATH, CHM, EEB, PHY, STA, Engineering
  • UBC: CS, MATH
  • Queens: MATH

What happened to other disciplines? I’d like to know but NSERC won’t reveal the 2011 data until after the federal election. I’d like to hear from other scientific disciplines about their confidence in the recently transformed NSERC peer review system.

In 2007, NSERC commissioned a review by an international committee culminating in this report. (Please find my annotated version here and a marked up version of the NSERC Management response to the 2007 International Review Committee Report.) This report made recommendations leading to fundamental changes in the peer review process for all disciplines starting in 2009. The implementation of these changes (involving the so-called conference model and binning system) and other forces have provoked a loss of confidence in the peer review process at NSERC among mathematicians at Toronto, and across Canada.

Toronto Math Results are Anomalous

The results (names omitted) of the 2011 NSERC Discovery Grants Competition for the Department of Mathematics at the University of Toronto are anomalous:

  • Professor A. \$29k/y to \$18k/y
  • Professor B. 40 to 15
  • Professor C. 42 to 30 to 42 to 18
  • Professor D. 26 to 18
  • Professor E. 40 to 40
  • Professor F. 38 to 47
  • Professor G. 0 to 0
  • Professor H. 15 to 13

(The numbers represent annual research grant amount in dollars for the past 5 years and the new number for the next 5 years. For a description about how mathematician’s use these funds, go here.)

About Professor C.

Consider the case of CMS award winning Professor C. In 2010, this researcher’s grant was cut from 42 down to 30. After an appeal, the grant was reinstated for one year back to 42. In this year’s competition, one year after the appeal, NSERC drops it to 18, a 57% cut. Meanwhile, Professor C’s frequent collaborator (each had more than 50% overlap of their research with the other during 2006-2011), Professor K., received 45 staying at 100% of the previous level in this competition. Will the real opinion of NSERC on Professor C’s research please stand up? Professor C’s story is quite similar to Don Fraser’s personal account. (Within the conference model, I understand that a different group of only 5 experts may have reviewed the proposals of Professors C and K. This remark can account for the inconsistency but reveals aspects of larger issues that need to be fixed.)

About Professor B.

Imagine running a successful research operation (success, former students get awards, 13 major pubs in 2006-2011) for the past five years like Professor B using 45K/y. Students are in the pipeline; postdoc candidates have been scouted; Professor B has new ideas. NSERC rewards this person with a drop from 45 down to 18, a 60% cut. This researcher is confused with the outcome: “What did I change? What should I have done differently?”

About Professor G.

This person is a (perhaps the) world leading expert on a substantial research area. It seems this person, despite spectacular research success, is unworthy of a Discovery Grant because they don’t produce enough students. It is as though Canada has a sports car and they don’t put tires on it.

Secondary Effects Scenarios

Mathematicians, of international calibre, with a steady research production stream and surrounded by young researchers have had their grants slashed by nearly 60% during the 2011 NSERC Discovery Grants Competition.

  • Now, imagine you are an assistant professor in Canada. You might have nice support right now, like a Sloan or an ERA. You are building a research group, spending money on HQP, scouting talent. But your funding has a finite time horizon and the Discovery Grants look unstable, unpredictable. So, would you leave Canada if you could? Fix NSERC or the young talent will leave Canada.
  • Now, imagine you are a recently recruited Canada Research Chair: if you just concluded that your junior faculty member might be wise to leave Canada, how do you see your department in 10 years? Would you leave Canada if you could? Fix NSERC or the CRCs will leave Canada.

The system is broken and needs to be fixed.

There are (at least) two main problems:

  • Math in Canada is treated unfairly compared to other disciplines
  • The Peer Review system is broken

Math in Canada is treated unfairly

The main problem with mathematics funding in Canada is the amount invested is too low. I’ve written about this before. Consider the data from 2009 of NSERC Discovery Grants (2010 is similar, 2011 is not available) over the disciplines:

2009 NSERC Data across Disciplines

The average math and stats grant is $20K/y while the average over all disciplines is $41K/y. Why is it that the average scientist in Canada can expect more than double the amount a Canadian mathematician can expect? Keep in mind that Discovery Grants are primarily used to fund research personnel not expensive labs.

David Wehlau’s data (posted and discussed here) reveals the trend: over the past twenty years, mathematics investment as a percentage of the total amount in Discovery Grants funding has declined from nearly 4% down to 2%. Math has received less and less funding compared to other disciplines. This subtle reallocation needs to be abruptly reversed.

The unfairness then multiplies. Consider the following snippet from the 2007 international review report:
MathDGPUnfair Other disciplines are benefitting more from other industrially targeted NSERC programs and other sources compared to pure mathematicians. NSERC views Discovery Grants as grant-in-aid: a precursor grant leading to other sources of funds. The international review committee reports that is not the case for mathematics AND mathematicians receive on average less than half the funds received by other scientists. This is an implicit funding reallocation away from mathematics toward other disciplines and is unfair.

Broken Peer Review System

The outcome of the 2011 competition, and consistent reports (like Don Fraser’s) from the past two years, have provoked a loss of confidence in the peer review system at NSERC. To rebuild trust and avoid the departure of talented scientists, the mathematics community of Canada needs to know what happened in 2011. We need to understand why the peer review system produced the 2011 funding allocations.

There has been considerable chatter in the mathematics community about the 2011 competition. However, we need people with official roles to speak officially at this time. In addition to the forthcoming data from NSERC, I hope that Section 1508, the Mathematics and Statistics Evaluation Committee will explain the 2011 evaluation process and actively participate in discussions leading to an improved system that regains the confidence of mathematicians and statisticians working in Canada. What happened? How can we fix it?

 

, ,