The Case of the Boiling Frogs: Provincial Indifference to Declining Education Outcomes

Summary:
Citation John Richards. 2025. "The Case of the Boiling Frogs: Provincial Indifference to Declining Education Outcomes." Commentary ###. Toronto: C.D. Howe Institute.
Page Title:The Case of the Boiling Frogs: Provincial Indifference to Declining Education Outcomes – C.D. Howe Institute
Article Title:The Case of the Boiling Frogs: Provincial Indifference to Declining Education Outcomes
URL:https://cdhowe.org/publication/the-case-of-the-boiling-frogs-provincial-indifference-to-declining-education-outcomes/
Published Date:March 25, 2025
Accessed Date:April 18, 2025
  • In terms of national scores in reading, mathematics, and science among sampled students at age 15, Canada continues to rank among the top ten countries in the 2022 PISA survey.
  •  At first glance, one might conclude all is well. However, Canada’s national trends have consistently declined since initial benchmarking of the three subjects in the early 2000s. The largest subject decline is in mathematics.
  •  The four large-population provinces – Quebec in particular – have composite mathematics scores well above the OECD average. Five of six small-population provinces score statistically at the OECD average. Newfoundland scores below the OECD average.
  •  Like the proverbial frogs slowly boiling in water, Canadian provinces are at risk of ignoring their problems until too late, as happened in Sweden a decade ago – and is currently taking place in many US states.
  •  The Council of Ministers of Education in Canada (CMEC) could facilitate a coordinated response among provinces to address weaknesses identified in PISA. For example, CMEC could promote superior Quebec mathematics practices. Another example: provinces could pay attention to BC’s superior high-school completion rate in provincial schools.

Press Release

The author thanks James Fleming, Tingting Zhang, Parisa Mahboubi, Rosalie Wyonch, Annie Kidder, Andrew Sharpe, and anonymous reviewers for comments on an earlier draft. The author retains responsibility for any errors and the views expressed.

Introduction

A determinant of a country’s economic and social future is its “human capital” – the ability of the next generation to contribute to the workforce and participate as citizens. Commonly used proxies for future “human capital” are the average number of years that students spend in school, or the proportion of students of a specified age possessing an education certification (such as primary and secondary school completion, technical trade certificate, and university degree).

The Achilles’ heel of such proxies is the unrealistic assumption of homogeneity across school quality and exam grading. An alternative is to measure human capital by surveying students’ ability to answer relevant questions on core subjects. By the 1990s, Canadian and US provincial/state systems adopted randomized assessments as a useful tool to evaluate school systems at various levels of the K–12 cycle.1An early innovation in randomized assessments is the US National Assessment of Education Progress (NAEP). The civil rights movement and “Great Society” programs of the 1960s attempted to eliminate – or at least dramatically reduce – ethnic gaps in terms of income, employment, and education. Continuously since the early 1970s, the US federal department of education has undertaken assessments of children's learning at ages 9 and 13. At present, the NAEP disaggregates by state and city, and by seven ethnic categories: Asian, African American, Hispanic, Indian native and Alaskan, white, and two mixed groups. All US states and Canadian provinces now undertake regional assessments at several grades. Obviously, national and sub-national assessments do not allow for international comparison. During the 1990s, major international agencies, including the Organisation for Economic Co-operation and Development (OECD), determined to measure “soft assets” such as student learning outcomes in core subjects – in addition to measuring investment in public and private capital. In 2000, the OECD launched the triennial Program for International Student Assessment (PISA), an ambitious assessment of the ability of students in secondary school, age 15, to read (in the dominant regional language), and answer expected secondary-level mathematics and science questions. By the latest round, in 2022, PISA had probably become the world’s best-known internationally comparable assessment of K–12 school systems.2Admittedly, while primary and secondary school learning contributes to a country’s future prosperity, there are important non-school contributions to learning. The PISA index of economic, social and cultural status (ESCS) is a simple measure of non-school factors.

For each round, the OECD releases vast statistical evidence on students’ performance, public versus private schools, students’ attitudes, etc. Initially, each of the three subject samples was benchmarked and normalized, with 500 as the average score and 100 as the standard deviation. Reading was benchmarked in the first round (in 2000), mathematics the second (in 2003) and science the third (in 2006). PISA rotates the priority subject. For the 2022 round, it was mathematics; interestingly, the results show the three highest mathematics scores were in East Asia.

Relative to most other OECD countries, Canada has fared well. In the 2022 round, Canada’s composite outcomes in each of the three subjects rank among the top ten countries: reading (6th), mathematics (6th), science (5th).3This ranking excludes PISA surveys in partial regions of a country. The ten highest scores in mathematics are Singapore (575), Japan (536), Korea (527), Estonia (510), Switzerland (508), Canada (497), Netherlands (493), and Ireland (492). The next four countries (Belgium, Denmark, UK, and Poland) are tied (489) (OECD 2023a, Table 1.1). However, ranking in the top ten should not reassure us that all is well in Canada’s K–12 provincial school systems.

The following sections raise and discuss three warnings. The first is obvious: the more-or-less continuous decline in Canadian composite scores in all three subjects since the Canadian benchmark scores. The second is the faster declines in composite scores among the six small-population provinces, relative to the four large-population provinces. The third is the decline in mathematics, the subject with Canada’s largest decline.

Three Warnings

Decline in Canadian PISA results

Canada’s subject scores, like those of many OECD countries, have declined from the benchmark year to 2022 across all three subjects (Figure 1). Before the latest round in 2022, Canada’s scores in all three subjects were above 500. The mathematics score fell below the 500-benchmark level in 2022.

In mathematics, from 2018-2022, Singapore, Japan, and Korea realized small statistically insignificant increases in composite scores. However, the average OECD mathematics score declined from 2018-2022 by 15 points, the reading score by 10 points. The OECD science decline was statistically insignificant (OECD 2023a). Interpreting 2018-2022 declines is ambiguous. One relevant factor is the length of COVID-induced school closures. Provinces varied with respect to weeks of school closure during COVID (March 2020 to May 2021) – the longer the closures, the larger the provincial declines (Bennett 2023). But other factors are present. Over the decade, 2012-2022, Canada experienced statistically significant declines in all three subjects.4The correlation between provincial weeks of school closure during COVID and provincial 2018-2022 mathematics declines is negative. The correlation coefficient is -0.37.

Expansion of PISA gaps between large- and small-population provinces

The second warning is that, weighted by population, the declines from the benchmark scores to 2022 have been larger among the six small-population provinces than among the four large-population provinces;5The 2022 weighted average composite scores among the large provinces are 501 for mathematics, 511 for reading, and 518 for science. The 2022 weighted averages among the small provinces are, respectively, 468, 483, and 491. hence the gaps between the large and small provinces are widening. Weighted by 2022 provincial populations, the declines among the four large-population provinces in PISA scores, from benchmark year to 2022, are reading (-26), mathematics (-34), and science (-19). The comparable declines among the six small-population provinces are reading (-38), mathematics (-50), and science (-27). A decline of 20 points on the assessment scale is equivalent to the loss of one year of school (Jakubowski et al. 2024). Small provinces have experienced a substantial decline in mathematics performance, equivalent to losing 2.5 years of schooling since 2003. This decline is nearly one year greater than that observed in large provinces over the same period.

Figures 2a–2c illustrate provincial average 2022 scores in the three subjects. The dot colours indicate whether (at a 5 percent significance level) the provincial averages are above the OECD average, at the OECD average, or below it. There are two points to highlight in the 2022 PISA round:

  • In all subjects, there is no overlap between scores of the four large- and six small-population provinces. In all subjects, the large provinces score above the OECD average. The small-population subject scores are mixed: 6 scores are statistically above the OECD average, 11 at the average, and one statistically below the average.
  • There is no overlap in socio-economic status between the small- and large-population provinces. The Economic Social and Cultural Status (ESCS) index is a self-defined measure of factors beyond school quality (see Box 1). ESCS is a well-intentioned exercise in estimating family and community effects. Each of the four large-population provinces has a higher ESCS score than any of the six small-population provinces.6Some reviewers raised concerns about whether the differences in ESCS index scores between the large- and small-population provinces merely reflect disparities in average income levels. While per capita provincial GDP is relevant in explaining provincial ESCS scores, notable exceptions challenge this assumption. For instance, Saskatchewan has the second-highest per capita GDP but ranks second lowest in ESCS score. This example highlights the complexity of factors influencing ESCS and math performance, beyond income alone. A more detailed discussion on these dynamics is provided in the following section. The trend lines illustrate the relationship between average provincial subject and ESCS scores.7The R2 statistic measures the proportion of the variance of the provincial scores captured by the trendlines in provincial ESCS scores. The R2 range is 0 percent to 100 percent. The three R2 statistics are approximately 0.7, which means ESCS trendlines capture about 70 percent of the variance among provincial scores. While the trendlines show high R² values, a multi-variate regression may reduce the contribution of ESCS coefficients. The high R2 in all three figures suggests that socio-economic and cultural factors play a significant role in explaining the differences in provincial performance.

Mathematics decline

The third warning is the decline in mathematics scores. Among the three subjects, mathematics has experienced the largest decline in Canadian results from the benchmark year, 2003, to 2022. It is the only subject in which a provincial score has fallen below the OECD average.

Table 1 shows Canadian and provincial 2022 composite PISA scores for the three subjects. The table disaggregates the changes: benchmark year to 2018 and 2018 to 2022.

From 2003 to 2018, eight provinces experienced statistically significant declines in mathematics scores. Quebec and Prince Edward Island are outliers, with statistically insignificant declines (-4 and -14, respectively). From 2018 to 2022, only Alberta, British Columbia, and Prince Edward Island experienced non-statistical declines.8Prince Edward Island realized a non-significant 2003-18 decline (-14) and, likewise, a non-significant 2018-22 decline relative to the OECD average.

Despite a significant 2018-2022 decline, much of which may be due to school closures, Quebec’s 2022 math score is by far the highest. Quebec’s mathematics deviation from its trendline projection in Figure 2a is the largest positive deviation from projection among all three subjects in all ten provinces. (Alberta also realized large positive deviations from projection in reading and science.) Given its standout mathematics status, the obvious question is, what is Quebec doing that other provinces should adopt? A rationale for provincial PISA disaggregation is to prod provinces with relatively low scores to learn from provinces with relatively high scores.

What’s to Be Done?

Provoke a “shock”

One potential policy initiative is for a provincial government or a parents’ association to provoke a “shock” due to declines in regional PISA scores. At present, public attention to Canada’s PISA results is relegated to the inside pages of newspapers. Sweden is a case of ignoring modest round-to-round declines until a 16-point decline, from 2009 to 2012, provoked a “shock” (see Box 2).

OECD countries varied in the extent to which they closed schools during the pandemic. For the first time, the aggregate 2022 PISA scores in all three subjects experienced substantial declines from the 2018 scores. The overall OECD 2018-22 mathematics and reading declines were, respectively, 15 and 10 points. Sweden’s mathematics and reading scores declined by 20 points, from 502 to 482. Three large European countries, France, Germany and Poland, also experienced 2018–2022 declines larger than 20 points. These declines have induced tremors in European media, but much of the declines can be attributed to school closures during the COVID pandemic.

Defining declines, sufficient to warrant promotion of a “shock,” is open to debate. The most justifiable Canadian “shock” would be in Newfoundland and Labrador, the sole province significantly below the OECD average.

Pay more attention to the ESCS impact on school outcomes

The ESCS index is a useful summary of non-school factors that contribute to learning outcomes. Canada is among the most egalitarian of OECD countries as measured by the ESCS index – the lower the proportion of national variance of PISA results attributable to national ESCS scores, the more equitable the country (PISA 2023a, Table 1.4). However, the combination of economic, social, and cultural factors into a single index limits interpretation of its relevance.

The definition of the index does not refer explicitly to ethnic communities, but some ethnic communities display high correlation with parents’ education, occupation, and possessions. Certain cultural conventions among ethnic communities may enhance student performance (viz. Asian indicator regressors in Figure 3). Certain conventions may do the opposite, for example, fundamentalist Muslim teachers disobeying secular education intentions of Quebec school policy (Bordeleau 2024). The most vexing cultural dilemma for K–12 public schools is simultaneously to promote Indigenous cultures and adequate learning of core subjects. The CMEC (2023) analysis of the 2022 PISA survey briefly introduces Indigenous issues; it does not discuss Indigenous learning outcomes.

Note that provincial ESCS index values are not a good proxy for average per capita provincial income. For instance, Saskatchewan, which has the second-highest per capita GDP, ranks second lowest in ESCS score. Similarly, Manitoba, the province with the lowest ESCS score, and Quebec, the province with the lowest ESCS score among large provinces, have similar per capita GDP levels but dramatically different mathematics performance. Newfoundland and Labrador, with a per capita GDP close to the national average, holds the third-lowest ESCS score, further illustrating the disconnect between ESCS index values and income levels.9National and provincial per capita GDP, 2023: Canada ($73,000), Alberta ($97,000), British Columbia ($74,000), Manitoba ($63,000), Newfoundland and Labrador ($72,000), Nova Scotia ($56,000), Ontario ($72,000), Prince Edward Island ($57,000), Quebec ($65,000), New Brunswick ($57,000), Saskatchewan ($91,000).

In explaining the lowest provincial ESCS scores in Manitoba and Saskatchewan, presumably low earnings among Indigenous students’ families and a history of Indigenous mistrust of public schools are highly relevant. In these two provinces, the Indigenous proportion of K–12 cohorts (ages 5-19) is about 30 percent.10Across provinces, the Indigenous proportions of K–12 cohorts (ages 5-19) vary dramatically (Statistics Canada census calculations, in Richards, Mahboubi 2018). In Manitoba and Saskatchewan, the proportion is approximately 30 percent; in Alberta, British Columbia and Atlantic region 8-10 percent; in Ontario and Quebec below 5 percent. Improving ESCS in these two provinces poses more complex “cultural” dilemmas than in other provinces with much lower proportions of Indigenous students. In all provinces, better core K–12 academic skills are required for Indigenous students to achieve high-school certification rates similar to those of non-Indigenous students. Without high-school certification, young Indigenous adults are unlikely to realize post-secondary training and good employment, either on- or off-reserve. In several monographs, Harvey McCue (2016) has written about the dilemma.11 Harvey McCue (Waubageshig) is a prominent and accomplished First Nation professional educator. For example, he played a crucial role in establishing the James Cree education system. In 2021, he received the Order of Canada. I acknowledge that he and I were two of four authors of the report designing a federal project to legislate a professionalization of reserve school administration. It was jointly supported by Shawn Atleo, head of the Assembly of First Nations, and Chuck Strahl, minister of Aboriginal Affairs. He has summarized the issues:

One of the critical reasons for the lack of success in First Nation [K–12] education is the disconnect between the curricula in First Nation schools (which is the provincial curricula, unfortunately) and the reality facing First Nation youth in their communities. The disconnect is severe enough in my opinion that a majority of kids simply give up on their education because of it. Of course, other factors come into play – a revolving door of unprepared teachers, principals for whom accountability is pretty well non-existent, and parents who for historical reasons see little value in what’s offered in the classrooms.

Defeating the disconnect would place a premium on getting an education, give cause to parents to encourage their kids to go to school, and provide kids with purposeful learning that they see as important and valuable. The [curriculum] should increase the applied education content … to at least 50% of the overall curriculum. This approach worked for most Canadians during the first half of the 20th century and I see many good reasons for re-implementing this approach for First Nation schools. (McCue in Richards 2023.)

Based on the 2021 census, the national proportion of First Nation young adults (ages 20-24) living on-reserve with at least high-school completion is only 56 percent; off-reserve the rate is 78 percent. The comparable non-Indigenous young adult certification rate is 93 percent.12The majority of First Nation students live off-reserve and attend provincial schools. For young adults identifying as First Nation (ages 20-24), 2021 census data on high-school certification by identity and province are available in Richards (2023). The BC on- and off-reserve high-school completion rates are respectively 74 percent and 83 percent. BC results are not ideal, but they are the best among Canadian provinces (Richards 2023).

Education certification is not necessarily a good measure of learning outcomes. For a quarter century, British Columbia (2023) has undertaken annual detailed learning outcome assessments among Indigenous and non-Indigenous students in provincial public schools at various grades and has published detailed results, thereby introducing empirical evidence in comparing Indigenous and non-Indigenous student learning, disaggregated at the school district level.13Using EQAO data, Gallagher-Mackay et al. (2023) has published its first public report containing some assessment data comparing First Nation and “all” students, disaggregated in terms of six provincial regions. BC has accompanied assessments with various incentives to encourage informal collaboration with Indigenous and non-Indigenous local leaders interested in education outcomes (see Richards et al. 2008; Anderson and Richards 2016). It is impossible to know the contribution of activities linked to assessments. Nonetheless, the province with the best Indigenous high-school certification is the province that has most persistently assessed and published Indigenous learning outcomes.

Analyze Quebecs superior mathematics scores

Anna Stokke (2015, 2023) has criticized “discovery-based” mathematics instruction, a teaching strategy that invites students to discover solutions to problems and discourages direct teaching by instructors. It is an inefficient strategy, she argues:

Math is cumulative and requires much practice to solidify new concepts. Students who do not receive effective instruction or get sufficient practice can easily fall behind and it can be difficult to get caught up.

Ministries of education must ensure that advice given to teachers about teaching math and instructional resources are aligned with the science of learning. This includes explicitly teaching students, incorporating ample practice, and using other evidence-informed techniques. Professional development providers who play down the need for explicit instruction, devalue student practice, or fail to provide solid evidence for the effectiveness of their programs should be avoided. (Stokke 2023.)

Reid (2017) studied in detail the training among future Ontario secondary mathematics teachers. Their conclusion is consistent with Stokke’s. Bennett (2018, 2023) concludes that Quebec’s superior mathematics scores are due, in part, to the mathematics training of teachers being more rigorous than in other provinces. Teacher training is obviously not the only relevant factor. Bennett suggests a more rigorous math curriculum, which implies less focus on child-centered approaches (i.e., less “discovery math”) and more focus on pedagogical approaches rooted in cognitive science. He acknowledges a potential dilemma: a more rigorous mathematics curriculum may be a part of the explanation for Quebec’s high-school completion rate being lower than the national average.

Bennett’s explanation of Quebec’s outlier status – it ranks 4th among PISA 2022 national mathematics scores – is consistent with the OECD evidence that a strong association exists between national PISA mathematics scores and imputed national mathematics competence among secondary level teachers.

On a national basis, Eric Hanushek and colleagues (2018) analyzed the impact of teacher competence on national PISA mathematics scores. Teacher competence is measured not by degrees; it is derived from the OECD Program for International Assessment of Adult Competencies (PIAAC).14The PIAAC assesses the level of mathematics (and literacy) competencies among national samples. The PIAAC also provides mean national scores among adults with tertiary level STEM degrees. Hanushek's colleagues made the reasonable assumptions that secondary level mathematics teachers in OECD countries have STEM degrees, and that the national PIAAC mean of numeracy competence among those with tertiary level STEM training is a reasonable proxy for national teachers' mathematics competence. Figure 3 illustrates the most recent PISA (2023a) and PIAAC (2023b) evidence among 30 countries. As did Hanushek and colleagues, the analysis here regresses national PISA mathematics scores on national PIAAC mean scores for the national sub-sample with STEM degrees. The regression illustrated below includes identification regressors for the three East and Southeast Asia countries, an acknowledgment of the Asian cultural emphasis on formal education.15The trend line shown is a univariate regression of national PISA mathematics scores on the mean national PIAAC scores (Asian countries excepted) using the x4 regressor: y = 311.68 + 0.55x4. R2 adjusted is 0.31, n = 27. The multivariate regression adds identification regressors for Japan, Korea, and Singapore. The (underspecified) regression equation is y = 311.68*** + 40.39**x1 + 62.3***6x2 + 97.77***x3 + 0.55***x4, where y is national PISA mathematics score, x1 Japan, x2 Korea, x3 Singapore, x4 mean national PIAAC numeracy score among sample with STEM tertiary training. R2 adjusted is 0.70, n = 30. The t-statistic legend of regressors is ** for 5 percent significance and *** for 1 percent significance.

Among the 27 non-Asian countries, the trendline illustrates a good “fit.” (If country A has two points higher PIAAC tertiary mean STEM score than country B, country A’s projected PISA score is approximately one PISA score higher than for country B.) Canada’s PIAAC STEM score is close to the sample median; its PISA score is somewhat above the trendline projection. Unfortunately, PIAAC does not disaggregate by provincial jurisdictions.

Conclusion

Canada’s subject declines from 2018-2022 are similar to overall OECD declines. Our national school system continues to rank among the top ten countries in all subjects. At first glance, one might conclude all is well. However, Canada’s national trends have shown consistent declines since the initial benchmarking of the three subjects in the early 2000s.

There is a risk, as seen in Sweden, in ignoring modest but continuous decline. Newfoundland and Labrador’s significant 2018-2022 mathematics decline – twice the comparable average OECD decline – may serve as a wake-up call, provoking a provincial “shock” and generating public insistence for improvements.16Alberta is an example of a minor Canadian “shock.” In the first round of PISA, Alberta’s scores were very high. Over the decade, Alberta’s scores – in math in particular – declined to the national average. Ultimately, parents demonstrated in front of the Alberta legislature, demanding better math results in particular. Alberta results have improved. The Council of Ministers of Education in Canada could facilitate a coordinated response among provinces to address weaknesses identified in PISA, promote best practices, and ensure that these insights lead to tangible policy actions aimed at improving student outcomes nationwide.

Second, the most dramatic “cultural” problem in discussing ESCS provincial scores is the unsatisfactory K–12 outcomes of Indigenous students. British Columbia, the province most attuned to public Indigenous versus non-Indigenous assessment outcomes, has achieved the highest rates of Indigenous high-school certification. Provinces with lower Indigenous high- school certification could benefit by adopting BC’s tradition of incentivizing district level initiatives to improve outcomes.

Third, there is a need for increased mathematics study for secondary level mathematics teachers, as emphasized by Bennett (2018, 2023). Other provinces should take lessons from Quebec’s teacher training.

Canada’s learning loss during the COVID pandemic, combined with previous declines from the initial benchmark scores, should be sufficient catalyst for improving Canadian K–12 education outcomes. Canada’s future human capital depends on it.

References

Anderson, B., and J. Richards. 2016. Students in Jeopardy: An Agenda for Improving Results in Band-Operated Schools. Commentary 444. Toronto: C.D. Howe Institute.

Bennett, P. 2018. “What can be learned from Quebec’s math prowess?” Policy Options. Institute for Research on Public Policy.

_______. 2023. “Pandemic Fallout Learning Loss, Collateral Damage, and Recovery in Canada’s Schools.” CARDUS. Accessed 20241116 at https://www.cardus.ca/research/education/reports/pandemic-fallout/

Bordeleau, S. 2024. “Québec organise des enquêtes dans trois écoles de Montréal. Radio Canada.” Accessed 20241228 at https://ici.radio-canada.ca/nouvelle/2126104/violation-laicite-quebec-enquete-education

British Columbia. 2023. Aboriginal Report: How Are We Doing, 2022/23? Accessed 20240923 at
https://www.fnesc.ca/how-are-we-doing-report/

Council of Ministers of Education, Canada (CMEC). 2023. Measuring Up: Canadian Results of the OECD PISA 2022 Study.

Gallagher-Mackay, K., M. Yau, C. Corso, N. Debassige and N. Snow. 2023. “Chiefs of Ontario Systemic Gaps in Education Project Report #1: Student Outcomes in Provincially Funded Schools.” Toronto: Chiefs of Ontario.

Gomendio, M., and J. West. 2023. Dire Straits – Education Reform: Ideology, Vested Interests, and Evidence. Open Book Publishers.

Haeck, C., and P. Lefebvre. 2021. “Trends in cognitive skill inequalities by socio-economic status across Canada.” Canadian Public Policy, pp.88-116. March.

Hanushek, E., M. Piopiunik and S. Wiederhold. 2018. “The Value of Smarter Teachers: International Evidence on Teacher Cognitive Skills and Student Performance.” Working paper 20727. National Bureau of Economic Research.

Jakubowski, Maciej, Tomasz Gajderowicz and Harry Patrinos. 2024. “COVID-19, School Closures, and Student Learning Outcomes New Global Evidence from PISA.” World Bank Group. Available at https://documents1.worldbank.org/curated/en/099932301112496929/pdf/IDU16cf7d0801f2091478b1934914b47c3ab4027.pdf

McCue, H. 2016. First Nations elementary-secondary education: a national dilemma. C.D. Howe Institute. Available at https://cdhowe.org/wp-content/uploads/2025/02/Verbatim_Waubageshing_McCue.2016.pdf

_______. 2023. Personal exchange in Richards, 2023.

National Assessment of Education Progress (NAEP). 2024. Accessed 20240722 at https://nces.ed.gov/nationsreportcard/about/

Organisation of Economic Cooperation and Development (OECD). 2023. PISA 2022 results: The State of Learning and Equity in Education. Vol I.

_______. 2024. Do Adults Have the Skills They Need to Thrive in a Changing World? Survey of Adult Skills 2023.

Reid, M., and S. Reid. 2017. “Learning to be a Math Teacher: What Knowledge is Essential?” International Electronic Journal of Elementary Education. 9:4851-872.

Richards, J., J. Hove and K. Afolabi. 2008. Understanding the Aboriginal / non-Aboriginal Gap in Student Performance: Lessons from British Columbia. Commentary 276. Toronto: C.D. Howe Institute.

Richards, J., and P. Mahboubi. 2018. “Measuring Student Outcomes: The Case for Identifying Indigenous Students in Canada”s PISA Sample.” E-Brief 272. Toronto: C.D. Howe Institute.

Richards, J. 2020. Student Performance in PISA 2018: Nettlesome Questions for Canada. Commentary 576. Toronto: C.D. Howe Institute.

_______. 2023. “Low indigenous employment and 'deaths of despair.'” Johnson-Shoyama Graduate Public Policy School. working papers. Accessed at https://www.schoolofpublicpolicy.sk.ca/research-ideas/publications-and-policy-insight/policy-brief/low-indigenous-employment-and-deaths-of-despair-in-the-canadian-prairies.php

Stokke, A. 2015. What to do about Canada's Declining Math Scores. Commentary 427. Toronto: C.D. Howe Institute.

_______. 2023. “Reversing the decline in Canada’s math scores.” Intelligence Memo. Toronto: C.D. Howe Institute.

Membership Application

Interested in becoming a Member of the C.D. Howe Institute? Please fill out the application form below and our team will be in touch with next steps. Note that Membership is subject to approval.

"*" indicates required fields

Please include a brief description, including why you’d like to become a Member.

Member Login

Not a Member yet? Visit our Membership page to learn more and apply.