Skip to:
  1. Main navigation
  2. Main content
  3. Footer
Economic Commentary

COVID-19 and Education: A Survey of the Research

This Commentary reviews evidence on three areas of concern related to the COVID-19 pandemic and education in the United States for which research currently exists. First, the evidence suggests that the spread of the COVID-19 virus at K–12 schools has been low, although it may have spread through colleges at a higher rate. Second, while anecdotal evidence suggests that school closures have reduced labor force participation, the research evidence thus far does not find much support for this situation. Third, the limited research evidence does, however, suggest the COVID-19 pandemic is negatively affecting students’ academic performance.

The views authors express in Economic Commentary are theirs and not necessarily those of the Federal Reserve Bank of Cleveland or the Board of Governors of the Federal Reserve System. The series editor is Tasia Hane. This paper and its data are subject to revision; please visit clevelandfed.org for updates.

In March 2020, millions of students, educators, and other school staff throughout the United States had their lives disrupted by the COVID-19 pandemic and associated mitigation policies. Millions of families have had to adapt unexpectedly to remote or hybrid learning environments. Some schools have since returned to in-person instruction, whereas others remain in a remote or hybrid mode. In addition to changes over time and differences across schools, at some schools the mode of instruction differs based on the decisions of parents or the grade of the student.

The disruption to education caused by the COVID-19 pandemic is unique in recent history, and research on its impacts will likely continue for many years. But some evidence has already emerged regarding the pandemic’s impact. This Economic Commentary presents existing evidence on three areas of concern related to the COVID-19 pandemic and education: the spread of the COVID-19 virus through in-person school settings, the impact of K–12 school closures on labor force participation, and the effects of virtual schooling on student outcomes. I discuss these three topics together here because they are among the few topics related to the pandemic’s impact on education for which research currently exists.

The Spread of the COVID-19 Virus through In-Person School Settings

As venues that bring large groups of people together to spend a substantial amount of time in each other’s company, schools are places where the COVID-19 virus could conceivably spread readily. But is this actually happening?

Abouk and Heydari (forthcoming) and Courtemanche et al. (2020a, 2020b) study the effects of school closures on COVID-19 case counts at the beginning of the pandemic by matching data on the timing of school closures and data on case counts at either the state or county level. Neither study finds a relationship between closures and case counts. Goldhaber et al. (2020) find that schools’ choices about instructional modality (in-person, remote, or a hybrid of the two) had little relationship with COVID-19 cases in the surrounding community in Michigan and Washington state in fall 2020, although they do find that in-person schooling was associated with a higher incidence of COVID-19 in areas where there was a high pre-existing infection rate. A nationwide study by Harris, Ziedan, and Hassig (2021) finds little relationship between school reopenings and COVID-19-related hospitalizations for most counties, but they find mixed and inconclusive evidence for counties with a high pre-existing hospitalization rate.

Although caution is warranted, commentators such as Henderson and Sullivan (2020) and Oster (2020) have argued that the benefits of in-person schooling coupled with a relatively low spread of the virus at K–12 schools in the United States call for reopening more schools for in-person instruction.

Research studying other countries finds similar results to the research on the United States. For example, Isphording, Lipfert, and Pestel (2020) and von Bismarck-Osten, Borusyak, and Schonberg (2020) do not find that school reopenings are associated with an increase in COVID-19 cases in Germany, but Vlachos, Hertegard, and Svaleryd (2020) find evidence that in-person schooling is associated with a higher infection rate in Sweden.

The evidence thus far on the spread of the COVID-19 virus at institutions of higher education differs from the evidence at the K–12 level. Mangrum and Niekamp (2020) study the spread of COVID-19 infection by college students using variation in the timing of spring breaks across colleges. With early spring breaks, students may have traveled somewhere and then brought the virus back with them. Later spring breaks were effectively canceled. Mangrum and Niekamp conduct their analysis at the county level and define an early-spring-break county to be one in which at least 25 percent of the college students enrolled there had a spring break that ended before March 9. The authors use smartphone geolocation data from SafeGraph Social Distancing Metrics to show that places with early spring breaks did indeed have a large number of people leave the area and then return, whereas places with later spring breaks had people leave and not return. The authors find that early-spring-break counties experienced 2.1 percentage points higher growth rates of infection in the first week after students returned home than the late-break counties, and 3.6 percentage points higher growth in the second week. Given the incubation period of the disease, these increases suggest that returning college students spread the virus to others in their local area. In line with Mangrum and Niekamp’s findings, Andersen et al. (2020) find that college reopenings in fall 2020 were associated with an increase in COVID-19 cases at the county level. The difference between college students and K–12 students may be partly due to physiological differences between people of different ages that affect how susceptible they are to the virus, but it might also be because college students are less likely to take precautions and are spreading the virus at events outside of the classroom.

The Impact of K–12 School Closures on Labor Force Participation

If parents leave the labor force to stay home with children whose schools have switched to remote instruction or a hybrid of remote and in-person instruction, one outcome may be that the labor force participation rate will fall. This may not only have deleterious consequences for the economy, but Bayham and Fenichel (2020) point out that it may be more difficult to treat patients and control the spread of the COVID-19 virus if the parents who are dropping out of the labor force to stay at home are healthcare workers. While journalistic accounts support the view that labor force participation has fallen because of at-home schooling (Guilford 2020 and Guilford and Chaney Cambon 2020), much of the academic research on school closures and labor force participation thus far runs counter to this conventional wisdom.

Barkowski, McLaughlin, and Dai (2020) use data from the Current Population Survey (CPS) to estimate the effects of the COVID-19 pandemic on three labor market outcomes: being employed, being at work (which differs from being employed because it excludes, for example, people who are sick or on vacation), and hours worked. They use three separate research designs to study how these outcomes changed over time for a treatment group relative to a control group. These research designs were specified in advance in a pre-analysis plan in order to guard against specification searching, a practice in which a researcher estimates a variety of statistical models before the “correct” one is found, and which can be misleading to readers if the researcher does not account for the fact that such a search has been conducted when reporting the results. 

The first research design compares people who have one or more children under age 13 (treated) to those who have no children under age 13 (control). The second research design limits the sample to people who have one or more children under age 13; it compares those whose oldest child is not aged 13 to 21 (treated) to those whose oldest child is aged 13 to 21 (control). The third research design limits the sample to people who have children under age 13 but whose oldest child is not aged 13 to 21 (i.e., the treated group from the second research design); it compares people who have a parent (the child or children’s grandparent) living with the family (treated) to those who do not (control).

The results of Barkowski, McLaughlin, and Dai (2020) vary based on which outcome they consider—being employed, being at work, and hours worked—and which of the three research designs they use. Insofar as they find an effect on labor market outcomes, it is a positive one. For example, depending on the research design, they find that the treated group is about 1 percentage point or 2 percentage points more likely to be employed and also to be at work. The same general qualitative results also hold when splitting the sample by gender.

One potential explanation for the results in Barkowski, McLaughlin, and Dai (2020) is that many people are able to work from home. It may even be possible that some people who wouldn’t work for pay under normal circumstances were drawn into the labor force because of the ability to work remotely. Another potential explanation is that some parents were able to find a friend, family member, or someone else to look after their children, allowing them to join the labor force.

Other research comes to similar conclusions, finding no strong evidence that school closures reduced labor force participation. Lozano-Rojas et al. (2020) use variation in school closures across states and over time to estimate how unemployment insurance (UI) claims respond to school closures. Although the authors find evidence that school closures are associated with higher UI claims in some specifications, the results become statistically insignificant and, in some cases, change signs when estimating alternative specifications. Furthermore, using exact dates of school closures and high-frequency Google search data, Kong and Prinz (2020) find that school closures are not associated with searches for the phrase “file for unemployment,” a proxy for filing for unemployment that can be used by researchers at a higher frequency than the official government employment data.

Heggeness (2020) uses CPS data to estimate the effects of school closures on labor market outcomes, like Barkowski, McLaughlin, and Dai (2020), but she relies on variation in school closures over time and across states, like Lozano-Rojas et al. (2020). Heggeness finds some evidence that school closures are associated with an increase in not working the previous week, but school closures also appear to be associated with an increase in hours worked. Part of the explanation for the increase in hours worked may be that certain workers worked additional hours in order to keep up with an increase in demand for the products they produce or services they provide, but another part of the explanation may be that workers became less productive working from home and thus needed to expend more hours than usual to perform the same amount of work as in weeks or months past. Additionally, workers may be devoting less time to commuting and spending some of that saved time working.

It is worth noting that all the research on school closures and labor force participation discussed here studies the impacts of the initial school closures in spring 2020. It is possible that the effects in fall 2020 or spring 2021 might differ. On the one hand, having more time to adapt and find alternative child care arrangements may make it easier for parents to work for pay even though their children’s schools do not meet in person. On the other hand, alternative child care arrangements may be more difficult to maintain as time goes on, making it more difficult for parents to work for pay. Additionally, the nature of virtual schooling and the propensity for schools to meet online have changed and will likely continue to do so. Research on school closures and labor force participation using data from fall 2020 and spring 2021 that relies on variation in the timing of schools returning to in-person learning would be very valuable.

The Effects of Virtual Schooling on Student Outcomes

It will take time to fully understand the effects of the pandemic and school closures on economic and educational outcomes, but the early evidence is not very encouraging. For example, Bacher-Hicks, Goodman, and Mulhern (2021) find that school-related Google searches rose more at the beginning of the pandemic in wealthier areas than in less wealthy areas. Insofar as these internet searches indicate effort put forth by parents or students to substitute for lost in-person instruction, the implication is that the pandemic and associated school closures may lead to greater educational inequality.

Aucejo et al. (2020) survey students at Arizona State University and find that the COVID-19 pandemic caused 13 percent of students to delay graduation, 11 percent of students to withdraw from a class, and 12 percent of students to change their major. Respondents also thought they would have lower grade point averages, a lower probability of finding a job, a lower reservation wage (the lowest wage at which one would accept a job), and lower earnings at age 35 because of the pandemic. Although the authors acknowledge that they are estimating “subjective treatment effects” based on respondents’ perceptions rather than the actual effects of the pandemic, they point out that what is relevant for understanding people’s choices is what they perceive the situation to be rather than what is reality. As an example of this, they note that, “If students (rightly or wrongly) perceive a negative treatment effect of COVID-19 on the returns to a college degree, this belief will have an impact on their future human capital decisions (such as continuing with their education, choice of major, etc.).”

The current situation is unique, but online education is not completely new. We may thus be able to take away lessons from earlier experiences with online education. At least three randomized controlled trials have studied online courses in higher education. Alpert, Couch, and Harmon (2016) randomly assign students at an unnamed large public university in the Northeast to a live, online, or blended principles of microeconomics course. Figlio, Rush, and Yin (2013) randomly assign students to live versus online lectures in an introductory microeconomics course at an unnamed large and selective university. Finally, Bowen et al. (2014) randomly assign students at six public universities to a hybrid (machine-guided online instruction with one hour of face-to-face instruction per week) or traditional statistics course.

The results of these three studies are not very encouraging for online education. Although Alpert, Couch, and Harmon (2016) find that students in the blended course do about as well on the final exam as those in the live course, they also find that students in the online version perform substantially worse. Figlio, Rush, and Yin (2013) find that students have lower test scores with the online lectures relative to the live ones. Bowen et al. (2014), however, do find that students have similar test scores in the hybrid course and traditional course.

There are also several observational studies of online courses in higher education. Xu and Jaggars (2011, 2013, 2014) find strong negative effects of online courses on course grades and course completion at community colleges in Virginia and Washington state. Bettinger et al. (2017) find negative effects of online courses on grades and on enrolling the next semester or next year at a large unnamed for-profit university, while Hart, Friedmann, and Hill (2018) find negative effects on grades and course completion in California community colleges. Studying online education in the current pandemic at community colleges in Virginia, Bird, Castleman, and Lohner (2020) find that courses beginning in person and then moving online in spring 2020 resulted in a 6.7 percentage point lower completion rate. To be sure, we cannot completely rule out the possibility that the results of these observational studies are driven by other differences between students who choose to take online courses and those who do not. However, it is worth emphasizing that they all find results that are generally consistent with the randomized controlled trials, which are not subject to this type of selection bias.

There is less evidence on online education at the elementary or secondary level, probably because, before the pandemic, virtual K–12 education was much rarer than virtual college education. However, Bueno (2020) studies virtual schools in Georgia. She finds, controlling for lagged test scores, students in grades 4–8 attending such schools have lower scores on statewide standardized tests in English language arts, mathematics, science, and social studies than students who attend traditional schools. Hart et al. (2019) study online education in Florida and find mixed results. They find that students taking a course online are more likely to pass the course, although they acknowledge that this might reflect differing grading standards rather than increased learning. For students taking a course the first time, online students are less likely to take and pass follow-up courses and are less likely to be on track to graduate from high school, although both of these results reverse for students repeating a course. Furthermore, Ahn and McEachin (2017) find that students at online charter schools in Ohio perform worse on statewide standardized tests than students at traditional charter or public schools, while Heissel (2016) finds that students in North Carolina who take Algebra I online perform worse on statewide standardized tests than those who take the course in person. This evidence on online education at the K–12 level is consonant with the evidence from higher education in suggesting that, at least so far, online education may not be a good substitute for in-person education.

Conclusion

This Commentary reviews evidence on three areas of concern related to the COVID-19 pandemic and education in the United States for which research currently exists. First, the evidence suggests that the spread of the COVID-19 virus at K–12 schools has been low, although it may have spread through colleges at a higher rate. Second, while anecdotal evidence suggests that school closures have reduced labor force participation, the research evidence thus far does not find much support for this situation. Third, the limited research evidence does, however, suggest the COVID-19 pandemic is negatively affecting students’ academic performance.

References
Suggested Citation

Hinrichs, Peter L. 2021. “COVID-19 and Education: A Survey of the Research.” Federal Reserve Bank of Cleveland, Economic Commentary 2021-04. https://doi.org/10.26509/frbc-ec-202104

This work by Federal Reserve Bank of Cleveland is licensed under Creative Commons Attribution-NonCommercial 4.0 International