Category:

Missed Tests Cloud View of Pandemic’s Student Impact

Focus #21 • December 2021

Print

With parents and policymakers alike concerned about the pandemic’s effect on student learning, recently released data from statewide testing last spring could have illuminated specific challenges and opportunities. The data appear to confirm that student learning suffered, but a surge in students missing the exams – particularly in the largest districts and among less privileged groups – undercut the data’s value and mean education leaders must look elsewhere for additional insights on student needs.

As was widely expected, statewide test scores released last month showed an overall decrease in student proficiency from the 2018-19 to 2020-21 school years amid the disruptions of the pandemic. The actual dip may be even larger than what these scores suggest, however, because of a sharp rise in students who did not take the exams at all. Particularly concerning is the decrease in participation among marginalized student groups.

Without these missing exam results, users of the data will need to proceed with caution. Many districts and communities will need to seek additional information to appropriately target student supports and maximize the impact of federal pandemic aid. The decrease in participation also raises concerns that future years of testing may be affected as well.

Disruption to annual testing

Each spring, third through eighth graders take the statewide Forward exam in math and English Language Arts (ELA) to assess their knowledge and skills. Eleventh graders take the statewide ACT exam to assess college readiness. The U.S. Department of Education and Wisconsin Legislature suspended 2019-20 test requirements due to COVID-19, making the 2020-21 data the first statewide source of information about these areas since the pandemic began.

The 2020-21 results come with asterisks, however, due to markedly decreased participation rates from previous years. The state Department of Public Instruction (DPI) reports that 13.3% of eligible public school students did not take the Forward ELA exam in 2020-21, while 12.9% of public school 11th graders did not take the statewide composite ACT test. These non-participation rates dwarf those of previous years: only 1.5% of students did not take the Forward ELA exam in 2018-19 (see Figure 1), and 5.2% of 11th graders did not take the statewide composite ACT in the same year.

Both the ACT and the Forward exam must be administered in person, which created challenges for schools operating remotely. In fact, two thirds (66.6%) of Forward exam non-participants from public school districts attended urban school districts, which were much more likely to operate remotely last year.

In interviews, district and school leaders noted that parents of children learning virtually often did not feel safe sending them into school buildings for testing. In addition, although DPI provided one-time flexibilities to help districts safely administer the tests, COVID guidance from some public health officials discouraged in-person gatherings. This combination of factors led the Madison Metropolitan School District (MMSD), for example, to offer a streamlined opt-out process for families. A leader in another school system observed that lowered attendance on testing days at his school was consistent with lowered attendance throughout the 2020-21 school year. Finally, compared to other concerns around the pandemic, standardized testing appears to have been a lesser priority for at least some administrators, educators, and families.

Disproportionate decreases in testing

Not all student groups experienced the same changes in test participation. Students of color and students from low-income households saw particularly large participation decreases, compromising the value of the data for statewide analyses and for districts such as Milwaukee and Madison with large shares of underserved students.

As Figure 2 illustrates, test non-participation for both students of color and white students increased from 2018-19 to 2020-21 but to sharply varying degrees. From 2018-19 to 2020-21, non-participation rates on the Forward ELA exam increased by 5.9 percentage points for white students but by 20.4 points for Hispanic students and 38.2 points for Black students. In total, 7.2% of white students, 22.6% of Hispanic students, and 40.2% of Black students in the state’s public schools missed the exam in 2020-21.

Analyzing economic backgrounds reveals a similar if less extreme story, as seen in Figure 3. The non-participation rate for students from higher-income households increased by 6.5 percentage points, while the rate for students from low-income households increased by 18.9 points. In all, 20.6% of students from low-income households did not take the Forward ELA exam in 2020-21 compared to 7.8% of other students.

Milwaukee Public Schools (MPS) and MMSD, the largest school districts in the state, accounted for the most non-test takers in 2020-21, including many from underserved student groups. Combined, the two districts serve 75.6% of the state’s Black students who did not take the Forward ELA exam, 61.0% of non-testing Hispanic students, and 18.1% of non-testing white students. These two districts also accounted for 57.6% of students from low-income households who did not test and 25.8% of non-test takers who were not economically disadvantaged. Students in both districts learned remotely for the majority of the 2020-21 school year, which district officials cited as the primary explanation for the low participation rates.

State non-participation rates for the Forward ELA exam were also higher for other disadvantaged students in public schools. For example, 28.1% of students experiencing homelessness did not participate, an increase of 23.7 percentage points over 2018-19; 21.2% of English Learners did not participate, an 18-point increase; and 19.5% of students with disabilities did not participate, a 15.5-point increase.

Non-participation numbers are even more stark for the ACT. Those data show 47.6% of Black 11th graders, 30.3% of 11th graders with disabilities, 26.8% of 11th grade English Learners, and 24.9% of 11th graders from low-income households missing from the composite ACT data. Forty-four percent of 11th graders experiencing homelessness did not test on the ACT in ELA. The ACT testing window occurs earlier in the year than the Forward exam, meaning that more students were still learning virtually during the 2020-21 testing. In some districts, high school students were also the last age group to return to school buildings.

Implications for interpreting and acting upon test results

The underrepresentation of the state’s most vulnerable students means that the statewide results should be interpreted with great caution. For example, 2020-21 Forward exam results show proficiency rates among test-taking public school students down from 2018-19 by 2.6 percentage points in ELA and by 5.2 percentage points in math. These decreases are likely understated, however, given the longstanding achievement gaps in Wisconsin affecting the groups most likely to have not tested.

Individual districts may still find value in their own data, depending on their test participation rate, change in rate from the previous testing year, and the composition of test takers. DPI recommends caution when comparing 2020-21 test results across years or schools if test participation is below 95%. Almost three quarters (72%) of districts met this 95% test participation threshold for the Forward ELA exam, and more than half (59.6%) met it for the composite ACT exam. The Forward ELA test participation rate decreased by less than two percentage points (or even increased) in 52.9% of districts from 2018-19 to 2020-21, while ACT participation declined by less than two percentage points or increased in 50.4% of districts. In these communities, test results have greater validity, but those using the data should still consider whether the students who missed the test differ in meaningful ways and to a greater degree than in the past.

Education officials with concerns about their testing results face the challenge of gathering other reliable data to inform academic recovery efforts. The lack of solid data may make it more difficult to effectively and equitably target federal pandemic aid to support students. Assessments like standardized tests can provide insight into subject area, grade level, or demographic disparities, each of which invites unique interventions.

Standardized tests are not the only possible source of such data, but their absence puts more pressure on other assessments to produce actionable data in the pandemic context. Leaders at MPS report relying on universal screening assessments to diagnose student needs. They also encourage schools to draw on a range of local quantitative and qualitative data like student grades, attendance, classroom observations, and in-class tests to inform student supports and interventions. MMSD uses screeners as well, in addition to interim assessments. School and district leaders can then use these data when making instructional decisions.

Long-term impact

The low participation rates may well be a one-year blip since students have largely returned to in-person learning for the 2021-22 school year thus far. MPS currently expects all students to test this spring, though the district still plans to follow state guidelines for families to opt students out of assessments. MMSD anticipates some families will opt out but is preparing resources to encourage participation. The course of the pandemic will likely play a role in district and family decisions. If large-scale non-participation continues into subsequent years, policymakers may raise the alarm over the long-term loss of data representative of all students.

Regardless of future years’ participation rates, the impact of this year’s anomaly will linger, since the state’s district and school report cards draw upon multiple years of testing data when calculating scores for academic achievement and growth. This multi-year approach smooths out the 2020-21 data irregularities but also means they will be used in state report cards for two more years. Caution in data interpretation will be in order throughout this timespan at least.

In the meantime, despite their limitations, the current test data still reinforce concerns that the pandemic has harmed student learning, especially for those who were already underserved. The results underscore the need for education leaders to act with clarity and urgency to target their federal K-12 pandemic aid toward short-term recovery and long-term gains for students.