Posts published on October 11, 2009

Why Are High School NAEP Scores Not Increasing?

Mark Schneider  former USA Commissioner of  Statistics has written an incisive analysis (below) of the static trend in NAEP test scores despite dramatic increases in academic course taking in recent decades. It used to be a research view that students scores will increase if they take more ELA and math courses, but no more!  However, be sure and read  Schnieder’s section on mitigating circumstances near the end. This post was on the American Enterprise website.

Math in American High Schools: The Delusion of Rigor

By Mark Schneider

The evidence on the failure of American high schools to educate and graduate their students is widespread. The release of the latest National Assessment of Educational Progress (NAEP) long-term trends (LTT) assessment data in April adds another data point to this sad compendium.[1] In this Outlook, I focus on trends in high school math, an area of critical national need, and one that has been a focus of national policy for decades. I present data that show a disconnect between the rigor of the math education that high schools claim to be delivering and the quality of the math education that students are actually receiving as measured by assessment data.

The 2008 NAEP’s LTT data reinforce other studies showing that U.S. high schools are failing in teaching math skills to their students. Administered every few years since the 1970s, NAEP’s LTT is one of the longest and most consistent data series we have on what American students know and what they can do in math. LTT is age-based and assesses the skills and knowledge of nine-, thirteen-, and seventeen-year-old students.

I focus here on the data for seventeen-year-olds, since these students are close to the end of their high school careers. Figure 1 shows that despite thirty years of effort to improve math instruction, our high schools have failed to improve their students’ math skills. In 1978, the average seventeen-year-old scored 300 on a 500-point scale. About fifteen years later, in 1992, the score was 306. And in 2008? Once again, 306.

In contrast to most of NAEP’s assessments, LTT does not have labels like basic, proficient, and advanced. Rather, NAEP describes the types of mathematical tasks a student can perform at five different cut points, starting at a score of 150 and increasing by fifty-point intervals to 350. At 300, students have demonstrated “moderately complex procedures and reasoning,” including computation with decimals, simple fractions, and commonly encountered percents. In addition, they can measure lengths and find averages–not exactly rocket science.

So what percentage of American seventeen-year-olds shows these moderately complex skills? In 1978, 52 percent; in 1992, 59 percent. And in 2008? Again, 59 percent. As a reminder, NAEP stands for the National Assessment of Educational Progress–not much progress to report in high school level math despite decades of effort.

Moreover, the LTT data show that the students high schools were failing to teach were coming to them with much better skills. Since 1978, thirteen-year-olds improved by seventeen points on the LTT math assessment and nine-year-olds by twenty-four points. And since 1994, scores of thirteen-year-olds increased by seven points and nine-year-olds by twelve points.

The depressing LTT data are matched by equally gloomy evidence from international assessments. Consider how our fifteen-year-old students (mostly tenth graders) stack up against their peers in other advanced industrial countries, as captured by the Organisation for Economic Co-operation and Development’s (OECD) Programme for International Student Assessment (PISA).[2]

As figure 2 shows, in 2003 the U.S. average was below the OECD average, and by 2006, the gap was even larger. Moreover, in 2006, across the OECD, the top 10 percent of students who took the assessment scored 615 or above. In the United States, the cut point for our top 10 percent was only 593–our best students do not match the performance of the best across the OECD. If, as President Barack Obama said in his speech to a joint session of Congress in February 2009, “countries that out-teach us today will out-compete us tomorrow,” our high schools are failing to build the foundation for tomorrow’s economy.

The Delusion of Rigor

Data about the failure of American high schools have been discussed for some time. But a number of indicators show that high schools can claim to be providing more rigorous instruction to a growing number of their students.

Here I rely on data from two different NAEP studies to show this disconnect: LTT and the series of NAEP high school transcript studies going back to 1990. Let us begin with the transcript studies.[3]

Figure 3, based on the most recent transcript study, shows that America’s high school students are taking more math courses. Since 1990, as documented in the top line of that figure, the average number of math Carnegie credits[4] a high school graduate completes has increased from 3.2 credits to 3.8–close to a 20 percent increase.

The LTT data give more detail about this increase in course credits. Figure 4 shows just how radically the composition of the math courses high school students take has shifted since the 1970s. In 1978, fully 20 percent of American high school students stopped with general math or prealgebra, and 17 percent stopped at Algebra I. Fewer than 10 percent reached precalculus or calculus as their final course.

The situation has changed dramatically. In 2008, only 3 percent of American high school students stopped at general math, and 7 percent stopped at Algebra I. In contrast, close to 20 percent now get through precalculus or calculus. While in 1978, only about one-third of American high school students completed Algebra II, now over half do.

This is a remarkable change in the course-taking patterns of American high school students. Combining the transcript data and the LTT data, we see that our high school students are completing a greater number of more difficult math courses.

Returning to figure 3, the bottom line shows that not only are high school students taking harder math courses, but high schools are also judging that they are doing better–since 1990, their average math GPA has increased by 18 percent from less than 2.2 to over 2.6.

Yet, increasing rigor in course taking and increasing grades do not square with the data from LTT or PISA showing that high school students have not improved their math knowledge or skills. The final piece of evidence showing the disconnect between the apparent increase in rigor and the lack of learning again comes from the LTT data.[5] Figure 5 shows the average NAEP score of students at each level of math-course completion. Students who stopped at Algebra I, geometry, and Algebra II all scored lower on NAEP in 2008 than the students enrolled in the same courses in 1978. The only bright spot is that students completing calculus now do about as well as their peers from thirty years ago.

Mitigating Circumstances

There are at least four mitigating factors that should be considered as we look at this disconnect. First, there is some debate about the validity of NAEP assessments for older students revolving around their motivation to take these exams seriously. While nine- and thirteen-year-old students take most exams seriously, jaded seventeen-year-old students may be bored with tests and, the argument goes, will not take a low-stakes test such as NAEP seriously. The National Center for Education Statistics (NCES) has conducted studies on this point and, while there is evidence that older students may be less motivated to take the exams compared to younger students, there is no consistent evidence on the impact of motivation on scores. There is also no evidence whatsoever that motivation has become a more serious issue in the last few years, during which time scores have remained flat.

Second, there is evidence of what is known as Simpson’s paradox. LTT reports scores for three major racial and ethnic groups–whites, blacks, and Hispanics–going back to 1973. In each group, we find increases in scores, but there has been no overall increase. Here is the paradox: how can all three of the major student groups increase their performance while the overall scores remain flat? The answer is in the declining proportion of higher-performing white students in the overall student population coupled with the rapid increase in the lower-performing Hispanic population. (The proportion of the K-12 population that is white declined from 78 percent in 1973 to about 57 percent in 2006 while the Hispanic proportion increased from 6 percent to around 20 percent. The black proportion has remained flat at around 15 percent throughout this period.) Thus, one could argue that had the American population not changed, the overall scores of our seventeen-year-old students would have increased. But the American population did change, and it changed throughout the K-12 system. Despite these changes, which affected schools at all levels, the LTT scores for nine- and thirteen-year-old students did increase, but our high schools failed to meet the challenge of America’s changing demography.

Third, there is what I will call the “tsunami” excuse–the forces pushing high schools to engage in an illusion of rigor were overwhelming. If policymakers decide that a mark of a successful high school career is completion of Algebra II, then schools enroll more students into a course called Algebra II. But not all math courses are equal–and it is easier to rebrand courses and still teach low-level math than it is to increase the rigor of math instruction.

Finally, we must note the role of the states. States have the authority to determine curriculum and set standards to which individual schools should conform. Both NCES and the Thomas B. Fordham Foundation have demonstrated how states set “proficiency standards” that are often below NAEP’s basic cut point.[6] Secretary of Education Arne Duncan has called this “lying to children and their parents because states have dumbed down their standards.”[7] In short, even as a wide swath of policymakers have demanded more rigor from high schools, most states have not stepped up to keep their high schools on track in delivering a more rigorous math curriculum that works.

More Math Courses, More Rigor, and No More Success

Here then is the evidence: high school students are taking more math courses with titles that imply more rigor. More of our high school students are getting through Algebra II and calculus, while fewer and fewer of them are stopping at general math and Algebra I. And transcript data show that even as they take more difficult courses, they are earning higher grades. All of these indicators reflect what schools claim to be delivering.

Objective indicators, however, show that the learning being delivered is faulty: LTT data show that, overall, while the math skills of elementary and middle school students entering high schools have improved, what American high school students know and what they can do in math have barely changed over the course of thirty years and not at all over the last fifteen. And when we step outside the United States to compare our high school students to students in other advanced industrial countries that are our peers and our competitors, the picture is also grim.

A delusion of rigor? As attorneys say: res ipsa loquitur–the thing speaks for itself.

Mark Schneider (mark.schneider@aei.org) is a visiting scholar at AEI and a vice president at American Institutes for Research.

Notes

1. U.S. Department of Education, National Center for Education Statistics (NCES), The Nation’s Report Card: Long-term Trend 2008 (Washington, DC: Department of Education, April 2009), available at http://nces.ed.gov/pubsearch/pubsinfo.asp?

pubid=2009479 (accessed September 23, 2009).

2. For more information, see U.S. Department of Education, NCES, “Program for International Student Assessment (PISA),” available at http://nces.ed.gov/Surveys/PISA (accessed September 23, 2009).

3. These data come from the 2005 report. See U.S. Department of Education, NCES, “NAEP High School Transcript Study,” available at http://nces.ed.gov/nationsreportcard/hsts (accessed September 23, 2009).

4. Each Carnegie credit represents 120 hours of classroom instruction.

5. These data were generated using the National Assessment of Educational Progress (NAEP) Data Explorer. See U.S. Department of Education, NCES, “NAEP Data Explorer,” available at http://nces.ed.gov/nationsreportcard/naepdata (accessed September 23, 2009).

6. U.S. Department of Education, NCES, Mapping 2005 State Proficiency Standards onto the NAEP Scales (Washington, DC: Department of Education, June 2007), available at http://nces.ed.gov/nationsreportcard/pubs/studies/2007482.asp (accessed September 23, 2009); and John Cronin, Michael Dahlin, Deborah Adkins, and G. Gage Kingsbury, The Proficiency Illusion (Washington, DC: Thomas B. Fordham Institute, October 2007), available at www.edexcellence.net/detail/news.

cfm?news_id=376 (accessed September 23, 2009).

7. Maria Glod, “46 States, D.C. Plan to Draft Common Education Standards,” Washington Post, June 1, 2009.