Imagine you’re a parent of a seven-year-old who has just come home from school with her end-of-year report card. And the report card provides marks for only two subjects, and for children who are in grade-levels different from hers. Furthermore, there's nothing on the report card to indicate how well these children have been progressing throughout the year. There are no teacher comments, like "great participation in class" or "needs to turn in homework on time." And to top it off, the report gives a far harsher assessment of academic performance than reports you've gotten from other sources.
That's just the sort of "report card" that was handed to America yesterday in the form of the National Assessment of Education Progress. And while the NAEP is all well and good for what it is -- a biennial norm-referenced, diagnostic assessment of fourth and eighth graders in math and reading -- the results of the NAEP invariably get distorted into all kinds of completely unfounded "conclusions" about the state of America's public education system.
'Nation's Report Card" Is Not A Report Card
First off, let's be clear on what the NAEP results that we got yesterday actually entail. As Diane Ravitch explains, there are two different versions of NAEP: 1) the Main NAEP, which we got yesterday, given every other year in grades 4 and 8 to measure national and state achievement in reading and math based on guidelines that change from time to time; and 2) the Long-Term Trend NAEP given less frequently at ages 9, 13, and 17 to test reading and math on guidelines that have been tested since the early 1970s. (There are also occasional NAEPs given in other subjects.) So in other words, be very wary of anyone claiming to identify "long term trends" based on the Main NAEP. This week's release was not the "long term" assessment.
Second, let's keep in mind the NAEP's limits in measuring "achievement." NAEP reports results in terms of the percent of students attaining Advanced, Proficient, Basic, and Below Basic levels. What's usually reported out by the media is the "proficient and above" figure. After all, don't we want all children to be "proficient?" But what does that really mean? Proficiency as defined by NAEP is actually quite high, in fact, much higher than what most states require and higher than what other nations such as Sweden and Singapore follow.
Third, despite its namesake, NAEP doesn't really show "progress." Because NAEP is a norm-referenced test, its purpose is for comparison -- to see how many children fall above or below a "cut score." Repeated applications of NAEP provide periodic points of comparison of the percentages of students falling above and below the cut score, but does tracking that variance really show "progress?" Statisticians and researchers worth their salt would say no.
Finally, let's remember that NAEP proficiency levels have defined the targets that all states are to aim for according toto the No Child Left Behind legislation. This policy that has now been mostly scrapped, or at least significantly changed, due to the proficiency goals that have been called "unrealistic."
Does this mean that NAEP is useless. Of course not. As a diagnostic tool it certainly has its place. But as the National Center on Fair and Open Testing (FairTest) has concluded, "NAEP is better than many state tests but is still far from the 'gold standard' its proponents claim for it."
NAEP Results: "As Modest As It Gets"
So what to make of this year's results? Not much, according to most reports, which tend to echo Secretary of Education Arne Duncan's statement that "modest increases in NAEP scores are reason for concern as much as optimism." Or then, maybe neither?
Writing at the blog for the Albert Shanker Institute, Matt Di Carlo sums up yesterday's NAEP results so:
NAEP results indicated a “significant increase” in fourth and eighth grade math and eighth grade reading, but in all three cases, the increase was as modest as it gets – just one scale score point, roughly a month of “learning.” Certainly, this change warrants attention, but it may not square with most people’s definition of “significant” (and it may also reflect differences in the students taking the test).
So there you have it.
Nevertheless, the modest results yielded by NAEP don't stop Beltway-based journalists and edu-pundits from projecting their favorite policy idea onto the high-profile Rorschach test that NAEP has become.
As Di Carlo predicted in a different post, "People on all 'sides' will interpret the results favorably no matter how they turn out."
Sure enough, as soon as NAEP results were made public, the Washington Post's Valerie Strauss -- no fan of the education "reform" movement -- declared on her blog that the modest showing was yet more proof that test-based reforms were having no effect on improving schools. She writes, "Someone should be printing up a T-shirt about now that says: 'My nation spent billions on testing and all I got was a 1-point gain.'"
Conversely, school reform enthusiasts at the Education Trust used NAEP's modest showing as a pretense for declaring that school reforms are "not moving fast enough."
Meanwhile, In The Real World
All this back and forth about "what NAEP results really mean" would be just as entertaining as your favorite daytime soap opera if it weren't for the fact that there weren't significantly more consequential problems crashing down on the nation's schools.
While edu-pudits get a sugar high from statistical tables and charts, Americans on a strictly meat and potatoes diet are getting a very different view of the world. While Beltway policy wonks argue over miniscule data points, families are coping with the immediate reality of school cuts that are robbing their children of real learning opportunities.
During the run-up to the NAEP carnival, parents and teachers in Cleveland had far more important concerns about budget cuts denying their students opportunities to ride the bus, play sports, participate in school plays, and go to summer school. Parents in Pasco County, Florida wondered about the consequences to their kids now that art instruction has become rare and PE classes have been cut to once a week. Elsewhere in Florida, thousands of middle and high school students fail to grasp the importance of NAEP as they walk hazardous routes to school because there's not enough money to pay for bus services.
One wonders how all those who fancy themselves to be such clear-minded analysts of "real data" from NAEP feel about a new report from California indicating that "only about one in 10" elementary school students in the state get adequate instruction in science.
This kind of information from the reality community is often dismissed as being "anecdotal." But it's not "anecdotal" when it's your kid. Furthermore, when anecdotes become this frequent, it's called a trend.
Another popular notion among the political class is to parry concerns about student's every day lives in school with arguments about the need to change the focus on "inputs" -- what kids actually do during the school day -- to "outputs" such as test scores. But we know that the inputs of a well-rounded curriculum that includes art, music, PE, and science undoubtedly have real consequences in our children's lives. So why do we we confine our attention to outputs, such as NAEP, that completely ignore that?
What all the fetishization of NAEP results amounts to is so much scrutiny of the dipstick while the wheels are falling off the car. Can you say, "Out of touch?"