12/15/2005

S&P Levels the Playing Field -- And Lowers Expectations

I'm not sure that I like what many states will do with S&P's new poverty-adjusted version of the 2005 NAEP results, which I fear will be to say some version of "Our kids are doing well relative to other poor kids," or "We're doing as well as can be expected, given our student demographics." The Stateline overview of the report (with links to the actual thing) is here. Yuk.

While I don't think it's really happened, lowering expectations for poor and minority kids is exactly what some educators feared would result from the disaggregation requirements of NCLB. But NCLB makes clear that the expectations are the same for all subgroups.

The S&P analysis doesn't, and as a result could be used for expectations-lowering in the 11 states (Florida, Kansas, Kentucky, Massachusetts, Minnesota, Montana, North Carolina, New York, Oregon, South Carolina and Texas) that "do better" on S&P's poverty-adjusted ratings.

6 Comments:

Anonymous Anonymous said...

Alexander:

We appreciate your interest in the risk-adjusted NAEP analysis that we recently published, and understand your concerns about how such information may be mis-used. We tried to deal with this squarely in the paper by reminding the reader that statistical expectations are quite different from policy expectations; beating statistical expectations should not be seen as "good enough." It is this concern -- that "leveling the playing field" will "lower expectations" -- that makes some standards supporters uncomfortable with new techniques like value-added performance measurement as well.

The reason that we insist on doing this kind of analysis is that we firmly believe in the value of performance benchmarking as a tool to foster improvements over time. Without adjusting for student poverty, low-performing states have little practical guidance for where to turn to find strategies to improve their NAEP results. As is implied in the paper, the idea that lower-performing (but high-poverty) states like Mississippi and New Mexico should be expected to learn from higher-performing (but low-poverty) states like Connecticut and New Hampshire seems dubious, given that their performance differences might be due more to demographic differences than differences in educational strategies.

In fact, we believe there's the potential for a false dichotomy between holding all states, districts, and schools to one standard, on the one hand, and considering performance on a risk-adjusted basis on the other, and that there is value in doing both. The former serves to remind everyone of the end-goal and the size of the remaining challenge to get there, while the latter provides some insight into the obstacles that stand in the way of getting there, as well as examples of those that are managing to overcome them. In this sense, putting states on a "level playing field" can help cut through the excuse that is typically offered in response to uniform standards -- "this isn't fair; our kids are different" or "this is the best that can be expected, given the types of kids we have." Adjusting for these challenges has the benefit of using data to cut through these excuses and demonstrate current possibilities that are being achieved. This is the equivalent of establishing a baseline, and trying to use the exceptions to facilitate a positive change in the baseline.

The analysis that we do is intended to start conversations by making clear where lower-performing states, districts, and schools can go to find promising practices to replicate as they seek to improve. We encourage you to be vigilant and help us to protect against this kind of analysis being instead used by some to stop conversations by viewing our findings as conclusions (rankings or ratings) that judge states, districts, or schools to be doing a "good job," and that "a good job" translates into "good enough job," or worse, the "best that can be expected."

However, we do not think the answer is to shy away from this kind of analysis due to the risks. The problem with sticking with one representation of performance, in which every school, district, or state is compared to a common goal regardless of their challenges, is that it serves only to exhort better performance (i.e.--you must do better); it does not provide much practical information as to how to get from current levels of performance, unacceptable though they may be, to desired levels of performance. The usual response is to provide case study examples of the exceptions, but this is not a systematic approach, and often ends up allowing defenders of the status quo to emphasize the factors that make these entities "exceptional", as in different from others and therefore not the source of replicable practices.

In summary, we wish to measure the relationship between poverty and performance with the explicit goal that this way of thinking can help facilitate the reduction and elimination of this relationship over time, on the grounds that the best way to eliminate the problem is to first understand the problem as it exists.

Thank you again for raising the issues, and for helping us to ensure that this kind of analysis is used correctly to foster improvements over time.

Paul Gazzerro
Director of Analytics, Standard & Poor's School Evaluation Services

3:14 PM  
Blogger Alexander Russo said...

Thanks for these thoughtful comments, Paul, and I agree that there is some use to risk-adjusted analysis, especially when it comes to encouraging states with similar demographics but different performance levels to look hard at what they're doing.

However, I remain concerned that your intended message (and the distinction you make between statistical and policy expectations) will be clear. It's a slippery slope.

If, for example, you were to do the same analysis on the urban NAEP scores, I can't help but believe that some districts would use the results for self-vindication rather than self-improvement. I've already heard chatter about that already, and it seems a shame.

2:12 AM  
Anonymous Anonymous said...

The other benefit of this type of SES adjusted analoysis is that it shows that most high-performing schools are merely coasting in their high SES, especially when you look at the disaggregated NCLB data that shows these school districts do as poorly with low SES kids as the failing school districts.

10:55 AM  
Blogger Alexander Russo said...

True enough, many higher performing schools don't have many low-SES students and/or don't do well with low-SES kids, a fact not revealed until NCLB came along. And the S&P report does highlight that states without many low-SES kids don't have as many challenges as others that do.

However, unless I've missed something, the S&P report doesn't show disaggregated performance levels like NCLB subgroups. So states don't actually know how they're doing with various subgroups on the NAEP. And, my main point, may take the results as an excuse rather than a point of needed improvement.

11:03 AM  
Anonymous Anonymous said...

That's right and why I pointed to "this type of SES adjusted analysis" rather than S&P's actual analysis. S&P provides a valuable service, but then fails to do meaningful analyses and make the data readily available in tabular form for users who want to analyze the data for themselves. Let's not also forget that S&P has been using the "demographically similar schools" comparison option since its inception which provides schools with the same type of cover you've noted.

9:36 AM  
Blogger Alexander Russo said...

here's another issue surrounding the S&P analysis and how it will be used:

"Undoubtedly, the Texas Education Agency will tout Texas' performance on the Standard and Poor's NAEP Report Card which has just been announced today. However, Standard and Poor has dug a little deeper than most and has put an asterisk beside those states which have chosen to exclude large percentages of students' test scores from the mix. Of course, excluding large percentages of students' test scores would..."

"Response to Standard & Poor's NAEP Report Card"
Monday, December 19, 2005
http://www.educationnews.org/Response_to_Standard_Poors_NAEP_Report_Card.htm

9:13 AM  

Post a Comment

<< Home