America doesn’t have a good relationship with
color-coded systems as previous experiments have shown, but apparently that isn’t stopping the California State Board of Education from introducing school report cards that also look like fodder for late night TV talk shows. The board is in the middle of revamping the state’s school accountability system. With a September deadline looming, it has released draft templates of charts intended to replace the single-number Academic Performance Index, which was largely based on student performance on annual standardized tests. Judging by a sharply worded editorial from
The Los Angeles Times, the board needs to go back to the drawing board. While the board is to be commended for trying to rate schools through a variety of measurements aside from test scores, its efforts confuse more than they illuminate. Headlines of articles written about the new report cards sound alarm bells. To wit,
How to Decipher the State's Proposed School and District Report Cards, or
‘Get to Green’: California Wants to Grade School Performance With Colors Instead of a Single Number. If a veteran education reporter like John Fensterwald of EdSource has to tell people how to “decipher” publicly viewable data—data that is supposed to be clear and accessible to anyone—then its purpose is lost. Ryan Smith, executive director of Education Trust-West, said in a conversation with me that the report cards suffer from too much information. “While I understand the need for multiple measures, and I agree it makes a lot of sense to have multiple measures, we can’t data dump 30 to 40 indicators at the feet of parents and communities,” he said. And what a dump it is, as the Times editorial outlines:
There are nine different categories for measuring schools, with only one of those being how its students scored on the standards tests. Others include “basics” (such as having adequate textbooks and facilities) and “implementation of academic standards.” Each category is ranked by how high a priority it is for that particular school. And each category has two colored boxes. And there are six possible colors for each box. We’ve got this much: Green is good. Red is bad. Yellow is somewhere in between. Just like traffic lights. It’s hard to ascertain what the rest mean, but there’s a separate chart showing all the colors that is supposed to give you an idea of what they stand for. But it doesn’t, really.
Got that? For all the detail the report cards provide (not to mention they
burst with colors not out of place in a game of Twister), parents still can’t get a succinct answer to their most pressing question: Is this a good school?
In attempting to evaluate schools with more nuance, the board has overcorrected to the point of opacity. Categories and numbers and colors blur together so the data cease to mean anything:
One reason the charts are so over-complicated is that they’re being larded with too many factors that don’t reveal how well students are learning. Among the elements being judged: parent involvement, suspension rates, graduation rates and the like. These may be important aspects of school life, but they are a means to an end—better-educated students. If a school doesn’t suspend any students, but also doesn’t improve academic outcomes, why should it get credit for hollow achievements?
The cards are yet more evidence of
California’s unwillingness to make education data transparent, obtainable and easy to understand for all. Despite three years of work, much remains to be done, says Smith. “We still have to figure out performance standards. We need to be clearer about what triggers supports and interventions when we see one or more subgroups are not performing. Those things are still unclear.” The board has about two months to fix the report cards and
establish an accountability system that has been missing for three years. This mishmash of boxes, numbers and colors doesn’t bode well for California’s schools.