The short version of the change in their program is that they are moving from a 14 point scale to a 150 point scale. In the past, schools who got 13/14 points received accreditation. Now they must receive at least 105/150 to be accredited. Districts receiving between 75-104 points will have provisional accreditation. As further incentive, districts who receive more than 90% of the points will be "accredited with distinction."
Why the change? The claim is that the new system will "intensify pressure on low-performing school districts to improve, while exposing even the best schools to new scrutiny from parents and the public." In fact, districts will now be able to compete more because their will be a wide range of scores in the accredited column. DESE plugged three years worth of data into the new system to give districts an idea of how they would rate. The examples given in the Post Disptach were Mehlville, Parkway, Pattonville and Rockwood. All three had perfect scores under the old rating, but under the new scale their scores would be 97.9, 98.2, 87.5 and 96.8 percent, respectively, based on this latest performance data.
Everyone keeps saying competition among the schools is good. The new system begins to look a little like Olympic scoring where bragging rights can depend on the tenth of a point earned in some small aspect of the grading. Lower performing schools have many more areas they can work on to up their scores which, in theory, should help them.
Unfortunately for lower performing schools, like St. Louis City schools, the reality is that the new system will probably strip away their newly acquired provisional accreditation. It could be because, and the comments in the Dispatch seem to support, some of the new categories to be rated contain bias against districts like St. Louis Public Schools.
Take the new way they will calculate attendance rate. Under the old system, schools supplied the percentage of kids in school daily. The new system requires districts to calculate what percentage of students are in school 90% of the time to focus on the kids who are missing the most school. To increase their score, the districts are going to have to find a way get those kids to come to school.
Suburban school districts, whose parents typically have more education and are more tuned in to their children's education have very good attendance rates no matter which way they are calculated. Urban district families face a host of problems which impact whether or not children attend school. Parents typically have less education. Many pay less attention to their children's education as a result. One Fergusson/Florissant parent wrote, "I have witnessed a very competent teacher who after she had a terse encounter with a parent throw her arms up and exclaim publicly for all nearby to hear, 'How can we expect kids to learn when these are the parents in our school?' "
Children's home life can be chaotic making getting to school challenging. Health care, for now, may be missing causing more illness. These and other factors endemic to urban life affect how many and how often kids attend school. I find it a little scary that anyone wants to give schools the power to impact these things. That seems well beyond the current authority of a school district. I would also expect this point in the accreditation process to be a constant point of contention for urban districts who will rightly note that they are penalized more for this than their suburban counterparts.
The new system was a bit of a shock for administrators. It set the bar higher. But with Missouri ranking 41st in the nation in the Ed Week 2013 Quality Counts Report, change would appear necessary. Setting our sites on a higher rating by Ed Week means hitting the categories noted in the chart below. Ed Week, funded by the Gates Foundation, is in essence telling states, these are the things they need to focus on.
|Source: Ed Week Quality Counts 2013|
The other shift in focus of the new system is onto individual students and their achievement, making sure they leave the district college or career ready, rather than the overall results for district. Yet ironically, what seems to be missing in DESE's and Ed Week's rating systems is the role of the student. Teachers are accountable, administrators are accountable, state legislatures are accountable, but the student is merely raw material put through the system to be transformed by the system into whatever the state needs. It's hard to see how scores in lower performing districts are going to change when the consumer of what they are offering has no accountability and no incentive to change their behavior.