This article first appeared in the St. Louis Beacon: Anyone who wants to play what Normandy’s school superintendent calls the “MSIP game” better make sure to know the rules.
Since his district’s annual performance review score was revealed last month to be 11.1 percent – lowest in the state – Ty McNichols and other Normandy administrators have been poring over the numbers, trying to determine the best way to rise out of unaccredited territory by achieving a score of at least 50 percent.
The easy answer, of course, is for the district to improve in the areas that MSIP5, the latest version of the Missouri School Improvement Plan, measures: academic achievement, college and career readiness, attendance and graduation rate. Top-flight performance in such basic categories is what every teacher, principal and superintendent aims for.
But the complicated nature of MSIP5 can mean that simply doing better may not be enough to score any points, let alone enough to raise your percentage score to an acceptable level. The new evaluation process measures not only how well you do but how far you’ve come since the last time your district was evaluated.
Measuring the progress of a district and the growth of individual students can be a valuable tool in tracking performance, says Kathleen Sullivan Brown, an education professor at the University of Missouri - St. Louis. But the failure to improve enough can leave the impression that no learning is going on at all, and a system that is too difficult to understand doesn’t help the public figure out how well the schools in their neighborhood are doing.
“I like certain things about it,” Brown told the Beacon during a detailed look at MSIP5 and its first round of results. “I like the fact that you do get credit for growth, but I’m surprised that so many districts didn’t. I like the fact that it is more comparable, because that’s what people want.
“But it is an arcane system they have created. The problem is it has these benchmarks. If you don’t meet these benchmarks as a group, you don’t get any points. That seems unfair. It’s just not a very straightforward system. Whenever I see systems like that, I wonder why aren’t we being more transparent about this. Why do we make it so difficult for people to understand what this means? It seems to me we sometimes unnecessarily confuse matters and obfuscate.”
The Beacon asked Brown to go through the system in general and the Normandy numbers in particular to point out where opportunities lie and where the new system could improve to make it more fair and more accessible.
Asked for a good way to explain some of the problems, particularly in areas where the MSIP5 report showed a stark row of zeroes, Brown came up with this analogy from the world of business
“If you’re a car salesman, and your team doesn’t meet its quota at the end of month, you don’t get any points toward a bonus at end of the year. It’s kind of like that.”
Ready for a plunge into the depths of MSIP5? Here we go.
ELA, MPI, FAY and MSIP
At a school board meeting last month, a Normandy administrator presented breakdowns of the district’s MSIP5 numbers that had been released a week earlier. Included in her slide presentation was an explanation of a graphic – something that at times needed straight translation, or at least a slow, careful, step-by-step explication of exactly what the figures mean and, perhaps more importantly, what Normandy could do to improve at its next evaluation.
Here is what the explanation said:
“In the area of academic achievement, no points were awarded for status as all Missouri Performance Index or MPI scores were at the floor level. MPI scores below 300 are considered at the floor. MPI scores are calculated by awarding 5 points for each advanced score, 4 points for each proficient score, 3 points for each basic score and 1 point for each below basic score. The total is divided by the number of students tested and multiplied by 100. This results in a score between 100 and 500. The awarding of three points for students at the basic level is an important change as the focus is not just on students being proficient and advanced but recognizes the work that districts are doing to move students out of the basic level and towards proficiency. In ELA, six growth points were awarded. Growth points are awarded based on the growth of individual students between two points in time. In mathematics the maximum number of progress points were earned. Progress points are awarded based on the percentage of increase in MPI from year to year. Additionally the green denotes that in mathematics the level of points earned falls within the accredited range. One change from MSIP IV to MSIP 5 is that only the scores of students in attendance for a full academic year or FAY are used in the calculation.”
Got that?
If not, here is what it means, according to Brown and to the explanation from a 102-page comprehensive guide to MSIP5 available on the website of Missouri’s Department of Elementary and Secondary Education (that’s DESE, for those of you playing acronym bingo).
You can also consult an earlier Beacon graphic by presentation editor Brent Jones breaking down the MSIP5 accreditation scorecard.
Know your abbreviations
MSIP5: the latest version of the Missouri School Improvement Plan, measures: academic achievement, college and career readiness, attendance and graduation rate.
DESE: Department of Elementary and Secondary Education
FAY: Full Academic Year
ELA: English language arts
MAP: Missouri Assessment Program
MPI: MAP Performance Index “The index approach calculates the movement of students throughout all MAP achievement levels.”
EOC: End of course
The MPI, or the MAP Performance Index, is designed to boil down to a single number how well a district, a school or a group of students is progressing toward Missouri’s goal of being in the top 10 states in education performance by the year 2020.
MAP scores are ranked based on where they fall on a four-part scale: advanced, proficient, basic or below basic, with points awarded for each step on the scale, from 5 down to 1.
Students’ growth from one year to the next is also figured into the calculation, which is one of the major changes in MSIP5 from earlier versions of the state’s evaluation process. It is designed to reward districts that have helped students move up, particularly those climbing up to the basic level from the bottom rung.
Growth points are awarded on the basis of upward movement by individual students; progress points are awarded on the basis of upward movement by a district overall.
So as the presentation at the board meeting showed, because the district’s MPI score in English, science and social studies dropped over the past three years, and its score in math was only slightly better, it received no points in those categories – a big blow when it came to adding up its final score, which was only 15.5 out of a possible 140.
Academics count for 70 of those points, or half the total. With no points for academic achievement or graduation rate, Normandy gained its only points for college and career readiness (8) and attendance (7.5).
At the board meeting, McNichols explained that a close analysis of Normandy’s numbers showed that small improvements in a number of areas could reap a significant increase in the number of points the district would earn.
His goal, he told the board, was a 3 percent increase in the academic areas that are tested, to move Normandy students out of the lowest category and into the area where the district would be awarded points for academic achievement. He also cited ways the district could get more points in the other areas that are counted, such as attendance, graduation rate and college and career readiness.
“We’ve been knocking at the door, but we haven’t crossed the threshold,” is how he explained the district’s academic achievement at a public forum sponsored by the Show-Me Institute earlier this month.
He also said that the improvements he hopes to see in test scores by the students who have remained in the district instead of transferring may not be easy to attain because of the economic hit Normandy will take by paying tuition and transportation for students who chose to leave under the newly upheld state transfer law.
McNichols said the district stands to lose $15 million and could be bankrupt by March. It is studying ways to cut its budget, most likely by eliminating staff, because salaries are a school district’s biggest expense. But a reduction in the number of teachers isn’t particularly compatible with the goal of academic improvement, he said.
“We can’t do less and expect more,” McNichols said.
Data, information and judgment
The emphasis that McNichols put on teachers and the effect they have on student success could also be used by DESE in the way it presents MSIP5 data to the public, Brown told the Beacon.
Too often, she said, the numbers included in a district’s annual performance report are overwhelming, in a numbing, eyes-glaze-over way that tends to leave those who aren’t schooled in what they mean out in the cold.
“It’s turning kids into points,” Brown said. “I like that they moved away from just 14 indicators and are focusing more on real core concerns, but now they’re setting kids up to generate these point values, then saying not enough of you got it, so you don’t get any points. That’s a horrible message to send to kids.”
That may not be the intent of state education officials, Brown said, but it frequently is the result. The answer, she added, is to find a way to make the annual data dump understandable yet meaningful, simpler but not simplistic – a task that she admits is not easy.
“There’s data, there’s information and there’s judgment,” she said. “Just because you have a whole bunch of data doesn’t necessarily mean that you understand that problem. You need to have the right data. You need to have the right solutions.
“Forget what it means to DESE. What does it mean to the kids? There is too much emphasis on the district. The emphasis has to be on the students. To get them from here to here, what needs to happen?”
Making the scores more meaningful, Brown said, would require drafting a system that is more relevant to the daily experience of students and their families.
“They say whether you’re on track to get somewhere by 2020,” she said. “But if you’re a second grader or a third grader, that means nothing to you if you’re not learning to read. We’ve got to start thinking about 2013 and 2014.
“Policies should be useful, something that is transparent and tells people, if you have a question, I’ll try to provide some data that give you an answer. If all you are doing is muddying the waters, that is not good policy, not doing what an education policy is supposed to do: help us understand and make things better.”
She held up the front page of the Post-Dispatch on the day the annual performance report tallies were made public, with a bold headline reading “THE GRADES ARE IN,” to highlight another point: The final numbers may help make facile comparisons of one district to another, but whatever nuance can be gained from diving more deeply into the statistics is often lost because the system is too difficult to follow.
Brown likes the emphasis on growth, and she likes the fact that the numbers are available down to the level of individual school buildings. And she assumes that after this first year’s experience, fine-tuning will be done to the system that evolved from many meetings of educators throughout the state.
But, she added, striking the balance between a system that is meaningful and one that is easy for the public to understand will not be easy.
“That’s the problem with policies,” she said. “You take away all of the judgment when you try to put in some kind of algorithm.”
One measure that is already available and has been shown to be a good barometer of school success, Brown said, is the ACT score. But not everyone takes the exam because not everyone is heading for college. She thinks that should change.
“I think some of the best data we have in Missouri is the ACT score,” she said. “It’s a stable, steady test result that has been out there for years. It would make sense if we made all students take the ACT, like they do in Illinois, and we pay for it. We would have a single, credible test score for everybody.”
As McNichols told his school board last month, taking the ACT could also give kids a focus on college they might not have had before. In districts like Normandy, Brown said, that emphasis could be a real plus when so many other factors come into play.
“Doing urban education is a very difficult challenge,” she said. “You can’t take your eye off the ball in any one place. You have to watch the teachers, you have to watch the students, you have to keep track of what’s happening with parents. You’ve got to have an environment that’s positive.
“These systems tend to want to point us to ‘If you just fix this over here, everything else will fall into place.’ But the history of this kind of thing is if you focus on one thing, something else slides.”