The Performance Management Framework is part of The Strategic Plan. It's intended to create additional transparency, further its goal of everyone being accountable, and basically... make stuff better. Individual schools and the district as a whole are supposed to have annual, clear evaluations in the form of "report cards" which will allow everyone to see how they're doing. High-performing schools making progress with students can earn additional autonomy, and we can all have a look at what they're doing that helps their students succeed. Schools with poor performance and struggling student progress will receive additional assistance and supervision from the central office. The MAP test would allow everyone to gauge how efforts are working, and make adjustments accordingly, at multiple points during the year. For all schools participating, there would be additional funding based on their performance needs.
This all sounds fine in theory. Don't you want struggling schools to get additional help so their students do better? Shouldn't schools that are doing well by their students get some additional decision-making freedom? Do you hate babies, puppies, mothers and apple pie? Hint: You should only be answering no to the last question.
The idea behind Performance Management makes sense. The transparency it would give to everyone - to staff at schools, to the central office, to the community - would be a good thing. The increased ability to compare what efforts do and don't work and what sticking points might be could be great. And for those of us who only clean our proverbial houses (ahem. Not that I know anything about that in the literal sense) when we think someone might be coming over, it would hopefully promote increased accountability.
The SE Initiative, a district intervention in 3 schools (Aki Kurose, Rainier Beach HS and Cleveland HS), was the initial pilot for Performance Management.
All three schools have struggled with academics. Aki and Rainier Beach have been bleeding enrollment over the course of the last decade. The basic goal of the SE Initiative was to make things better at all three schools. Yes, the official wording was significantly more pompous than "make things better." The district explicitly stated that the SE Initiative would "inform the development of a comprehensive school performance framework." (p11, and yes, a long way of saying "it's a pilot")
So, did the SE Initiative work? On numerous fronts, no.
Enrollment is up a little bit at Aki and Rainier Beach. So that worked... a little. Even so, Aki and Rainier Beach are both woefully under capacity. I will briefly note that the glittery magic of functional capacity analysis reduced the capacity of under-enrolled schools like Aki and Rainier Beach, but conveeeeeeniently found that over-crowded schools like View Ridge and Schmitz Park had puh-lenty of room. Right. Anyway.
Academically? The program failed. Academics went up and down a little bit, but never hit the "met goal" mark (which meant coming within 10 points of the goal).
And transparency/accountability? Failed. The best document that I have reviewing the overall performance of the SE Initiative is one I obtained through super-sleuthing (okay. Someone sent it to me). And it pretty much... sucks. Take a look at it. Three years, and the best analysis they can provide is "progress made (y/n)" and "goal achieved (y/n)"?
No line item for total annual project cost, much less a line item breakdown of how the money was spent at each school, or whether the spending moved the student achievement needle. No note of tactics. Just whether there was progress or if the target was met. And I'll say again: the target was considered "met" if they came within 10 points of it. That's much like saying I got an "A+" because I came within 10 points of it. Which would be a... "B+". Still. Whatever. Those are the metrics they settled on. Maybe that's okay.
How-ev-uh. Given the significant lack of relevant information, at some point, you'd think someone would say, "the analysis of this pilot sucks." And since the SE Initiative was intended to "inform" a larger, rigorous performance framework, who in hell would say "Looks great! Let's... expand this!" ?
Unfortunately, SPS's board and management.
In 2009-10, Performance Management expanded to add 9 more schools, bringing the total number of schools piloting the "framework" to 12. They were:
1. Dearborn Park Elementary
2. Maple Elementary
3. Olympic Hills Elementary
4. Roxhill Elementary
5. Schmitz Park Elementary
6. Stevens Elementary
7. Aki Kurose Middle School
8. McClure Middle School
9. Mercer Middle School
10. Ballard High School
11. Cleveland High School
12. Rainier Beach High School
Spending? Dunno. Well, publicly, dunno. The FAQ helpfully says that the Alliance for Education helped "support" the effort of creating the scorecards (dudes. For realsies? You needed a freaking grant to think up the scorecards?) with a grant of... oh, wait. They don't say how much it costs. However, a handy-dandy public records request indicated that the 2009-10 spending for Performance Management was $2.8 millllllllion ($1.5m from baseline and $1.3m from the Gates Foundation... which must have been passed through the Alliance for Education and then on to the district? Maybe. I dunno.)
Overall Goals? Dunno. Well... presumably to make things better for students? Let's go with that. We'll call "Make Stuff Better for Kids" a goal. In district-speak it would probably be more like "delivering on our commitment of ensuring a high quality school for every student so that all students graduate from high school prepared for College!, Career! and Life!"
Tactics Implemented to Achieve Goals? Dunno.
Any Tactics Changed to Optimize Results? Dunno.
To be fair(ish), it's possible that all of this will be reported on in detail in the Quarterly Strategic Plan Update. It's also possible that I will get my very own fairy godmother who will put the bedazzled-skull McQueens under my pillow the very same night that rigorous detail on Performance Management progress to date is publicly given. I don't know with absolute certainty whether either of this things will or won't happen. If you don't like obscenity, skip to the next paragraph, because I can't say this without swearing: I can sure as fuck guess what's likely, and sadly, it doesn't include a pair of kick-ass McQueens under my goddamn pillow.
So what do I know? I know that I can provide metrics (and a crappy chart!) for what I don't know, and then measure and benchmark the lack of information across multiple areas. Wooooooooo! Accountability, here we come!
Given the sketchy details and assessment for the SE Initiative, which involved 3 schools instead of 12, I'm not going to hold my breath that we're going to get more detail than I've provided in the above chart.
So. There seems to be a leeeeeeettle misunderstanding, by SPS management, of what a "pilot" is. In most places, a pilot is a small-scale project used to test the design/feasibility of a larger-scale project. Design issues in the pilot are often remedied if the project is moved forward to a larger scale. Often, a pilot provides quantitative proof of the viability of moving forward on a larger scale. This is how most people and groups think of pilot projects.
For SPS? We just keep expanding, baby. Changing course because of "data" and "results" is for sissies. And speaking of expanding...
For 2010-11, Performance Management will be rolled out in an additional 25 schools, bringing the total schools participating to 37. $5.1 milllllllllllion additional dollars will be allocated to participating schools, with grants ranging from $440,250 (Cleveland, with a 2009-10 FRL of 66.8%) to $24,000 (Brighton, 2009-10 FRL of 80.1%).
Given the financial shortfall in the operating budget, and the belt-tightening going on in schools across the district, you kinda have to ask: where's this five point two milllllllion dollars coming from?
As it happens, I know the answer to that, from page 10 of this presentation. It comes from the following funds: Title I ($2m), LAP ($.5m) and FRL ($2.1m).
But, kids, that's going have to be for the Crappy! Chart! Thursday! Fun with Performance Management History: The Sequel! That's right. There's (maybe. If I feel like it) going to be the first ever Crappy! Chart! Sequel!
It's historic. It's thrilling. It's sunny out.