School Performance Scores finally out?

Since the Louisiana Department of Education finally released their School Performance Scores (SPS) yesterday, I’m inspired to pass along some thoughts on how screwed-up the methodology is.  At various points I assign letter grades to various component indices.  In doing so, I am using the same scales as the composite SPS (below 50 F, 50-69.9 D, 70-84.9 C, 85-99.9 B, 100-150 A).

1) Timing:  The scores released October 24, 2013, depend on scores of tests taken primarily before the end of the school year in May, perhaps with some summer retests thrown in.  The graduation data are one year older.  How incompetent must the Department of Education be to not be able to release data before school started again this fall?  I’m stunned that legislators and citizens tolerate the lag time between actions and so-called accountability.

2) Missing schools:  1315 schools in Louisiana had SPS scores and grades assigned, and all but 5 of those used the same site code and school name as reported on the February, 2013, enrollment count.  The five exceptions are in Lincoln Parish, where the Howard School (which appears to be at the Methodist Children’s Home) and four University-affilitated lab schools (three at Grambling and one at Louisiana Tech, if I’m not mistaken) were removed from Lincoln Parish’s district grouping and placed in their own groups.  That’s odd, but even more disturbing is that 79 schools were not given a school performance score.  Many of those are low in enrollment or alternative schools; others have closed.  However, they all had students.  One, East St. John High School (site code 048001) had 1306 students.  Yet there’s no school grade reported.  See https://sites.google.com/site/drjamescfinney/home/files/ for a spreadsheet showing the missing schools.

3) FERPA nonsense:  In their omnipresent effort to keep useful data away from the public, the folks at the Department of Education routinely use the federal government as an excuse to obfuscate:  “The Louisiana Department of Education has modified and/or suppressed data reported to protect the privacy of students in compliance with the Family Educational Rights and Privacy Act (FERPA) codified at 20 U.S.C. 1232g. The strategies used to protect privacy vary and may include rounding or other techniques but do not substantially affect the general usefulness of the data. . .”  Baloney!  It’s hard to do any kind of mathematical operation on the number “>145” because it’s not a number.  Is anyone’s privacy really being protected by not letting us know if the credit accumulation/dropout index is exactly 150 or some slightly smaller number?  Hardly.  But “muddying the narrative” with FERPA as an excuse makes it very difficult to check on the quality of the calculations used by the state.  So much for accountability.

4) Combination schools:  Bulletin 111, which supposedly defines how this SPS “accountability” system is supposed to work, appears to put twice the weight on high school students versus elementary and middle school students when calculating a score for a school with students in both grade ranges.  Unfortunately the public is not privy to the exact number of units on each side to see if, for example, the K-8 component consists of only one student, or most of the students, or some other ratio.  There might be a legitimate privacy concern if there are only a few students in either group, but we could at least be given the two component SPS scores (K-8 and 9-12) from which the composite scores were calculated.

5) Overall grade distribution:  Of the 1315 schools given a grade, 106 got Fs, 264 got Ds, 374 got Cs, 384 got Bs and 187 got As.  What does this mean?  Absolutely nothing.

6) K-8 assessment index distributions: Of 1177 K-8 assessment indices, there were 160 Fs, 299 Ds, 315Cs, 265 Bs, 100As, and 38 not reported due to low enrollments.  So there would have been a lot more potential takeover targets before bonus points were thrown in.  Where available, the dropout/credit accumulation index mentioned below also would generally have moved the scores up.  The hideous characteristic about the K-8 assessment index (which also applies to the high school EOC and ACT indices) is that a student either earns the school an A (100 points for Basic, 125 for Mastery and 150 for Advanced) or an F (0 points for either Approaching Basic or Unsatisfactory).  And the difference between Approaching Basic and Basic is one multiple-choice test question.

7) Dropout credit accumulation index:  This component is ridiculous for at least two reasons.  First, it appears to judge a school containing eighth grade based on what the 2011-12 eighth grade class did at their high school in 2012-13.  Note that the high school doesn’t get judged directly on credits accumulated by those students.  Furthermore, almost everyone got an A.  Of 448 schools given a dropout-credit-accumulation index, only 7 got index values between below 50, the F range. Another 4 were Ds, 10 Ds and 9 Bs.  The remaining 418 schools with eighth grade got index values above 100, the A range.  A student contributes 100 points to the index if, by the end of ninth grade, they have at least five high school credits.  Is this raising the bar?  The pattern seems to raise one bar too far, and then drop another to compensate.

8) The End-of-Course Assessment Index is reported, at least in theory, for 352 high schools, although 16 apparently had fewer than 10 students and hence no data reported.  Of the remaining 336, the index values were in the F range for 87, D 103, C 85, B 39 and A 22.  Yep.  John White’s magic formula says 87 schools flunked the End-of-Course Assessment  index and only 22 got an A.  These data have the same problem as the elementary assessment index data — a student either contributes 100 points or more (worth an A) or zero points (lowest possible F score).  Again the difference is one multiple-choice problem.  There is no difference in SPS terms between getting every answer wrong and getting the highest-possible “non-proficient” score.  But one more correct answer turns an F into an A.  Then enough additional correct answers might make the score a higher A.  And the state won’t even share the histogram of how many students got each of the possible scores.

9) The ACT Index is even more perverse in its distribution than the End-of-Course Assessment Index.  Of 321 high schools, 12 had insufficient numbers to report data.  Of the remaining 309, the distribution was 104 F, 105 D, 61 C, 24 B, and 15 A.  It should be noted that two schools with identical average ACT scores can have wildly different ACT index values.  For example, a school with an average ACT of 17 could get an ACT index as low as 0 (if everyone got a score of 17) or as high as 94.1 (if 16 students got 18s and the other one got a zero).  In reality, the variability would not be as extreme.  However, shifting from a score of 17 being worthless in SPS terms to an 18 being valuable just seems bizarre.

10) Graduation Index:  Of 306 high schools with reported graduation index values, 10 were in the F range, 20 D, 85 C, 125 B, and 66 A.  So maybe this is supposed to compensate for the opposite skew on the End-of-Course and ACT data?  It’s also worth noting that there is a one-year lag in the graduation index data beyond the compression of four year’s worth of history into one reporting interval.  Thus a student who dropped out October 24, 2008 (five years to the day before the scores were released) would finally count (as a zero) on the 2013 School Performance Score.  And a high school whose entire senior class dropped out last year could still have an excellent graduation index because of the one-year delay.  Go figure.

11) Graduation Rate:  The same problems mentioned with Graduation Index recur with the Graduation Rate.  The data are all one year old, and a dropout from five years ago affects the current score.  Also, the conversion from graduation rate as a percentage to the Graduation Rate Index is generous.  There were only 10 Fs, 15 Ds, 32 Cs, 53 Bs and 196 As.  All it takes to get an A is for 3/4 of the freshmen to last through graduation.  So much for high standards.

12) Bonus Points (High School):  John White has suggested that poor scores on ACT or End-of-Course indices would be compensated by the bonus points.  In fact, that was his testimony before a lesislative committee.  The reality?  Of 138 high schools with no K-8 component, exactly 7 were given bonus points.  They were:

  • Caddo Parish Magnet High School (already an A before the 4.3 bonus points)
  • Live Oak High School (Livingston Parish, already a low A before the 3.7 bonus points)
  • Edna Karr High School (Orleans Parish, a B school with or without the 3.7 bonus points)
  • International High School of New Orleans (charter school, a D school with or without the 4.8 bonus points)
  • O. Perry Walker Senior High School (Recovery School District charter, 9.7 bonus points changed a C to a B)
  • Sarah Towles Reed Senior High School (Recovery School District, 10 bonus points changed an F to a D)
  • G.W. Carver High School (Recovery School District, a D with or without the 10 bonus points)

On the other hand, of 963 K-8 schools with no high school component, more than half (498) got bonus points.  Of those, 218 got the maximum 10 bonus points.  Of the 214 combination schools (at least some K-8 students and some high school students), 123 got some bonus points, and 55 got all 10.  Unfortunately there is no easy way to see whether those were mostly earned by high school students or by elementary students.  And there’s also no provision for letting the public know on which tests the non-proficient students demonstrated their improvement enough to earn bonus points.

Lucky 13)  But wait, there’s more!  It’s easy to get lost in the details of how weird the mathematics are, but that misses the point.  The state is making career-ending and neighborhood-disrupting decisions based on an ever-changing, arbitray, opaque process.  Never has it been made clear what a score of A, B, C, D or F should truly represent in terms of school quality.  And even if that were clearly defined, it likely wouldn’t be measured by a set of once-per-year tests.  And even if the annual tests truly measured either student achievement or instructional quality, Louisiana’s accountability system ignores what happens at the beginning of a student’s educational career (since the first test that counts is at the end of third grade) or at the end (since End-of-Course tests and ACT are normally completed in the first three years of high school).  It’s ridiculous that scores aren’t released before the end of summer, and it’s asinine to judge the quality of a school in 2012-13 by the accomplishments of the Class of 2012.

I’m not sure this pseudo-accountability system is worth fixing.  But releasing more data and releasing it earlier would help while we wait for the Legislature to come to its senses and demand an accountability system that improves education.  The current system serves the interests of those who would destroy public education, not those who would benefit from its improvement. 

This entry was posted in Uncategorized. Bookmark the permalink.

6 Responses to School Performance Scores finally out?

  1. RampartStreet says:

    RE: your point #3, “FERPA Nonsense.” An expressed concern for respecting the requirements imposed by FERPA may serve as a convenient excuse for obfuscation by the State Dept. of Education, as you ably point out, but those same requirements seem not to have mattered a whit when John White entered into his MOU with Michael Bloomberg’s InBloom. What a curious duck is that Mr. White.

  2. Pingback: School Performance Scores finally out? | Methodical, Musical Mathematician’s Musings | Crazy Crawfish's Blog

  3. Herb Bassett says:

    Nice post. I can explain why so few high schools earned bonus points. To be eligible, a high school had to administer the preliminary ACT series (plan, explore) in or before spring 2012. LDOE did not start providing information about the new system until late spring 2012 after White was officially made State Supt. Before that most schools did not administer those tests to enough/all students. Note that the schools that were in place to receive the bonus points were mostly White’s former schools (Recovery School District) and schools near his influence. Ask LDOE how many schools administered the necessary ACT pre tests in the 2011-2012 school year.

  4. Outraged Educator says:

    There are three public high schools in St. Martin Parish: Cecilia, Breaux Bridge and St. Martinville. Cecilia dropped from a B to a C, and St. Martinville went up to a B. Let’s look at the test scores: EOC: CHS 23% scored Excellent and SMSH had 11%. CHS scored 37% Good and SMSH had 32%. CHS had 13% who failed the EOC and SMSH had 21% fail. Yet they are a B and CHS is a C school. CHS had an ACT index of 64% and SMSH had a score of a 38%. That is 26 points higher than SMSH but SMSH is a B school.

    • Outraged Educator says:

      What made the difference? SMSH earned 10 bonus points from its super-subgroup growth. Although CHS better scores in this category, CHS did not have 10 students so it did not qualify for the bonus points. In other words, CHS was punished because its feeder school did not fail enough students. This is absurd!

  5. liznola716 says:

    I find all of this very difficult to really grasp, even as I intuit what’s being done is flagrantly fraudulent designed to bolster the RSD/charters while condemning public schools generally . . . as prequel to chartering the entirely of LA’s public schools. What I’m trying to figure out now is how the state will contain the angry mobilization sure to follow once the charter system hits middle class white teachers and their families. Re the fraudulent data: I wish there was a way for you truly brilliant mathematical minds to create a document that a lay person can understand. I’ve got advanced degrees from the Ivy League but am not a numbers person, nor am I really familiar with how all these tests figure into any of the numbers that calculate the score. We need a document that clearly conveys in lay language the fraud of ed “reform.” If the Uptown residents whose children are in private schools and spared all of this torture could understand this issue in terms of the massive shakedown of the public treasury that “reform” is, we might start to wake some folks up. Dispelling their reverence for Leslie Jacobs, their crusading neighbor, is crucial. She must be exposed for the pathological liar and thief she is.

Leave a comment