I will protect your pensions. Nothing about your pension is going to change when I am governor. - Chris Christie, "An Open Letter to the Teachers of NJ" October, 2009

Tuesday, August 15, 2017

Chris Cerf Is Better Than You -- Just Ask Him

Chris Cerf is, of course, the State Superintendent of the Newark Public Schools. He holds this position not because the citizens of Newark chose him, nor because he was appointed by a democratically elected school board, nor because he was chosen by the democratically elected mayor, nor because he had an outstanding track record as a chief officer in another public school district (he never ran a district before Newark*).

No, Cerf only holds his office because Chris Christie -- the man for whom Cerf worked for three years as Education Commissioner -- put him there.

And why did Christie choose Cerf? We can only guess, but as this interview from the past Sunday makes plain, Cerf does share his boss's love of taking cheap shots at those who disagree with him. Before we get to that, however, let's see what spin Cerf is serving these days about charters:
Q. What do you think of the idea that charters siphon off resources from public schools?

A. It presumes charters are not public schools. But they are. They are publicly funded, open to all, tuition-free, accountable to democratically-elected authorities, and in New Jersey, not for-profit.
 
What nonsense. Yes, charter schools are publicly funded. But Newark Prep -- which was approved by Cerf when he was Commissioner but closed down last year due to poor performance -- was managed for years by K12 Inc., a for-profit corporation. And Camden Community Charter School -- also approved by Cerf, also closed due to poor performance this past year -- was also, according to the Star-Ledger, managed by a for-profit corporation.

But what's really galling here is Cerf's insistence that charters are overseen by "democratically-elected authorities." Because not one charter school in Newark was required to gain approval from the local school board, or the mayor, or the city council. All they needed was a nod from the Education Commissioner, who serves at the pleasure of the governor.

Keep in mind that Chris Christie lost in Newark by a landslide in the 2013 election: 6,443 to 29,039. The notion, then, that Newark's charter sector is "accountable to democratically-elected authorities" is a sick joke. But when it comes to charters, Cerf lives in an alternate universe:
The only difference is that a charter is managed by a board of citizens, as opposed to a local school board. Secondly, charters receive only 90 percent of the per-pupil operating cost of a traditional public school, which means that for every student who leaves for a charter, the district retains 10 percent to cover its fixed expenses.

So it's only a negative if the district cannot manage its own expenses to keep its fixed and legacy costs to a number equal to or less than 10 percent. [emphasis mine]
Cerf's hypocrisy here would be funny if the consequences weren't so serious. Because it wasn't even two years ago that Chris Cerf was complaining to the State BOE that charters had an unfair fiscal advantage, and that advantage was hurting his district:
Much of the budget pressure has come from payments the district is required to make to the city’s charter schools. Cerf, a cheerleader for charter schools as state commissioner, yesterday acknowledged that some funding stop-gap is needed to help the district.
He said the mandatory funding for charter schools year to year is “disproportionately hurting the district schools,” adding, “We can’t just turn the other way and let that happen.” [emphasis mine]
Oh, come on, Chris -- can't you keep your fixed and legacy costs in line?

As Cerf undoubtedly knows, New Jersey charter schools have been "held harmless" in their revenues for three years running. This means charters haven't had to take the financial hit that district schools, like Newark's, have had to take because Christie has refused to fully fund the School Funding Reform Act -- the state's own law that ensures districts get adequate and equitable funding.

For every year Christie has been in office -- including the three years Cerf was his Education Commissioner -- Christie has refused to follow the state's own law and provide New Jersey's schools with the funding they need to fulfill the mandates of the state constitution. In Newark, for FY2017 alone, this meant Cerf's district was $181 million below its adequacy target, or $3,615 per pupil.

Cumulatively, NPS is $611 million in the hole since the beginning of Christie's reign. It could have been worse: back in 2011, the state's Supreme Court ordered Christie to restore half-a-billion dollars in state aid to the neediest districts. But that didn't stop Cerf, working on Christie's behalf, from running up and down the state, making the case that New Jersey's schools didn't need the funding the state's own law said they needed.

I contend that there is no one individual over the last eight years -- and yes, I do include Christie -- who has done more to undermine the state's own law on school funding than Chris Cerf. It was Cerf, after all, who authored the 2012 Education Funding Report, an incoherent, rambling mess that tried to make the case that while money matters in education, the schools serving the most disadvantaged students were clearly suffering because they had too much.

As Bruce Baker noted in real time, once you got past the reformy pablum served up in the report, the actual policy of Christie and Cerf was to cut aid to districts serving larger proportions of at-risk students, while simultaneously giving more aid to districts enrolling proportionally fewer at-risk students.

Cerf's report made the case the "weights" for at-risk students were too high in SFRA, even as the empirical evidence showed they were, in fact, very likely too low. He tried to argue that eliminating tenure was absolutely critical for student success, even though there is no evidence whatsoever that tenure impedes student achievement (in fact, there's some evidence strong collective bargaining improves staff effectiveness). Cerf also argued for changing the way students are counted in the aid formula, a clear attempt to decrease funding going to districts enrolling more at-risk students.

But the Funding Report was only the beginning of Chris Cerf's assault on equitable and adequate school funding. In 2013, Cerf stood in front of the NJEA and presented a brazenly deceptive graph in an attempt to downplay the effects of poverty on student outcomes.


Here is the full set of data that Cerf's slide omitted:


Still amazes me every time I see it.

There was also Cerf's twisting of national test scores to make New Jersey's schools appear to be failing their students; Matt DiCarlo did a great job at the time deconstructing Cerf's arguments. And there was the December, 2012 report that bizarrely tried to rationalize the Christie aid cuts; again, Bruce Baker pointed out how absurd its arguments were. Plus there was the proposed policy of underfunding the districts Cerf's DOE classified as "Priority" more than ones classified as "Reward."

All of this went on while New Jersey, under Chris Christie, retreated from funding equity -- even as the record clearly showed the state had made significant gains during the period of meaningful funding reform.

Again: it's not hard to make the case that Chris Cerf was as responsible as anyone for giving cover to the underfunding of New Jersey's schools under Chris Christie -- including, notably, the Newark Public Schools. And yet, despite this track record, Cerf has the stones to make the case that he cares more for the wee children than anyone who dares to disagree with his reformy views:
I look with astonishment at groups like Save Our Schools, highly represented by white wealthy suburbanites that have made it their mission to undermine the opportunity of poor African-American students to have access to quality education. 
Many don't honor their own principles by sending their children to private schools or living in leafy green suburbs. I ask myself whether the strength of their argument would be affected if the focus of charters did not include suburbs like Princeton. 
Mostly, I'm aghast at seemingly progressive individuals who so deeply misunderstand the profound injury their position will cause families of vastly more limited means. [emphasis mine]
Let's start with the low-hanging fruit: Chris, when you were Education Commissioner, did you ever walk into the governor's office and fault him for sending his children to highly-resourced private schools?



Yes, there are people in Save Our Schools who live in the suburbs. Yes, there are people who question charter school proliferation who send their kids to private schools. The difference, Chris, is that none of them, so far as I know, have ever tried to undermine the state's own law regarding full and fair funding of public schools.

For three years, Chris, you did everything within your power to keep schools serving large numbers of at-risk students from getting the funds the state's own law says they should get -- all while your boss gave gobs of tax breaks to wealthy, connected special interests. You went out of your way to justify Christie's underfunding of SFRA with some of the most fallacious nonsense imaginable, costing Newark hundreds of millions of dollars for their schools.

And still, despite your own past, you publicly question others' motives:
I stay up at night wondering how otherwise good-hearted people could say they want to impose a moratorium or somehow stop the addition of charter schools, when so many children are choosing them and being successfully launched into adulthood on the basis of that choice. 
Unfortunately, I think the answer is a raw political one. By statute, traditional public schools are unionized and charter public schools could be unionized, but are not required to be. 
The interest group in the state that spends by far the most money in Trenton, the teacher's union, would be economically hurt by the increased number of charter schools because it would lose dues-paying members. I think this is a purely economic argument. They do not want to lose members. All their talk about creaming and hedge funds is just so much propaganda that they have focus grouped and determined works in the public debate.

Is it true? Is the charge that Newark's charters don't enroll the same sorts of students as NPS "just so much propaganda"?

Let's go to the data, straight from Cerf's former office, the NJDOE (and including the three years when he ran it):



Year after year, NPS enrolls proportionally more special education students than the Newark charter sector.



The special education students the charters do take tend to have lower-cost disabilities compared to the classified students enrolled at NPS. (The lower-cost disabilities are speech impairments (SPL) and specific learning disabilities (SLD); see here for more information.)


Newark's charter sector enrolls very few students who are Limited English Proficient. 


Yes, the free and reduced-price lunch eligibility rates between the district and the charters look the same. But keep in mind that NPS has been expanding its free lunch program** for some time, which means the incentive for families to fill out applications has waned. The charters, on the other hand, have a great incentive for their families to fill out the application, as their funding increases based on the number of FRPL students they enroll.

Whether there is, in fact, a difference in the socio-economic status of charter students compared to district students is a serious question. Cerf says there's "zero factual evidence" to support a charge of creaming. I say: What have you done, Chris, to explore the question, either as Commissioner or State Superintendent?

Who did you contract with to look into why charters don't enroll as many LEP students as their host districts? Who did you assign from your staff to write a report on the disparity in special education rates between the two sectors?

As you know, Chris, Julia Sass Rubin and I wrote the report using your department's data that explored these questions. But why did we have to do it? Why did I have to be the one to use your department's data and point out charter schools spend far more on administrative costs and far less on support services than the New Jersey charter sector?

Why did I have to be the one who pointed out that charter staffs have far less experience than district staffs, and consequently make less money? The charters you laud are basically free-riding on district salaries negotiated by the unions you casually denigrate. Did this ever concern you? Did you ever even think about it? Did you ever hire people for your senior staff who had the capacity to study this stuff?

Chris Cerf has a nasty habit of questioning other people's motives. You would have thought maybe a few years actually running a school district would give him a little bit of humility, and a little bit of space to reflect on his own culpability in underfunding New Jersey schools.

Sadly, Cerf appears set in his ways. He'd rather take his pot shots than own up to the terrible legacy he left as Commissioner. Luckily for Newark, he won't be around much longer. Hopefully, he'll soon return to the private sector, where his record speaks for itself.

Accountability begins at home.


ADDING: Remember this? Good times...

ADDING MORE: As Darcie Cimarusti reminds us, Atlantic City Community Charter is also managed by a for-profit CMO. And, according to the Star-Ledger, Central Jersey Arts Charter School contracts with a for-profit manager.

Pesky facts...

* Maybe now that Newark is regaining control of its schools, it can look around for a superintendent who has actually run a district - unlike the last two appointed by Chris Christie.

** This is a good thing. It's one of the few things I give Cerf and Cami Anderson credit for.

Saturday, August 5, 2017

CREDO Charter School Studies' "X Days Of Learning" Conversion: Still Unvalidated

This past week, the Center for Research on Education Outcomes (CREDO) at Stanford University released yet another report in a series on the effects of charter schools on test scores -- this time focusing on Texas.

Almost immediately, members of the the local media trumpeted the results as "proof" that charter schools are realizing meaningful gains in student outcomes:
For the first time, Texas charter schools have outperformed traditional public schools in reading and closed the gap in math, researchers at Stanford University have found.
Students at Texas charter schools, on average, received the equivalent of 17 additional days of learning per year in reading and virtually the same level of education in math when compared to traditional public schools, according to a study released Wednesday by the Center for Research on Education Outcomes, or CREDO.
Rather than looking at raw standardized test scores, CREDO researchers quantify the impact of a school by looking at student improvement on the tests relative to other schools. The researchers then translate those results into an equivalent number of "days of learning" gained or lost in a 180-day school year.
The center's staff produced similar analyses in 2013 and 2015, finding Texas charter schools had a negative impact on reading and math performance.
"The most recent results are positive in two ways. Not only do they show a positive shift over time, but the values themselves are both positive for the first time," the researchers wrote.
CREDO's studies of charter school performance are widely respected in education circles. The center compares students from charter and traditional public schools by matching them based on demographic characters -- race, economic status, geography and English proficiency, among others -- and comparing their growth on standardized tests. Scores from 2011-12 to 2014-15 were analyzed for the most recent report. [emphasis mine]
That's from the Houston Chronicle, which published just one paragraph suggesting the CREDO studies might have credible critics:
Skeptics of CREDO's study typically offer three main criticisms of the research: it focuses exclusively on standardized test results, incentivizing schools that "teach to the test"; it ignores other advantages of traditional public schools, such as better access to extracurricular activities; and it doesn't account for the fact that charter school students are more likely to have strong, positive parental influence on their education.
Sorry, but that's, at best, an incomplete description of the serious limitations of these studies, which include:

Here is how the CREDO Texas study reports its findings:



Stanley Pogrow published a paper earlier this year that didn't get much attention, and that's too bad. Because he quite rightly points out that it's much more credible to describe results like the ones reported here as "small" than as substantial. 0.03 standard deviations is tiny: plug it in here and you'll see it translates into moving from the 50th to the 51st percentile (the most generous possible interpretation when converting to percentiles).

I have been working on something more formal than a blog post to delve into this issue. I've decided to publish an excerpt now because, frankly, I am tired of seeing "days of learning" conversions reported in the press and in research -- both peer-reviewed and not -- as if there was no debate about their validity. 

The fact is that many people who know what they are talking about have a problem with how CREDO and others use "days of learning," and it's well past time that the researchers who make this conversion justify it. 

The excerpt below refers to what the eminent psychometrician Michael T. Kane coined a "validity argument." To quote Kane: "Public claims require public justification." I sincerely hope I can spark a meaningful conversation here and get the CREDO team to adequately and publicly justify their use of "days of learning." As of now, their validity argument is cursory at best -- and that's just not good enough.

I have added some bolding to the excerpt below to highlight key points.

* * *

Avoiding the Validity Argument: A Case Study

As an illustration of the problem of avoiding the validity argument in education policy, I turn to an ongoing series of influential studies of charter school effects. Produced by The Center for Research on Education Outcomes at Stanford University, the so-called CREDO reports have driven a great deal of discussion about the efficacy of charter school proliferation. The studies have been cited often in the media, where the effects they find are reported as “days of learning.”[1] Both the National Charter School Study (Raymond et al., 2013) and the Urban Charter School Study Report on 41 Regions (CREDO, 2015) include tables that translate the effect sizes found in the study into “days of learning.” Since an effect size of 0.25 SD is translated into 180 days, the clear implication is that an effect of this size moves a student ahead a grade level (a typical school year being 180 days long). Yet neither study explains the rationale behind the tables; instead, they cite two different sources, each authored by economist Eric Hanushek, as the source for the translations.

The 2015 study (p. 5) cites a paper published in Education Next (Hanushek, Peterson & Woessmann, 2012) that asserts: “On most measures of student performance, student growth is typically about 1 full std. dev. on standardized tests between 4th and 8th grade, or about 25 percent of a std. dev. from one grade to the next.” (p. 3-4) No citation, however, is given to back up this claim: it is simply stated as a received truth. 

The 2013 study (p. 13) cites a chapter by Hanushek in the Handbook of the Economics of Education (Hanushek & Rivkin, 2006), in which the author cites his own earlier work:
“Hanushek (1992) shows that teachers near the top of the quality distribution can get an entire year’s worth of additional learning out of their students compared to those near the bottom. That is, a good teacher will get a gain of 1.5 grade level equivalents while a bad teacher will get 0.5 year for a single academic year.” (p. 1068)
No other references are made within the chapter as to how student gains could be presented as years or fractions of a year’s worth of learning.

The 1992 citation is to an investigation by Hanushek of the correlation between birth order and student achievement, and between family size and student achievement. The test scores used to measure achievement come from the “Iowa Reading Comprehension and Vocabulary Tests.” (p. 88) The Iowa Assessments: Forms E and F Research and Development Guide (2015), which traces the development of the Iowa Assessments back to the 1970s, states:
“To describe the developmental continuum or learning progression in a particular achievement domain, students in several different grade levels must answer the same questions in that domain. Because of the range of item difficulty in the scaling tests, special Directions for Administration were prepared to explain to students that they would be answering some very easy questions and other very difficult questions.” (p. 55-56)
In other words: to have test scores reported in a way that allows for comparisons across grade levels (or, by extension, fractions of a grade level), the Iowa Assessments deliberately place the same questions across those grade levels. There is no indication, however, that all, or any, of the statewide tests used in the CREDO studies have this property.[2]

Harris (2007) describes the process of creating a common score scale for different levels of an assessment as vertical scaling. She notes: “Different decisions can lead to different vertical scales, which in turn can lead to different reported scores and different decisions.” (p. 233) In her discussion of data collection, Harris emphasizes that several designs can be used to facilitate a vertical scale, such as a scaling test, common items, or single group to scaling test. (p. 241) 

In all of these methods, however, there must be some form of overlapping: at least some students in concurrent grades must have at least some common items on their tests. And yet students in different grades still take tests that differ in form and content; Patz (2007) describes the process of merging their results into a common scale as linking (p. 6). He notes, however, that there is a price to be paid for linking: “In particular, since vertical links provide for weaker comparability than equating, the strength of the validity of interpretations that rest on the vertical links between test forms is weaker.” (p. 16)

So even if the CREDO studies used assessments that were vertically scaled, the authors would have to acknowledge that the validity of their effect sizes was at least somewhat compromised compared to effect sizes derived from other assessments. In this case, however, the point is moot: it appears that many of the assessments used by CREDO are not vertically scaled[3], which is a minimal requirement for making the case that effect sizes can be translated into fractions of a year’s worth of learning. The authors are, therefore, presenting their results in a metric that has not been validated and could be misleading.

I use this small but important example to illustrate a larger point: when influential education policy research neglects to validate the use of assessments, it may lead stakeholders to conclusions that cannot be justified. In the case of the CREDO reports, avoiding a validity argument for presenting effect sizes in “days of learning” has led to media reports on the effects of charter schools and policy decisions regarding charter proliferation that are based on conclusions that have not been validated. That is not to say these decisions are necessarily harmful; rather, that they are based on a reporting of the effects of charter schools that avoided having to make an argument for the validity of using test scores.


[2] Nor is there any indication the national and international tests Hanushek cites in his 2006 paper, such as the National Assessment of Educational Progress, share questions across grade levels. In fact, Patz (2007) notes: “NAEP attempted to vertically link forms for grades 4, 8, and 12, but abandoned the approach as the comparisons of students separated by 4 or 8 years were found to be ‘largely meaningless’ (Haertel, 1991).” (p.17)

[3] Some statewide assessments are vertically scaled, including the Smarter Balanced assessments; see: https://portal.smarterbalanced.org/library/en/2014-15-technical-report.pdf

References

Center for Research on Education Outcomes (CREDO) (2015). Urban Charter School Study Report on 41 Regions. Palo Alto, CA: Center for Research on Education Outcomes (CREDO), Stanford University. Retrieved from: http://urbancharters.stanford.edu/summary.php

Dunbar, S.& Welch, C. (2015). Iowa Assessments: Forms E and F Research and Development Guide. Iowa City, IA: University of Iowa. Retrieved from: https://itp.education.uiowa.edu/ia/documents/Research-Guide-Form-E-F.pdf

Hanushek, E. A. (1992). The trade-off between child quantity and quality. Journal of political economy100(1), 84-117.

Hanushek, E. A., & Rivkin, S. G. (2006). Teacher quality. Handbook of the Economics of Education2, 1051-1078.

Hanushek, E. A., Peterson, P. E., & Woessmann, L. (2012). Achievement Growth: International and US State Trends in Student Performance. PEPG Report No.: 12-03. Program on Education Policy and Governance, Harvard University.


Harris, D. J. (2007). Practical issues in vertical scaling. In Linking and aligning scores and scales (233-251). New York: Springer.

Kane, M. (2013). Validating the interpretations and uses of test scores. Journal of Educational Measurement 50(1), 1–73.

Raymond, M. E., Woodworth, J. L., Cremata, E., Davis, D., Dickey, K., Lawyer, K., & Negassi, Y. (2013). National Charter School Study 2013. Palo Alto, CA: Center for Research on Education Outcomes (CREDO), Stanford University. Retrieved from: http://credo.stanford.edu/research-reports.html