Monday, November 4, 2013

Private School Performance?

Infographic available at
In speaking with friends recently, we started puzzling over a question we all share: how do we know what the benefits are of a private school over public school? Clearly, many families see the benefit of substantial expenditures on private education, and surely there must be some return on that investment?  In essence, the consensus amongst the group was that the marketing of such schools leads parents to believe that the students that attend those schools would be expected, test scores and grades in high school being equal, to outperform public school students when the public and private school students arrive together at college.

In general, I have always thought that there is no way to determine the truth of that proposition -- that there is no useful data concerning performance in the first year of college.  Grades issued by different institutions are generally not comparable, even if they were available.  We tend to instead retask predictive measures of college success, such as the SAT, as a proxy for that data -- but such measures are always second best to the grades themselves.

But it looks like there's a way to answer the question after all.  The California State University system accepts students from virtually every high school in California (1,590 different ones in the last 18 years). While CSU is not the most elite part of California's Education Master Plan (that would be the UC, which accepts only the top 12.5% of high school students), CSU generally accepts the top third of California's students. While these students are not typically those who have accumulated many advanced placement units, and while these students tend to have test scores within one standard deviation of the norm, the students that attend CSU from private and public schools, at least as a matter of general consensus, are believed to have similar if not the same test scores and grades.  Thus, the performance of such students is perhaps the best example of head-to-head, apples-to-apples performance of public and private school students, after arriving at college, in California (if not in America).

I was surprised to discover last week that CSU makes those grades available on the web, sorted by high school and by year.  It was a relatively straightforward task to script cURL to pull CSU's data for 31 high schools in Marin, Napa, and Sonoma County from 1996-2012 (527 requests, which took the bash script less than 60 seconds), and I have zipped those files and made them available here, for anyone interested in examining the raw data. 

At the top of this post is a graph of the average freshman year college GPAs of students who have graduated from Sonoma Valley High School, Justin Siena, Sonoma Academy, Cardinal Newman, St. Vincent's, and Marin Catholic in that time period. While in 1996 every private school's college freshmen performed better than those from Sonoma Valley, the reverse has been true since 2005.  Sonoma Valley's students outperform every one of the private schools, and by a significant amount.

I've excluded Marin Academy, because they don't have enough data to graph.  Marin Academy sent six students to CSU in 2012, but before that Marin Academy fell below the threshold for reporting (5 students) every year -- so there's only one year's worth of data.  It's a good result for them, though -- in 2012, Marin Academy posted a 3.22, but their difference from Sonoma Valley (.17 of a grade point, 3.05 versus 3.22) is significantly smaller than the difference between Sonoma Valley and the next private school on the list (a .27 difference, between Sonoma Valley at 3.05 and Cardinal Newman, at 2.78).

Elizabeth Warren
Image available at
This strongly suggests that those pursuing private education, with the exception of the very expensive Marin Academy (tuition $37,430 yearly), are perhaps subject to the phenomenon uncovered by Elizabeth Warren in her early research.  In "The Two Income Trap," Warren and her co-author (and daughter!) Amelia Tyagi pointed out that middle class families drive themselves into bankruptcy to buy homes they cannot afford in order to live in a neighborhood with better schools. As Warren and Tyagi argued, the actual "benefits" such parents obtained for their children were slim at best, and were more likely than not illusory in truth.

Yet the problem in Sonoma may be even worse.  Many parents are sending their children to private schools in the belief that they are obtaining an academic advantage.  This is not to discount other reasons for sending children, for instance, to religious schools -- educating one's children regarding deep religious convictions, shared by an organized group, and intimately related to daily living is a right protected by the First AmendmentHowever, to the extent that parents are financially straining themselves to obtain a perceived academic advantage, they should know that the evidence shows no increase in the children's later academic success, and instead shows that the opposite may in fact be true ...


  1. A few thoughts:

    (a) Generally speaking, students at private high schools consider the CSU schools as less desirable than Cal schools, private colleges, and other major institutions out of state. As a result the students from private schools who end up at CSU schools are more likely to be from the bottom half of their high school classes.

    (b) The most significant value of a private school is the environment. Among the student body, the question is not whether you are going to college, but where. Similarly, among the student body, being studious is a virtue.

    So, perhaps at CSUs the students from private schools do not outperform their classmates from public scools. But had those same students attended public schools they very well may have never made it to CSUs in the first place. . .

  2. Tom, regarding the first point, the data doesn’t back up your arguments. Take Cardinal Newman for example. The Average SAT Math at CN between 2005 and 2010 was 557, the average SAT Verbal was 537, and the average ACT was 23.3. However, for the students admitted to CSU from CN, the average SAT Math in that time period was 572, the SAT Verbal was 538, and the average ACT was 24.5.

    These are not the bottom 50%. These are students who are a bit (but not a lot) above the average.

    Remember, the reason CSU data is powerful is that we're concerned about teaching ability, not raw performance from students. To put it in academic terms, the g factor among the high end students at both private and public schools is excluded by the CSU data. And that very high end probably should be ignored when assessing the quality of the schools — it’s no accident the advertisements for the schools will highlight, say, a single student attending Harvard or Stanford, and will focus the parent on the anecdotal evidence. And anecdotal evidence, of course, really has no place in a data-driven discussion.

    Regarding the second point, that environment is the issue, or that private schools get kids in to CSU who wouldn’t be there otherwise, the data suggests the exact opposite. The 2012 data is a good example.

    The average Sonoma Academy kid coming in to CSU in 2012 had an SAT-M of 593, SAT-V of 590, ACT of 26, and a High School GPA of 3.29.

    The average CN kid coming in to CSU in 2012 had an SAT-M of 560, an SAT-V of 556, an ACT of 24, and a High School GPA of 3.51.

    The average Sonoma Valley High kid coming in to CSU in 2012 had an SAT-M of 542, an SAT-V of 499, an ACT of 22, and a High School GPA of 3.17.

    Guess what the results were?

    Sonoma Valley 3.05.
    Cardinal Newman 2.78.
    Sonoma Academy 2.61.

    The kids coming out of the private schools have (slightly) higher test scores and (much) higher GPAs than their public school competitors. And the public school kids do significantly better at CSU. I can speculate about the environment issues that lead to that outcome, but in the fine tradition of our law professors, Tom, the answer to that question is probably one best left as an exercise for the reader …

  3. Hey John, I'm compulsively drawn to this sort of data analysis exercise, and I've pondered this graph for a while today. I attempted to view the data you collected but the format's not readily usable. I assume your graph is based on some slick table data. Do you have an excel or CSV file?

    I think your graph could benefit from some more public school data. I'm also curious about how much smaller the pool of students going to CSU from private school is compared to public. You mentioned that Marin Academy only sent 6 students one year.

    Now I'd love to speculate wildly. My reoccurring thought is that we can only see the first year. I wonder what sort of trends we'd see in subsequent years. Would it bounce? flatten? Is what you're observing a measure of student resilience or some other intangible? Also we're discussing a difference of about 0.3-0.4 points. We can observe a difference this small in the aggregate, and it may be important to the institution, but would it even be perceptible when observing an individual transcript? It still serves the original argument that private high school does not have a meaningful effect on performance.

  4. Eric, second set of questions first. Most of the research I've run across indicates that, after the freshman year of college, the student's performance that first year becomes, by far, the strongest predictor of future college performance.

    Regarding the apparently small differences, on Facebook, this came up as well -- Palo Alto High, perhaps the best high school in the State, comes in at a 3.28 in 2012, and Richmond High, which has as many challenges facing it as any school in California, comes in at a 2.44. So, the difference between the extremes looks like it's about .8; .3 or .4 starts looking pretty big.

    Over the time series in question (1996 to 2012) the best performing schools out of the 31 I pulled were Marin Academy, El Molino, Maria Carrillo, Analy, Casa Grande, Santa Rosa High, and San Marin, in that order (that's including public and private).

    The explanation for this could very well be resilience, but I am being pretty cautious in speculating about what's causing the differences, just noting them and leaving it to the reader to pursue it further.

    When it comes to particular students, any metric is limited in assessing ability -- in evaluating merit, for lack of a better defined term. The question I approach this with is really whether test score data demonstrates that there is a quality problem with our public schools in the first instance -- and the "exceeds expectations" feature of public school student performance at CSU is cause for skepticism regarding those claims.

    Regarding the data itself, I have, well, I have a lot of data at this point. I'm thinking I'm going to put it on Google's Public Data Explorer (the DSPL XML schema is ... hairy...) but I should probably have that done shortly, and it will allow people to compare pretty much any set of high schools in the state without programmatic knowledge.

    Maybe I'll just get that done and forward you a link? But if you're in a rush, here's a CSV of the 31 schools -- caution, the data is noisy, and you'll note I use a moving or rolling average to spot trends, and I'd recommend that for anyone else using it as well. Please also note there’s a great deal more data available than what’s on this sheet —- this sheet does not include dropout data, total number of students attending from each school, incoming GPAs and SAT scores, etc.

    1. Wondering how this argument would hold up for UC and top tier universitiy comparisons? As the parent of a private school student and also a public school educator, I am conflicted by this issue.

  5. Hi John,

    I really appreciate the thought and effort you have put in to this. It is an important discussion. And I agree that while anecdotal information has great value, at the end of the day hard, empirical data, even with its limitations, is the best measurement tool we have at hand. I was wondering about the sample size for each population. Do you think that is a limiting factor in your analysis?


    1. Merrill -- thanks for the comment. One of the things to remember is, what are we looking at? As far as the CSU population is concerned, we're not sampling -- we're reviewing all of the data for every student from these schools -- there's no box model involved. So on that particular question, the concept of sampling isn't applicable -- it's more like an election return than an opinion poll.

      It looks like pretty much all the schools send what looks like ~10% of each graduating class to CSU, ±3%, regardless of school size. Further, at those private schools that I've focused in on, the students heading to CSU are typically above the school's average -- not a lot, about .2 or .3 of an SD -- but they are above average. Knowing that our typical subject's on the right side of the normal curve is very helpful -- there just can't be a huge number of students going to “elite schools,” for example, that are performing at a higher level, unless the school is Lake Wobegon High or something. Further, those students at the far right end of the normal curve are ones where individual g factors probably explain more about their performance than do the institutional characteristics, which is what I think is important.

      Merrill, as lawyers, when we look at arguments, and we see one side has data (really, evidence) and the other doesn't, we have a couple of rules (standards) that we apply to sort the problem out. Who has the burden of persuasion in the first place?  Is the question one where we're looking for a preponderance of the evidence? Are we looking for evidence beyond a reasonable doubt? Are we looking for clear and convincing evidence?

      Parents are spending very significant sums on private education and oftentimes their major, if not only goal is to obtain an academic advantage for their kids.  If that's true, I'd think the burden of persuasion's on the private schools, and that the standard to apply is probably clear and convincing. I can see how the standard would shift to preponderance if cultural and/or religious factors are at issue, but I think the burden would still remain on the private schools.

      Objectively, I think we have to note that the preponderance of the evidence favors the public schools here, just because there isn't data on the other side of the argument. I leave it, again, as an exercise for the reader to determine whether the evidence in favor of public schools has started to reach the clear and convincing standard.