Head Start Impact Study Final Report
US Department of HHSJanuary 2010
US Department of HHS
January 2010
This study uses random assignment to answer a specific research question – what is the causal impact of one year of Head Start (2002-2003) on key child outcomes? Drawn from a nationally representative sample of 5,000 children, the analysis examines outcomes for 3-year-olds and 4-year-olds receiving one year of the program. It measures results at various intervals (during preschool, kindergarten, and through the end of first grade) in the domains of cognitive development, social-emotional development, health status, and parenting practices.
Researchers found that a year of programming had several positive impacts on school readiness measures one year later, but by the end of first grade, most of these impacts disappeared. A few impacts remained intact: for the 4-year-old cohort - small impacts on vocabulary scores, receipt of dental care, and increase in health insurance coverage; for the 3-year-old cohort – impacts on oral comprehension, closer relationships with parents, less authoritarian parenting styles. The rest of the 420 page report points overwhelmingly to an unfortunate trend: early learning impacts often fade out after a few years.
Despite its gold-star methodology, readers may be left with pragmatic questions about how to improve Head Start programming, whether sustained improvements to child-parent relationships and health outcomes fulfill the goals of the program, and whether (and how) to allocate scarce funds to early learning if it doesn’t significantly improve children’s academic readiness.
In Ohio, policymakers may wonder about the efficacy of not just the federal Head Start program, but the state Early Learning Initiative (which replaced the state Head Start program under Gov. Taft). In light of the less-than-inspiring findings on Head Start, Ohio can be proud that it replaced Head Start Performance standards with its own Early Learning Content Standards (arguably better as they are aligned with K-12 content standards). However, this makes Gov. Strickland’s decision to slash the Early Learning Initiative all the more troublesome.
Read the full report here.
CALDER at the Urban Institute
Damon Clark, Paco Martorell, & Jonah Rockoff
December 2009
This working paper from CALDER (Center for Analysis of Longitudinal Data in Education Research) uses data from New York City to examine the relationship between principal characteristics and school performance, particularly whether past principal experience has an impact on the performance of the schools in which they are placed.
The authors find that schools with experienced principals have stronger school-wide performance than those without such experience, especially related to math outcomes and student absences. For schools with inexperienced principals, the only ones whose schools improved performance were those who had previously served as assistant principal in that same school. It seems intuitive that previous experience as a principal or assistant principal would contribute to better school performance, but less obvious is the finding that principals’ graduation from highly selective universities and prior (pre-principal) work experience have little correlation with student outcomes.
The paper also makes a credible case that retaining principals over a longer period of time will aid school performance. By reducing the number of times a school switches principals, the less time (and money) a school will spend getting new principals adjusted to their new careers.
More generally, the paper suggests that “characteristics that can be directly observed on a resume … are probably less important than characteristics that cannot, such as leadership skills and determination.” This finding is all the more interesting in light of a recent article in The Atlantic that described Teach for America’s efforts to pin down “squishier” concepts of teacher quality, such as “grit” and “perseverance.”
Because the findings are based on data from NYC schools and principals, the results cannot be immediately generalized to Ohio. But the report’s findings on principal quality parallel much of what we know about teacher quality – academic or formal training may not matter as much as immediate experience in schools, and characteristics such as “leadership” that are traditionally harder to measure. Ohio leaders would do well to read this paper (and others from CALDER measuring principal quality), especially in light of the fact that a large number of principals in the Buckeye State are nearing retirement and the state must think strategically on how to recruit new talent into the profession. Read the full report online here.
Deloitte LLP
November 2009
This national report assessed, from the perspectives of students, teachers, and parents, the purpose of high school. To some of us, the answer appears manifest: to prepare students for post-secondary education and successful careers. However, the results from this survey portray a culture that believes otherwise.
The Deloitte 2009 Education Survey Overview highlights the blinding case of myopia that has pervaded our school system and its potential impact on low-income high school students. For example, when teachers in the study were asked to define their primary mission as a teacher, 38 percent responded to “help students master the subject you teach”; a scant 9 percent replied “prepare students for success in college.” On the contrary, when students and parents were surveyed, both overwhelmingly identified the most important purpose of high school as “getting prepared for college.”
For low-income students, a college education is becoming one of the only ways to escape the cycle of insolvency all too familiar to their relatives, friends, and family. It is essential for teachers to help these students believe that a demographic characteristic will not predict their fate.
The report also noted the difference between a student’s desire to go college and their ability to actually complete college level work; 70 percent of students said they “definitely” wanted to attend college, but less than a quarter actually felt “very prepared” These sentiments were echoed by their parents, as 89 percent thought it to be “very important” that their child attend college, yet 29 percent felt that their children were “very prepared” to handle the course work.
In Ohio, similar patterns have emerged. Students clearly want to attend college but lack the preparation to do so. In a 2005 survey by the Ohio Board of Regents, only 24 percent of Ohio’s first-time college freshman had taken a complete core of college preparatory classes; 41 percent took at least one remedial course during their first year. Compound the problem with scant funding and lack of assistance to students traversing complex admissions processes, and it becomes clear why low-income students nationally and in Ohio are underrepresented on college campuses.
Although the survey’s sample comprised only a drop in our education system’s bucket – 400 teachers, counselors, and administrators, as well as 400 parents and 601 high school students in total – the implications are significant. The value of a college diploma in today’s society is at an all time high, and so must be the belief that, regardless of a student’s socioeconomic status, the central role of high school is to prepare students for college. Read the report here.
School districts and STEM schools should be able to assign online work to students to make up for calamity days, according to legislation introduced earlier this month in the Ohio House of Representatives.
Under House Bill 407, school boards at Ohio district schools and STEM schools could voluntarily propose a plan to the Ohio Department of Education, detailing how they wish to assign such work (the legislation does not include charter schools because they do not have calamity days). The legislation has received some bipartisan support from sponsors Rep. James J. Zehringer (R-Fort Recovery) and Rep. Mark Okey (D-Carrollton), as well as 2 Democratic and 8 Republican cosponsors.
Some of Ohio’s current efforts to provide alternative, online sources of education have worked well and others haven’t. Plans submitted under this legislation would likely be no exception. Yet the potential for online education clearly exists. Ohio has launched a pilot distance-learning program statewide, and 25 percent of Ohio’s charter school students attend e-schools.
Still, it is unclear how much preparation schools will be expected to undertake if they apply.
Will they need daily, online lesson plans prepared for every course by the August 1 proposal deadline? Would it be enough for teachers to simply send their students an e-mail message asking them to read certain pages from a book? Is the online work supposed to replace the material missed during the calamity day, or does it provide a pathway for teachers to bump their schedule up a day so the missed work can be made up in class?
These questions lead to a central one: how will the Ohio Department of Education evaluate the proposals? Without more guidance on how these plans will be implemented, it is difficult to say whether they will be effective or a waste of precious time and effort.
Used properly, this legislation can demonstrate how, once again, the Internet can overcome impediments (such as fickle Ohio weather) to education. But busy work in the name of using technology and saving schools from a calendar day in class should be avoided.
The Ohio School Funding Advisory Council had its second meeting last week. Some observers have questioned the makeup of the 28-member panel, a group charged with crafting education spending recommendations by December 2010 but that is stacked with folks who may have a “vested interest” in seeing larger education budgets come to fruition.
Despite this, several council members voiced concerns about the fiscal realities facing Ohio in the biennium, as well as the efficacy of the “evidence-based model” (EBM) in other states. With the first of the EBM’s mandates ready to be unleashed on school districts in July 2010, that’s certainly a fair question.
Deborah Delisle, state superintendent of public instruction, pointed out that analyzing the EBM elsewhere won’t give us the answers we’re looking for, because in other EBM states, “there has not been fidelity to the [evidence-based] program.” This may be true, but it begs an obvious question: if school districts elsewhere haven’t been faithful to the mandates enshrined in EBM (reduced teacher-student ratios; all-day kindergarten; mandatory staffing of counselors and “school wellness coordinators,” etc.), what does Ohio plan to do differently to enforce spending requirements that districts are already unhappy with? (Recall the push-back against the all-day kindergarten requirement, the first of Ohio’s EBM mandates to be phased in.)
Understanding the “evidence” behind this massive -- and expensive -- undertaking is important. Rep. Steve Dyer encouraged council members to have faith in the “scientific evidence” of the model (Fordham remains skeptical of the science, as does prominent economist Eric Hanushek, among others).
As for financial realities? “Yes, there will be economic realities, but that’s something we have to worry about in a couple of years,” said Dyer. He also stated his belief that “the magic of the EBM” lies in its transparency; because the EBM indentifies the cost of education strategies “proven” to work, taxpayers know where their dollars are going. (It’s true the EBM costs out the price for various reform strategies, but taxpayers and individual school districts may not like being instructed on how to spend their money. Transparency doesn’t diminish the desire for autonomy.)
A question posed by council members as to whether the EBM has worked in other states that have adopted it (Arizona, Arkansas, Kentucky, Washington, Wisconsin, Wyoming) is one that all Ohioans should be asking. If the Buckeye State is going to commit to increased spending on education and impose prescriptive requirements on districts, shouldn’t we look to other states who have already committed to the EBM to see if its promises (improving student achievement by an extraordinary three to six standards deviations!) hold up? As one council member described at the meeting, “If someone could convince me that by adding [these mandates] we would improve student achievement, then we’ll find the money somehow.”
But the academic results in other states (as measured by NAEP, the “nation’s report card”) that have adopted the EBM are not as stellar as one might hope. Admittedly, there is no way to determine that the EBM had a direct causal impact on student achievement scores (for better or for worse) in these states, and the graphs don’t attempt to show the precise date that EBM was implemented in each state (assuming it was gradually phased in, as Ohio plans to do). Still, it is telling that achievement patterns in EBM states over the last decade – in both tested grades and subjects – look remarkably similar to the national average. In fourth- and eighth-grade math, most EBM states saw small gains in proficiency of a few percentage points, but so did non-EBM states. In reading, the findings are grim. Despite ten years worth of reforms, proficiency rates in reading in EBM states are alarmingly flat. A handful of states even saw a decrease in test scores.
Improving student achievement statewide is no easy task. Members of the School Funding Advisory Council may be tempted to buy into the myth that increased spending will translate into student achievement gains, but the experience of other EBM states should serve as a cautionary warning. Spending more on inputs without a clear link to improving student outcomes can only guarantee one thing: more spending. The mandates embedded in the EBM haven’t lived up to their promises in six other states, and we are naïve to think that the EBM will be more effective in Ohio. To quote an old adage, if something sounds too good to be true, it probably is.
Congratulations, Ohio. The state’s continued slow climb up the Education Week achievement ladder continues and shows that improvements put in place over the last decade are creating a strong educational infrastructure. See here, for the report (subscription needed).
But that’s all. Unfortunately, what Ohio’s fifth-place, B-minus finish (Maryland was first at B-plus) really shows is that adults in the state are better off than most students. The Buckeye State received good marks for our accountability program and we were okay on equity in financing, for example, but when it comes to actual student learning we aren’t doing so hot. It’s like getting all excited about how grand a brand new school building looks and forgetting that the important thing is what’s going on inside.
Still, perception is important. The Quality Counts survey is considered a big deal and gets lots of press coverage. State Superintendent Deborah Delisle gushed her relief when the annual results were issued earlier this month. “This report confirms what the members of Ohio’s educational community have known for several years – Ohio has a strong system that is viewed as a national leader,” she said. Delisle went on to praise administrators, teachers, policy makers, and students.
Unfortunately, students didn’t have much to do with it. Academic performance is actually a drag on our state’s ranking.
And that’s the rub for taxpayers. They are no doubt pleased that the state’s education system seems to be getting better but they also have a right to be confused. Ohio has the fifth-best education system in the nation? At a B-minus, according to Quality Counts, better than average.
Then why is everyone complaining so much about the schools? And why are children who received Bs in high school math, needing to take remedial algebra as first-year college students?
Or, why are many urban school districts never rated higher than a D on the state’s rating system – never mind that too many schools in those districts are rated F.
And then there are scores on national tests. Ohio’s fourth graders placed ninth in math in 2000 and 11th in 2009 on the National Assessment of Educational Progress (NAEP). That’s the wrong direction for a school system that’s supposed to be improving. Eighth graders placed eighth in math in 2000 and 24th in 2009. Ouch. In reading, fourth graders placed 14th in 2002 and 11th in 2007 on the NAEP. Eighth graders placed ninth in reading in 2002 and 11th in 2007.
The fact is Ohio’s student achievement has never been fifth-best in any of these measures and, in just these brief examples, we’ve slipped backward in three of four.
Finally, why did we go through all the angst last year of fighting over new plans to “fix” the education system if it’s as good as Delisle says it is? Quality Counts evaluates what is currently in place, not the changes approved in Gov. Strickland’s education agenda that have yet to be phased in.
Here’s how Quality Counts rated Ohio, followed by Ohio’s national ranking in each category: Chance for Success, C+ (tied for 24); Standards, Assessment & Accountability, A ( 3); Teaching Profession, C+ ( 14 ); Finance, C+ (18); Transitions/Alignment, B- ( 10); K-12 Achievement, C- ( 14). The first five are all about what adults do in the education system.
And, actually, much of the Chance for Success has little to do directly with schools. It is more aligned with the economic well-being of students such as whether parents are employed, and their level of schooling and family income. This is important for an individual student’s educational success but it has nothing to do with what the public school system is doing.
In a piece online and to be published in the upcoming Education Next, Stanford University’s Margaret Raymond argues that the Chance for Success rating doesn’t even show the likelihood of student success. “Instead, they provide statistics that divert attention away from the things that actually do matter, such as high-quality teaching, a good range of school options, and success in early elementary schools,” she said.
“Until the measures that are incorporated into the Quality Counts ratings are more clearly tied to education outcomes, we are likely to see continued shifts in rankings that bear little resemblance to actual changes in education quality,” she wrote in a blog posting earlier this month.
Even the Assessment and Accountability area, which produced Ohio’s best showing, doesn’t indicate whether Ohio schools actually have adequate assessment and accountability systems, only that the state has a good plan on paper. In fact, most schools don’t have plans. (For a closer look at these categories, see former state board of education member Colleen Grady’s analysis here.)
But, clearly, a reasonable person might ask, what good is Ohio ranking third in assessment and accountability when the actual student achievement – on average – across the state was so low at C-minus? In fact, there is a huge disconnect with most of what Quality Counts measures and the bottom line of what schools are in business to do, which is to educate kids.
Flushed with success, Governor Strickland touted the fifth-place achievement in yesterday’s State of the State address and the Ohio Department of Education played up the ranking in its application for federal Race to the Top funds. Ohio wants to be first in the rankings within four years and we might just make it – especially by Education Week’s metrics, where students don’t count for much.
CALDER at the Urban Institute
Damon Clark, Paco Martorell, & Jonah Rockoff
December 2009
This working paper from CALDER (Center for Analysis of Longitudinal Data in Education Research) uses data from New York City to examine the relationship between principal characteristics and school performance, particularly whether past principal experience has an impact on the performance of the schools in which they are placed.
The authors find that schools with experienced principals have stronger school-wide performance than those without such experience, especially related to math outcomes and student absences. For schools with inexperienced principals, the only ones whose schools improved performance were those who had previously served as assistant principal in that same school. It seems intuitive that previous experience as a principal or assistant principal would contribute to better school performance, but less obvious is the finding that principals’ graduation from highly selective universities and prior (pre-principal) work experience have little correlation with student outcomes.
The paper also makes a credible case that retaining principals over a longer period of time will aid school performance. By reducing the number of times a school switches principals, the less time (and money) a school will spend getting new principals adjusted to their new careers.
More generally, the paper suggests that “characteristics that can be directly observed on a resume … are probably less important than characteristics that cannot, such as leadership skills and determination.” This finding is all the more interesting in light of a recent article in The Atlantic that described Teach for America’s efforts to pin down “squishier” concepts of teacher quality, such as “grit” and “perseverance.”
Because the findings are based on data from NYC schools and principals, the results cannot be immediately generalized to Ohio. But the report’s findings on principal quality parallel much of what we know about teacher quality – academic or formal training may not matter as much as immediate experience in schools, and characteristics such as “leadership” that are traditionally harder to measure. Ohio leaders would do well to read this paper (and others from CALDER measuring principal quality), especially in light of the fact that a large number of principals in the Buckeye State are nearing retirement and the state must think strategically on how to recruit new talent into the profession. Read the full report online here.
Deloitte LLP
November 2009
This national report assessed, from the perspectives of students, teachers, and parents, the purpose of high school. To some of us, the answer appears manifest: to prepare students for post-secondary education and successful careers. However, the results from this survey portray a culture that believes otherwise.
The Deloitte 2009 Education Survey Overview highlights the blinding case of myopia that has pervaded our school system and its potential impact on low-income high school students. For example, when teachers in the study were asked to define their primary mission as a teacher, 38 percent responded to “help students master the subject you teach”; a scant 9 percent replied “prepare students for success in college.” On the contrary, when students and parents were surveyed, both overwhelmingly identified the most important purpose of high school as “getting prepared for college.”
For low-income students, a college education is becoming one of the only ways to escape the cycle of insolvency all too familiar to their relatives, friends, and family. It is essential for teachers to help these students believe that a demographic characteristic will not predict their fate.
The report also noted the difference between a student’s desire to go college and their ability to actually complete college level work; 70 percent of students said they “definitely” wanted to attend college, but less than a quarter actually felt “very prepared” These sentiments were echoed by their parents, as 89 percent thought it to be “very important” that their child attend college, yet 29 percent felt that their children were “very prepared” to handle the course work.
In Ohio, similar patterns have emerged. Students clearly want to attend college but lack the preparation to do so. In a 2005 survey by the Ohio Board of Regents, only 24 percent of Ohio’s first-time college freshman had taken a complete core of college preparatory classes; 41 percent took at least one remedial course during their first year. Compound the problem with scant funding and lack of assistance to students traversing complex admissions processes, and it becomes clear why low-income students nationally and in Ohio are underrepresented on college campuses.
Although the survey’s sample comprised only a drop in our education system’s bucket – 400 teachers, counselors, and administrators, as well as 400 parents and 601 high school students in total – the implications are significant. The value of a college diploma in today’s society is at an all time high, and so must be the belief that, regardless of a student’s socioeconomic status, the central role of high school is to prepare students for college. Read the report here.
US Department of HHS
January 2010
This study uses random assignment to answer a specific research question – what is the causal impact of one year of Head Start (2002-2003) on key child outcomes? Drawn from a nationally representative sample of 5,000 children, the analysis examines outcomes for 3-year-olds and 4-year-olds receiving one year of the program. It measures results at various intervals (during preschool, kindergarten, and through the end of first grade) in the domains of cognitive development, social-emotional development, health status, and parenting practices.
Researchers found that a year of programming had several positive impacts on school readiness measures one year later, but by the end of first grade, most of these impacts disappeared. A few impacts remained intact: for the 4-year-old cohort - small impacts on vocabulary scores, receipt of dental care, and increase in health insurance coverage; for the 3-year-old cohort – impacts on oral comprehension, closer relationships with parents, less authoritarian parenting styles. The rest of the 420 page report points overwhelmingly to an unfortunate trend: early learning impacts often fade out after a few years.
Despite its gold-star methodology, readers may be left with pragmatic questions about how to improve Head Start programming, whether sustained improvements to child-parent relationships and health outcomes fulfill the goals of the program, and whether (and how) to allocate scarce funds to early learning if it doesn’t significantly improve children’s academic readiness.
In Ohio, policymakers may wonder about the efficacy of not just the federal Head Start program, but the state Early Learning Initiative (which replaced the state Head Start program under Gov. Taft). In light of the less-than-inspiring findings on Head Start, Ohio can be proud that it replaced Head Start Performance standards with its own Early Learning Content Standards (arguably better as they are aligned with K-12 content standards). However, this makes Gov. Strickland’s decision to slash the Early Learning Initiative all the more troublesome.
Read the full report here.