Cash incentives to curb chronic absenteeism are worth a try
Last year, 27 percent of Ohio students were chronically absent, meaning they missed more than 10 percent of the school year for any reason.
Last year, 27 percent of Ohio students were chronically absent, meaning they missed more than 10 percent of the school year for any reason.
Last year, 27 percent of Ohio students were chronically absent, meaning they missed more than 10 percent of the school year for any reason. In districts such as Columbus and Cleveland, more than half of students were chronically absent. That’s an enormous problem, as absenteeism negatively impacts student achievement.
In December, Ohio House lawmakers Bill Seitz and Dani Isaacsohn proposed a rather creative idea with House Bill 348, legislation that would provide cash incentives to kindergarteners and ninth graders for regular school attendance. Yet after news coverage of the bill appeared in mid-January, a few Columbus Dispatch readers belittled the idea in letters to the editor. The Youngstown Vindicator editorial board also scoffed, calling the idea “ridiculous.”
Skeptical reactions are perhaps understandable, as the public might reasonably expect students to attend school without the need for incentives. But I think the House proposal deserves further consideration. The nutshell reasoning is this: First, education scholars have studied the impacts of financial incentives, and the results show promise, especially when the incentives are thoughtfully designed. Second, the legislative proposal itself—a modest pilot project—has commendable features that generally follow the research and could even yield further insight into the effectiveness of incentives. The remainder of this piece takes a closer look at the research and then turns to the specifics of House Bill 348.
Research on financial incentives in K–12 education
Offering “carrots” to motivate certain behaviors is a commonly used tactic. Parents offer their toddlers a treat to learn basic tasks like going to the bathroom. Employers reward workers with year-end bonuses for a job well done. Decades ago, I participated in Pizza Hut’s Book It program to earn free pizza for reading books. Whether these types of motivators actually change behavior and outcomes is routinely discussed and debated, and researchers have conducted experiments in education settings that explore whether cash incentives for students might be worth a shot.
Roland Fryer of Harvard University has conducted the most extensive analyses of cash incentives, carrying out pilots in Chicago, Dallas, Houston, New York City, and Washington, D.C.[1] Based on random assignment, some students in these locales were eligible to receive a reward, while others were not—a method that provides strong evidence about the impact of the incentive. He varied the incentives by city in an effort to find which structures might work best.
The experiments in Dallas, Houston, and Washington, D.C., yielded the best results. In Dallas, for instance, English-speaking students who participated in a book-reading incentive program made impressive gains equivalent to roughly two additional months of learning on standardized exams. However, in places where incentives were more directly tied to achievement outcomes—Chicago and New York City—results were not significant.
From his extensive body of work, Fryer concludes that the design of a cash-incentive program matters immensely. Specifically, tying incentives to educational “inputs” is generally more effective than linking them directly to an academic outcome. Students tend to have more control over inputs and a clearer understanding of how to earn the incentive. Reading a book and passing a short quiz on it, as in Dallas, is more straightforward than asking students to figure out how to improve their test scores. But the response to the incentive does, he found, ultimately produce a better academic result.
Ohio’s cash-incentive proposal for attendance
Under the recently proposed legislation, school districts could apply to the state to have up to two of their schools participate in the cash-incentive program. Such schools must have high chronic absenteeism rates. From there, the Department of Education and Workforce would randomly assign kindergarten and ninth graders attending selected schools to “treatment” or “control” groups (i.e., eligible for the incentive or not). Students could earn roughly $500 to $600 per school year when they achieve an attendance rate of 90 percent or higher.[2] The overall price tag for the pilot is very modest—$250,000 in year one and $500,000 in year two—likely making the program possible at just a small handful of Ohio schools.
As proposed, the program has several strengths. First and foremost, it addresses an urgent issue that requires policy attention. Although there are efforts afoot to encourage better attendance, none are quite as direct as what HB 348 proposes. Kindergarten and ninth grade are sensible places to start, too, as absenteeism is higher in those grades. And by focusing on a straightforward “input” such as attendance—something that families and students can control—research suggests that there’s a greater likelihood that the program will boost attendance and, perhaps, achievement, as well. As an added benefit, the program’s use of random assignment could yield rigorous experimental evidence that helps inform whether initiatives like this should be scaled (or not) in future years.
As with any policy proposal, there are also some tweaks lawmakers should consider that would make the bill even stronger. Here are three possibilities.
First, raise the bar for attendance. As proposed, students only need to have a 90 percent attendance rate to earn the incentive. That basically means not being chronically absent, as chronic absenteeism means missing 10 percent of school. To receive the cash incentive, students should have to attend school at least 95 percent of time.
Second, scrap the more problematic incentive based on high school graduation. In addition to the attendance incentive pilot, House Bill 348 includes a companion piece intended to incentivize high school graduation. While Ohio still has too many dropouts, graduation rates have actually increased in recent years. Moreover, a financial incentive is highly unlikely to encourage students who are far behind in meeting graduation requirements. Those students are certainly in need of help, but a modest incentive for graduation is unlikely to motivate them, as achieving that goal would take significant time. Even for students who are right on the cusp of meeting graduation requirements, a financial incentive is unlikely to nudge them further along, as the promise of a diploma is an incentive in and of itself.[3]
Third, add language that would explicitly require an independent evaluation of the pilot program. Although the bill wisely structures the pilot in a way that is amenable to research, it doesn’t actually call for a study of its impact. That’s a missed opportunity, as policymakers could use such research to shape future attendance efforts.
* * *
Poor attendance is a real issue for schools in Ohio and across the nation. This not only hurts students who miss learning opportunities, but creates challenges for educators who must expend even more effort to remediate chronically absent students. Initial skepticism around House Bill 348 is understandable, given society’s expectation that children should be in school. But absenteeism has hit crisis levels, even with that expectation in place. To comprehensively address the issue, schools need a toolkit of carrots, prods, and supports. If passed, and if it proves effective, HB 348 would be another tool in that kit.
[1] Other notable experimental studies on cash incentives in K–12 education include Eric Bettinger’s study from Coshocton, Ohio, which found positive impacts on math test scores (but not reading), and Steven Levitt, John List, and Sally Sadoff’s work in Chicago-area districts, which generally finds short-term impacts of incentives, provided they are awarded promptly (not on a delay).
[2] The bill proposes to experiment with three different payment methods based on 90 percent or higher attendance rates during the time period: $25 paid biweekly, $150 paid quarterly, or $500 per year. For kindergartners, the incentive funding would be directed to their parents. For ninth graders, it would be payable to the student and parents jointly.
[3] In addition to paying a “base” incentive of $250 for graduation, the bill awards higher cash incentives based on high-school graduates’ GPAs. Unfortunately, that could create a perverse incentive for teachers to further inflate course grades. In Fryer’s work from Chicago, he found that directly incentivizing course grades did not yield achievement gains on exams.
Last year, Ohio lawmakers used the state budget bill to enshrine into law some important early literacy reforms focused on the science of reading. If implemented effectively, these reforms could improve reading achievement in schools across the state. But one of the keys to effective implementation—ensuring that both current and future teachers are well trained in scientifically-based reading instruction—is a lot easier said than done.
That’s especially true in Ohio. According to an NCTQ analysis published last year by Fordham, just nine of twenty-six teacher preparation programs in the state provided adequate coverage of all five components of reading science (phonics, phonemic awareness, fluency, vocabulary, and comprehension). Even worse, more than half promoted practices that are contrary to research-based methods, like three-cueing (i.e., teaching kids to guess at words).
Fortunately, last year’s budget bill put Ohio on the path toward fixing that. In fact, a second analysis by NCTQ—this one published just last week—indicates that not only is the Buckeye State on the right path, it ranks “significantly above the national average” in five key policy actions. These actions, which were identified by NCTQ as instrumental to ensuring that a state’s teacher workforce can implement and sustain the science of reading over time, were used to evaluate all fifty states and the District of Columbia. Twelve states earned NCTQ’s highest overall rating of “strong,” and Ohio was one of them. Let’s take a look at why.
1. Setting specific, detailed reading standards for teacher preparation
For teacher preparation programs, standards offer clarity around which skills and knowledge should be taught in depth, and what teacher candidates must demonstrate mastery of before they graduate. For states, they provide explicit criteria that can be used to give feedback, either through audits or renewal reviews. And for districts, standards can offer assurance during the hiring process that candidates have been well trained.
NCTQ identified three indicators within this category. The first is a no-brainer, and asks whether a state has specific and detailed standards for elementary preparation programs that cover all five core reading components. The next two are less obvious, but no less important: Do the state’s standards incorporate how to teach struggling readers (including those with dyslexia), as well as English learners? Thanks in large part to its recent legislative changes, Ohio can say yes on all three fronts.
2. Reviewing preparation programs to ensure they teach the science of reading
NCTQ’s indicators in this area include conducting reviews to hold programs accountable for implementing the science of reading, maintaining full state control over which programs are approved and renewed, using multiple sources of evidence to evaluate science of reading implementation, and utilizing literacy experts in program reviews.
Ohio meets expectations in three of the four indicators. It falls short on the last, as the Department of Higher Education—the entity responsible for conducting Ohio’s newly established audits of preparation programs—is not required to include reading specialists or experts in its reviews (though they may do so in practice). Despite this shortcoming, it’s important to recognize just how big of a deal it is that Ohio will soon audit preparation programs to ensure they’re training teacher candidates in reading science. That’s a level of accountability that didn’t exist prior to last year’s legislation, and it will play a crucial role going forward in ensuring lasting change—provided Ohio lawmakers fully fund the effort (more on this later). Also worthy of note is the fact that audits will focus solely on the science of reading—an important distinction, as a more general audit could be too shallow to make a difference. In addition, the Department will be required to publicly release summaries of their findings, and all programs will be reviewed every four years.
3. Adopting a strong elementary reading licensure test
This area examines whether states have a reading licensure test designed to ensure that new teachers understand the science of reading. NCTQ also asks if elementary teacher candidates are required to pass that test and whether states publish pass-rate data (especially first-time pass rates). Taken together, these three indicators can help verify that teacher candidates truly understand scientifically-based reading instruction and identify programs that do an excellent (or lackluster) job of preparing candidates to succeed. Ohio currently checks the first two boxes—the state has a reading licensure test, and candidates are required to pass it—and partially checks the third, as it publishes some pass-rate data. Thanks to last year’s legislation, though, the state will soon be publishing first-time pass-rate data, as well. That change will bring Ohio up to three out of three.
4. Requiring districts to select a high-quality reading curriculum
This is arguably the most important policy action, as using high-quality curricula is a cost-effective reform that can boost student outcomes. Perhaps because of its importance, there are six indicators. The four indicators Ohio met include whether the state requires districts to use core curricula materials from an identified list, provides guidance or evaluation tools to help districts select supplemental materials to support struggling readers, provides guidance and tools for English learners, and has allocated resources to help districts purchase new curricula.
Ohio fell short because, although districts are required to report the curricula they are using, the state does not publish that information on its website or require districts to publish it on theirs. These shortcomings, however, weren’t enough to prevent NCTQ from spotlighting Ohio’s forthcoming list of state-approved high-quality curricula, as well as its allocation of $64 million to help districts purchase materials.
5. Providing professional learning and ongoing support to teachers
This category examines whether states require current elementary teachers to be trained in scientifically-based reading instruction, as well as whether they allocate resources to districts to support implementation and professional learning. Ohio can check off both boxes, as these requirements were key parts of recent legislation. Specifically, the budget allocated up to $6 million in FY 2024 and $12 million in FY 2025 to pay for literacy coaches in public schools with the lowest rates of proficiency. The Department of Education and Workforce is also partnering with Keys to Literacy, an NCTQ recommended professional development provider, to create a course in the science of reading for current teachers. Stipends of $1,200 are available for most teachers who complete the course.
***
Despite the rave reviews, NCTQ offers several solid recommendations for Ohio to make its policies even stronger, including requiring literacy experts to be part of program reviews and publishing the reading curricula used by each district on the state’s website. But those aren’t the only issues that need to be addressed.
For starters, it’s crucial that the Department of Higher Education conduct thorough, high-quality reviews of teacher preparation programs annually. But early indications are that lawmakers didn’t appropriate nearly enough money for them to do so. If the department can’t afford to ensure these audits are done well, it will be impossible to determine if teacher candidates are being properly trained. When the budget cycle restarts next year, lawmakers should prioritize funding teacher preparation audits.
In addition, there have already been attempts via Senate Bill 168 to water down professional development expectations for teachers regarding students with dyslexia. The Buckeye State has done admirable work to improve how it serves students with dyslexia, and it would be a shame to slide backward so soon. Lawmakers should resist these efforts.
Finally, it’s imperative for Ohio to stay the course. That might seem like a meaningless recommendation given how new these provisions are, but lawmakers have an unfortunate history of failing to follow through on important policy reforms. Ohio leaders deserve plenty of kudos for their efforts last year, but it’s a new year now—and there’s plenty of work to be done.
Although it’s a brand-new year, many Ohio students are still caught in the education riptide of the pandemic era. Achievement in reading and math has improved from the horrific lows of 2020–21, but students remain behind. In math, for example, just 53 percent of students met grade-level proficiency standards last year, down from 61 percent in 2018–19. Achievement gaps remain unacceptably wide, and chronic absenteeism is pervasive. To turn the tide, schools and students need all hands on deck. But many of the most important hands—those that belong to parents and families—may not fully realize the challenges facing their children.
Consider a 2023 report published by Learning Heroes, a nonprofit group that seeks to equip parents to support their child’s education. They found that nearly nine in ten parents nationwide believe their children are performing at or above grade level in reading and math, even though less than half are actually performing at grade level. This disconnect is largely a result of parents relying on report card grades as their primary source of information. The vast majority say that their kids are earning mostly A’s or B’s, and “understandably equate a good grade with grade-level achievement.”
But report cards measure much more than just achievement—things like class participation or group work might factor in, too—and grade inflation is rampant. As a result, they note, “relying on report cards in isolation could prevent parents from initiating crucial interventions.” Without a complete picture of student progress, it’s impossible for families to hold schools accountable for serving their children well or know when it’s time to make a change.
Fortunately, last year’s landmark state budget, House Bill 33, included two important provisions that could help Ohio parents get a better handle on their child’s achievement and empower them to make decisions accordingly. Both provisions must be implemented by districts beginning this year. Given that the state’s testing windows begin as early as March 25, state and district officials are hopefully already discussing how to implement these changes (and if they’re not, they should be). With that in mind, let’s take a closer look at each.
Faster delivery of a child’s state test scores to parents
After students complete state exams, the Department of Education and Workforce (DEW) assembles test score reports that offer detailed information about how students performed in reading, math, and other subjects compared to state standards. DEW sends these reports to districts, which then distribute them to parents.
The vast majority of students finish testing by early May. Ideally, families would receive their score reports within a month or so—not just for the sake of an efficient and timely turnaround, but also because score reports can help parents decide when to get struggling students extra help (like tutoring over the summer) or when to make an even bigger change (like switching schools in the fall). Unfortunately, Ohio families haven’t historically been guaranteed a quick turnaround for state test results. Most have waited months to receive their child’s score report, leaving them little to no time to take advantage of the summer months or plan for the fall.
Thanks to HB 33, that’s about to change. Beginning this year, public and private schools will be required to provide score reports to families by June 30. This new deadline could require a great deal of administrative work for districts, especially those that have historically distributed reports via the postal service or students’ backpacks when they return in the fall, rather than through the more efficient means of email or a secure online portal (districts and schools have discretion in how test scores are reported to parents). It will also require the state to move more quickly, and to ensure that tests are still scored accurately despite the tighter turnaround. Both will be tall tasks. But when it comes to ensuring that parents have up-to-date and accurate data so they can make informed decisions, timeliness is necessary.
Intervention and parental engagement for struggling readers beyond third grade
Since 2012, Ohio’s Third Grade Reading Guarantee has required schools to administer diagnostic reading assessments to students in grades K–3, identify those who are off track, notify their parents, and create improvement plans. The original version of the Guarantee also required schools to retain students who, based on state assessments or state-approved alternative exams, didn’t meet reading standards by the end of third grade. Schools were required to provide these retained students with intensive reading interventions, such as summer school or tutoring.
Last year, after much complaining from school groups and despite strong evidence that it benefits students, lawmakers functionally ditched the retention requirement. Going forward, schools will be able to promote a student to fourth grade regardless of their reading level if their parent or guardian requests promotion after consulting with the student’s reading teacher and building principal.
In a perfect world, parents of children who do not meet third grade reading standards would receive all the information they need to make an informed decision between retention and promotion. That includes an accurate debrief on the extensive research demonstrating that retention benefits struggling readers, acknowledgment that social promotion might harm them in the long run, and a clear and specific outline of how the district plans to provide intensive intervention. However, given that the driving force behind eliminating retention was largely district administrators and teachers unions, the reality is that most consultations will result in a recommendation for parents to request promotion.
When that happens, districts will be off the hook for retention. But they won’t be off the hook for updating parents about their child’s progress and providing intensive intervention until the student reads at grade level. Under the previous policy, districts only had to provide intensive reading intervention for as long as a student was retained, which couldn’t be more than two years. Now, they’ll be responsible for providing intervention for as long as it takes for students to meet grade-level standards—even if that doesn’t happen until middle or high school—and keeping parents informed of their progress. This should give families additional opportunities not only to obtain accurate information from schools, but to hold them accountable for catching their kids up.
***
In schools across the state, thousands of students are still struggling to break free from the academic undertow of the pandemic. To help them catch up, educators and families will have to work together. But that’s only possible if parents have accurate, timely, and honest information about student progress and achievement. A quicker turnaround on state score reports and a new opportunity for parents to hold their local school accountable for their child’s progress are steps in the right direction.
A new research report examines the confluence of career and technical education (CTE) and the academic trajectory of high school students with learning disabilities (SWLD). The impetus for such an investigation is the existence of two separate “leaky pipelines”: large numbers of students with disabilities who fail to transition successfully to postsecondary education and employment, and a projected dearth of new recruits for numerous STEM jobs. The research team, led by Ohio State University’s Jay Plasman, was looking for a possible approach to sealing both leaks—hoping to find that high school CTE courses propelled SWLDs into further STEM education, where employers could find a previously-overlooked group of prospective recruits—and get those pipelines flowing more strongly. The study’s findings are inconclusive, but point in positive directions.
Primary data come from the High School Longitudinal Study (HSLS), which followed a nationally-representative cohort of students who entered ninth grade in 2009 through high school and into postsecondary education and early career, ending in 2019. The robust dataset includes full course-taking histories (including grades and credits earned), as well as detailed demographic information. The researchers narrowed their sample down to 870 HSLS students whose parents responded affirmatively when asked if they had ever been told by a doctor or other professional that their child had a specific learning disability. This identification method—used because the HSLS administrative data on IEPs is incomplete—has been employed successfully in prior research.
Course-taking data were narrowed to those classes with an engineering or technology focus (called E-CTE here and comprising such things as computer aided design, coding, and laboratory research). Prior research suggests that the hands-on and practical instructional aspects of such courses align with recommended learning strategies for SWLDs and thus support such students’ educational pursuits. (A direct connection between such courses and related postsecondary education and career pathways also underpins this research.)
The researchers ran several different analytical models to determine how E-CTE coursetaking in high school links to two specific college preparation outcomes—math SAT scores and dual credit course participation—and to two specific college transition outcomes—FAFSA completion and application to college. No further outcomes, such as college degree or employment, were studied. Their favored model is a robust school fixed effects analysis, which compares SWLDs who participated in E-CTE to SWLDs who took fewer such courses or none at all.
The findings link E-CTE course-taking in high school to positive and statistically significant outcomes across the board. For each E-CTE credit earned, SWLDs could be expected to score about 74 points higher on their math SAT assessment, and had a 15 percent higher probability of participation in dual credit courses. As to college transition outcomes, each E-CTE credit was associated with a 17 percent increase in FAFSA completion among SWLDs and a 13 percent higher probability of completing at least one college application. Among the limitations noted by Plasman and his team: the age of the data, a lack of clarity on the courses’ curricula and skills taught, and not knowing how readily available E-CTE courses are to SWLDs in every school—especially those with more severe learning disabilities. This is a promising start, but not much more than that.
Plasman and his team suggest a number of ways that the positive impacts thus far indicated—that students with learning disabilities can do well and go further with E-CTE type courses—can be disseminated to educators, parents, policymakers, and STEM employers. They also suggest ways in which the research can be extended with newer and longer-term data. (The Holy Grail would be some solid information on college or credential completion and employment data.) Much has changed in the CTE world since 2019—including a rise in short-term credentialing programs and apprenticeships. The potential for students with learning disabilities to benefit even further from these changes is high.
SOURCE: Jay S. Plasman, Filiz Oskay, and Michael Gottfried, “Transitioning to Success: The Link between E-CTE and College Preparation for Students with Learning Disabilities in the United States,” Education Sciences (January 2024).
Equitably funding education in America means providing more resources to students who need additional support. Youngsters from economically-disadvantaged families are typically among those determined to need additional help, and have traditionally been identified by asking families to self-report their financial status and having a third party then verify their eligibility for services such as free or reduced-price lunch. But is direct certification really the right method for identifying family need? It has been criticized for being intrusive to families and burdensome to schools, but other methods regularly generate concerns in regard to their accuracy. New research out of Oregon compares various methods for identifying students in need, looking for the most accurate.
A group of researchers led by Stanford University’s Michelle Spiegel uses student-level administrative records from a sample of 1,060 public schools in the Beaver State from 2009–10 through 2016–17. These records are linked with U.S. Census Bureau data, Oregon Supplemental Nutrition Assistance Program (SNAP) program enrollment, and IRS data to construct two novel benchmark measures of student economic disadvantage—one based solely on IRS data, the other based on a combination of SNAP/IRS information. The researchers’ constructed measures are then aggregated to the school-by-year level to aid comparison over time and across schools versus five other measures actively used to report poverty levels in Oregon schools. These include student-specific measures, like free or reduced-price lunch (FRPL) and SNAP eligibility (without the addition of IRS data), and community-centric measures, like the Urban Institute’s Model Estimates of Poverty in Schools (MEPS) and the Census Bureau’s household income-to-poverty ratio.
Each measure includes various income cutoff levels, and thus they predictably yield different percentages of students defined as economically disadvantaged. However, the trend lines over successive school years do tend to correlate reasonably well. In other words, if one measure indicates an improvement in economic circumstances (fewer disadvantaged students) from year to year, most of the others indicate similarly.
There is, however, one exception: the Oregon Department of Education’s Economic Disadvantage rate, a widely-used measure which includes schools that take advantage of the Community Eligibility Provision (CEP) allowing them to designate all students as economically disadvantaged. That measure showed the least correlation with the others and typically overestimated the number of economically-disadvantaged students, and indicated a relatively-stable percentage of economically disadvantaged students from year to year without reflecting the economic changes clearly shown in all the others. More on the implications of that in a moment.
Further comparison of trend data indicates that SNAP-based direct certification measures correlate most directly with the researchers’ own method, whose IRS-sourced data are not readily available to school leaders and policymakers. Thus, SNAP-based direct certification is likely the best and most accurate existing means by which to determine true economic disadvantage rates in schools, both over time and across schools.
CEP, which launched in the 2014–15 school year, likely achieved its aims of expanding access to nutritious food, reducing paperwork, and eliminating school lunch debt. But it also began skewing the Oregon poverty data almost immediately. The same pattern was seen by Fordham’s own analysis in our home state of Ohio, which went on to show that levels of other aid for economically disadvantaged students were likely impacted by the designation of 100 percent of students in a growing number of districts as eligible for free lunches.
Spiegel and her team don’t go that far in their analysis, but the implications of their findings are the same: To the extent that other financial supports for low-income students are allotted on a per pupil basis, it is inevitable that a poverty measure that includes all students without regard to actual need will sweep in families of far higher means, risking a misdirection of finite funding. In short, although community eligibility is a commendable school meals initiative, mounting evidence shows that its impact on school poverty measures has negative implications for equitable education funding writ large.
SOURCE: Michelle Spiegel et al., “Measuring School Economic Disadvantage,” Educational Evaluation and Policy Analysis (January 2024).