Why college entrance exams matter
Since the spring of 2017, all Ohio eleventh graders have been required to take either the ACT or the SAT at the state’s expense.
Since the spring of 2017, all Ohio eleventh graders have been required to take either the ACT or the SAT at the state’s expense.
Since the spring of 2017, all Ohio eleventh graders have been required to take either the ACT or the SAT at the state’s expense. Research shows that universal testing policies like this are beneficial for underserved students. Nevertheless, some lawmakers and educators have recently argued that the ACT and SAT shouldn’t be required for every student. They believe that because not everyone plans to go to college, the state shouldn’t require all kids to take a college entrance exam. And they’re not alone. Some colleges and universities have transitioned to “test-optional” admissions.
The problem with this shift toward a more “holistic” view of student readiness is that it assumes everyone has the same opportunities—and the same awareness of those opportunities. But that’s not true. In fact, students from low-income families, students of color, and students who attend chronically low-performing schools often rely on universal services and assessments to level the playing field between them and their White, affluent, or more privileged peers.
Consider a few of the benchmarks that test-optional colleges and universities use in place of college-readiness exams. GPA, class rank, school involvement (such as sports and other extracurriculars), and community involvement (such as volunteer hours) all seem like objective and fair measures at first glance. But a closer look reveals that many of these measures rely heavily on the school a student attends and not the actual student.
I saw this play out all the time when I was teaching in a high-poverty, chronically low-performing high school. I knew plenty of exceptionally bright students whose 4.0 GPAs and top-ten class ranks didn’t mean as much to admissions officers as the same GPAs and class ranks earned by students who attended the affluent, high-performing schools across town. It’s hard to acknowledge that attending a low-performing assigned school can put kids at a disadvantage. We tell students that if they work hard and get good grades, then colleges will notice. But there’s a reason that affluent parents pay obnoxiously high tuition prices for prestigious private schools or sky-high mortgage payments for a house in a certain suburb. The reputation of the school a student attends—or “program rigor,” as it’s sometimes referred to in admissions literature—matters, and it’s completely out of the control of thousands of students.
The activities that fall under the “school involvement” umbrella are completely dependent on schools, too, and opportunity gaps are enormous. For example, at the high school I attended as a student, kids could choose between thirty sports and dozens of other clubs and activities. But at the high-poverty urban high school where I taught, extracurriculars were extremely limited: The only sports that were consistently available were football, boys’ and girls’ basketball, and cheerleading; marching band was the only option for musically inclined students; and students with academically based interests—those who would’ve liked to write for the school newspaper, learn how to code, or join the debate team—had few options. The school didn’t offer consistent volunteer opportunities, either. And because many of my students lacked reliable transportation, worked part-time jobs to help pay the bills, or kept an eye on their younger brothers and sisters while their parents were at work, going out into the community to seek out volunteer or leadership opportunities wasn’t an option.
The upshot? A “holistic” admissions process isn’t as much of an advantage for underserved students as anti-testing folks think. When your school sets you at a disadvantage through no fault of your own, you need objective, third-party measures to prove that you’re just as ready for college coursework as your peers. Obviously, we should be working to close the opportunity gap and ensure that all kids have access to the same opportunities. But focusing on that while simultaneously removing a state requirement for all students to take the same test and be held to the same standards is both illogical and myopic.
Of course, the folks attempting to change Ohio’s universal testing policy will argue that making the policy optional still gives students a free opportunity to take a college entrance exam. They couch this argument in terms of choice and say that families should have the right to choose whether or not their students take an assessment.
But this line of reasoning incorrectly assumes that all families have the same informational access. They don’t. A 2017 study of six graduating classes in Michigan found that for every ten low-income students who sat for a college entrance exam and were deemed college ready, there were another five who also would have been identified as college ready had they taken the assessment. Once Michigan’s universal-testing policy kicked in, the ACT test-taking rates of low-income students more than doubled, from 35 to 85 percent. That’s a massive jump, and it should raise several questions. How many students didn’t take a test prior to the mandatory policy just because they assumed they weren’t college material? How many kids missed out on higher education and all it has to offer because they didn’t realize they had an opportunity right in front of them? And how many Ohio students will end up in the same boat if state lawmakers succeed in changing our mandatory policy?
The bottom line is that changing the state’s college-admissions-exam policy won’t harm students who are affluent, well connected, or attend high-performing schools. But it can and likely will harm underserved students with fewer opportunities. Data—both quantitative and anecdotal—indicate that a mandatory policy gives underserved students a boost and opens doors that may have been closed otherwise. If lawmakers are interested in crafting education policy that’s in the best interest of all kids, then they need to maintain the state’s current mandatory policy.
After months of debate, state lawmakers continue to mull significant changes to Ohio’s school report card system. Two vastly different proposals to overhaul the report card framework have emerged (House Bill 200 and Senate Bill 145). We at Fordham, along with several other education groups, have thrown our support behind SB 145. That proposal makes responsible course corrections, while still maintaining a strong focus on all students’ academic outcomes, and adopting an intuitive five-star rating system. At the same time, we have also voiced serious concerns about the House legislation. That proposal would bury critical achievement and college-and-career-readiness data and hide the ball from parents and the public through the use of technocratic jargon.
In testimony in front of the Senate Education Committee last week, the education establishment tried to defend its support of House Bill 200 and attempted to address some of the criticisms of that legislation. Speaking on behalf of various public school groups, Kevin Miller of the Buckeye Association of School Administrators, took issue with Fordham’s characterization of HB 200’s rating system as “actively misleading.” He also sought to explain the exclusion of an overall, or “summative,” school rating in the House plan.
With the summer legislative recess coming soon, there may not be another hearing on school report cards this spring to address their claims in committee. As such, we offer this response to Miller’s arguments.
Why the HB 200 descriptive ratings are misleading
The House plan moves Ohio away from the widely understood but much-maligned (at least by education groups) A–F rating system to a descriptive labelling system. The six proposed ratings are as follows: significantly exceeds expectations, exceeds expectations, meets expectations, making substantial progress toward expectations, making moderate progress toward expectations, and in need of support.
In earlier Senate testimony, my colleague Chad Aldis noted that the “making moderate progress toward expectations” descriptor—equivalent to a D rating—would be “actively misleading.” He is absolutely right. To illustrate the problem more concretely, consider some hypothetical data for the performance index (PI)—a measure of pupil achievement—as well as the graduation rate. Under HB 200, schools could actually register worse results compared to a year prior and receive a rating that indicates they are making “substantial” or “moderate progress” on that measure. Take the case of the (fictional) Washington Elementary, a school that receives a PI rating of “exceeds expectations” in 2021. Even if its PI score drops by ten points between 2021 and 2022, it could still receive a “making moderate progress” rating under HB 200. The same type of phenomena would occur in the graduation component where schools see a decline in graduation rates yet receive a rating indicating progress over time.
In sum, a school could be regressing—with fewer students meeting state expectations—but the rating labels would tell the public that it’s improving. How is that not actively and knowingly misleading to Ohioans?
Table 1: Illustration of how poorly crafted labels can mislead the public
The administrator associations didn’t actually respond to the concerns about misleading labels or offer other ideas that would better communicate performance. Rather, they pointed to Massachusetts’s report card, which uses this terminology. Indeed, the Bay State does use language similar to the House plan, including “moderate progress toward targets.” But there’s an important difference. As shown below, in Massachusetts’s report card, schools are ranked from 1–99 using an “accountability percentile” that shows where along the spectrum in statewide performance the school stands. Moreover, when a school is low performing, there is even a clear notice to the public that the school is near the bottom statewide. Neither of these features are in the House plan.
Source: Massachusetts Department of Elementary and Secondary Education.
It’s always possible that the administrator associations missed this part of the Massachusetts report card. But one suspects that they may have cherry picked elements of the report card they liked and omitted those they didn’t (ranking schools has been routinely decried by the establishment). Whatever the case may be, the bottom line is that it’s dishonest to suggest that all the House plan does is simply mirror the Massachusetts report card.
Why overall ratings matter
Another source of disagreement between the House and Senate bills is the overall rating, which summarizes performance across the various dimensions of the report cards. The House scraps this rating, while the Senate continues Ohio’s longstanding policy—and Massachusetts’s too!—by maintaining an overall mark. In his testimony for the administrator groups, Miller argued that the absence of an overall rating would “dilute the significance of the component ratings,” which he deemed “more important” than the overall rating. He is of course right that the component ratings are critical pieces of the report cards. They help users—particularly school officials who are apt to understand each measure—take stock in the strengths and weaknesses of individual schools.
That said, we must recognize that most Ohio parents and citizens don’t work in education. For this audience—arguably the primary audience of the report card—they deserve a user-friendly summary that doesn’t require them to comprehend the intricacies of each component. In surveys commissioned by Ohio Excels, about two in three Ohio parents favor a summary rating. That’s no surprise considering the widespread use of overall marks in other contexts that give users a straightforward, bottom-line summary. In schools, students receive cumulative grade point averages that capture their performance across various subjects. In business, companies’ creditworthiness are summarized though bond ratings. Hospitals and preschools receive star ratings, much like those proposed in SB 145 for Ohio schools and used in other states’ report cards. Again, these star-rating systems combine various indicators so that the public gets a general sense about the quality of services. While the component ratings are important, the general public likely sees far more value in an overall mark.
* * *
The House and Senate report card bills would put Ohio on very different paths when it comes to school accountability. The House version of report card “reform” is far more accommodative to the public school system, as it softens measures and uses euphemisms that paper over low performance—and even actively mislead the public. This type of system isn’t fair to property-taxpayers who deserve an honest evaluation of their local schools when they vote on school levies. Worse, it’s unfair to Ohio parents who are simply searching for a great school that works for their kids.
The Senate legislation is truer to core principles of school accountability. Its approach includes a fair, evenhanded assessment of school performance that is then communicated to the public through a transparent rating system. With any luck, the General Assembly will discount the faltering arguments of the school establishment, and instead look towards a solution that puts students, parents, and citizens first.
The state budget has long been the primary vehicle for instituting sweeping education policy changes in Ohio. Amidst all the hustle and bustle and debate, it’s easy to forget that standalone bills are still being proposed. In fact, given the high stakes, it may even seem wise to focus solely on the budget.
But ignoring standalone bills is a mistake. Consider the recently proposed Senate Bill 166. Its focus is on career-technical education (CTE), and it proposes a number of changes. One particularly promising one is that it calls on the Governor’s Office of Workforce Transformation, the Ohio Departments of Education and Higher Education, and JobsOhio to create a program to offer incentives aimed at encouraging businesses to provide work-based learning (WBL) experiences to students enrolled in CTE programs. To qualify for these incentives, businesses would need to ensure that their WBL experiences align with state law, the state’s WBL framework, and labor laws pertaining to minors. Given that Ohio has been focused on improving and expanding WBL efforts in recent years, this incentive program could be a helpful addition to the state’s CTE landscape.
But SB 166 also makes a couple of additional changes related to driver’s licenses (yes, driver’s licenses). First, it permits students who complete a driver education course at their school district to receive either half a unit of elective credit toward graduation or credit in the form of an approved industry-recognized credential (IRC). Second, it requires the state’s IRC committee to update the list of credentials and licenses that can be used to earn a high school diploma to include a driver’s license obtained through one of these courses.
In his sponsor testimony, Senator Reineke explained that these provisions were included to address a current problem in the WBL sphere. Specifically, he noted that businesses are “facing an issue where students engaging in current work-based learning programs are unable to actually get to work.” Problems like this aren’t new. Transportation has been a sticking point in education for decades, and thousands of students each year miss out on worthwhile opportunities simply because they lack transportation options. Incentivizing students to earn a driver’s license by making it an IRC and tying it to graduation could expand access to WBL opportunities.
But there are also some unintended consequences, as a driver’s license doesn’t fit the IRC mold. An industry-recognized credential is supposed to do exactly what its name suggests—indicate that the person who earned it has the knowledge and skills needed to be successful in a certain industry. A non-commercial driver’s license, however, isn’t tied to a specific industry. In fact, it isn’t specific to any of the thirteen career field categories listed by the Ohio Department of Education.
But don’t just take my word for it. In 2014, the State Board of Education adopted a framework for determining which IRCs should qualify students for high school graduation. The application that’s required as part of this process asks applicants to identify which career field the credential applies to; list three entry-level jobs that are associated with it and five job skills that students learn as a result of earning it; and provide the name, number of openings, and estimated yearly income of jobs tied to the credential. A generic driver’s license can’t answer any of these questions satisfactorily, which should raise some red flags about whether the state should label it as an IRC.
Of course, it’s true that there are plenty of jobs that require employees to know how to drive. But the full-time, well-paying jobs that fit this description require a Commercial Driver’s License, not a generic one. Jobs that require only a state-issued driver’s license and no additional training and skills are plentiful, especially thanks to the gig economy. But they are also famously unreliable, low-paying, and stressful. Industry-recognized credentials are supposed to put students on a pathway toward steady, well-paying jobs. A driver’s license alone can’t do that.
SB 166 is a standalone, well-intentioned effort to strengthen CTE in Ohio. Its provisions incentivizing businesses to establish WBL experiences are a step in the right direction, and allowing students to earn credit toward graduation by earning a driver’s license could help solve a persistent problem. But equating a driver’s license to an industry-recognized credential could have unintended consequences that diminish the value of credentials overall. Here’s hoping that as SB 166 is debated in committee, the bill language is clarified to address this issue.
Back in 2014, Ohio lawmakers overhauled the state’s dual-enrollment program that gives students opportunities to take advanced courses through two- or four-year colleges. Public, private, and homeschool students in grades 7–12 can participate in the program, now known as College Credit Plus (CCP). The courses may be delivered on-campus, online, or at a high school (taught either by college faculty or qualified teachers). Because students earn both high school and college credit for passing these courses, the program offers a head start in earning post-secondary credentials.
The revamped dual-enrollment program has become an increasingly popular option with students. In CCP’s inaugural year (2015–16), just shy of 55,000 students took at least one dual-enrollment course. In just four years, that number has risen to almost 77,000 students. The most recent state data show that 25 percent of the class of 2019 received at least three credits through dual enrollment. State leaders have been largely supportive of CCP, touting its potential to make college more affordable by reducing the time needed to earn degrees. A couple months ago, Lieutenant Governor Jon Husted highlighted CCP in an editorial that focused on state initiatives to upskill the workforce.
Dual enrollment is a valuable option for students and it supports Ohio’s efforts to ensure young people are ready for college and great careers. Yet at the same time, policymakers continue to wrestle with a few important policy details, three of which are reflected in the House-passed version of this year’s state budget bill (HB 110).
Eligibility standards: The original CCP legislation did not include specific eligibility guidelines. Any student could participate, regardless of their academic readiness, so long as a college or university admitted them. That changed in 2017 when the legislature added a provision that now requires students to either (1) achieve a remediation-free score on a college entrance exam or (2) come within one standard error (SE) of it plus have a 3.0 GPA or letter of recommendation.
Seemingly uncomfortable with these higher standards, lawmakers have proposed via HB 110 to give the chancellor of higher education authority to determine eligibility criteria. Should this pass the Senate, the question will then focus on what happens to the eligibility bar. No one knows for sure which way it’ll go, but in recent testimony current chancellor Randy Gardner suggested an easing of standards to broaden access to CCP, saying “We still have standards, standards are important, but we want to provide as much access as possible to allow as many students as possible to take advantage of this [CCP].”
If indeed this is the direction the state goes, it opens some new questions. While participation numbers might rise, will less prepared students excel in these courses? What types of supports might they need? Will colleges need to water down CCP courses or slow their pace? (Remember, there are already concerns about quality control in dual enrollment.) In the end, there may be benefits to opening CCP to students who aren’t truly college ready. But we should also recognize the challenges that may lie ahead to ensure that all participants benefit from the option.
Approved courses: Since its inception, CCP rules have prohibited students from taking remedial or religious courses. Nevertheless, concerns were almost immediately raised about some of the courses that students could take (think Zumba or Pilates). Wisely, policymakers clamped down and instituted some new rules governing approved courses. Despite that move, controversy erupted earlier this year regarding a course that is alleged to have included graphic material. In response, legislators included provisions in HB 110 addressing this concern by requiring, among a few other things, students and parents to sign a permission slip, and colleges and state agencies to include disclaimers that course material may include “mature subject matter.” While the bill doesn’t include new course prohibitions, this episode illustrates the possibility of additional scrutiny about the types of courses that middle and high school students take. Will legislators seek more aggressive actions the next time controversy arises over course content? Will questionable courses dampen political support for the program? All things to keep an eye on.
Program effectiveness: Although CCP is still relatively new, there have been questions about the cost-effectiveness of the program. Back in 2018, lawmakers passed provisions that required state education agencies to look into this issue, and they have called for another report in HB 110. It’s hard to know what’s driving the most recent request. Legislators may simply want updated information. Or perhaps they didn’t find all the answers they were looking for in the prior report, which was a somewhat superficial review yet concluded that the program “has been a transformative addition to the high school experience and a game-changer for Ohio students.”
It’s definitely appropriate to continue asking questions about whether CCP—or any government program—is fully meeting its goals. While it may have been too early for state agencies to do a rigorous evaluation of CCP for the previous study, the time is ripe for a more comprehensive analysis of program effectiveness. Does CCP boost participants’ knowledge and skills relative to similar students who do not participate? How does it compare to the Advanced Placement or International Bachelorette programs? Are all types of delivery models equally effective (e.g., on campus versus high school versus online)? How many students earn enough credit through CCP to skip a semester or two of college, where the real savings happen? A rigorous study that examines questions such as these—whether undertaken by state agencies or independent researchers—would help inform policymakers as they continue to work to strengthen pathways to college and career.
* * *
Dual enrollment holds tremendous potential to challenge students academically, to allow young people to take courses that aren’t offered by local schools, and to give students an inside track to college degrees. In terms of program numbers, CCP has gotten off to a great start. But to maximize the potential of this option, state leaders will need to continue steering the program in the right direction.
As post-pandemic life cautiously starts to take shape here in America, uncertainty abounds. Will our systems and processes and activities eagerly snap back to their 2019 forms? Or will our lives in 2021 and beyond take on new contours influenced by what we have learned, for good and ill, during the challenges forced upon us by 2020? A new report from the Afterschool Alliance that addresses students’ summer experiences might provide a clue.
“America After 3pm” is a periodic, nationally-representative survey of randomly selected parents of school-aged children. Pre-pandemic, this survey was conducted with nearly 30,000 households between January and mid-March 2020, providing baseline data on students’ summer experiences in 2019 as they had done in years past. These experiences are broadly defined to include camps, classes, vacations, work, and more.
Five additional data sources, two smaller parental surveys and three questionnaires of summer and afterschool program providers across the country, were obtained between August 2020 and mid-March 2021.
In 2019, 47 percent of families surveyed reported that at least one of their children participated in a summer program, continuing the upward trend seen in 2008 (25 percent) and 2013 (33 percent). Most common was a non-STEM specialty program (such as visual arts, sports, or drama), followed by voluntary summer programming run by schools or districts. STEM programs, jobs and internships, and mandatory or voluntary summer school brought up the rear. Overall, 95 percent of parents reported satisfaction with their child’s summer experience. While less than half of parents in 2019 said they preferred summer experiences to be “different” than the school year, three in four of those surveyed said that it was important that those experiences “helped keep their child from losing academic ground.” More physical activity, life skills, and access to the outdoors led the list of differences preferred, but differences varied between low-income and higher-income parents and between different racial and ethnic groups.
Forty-two percent of higher-income families reported that their children did not participate in summer programming due to “other family activities,” such as in-home supervision or vacation travel, while 35 percent of low-income families reported the same. But parents also reported that nearly one in three children not in a structured program during the 2019 summer would have been enrolled if one were available to them. “Availability” is not strictly defined, and that does raise some questions. It would be interesting to know, for example, how many parents forewent an available but undesirable opportunity (say, a baseball bootcamp) because a truly desired opportunity (say, a poetry camp) was not on offer near them. Using the general terminology, the report determines that up to 13.9 million children nationwide were unable to access a desired opportunity. Expense was the most commonly cited barrier to participation, with the average cost of activities estimated at $758 to $900—or approximately $200 per week—based on type and length of program. Transportation to and from programs and simple lack of awareness of available opportunities were also cited as barriers, but future surveys would do well to better define that term.
During the pandemic summer of 2020, participation in summer activities predictably declined. Thirty-four percent of families surveyed reported that at least one of their children participated in a structured summer experience. Thirty-seven percent reported that programming was fully virtual, 36 percent report that it was in fully in-person, and 26 percent reported a hybrid model. Participation in in-person programs was affected by social distancing requirements; 40 percent of in-person program providers reported running a waitlist in 2020. Satisfaction dipped a little but remained high, with nine in ten parents reporting that they were happy with their child’s summer program overall. Interestingly, the average cost for summer programming in 2020 was reported at about $120 per week, down more than $80 from the previous year. This was likely due to a combination of reduced cost to conduct virtual versus in-person programs and the availability of Covid relief funding to assist schools, organizations, and families. Forty-eight percent of parents reported that their child’s 2020 summer experience came at no cost to them. Schools provided about 30 percent of summer programming, community-based organizations provided about 23 percent, and cities and towns provided 21 percent. Unmet demand for summer programs remained high once again, with 57 percent of families who reported not having a child in a summer program saying they would have liked to have enrolled their child in one if they could have, although the caveats around the definition of “availability” remain.
What might all this mean for the summer that is now upon us? That’s far from clear, but what we can see so far is troubling. Seventy-nine percent of program providers surveyed in February and March 2021 said they would be providing structured activities this summer, but 36 percent of those said that they are most concerned about their ability to meet the demand from families. This despite the fact that the American Rescue Plan provides more than $1 billion for summer enrichment activities and allows state and local education agencies to target billions of additional dollars to summer learning programs to help students recover from the pandemic. Not to mention the expanded and refundable child tax credit, which parents could choose to use on enrichment activities like summer camp. School districts are gearing up, but even those with robust programs are shying away from calling it “summer school” and from mandating attendance for even those students with the clearest needs. Despite the fact that many of the barriers to access reported in previous years—cost, transportation, awareness—might be swept away in the tsunami of money available, and despite the need for more and better programming following more than a year of disrupted learning, the survey data seem to point to more of the same: sizeable numbers of students missing out on summer opportunities.
Even though the desire to return to pre-pandemic norms may be strong, we must be clear-eyed about whether those norms were good enough to begin with. And despite the obstacles we faced in 2020, they could help inform improvement so that a better version of systems and activities can go forward. How this summer’s programming for students fares could be a harbinger of what more is to come. Here’s hoping for detailed data and robust analysis thereof.
SOURCE: Afterschool Alliance, “Time for a Game-Changing Summer, With Opportunity and Growth for All of America’s Youth” (May 2021).
NOTE: On June 3, 2021, the Ohio Senate’s Finance Committee heard testimony on House Bill 110, the state budget bill. Fordham’s Vice President for Ohio Policy presented proponent testimony on a number of education provisions in the bill. These are his written remarks.
My name is Chad Aldis, and I am the Vice President for Ohio Policy at the Thomas B. Fordham Institute. The Fordham Institute is an education-focused nonprofit that conducts research, analysis, and policy advocacy with offices in Columbus, Dayton, and Washington, D.C. Our Dayton office, through the affiliated Thomas B. Fordham Foundation, is also a charter school sponsor.
As we all know, a strong K-12 education system lays the foundation for the lifelong success of Ohio’s 1.8 million students. Recognizing the critical role of education, legislators for decades have worked hard to create policies that unlock great opportunities for all Ohio students and ensure that students are well-prepared for life after high school.
In their versions of the budget, Governor DeWine and House legislators continued the state’s long tradition of prioritizing K-12 education. We commend them for their work, and we’re encouraged that the Senate has built on several of their proposals. Specifically, we’re pleased with the following provisions:
In addition to these promising moves, we also applaud the Senate for championing policy changes that will empower Ohio families with quality educational options. We strongly support the following provisions in the substitute bill:
In terms of the much-discussed overall funding formula, we are encouraged to see the Senate put forward a proposal that, much like the House plan, would transition Ohio back towards a formula-based system after it was suspended for the 2020 and 2021 fiscal years. Broadly speaking, we wish to highlight two strengths of the Senate’s approach to school funding:
Last, we wish to flag a few issues in the substitute bill that require attention in the coming weeks. They are as follows:
All Ohio students deserve a great education that puts them on a solid pathway to rewarding careers and lifelong success. The Senate has crafted an education budget that will help to put Ohio on the path to a world-class education system.
Thank you for the opportunity to testify.