Teachers, the Common Core, and the freedom to teach
Common Core standards mean freedom to many teachers. Here's why.
Common Core standards mean freedom to many teachers. Here's why.
Common Core watchers out there have probably heard this one before: All the teachers I know hate the Common Core.
There are undoubtedly some teachers who dislike the Common Core, but recent polls suggest that most teachers support the new standards. During my three years of teaching (completed a month ago), most of my colleagues and I liked the Common Core. One reason we supported the new standards was because they gave us more freedom. Detractors claim that standards tell teachers how to teach. But I taught Common Core after teaching Tennessee’s state standards, and while Common Core did give me expectations for what my students should know and be able to do by the end of the year (just like the previous standards did), it allowed me to decide what and how to teach.
Let’s consider, for example, the first literature standard for ninth graders (the grade I taught), which states, “Cite strong and thorough textual evidence to support analysis of what the text says explicitly as well as inferences drawn from the text.” Most would agree that using evidence to support the analysis of a text is crucial. Students ought to know how to cite evidence instead of simply writing about their opinions and feelings.
That’s all the standard says, though. Nothing more, nothing less.
The standard didn’t tell me when in the year I should teach the skill. I could spend as much or as little time as I wanted on it and make that determination using my knowledge of what my students needed. The standard didn’t tell me how to assess whether students had mastered the concept. I could require students to demonstrate mastery in writing, in oral presentations, on multiple-choice tests, or with project-based learning. The standard didn’t tell me that I could teach it only with fictional texts. In fact, the same standard showed up in the Informational Text, Writing, and Speaking and Listening sections of the English language arts standards. Since my students had to understand how to cite evidence in various ways (verbally, in writing, etc.), the Common Core allowed me to teach it in various ways.
Furthermore, although the Common Core recommended age-appropriate texts I could use, it said nothing about texts that I was required to use. I could use Romeo and Juliet, “The Cask of Amontillado”, “The Love Song of J. Alfred Prufrock,” and “Letter From Birmingham Jail”—and I did. I could teach as broadly, narrowly, or deeply as I deemed necessary for my students. The possibilities were endless.
And that’s all for just one standard.
If teachers are feeling constrained by the Common Core, the fault surely lies at the feet of those choosing curricula and materials. If a district purchases textbooks that limit teachers to using the standards in prescribed ways, those restrictions could easily hamstring teachers, especially those who need maximum flexibility to properly engage and instruct diverse learners in a classroom. But the standards are not causing the limitation.
Districts would do well to consider training teachers on how to unpack the Common Core standards and then let those same teachers write the curriculum. Who better to interpret what students need than the people who teach them and are held accountable for their achievement? It puts control where it should be: in the hands of teachers. In fact, a recent Fordham publication reported that districts utilizing homegrown materials enjoy more buy in and ownership from teachers, and ownership is vital for effective implementation.
Don’t get me wrong—writing curriculum instead of using something off of the shelf (likely with a big “Common Core aligned” sticker) is a lot of work. Believe me, I’ve done it. But relying on curriculum written by someone else threatens the very classroom freedoms that Common Core provides. Premade materials tell teachers what texts to teach, what order to teach them, and even have ready-made assessments. It might make the job easier, but it requires a sacrifice of control and the risk of knowing that whoever wrote the curriculum didn’t do it specifically for your children.
The Common Core emphasizes what kids should learn (that’s the standards) and empowers teachers to focus on the “how.” School districts would be wise to properly train and trust their staff, those best suited to make judgments about how to implement the new standards.
NOTE: The Thomas B. Fordham Institute occasionally publishes guest commentaries on its blogs. The views expressed by guest authors do not necessarily reflect those of Fordham.
Earlier this year, two articles published in the Columbus Dispatch claimed that students using vouchers to attend private schools in Ohio perform worse than their peers attending public schools. The focus of the March 8 article and the subsequent March 16 editorial was on extending the third grade reading guarantee to students using vouchers (a measure eventually signed into law). In an effort to bolster this argument, the article referenced data suggesting that 36 percent of third-grade voucher students would be retained compared to only 34 percent of public school students. Other articles in the Cincinnati Enquirer and the Canton Repository made similar comparisons that negatively portrayed the performance of students using an EdChoice Scholarship. However, Test Comparison Summary data released this week by the Ohio Department of Education shows a very different picture of how voucher students are performing. The key is using the right comparison group.
The data used in the articles referenced above incorrectly grouped the results of all public school students in the state, including many affluent public schools, and then compared their results with those of voucher students. However, these scholarships are not available to all students. Students are only eligible for a traditional EdChoice Scholarship if they attended or otherwise would be assigned to a “low-performing” public school. Many such schools are located in Ohio’s less-affluent urban areas. Accordingly, the most accurate comparison is to examine the test results of students receiving EdChoice vouchers with the results of students who are attending a public school that the scholarship students would otherwise be assigned to.
In contrast to the widely circulated newspaper reports, data published by the Ohio Department of Education shows that students receiving vouchers performed considerably better on the reading assessment in comparison to the results of comparable district students. Consider Columbus. In 2013, voucher students in grades three to eight outperformed their district peers in math and reading at every grade level on OAAs and OGTs (see table below). For instance, on the OGT’s, 90 percent of voucher students were proficient in reading compared to 68 percent of public school students. On the math portion of the OGT, 83 percent of voucher students were proficient compared to 55 percent of their public school peers. Similarly impressive scores, with some exceptions at certain grade levels, were posted in Cincinnati, Cleveland, Dayton, and Toledo.
Source: Ohio Department of Education
These statistics should not surprise anyone. A study by Greg Forster of the Friedman Foundation for Educational Choice found that eleven of the twelve gold standard studies of the academic impact of vouchers found positive gains. (The other one found no net impact either way.)
These new data are not a perfect picture of academic performance or the “effectiveness” of Ohio’s private schools that take voucher students. (In depth research using student-level data over time would be valuable.) It does not account for students’ performance (which is often below grade level) before they transferred to private schools, nor does it track individual students as they progress over time. What it does, however, is give us a more accurate snapshot of current academic performance compared to the district schools the students may have attended than comparing voucher students to all students statewide.
The department’s recent release of city-level data is the best current available data on academic performance of EdChoice students. It is a far better reference point than comparing voucher students to affluent district schools to which they have no access. How is it fair, for instance, to compare an inner-city Columbus voucher student to an upper-middle-class student in Worthington? It’s like comparing a Pell Grant student at Columbus State to an affluent student attending Harvard. Context matters.
While parents choose schools for their children based on many different factors, academic results are an important gauge of whether students are benefiting from their schools of choice. Reasonable comparisons are essential, too. The state’s data suggest that Ohio’s main voucher program is helping disadvantaged students’ succeed academically. However, true success only comes when we focus on what works best for the unique needs of every child and family.
Yitz Frank is Ohio Director of Agudath Israel of America.
The Friedman Foundation for Educational Choice recently released results from its latest public-opinion survey. The national survey of 1,007 adults examined their views concerning the state of American education, with a particular focus on school choice, the Common Core, and standardized testing. The survey shows that most Americans—58 percent of those surveyed—tend to think that K–12 education has “gotten off on the wrong track.” Interestingly, those who are white, higher income, residents of rural areas, and older tended to express the least satisfaction with K–12 education. High percentages of respondents support various school-choice reforms. Big takeaways include the following: Charter schools and vouchers are supported broadly across racial, income, and political-party segments. Overall, 61 percent say they favor charter schools, while only 26 percent say they oppose them. Similarly, 63 percent say they support school vouchers, with only 33 percent opposing them. When it comes to accountability for test results, 62 percent of those surveyed say that teachers should be held accountable. But fewer respondents thought principals should be held accountable (50 percent), and just 40 percent thought state officials should be accountable. Finally, half of the respondents expressed support for the Common Core. What the public thinks matters—and in this new survey, the results pose an interesting (if unintended) question: If choice programs have so much public support, why are they so politically controversial?
Source: Paul DiPerna, 2014 Schooling In America Survey (Indianapolis, IN: The Friedman Foundation for Educational Choice, June 2014).
The Education Tax Policy Institute in Columbus released a new report that says the tax burden in Ohio has shifted significantly since the early 1990s, from businesses onto farmers and homeowners, to the detriment of school districts and local governments. Much hay is being made over this report by the usual suspects, including the alphabet soup of education groups (BASA, OASBO, and OSBA) who commissioned it. Here are a few examples of media coverage the report has garnered:
While this report is interesting and describes changes to the state’s property-tax policy over the years, it doesn’t offer much in the way of takeaways. The shift in the property-tax burden over time is likely borne of necessity, as Ohio works to ensure that its business-tax structure is competitive with that of other states. The implication, though, is that the shift has somehow harmed education funding. Fortunately, this doesn’t seem to be true, as school expenditures have increased 22 percent (in real dollars) since 2000.
Teach For America (TFA) is one of the nation’s largest alternative routes into the teaching profession. In the 2013–14 school year, there were 11,000 corps members reaching more than 750,000 students in high-need classrooms all around the country, including nearly 150 TFA members in the Cleveland and Cincinnati-Dayton areas. Yet even with TFA’s growing scale, its teachers are a proverbial drop in the bucket compared to the country’s teaching force of approximately 3 million. This raises the question of how best to allocate these young, enthusiastic teachers. Should corps members be dispersed widely across a district’s schools, or should they be “clustered” into targeted schools? Would having a high density of TFA members in a few, high-need schools provide positive learning benefits even for students with non-TFA teachers (“spillover” effects)? This new study analyzes the impact of clustering TFA members in Miami-Dade County Public Schools (M-DCPS), using district level data from 2008–09 to 2012–13. TFA altered its placement strategy in M-DCPS in 2009–10 and began to cluster members in a smaller number of turnaround schools. For example, among middle schools with a TFA member, 18 percent of the school’s teaching staff was, on average, TFA in 2012–13, compared to just 4 percent in 2008–09. The researchers, however, found that the higher density of TFA members in the targeted schools yielded no significant “spillover” benefits—as measured by test-score gains—for students with non-TFA teachers. That said, this study replicates the finding that TFA teachers, in math at least, contribute significant value-added gains relative to the average schoolteacher. The analysts estimate that Miami’s TFA members contributed an impressive three additional months of learning in math (the gains in reading were statistically insignificant). It appears, then, that the effectiveness of TFA members doesn’t necessarily rub off on other teachers—even when there are many members in a single school. Nonetheless, with or without “spillover” effects, this study makes it increasingly clear that TFA members are doing a heck of a job teaching math.
Source: Michael Hansen, Ben Backes, Victoria Brady, and Zeyu Xu, Examining Spillover Effects from Teach For America Corps Members in Miami-Dade County Public Schools (Washington, D.C.: National Center for Analysis of Longitudinal Data in Education Research, June 2014).
In recent years, Ohio’s businesses have lamented the challenge of hiring highly skilled employees. Surprisingly, this has occurred even as 7 percent of able-bodied Ohioans have been unemployed. Some have argued that the crux of the problem boils down to a mismatch between the needs of employers and the skills of job-seeking workers. A new study from Jonathan Rothwell of the Brookings Institution sheds new light on the difficulty that employers face when hiring for jobs that require skills in science, technology, engineering, and math (STEM) fields. Using a database compiled by Burning Glass, a job-analytics company, the Rothwell examines 1.1 million job postings from 52,000 companies during the first quarter of 2013. The study approximates the relative demand for STEM vis-à-vis non-STEM jobs by comparing the duration of time that the job vacancies are posted. Hence, a job posted for an extended period of time is considered hard to fill (i.e., “in demand”).[1] As expected, Rothwell finds that STEM-related job postings were posted for longer periods than non-STEM jobs. STEM jobs were advertised, on average, for thirty-nine days, compared to thirty-three days for non-STEM jobs. The longer posting periods for STEM jobs were consistent across all education levels—from STEM jobs that required a minimum of a graduate-level degree to “blue-collar” STEM jobs that required less than a college degree. For Ohioans, the study also includes a useful interactive webpage that slices the data for the state’s six metropolitan areas (Akron, Cincinnati, Cleveland, Columbus, Dayton, and Toledo). The study provides us with some hard evidence that workers possessing STEM skills have a relative advantage over those who do not. If one of the end goals of K–12 education is to prepare students for in-demand jobs, Ohio’s policies should focus on upgrading the quality of STEM education.
SOURCE: Jonathan Rothwell, Still Searching: Job Vacancies and STEM Skills (Washington, D.C.: Brookings Institution, July 2014).
[1] The analyst notes that vacancy duration isn’t a perfect measure of demand, since lower-skilled job openings, like retail sales, are sometimes posted continuously. The inclusion of these “always open” non-STEM jobs suggests that the analysis actually understates the difficulty of hiring for STEM positions.