So many of our debates about paying for higher education hinge on conflicting views of what’s the taxpayer’s responsibility and what’s the recipient’s (or their families.) That’s true of President Biden’s much-contested loan-forgiveness plan, as well as any number of proposals to make community college free. That’s true of student-aid debates going back to Lyndon Johnson’s time (and arguably before). These days, it’s also true of preschooling, and arises in different form when we fight over vouchers, tax credits, ESA’s, and such. Is it society’s responsibility to pay for private schooling or is it the family’s?
Education, according to every economist I’ve known or read, is both a public good and a private good. Which is to say, getting people educated is good for society, makes it more prosperous, cultured, secure, creative, perhaps even more humane. Keeping kids in school is good for society, too, as it keeps them out of trouble, advances the general welfare, and boosts the GDP by freeing more adults to get out of the house, earn money, pay taxes, and keep the wheels of industry and commerce turning.
But it’s a private good, too. Getting educated boosts one’s own life prospects, skills, job options, future earnings, and capacity to function as an informed citizen. For the most part, the more of it one gets, the more of those things it does, which is why high school graduates earn more than dropouts and college graduates (for the most part) earn more than those who didn’t. Get enough education and you may become a neurosurgeon or software engineer. But it’s not just a quantity thing. Graduating from a prestigious school or college, or winning an external accolade like a National Merit scholarship or Rhodes Scholarship, is apt to do you even more private good.
If education is both a public good and a private good, how much of it should the public—i.e., the taxpayer, whether local, state, or national—pay for and how much should be financed via investments by individuals who will benefit from it or their doting parents? And does equity in this realm mean treating everyone the same or working out sliding scales and income-contingent schemes that come closer to equalizing opportunity? Such questions carry huge moral and political implications, as well as budgetary.
Even before the mid-nineteenth Century invention of what we know as “public education,” many towns and local communities pitched in to cover the costs of educating at least some of the children who lived in them (most often white boys). But as the industrial revolution spread, states began to make some schooling universal and committed themselves to pay for it (up to a point). They also added clauses to their constitutions that obligated them to educate their residents—also up to a point. The point varied from state to state, as did the Constitutional phrasing, but generally resembled clauses such as Ohio’s, which committed itself to providing a “thorough and efficient system of common schools throughout the State.”
Alongside these self-imposed obligations to educate the public, states started to enact “compulsory attendance” laws, requiring children to attend those schools, again up to a point. Such mandates also varied by state, but all had such laws on their books by the end of World War I, generally demanding attendance through the elementary grades.
By and large, society’s changing human-capital needs have driven the expansion of taxpayer-supported education. When most work was manual—pushing a plow, driving a harvester, screwing nuts onto bolts on an assembly line, taking dictation—there wasn’t much need for secondary schooling—which was then largely the province of prosperous elites headed for university. The wealthy have always been able to buy more—and higher-status—education for themselves and their children. That’s part of how they stay wealthy! As the economy modernized, however, more people needed more skills, and the tax-financed parts of education grew apace. In 1910, just 9 percent of Americans had a high school diploma; by 1940, the number had increased to 50 percent.
Yet as it became open to all and free to the consumer, high schooling was usually optional—and even today there’s no requirement that kids complete it. Compulsory attendance laws are written in terms of students’ age, not the amount of education they get under their belts. Those ages varied by state—and still do. Here’s where things stood in 2009 according to the Education Commission of the States:
Minimum compulsory age:
- Age 5: 8 states and the District of Columbia, Puerto Rico and Virgin Islands
- Age 6: 24 states and American Samoa
- Age 7: 16 states
- Age 8: 2 states
Maximum compulsory age:
- Age 16: 23 states and the Virgin Islands
- Age 17: 8 states
- Age 18: 19 states and the District of Columbia, American Samoa, and Puerto Rico
At the lower end, observe that a third of the states don’t even require six-year-olds to attend school, let alone five-year-olds, and most don’t require attendance through the conventional age of high school graduation.
Insofar as education is optional, however, how much of it is society’s responsibility to pay for? Compulsory attendance definitely implies access to free public schooling, though such laws can also be satisfied by attending private schools or home schooling.[1] Yet arguments rage over how much additional education the taxpayer should pay for, whether universal or not, mandatory or not.
Taxpayer-funded pre-K is contentious, in part because some families don’t want it for their kids and many have made private arrangements for the version of it that they want (and can afford). Why create a windfall for those who don’t need or desire it? Nowhere is preschool compulsory.
Post-secondary education is where most of the fuss arises today. Though college has grown ever more expensive for decades, the “returns” on most forms of it remain substantial, and lots of politicians are urging that it be more heavily subsidized by taxpayers.
The United States has long had a mixed system when it comes to paying for higher education. Some of it is almost entirely private, but most takes place in “public” institutions. There, however, attendance is entirely voluntary and seldom actually free, save for a relative handful of states that cover tuition at their community colleges and (rarely) in four-year colleges for at least some residents of those states.
The general pattern is that “public” colleges and universities are partially subsidized by their states (and sometimes local communities), but their students share in the cost of attendance. And even where tuition is fully subsidized, there are fees to be paid, as well as room and board, parking, books, and more.
Superimposed on all this are myriad “student financial aid” programs—grants, jobs, and increasingly loans—designed, as the phrase implies, to assist tuition-payers with the cost of obtaining higher education. Almost all such aid is calibrated to a student’s (or family’s) ability to pay. This “Robin Hood” form of higher-ed financing has been around a long time, but today it’s battered by several forces.
Demographic and economic changes have created big vacancies in lots of colleges, sharpening their appetite for more students. Contemporary equity concerns argue for redoubled efforts to open opportunity and remove obstacles for poor and minority students, yet states have been stingy with appropriations for higher education. Meanwhile, the “college for all” push has caused many more young people to embark on post-secondary schooling than in earlier eras, often borrowing substantially to do that, and the rising cost of higher education has led to more and more student debt, including on the part of millions of people who didn’t finish college and therefore didn’t reap the income boost that it usually provides. Hence the pleas for debt forgiveness and the politicians’ incentive to respond to it.
The issues in K–12 school are parallel but different. In quantitative terms, every young American now has a “right” to free public schooling—there’s no tradition of cost-sharing with the consumer—but in most cases, they don’t have a tax-subsidized right to the version of it that they want or that may serve them best. So we argue, often bitterly, over just what is society’s responsibility and where to draw the line, across which families should be obliged to satisfy their “private” requirements and preferences with out-of-pocket investments of their own, whether that means moving their residence to a different place, getting up early to enter a school-assignment lottery, digging into their pockets for tuition, or educating their kids at home. Pushing to keep that line as tight as possible are not only budget considerations plus the organized interests of traditional public schooling, but also the ideology of “common” schooling, something we don’t much contend with at the postsecondary level. (Look back at that phrase in the Ohio constitution.)
At the end of this meditation, you may well expect me to advance a grand synthesis, sweeping conclusion, or great compromise. The fact is that, while Americans “believe” in education and the right to obtain as much of it as one wants, we don’t really agree on who should pay for how much of it, particularly at the “optional” stages, nor do we agree on whether society’s obligation is to furnish a single uniform version or to foster options, choices, and personal preferences. These disputes are destined to continue.
[1] It also needs to be noted that enforcement of those laws is spotty at best. The “truant officer” of yesteryear is a rare creature today, chronic absenteeism is widespread, and basically nobody gets punished for violating a compulsory attendance law.