The most painful part of writing The Case Against Education was calculating the return to education. I spent fifteen months working on the spreadsheets. I came up with the baseline case, did scores of “variations on a theme,” noticed a small mistake or blind alley, then started over. Several programmer friends advised me to learn a new programming language like Python to do everything automatically, but I’m 98% sure that would have taken even longer – and introduced numerous additional errors into the results. I did plenty of programming in my youth, and I know my limitations.
I took quality control very seriously. About half a dozen friends gave up whole days of their lives to sit next to me while I gave them a guided tour of the reasoning behind my number-crunching. Four years before the book’s publication, I publicly released the spreadsheets, and asked the world to “embarrass me now” by finding errors in my work. If memory serves, one EconLog reader did find a minor mistake. When the book finally came out, I published final versions of all the spreadsheets underlying the book’s return to education calculations. A one-to-one correspondence between what’s in the book and what I shared with the world. Full transparency.
Now guess what? Since the 2018 publication of The Case Against Education, precisely zero people have emailed me about those spreadsheets. The book enjoyed massive media attention. My results were ultra-contrarian: my preferred estimate of the Social Return to Education is negative for almost every demographic. I loudly used these results to call for massive cuts in education spending. Yet since the book’s publication, no one has bothered to challenge my math. Not publicly. Not privately. No one cared about my spreadsheets.
The upshot is that I probably could have saved a year of my life. I could have glossed over dozens of thorny issues. Taxes. Transfers. The effect of education on longevity. The effect of education on quality of life. The effect of education on crime. How unpleasant school is compared to work. Instead of reading multiple literatures to extract plausible parameters, I could have just eyeballed and stipulated for every tangential issue. Who would have called me on it?
Don’t get me wrong; The Case Against Education drew plenty of criticism. Almost none of it, however, was quantitative. Some critics appealed to common sense: “Education can’t be anywhere near as wasteful as Caplan claims.” Some critics called me a philistine: “Education isn’t about making money; it’s about becoming a whole person.” Never mind that I wrote a whole chapter against this misinterpretation. A few critics bizarrely claimed that one recent paper had refuted my entire enterprise. But as far as I recall, zero critics ever checked my math.
The most novel feature of my return to education calculations was that I tried to count everything that matters. I took the countless papers that start with the standard return estimates and tweak them with one novel complication. Then I merged all the tweaks that seemed convincing to me to get final policy-relevant numbers. If you wanted to use everything researchers know to craft optimal policy, that is precisely what you would do.
In the end, however, I discovered that the true intellectual problem was not lack of supply, but lack of demand. Education researchers don’t tweak standard return calculations to get the world closer to the truth. They tweak standard return calculations to get another publication – then move on with their lives. If the world handed out attention and tenure for synthesizing everything we know about the return to education, someone else would have done it long ago.
It’s hard to avoid a disheartening conclusion: Quantitative social science is barely relevant in the real world – and almost every social scientist covertly agrees. The complex math that researchers use is disposable. You deploy it to get a publication, then move on with your career. When it comes time to give policy advice, the math is AWOL. If you’re lucky, researchers default to common sense. Otherwise, they go with their ideology and status-quo bias, using the latest prestigious papers as fig leaves. Empirical social science teaches us far more about the world than pure theory. Yet in practice, even empirical researchers barely care what empirical social science really has to teach.
The post No One Cared About My Spreadsheets appeared first on Econlib.
You'll understand as a writer, if you blame the audience when your heavy work is lightly received.. you'll only continue to be lightly received (like a true academic).
At a glance:
- the files and tabs are overly dispersed for comparability
- I can't see a reference table or comments to explain the headers?
- there are constant numbers hidden in the formulas, e.g. -9500 for taxable income.. this doesn't just add work for the user, but means your work can't be updated with financial years..
I commend your rigour... there's usually not much behind the curtain.. but as you've found, few will bother to reverse engineer a mammoth spreadsheet before they're engaged by the claims.
Idea: You worked with someone to turn your writing into digestible comic book form.
..Why not work with someone to turn your spreadsheet results into an infographic?
I mean, the point of the months was finding the errors so that readers wouldn't, right? And there are returns to being the sort of person who does the math before staking too hard on a position.