The Ethics of Giving in the Very Long Term
Glenn weighs in on effective altruism and longtermism
Today we’ve got the second installment of our new feature in which the TGS team presents me with a story, idea, video, or topic and documents my spontaneous reaction to it. In this episode, Mark Sussman, the editor of this newsletter, brings to my attention a Twitter thread by Émile P. Torres about “effective altruism” and its controversial offshoot, “longtermism.” Mark presents a summary of both of these ideas, but you may have read about them in the news over the past week or so. (A good starting point is Gideon Lewis-Kraus’s profile of effective altruism’s most influential proponent, the philosopher William MacAskill.) That’s because one of the major figures in longermist circles, the (perhaps former) cryptocurrency billionaire Sam Bankman-Fried, has been at the center of a scandal that resulted in the bankruptcy of his cryptocurrency exchange, FTX.
We recorded this episode weeks before the implosion of FTX, but Mark refers to Bankman-Fried’s attempt to parlay his wealth into political influence, and both he and I evince skepticism about the prospects of Bankman-Fried maintaining his purported ethical position while dealing in vast sums of money. That’s not all we talk about, though. Neither Mark nor Nikita nor I are experts on the topic, and you’ll hear us thinking through some of the possible implications of effective altruism and longtermism in real time. What, if anything, do we owe humans living in the distant future? Is effective altruism just a way for Silicon Valley elites to salve their guilty consciences? Can “presentism” get you canceled?
This post was previously available only to subscribers, but I’ve taken it out from behind the paywall. However, I encourage everyone to subscribe. Besides features likes this one, subscriber benefits include early access to the ad-free version of the weekly podcast episode, the opportunity to ask questions for the exclusive monthly Q&A episode John McWhorter and I record, ticket pre-sales for live events, and full access to the archives and all new content.
The work we do here at the newsletter and podcast would be impossible without your generous support, so many thanks to those of you who currently subscribe. If you’re not yet a subscriber, please consider becoming one.
I wasn't quite sure what to make of this, though it was not without interest. Whether the difference between Scenario A and B is greater than B and C doesn't seem to me like it matters much. In fact, I was reminded of Rousseau's "he who suffers least is happiest, but ever more sorrow than joy, that is the lot of us all," which makes a strong case for Scenario C. And would not animals living and dying in factory farms be better off if their ancestors had perished with the dodo birds?
I liked Dr. Loury's comment about people in 1850 and whether people in 2022 mattered to them, or should have mattered to them. Lately I've been thinking about this but in the opposite direction. How much of the present should be devoted to people living in 2150 or 2300? Is there some magic future day when humanity needs to experience peak happiness and well-being? May 12, 2273? Why not Nov 21, 2022? Rather than sacrifice now for the future, why not let the future sacrifice for now? That's not a suggestion from me -- my time on the planet is of no importance -- it's a philosophical question.
Honestly, these days I'm busy thinking about sacrifices some citizens are being asked to make in 2022 for the good of 2032 (to pick a random close year) when I don't think they will help 2032 at all. In fact, they will likely make 2032 worse. (Since we are on a more theoretical plane, it would be inappropriate for me to mention progressive DAs and their trendy "less is more" ideology.)
I am a professional forecaster and econometrician. One thing I know for sure is you cannot predict more that a few years out at best (think of how Covid changed everything). Sacrificing the current for the future is a fools game. We don't know what the future holds.