How does someone having bought an asset that can be sold later at a likely profit demonstrate that they think that the end of the world is coming? Mind you, I think he probably does think it, but this is not a demonstration of it.
Tell me you haven't been to Blackpool without saying you haven't been to Blackpool. :)
It doesn't show that he thinks the end of the world is coming per se, but it shows that he's willing to put money into something exceptionally unprofitable for a cause he believes in.
I’m with Bryan on this—if the world is ending, the least we can do is price it correctly.
These end-of-the-world bets always remind me of Balaji confidently lighting ~$1M on fire in public during his Bitcoin hyperinflation call. There’s something oddly comforting about watching very smart people discover that markets are less impressed by vibes than they are.
At this point, Eliezer and Balaji feel like useful reminders of an alternate timeline where both ended up as mid-level paper pushers in some sprawling bureaucracy—arguing in memos no one reads—if not for the cosmic luck of being in Silicon Valley at exactly the right moment with exactly the right priors.
Instead, we get front-row seats to them making apocalyptic proclamations and occasionally converting large sums of money into… content.
Caplan’s framing is refreshingly grounded. If you really believe the end is near, you don’t write long blog posts—you lever up, spend aggressively, and stop worrying about 2030 repayment terms.
Until then, I’m happy to take the other side of the bet—and keep my apocalypse hedging strictly limited to snacks and a decent internet connection.
I don’t understand why they’re not betting you much more money than this. It seems like an easy way to increase your enjoyment of the end times! As you note, massive borrowing also makes sense.
Sure you do. None of these guys (subconsciously, I think) really buy what they’re selling. They don’t believe it will happen, the same way most other doomers of various stripes (climate, fall of the West, etc.) don’t really believe it will happen.
The only people really putting their money where their mouths are would be prepper types: guns, canned food, bunkers.
I do think the psychology of this is really interesting; there’s a lot of doublethink. Bryan’s bet only makes sense for him because he knows this. As noted in the other comment, if they really believed, it would be a bad bet for Bryan because they would spend themselves into a hole and never be able to repay. Bryan knows they don’t really believe, but they want to be seen as believing, so they pay a nominal amount to continue having credibility in an arena that they eventually know will puncture their credibility totally.
In the AI doomers defense - and I'm not an AI doomer at all, just for the record - they don't believe simple tools like guns, canned food and bunkers will be any bulwark against the AI. Now, they could team up with Musk and get the expedition to colonize Mars going. They might think they are safe there. That would be a strongly revealed preference.
You are right but I mean that as an analogy. I think the comparable money-where-mouth-is actions for AI doomers would be terrorism (cyberwarfare or actual bombs, attempt at EMP, whatever).
Yudkowsky has loosely gestured at this but backed off of it immediately. But if he was a true believer, if he thought the human race was at imminent threat of getting paperclipped, bombs would be the right move. I’m pretty sure they believe AGI to be god so escape to Mars will be pointless, it’ll simulate them in hell even if it doesn’t leave the planet right?
Likewise for the guys that believe, I dunno, the economy is going to collapse into anarchy because there’s no gold standard or whatever, buying canned corn and 5.56 in bulk is the obvious rational move, and if you don’t move to a remote farm but instead stay in your comfy suburb with your 401(k) contrivance still going, you obviously do not really believe.
The optimist-prepays mechanism sounds good till you run into the consideration you bring up at the end, which is that a true pessimist has every incentive to blow through the prepayment, as well as any reserves they have to pay the bet. You might as well go to a gamblers anonymous meeting and bet degenerate sports betters that they can’t go bust in the next five years. Good luck collecting when you “win”. All this makes me wonder: usually when you’re betting, I assume you’re trying to find people who sincerely believe the bet has a different expected yield than you do. When you bet an AI doomer, what you’re really betting is that their actual views are different than their publicly stated view. If their views are sincere, then they know full well they’ll never repay you even if you end up being right about the thing you’re betting on.
If you want an actual bet with them, you should secure their obligation. The contract should be something like mortgaging their house with a high rate that’s due as a balloon in 10 years or whatever. You can record that against their property which would prevent them from selling or further encumbering their house in the intervening years.
If they do what’s “rational” given their views, the pessimists will almost certainly be insolvent, meaning they’ll want to declare bankruptcy and discharge all that debt.
See my comment to Perry. I think you’re right, but the answer is that their views are obviously not fully rational, and therefore sincere but not fully committed in the sense that a fully rational person commits to a fully held rational belief.
"I’ve never met an AI doomer who claimed to be massively borrowing in order to maximally enjoy their final years. But if doomers took their pessimistic view seriously, they obviously would so borrow."
This is a very odd framing. Why would it be obvious that they should so borrow?
If I was confident that the world would end in or before 2030, trying to figure out how to prevent that would be my top priority. Hedonic maximization wouldn't be particularly relevant. Borrowing heavily comes around the same time as heroin and suicide in terms of "nothing I do can plausibly have any positive impact on the world anymore", and that is not what most of the people you call doomers profess to think.
Not quite. Bryan lost the first bet (om the sense of making a poor loan for much lower interest than he could have gotten elsewhere), but this second bet is essentially a mini-loan for ~21% interest. While Greg's investments may very well out-perform this, this is objectively a high-interest rate and Greg undoubtedly can get lower interest loans elsewhere.
Who is it that's born every minute? 🤔 You still have almost two million chances to make similar bets (there are approximately 1,944,625 minutes remaining until January 1, 2030).
I believe it's spelled Colbourn.
Greg has bought out two hotels in Blackpool, Lancashire for philanthropic use - they are called CEEALAR and Pause House (https://forum.effectivealtruism.org/posts/fhiXNB4RS3h5BpQLF/pause-house-blackpool), and they're dedicated for use by people trying to stop existential risks.
He's definitely someone who puts his money where his mouth is, and I'm glad to see he's made this bet!
How does someone having bought an asset that can be sold later at a likely profit demonstrate that they think that the end of the world is coming? Mind you, I think he probably does think it, but this is not a demonstration of it.
Tell me you haven't been to Blackpool without saying you haven't been to Blackpool. :)
It doesn't show that he thinks the end of the world is coming per se, but it shows that he's willing to put money into something exceptionally unprofitable for a cause he believes in.
I’m with Bryan on this—if the world is ending, the least we can do is price it correctly.
These end-of-the-world bets always remind me of Balaji confidently lighting ~$1M on fire in public during his Bitcoin hyperinflation call. There’s something oddly comforting about watching very smart people discover that markets are less impressed by vibes than they are.
At this point, Eliezer and Balaji feel like useful reminders of an alternate timeline where both ended up as mid-level paper pushers in some sprawling bureaucracy—arguing in memos no one reads—if not for the cosmic luck of being in Silicon Valley at exactly the right moment with exactly the right priors.
Instead, we get front-row seats to them making apocalyptic proclamations and occasionally converting large sums of money into… content.
Caplan’s framing is refreshingly grounded. If you really believe the end is near, you don’t write long blog posts—you lever up, spend aggressively, and stop worrying about 2030 repayment terms.
Until then, I’m happy to take the other side of the bet—and keep my apocalypse hedging strictly limited to snacks and a decent internet connection.
I don’t understand why they’re not betting you much more money than this. It seems like an easy way to increase your enjoyment of the end times! As you note, massive borrowing also makes sense.
Sure you do. None of these guys (subconsciously, I think) really buy what they’re selling. They don’t believe it will happen, the same way most other doomers of various stripes (climate, fall of the West, etc.) don’t really believe it will happen.
The only people really putting their money where their mouths are would be prepper types: guns, canned food, bunkers.
I do think the psychology of this is really interesting; there’s a lot of doublethink. Bryan’s bet only makes sense for him because he knows this. As noted in the other comment, if they really believed, it would be a bad bet for Bryan because they would spend themselves into a hole and never be able to repay. Bryan knows they don’t really believe, but they want to be seen as believing, so they pay a nominal amount to continue having credibility in an arena that they eventually know will puncture their credibility totally.
In the AI doomers defense - and I'm not an AI doomer at all, just for the record - they don't believe simple tools like guns, canned food and bunkers will be any bulwark against the AI. Now, they could team up with Musk and get the expedition to colonize Mars going. They might think they are safe there. That would be a strongly revealed preference.
You are right but I mean that as an analogy. I think the comparable money-where-mouth-is actions for AI doomers would be terrorism (cyberwarfare or actual bombs, attempt at EMP, whatever).
Yudkowsky has loosely gestured at this but backed off of it immediately. But if he was a true believer, if he thought the human race was at imminent threat of getting paperclipped, bombs would be the right move. I’m pretty sure they believe AGI to be god so escape to Mars will be pointless, it’ll simulate them in hell even if it doesn’t leave the planet right?
Likewise for the guys that believe, I dunno, the economy is going to collapse into anarchy because there’s no gold standard or whatever, buying canned corn and 5.56 in bulk is the obvious rational move, and if you don’t move to a remote farm but instead stay in your comfy suburb with your 401(k) contrivance still going, you obviously do not really believe.
Bryan was the one to make the bet for a token amount, not Greg: https://x.com/gcolbourn/status/2036471089732211060
Greg very generously gave you 1:1 odds when you offered 10:1 odds and you still were only willing to bet $250? Why? https://x.com/gcolbourn/status/2036471089732211060
1) opportunity cost
2) uncertainty of pessimist’s ability to pay
This is a great bet for Bryan but the main problem from Bryan’s end is those two, linked
I'd bet more against Greg.
I understand the risks of AI, but betting we'll all be dead in 4 years is wild!
Stupid people making stupid bets. Get back to work, Caplan!
The optimist-prepays mechanism sounds good till you run into the consideration you bring up at the end, which is that a true pessimist has every incentive to blow through the prepayment, as well as any reserves they have to pay the bet. You might as well go to a gamblers anonymous meeting and bet degenerate sports betters that they can’t go bust in the next five years. Good luck collecting when you “win”. All this makes me wonder: usually when you’re betting, I assume you’re trying to find people who sincerely believe the bet has a different expected yield than you do. When you bet an AI doomer, what you’re really betting is that their actual views are different than their publicly stated view. If their views are sincere, then they know full well they’ll never repay you even if you end up being right about the thing you’re betting on.
If you want an actual bet with them, you should secure their obligation. The contract should be something like mortgaging their house with a high rate that’s due as a balloon in 10 years or whatever. You can record that against their property which would prevent them from selling or further encumbering their house in the intervening years.
They might use all their money, but they won't use up all their human capital which they can use to earn some more money in order to pay the dept.
If they do what’s “rational” given their views, the pessimists will almost certainly be insolvent, meaning they’ll want to declare bankruptcy and discharge all that debt.
See my comment to Perry. I think you’re right, but the answer is that their views are obviously not fully rational, and therefore sincere but not fully committed in the sense that a fully rational person commits to a fully held rational belief.
"I’ve never met an AI doomer who claimed to be massively borrowing in order to maximally enjoy their final years. But if doomers took their pessimistic view seriously, they obviously would so borrow."
This is a very odd framing. Why would it be obvious that they should so borrow?
If I was confident that the world would end in or before 2030, trying to figure out how to prevent that would be my top priority. Hedonic maximization wouldn't be particularly relevant. Borrowing heavily comes around the same time as heroin and suicide in terms of "nothing I do can plausibly have any positive impact on the world anymore", and that is not what most of the people you call doomers profess to think.
You already lost. SPY has gone from $250 in late 2017 to $700 today. At best, you get $500 in 2030.
You need to charge interest and adjust for inflation.
I would be tempted to take the other side of your bet just to invest the float.
Not quite. Bryan lost the first bet (om the sense of making a poor loan for much lower interest than he could have gotten elsewhere), but this second bet is essentially a mini-loan for ~21% interest. While Greg's investments may very well out-perform this, this is objectively a high-interest rate and Greg undoubtedly can get lower interest loans elsewhere.
Who is it that's born every minute? 🤔 You still have almost two million chances to make similar bets (there are approximately 1,944,625 minutes remaining until January 1, 2030).