Famed futurist Eliezer Yudkowsky fears the imminent end of the world at the hands of Unfriendly Artificial Intelligence. I find this worry fanciful. Many people in Eliezer’s position would just dismiss my head-in-the-sandedness, but he’s also genuinely impressed with my perfect public betting record. To bridge the gap and advance our knowledge, we’ve agreed to the following bet, written in the first person by Eliezier. Since I’ve just Paypaled him the money, the bet is now officially on.
Short bet:
– Bryan Caplan pays Eliezer $100 now, in exchange for $200 CPI-adjusted from Eliezer if the world has not been ended by nonaligned AI before 12:00am GMT on January 1st, 2030.
Details:
– $100 USD is due to Eliezer Yudkowsky before February 1st, 2017 for the bet to
become effective.
– In the event the CPI is retired or modified or it’s gone totally bogus under the Trump administration, we’ll use a mutually agreeable inflation index or toss it to a mutually agreeable third party; the general notion is that you should be paid back twice what you bet now without anything making that amount ludicrously small or large.
– If there are still biological humans running around on the surface of the Earth, it will not have been said to be ended.
– Any circumstance under which the vast bulk of humanity’s cosmic resource endowment is being diverted to events of little humane value due to AGI not under human control, and in which there are no longer biological humans running around on the surface of the Earth, shall be considered to count as the world being ended by nonaligned AGI.
– If there is any ambiguity over whether the world has been ended by nonaligned AGI, considerations relating to the disposition of the bulk of humanity’s potential astronomical resource endowment shall dominate a mutually agreeable third-party judgment, since the cosmic endowment is what I actually care about and its diversion is what I am attempting to avert using your bet-winning skills. Regardless, if there are still non-uploaded humans running around the Earth’s surface, you shall be said to have unambiguously won the bet (I think this is what you predict and care about).
– You win the bet if the world has been ended under AGI under specific human control by some human who specifically wanted to end it in a specific way and successfully did so. You do not win if somebody who thought it was a great idea just built an AGI and turned it loose (this will not be deemed ‘aligned’, and would not surprise me).
– If it sure looks like we’re all still running around on the surface of the Earth and nothing AGI-ish is definitely known to have happened, the world shall be deemed non-ended for bet settlement purposes, irrespective of simulation arguments or the possibility of an AGI deceiving us in this regard.
– The bet is payable to whomsoever has the most credible claim to being your heirs and assigns in the event that anything unfortunate should happen to you. Whomsoever has primary claim to being my own heir shall inherit this responsibility from me if they have inherited more than $200 of value from me.
– Your announcement of the bet shall mention that Eliezer strongly prefers that the world not be destroyed and is trying to exploit Bryan’s amazing bet-winning abilities to this end. Aside from that, these details do not need to be publicly recounted in any particular regard, and just form part of the semiformal understanding between us (they may of course be recounted any time either of us wishes).
Notice: The bet is structured so that Eliezer still gets a marginal benefit ($100 now) even if he’s right about the end of the world. I, similarly, get a somewhat larger marginal benefit ($200 inflation-adjusted in 2030) if he’s wrong. In my mind, this is primarily a bet that annualized real interest rates stay below 5.5%. After all, at 5.5%, I could turn $100 today in $200 inflation-adjusted without betting. I think it’s highly unlikely real rates will get that high, though I still think that’s vastly more likely than Eliezer’s doomsday scenario.
I would have been happy to bet Eliezer at the same odds for all-cause end-of-the-world. After all, if the world ends, I won’t be able to collect my winnings no matter what caused it. But a bet’s a bet!
The post appeared first on Econlib.
Would you say end of the world due to AGI is looking more/less likely today than 2017?
This should be an easy bet for you to win.