This is further evidence that he was thinking of this in terms of all-or-nothing coin flips. In case you haven't seen it:
COWEN: Okay, but let’s say there’s a game: 51 percent, you double the Earth out somewhere else; 49 percent, it all disappears. Would you play that game? And would you keep on playing that, double or nothing?
BANKMAN-FRIED: With one caveat. Let me give the caveat first, just to be a party pooper, which is, I’m assuming these are noninteracting universes. Is that right? Because to the extent they’re in the same universe, then maybe duplicating doesn’t actually double the value because maybe they would have colonized the other one anyway, eventually.
COWEN: But holding all that constant, you’re actually getting two Earths, but you’re risking a 49 percent chance of it all disappearing.
BANKMAN-FRIED: Again, I feel compelled to say caveats here, like, “How do you really know that’s what’s happening?” Blah, blah, blah, whatever. But that aside, take the pure hypothetical.
COWEN: Then you keep on playing the game. So, what’s the chance we’re left with anything? Don’t I just St. Petersburg paradox you into nonexistence?
BANKMAN-FRIED: Well, not necessarily. Maybe you St. Petersburg paradox into an enormously valuable existence. That’s the other option.
(See the first paragraph of https://www.lesswrong.com/posts/BZ6XaCwN4QGgH9CxF/the-kelly-... for the outline of an explanation of why his reasoning was not applicable to most real-life scenarios. This is pretty subtle; >95% of the explanations on the Internet I've seen over the past week as to why SBF is "obviously wrong" don't actually work.)
Maybe I'm speaking too soon, but he does seem ready to accept that he lost this flip. It would obviously be insane to trust with him anything big at this point, but I'm optimistic that he won't fight to the bitter end.
Not quite. The question is defined in a way that the EV of bet is positive, even though repeatedly doing it gets you an arbitrarily high chance of (moral) bankruptcy.
It's not though, it's just guaranteed failure. The only stable state is that you lose everything and you will reach that stable state in about 2 coin flips. If it's always good to flip, you will eventually (and quickly!) reach nothing.
I'm actually somewhat impressed by someone who articulated a stupid logic about how to live life and then actually followed through.
SBF talks about money like a prop trader who thinks that blowing your account is just a thing that happens to everyone in their lifetime, crossed with an extreme Rationalist (of the self-described "Rationalist" community) who thinks everything can be reduced to choosing between Column A and Column B based on which column has more utils.
Quite a lot of Rationalists aren't Utilitarians, for various reasons.
Most of the ones I hang out with seem to agree with my characterisation of Utilitarianism as what happens when philosophers discovered basic arithmetic and then just stopped there. Most recently:
"""I remember in secondary school, with nobody to teach me more than basic trigonometry and algebra, I spent 6 months figuring out what I later learned were the two ways 3D rendering can be done: ray tracing, and turning points in 3D space into points in 2D (screen) space and drawing those as 2D primitives.
Utilitarianism feels like even less than that, to me. It's the foundation of algebra upon which more can be built, saying that utility is a thing that can be combined, but it doesn't really say how to combine utility, and all the weird things that happen in extreme hypotheticals are because it's naively summing the potential utility of unbounded agents."""
Is he wrong? I suspect that even if he does jail time for fraud etc. then he will come out of it famous, and with plenty of deep-pocketed investors lined up to ride another wave. What are the actual consequences to the big guys?
I feel like the next question is, "the tortoise lies there, its belly baking in the hot sun, beating its legs, trying to turn itself over, but it cannot do so without your help. You are not helping. Why?"
The whole point of the paradox is that if you keep playing it’s impossible to end with anything but nothing. Sam laid out no exit point, not sure how he comes to the conclusion that he can end with an enormously valuable existence.
This is not true. The probability of ending up with nothing is only one in the limit. The probability for any finite tries is close to one, but not one. The EV keeps growing exponentially, as the rewards are growing even faster than the probability of ruin.
It is extremely risky and ill-advised, but as SBF said, it’s possible to end up in a very (very!) good position. The crux is that infinity doesn’t exist. You will place the bet at most a finite number of times.
COWEN: Okay, but let’s say there’s a game: 51 percent, you double the Earth out somewhere else; 49 percent, it all disappears. Would you play that game? And would you keep on playing that, double or nothing?
BANKMAN-FRIED: With one caveat. Let me give the caveat first, just to be a party pooper, which is, I’m assuming these are noninteracting universes. Is that right? Because to the extent they’re in the same universe, then maybe duplicating doesn’t actually double the value because maybe they would have colonized the other one anyway, eventually.
COWEN: But holding all that constant, you’re actually getting two Earths, but you’re risking a 49 percent chance of it all disappearing.
BANKMAN-FRIED: Again, I feel compelled to say caveats here, like, “How do you really know that’s what’s happening?” Blah, blah, blah, whatever. But that aside, take the pure hypothetical.
COWEN: Then you keep on playing the game. So, what’s the chance we’re left with anything? Don’t I just St. Petersburg paradox you into nonexistence?
BANKMAN-FRIED: Well, not necessarily. Maybe you St. Petersburg paradox into an enormously valuable existence. That’s the other option.
https://conversationswithtyler.com/episodes/sam-bankman-frie...
(See the first paragraph of https://www.lesswrong.com/posts/BZ6XaCwN4QGgH9CxF/the-kelly-... for the outline of an explanation of why his reasoning was not applicable to most real-life scenarios. This is pretty subtle; >95% of the explanations on the Internet I've seen over the past week as to why SBF is "obviously wrong" don't actually work.)
Maybe I'm speaking too soon, but he does seem ready to accept that he lost this flip. It would obviously be insane to trust with him anything big at this point, but I'm optimistic that he won't fight to the bitter end.