The INTO THE IMPOSSIBLE Podcast #226 Is It Possible to Live on Mars or are we INSANE? (ft. Adam Becker)
Adam Becker 00:00:00 - 00:00:29
They don't see that as the problem. They think, oh, this is the only example of advanced life in the entire universe. And so therefore, it's really, really important that we get off of this planet in order to secure a future out in space. They say it's more important than anything else. It's more important than preserving the environment here. It's more important than democracy. It's more important than civil rights. They also overlook that space is pretty bad for humans to live.
Adam Becker 00:00:29 - 00:00:31
This is the best place we've got.
Brian Keating 00:00:31 - 00:01:04
We're going to a lot about what I call the eschatology, the end times fantasies of tech billionaires, some of whom have been on the show and I've talked to in the past, including people like Peter Thiel and Marc Andreessen. Let's open with something that's really been a hot topic lately, which is the potential for AI Boom or AI Doom. In this book, you come down relatively hard on the billionaires that are not only shaping our economy through their inventions or technology, but also on their desire for control, manipulation, power, and perhaps even, as you call it, a threat to democracy.
Adam Becker 00:01:04 - 00:01:05
Yeah.
Brian Keating 00:01:05 - 00:01:16
What is going on with AI? How do you see it now? And, and, and what do you really see as the number one danger the audience would be, you know, keen to pay attention to, if not panic? Let us know when we should start panicking.
Adam Becker 00:01:16 - 00:02:38
Sure, yeah. So, you know, like you said, there's the AI boomers who think that, you know, like a lot of these billionaires who, like Andreessen specifically is a great example, who think that really we just need to make AI go as fast as possible, scale up as big as possible as quickly as possible, and it will solve all of our problems for us. And then on the other side, you have these AI doomers, people like this guy Eliezer Yudkowski, who I think we're probably going to talk about a bit more as we go on, and also the effective altruists, people from the same camp as, like, Sam Bankman, Fried, or if we want to pick a currently active tech billionaire, Dustin Moskovitz, who believe that AI in all likelihood is going to lead to, you know, total extinction of humanity and the end of the world. I think that the evidence for either of these claims is pretty bad because they're, in a lot of ways are two sides of the same coin. And the reason I say that, even though their claims sound as far apart as possible, is that they're both predicated on this idea that AI is going to inevitably and in pretty short order become incredibly powerful and superhuman in not just its abilities, but in its intelligence and. And just be more powerful and more intelligent and more capable than all of the rest of humanity and human civilization combined.
Brian Keating 00:02:38 - 00:02:39
Super intelligence.
Adam Becker 00:02:39 - 00:03:19
Right, exactly. And then that will lead to it being able to do whatever it wants with like godlike powers of creation and destruction. I think the arguments in favor of that happening are pretty bad and the arguments against it are pretty good. So I think both the boomers and the doomers are sort of living in the same fantasy land and they're both wrong. And that the biggest dangers we have around AI are the sort of normal dangers that we have around technology. Right. That it'll be used to further concentrate wealth and power into the hands of a few without democratic accountability. That it will take existing problems in society and make them worse, existing biases and make them worse.
Adam Becker 00:03:19 - 00:03:24
And we're already seeing all of that happen. So I think it's just going to be more of the same. Faster and faster.
Brian Keating 00:03:24 - 00:03:43
Yeah, but more of everything. Yeah, more of everything, but faster and faster. Guest Nick Bostrom has this famous thought experiment called the paperclip problem. You talk about it, and I was very, very impressed by the way this is described in the book. Because as astrophysicist, you are an astrophysicist. Your first book was not even about AI or astrophysics, about quantum mechanics. We'll get to that later. Stay tuned.
Brian Keating 00:03:43 - 00:04:12
Those are breadcrumbs for later. But tell us, what are the kind of delusional, or perhaps physics uninformed content hot takes of these billionaires ranging from Elon Musk, who I spoke to very briefly on the podcast a year ago, and people like Nick Bostrom who don't seem to in the so called limits to growth and things that have been around for a very long time. So what are they missing and that we should be getting or lead us to be either less concerned about doom or more optimistic about boom?

What is Castmagic?

Castmagic is the best way to generate content from audio and video.

Full transcripts from your audio files. Theme & speaker analysis. AI-generated content ready to copy/paste. And more.