Datasets:
Languages:
English
Multilinguality:
monolingual
Size Categories:
n<1K
Language Creators:
found
Source Datasets:
original
Tags:
karpathy,whisper,openai
WEBVTT | |
00:00.000 --> 00:02.760 | |
The following is a conversation with Sean Carroll. | |
00:02.760 --> 00:04.920 | |
He's a theoretical physicist at Caltech, | |
00:04.920 --> 00:08.800 | |
specializing in quantum mechanics, gravity, and cosmology. | |
00:08.800 --> 00:11.640 | |
He's the author of several popular books, | |
00:11.640 --> 00:15.360 | |
one on the Arrow of Time called From Eternity to Hear, | |
00:15.360 --> 00:17.840 | |
one on the Higgs Boson called Particle | |
00:17.840 --> 00:19.160 | |
at the End of the Universe, | |
00:19.160 --> 00:22.560 | |
and one on Science and Philosophy called The Big Picture, | |
00:22.560 --> 00:26.360 | |
on the origins of life, meaning, and the universe itself. | |
00:26.360 --> 00:28.720 | |
He has an upcoming book on quantum mechanics | |
00:28.720 --> 00:32.760 | |
that you can preorder now called Something Deeply Hidden. | |
00:32.760 --> 00:36.040 | |
He writes one of my favorite blogs on his website, | |
00:36.040 --> 00:37.960 | |
preposterousuniverse.com. | |
00:37.960 --> 00:40.440 | |
I recommend clicking on the greatest hits link | |
00:40.440 --> 00:44.400 | |
that lists accessible, interesting posts on the Arrow of Time, | |
00:44.400 --> 00:47.600 | |
dark matter, dark energy, the Big Bang, general relativity, | |
00:47.600 --> 00:49.560 | |
string theory, quantum mechanics, | |
00:49.560 --> 00:53.160 | |
and the big meta questions about the philosophy of science, | |
00:53.160 --> 00:57.600 | |
God, ethics, politics, academia, and much, much more. | |
00:57.600 --> 01:00.280 | |
Finally, and perhaps most famously, | |
01:00.280 --> 01:03.640 | |
he's the host of a podcast called Mindscape | |
01:03.640 --> 01:06.920 | |
that you should subscribe to and support on Patreon. | |
01:06.920 --> 01:08.800 | |
Along with the Joe Rogan experience, | |
01:08.800 --> 01:10.480 | |
Sam Harris is Making Sense, | |
01:10.480 --> 01:13.080 | |
and Dan Carlin's Hardcore History. | |
01:13.080 --> 01:15.840 | |
Sean's Mindscape podcast is one of my favorite ways | |
01:15.840 --> 01:18.800 | |
to learn new ideas or explore different perspectives | |
01:18.800 --> 01:22.120 | |
and ideas that I thought I understood. | |
01:22.120 --> 01:25.880 | |
It was truly an honor to meet and spend a couple hours | |
01:25.880 --> 01:27.200 | |
with Sean. | |
01:27.200 --> 01:30.480 | |
It's a bit heartbreaking to say that for the first time ever, | |
01:30.480 --> 01:32.760 | |
the audio recorder for this podcast died | |
01:32.760 --> 01:34.880 | |
in the middle of our conversation. | |
01:34.880 --> 01:36.280 | |
There are technical reasons for this, | |
01:36.280 --> 01:38.360 | |
having to do with phantom power | |
01:38.360 --> 01:41.040 | |
that I now understand and will avoid. | |
01:41.040 --> 01:44.200 | |
It took me one hour to notice and fix the problem. | |
01:44.200 --> 01:48.280 | |
So, much like the universe's 68% dark energy, | |
01:48.280 --> 01:51.280 | |
roughly the same amount from this conversation was lost, | |
01:51.280 --> 01:54.160 | |
except in the memories of the two people involved | |
01:54.160 --> 01:56.280 | |
and in my notes. | |
01:56.280 --> 01:59.920 | |
I'm sure we'll talk again and continue this conversation | |
01:59.920 --> 02:02.440 | |
on this podcast or on Sean's. | |
02:02.440 --> 02:05.320 | |
And of course, I look forward to it. | |
02:05.320 --> 02:07.840 | |
This is the Artificial Intelligence podcast. | |
02:07.840 --> 02:09.960 | |
If you enjoy it, subscribe on YouTube, | |
02:09.960 --> 02:12.520 | |
iTunes, support on Patreon, | |
02:12.520 --> 02:16.680 | |
or simply connect with me on Twitter at Lex Freedman. | |
02:16.680 --> 02:21.360 | |
And now, here's my conversation with Sean Carroll. | |
02:21.360 --> 02:23.520 | |
What do you think is more interesting and impactful? | |
02:23.520 --> 02:25.840 | |
Understanding how the universe works | |
02:25.840 --> 02:26.880 | |
at a fundamental level | |
02:26.880 --> 02:29.200 | |
or understanding how the human mind works? | |
02:29.200 --> 02:32.440 | |
You know, of course this is a crazy meaningless | |
02:32.440 --> 02:33.960 | |
unanswerable question in some sense, | |
02:33.960 --> 02:35.160 | |
because they're both very interesting | |
02:35.160 --> 02:37.520 | |
and there's no absolute scale of interestingness | |
02:37.520 --> 02:39.160 | |
that we can rate them on. | |
02:39.160 --> 02:41.160 | |
There's a glib answer that says the human brain | |
02:41.160 --> 02:43.080 | |
is part of the universe, right? | |
02:43.080 --> 02:44.400 | |
And therefore, understanding the universe | |
02:44.400 --> 02:47.000 | |
is more fundamental than understanding the human brain. | |
02:47.000 --> 02:49.600 | |
But do you really believe that once we understand | |
02:49.600 --> 02:51.520 | |
the fundamental way the universe works | |
02:51.520 --> 02:52.680 | |
at the particle level, | |
02:52.680 --> 02:55.800 | |
the forces we would be able to understand how the mind works? | |
02:55.800 --> 02:56.640 | |
No, certainly not. | |
02:56.640 --> 02:58.760 | |
We cannot understand how ice cream works | |
02:58.760 --> 03:01.040 | |
just from understanding how particles work, right? | |
03:01.040 --> 03:02.760 | |
So I'm a big believer in emergence. | |
03:02.760 --> 03:05.320 | |
I'm a big believer that there are different ways | |
03:05.320 --> 03:06.640 | |
of talking about the world | |
03:07.880 --> 03:11.200 | |
beyond just the most fundamental microscopic one. | |
03:11.200 --> 03:13.880 | |
You know, when we talk about tables and chairs | |
03:13.880 --> 03:15.120 | |
and planets and people, | |
03:15.120 --> 03:16.400 | |
we're not talking the language | |
03:16.400 --> 03:18.360 | |
of particle physics and cosmology. | |
03:18.360 --> 03:20.880 | |
So, but understanding the universe, | |
03:20.880 --> 03:24.040 | |
you didn't say just at the most fundamental level, right? | |
03:24.040 --> 03:28.200 | |
So understanding the universe at all levels is part of that. | |
03:28.200 --> 03:29.960 | |
I do think, you know, to be a little bit more fair | |
03:29.960 --> 03:33.960 | |
to the question, there probably are general principles | |
03:33.960 --> 03:38.520 | |
of complexity, biology, information processing, | |
03:38.520 --> 03:41.840 | |
memory, knowledge, creativity | |
03:41.840 --> 03:45.600 | |
that go beyond just the human brain, right? | |
03:45.600 --> 03:47.800 | |
And maybe one could count understanding those | |
03:47.800 --> 03:49.120 | |
as part of understanding the universe. | |
03:49.120 --> 03:53.040 | |
The human brain, as far as we know, is the most complex thing | |
03:53.040 --> 03:54.320 | |
in the universe. | |
03:54.320 --> 03:57.440 | |
So there's, it's certainly absurd to think | |
03:57.440 --> 03:58.880 | |
that by understanding the fundamental laws | |
03:58.880 --> 04:00.400 | |
of particle physics, | |
04:00.400 --> 04:02.880 | |
you get any direct insight on how the brain works. | |
04:02.880 --> 04:04.360 | |
But then there's this step | |
04:04.360 --> 04:06.840 | |
from the fundamentals of particle physics | |
04:06.840 --> 04:08.680 | |
to information processing, | |
04:08.680 --> 04:10.840 | |
which a lot of physicists and philosophers | |
04:10.840 --> 04:12.520 | |
may be a little bit carelessly take | |
04:12.520 --> 04:14.680 | |
when they talk about artificial intelligence. | |
04:14.680 --> 04:18.080 | |
Do you think of the universe | |
04:18.080 --> 04:21.360 | |
as a kind of a computational device? | |
04:21.360 --> 04:22.200 | |
No. | |
04:22.200 --> 04:24.200 | |
To be like the honest answer there is no. | |
04:24.200 --> 04:27.600 | |
There's a sense in which the universe processes information | |
04:27.600 --> 04:29.200 | |
clearly. | |
04:29.200 --> 04:32.720 | |
There's a sense in which the universe is like a computer, | |
04:32.720 --> 04:33.920 | |
clearly. | |
04:33.920 --> 04:36.560 | |
But in some sense, I think, | |
04:36.560 --> 04:38.560 | |
I tried to say this once on my blog | |
04:38.560 --> 04:39.400 | |
and no one agreed with me, | |
04:39.400 --> 04:42.400 | |
but the universe is more like a computation | |
04:42.400 --> 04:45.080 | |
than a computer because the universe happens once. | |
04:45.080 --> 04:46.960 | |
A computer is a general purpose machine, right? | |
04:46.960 --> 04:48.680 | |
You can ask it different questions, | |
04:48.680 --> 04:50.120 | |
even a pocket calculator, right? | |
04:50.120 --> 04:52.960 | |
And it's set up to answer certain kinds of questions. | |
04:52.960 --> 04:54.320 | |
The universe isn't that. | |
04:54.320 --> 04:57.360 | |
So information processing happens in the universe, | |
04:57.360 --> 04:59.120 | |
but it's not what the universe is. | |
04:59.120 --> 05:01.560 | |
And I know your MIT colleague, Seth Lloyd, | |
05:01.560 --> 05:03.840 | |
feels very differently about this, right? | |
05:03.840 --> 05:07.240 | |
Well, you're thinking of the universe as a closed system. | |
05:07.240 --> 05:08.080 | |
I am. | |
05:08.080 --> 05:11.760 | |
So what makes a computer more like a PC, | |
05:13.000 --> 05:14.560 | |
like a computing machine, | |
05:14.560 --> 05:17.720 | |
is that there's a human that comes up to it | |
05:17.720 --> 05:19.120 | |
and moves the mouse around, | |
05:19.120 --> 05:21.680 | |
so input gives it input. | |
05:21.680 --> 05:26.320 | |
And that's why you're saying it's just a computation, | |
05:26.320 --> 05:29.280 | |
a deterministic thing that's just unrolling. | |
05:29.280 --> 05:32.240 | |
But the immense complexity of it | |
05:32.240 --> 05:34.440 | |
is nevertheless like processing. | |
05:34.440 --> 05:39.440 | |
There's a state and it changes with rules. | |
05:40.160 --> 05:41.680 | |
And there's a sense for a lot of people | |
05:41.680 --> 05:45.400 | |
that if the brain operates, the human brain operates | |
05:45.400 --> 05:46.520 | |
within that world, | |
05:46.520 --> 05:49.400 | |
then it's simply just a small subset of that. | |
05:49.400 --> 05:52.560 | |
And so there's no reason we can't build | |
05:52.560 --> 05:55.600 | |
arbitrarily great intelligences. | |
05:55.600 --> 05:56.440 | |
Yeah. | |
05:56.440 --> 05:58.720 | |
Do you think of intelligence in this way? | |
05:58.720 --> 05:59.640 | |
Intelligence is tricky. | |
05:59.640 --> 06:01.720 | |
I don't have a definition of it offhand. | |
06:01.720 --> 06:04.640 | |
So I remember this panel discussion | |
06:04.640 --> 06:06.240 | |
that I saw on YouTube, I wasn't there, | |
06:06.240 --> 06:07.720 | |
but Seth Lloyd was on the panel. | |
06:07.720 --> 06:10.560 | |
And so was Martin Rees, the famous astrophysicist. | |
06:10.560 --> 06:13.800 | |
And Seth gave his shtick for why the universe is a computer | |
06:13.800 --> 06:14.840 | |
and explained this. | |
06:14.840 --> 06:19.360 | |
And Martin Rees said, so what is not a computer? | |
06:19.360 --> 06:22.000 | |
And Seth is like, oh, that's a good question. | |
06:22.000 --> 06:22.840 | |
I'm not sure. | |
06:22.840 --> 06:24.960 | |
Because if you have a sufficiently broad definition | |
06:24.960 --> 06:28.360 | |
of what a computer is, then everything is, right? | |
06:28.360 --> 06:32.160 | |
And similarly, or the analogy gains force | |
06:32.160 --> 06:34.560 | |
when it excludes some things. | |
06:34.560 --> 06:38.640 | |
Is the moon going around the earth performing a computation? | |
06:38.640 --> 06:41.320 | |
I can come up with definitions in which the answer is yes, | |
06:41.320 --> 06:43.840 | |
but it's not a very useful computation. | |
06:43.840 --> 06:46.120 | |
I think that it's absolutely helpful | |
06:46.120 --> 06:49.600 | |
to think about the universe in certain situations, | |
06:49.600 --> 06:53.080 | |
certain contexts, as an information processing device. | |
06:53.080 --> 06:54.840 | |
I'm even guilty of writing a paper | |
06:54.840 --> 06:56.960 | |
called Quantum Circuit Cosmology, where | |
06:56.960 --> 06:59.280 | |
we modeled the whole universe as a quantum circuit. | |
06:59.280 --> 07:00.320 | |
As a circuit. | |
07:00.320 --> 07:01.440 | |
As a circuit, yeah. | |
07:01.440 --> 07:02.840 | |
With qubits kind of thing. | |
07:02.840 --> 07:05.000 | |
With qubits, basically, right. | |
07:05.000 --> 07:07.400 | |
So in qubits, becoming more and more entangled. | |
07:07.400 --> 07:09.640 | |
So do we want to digress a little bit? | |
07:09.640 --> 07:11.000 | |
Because this is kind of fun. | |
07:11.000 --> 07:13.680 | |
So here's a mystery about the universe | |
07:13.680 --> 07:16.840 | |
that is so deep and profound that nobody talks about it. | |
07:16.840 --> 07:19.040 | |
Space expands, right? | |
07:19.040 --> 07:21.880 | |
And we talk about, in a certain region of space, | |
07:21.880 --> 07:23.560 | |
a certain number of degrees of freedom, | |
07:23.560 --> 07:25.480 | |
a certain number of ways that the quantum fields | |
07:25.480 --> 07:28.800 | |
and the particles in that region can arrange themselves. | |
07:28.800 --> 07:32.200 | |
That number of degrees of freedom in a region of space | |
07:32.200 --> 07:33.800 | |
is arguably finite. | |
07:33.800 --> 07:36.640 | |
We actually don't know how many there are, | |
07:36.640 --> 07:39.440 | |
but there's a very good argument that says it's a finite number. | |
07:39.440 --> 07:44.920 | |
So as the universe expands and space gets bigger, | |
07:44.920 --> 07:46.560 | |
are there more degrees of freedom? | |
07:46.560 --> 07:48.520 | |
If it's an infinite number, it doesn't really matter. | |
07:48.520 --> 07:50.000 | |
Infinity times 2 is still infinity. | |
07:50.000 --> 07:53.160 | |
But if it's a finite number, then there's more space, | |
07:53.160 --> 07:54.480 | |
so there's more degrees of freedom. | |
07:54.480 --> 07:55.760 | |
So where did they come from? | |
07:55.760 --> 07:58.000 | |
That would mean the universe is not a closed system. | |
07:58.000 --> 08:01.520 | |
There's more degrees of freedom popping into existence. | |
08:01.520 --> 08:05.320 | |
So what we suggested was that there are more degrees of freedom. | |
08:05.320 --> 08:07.960 | |
And it's not that they're not there to start, | |
08:07.960 --> 08:10.880 | |
but they're not entangled to start. | |
08:10.880 --> 08:12.800 | |
So the universe that you and I know of, | |
08:12.800 --> 08:15.440 | |
the three dimensions around us that we see, | |
08:15.440 --> 08:18.080 | |
we said those are the entangled degrees of freedom | |
08:18.080 --> 08:19.640 | |
making up space time. | |
08:19.640 --> 08:22.640 | |
As the universe expands, there are a whole bunch of qubits | |
08:22.640 --> 08:26.840 | |
in their zero state that become entangled | |
08:26.840 --> 08:28.720 | |
with the rest of space time through the action | |
08:28.720 --> 08:31.200 | |
of these quantum circuits. | |
08:31.200 --> 08:37.080 | |
So what does it mean that there's now more degrees of freedom | |
08:37.080 --> 08:39.280 | |
as they become more entangled? | |
08:39.280 --> 08:40.280 | |
Yeah. | |
08:40.280 --> 08:41.640 | |
As the universe expands. | |
08:41.640 --> 08:41.960 | |
That's right. | |
08:41.960 --> 08:43.280 | |
So there's more and more degrees of freedom | |
08:43.280 --> 08:47.320 | |
that are entangled, that are playing the role of part | |
08:47.320 --> 08:49.600 | |
of the entangled space time structure. | |
08:49.600 --> 08:53.320 | |
So the underlying philosophy is that space time itself | |
08:53.320 --> 08:55.600 | |
arises from the entanglement of some fundamental quantum | |
08:55.600 --> 08:57.680 | |
degrees of freedom. | |
08:57.680 --> 08:58.280 | |
Wow. | |
08:58.280 --> 08:59.780 | |
OK. | |
08:59.780 --> 09:05.200 | |
At which point is most of the entanglement happening? | |
09:05.200 --> 09:07.400 | |
Are we talking about close to the Big Bang? | |
09:07.400 --> 09:11.840 | |
Are we talking about throughout the time of the life of the | |
09:11.840 --> 09:12.340 | |
universe? | |
09:12.340 --> 09:12.840 | |
Yeah. | |
09:12.840 --> 09:15.080 | |
So the idea is that at the Big Bang, | |
09:15.080 --> 09:17.760 | |
almost all the degrees of freedom that the universe could | |
09:17.760 --> 09:22.400 | |
have were there, but they were unentangled with anything else. | |
09:22.400 --> 09:23.840 | |
And that's a reflection of the fact | |
09:23.840 --> 09:25.560 | |
that the Big Bang had a low entropy. | |
09:25.560 --> 09:28.080 | |
It was a very simple, very small place. | |
09:28.080 --> 09:31.360 | |
And as space expands, more and more degrees of freedom | |
09:31.360 --> 09:34.240 | |
become entangled with the rest of the world. | |
09:34.240 --> 09:35.960 | |
Well, I have to ask John Carroll, | |
09:35.960 --> 09:38.160 | |
what do you think of the thought experiment from Nick | |
09:38.160 --> 09:41.560 | |
Bostrom that we're living in a simulation? | |
09:41.560 --> 09:44.880 | |
So I think let me contextualize that a little bit more. | |
09:44.880 --> 09:48.320 | |
I think people don't actually take this thought experiment. | |
09:48.320 --> 09:50.360 | |
I think it's quite interesting. | |
09:50.360 --> 09:52.880 | |
It's not very useful, but it's quite interesting. | |
09:52.880 --> 09:55.440 | |
From the perspective of AI, a lot of the learning | |
09:55.440 --> 09:59.280 | |
that can be done usually happens in simulation, | |
09:59.280 --> 10:01.440 | |
artificial examples. | |
10:01.440 --> 10:03.040 | |
And so it's a constructive question | |
10:03.040 --> 10:09.360 | |
to ask how difficult is our real world to simulate, | |
10:09.360 --> 10:12.400 | |
which is kind of a dual part of, if we're | |
10:12.400 --> 10:16.400 | |
living in a simulation and somebody built that simulation, | |
10:16.400 --> 10:18.840 | |
if you were to try to do it yourself, how hard would it be? | |
10:18.840 --> 10:21.080 | |
So obviously, we could be living in a simulation. | |
10:21.080 --> 10:22.960 | |
If you just want the physical possibility, | |
10:22.960 --> 10:25.360 | |
then I completely agree that it's physically possible. | |
10:25.360 --> 10:27.360 | |
I don't think that we actually are. | |
10:27.360 --> 10:31.880 | |
So take this one piece of data into consideration. | |
10:31.880 --> 10:35.080 | |
We live in a big universe. | |
10:35.080 --> 10:38.480 | |
There's two trillion galaxies in our observable universe | |
10:38.480 --> 10:41.640 | |
with 200 billion stars in each galaxy, et cetera. | |
10:41.640 --> 10:44.920 | |
It would seem to be a waste of resources | |
10:44.920 --> 10:47.600 | |
to have a universe that big going on just to do a simulation. | |
10:47.600 --> 10:50.120 | |
So in other words, I want to be a good Bayesian. | |
10:50.120 --> 10:54.920 | |
I want to ask, under this hypothesis, what do I expect to see? | |
10:54.920 --> 10:56.080 | |
So the first thing I would say is I | |
10:56.080 --> 11:00.280 | |
wouldn't expect to see a universe that was that big. | |
11:00.280 --> 11:02.560 | |
The second thing is I wouldn't expect the resolution | |
11:02.560 --> 11:05.000 | |
of the universe to be as good as it is. | |
11:05.000 --> 11:08.960 | |
So it's always possible that if our superhuman simulators only | |
11:08.960 --> 11:10.840 | |
have finite resources that they don't render | |
11:10.840 --> 11:14.360 | |
the entire universe, that the part that is out there, | |
11:14.360 --> 11:17.040 | |
the two trillion galaxies, isn't actually | |
11:17.040 --> 11:19.600 | |
being simulated fully. | |
11:19.600 --> 11:22.720 | |
But then the obvious extrapolation of that | |
11:22.720 --> 11:25.640 | |
is that only I am being simulated fully. | |
11:25.640 --> 11:29.240 | |
The rest of you are just nonplayer characters. | |
11:29.240 --> 11:30.520 | |
I'm the only thing that is real. | |
11:30.520 --> 11:32.720 | |
The rest of you are just chatbots. | |
11:32.720 --> 11:34.320 | |
Beyond this wall, I see the wall, | |
11:34.320 --> 11:37.360 | |
but there is literally nothing on the other side of the wall. | |
11:37.360 --> 11:39.000 | |
That is sort of the Bayesian prediction. | |
11:39.000 --> 11:40.400 | |
That's what it would be like to do | |
11:40.400 --> 11:42.240 | |
an efficient simulation of me. | |
11:42.240 --> 11:45.760 | |
So none of that seems quite realistic. | |
11:45.760 --> 11:50.880 | |
I don't see, I hear the argument that it's just possible | |
11:50.880 --> 11:53.280 | |
and easy to simulate lots of things. | |
11:53.280 --> 11:57.280 | |
I don't see any evidence from what we know about our universe | |
11:57.280 --> 11:59.280 | |
that we look like a simulated universe. | |
11:59.280 --> 12:01.120 | |
Now, maybe you can say, well, we don't know what it would | |
12:01.120 --> 12:03.000 | |
look like, but that's just abandoning | |
12:03.000 --> 12:04.520 | |
your Bayesian responsibilities. | |
12:04.520 --> 12:07.680 | |
Like your job is to say, under this theory, | |
12:07.680 --> 12:09.480 | |
here's what you would expect to see. | |
12:09.480 --> 12:11.680 | |
Yeah, so certainly if you think about a simulation | |
12:11.680 --> 12:16.680 | |
as a thing that's like a video game where only a small subset | |
12:16.680 --> 12:22.880 | |
is being readied, but say all the laws of physics, | |
12:22.880 --> 12:26.560 | |
the entire closed system of the quote unquote universe, | |
12:26.560 --> 12:27.800 | |
it had a creator. | |
12:27.800 --> 12:29.640 | |
Yeah, it's always possible. | |
12:29.640 --> 12:32.280 | |
So that's not useful to think about | |
12:32.280 --> 12:34.040 | |
when you're thinking about physics. | |
12:34.040 --> 12:38.080 | |
The way Nick Bostrom phrases it, if it's possible | |
12:38.080 --> 12:40.520 | |
to simulate a universe, eventually we'll do it. | |
12:40.520 --> 12:42.720 | |
Right. | |
12:42.720 --> 12:45.560 | |
You can use that, by the way, for a lot of things. | |
12:45.560 --> 12:49.840 | |
But I guess the question is, how hard is it | |
12:49.840 --> 12:52.320 | |
to create a universe? | |
12:52.320 --> 12:53.800 | |
I wrote a little blog post about this, | |
12:53.800 --> 12:55.440 | |
and maybe I'm missing something. | |
12:55.440 --> 12:57.680 | |
But there's an argument that says not only | |
12:57.680 --> 13:00.480 | |
that it might be possible to simulate a universe, | |
13:00.480 --> 13:05.400 | |
but probably, if you imagine that you actually | |
13:05.400 --> 13:07.320 | |
attribute consciousness and agency | |
13:07.320 --> 13:09.920 | |
to the little things that we're simulating, | |
13:09.920 --> 13:12.400 | |
to our little artificial beings, there's probably | |
13:12.400 --> 13:15.000 | |
a lot more of them than there are ordinary organic beings | |
13:15.000 --> 13:17.400 | |
in the universe, or there will be in the future. | |
13:17.400 --> 13:19.600 | |
So there's an argument that not only is being a simulation | |
13:19.600 --> 13:23.520 | |
possible, it's probable, because in the space | |
13:23.520 --> 13:25.480 | |
of all living consciousnesses, most of them | |
13:25.480 --> 13:26.600 | |
are being simulated. | |
13:26.600 --> 13:28.840 | |
Most of them are not at the top level. | |
13:28.840 --> 13:30.520 | |
I think that argument must be wrong, | |
13:30.520 --> 13:34.080 | |
because it follows from that argument that if we're simulated, | |
13:34.080 --> 13:36.880 | |
but we can also simulate other things. | |
13:36.880 --> 13:38.800 | |
Well, but if we can simulate other things, | |
13:38.800 --> 13:41.800 | |
they can simulate other things. | |
13:41.800 --> 13:44.280 | |
If we give them enough power and resolution, | |
13:44.280 --> 13:46.000 | |
and ultimately, we'll reach a bottom, | |
13:46.000 --> 13:47.800 | |
because the laws of physics in our universe | |
13:47.800 --> 13:51.120 | |
have a bottom, we're made of atoms and so forth. | |
13:51.120 --> 13:55.080 | |
So there will be the cheapest possible simulations. | |
13:55.080 --> 13:57.680 | |
And if you believe the original argument, | |
13:57.680 --> 13:59.920 | |
you should conclude that we should be in the cheapest | |
13:59.920 --> 14:02.560 | |
possible simulation, because that's where most people are. | |
14:02.560 --> 14:03.640 | |
But we don't look like that. | |
14:03.640 --> 14:06.880 | |
It doesn't look at all like we're at the edge of resolution, | |
14:06.880 --> 14:09.960 | |
that we're 16 bit things. | |
14:09.960 --> 14:12.840 | |
It seems much easier to make much lower level things | |
14:12.840 --> 14:14.160 | |
than we are. | |
14:14.160 --> 14:18.200 | |
So, and also, I question the whole approach | |
14:18.200 --> 14:19.840 | |
to the anthropic principle that says | |
14:19.840 --> 14:22.320 | |
we are typical observers in the universe. | |
14:22.320 --> 14:23.640 | |
I think that that's not actually, | |
14:23.640 --> 14:27.320 | |
I think that there's a lot of selection that we can do | |
14:27.320 --> 14:30.120 | |
that were typical within things we already know, | |
14:30.120 --> 14:32.240 | |
but not typical within all the universe. | |
14:32.240 --> 14:35.760 | |
So do you think there is intelligent life, | |
14:35.760 --> 14:37.800 | |
however you would like to define intelligent life | |
14:37.800 --> 14:39.920 | |
out there in the universe? | |
14:39.920 --> 14:44.640 | |
My guess is that there is not intelligent life | |
14:44.640 --> 14:46.840 | |
in the observable universe other than us. | |
14:48.320 --> 14:52.480 | |
Simply on the basis of the fact that the likely number | |
14:52.480 --> 14:56.320 | |
of other intelligent species in the observable universe, | |
14:56.320 --> 15:00.280 | |
there's two likely numbers, zero or billions. | |
15:01.480 --> 15:02.560 | |
And if there had been billions, | |
15:02.560 --> 15:04.000 | |
you would have noticed already. | |
15:05.040 --> 15:07.320 | |
For there to be literally like a small number, | |
15:07.320 --> 15:12.320 | |
like Star Trek, there's a dozen intelligent civilizations | |
15:12.440 --> 15:15.040 | |
in our galaxy, but not a billion. | |
15:16.240 --> 15:18.480 | |
That's weird, that's sort of bizarre to me. | |
15:18.480 --> 15:21.000 | |
It's easy for me to imagine that there are zero others | |
15:21.000 --> 15:22.600 | |
because there's just a big bottleneck | |
15:22.600 --> 15:24.960 | |
to making multicellular life | |
15:24.960 --> 15:27.040 | |
or technological life or whatever. | |
15:27.040 --> 15:28.560 | |
It's very hard for me to imagine | |
15:28.560 --> 15:30.160 | |
that there's a whole bunch out there | |
15:30.160 --> 15:32.280 | |
that have somehow remained hidden from us. | |
15:32.280 --> 15:34.880 | |
The question I'd like to ask is, | |
15:34.880 --> 15:37.240 | |
what would intelligent life look like? | |
15:37.240 --> 15:41.120 | |
What I mean by that question and where it's going is, | |
15:41.120 --> 15:45.120 | |
what if intelligent life is just fundamentally, | |
15:45.120 --> 15:49.120 | |
in some very big ways, different than the one | |
15:49.120 --> 15:51.480 | |
that has on Earth. | |
15:51.480 --> 15:53.880 | |
That there's all kinds of intelligent life | |
15:53.880 --> 15:57.560 | |
that operates at different scales of both size and temporal. | |
15:57.560 --> 15:59.280 | |
That's a great possibility | |
15:59.280 --> 16:00.800 | |
because I think we should be humble | |
16:00.800 --> 16:02.640 | |
about what intelligence is, what life is. | |
16:02.640 --> 16:04.040 | |
We don't even agree on what life is, | |
16:04.040 --> 16:06.040 | |
much less what intelligent life is, right? | |
16:06.040 --> 16:08.200 | |
So that's an argument for humility, | |
16:08.200 --> 16:10.080 | |
saying there could be intelligent life | |
16:10.080 --> 16:12.800 | |
of a very different character, right? | |
16:12.800 --> 16:17.240 | |
You could imagine that dolphins are intelligent | |
16:17.240 --> 16:19.760 | |
but never invent space travel | |
16:19.760 --> 16:20.760 | |
because they live in the ocean | |
16:20.760 --> 16:22.760 | |
and they don't have thumbs, right? | |
16:22.760 --> 16:25.840 | |
So they never invent technology, they never invent smelting. | |
16:26.840 --> 16:31.200 | |
Maybe the universe is full of intelligent species | |
16:31.200 --> 16:33.200 | |
that just don't make technology, right? | |
16:33.200 --> 16:35.440 | |
That's compatible with the data, I think. | |
16:35.440 --> 16:38.560 | |
And I think maybe what you're pointing at | |
16:38.560 --> 16:42.560 | |
is even more out there versions of intelligence, | |
16:42.560 --> 16:46.240 | |
you know, intelligence in intermolecular clouds | |
16:46.240 --> 16:48.160 | |
or on the surface of a neutron star | |
16:48.160 --> 16:50.360 | |
or in between the galaxies in giant things | |
16:50.360 --> 16:52.840 | |
where the equivalent of a heartbeat is 100 million years. | |
16:54.840 --> 16:56.760 | |
On the one hand, yes, | |
16:56.760 --> 16:58.560 | |
we should be very open minded about those things. | |
16:58.560 --> 17:03.560 | |
On the other hand, we all of us share the same laws of physics. | |
17:03.560 --> 17:07.040 | |
There might be something about the laws of physics | |
17:07.040 --> 17:08.560 | |
even though we don't currently know exactly | |
17:08.560 --> 17:12.560 | |
what that thing would be that makes meters | |
17:12.560 --> 17:16.560 | |
and years the right length and time scales | |
17:16.560 --> 17:19.560 | |
for intelligent life, maybe not. | |
17:19.560 --> 17:22.560 | |
But we're made of atoms, atoms have a certain size, | |
17:22.560 --> 17:25.560 | |
we orbit stars, our stars have a certain lifetime. | |
17:25.560 --> 17:28.560 | |
It's not impossible to me that there's a sweet spot | |
17:28.560 --> 17:30.560 | |
for intelligent life that we find ourselves in. | |
17:30.560 --> 17:33.560 | |
So I'm open minded either way, I'm open minded either being humble | |
17:33.560 --> 17:35.560 | |
and there's all sorts of different kinds of life | |
17:35.560 --> 17:37.560 | |
or no, there's a reason we just don't know it yet | |
17:37.560 --> 17:40.560 | |
why life like ours is the kind of life that's out there. | |
17:40.560 --> 17:43.560 | |
Yeah, I'm of two minds too, but I often wonder | |
17:43.560 --> 17:48.560 | |
if our brains is just designed to, quite obviously, | |
17:48.560 --> 17:53.560 | |
to operate and see the world on these time scales. | |
17:53.560 --> 17:57.560 | |
And we're almost blind and the tools we've created | |
17:57.560 --> 18:01.560 | |
for detecting things are blind to the kind of observation | |
18:01.560 --> 18:04.560 | |
needed to see intelligent life at other scales. | |
18:04.560 --> 18:06.560 | |
Well, I'm totally open to that, | |
18:06.560 --> 18:08.560 | |
but so here's another argument I would make. | |
18:08.560 --> 18:10.560 | |
We have looked for intelligent life, | |
18:10.560 --> 18:13.560 | |
but we've looked at for it in the dumbest way we can | |
18:13.560 --> 18:15.560 | |
by turning radio telescopes to the sky. | |
18:15.560 --> 18:20.560 | |
And why in the world would a super advanced civilization | |
18:20.560 --> 18:23.560 | |
randomly beam out radio signals wastefully | |
18:23.560 --> 18:25.560 | |
in all directions into the universe? | |
18:25.560 --> 18:28.560 | |
It just doesn't make any sense, especially because | |
18:28.560 --> 18:30.560 | |
in order to think that you would actually contact | |
18:30.560 --> 18:33.560 | |
another civilization, you would have to do it forever. | |
18:33.560 --> 18:35.560 | |
You have to keep doing it for millions of years. | |
18:35.560 --> 18:37.560 | |
That sounds like a waste of resources. | |
18:37.560 --> 18:42.560 | |
If you thought that there were other solar systems | |
18:42.560 --> 18:45.560 | |
with planets around them where maybe intelligent life | |
18:45.560 --> 18:48.560 | |
didn't yet exist, but might someday, | |
18:48.560 --> 18:51.560 | |
you wouldn't try to talk to it with radio waves. | |
18:51.560 --> 18:53.560 | |
You would send a spacecraft out there | |
18:53.560 --> 18:55.560 | |
and you would park it around there. | |
18:55.560 --> 18:57.560 | |
And it would be like, from our point of view, | |
18:57.560 --> 19:00.560 | |
it would be like 2001 where there was a monolith. | |
19:00.560 --> 19:02.560 | |
There could be an artifact. | |
19:02.560 --> 19:04.560 | |
In fact, the other way works also, right? | |
19:04.560 --> 19:07.560 | |
There could be artifacts in our solar system | |
19:07.560 --> 19:11.560 | |
that have been put there by other technologically advanced | |
19:11.560 --> 19:14.560 | |
civilizations, and that's how we will eventually contact them. | |
19:14.560 --> 19:16.560 | |
We just haven't explored the solar system well enough yet | |
19:16.560 --> 19:18.560 | |
to find them. | |
19:18.560 --> 19:20.560 | |
The reason why we don't think about that is because | |
19:20.560 --> 19:21.560 | |
we're young and impatient, right? | |
19:21.560 --> 19:23.560 | |
It's like it would take more than my lifetime | |
19:23.560 --> 19:25.560 | |
to actually send something to another star system | |
19:25.560 --> 19:27.560 | |
and wait for it and then come back. | |
19:27.560 --> 19:30.560 | |
But if we start thinking on hundreds of thousands of years | |
19:30.560 --> 19:32.560 | |
or a million year time scales, | |
19:32.560 --> 19:34.560 | |
that's clearly the right thing to do. | |
19:34.560 --> 19:38.560 | |
Are you excited by the thing that Elon Musk is doing with SpaceX | |
19:38.560 --> 19:41.560 | |
in general, but the idea of space exploration, | |
19:41.560 --> 19:45.560 | |
even though you're species is young and impatient? | |
19:45.560 --> 19:50.560 | |
No, I do think that space travel is crucially important, long term. | |
19:50.560 --> 19:52.560 | |
Even to other star systems. | |
19:52.560 --> 19:57.560 | |
And I think that many people overestimate the difficulty | |
19:57.560 --> 20:00.560 | |
because they say, look, if you travel 1% the speed of light | |
20:00.560 --> 20:03.560 | |
to another star system, we'll be dead before we get there, right? | |
20:03.560 --> 20:05.560 | |
And I think that it's much easier. | |
20:05.560 --> 20:07.560 | |
And therefore, when they write their science fiction stories, | |
20:07.560 --> 20:09.560 | |
they imagine we'd go faster than the speed of light | |
20:09.560 --> 20:11.560 | |
because otherwise they're too impatient, right? | |
20:11.560 --> 20:13.560 | |
We're not going to go faster than the speed of light, | |
20:13.560 --> 20:15.560 | |
but we could easily imagine that the human lifespan | |
20:15.560 --> 20:17.560 | |
gets extended to thousands of years. | |
20:17.560 --> 20:19.560 | |
And once you do that, then the stars are much closer. | |
20:19.560 --> 20:20.560 | |
Effectively, right? | |
20:20.560 --> 20:22.560 | |
What's 100 year trip, right? | |
20:22.560 --> 20:26.560 | |
So I think that that's going to be the future, the far future, | |
20:26.560 --> 20:29.560 | |
not my lifetime once again, but baby steps. | |
20:29.560 --> 20:31.560 | |
Unless your lifetime gets extended. | |
20:31.560 --> 20:33.560 | |
Well, it's in a race against time, right? | |
20:33.560 --> 20:37.560 | |
A friend of mine who actually thinks about these things said, | |
20:37.560 --> 20:39.560 | |
you know, you and I are going to die, | |
20:39.560 --> 20:42.560 | |
but I don't know about our grandchildren. | |
20:42.560 --> 20:45.560 | |
I don't know, predicting the future is hard, | |
20:45.560 --> 20:47.560 | |
but that's the least plausible scenario. | |
20:47.560 --> 20:51.560 | |
And so, yeah, no, I think that as we discussed earlier, | |
20:51.560 --> 20:56.560 | |
there are threats to the earth, known and unknown, right? | |
20:56.560 --> 21:02.560 | |
Having spread humanity and biology elsewhere | |
21:02.560 --> 21:04.560 | |
is a really important longterm goal. | |
21:04.560 --> 21:08.560 | |
What kind of questions can science not currently answer, | |
21:08.560 --> 21:11.560 | |
but might soon? | |
21:11.560 --> 21:14.560 | |
When you think about the problems and the mysteries before us, | |
21:14.560 --> 21:17.560 | |
that may be within reach of science. | |
21:17.560 --> 21:19.560 | |
I think an obvious one is the origin of life. | |
21:19.560 --> 21:21.560 | |
We don't know how that happened. | |
21:21.560 --> 21:24.560 | |
There's a difficulty in knowing how it happened historically, | |
21:24.560 --> 21:26.560 | |
actually, you know, literally on earth, | |
21:26.560 --> 21:29.560 | |
but starting life from nonlife | |
21:29.560 --> 21:32.560 | |
is something I kind of think we're close to, right? | |
21:32.560 --> 21:33.560 | |
You really think so? | |
21:33.560 --> 21:35.560 | |
Like, how difficult is it to start life? | |
21:35.560 --> 21:36.560 | |
I do. | |
21:36.560 --> 21:40.560 | |
Well, I've talked to people, including on the podcast, about this. | |
21:40.560 --> 21:42.560 | |
You know, life requires three things. | |
21:42.560 --> 21:44.560 | |
Life as we know it. | |
21:44.560 --> 21:46.560 | |
There's a difference between life, who knows what it is, | |
21:46.560 --> 21:47.560 | |
and life as we know it, | |
21:47.560 --> 21:50.560 | |
which we can talk about with some intelligence. | |
21:50.560 --> 21:53.560 | |
Life as we know it requires compartmentalization. | |
21:53.560 --> 21:56.560 | |
You need a little membrane around your cell. | |
21:56.560 --> 21:58.560 | |
Metabolism, you need to take in food and eat it | |
21:58.560 --> 22:00.560 | |
and let that make you do things. | |
22:00.560 --> 22:02.560 | |
And then replication. | |
22:02.560 --> 22:04.560 | |
You need to have some information about who you are, | |
22:04.560 --> 22:07.560 | |
that you pass down to future generations. | |
22:07.560 --> 22:11.560 | |
In the lab, compartmentalization seems pretty easy, | |
22:11.560 --> 22:13.560 | |
not hard to make lipid bilayers | |
22:13.560 --> 22:16.560 | |
that come into little cellular walls pretty easily. | |
22:16.560 --> 22:19.560 | |
Metabolism and replication are hard, | |
22:19.560 --> 22:21.560 | |
but replication we're close to. | |
22:21.560 --> 22:25.560 | |
People have made RNA like molecules in the lab that... | |
22:25.560 --> 22:28.560 | |
I think the state of the art is | |
22:28.560 --> 22:31.560 | |
they're not able to make one molecule that reproduces itself, | |
22:31.560 --> 22:34.560 | |
but they're able to make two molecules that reproduce each other. | |
22:34.560 --> 22:37.560 | |
So that's okay. That's pretty close. | |
22:37.560 --> 22:40.560 | |
Metabolism is harder, believe it or not, | |
22:40.560 --> 22:42.560 | |
even though it's sort of the most obvious thing, | |
22:42.560 --> 22:44.560 | |
but you want some sort of controlled metabolism | |
22:44.560 --> 22:48.560 | |
and the actual cellular machinery in our bodies is quite complicated. | |
22:48.560 --> 22:51.560 | |
It's hard to see it just popping into existence all by itself. | |
22:51.560 --> 22:53.560 | |
It probably took a while. | |
22:53.560 --> 22:55.560 | |
But we're making progress. | |
22:55.560 --> 22:58.560 | |
In fact, I don't think we're spending nearly enough money on it. | |
22:58.560 --> 23:01.560 | |
If I were the NSF, I would flood this area with money | |
23:01.560 --> 23:04.560 | |
because it would change our view of the world | |
23:04.560 --> 23:06.560 | |
if we could actually make life in the lab | |
23:06.560 --> 23:09.560 | |
and understand how it was made originally here on Earth. | |
23:09.560 --> 23:11.560 | |
I'm sure it would have some ripple effects | |
23:11.560 --> 23:13.560 | |
that help cure diseases and so on. | |
23:13.560 --> 23:15.560 | |
That's right. | |
23:15.560 --> 23:18.560 | |
Synthetic biology is a wonderful big frontier where we're making cells. | |
23:18.560 --> 23:21.560 | |
Right now, the best way to do that | |
23:21.560 --> 23:23.560 | |
is to borrow heavily from existing biology. | |
23:23.560 --> 23:26.560 | |
Craig Ventner several years ago created an artificial cell, | |
23:26.560 --> 23:28.560 | |
but all he did was... | |
23:28.560 --> 23:30.560 | |
not all he did, it was a tremendous accomplishment, | |
23:30.560 --> 23:33.560 | |
but all he did was take out the DNA from a cell | |
23:33.560 --> 23:36.560 | |
and put in entirely new DNA and let it boot up and go. | |
23:36.560 --> 23:43.560 | |
What about the leap to creating intelligent life on Earth? | |
23:43.560 --> 23:45.560 | |
However, again, we define intelligence, of course, | |
23:45.560 --> 23:49.560 | |
but let's just even say homo sapiens, | |
23:49.560 --> 23:54.560 | |
the modern intelligence in our human brain. | |
23:54.560 --> 23:58.560 | |
Do you have a sense of what's involved in that leap | |
23:58.560 --> 24:00.560 | |
and how big of a leap that is? | |
24:00.560 --> 24:02.560 | |
So AI would count in this? | |
24:02.560 --> 24:04.560 | |
Or do you really want life? | |
24:04.560 --> 24:06.560 | |
AI would count in some sense. | |
24:06.560 --> 24:08.560 | |
AI would count, I think. | |
24:08.560 --> 24:10.560 | |
Of course, AI would count. | |
24:10.560 --> 24:12.560 | |
Well, let's say artificial consciousness. | |
24:12.560 --> 24:14.560 | |
I do not think we are on the threshold | |
24:14.560 --> 24:16.560 | |
of creating artificial consciousness. | |
24:16.560 --> 24:18.560 | |
I think it's possible. | |
24:18.560 --> 24:20.560 | |
I'm not, again, very educated about how close we are, | |
24:20.560 --> 24:22.560 | |
but my impression is not that we're really close | |
24:22.560 --> 24:24.560 | |
because we understand how little we understand | |
24:24.560 --> 24:26.560 | |
of consciousness and what it is. | |
24:26.560 --> 24:28.560 | |
So if we don't have any idea what it is, | |
24:28.560 --> 24:30.560 | |
it's hard to imagine we're on the threshold | |
24:30.560 --> 24:32.560 | |
of making it ourselves. | |
24:32.560 --> 24:34.560 | |
But it's doable, it's possible. | |
24:34.560 --> 24:36.560 | |
I don't see any obstacles in principle, | |
24:36.560 --> 24:38.560 | |
so yeah, I would hold out some interest | |
24:38.560 --> 24:40.560 | |
in that happening eventually. | |
24:40.560 --> 24:42.560 | |
I think in general, consciousness, | |
24:42.560 --> 24:44.560 | |
I think it would be just surprised | |
24:44.560 --> 24:46.560 | |
how easy consciousness is | |
24:46.560 --> 24:48.560 | |
once we create intelligence. | |
24:48.560 --> 24:50.560 | |
I think consciousness is a thing | |
24:50.560 --> 24:54.560 | |
that's just something we all fake. | |
24:54.560 --> 24:56.560 | |
Well, good. | |
24:56.560 --> 24:58.560 | |
No, actually, I like this idea that, in fact, | |
24:58.560 --> 25:00.560 | |
consciousness is way less mysterious than we think | |
25:00.560 --> 25:02.560 | |
because we're all at every time, | |
25:02.560 --> 25:04.560 | |
at every moment, less conscious than we think we are. | |
25:04.560 --> 25:06.560 | |
We can fool things. | |
25:06.560 --> 25:08.560 | |
And I think that plus the idea that you | |
25:08.560 --> 25:10.560 | |
not only have artificial intelligence systems, | |
25:10.560 --> 25:12.560 | |
but you put them in a body, | |
25:12.560 --> 25:14.560 | |
give them a robot body, | |
25:14.560 --> 25:18.560 | |
that will help the faking a lot. | |
25:18.560 --> 25:20.560 | |
Yeah, I think creating consciousness | |
25:20.560 --> 25:22.560 | |
in artificial consciousness | |
25:22.560 --> 25:24.560 | |
is as simple | |
25:24.560 --> 25:26.560 | |
as asking a Roomba | |
25:26.560 --> 25:28.560 | |
to say, I'm conscious | |
25:28.560 --> 25:32.560 | |
and refusing to be talked out of it. | |
25:32.560 --> 25:34.560 | |
It could be. | |
25:34.560 --> 25:36.560 | |
I mean, I'm almost being silly, | |
25:36.560 --> 25:38.560 | |
but that's what we do. | |
25:38.560 --> 25:40.560 | |
That's what we do with each other. | |
25:40.560 --> 25:44.560 | |
The consciousness is also a social construct, | |
25:44.560 --> 25:46.560 | |
and a lot of our ideas of intelligence | |
25:46.560 --> 25:48.560 | |
is a social construct, | |
25:48.560 --> 25:50.560 | |
and so reaching that bar involves | |
25:50.560 --> 25:52.560 | |
something that's beyond, | |
25:52.560 --> 25:54.560 | |
that doesn't necessarily involve | |
25:54.560 --> 25:56.560 | |
the fundamental understanding | |
25:56.560 --> 25:58.560 | |
of how you go from | |
25:58.560 --> 26:00.560 | |
electrons to neurons | |
26:00.560 --> 26:02.560 | |
to cognition. | |
26:02.560 --> 26:04.560 | |
No, actually, I think that is an extremely good point, | |
26:04.560 --> 26:06.560 | |
and in fact, | |
26:06.560 --> 26:08.560 | |
what it suggests is, | |
26:08.560 --> 26:10.560 | |
so yeah, you referred to Kate Darling, | |
26:10.560 --> 26:12.560 | |
who I had on the podcast, | |
26:12.560 --> 26:14.560 | |
and who does these experiments with | |
26:14.560 --> 26:16.560 | |
very simple robots, | |
26:16.560 --> 26:18.560 | |
but they look like animals, | |
26:18.560 --> 26:20.560 | |
and they can look like they're experiencing pain, | |
26:20.560 --> 26:22.560 | |
and we human beings react | |
26:22.560 --> 26:24.560 | |
very negatively to these little robots | |
26:24.560 --> 26:26.560 | |
looking like they're experiencing pain, | |
26:26.560 --> 26:28.560 | |
and what you want to say is, | |
26:28.560 --> 26:30.560 | |
yeah, but they're just robots. | |
26:30.560 --> 26:32.560 | |
It's not really pain. | |
26:32.560 --> 26:34.560 | |
It's just some electrons going around, | |
26:34.560 --> 26:36.560 | |
but then you realize you and I | |
26:36.560 --> 26:38.560 | |
are just electrons going around, | |
26:38.560 --> 26:40.560 | |
and that's what pain is also. | |
26:40.560 --> 26:42.560 | |
What I would have an easy time imagining | |
26:42.560 --> 26:44.560 | |
is that there is a spectrum | |
26:44.560 --> 26:46.560 | |
between these simple little robots | |
26:46.560 --> 26:48.560 | |
that Kate works with | |
26:48.560 --> 26:50.560 | |
and a human being, | |
26:50.560 --> 26:52.560 | |
where there are things that, | |
26:52.560 --> 26:54.560 | |
like a human touring test level thing | |
26:54.560 --> 26:56.560 | |
are not conscious, | |
26:56.560 --> 26:58.560 | |
but nevertheless walk and talk | |
26:58.560 --> 27:00.560 | |
like they're conscious, | |
27:00.560 --> 27:02.560 | |
and it could be that the future is, | |
27:02.560 --> 27:04.560 | |
I mean, Siri is close, right? | |
27:04.560 --> 27:06.560 | |
And so it might be the future | |
27:06.560 --> 27:08.560 | |
has a lot more agents like that, | |
27:08.560 --> 27:10.560 | |
and in fact, rather than someday going, | |
27:10.560 --> 27:12.560 | |
aha, we have consciousness, | |
27:12.560 --> 27:14.560 | |
we'll just creep up on it | |
27:14.560 --> 27:16.560 | |
with more and more accurate reflections | |
27:16.560 --> 27:18.560 | |
of what we expect. | |
27:18.560 --> 27:20.560 | |
And in the future, maybe the present, | |
27:20.560 --> 27:22.560 | |
and you're basically assuming | |
27:22.560 --> 27:24.560 | |
that I'm human. | |
27:24.560 --> 27:26.560 | |
I get a high probability. | |
27:26.560 --> 27:28.560 | |
At this time, because the, | |
27:28.560 --> 27:30.560 | |
but in the future, | |
27:30.560 --> 27:32.560 | |
there might be question marks around that, right? | |
27:32.560 --> 27:34.560 | |
Yeah, no, absolutely. | |
27:34.560 --> 27:36.560 | |
Certainly videos are almost to the point | |
27:36.560 --> 27:38.560 | |
where you shouldn't trust them already. | |
27:38.560 --> 27:40.560 | |
Photos you can't trust, right? | |
27:40.560 --> 27:42.560 | |
Videos is easier to trust, | |
27:42.560 --> 27:44.560 | |
but we're getting worse. | |
27:44.560 --> 27:46.560 | |
We're getting better at faking them, right? | |
27:46.560 --> 27:48.560 | |
Yeah, so physical, embodied people, | |
27:48.560 --> 27:50.560 | |
what's so hard about faking that? | |
27:50.560 --> 27:52.560 | |
This is very depressing, | |
27:52.560 --> 27:54.560 | |
this conversation we're having right now. | |
27:54.560 --> 27:56.560 | |
To me, it's exciting. | |
27:56.560 --> 27:58.560 | |
You're doing it, so it's exciting to you, | |
27:58.560 --> 28:00.560 | |
but it's a sobering thought. | |
28:00.560 --> 28:02.560 | |
We're very bad at imagining | |
28:02.560 --> 28:04.560 | |
what the next 50 years are going to be like | |
28:04.560 --> 28:06.560 | |
when we're in the middle of a phase transition | |
28:06.560 --> 28:08.560 | |
as we are right now. | |
28:08.560 --> 28:10.560 | |
Yeah, and in general, | |
28:10.560 --> 28:12.560 | |
I'm not blind to all the threats. | |
28:12.560 --> 28:14.560 | |
I am excited by the power of technology | |
28:14.560 --> 28:16.560 | |
to solve, | |
28:16.560 --> 28:18.560 | |
as they evolve. | |
28:18.560 --> 28:20.560 | |
I'm not as much as Steven Pinker | |
28:20.560 --> 28:22.560 | |
optimistic about the world, | |
28:22.560 --> 28:24.560 | |
but in everything I've seen, | |
28:24.560 --> 28:26.560 | |
all the brilliant people in the world | |
28:26.560 --> 28:28.560 | |
that I've met are good people. | |
28:28.560 --> 28:30.560 | |
So the army of the good | |
28:30.560 --> 28:32.560 | |
in terms of the development of technology is large. | |
28:32.560 --> 28:34.560 | |
Okay, you're way more | |
28:34.560 --> 28:36.560 | |
optimistic than I am. | |
28:36.560 --> 28:38.560 | |
I think that goodness and badness | |
28:38.560 --> 28:40.560 | |
are equally distributed among intelligent | |
28:40.560 --> 28:42.560 | |
and unintelligent people. | |
28:42.560 --> 28:44.560 | |
I don't see much of a correlation there. | |
28:44.560 --> 28:46.560 | |
Interesting. | |
28:46.560 --> 28:48.560 | |
Neither of us have proof. | |
28:48.560 --> 28:50.560 | |
Yeah, exactly. Again, opinions are free, right? | |
28:50.560 --> 28:52.560 | |
Nor definitions of good and evil. | |
28:52.560 --> 28:54.560 | |
Without definitions | |
28:54.560 --> 28:56.560 | |
or without data | |
28:56.560 --> 28:58.560 | |
opinions. | |
28:58.560 --> 29:00.560 | |
So what kind of questions can science not | |
29:00.560 --> 29:02.560 | |
currently answer | |
29:02.560 --> 29:04.560 | |
and may never be able to answer in your view? | |
29:04.560 --> 29:06.560 | |
Well, the obvious one is what is good and bad. | |
29:06.560 --> 29:08.560 | |
What is right and wrong? | |
29:08.560 --> 29:10.560 | |
I think that there are questions that science tells us | |
29:10.560 --> 29:12.560 | |
what happens, what the world is, | |
29:12.560 --> 29:14.560 | |
doesn't say what the world should do | |
29:14.560 --> 29:16.560 | |
or what we should do because we're part of the world. | |
29:16.560 --> 29:18.560 | |
But we are part of the world | |
29:18.560 --> 29:20.560 | |
and we have the ability to feel like | |
29:20.560 --> 29:22.560 | |
something's right, something's wrong. | |
29:22.560 --> 29:24.560 | |
And to make a very long story | |
29:24.560 --> 29:26.560 | |
very short, I think that the idea | |
29:26.560 --> 29:28.560 | |
of moral philosophy is | |
29:28.560 --> 29:30.560 | |
systematizing our intuitions of what is right | |
29:30.560 --> 29:32.560 | |
and what is wrong. | |
29:32.560 --> 29:34.560 | |
And science might be able to predict ahead of time | |
29:34.560 --> 29:36.560 | |
what we will do, | |
29:36.560 --> 29:38.560 | |
but it won't ever be able to judge | |
29:38.560 --> 29:40.560 | |
whether we should have done it or not. | |
29:40.560 --> 29:42.560 | |
You know, you're kind of unique in terms of scientists. | |
29:42.560 --> 29:44.560 | |
It doesn't | |
29:44.560 --> 29:46.560 | |
have to do with podcasts, but | |
29:46.560 --> 29:48.560 | |
even just reaching out, I think you refer to | |
29:48.560 --> 29:50.560 | |
as sort of doing interdisciplinary science. | |
29:50.560 --> 29:52.560 | |
So you reach out | |
29:52.560 --> 29:54.560 | |
and talk to people | |
29:54.560 --> 29:56.560 | |
that are outside of your discipline, | |
29:56.560 --> 29:58.560 | |
which I always | |
29:58.560 --> 30:00.560 | |
hope that's what science was for. | |
30:00.560 --> 30:02.560 | |
In fact, I was a little disillusioned | |
30:02.560 --> 30:04.560 | |
when I realized that academia | |
30:04.560 --> 30:06.560 | |
is very siloed. | |
30:06.560 --> 30:08.560 | |
Yeah. | |
30:08.560 --> 30:10.560 | |
The question is | |
30:10.560 --> 30:12.560 | |
how, | |
30:12.560 --> 30:14.560 | |
at your own level, how do you prepare for these conversations? | |
30:14.560 --> 30:16.560 | |
How do you think about these conversations? | |
30:16.560 --> 30:18.560 | |
How do you open your mind enough | |
30:18.560 --> 30:20.560 | |
to have these conversations? | |
30:20.560 --> 30:22.560 | |
And it may be a little bit broader. | |
30:22.560 --> 30:24.560 | |
How can you advise other scientists | |
30:24.560 --> 30:26.560 | |
to have these kinds of conversations? | |
30:26.560 --> 30:28.560 | |
Not at the podcast. | |
30:28.560 --> 30:30.560 | |
The fact that you're doing a podcast is awesome. | |
30:30.560 --> 30:32.560 | |
Other people get to hear them. | |
30:32.560 --> 30:34.560 | |
But it's also good to have it without mics in general. | |
30:34.560 --> 30:36.560 | |
It's a good question, but a tough one | |
30:36.560 --> 30:38.560 | |
to answer. I think about | |
30:38.560 --> 30:40.560 | |
a guy I know is a personal trainer | |
30:40.560 --> 30:42.560 | |
and he was asked on a podcast | |
30:42.560 --> 30:44.560 | |
how do we psych ourselves up | |
30:44.560 --> 30:46.560 | |
to do a workout? How do we make | |
30:46.560 --> 30:48.560 | |
that discipline to go and work out? | |
30:48.560 --> 30:50.560 | |
And he's like, why are you asking me? | |
30:50.560 --> 30:52.560 | |
I can't stop working out. | |
30:52.560 --> 30:54.560 | |
I don't need to psych myself up. | |
30:54.560 --> 30:56.560 | |
Likewise, you asked me | |
30:56.560 --> 30:58.560 | |
how do you get to have | |
30:58.560 --> 31:00.560 | |
interdisciplinary conversations and all sorts of different things | |
31:00.560 --> 31:02.560 | |
with all sorts of different people? | |
31:02.560 --> 31:04.560 | |
That's what makes me go. | |
31:04.560 --> 31:06.560 | |
I couldn't stop | |
31:06.560 --> 31:08.560 | |
doing that. I did that long before | |
31:08.560 --> 31:10.560 | |
any of them were recorded. In fact, | |
31:10.560 --> 31:12.560 | |
a lot of the motivation for starting recording it | |
31:12.560 --> 31:14.560 | |
was making sure I would read all these books | |
31:14.560 --> 31:16.560 | |
that I had purchased. All these books | |
31:16.560 --> 31:18.560 | |
I wanted to read. Not enough time to read them. | |
31:18.560 --> 31:20.560 | |
And now, if I have the motivation | |
31:20.560 --> 31:22.560 | |
because I'm going to interview Pat | |
31:22.560 --> 31:24.560 | |
Churchland, I'm going to finally read her | |
31:24.560 --> 31:26.560 | |
book. | |
31:26.560 --> 31:28.560 | |
And | |
31:28.560 --> 31:30.560 | |
it's absolutely true that academia is | |
31:30.560 --> 31:32.560 | |
extraordinarily siloed. We don't talk to people. | |
31:32.560 --> 31:34.560 | |
We rarely do. | |
31:34.560 --> 31:36.560 | |
And in fact, when we do, it's punished. | |
31:36.560 --> 31:38.560 | |
The people who do it successfully | |
31:38.560 --> 31:40.560 | |
generally first became | |
31:40.560 --> 31:42.560 | |
very successful within their little siloed discipline. | |
31:42.560 --> 31:44.560 | |
And only then | |
31:44.560 --> 31:46.560 | |
did they start expanding out. | |
31:46.560 --> 31:48.560 | |
If you're a young person, I have graduate students | |
31:48.560 --> 31:50.560 | |
and I try to be very, very | |
31:50.560 --> 31:52.560 | |
candid with them about this. | |
31:52.560 --> 31:54.560 | |
That it's | |
31:54.560 --> 31:56.560 | |
most graduate students do not become faculty members. | |
31:56.560 --> 31:58.560 | |
It's a tough road. | |
31:58.560 --> 32:00.560 | |
And so | |
32:00.560 --> 32:02.560 | |
you live the life you want to live | |
32:02.560 --> 32:04.560 | |
but do it with your eyes open | |
32:04.560 --> 32:06.560 | |
about what it does to your job chances. | |
32:06.560 --> 32:08.560 | |
And the more | |
32:08.560 --> 32:10.560 | |
broad you are and the less | |
32:10.560 --> 32:12.560 | |
time you spend hyper | |
32:12.560 --> 32:14.560 | |
specializing in your field, the lower | |
32:14.560 --> 32:16.560 | |
your job chances are. That's just an academic | |
32:16.560 --> 32:18.560 | |
reality. It's terrible. I don't like it. | |
32:18.560 --> 32:20.560 | |
But it's a reality. | |
32:20.560 --> 32:22.560 | |
And for some people | |
32:22.560 --> 32:24.560 | |
that's fine. Like there's plenty of people | |
32:24.560 --> 32:26.560 | |
who are wonderful scientists who have zero | |
32:26.560 --> 32:28.560 | |
interest in branching out and talking to | |
32:28.560 --> 32:30.560 | |
things to anyone outside their field. | |
32:30.560 --> 32:32.560 | |
But | |
32:32.560 --> 32:34.560 | |
it is disillusioning to me | |
32:34.560 --> 32:36.560 | |
some of the romantic notion | |
32:36.560 --> 32:38.560 | |
I had of the intellectual academic life | |
32:38.560 --> 32:40.560 | |
is belied by the reality | |
32:40.560 --> 32:42.560 | |
of it. The idea that we should | |
32:42.560 --> 32:44.560 | |
reach out beyond our discipline | |
32:44.560 --> 32:46.560 | |
and that is a positive good | |
32:46.560 --> 32:48.560 | |
is just so | |
32:48.560 --> 32:50.560 | |
rare in | |
32:50.560 --> 32:52.560 | |
universities that it may as well | |
32:52.560 --> 32:54.560 | |
not exist at all. But | |
32:54.560 --> 32:56.560 | |
that said, even though you're saying | |
32:56.560 --> 32:58.560 | |
you're doing it like the personal trainer | |
32:58.560 --> 33:00.560 | |
because you just can't help it, you're also | |
33:00.560 --> 33:02.560 | |
an inspiration to others. | |
33:02.560 --> 33:04.560 | |
Like I could speak for myself. | |
33:04.560 --> 33:06.560 | |
You know, | |
33:06.560 --> 33:08.560 | |
I also have a career I'm thinking about | |
33:08.560 --> 33:10.560 | |
right. And without | |
33:10.560 --> 33:12.560 | |
your podcast, I may have | |
33:12.560 --> 33:14.560 | |
not have been doing this at all. | |
33:14.560 --> 33:16.560 | |
Right. So it | |
33:16.560 --> 33:18.560 | |
makes me realize that these kinds | |
33:18.560 --> 33:20.560 | |
of conversations is kind of what science is about. | |
33:20.560 --> 33:22.560 | |
In many | |
33:22.560 --> 33:24.560 | |
ways. The reason we write papers | |
33:24.560 --> 33:26.560 | |
this exchange of ideas | |
33:26.560 --> 33:28.560 | |
is much harder to do | |
33:28.560 --> 33:30.560 | |
into the disciplinary papers, I would say. | |
33:30.560 --> 33:32.560 | |
Yeah. Right. | |
33:32.560 --> 33:34.560 | |
And conversations are easier. | |
33:34.560 --> 33:36.560 | |
So conversations is the beginning | |
33:36.560 --> 33:38.560 | |
and in the field of AI | |
33:38.560 --> 33:40.560 | |
that it's | |
33:40.560 --> 33:42.560 | |
obvious that we should think outside | |
33:42.560 --> 33:44.560 | |
of pure | |
33:44.560 --> 33:46.560 | |
computer vision competitions and in particular | |
33:46.560 --> 33:48.560 | |
data sets. We should think about the broader | |
33:48.560 --> 33:50.560 | |
impact of how this can be | |
33:50.560 --> 33:52.560 | |
you know, reaching | |
33:52.560 --> 33:54.560 | |
out to physics, to psychology | |
33:54.560 --> 33:56.560 | |
to neuroscience | |
33:56.560 --> 33:58.560 | |
and having these conversations. | |
33:58.560 --> 34:00.560 | |
So you're an inspiration | |
34:00.560 --> 34:02.560 | |
and so. Well, thank you very much. | |
34:02.560 --> 34:04.560 | |
Never know how the world | |
34:04.560 --> 34:06.560 | |
changes. I mean | |
34:06.560 --> 34:08.560 | |
the fact that this stuff is out there | |
34:08.560 --> 34:10.560 | |
and I've | |
34:10.560 --> 34:12.560 | |
a huge number of people come up to me | |
34:12.560 --> 34:14.560 | |
grad students really loving the | |
34:14.560 --> 34:16.560 | |
podcast inspired by it and | |
34:16.560 --> 34:18.560 | |
they will probably have that | |
34:18.560 --> 34:20.560 | |
there'll be ripple effects when they become faculty | |
34:20.560 --> 34:22.560 | |
and so on. So we can end | |
34:22.560 --> 34:24.560 | |
on a balance between pessimism | |
34:24.560 --> 34:26.560 | |
and optimism and Sean, thank you so much | |
34:26.560 --> 34:28.560 | |
for talking. It was awesome. No, Lex, thank you very | |
34:28.560 --> 34:52.560 | |
much for this conversation. It was great. | |