episode
stringlengths
45
100
text
stringlengths
1
528
timestamp_link
stringlengths
56
56
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
So in terms of, so if you think of a Boston Dynamics
https://karpathy.ai/lexicap/0001-large.html#00:18:01.120
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
human or robot being sort of with a broom
https://karpathy.ai/lexicap/0001-large.html#00:18:04.680
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
being pushed around, it starts pushing
https://karpathy.ai/lexicap/0001-large.html#00:18:07.680
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
on a consciousness question.
https://karpathy.ai/lexicap/0001-large.html#00:18:11.920
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
So let me ask, do you think an AGI system
https://karpathy.ai/lexicap/0001-large.html#00:18:13.320
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
like a few neuroscientists believe
https://karpathy.ai/lexicap/0001-large.html#00:18:17.040
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
needs to have a physical embodiment?
https://karpathy.ai/lexicap/0001-large.html#00:18:19.720
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
Needs to have a body or something like a body?
https://karpathy.ai/lexicap/0001-large.html#00:18:22.320
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
No, I don't think so.
https://karpathy.ai/lexicap/0001-large.html#00:18:25.720
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
You mean to have a conscious experience?
https://karpathy.ai/lexicap/0001-large.html#00:18:28.280
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
To have consciousness.
https://karpathy.ai/lexicap/0001-large.html#00:18:30.560
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
I do think it helps a lot to have a physical embodiment
https://karpathy.ai/lexicap/0001-large.html#00:18:33.160
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
to learn the kind of things about the world
https://karpathy.ai/lexicap/0001-large.html#00:18:36.080
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
that are important to us humans, for sure.
https://karpathy.ai/lexicap/0001-large.html#00:18:38.440
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
But I don't think the physical embodiment
https://karpathy.ai/lexicap/0001-large.html#00:18:42.560
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
is necessary after you've learned it
https://karpathy.ai/lexicap/0001-large.html#00:18:45.600
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
to just have the experience.
https://karpathy.ai/lexicap/0001-large.html#00:18:47.120
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
Think about when you're dreaming, right?
https://karpathy.ai/lexicap/0001-large.html#00:18:48.760
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
Your eyes are closed.
https://karpathy.ai/lexicap/0001-large.html#00:18:51.400
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
You're not getting any sensory input.
https://karpathy.ai/lexicap/0001-large.html#00:18:52.600
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
You're not behaving or moving in any way
https://karpathy.ai/lexicap/0001-large.html#00:18:54.240
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
but there's still an experience there, right?
https://karpathy.ai/lexicap/0001-large.html#00:18:55.960
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
And so clearly the experience that you have
https://karpathy.ai/lexicap/0001-large.html#00:18:59.720
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
when you see something cool in your dreams
https://karpathy.ai/lexicap/0001-large.html#00:19:01.400
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
isn't coming from your eyes.
https://karpathy.ai/lexicap/0001-large.html#00:19:03.320
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
It's just the information processing itself in your brain
https://karpathy.ai/lexicap/0001-large.html#00:19:04.800
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
which is that experience, right?
https://karpathy.ai/lexicap/0001-large.html#00:19:08.640
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
But if I put it another way, I'll say
https://karpathy.ai/lexicap/0001-large.html#00:19:10.920
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
because it comes from neuroscience
https://karpathy.ai/lexicap/0001-large.html#00:19:13.640
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
is the reason you want to have a body and a physical
https://karpathy.ai/lexicap/0001-large.html#00:19:15.120
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
something like a physical, you know, a physical system
https://karpathy.ai/lexicap/0001-large.html#00:19:18.280
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
is because you want to be able to preserve something.
https://karpathy.ai/lexicap/0001-large.html#00:19:23.920
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
In order to have a self, you could argue,
https://karpathy.ai/lexicap/0001-large.html#00:19:27.040
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
would you need to have some kind of embodiment of self
https://karpathy.ai/lexicap/0001-large.html#00:19:30.840
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
to want to preserve?
https://karpathy.ai/lexicap/0001-large.html#00:19:36.400
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
Well, now we're getting a little bit anthropomorphic
https://karpathy.ai/lexicap/0001-large.html#00:19:38.920
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
into anthropomorphizing things.
https://karpathy.ai/lexicap/0001-large.html#00:19:42.400
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
Maybe talking about self preservation instincts.
https://karpathy.ai/lexicap/0001-large.html#00:19:45.200
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
I mean, we are evolved organisms, right?
https://karpathy.ai/lexicap/0001-large.html#00:19:47.280
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
So Darwinian evolution endowed us
https://karpathy.ai/lexicap/0001-large.html#00:19:50.560
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
and other evolved organism with a self preservation instinct
https://karpathy.ai/lexicap/0001-large.html#00:19:53.520
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
because those that didn't have those self preservation genes
https://karpathy.ai/lexicap/0001-large.html#00:19:57.120
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
got cleaned out of the gene pool, right?
https://karpathy.ai/lexicap/0001-large.html#00:20:00.560
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
But if you build an artificial general intelligence
https://karpathy.ai/lexicap/0001-large.html#00:20:02.960
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
the mind space that you can design is much, much larger
https://karpathy.ai/lexicap/0001-large.html#00:20:06.880
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
than just a specific subset of minds that can evolve.
https://karpathy.ai/lexicap/0001-large.html#00:20:10.040
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
So an AGI mind doesn't necessarily have
https://karpathy.ai/lexicap/0001-large.html#00:20:14.440
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
to have any self preservation instinct.
https://karpathy.ai/lexicap/0001-large.html#00:20:17.280
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
It also doesn't necessarily have to be
https://karpathy.ai/lexicap/0001-large.html#00:20:19.880
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
so individualistic as us.
https://karpathy.ai/lexicap/0001-large.html#00:20:21.600
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
Like, imagine if you could just, first of all,
https://karpathy.ai/lexicap/0001-large.html#00:20:24.040
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
or we are also very afraid of death.
https://karpathy.ai/lexicap/0001-large.html#00:20:26.080
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
You know, I suppose you could back yourself up
https://karpathy.ai/lexicap/0001-large.html#00:20:27.960
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
every five minutes and then your airplane
https://karpathy.ai/lexicap/0001-large.html#00:20:29.920
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
is about to crash.
https://karpathy.ai/lexicap/0001-large.html#00:20:32.000
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
You're like, shucks, I'm gonna lose the last five minutes
https://karpathy.ai/lexicap/0001-large.html#00:20:32.840
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
of experiences since my last cloud backup, dang.
https://karpathy.ai/lexicap/0001-large.html#00:20:36.680
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
You know, it's not as big a deal.
https://karpathy.ai/lexicap/0001-large.html#00:20:39.520
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
Or if we could just copy experiences between our minds
https://karpathy.ai/lexicap/0001-large.html#00:20:41.520
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
easily like we, which we could easily do
https://karpathy.ai/lexicap/0001-large.html#00:20:45.680
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
if we were silicon based, right?
https://karpathy.ai/lexicap/0001-large.html#00:20:47.640
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
Then maybe we would feel a little bit more
https://karpathy.ai/lexicap/0001-large.html#00:20:50.360
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
like a hive mind actually, that maybe it's the,
https://karpathy.ai/lexicap/0001-large.html#00:20:54.040
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
so I don't think we should take for granted at all
https://karpathy.ai/lexicap/0001-large.html#00:20:56.560
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
that AGI will have to have any of those sort of
https://karpathy.ai/lexicap/0001-large.html#00:20:59.960
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
competitive as alpha male instincts.
https://karpathy.ai/lexicap/0001-large.html#00:21:04.880
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
On the other hand, you know, this is really interesting
https://karpathy.ai/lexicap/0001-large.html#00:21:07.360
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
because I think some people go too far and say,
https://karpathy.ai/lexicap/0001-large.html#00:21:10.160
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
of course we don't have to have any concerns either
https://karpathy.ai/lexicap/0001-large.html#00:21:13.840
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
that advanced AI will have those instincts
https://karpathy.ai/lexicap/0001-large.html#00:21:16.680
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
because we can build anything we want.
https://karpathy.ai/lexicap/0001-large.html#00:21:20.800
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
That there's a very nice set of arguments going back
https://karpathy.ai/lexicap/0001-large.html#00:21:22.680
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
to Steve Omohundro and Nick Bostrom and others
https://karpathy.ai/lexicap/0001-large.html#00:21:26.280
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
just pointing out that when we build machines,
https://karpathy.ai/lexicap/0001-large.html#00:21:28.560
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
we normally build them with some kind of goal, you know,
https://karpathy.ai/lexicap/0001-large.html#00:21:32.280
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
win this chess game, drive this car safely or whatever.
https://karpathy.ai/lexicap/0001-large.html#00:21:34.680
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
And as soon as you put in a goal into machine,
https://karpathy.ai/lexicap/0001-large.html#00:21:38.520
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
especially if it's kind of open ended goal
https://karpathy.ai/lexicap/0001-large.html#00:21:40.960
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
and the machine is very intelligent,
https://karpathy.ai/lexicap/0001-large.html#00:21:42.760
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
it'll break that down into a bunch of sub goals.
https://karpathy.ai/lexicap/0001-large.html#00:21:44.640
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
And one of those goals will almost always
https://karpathy.ai/lexicap/0001-large.html#00:21:48.280
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
be self preservation because if it breaks or dies
https://karpathy.ai/lexicap/0001-large.html#00:21:51.280
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
in the process, it's not gonna accomplish the goal, right?
https://karpathy.ai/lexicap/0001-large.html#00:21:54.200
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
Like suppose you just build a little,
https://karpathy.ai/lexicap/0001-large.html#00:21:56.120
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
you have a little robot and you tell it to go down
https://karpathy.ai/lexicap/0001-large.html#00:21:58.040
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
the store market here and get you some food,
https://karpathy.ai/lexicap/0001-large.html#00:22:01.000
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
make you cook an Italian dinner, you know,
https://karpathy.ai/lexicap/0001-large.html#00:22:04.040
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
and then someone mugs it and tries to break it
https://karpathy.ai/lexicap/0001-large.html#00:22:06.200
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
on the way.
https://karpathy.ai/lexicap/0001-large.html#00:22:08.400
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
That robot has an incentive to not get destroyed
https://karpathy.ai/lexicap/0001-large.html#00:22:09.480
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
and defend itself or run away,
https://karpathy.ai/lexicap/0001-large.html#00:22:12.920
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
because otherwise it's gonna fail in cooking your dinner.
https://karpathy.ai/lexicap/0001-large.html#00:22:14.720
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
It's not afraid of death,
https://karpathy.ai/lexicap/0001-large.html#00:22:17.720
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
but it really wants to complete the dinner cooking goal.
https://karpathy.ai/lexicap/0001-large.html#00:22:19.560
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
So it will have a self preservation instinct.
https://karpathy.ai/lexicap/0001-large.html#00:22:22.960
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
Continue being a functional agent somehow.
https://karpathy.ai/lexicap/0001-large.html#00:22:25.040
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
And similarly, if you give any kind of more ambitious goal
https://karpathy.ai/lexicap/0001-large.html#00:22:27.920
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
to an AGI, it's very likely they wanna acquire
https://karpathy.ai/lexicap/0001-large.html#00:22:33.720
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
more resources so it can do that better.
https://karpathy.ai/lexicap/0001-large.html#00:22:37.000
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
And it's exactly from those sort of sub goals
https://karpathy.ai/lexicap/0001-large.html#00:22:39.840