Datasets:
Languages:
English
Multilinguality:
monolingual
Size Categories:
n<1K
Language Creators:
found
Source Datasets:
original
Tags:
karpathy,whisper,openai
WEBVTT | |
00:00.000 --> 00:03.160 | |
The following is a conversation with Vijay Kumar. | |
00:03.160 --> 00:05.800 | |
He's one of the top roboticists in the world, | |
00:05.800 --> 00:08.760 | |
a professor at the University of Pennsylvania, | |
00:08.760 --> 00:10.680 | |
a Dean of Penn Engineering, | |
00:10.680 --> 00:12.880 | |
former director of Grasp Lab, | |
00:12.880 --> 00:15.320 | |
or the General Robotics Automation Sensing | |
00:15.320 --> 00:17.560 | |
and Perception Laboratory at Penn, | |
00:17.560 --> 00:22.560 | |
that was established back in 1979, that's 40 years ago. | |
00:22.600 --> 00:24.720 | |
Vijay is perhaps best known | |
00:24.720 --> 00:28.520 | |
for his work in multi robot systems, robot swarms, | |
00:28.520 --> 00:30.880 | |
and micro aerial vehicles. | |
00:30.880 --> 00:34.040 | |
Robots that elegantly cooperate in flight | |
00:34.040 --> 00:36.200 | |
under all the uncertainty and challenges | |
00:36.200 --> 00:38.760 | |
that the real world conditions present. | |
00:38.760 --> 00:41.960 | |
This is the Artificial Intelligence Podcast. | |
00:41.960 --> 00:44.320 | |
If you enjoy it, subscribe on YouTube, | |
00:44.320 --> 00:46.080 | |
give it five stars on iTunes, | |
00:46.080 --> 00:47.560 | |
support it on Patreon, | |
00:47.560 --> 00:49.480 | |
or simply connect with me on Twitter | |
00:49.480 --> 00:53.280 | |
at Lex Freedman spelled FRID MAN. | |
00:53.280 --> 00:57.560 | |
And now here's my conversation with Vijay Kumar. | |
00:58.680 --> 01:01.080 | |
What is the first robot you've ever built | |
01:01.080 --> 01:02.840 | |
or were a part of building? | |
01:02.840 --> 01:04.760 | |
Way back when I was in graduate school, | |
01:04.760 --> 01:06.760 | |
I was part of a fairly big project | |
01:06.760 --> 01:11.760 | |
that involved building a very large hexapod. | |
01:12.000 --> 01:17.000 | |
This weighed close to 7,000 pounds. | |
01:17.520 --> 01:21.600 | |
And it was powered by hydraulic actuation, | |
01:21.600 --> 01:26.600 | |
or was actuated by hydraulics with 18 motors, | |
01:27.760 --> 01:32.760 | |
hydraulic motors, each controlled by an Intel 8085 processor | |
01:34.200 --> 01:36.680 | |
and an 8086 co processor. | |
01:38.160 --> 01:43.160 | |
And so imagine this huge monster that had 18 joints, | |
01:44.840 --> 01:47.000 | |
each controlled by an independent computer. | |
01:47.000 --> 01:48.560 | |
And there was a 19th computer | |
01:48.560 --> 01:50.160 | |
that actually did the coordination | |
01:50.160 --> 01:52.360 | |
between these 18 joints. | |
01:52.360 --> 01:53.760 | |
So as part of this project, | |
01:53.760 --> 01:57.960 | |
and my thesis work was, | |
01:57.960 --> 02:01.080 | |
how do you coordinate the 18 legs? | |
02:02.120 --> 02:06.360 | |
And in particular, the pressures in the hydraulic cylinders | |
02:06.360 --> 02:09.240 | |
to get efficient locomotion. | |
02:09.240 --> 02:11.680 | |
It sounds like a giant mess. | |
02:11.680 --> 02:14.480 | |
So how difficult is it to make all the motors communicate? | |
02:14.480 --> 02:16.880 | |
Presumably you have to send signals | |
02:16.880 --> 02:18.720 | |
hundreds of times a second, or at least... | |
02:18.720 --> 02:22.800 | |
This was not my work, but the folks who worked on this | |
02:22.800 --> 02:24.200 | |
wrote what I believe to be | |
02:24.200 --> 02:26.640 | |
the first multiprocessor operating system. | |
02:26.640 --> 02:27.960 | |
This was in the 80s. | |
02:29.080 --> 02:32.240 | |
And you had to make sure that obviously messages | |
02:32.240 --> 02:34.640 | |
got across from one joint to another. | |
02:34.640 --> 02:37.960 | |
You have to remember the clock speeds on those computers | |
02:37.960 --> 02:39.640 | |
were about half a megahertz. | |
02:39.640 --> 02:40.480 | |
Right. | |
02:40.480 --> 02:42.200 | |
So the 80s. | |
02:42.200 --> 02:45.320 | |
So not to romanticize the notion, | |
02:45.320 --> 02:47.960 | |
but how did it make you feel to make, | |
02:47.960 --> 02:49.680 | |
to see that robot move? | |
02:51.040 --> 02:52.240 | |
It was amazing. | |
02:52.240 --> 02:54.160 | |
In hindsight, it looks like, well, | |
02:54.160 --> 02:57.240 | |
we built the thing which really should have been much smaller. | |
02:57.240 --> 02:59.080 | |
And of course, today's robots are much smaller. | |
02:59.080 --> 03:02.200 | |
You look at, you know, Boston Dynamics, | |
03:02.200 --> 03:04.720 | |
our ghost robotics has been off from pen. | |
03:06.000 --> 03:08.640 | |
But back then, you were stuck | |
03:08.640 --> 03:11.120 | |
with the substrate you had, the compute you had, | |
03:11.120 --> 03:13.640 | |
so things were unnecessarily big. | |
03:13.640 --> 03:18.000 | |
But at the same time, and this is just human psychology, | |
03:18.000 --> 03:20.380 | |
somehow bigger means grander. | |
03:21.280 --> 03:23.600 | |
You know, people never have the same appreciation | |
03:23.600 --> 03:26.320 | |
for nanotechnology or nano devices | |
03:26.320 --> 03:30.120 | |
as they do for the space shuttle or the Boeing 747. | |
03:30.120 --> 03:32.720 | |
Yeah, you've actually done quite a good job | |
03:32.720 --> 03:35.960 | |
at illustrating that small is beautiful | |
03:35.960 --> 03:37.680 | |
in terms of robotics. | |
03:37.680 --> 03:42.520 | |
So what is on that topic is the most beautiful | |
03:42.520 --> 03:46.200 | |
or elegant robot emotion that you've ever seen. | |
03:46.200 --> 03:47.880 | |
Not to pick favorites or whatever, | |
03:47.880 --> 03:51.000 | |
but something that just inspires you that you remember. | |
03:51.000 --> 03:54.000 | |
Well, I think the thing that I'm most proud of | |
03:54.000 --> 03:57.200 | |
that my students have done is really think about | |
03:57.200 --> 04:00.360 | |
small UAVs that can maneuver and constrain spaces | |
04:00.360 --> 04:03.640 | |
and in particular, their ability to coordinate | |
04:03.640 --> 04:06.760 | |
with each other and form three dimensional patterns. | |
04:06.760 --> 04:08.920 | |
So once you can do that, | |
04:08.920 --> 04:13.920 | |
you can essentially create 3D objects in the sky | |
04:17.000 --> 04:19.800 | |
and you can deform these objects on the fly. | |
04:19.800 --> 04:23.520 | |
So in some sense, your toolbox of what you can create | |
04:23.520 --> 04:25.300 | |
has suddenly got enhanced. | |
04:27.400 --> 04:29.920 | |
And before that, we did the two dimensional version of this. | |
04:29.920 --> 04:33.760 | |
So we had ground robots forming patterns and so on. | |
04:33.760 --> 04:37.080 | |
So that was not as impressive, that was not as beautiful. | |
04:37.080 --> 04:40.480 | |
But if you do it in 3D, suspend it in midair | |
04:40.480 --> 04:43.640 | |
and you've got to go back to 2011 when we did this. | |
04:43.640 --> 04:45.960 | |
Now it's actually pretty standard to do these things | |
04:45.960 --> 04:49.800 | |
eight years later, but back then it was a big accomplishment. | |
04:49.800 --> 04:52.440 | |
So the distributed cooperation | |
04:52.440 --> 04:55.640 | |
is where beauty emerges in your eyes? | |
04:55.640 --> 04:57.960 | |
Well, I think beauty to an engineer is very different | |
04:57.960 --> 05:01.520 | |
from beauty to someone who's looking at robots | |
05:01.520 --> 05:03.400 | |
from the outside, if you will. | |
05:03.400 --> 05:07.920 | |
But what I meant there, so before we said that grand | |
05:07.920 --> 05:10.480 | |
is associated with size. | |
05:10.480 --> 05:13.640 | |
And another way of thinking about this | |
05:13.640 --> 05:16.480 | |
is just the physical shape and the idea | |
05:16.480 --> 05:18.320 | |
that you can create physical shapes in midair | |
05:18.320 --> 05:21.520 | |
and have them deform, that's beautiful. | |
05:21.520 --> 05:23.000 | |
But the individual components, | |
05:23.000 --> 05:24.840 | |
the agility is beautiful too, right? | |
05:24.840 --> 05:25.680 | |
That is true too. | |
05:25.680 --> 05:28.400 | |
So then how quickly can you actually manipulate | |
05:28.400 --> 05:29.560 | |
these three dimensional shapes | |
05:29.560 --> 05:31.200 | |
and the individual components? | |
05:31.200 --> 05:32.200 | |
Yes, you're right. | |
05:32.200 --> 05:36.760 | |
Oh, by the way, said UAV, unmanned aerial vehicle. | |
05:36.760 --> 05:41.760 | |
What's a good term for drones, UAVs, quadcopters? | |
05:41.840 --> 05:44.520 | |
Is there a term that's being standardized? | |
05:44.520 --> 05:45.440 | |
I don't know if there is. | |
05:45.440 --> 05:47.880 | |
Everybody wants to use the word drones. | |
05:47.880 --> 05:49.760 | |
And I've often said there's drones to me | |
05:49.760 --> 05:51.000 | |
is a pejorative word. | |
05:51.000 --> 05:53.960 | |
It signifies something that's dumb, | |
05:53.960 --> 05:56.320 | |
a pre program that does one little thing | |
05:56.320 --> 05:58.600 | |
and robots are anything but drones. | |
05:58.600 --> 06:00.680 | |
So I actually don't like that word, | |
06:00.680 --> 06:02.960 | |
but that's what everybody uses. | |
06:02.960 --> 06:04.880 | |
You could call it unpiloted. | |
06:04.880 --> 06:05.800 | |
Unpiloted. | |
06:05.800 --> 06:08.080 | |
But even unpiloted could be radio controlled, | |
06:08.080 --> 06:10.480 | |
could be remotely controlled in many different ways. | |
06:11.560 --> 06:12.680 | |
And I think the right word | |
06:12.680 --> 06:15.040 | |
is thinking about it as an aerial robot. | |
06:15.040 --> 06:19.080 | |
You also say agile, autonomous aerial robot, right? | |
06:19.080 --> 06:20.600 | |
Yeah, so agility is an attribute, | |
06:20.600 --> 06:22.200 | |
but they don't have to be. | |
06:23.080 --> 06:24.800 | |
So what biological system, | |
06:24.800 --> 06:26.880 | |
because you've also drawn a lot of inspiration | |
06:26.880 --> 06:28.640 | |
with those I've seen bees and ants | |
06:28.640 --> 06:32.400 | |
that you've talked about, what living creatures | |
06:32.400 --> 06:35.280 | |
have you found to be most inspiring | |
06:35.280 --> 06:38.560 | |
as an engineer, instructive in your work in robotics? | |
06:38.560 --> 06:43.480 | |
To me, so ants are really quite incredible creatures, right? | |
06:43.480 --> 06:47.920 | |
So you, I mean, the individuals arguably are very simple | |
06:47.920 --> 06:52.400 | |
in how they're built, and yet they're incredibly resilient | |
06:52.400 --> 06:54.000 | |
as a population. | |
06:54.000 --> 06:56.800 | |
And as individuals, they're incredibly robust. | |
06:56.800 --> 07:01.800 | |
So, if you take an ant with six legs, you remove one leg, | |
07:02.080 --> 07:04.160 | |
it still works just fine. | |
07:04.160 --> 07:05.800 | |
And it moves along, | |
07:05.800 --> 07:08.800 | |
and I don't know that he even realizes it's lost a leg. | |
07:09.800 --> 07:13.480 | |
So that's the robustness at the individual ant level. | |
07:13.480 --> 07:15.400 | |
But then you look about this instinct | |
07:15.400 --> 07:17.760 | |
for self preservation of the colonies, | |
07:17.760 --> 07:20.440 | |
and they adapt in so many amazing ways, | |
07:20.440 --> 07:25.440 | |
you know, transcending gaps by just chaining themselves | |
07:27.680 --> 07:30.400 | |
together when you have a flood, | |
07:30.400 --> 07:33.160 | |
being able to recruit other teammates | |
07:33.160 --> 07:35.200 | |
to carry big morsels of food, | |
07:36.560 --> 07:38.240 | |
and then going out in different directions, | |
07:38.240 --> 07:39.560 | |
looking for food, | |
07:39.560 --> 07:43.040 | |
and then being able to demonstrate consensus, | |
07:43.880 --> 07:47.240 | |
even though they don't communicate directly | |
07:47.240 --> 07:50.400 | |
with each other the way we communicate with each other, | |
07:50.400 --> 07:53.840 | |
in some sense, they also know how to do democracy, | |
07:53.840 --> 07:55.480 | |
probably better than what we do. | |
07:55.480 --> 07:59.000 | |
Yeah, somehow, even democracy is emergent. | |
07:59.000 --> 08:00.640 | |
It seems like all of the phenomena | |
08:00.640 --> 08:02.480 | |
that we see is all emergent. | |
08:02.480 --> 08:05.600 | |
It seems like there's no centralized communicator. | |
08:05.600 --> 08:09.840 | |
There is, so I think a lot is made about that word, emergent, | |
08:09.840 --> 08:11.560 | |
and it means lots of things to different people, | |
08:11.560 --> 08:12.600 | |
but you're absolutely right. | |
08:12.600 --> 08:17.600 | |
I think as an engineer, you think about what element, | |
08:17.600 --> 08:22.600 | |
elemental behaviors, what primitives you could synthesize | |
08:22.720 --> 08:26.640 | |
so that the whole looks incredibly powerful, | |
08:26.640 --> 08:27.920 | |
incredibly synergistic, | |
08:27.920 --> 08:30.960 | |
the whole definitely being greater than some of the parts, | |
08:30.960 --> 08:33.760 | |
and ants are living proof of that. | |
08:33.760 --> 08:36.280 | |
So when you see these beautiful swarms | |
08:36.280 --> 08:38.800 | |
where there's biological systems of robots, | |
08:39.920 --> 08:41.560 | |
do you sometimes think of them | |
08:41.560 --> 08:45.920 | |
as a single individual living intelligent organism? | |
08:45.920 --> 08:49.400 | |
So it's the same as thinking of our human civilization | |
08:49.400 --> 08:52.920 | |
as one organism, or do you still, as an engineer, | |
08:52.920 --> 08:54.560 | |
think about the individual components | |
08:54.560 --> 08:55.840 | |
and all the engineering that went into | |
08:55.840 --> 08:57.280 | |
the individual components? | |
08:57.280 --> 08:58.600 | |
Well, that's very interesting. | |
08:58.600 --> 09:01.440 | |
So again, philosophically, as engineers, | |
09:01.440 --> 09:06.440 | |
what we want to do is to go beyond the individual components, | |
09:06.800 --> 09:10.240 | |
the individual units, and think about it as a unit, | |
09:10.240 --> 09:11.480 | |
as a cohesive unit, | |
09:11.480 --> 09:14.240 | |
without worrying about the individual components. | |
09:14.240 --> 09:19.240 | |
If you start obsessing about the individual building blocks | |
09:19.680 --> 09:24.680 | |
and what they do, you inevitably will find it hard | |
09:26.400 --> 09:27.920 | |
to scale up. | |
09:27.920 --> 09:28.960 | |
Just mathematically, | |
09:28.960 --> 09:31.560 | |
just think about individual things you want to model, | |
09:31.560 --> 09:34.000 | |
and if you want to have 10 of those, | |
09:34.000 --> 09:36.400 | |
then you essentially are taking Cartesian products | |
09:36.400 --> 09:39.280 | |
of 10 things, and that makes it really complicated | |
09:39.280 --> 09:41.800 | |
than to do any kind of synthesis or design | |
09:41.800 --> 09:44.160 | |
in that high dimension space is really hard. | |
09:44.160 --> 09:45.840 | |
So the right way to do this | |
09:45.840 --> 09:49.080 | |
is to think about the individuals in a clever way | |
09:49.080 --> 09:51.160 | |
so that at the higher level, | |
09:51.160 --> 09:53.400 | |
when you look at lots and lots of them, | |
09:53.400 --> 09:55.320 | |
abstractly, you can think of them | |
09:55.320 --> 09:57.120 | |
in some low dimensional space. | |
09:57.120 --> 09:58.680 | |
So what does that involve? | |
09:58.680 --> 10:02.160 | |
For the individual, you have to try to make | |
10:02.160 --> 10:05.160 | |
the way they see the world as local as possible, | |
10:05.160 --> 10:06.440 | |
and the other thing, | |
10:06.440 --> 10:09.560 | |
do you just have to make them robust to collisions? | |
10:09.560 --> 10:10.880 | |
Like you said, with the ants, | |
10:10.880 --> 10:15.320 | |
if something fails, the whole swarm doesn't fail. | |
10:15.320 --> 10:17.760 | |
Right, I think as engineers, we do this. | |
10:17.760 --> 10:18.840 | |
I mean, you know, think about, | |
10:18.840 --> 10:21.280 | |
we build planes or we build iPhones, | |
10:22.240 --> 10:26.280 | |
and we know that by taking individual components, | |
10:26.280 --> 10:27.600 | |
well engineered components, | |
10:27.600 --> 10:30.080 | |
with well specified interfaces | |
10:30.080 --> 10:31.680 | |
that behave in a predictable way, | |
10:31.680 --> 10:33.560 | |
you can build complex systems. | |
10:34.400 --> 10:36.840 | |
So that's ingrained I would claim | |
10:36.840 --> 10:39.400 | |
in most engineers thinking, | |
10:39.400 --> 10:41.600 | |
and it's true for computer scientists as well. | |
10:41.600 --> 10:44.760 | |
I think what's different here is that you want | |
10:44.760 --> 10:49.480 | |
the individuals to be robust in some sense, | |
10:49.480 --> 10:52.000 | |
as we do in these other settings, | |
10:52.000 --> 10:54.480 | |
but you also want some degree of resiliency | |
10:54.480 --> 10:56.320 | |
for the population. | |
10:56.320 --> 10:58.720 | |
And so you really want them to be able | |
10:58.720 --> 11:03.720 | |
to reestablish communication with their neighbors. | |
11:03.840 --> 11:07.320 | |
You want them to rethink their strategy | |
11:07.320 --> 11:09.040 | |
for group behavior. | |
11:09.040 --> 11:11.000 | |
You want them to reorganize. | |
11:12.440 --> 11:16.120 | |
And that's where I think a lot of the challenges lie. | |
11:16.120 --> 11:18.400 | |
So just at a high level, | |
11:18.400 --> 11:21.120 | |
what does it take for a bunch of, | |
11:22.400 --> 11:23.560 | |
what should we call them, | |
11:23.560 --> 11:26.920 | |
flying robots to create a formation? | |
11:26.920 --> 11:30.400 | |
Just for people who are not familiar with robotics | |
11:30.400 --> 11:33.000 | |
in general, how much information is needed? | |
11:33.000 --> 11:36.040 | |
How do you even make it happen | |
11:36.040 --> 11:39.720 | |
without a centralized controller? | |
11:39.720 --> 11:41.320 | |
So I mean, there are a couple of different ways | |
11:41.320 --> 11:43.400 | |
of looking at this. | |
11:43.400 --> 11:48.400 | |
If you are a purist, you think of it as a way | |
11:50.040 --> 11:52.160 | |
of recreating what nature does. | |
11:53.800 --> 11:58.680 | |
So nature forms groups for several reasons, | |
11:58.680 --> 12:02.200 | |
but mostly it's because of this instinct | |
12:02.200 --> 12:07.200 | |
that organisms have of preserving their colonies, | |
12:07.280 --> 12:11.200 | |
their population, which means what? | |
12:11.200 --> 12:14.640 | |
You need shelter, you need food, you need to procreate, | |
12:14.640 --> 12:16.480 | |
and that's basically it. | |
12:16.480 --> 12:20.120 | |
So the kinds of interactions you see are all organic. | |
12:20.120 --> 12:21.320 | |
They're all local. | |
12:22.320 --> 12:25.760 | |
And the only information that they share, | |
12:25.760 --> 12:27.800 | |
and mostly it's indirectly, | |
12:27.800 --> 12:31.040 | |
is to again preserve the herd or the flock | |
12:31.040 --> 12:36.040 | |
or the swarm and either by looking for new sources of food | |
12:39.400 --> 12:41.240 | |
or looking for new shelters, right? | |
12:42.960 --> 12:47.200 | |
As engineers, when we build swarms, we have a mission. | |
12:48.280 --> 12:50.760 | |
And when you think of a mission, | |
12:52.080 --> 12:54.360 | |
and it involves mobility, | |
12:54.360 --> 12:56.840 | |
most often it's described in some kind | |
12:56.840 --> 12:58.800 | |
of a global coordinate system. | |
12:58.800 --> 13:03.080 | |
As a human, as an operator, as a commander, | |
13:03.080 --> 13:07.120 | |
or as a collaborator, I have my coordinate system | |
13:07.120 --> 13:10.160 | |
and I want the robots to be consistent with that. | |
13:11.120 --> 13:14.720 | |
So I might think of it slightly differently. | |
13:14.720 --> 13:18.960 | |
I might want the robots to recognize that coordinate system, | |
13:18.960 --> 13:21.360 | |
which means not only do they have to think locally | |
13:21.360 --> 13:23.160 | |
in terms of who their immediate neighbors are, | |
13:23.160 --> 13:24.640 | |
but they have to be cognizant | |
13:24.640 --> 13:28.320 | |
of what the global environment looks like. | |
13:28.320 --> 13:31.080 | |
So if I go, if I say surround this building | |
13:31.080 --> 13:33.280 | |
and protect this from intruders, | |
13:33.280 --> 13:35.160 | |
well, they're immediately in a building | |
13:35.160 --> 13:36.520 | |
centered coordinate system | |
13:36.520 --> 13:38.720 | |
and I have to tell them where the building is. | |
13:38.720 --> 13:40.080 | |
And they're globally collaborating | |
13:40.080 --> 13:41.360 | |
on the map of that building. | |
13:41.360 --> 13:44.240 | |
They're maintaining some kind of global, | |
13:44.240 --> 13:45.560 | |
not just in the frame of the building, | |
13:45.560 --> 13:49.040 | |
but there's information that's ultimately being built up | |
13:49.040 --> 13:53.320 | |
explicitly as opposed to kind of implicitly, | |
13:53.320 --> 13:54.400 | |
like nature might. | |
13:54.400 --> 13:55.240 | |
Correct, correct. | |
13:55.240 --> 13:57.720 | |
So in some sense, nature is very, very sophisticated, | |
13:57.720 --> 14:00.480 | |
but the tasks that nature solves | |
14:00.480 --> 14:03.040 | |
or needs to solve are very different | |
14:03.040 --> 14:05.160 | |
from the kind of engineered tasks, | |
14:05.160 --> 14:09.800 | |
artificial tasks that we are forced to address. | |
14:09.800 --> 14:12.560 | |
And again, there's nothing preventing us | |
14:12.560 --> 14:15.200 | |
from solving these other problems, | |
14:15.200 --> 14:16.640 | |
but ultimately it's about impact. | |
14:16.640 --> 14:19.400 | |
You want these swarms to do something useful. | |
14:19.400 --> 14:24.400 | |
And so you're kind of driven into this very unnatural, | |
14:24.400 --> 14:27.840 | |
if you will, unnatural meaning, not like how nature does, | |
14:27.840 --> 14:29.000 | |
setting. | |
14:29.000 --> 14:31.720 | |
And it's probably a little bit more expensive | |
14:31.720 --> 14:33.560 | |
to do it the way nature does, | |
14:33.560 --> 14:38.560 | |
because nature is less sensitive to the loss of the individual | |
14:39.280 --> 14:42.080 | |
and cost wise in robotics, | |
14:42.080 --> 14:45.280 | |
I think you're more sensitive to losing individuals. | |
14:45.280 --> 14:48.800 | |
I think that's true, although if you look at the price | |
14:48.800 --> 14:51.320 | |
to performance ratio of robotic components, | |
14:51.320 --> 14:53.640 | |
it's coming down dramatically. | |
14:53.640 --> 14:54.480 | |
I'm interested. | |
14:54.480 --> 14:56.040 | |
Right, it continues to come down. | |
14:56.040 --> 14:58.920 | |
So I think we're asymptotically approaching the point | |
14:58.920 --> 14:59.960 | |
where we would get, yeah, | |
14:59.960 --> 15:05.080 | |
the cost of individuals would really become insignificant. | |
15:05.080 --> 15:07.600 | |
So let's step back at a high level view, | |
15:07.600 --> 15:11.680 | |
the impossible question of what kind of, | |
15:11.680 --> 15:14.400 | |
as an overview, what kind of autonomous flying vehicles | |
15:14.400 --> 15:16.200 | |
are there in general? | |
15:16.200 --> 15:19.720 | |
I think the ones that receive a lot of notoriety | |
15:19.720 --> 15:22.560 | |
are obviously the military vehicles. | |
15:22.560 --> 15:26.280 | |
Military vehicles are controlled by a base station, | |
15:26.280 --> 15:29.640 | |
but have a lot of human supervision, | |
15:29.640 --> 15:31.800 | |
but have limited autonomy, | |
15:31.800 --> 15:34.760 | |
which is the ability to go from point A to point B, | |
15:34.760 --> 15:38.320 | |
and even the more sophisticated vehicles | |
15:38.320 --> 15:41.760 | |
can do autonomous takeoff and landing. | |
15:41.760 --> 15:44.400 | |
And those usually have wings and they're heavy? | |
15:44.400 --> 15:45.360 | |
Usually they're wings, | |
15:45.360 --> 15:47.440 | |
but there's nothing preventing us from doing this | |
15:47.440 --> 15:49.000 | |
for helicopters as well. | |
15:49.000 --> 15:53.440 | |
There are many military organizations that have | |
15:53.440 --> 15:56.560 | |
autonomous helicopters in the same vein. | |
15:56.560 --> 16:00.080 | |
And by the way, you look at autopilots and airplanes, | |
16:00.080 --> 16:02.800 | |
and it's actually very similar. | |
16:02.800 --> 16:07.160 | |
In fact, one interesting question we can ask is, | |
16:07.160 --> 16:12.120 | |
if you look at all the air safety violations, | |
16:12.120 --> 16:14.080 | |
all the crashes that occurred, | |
16:14.080 --> 16:16.880 | |
would they have happened if the plane | |
16:16.880 --> 16:20.200 | |
were truly autonomous, and I think you'll find | |
16:20.200 --> 16:21.960 | |
that in many of the cases, | |
16:21.960 --> 16:24.600 | |
because of pilot error, we made silly decisions. | |
16:24.600 --> 16:26.920 | |
And so in some sense, even in air traffic, | |
16:26.920 --> 16:29.760 | |
commercial air traffic, there's a lot of applications, | |
16:29.760 --> 16:33.920 | |
although we only see autonomy being enabled | |
16:33.920 --> 16:38.920 | |
at very high altitudes when the plane is an autopilot. | |
16:41.160 --> 16:42.520 | |
There's still a role for the human, | |
16:42.520 --> 16:47.520 | |
and that kind of autonomy is, you're kind of implying, | |
16:47.640 --> 16:48.680 | |
I don't know what the right word is, | |
16:48.680 --> 16:52.600 | |
but it's a little dumber than it could be. | |
16:53.480 --> 16:55.720 | |
Right, so in the lab, of course, | |
16:55.720 --> 16:59.200 | |
we can afford to be a lot more aggressive. | |
16:59.200 --> 17:04.200 | |
And the question we try to ask is, | |
17:04.600 --> 17:09.600 | |
can we make robots that will be able to make decisions | |
17:09.600 --> 17:13.680 | |
without any kind of external infrastructure? | |
17:13.680 --> 17:14.880 | |
So what does that mean? | |
17:14.880 --> 17:16.960 | |
So the most common piece of infrastructure | |
17:16.960 --> 17:19.640 | |
that airplanes use today is GPS. | |
17:20.560 --> 17:25.160 | |
GPS is also the most brittle form of information. | |
17:26.680 --> 17:30.480 | |
If you've driven in a city, try to use GPS navigation, | |
17:30.480 --> 17:32.760 | |
tall buildings, you immediately lose GPS. | |
17:33.720 --> 17:36.320 | |
And so that's not a very sophisticated way | |
17:36.320 --> 17:37.880 | |
of building autonomy. | |
17:37.880 --> 17:39.560 | |
I think the second piece of infrastructure | |
17:39.560 --> 17:41.920 | |
that I rely on is communications. | |
17:41.920 --> 17:46.200 | |
Again, it's very easy to jam communications. | |
17:47.400 --> 17:49.680 | |
In fact, if you use Wi Fi, | |
17:49.680 --> 17:51.880 | |
you know that Wi Fi signals drop out, | |
17:51.880 --> 17:53.560 | |
cell signals drop out. | |
17:53.560 --> 17:56.840 | |
So to rely on something like that is not good. | |
17:58.600 --> 18:01.240 | |
The third form of infrastructure we use, | |
18:01.240 --> 18:02.960 | |
and I hate to call it infrastructure, | |
18:02.960 --> 18:06.400 | |
but it is that in the sense of robots, it's people. | |
18:06.400 --> 18:08.760 | |
So you could rely on somebody to pilot you. | |
18:08.760 --> 18:09.960 | |
Right. | |
18:09.960 --> 18:11.600 | |
And so the question you wanna ask is | |
18:11.600 --> 18:13.400 | |
if there are no pilots, | |
18:13.400 --> 18:16.200 | |
if there's no communications with any base station, | |
18:16.200 --> 18:18.720 | |
if there's no knowledge of position, | |
18:18.720 --> 18:21.640 | |
and if there's no a priori map, | |
18:21.640 --> 18:24.880 | |
a priori knowledge of what the environment looks like, | |
18:24.880 --> 18:28.280 | |
a priori model of what might happen in the future. | |
18:28.280 --> 18:29.560 | |
Can robots navigate? | |
18:29.560 --> 18:31.440 | |
So that is true autonomy. | |
18:31.440 --> 18:33.240 | |
So that's true autonomy. | |
18:33.240 --> 18:35.040 | |
And we're talking about, you mentioned | |
18:35.040 --> 18:36.880 | |
like military applications and drones. | |
18:36.880 --> 18:38.280 | |
Okay, so what else is there? | |
18:38.280 --> 18:43.280 | |
You talk about agile autonomous flying robots, aerial robots. | |
18:43.480 --> 18:46.320 | |
So that's a different kind of, it's not winged, | |
18:46.320 --> 18:48.120 | |
it's not big, at least it's small. | |
18:48.120 --> 18:50.800 | |
So I use the word agility mostly, | |
18:50.800 --> 18:53.480 | |
or at least we're motivated to do agile robots, | |
18:53.480 --> 18:57.960 | |
mostly because robots can operate | |
18:57.960 --> 19:01.120 | |
and should be operating in constrained environments. | |
19:02.120 --> 19:06.960 | |
And if you want to operate the way a global hawk operates, | |
19:06.960 --> 19:09.120 | |
I mean, the kinds of conditions in which you operate | |
19:09.120 --> 19:10.760 | |
are very, very restrictive. | |
19:11.760 --> 19:13.720 | |
If you wanna go inside a building, | |
19:13.720 --> 19:15.600 | |
for example, for search and rescue, | |
19:15.600 --> 19:18.120 | |
or to locate an active shooter, | |
19:18.120 --> 19:22.120 | |
or you wanna navigate under the canopy in an orchard | |
19:22.120 --> 19:23.880 | |
to look at health of plants, | |
19:23.880 --> 19:28.880 | |
or to count fruits to measure the tree trunks. | |
19:31.240 --> 19:33.240 | |
These are things we do, by the way. | |
19:33.240 --> 19:35.920 | |
Yeah, some cool agriculture stuff you've shown in the past, | |
19:35.920 --> 19:36.760 | |
it's really awesome. | |
19:36.760 --> 19:40.400 | |
So in those kinds of settings, you do need that agility. | |
19:40.400 --> 19:42.560 | |
Agility does not necessarily mean | |
19:42.560 --> 19:45.440 | |
you break records for the 100 meters dash. | |
19:45.440 --> 19:48.040 | |
What it really means is you see the unexpected | |
19:48.040 --> 19:51.520 | |
and you're able to maneuver in a safe way, | |
19:51.520 --> 19:55.440 | |
and in a way that gets you the most information | |
19:55.440 --> 19:57.720 | |
about the thing you're trying to do. | |
19:57.720 --> 20:00.520 | |
By the way, you may be the only person | |
20:00.520 --> 20:04.280 | |
who in a TED talk has used a math equation, | |
20:04.280 --> 20:07.720 | |
which is amazing, people should go see one of your TED talks. | |
20:07.720 --> 20:08.840 | |
Actually, it's very interesting | |
20:08.840 --> 20:13.560 | |
because the TED curator, Chris Anderson, told me, | |
20:13.560 --> 20:15.400 | |
you can't show math. | |
20:15.400 --> 20:18.240 | |
And I thought about it, but that's who I am. | |
20:18.240 --> 20:20.800 | |
I mean, that's our work. | |
20:20.800 --> 20:25.800 | |
And so I felt compelled to give the audience a taste | |
20:25.800 --> 20:27.680 | |
for at least some math. | |
20:27.680 --> 20:32.680 | |
So on that point, simply, what does it take | |
20:32.680 --> 20:37.120 | |
to make a thing with four motors fly, a quadcopter, | |
20:37.120 --> 20:40.400 | |
one of these little flying robots? | |
20:41.560 --> 20:43.800 | |
How hard is it to make it fly? | |
20:43.800 --> 20:46.360 | |
How do you coordinate the four motors? | |
20:47.360 --> 20:52.360 | |
How do you convert those motors into actual movement? | |
20:52.400 --> 20:54.600 | |
So this is an interesting question. | |
20:54.600 --> 20:57.840 | |
We've been trying to do this since 2000. | |
20:57.840 --> 21:00.360 | |
It is a commentary on the sensors | |
21:00.360 --> 21:01.880 | |
that were available back then, | |
21:01.880 --> 21:04.320 | |
and the computers that were available back then. | |
21:05.640 --> 21:08.080 | |
And a number of things happened | |
21:08.080 --> 21:10.320 | |
between 2000 and 2007. | |
21:11.640 --> 21:15.560 | |
One is the advances in computing, which is, | |
21:15.560 --> 21:16.840 | |
so we all know about Moore's Law, | |
21:16.840 --> 21:19.760 | |
but I think 2007 was a tipping point, | |
21:19.760 --> 21:22.800 | |
the year of the iPhone, the year of the cloud. | |
21:22.800 --> 21:24.720 | |
Lots of things happened in 2007. | |
21:25.680 --> 21:27.640 | |
But going back even further, | |
21:27.640 --> 21:31.440 | |
inertial measurement units as a sensor really matured. | |
21:31.440 --> 21:33.040 | |
Again, lots of reasons for that. | |
21:34.000 --> 21:35.480 | |
Certainly there's a lot of federal funding, | |
21:35.480 --> 21:37.440 | |
particularly DARPA in the US, | |
21:38.360 --> 21:42.840 | |
but they didn't anticipate this boom in IMUs. | |
21:43.800 --> 21:46.080 | |
But if you look subsequently, | |
21:46.080 --> 21:49.000 | |
what happened is that every car manufacturer | |
21:49.000 --> 21:50.120 | |
had to put an airbag in, | |
21:50.120 --> 21:52.720 | |
which meant you had to have an accelerometer on board. | |
21:52.720 --> 21:54.080 | |
And so that drove down the price | |
21:54.080 --> 21:56.280 | |
to performance ratio of the sensors. | |
21:56.280 --> 21:57.960 | |
I should know this, that's very interesting. | |
21:57.960 --> 21:59.480 | |
It's very interesting, the connection there. | |
21:59.480 --> 22:03.160 | |
And that's why research is very hard to predict the outcomes. | |
22:04.920 --> 22:07.760 | |
And again, the federal government spent a ton of money | |
22:07.760 --> 22:12.360 | |
on things that they thought were useful for resonators, | |
22:12.360 --> 22:16.920 | |
but it ended up enabling these small UAVs, which is great, | |
22:16.920 --> 22:18.600 | |
because I could have never raised that much money | |
22:18.600 --> 22:20.800 | |
and told, sold this project, | |
22:20.800 --> 22:22.280 | |
hey, we want to build these small UAVs. | |
22:22.280 --> 22:25.520 | |
Can you actually fund the development of low cost IMUs? | |
22:25.520 --> 22:27.720 | |
So why do you need an IMU on an IMU? | |
22:27.720 --> 22:30.440 | |
So I'll come back to that, | |
22:30.440 --> 22:33.400 | |
but so in 2007, 2008, we were able to build these. | |
22:33.400 --> 22:35.280 | |
And then the question you're asking was a good one, | |
22:35.280 --> 22:40.280 | |
how do you coordinate the motors to develop this? | |
22:40.320 --> 22:43.920 | |
But over the last 10 years, everything is commoditized. | |
22:43.920 --> 22:47.920 | |
A high school kid today can pick up a Raspberry Pi kit | |
22:49.520 --> 22:50.600 | |
and build this, | |
22:50.600 --> 22:53.240 | |
all the low levels functionality is all automated. | |
22:53.240 --> 22:58.240 | |
But basically at some level, you have to drive the motors | |
22:59.160 --> 23:03.660 | |
at the right RPMs, the right velocity, | |
23:04.560 --> 23:07.480 | |
in order to generate the right amount of thrust | |
23:07.480 --> 23:09.960 | |
in order to position it and orient it | |
23:09.960 --> 23:12.840 | |
in a way that you need to in order to fly. | |
23:13.800 --> 23:16.680 | |
The feedback that you get is from onboard sensors | |
23:16.680 --> 23:18.400 | |
and the IMU is an important part of it. | |
23:18.400 --> 23:23.400 | |
The IMU tells you what the acceleration is | |
23:23.840 --> 23:26.400 | |
as well as what the angular velocity is. | |
23:26.400 --> 23:29.200 | |
And those are important pieces of information. | |
23:30.440 --> 23:34.200 | |
In addition to that, you need some kind of local position | |
23:34.200 --> 23:36.480 | |
or velocity information. | |
23:37.440 --> 23:39.320 | |
For example, when we walk, | |
23:39.320 --> 23:41.520 | |
we implicitly have this information | |
23:41.520 --> 23:45.800 | |
because we kind of know how, what our stride length is. | |
23:45.800 --> 23:50.800 | |
We also are looking at images fly past our retina, | |
23:51.440 --> 23:54.240 | |
if you will, and so we can estimate velocity. | |
23:54.240 --> 23:56.280 | |
We also have accelerometers in our head | |
23:56.280 --> 23:59.120 | |
and we're able to integrate all these pieces of information | |
23:59.120 --> 24:02.320 | |
to determine where we are as we walk. | |
24:02.320 --> 24:04.320 | |
And so robots have to do something very similar. | |
24:04.320 --> 24:08.160 | |
You need an IMU, you need some kind of a camera | |
24:08.160 --> 24:11.600 | |
or other sensor that's measuring velocity. | |
24:11.600 --> 24:15.800 | |
And then you need some kind of a global reference frame | |
24:15.800 --> 24:19.480 | |
if you really want to think about doing something | |
24:19.480 --> 24:21.280 | |
in a world coordinate system. | |
24:21.280 --> 24:23.680 | |
And so how do you estimate your position | |
24:23.680 --> 24:25.160 | |
with respect to that global reference frame? | |
24:25.160 --> 24:26.560 | |
That's important as well. | |
24:26.560 --> 24:29.520 | |
So coordinating the RPMs of the four motors | |
24:29.520 --> 24:32.680 | |
is what allows you to, first of all, fly and hover | |
24:32.680 --> 24:35.640 | |
and then you can change the orientation | |
24:35.640 --> 24:37.640 | |
and the velocity and so on. | |
24:37.640 --> 24:38.480 | |
Exactly, exactly. | |
24:38.480 --> 24:40.360 | |
So there's a bunch of degrees of freedom | |
24:40.360 --> 24:42.240 | |
or there's six degrees of freedom | |
24:42.240 --> 24:44.960 | |
but you only have four inputs, the four motors. | |
24:44.960 --> 24:49.960 | |
And it turns out to be a remarkably versatile configuration. | |
24:50.960 --> 24:53.120 | |
You think at first, well, I only have four motors, | |
24:53.120 --> 24:55.040 | |
how do I go sideways? | |
24:55.040 --> 24:56.360 | |
But it's not too hard to say, well, | |
24:56.360 --> 24:59.200 | |
if I tilt myself, I can go sideways. | |
24:59.200 --> 25:01.200 | |
And then you have four motors pointing up, | |
25:01.200 --> 25:05.400 | |
how do I rotate in place about a vertical axis? | |
25:05.400 --> 25:07.840 | |
Well, you rotate them at different speeds | |
25:07.840 --> 25:09.760 | |
and that generates reaction moments | |
25:09.760 --> 25:11.560 | |
and that allows you to turn. | |
25:11.560 --> 25:13.400 | |
So it's actually a pretty, | |
25:13.400 --> 25:17.040 | |
it's an optimal configuration from an engineer standpoint. | |
25:17.960 --> 25:22.960 | |
It's very simple, very cleverly done and very versatile. | |
25:23.800 --> 25:26.520 | |
So if you could step back to a time, | |
25:27.320 --> 25:30.120 | |
so I've always known flying robots as, | |
25:31.120 --> 25:35.840 | |
to me it was natural that the quadcopters should fly. | |
25:35.840 --> 25:38.040 | |
But when you first started working with it, | |
25:38.040 --> 25:42.040 | |
how surprised are you that you can make, | |
25:42.040 --> 25:45.560 | |
do so much with the four motors? | |
25:45.560 --> 25:47.640 | |
How surprising is that you can make this thing fly, | |
25:47.640 --> 25:49.800 | |
first of all, that you can make it hover, | |
25:49.800 --> 25:52.040 | |
then you can add control to it? | |
25:52.920 --> 25:55.800 | |
Firstly, this is not, the four motor configuration | |
25:55.800 --> 26:00.120 | |
is not ours, it has at least a hundred year history. | |
26:01.080 --> 26:02.480 | |
And various people, | |
26:02.480 --> 26:06.280 | |
various people try to get quadrotors to fly | |
26:06.280 --> 26:08.120 | |
without much success. | |
26:09.240 --> 26:11.560 | |
As I said, we've been working on this since 2000. | |
26:11.560 --> 26:13.360 | |
Our first designs were, | |
26:13.360 --> 26:15.160 | |
well, this is way too complicated. | |
26:15.160 --> 26:19.160 | |
Why not we try to get an omnidirectional flying robot? | |
26:19.160 --> 26:22.760 | |
So our early designs, we had eight rotors. | |
26:22.760 --> 26:26.080 | |
And so these eight rotors were arranged uniformly | |
26:27.520 --> 26:28.880 | |
on a sphere, if you will. | |
26:28.880 --> 26:31.360 | |
So you can imagine a symmetric configuration | |
26:31.360 --> 26:34.160 | |
and so you should be able to fly anywhere. | |
26:34.160 --> 26:36.280 | |
But the real challenge we had is the strength | |
26:36.280 --> 26:37.880 | |
to weight ratio is not enough, | |
26:37.880 --> 26:41.240 | |
and of course we didn't have the sensors and so on. | |
26:41.240 --> 26:43.840 | |
So everybody knew, or at least the people | |
26:43.840 --> 26:45.680 | |
who worked with rotor crafts knew, | |
26:45.680 --> 26:47.320 | |
four rotors would get it done. | |
26:48.280 --> 26:50.200 | |
So that was not our idea. | |
26:50.200 --> 26:53.480 | |
But it took a while before we could actually do | |
26:53.480 --> 26:56.520 | |
the onboard sensing and the computation | |
26:56.520 --> 27:00.400 | |
that was needed for the kinds of agile maneuvering | |
27:00.400 --> 27:03.800 | |
that we wanted to do in our little aerial robots. | |
27:03.800 --> 27:08.320 | |
And that only happened between 2007 and 2009 in our lab. | |
27:08.320 --> 27:10.680 | |
Yeah, and you have to send the signal | |
27:10.680 --> 27:13.200 | |
maybe a hundred times a second. | |
27:13.200 --> 27:15.400 | |
So the compute there is everything | |
27:15.400 --> 27:16.720 | |
has to come down in price. | |
27:16.720 --> 27:21.720 | |
And what are the steps of getting from point A to point B? | |
27:22.320 --> 27:25.840 | |
So we just talked about like local control, | |
27:25.840 --> 27:30.840 | |
but if all the kind of cool dancing in the air | |
27:30.840 --> 27:34.480 | |
that I've seen you show, how do you make it happen? | |
27:34.480 --> 27:38.840 | |
Make it trajectory, first of all, okay, | |
27:38.840 --> 27:41.600 | |
figure out a trajectory, so plan a trajectory, | |
27:41.600 --> 27:44.320 | |
and then how do you make that trajectory happen? | |
27:44.320 --> 27:47.320 | |
I think planning is a very fundamental problem in robotics. | |
27:47.320 --> 27:50.120 | |
I think 10 years ago it was an esoteric thing, | |
27:50.120 --> 27:52.360 | |
but today with self driving cars, | |
27:52.360 --> 27:55.160 | |
everybody can understand this basic idea | |
27:55.160 --> 27:57.320 | |
that a car sees a whole bunch of things | |
27:57.320 --> 27:59.720 | |
and it has to keep a lane or maybe make a right turn | |
27:59.720 --> 28:02.160 | |
or switch lanes, it has to plan a trajectory, | |
28:02.160 --> 28:04.320 | |
it has to be safe, it has to be efficient. | |
28:04.320 --> 28:06.120 | |
So everybody's familiar with that. | |
28:06.120 --> 28:07.400 | |
That's kind of the first step | |
28:07.400 --> 28:12.400 | |
that you have to think about when you say autonomy. | |
28:14.320 --> 28:18.600 | |
And so for us, it's about finding smooth motions, | |
28:18.600 --> 28:20.760 | |
motions that are safe. | |
28:20.760 --> 28:22.360 | |
So we think about these two things. | |
28:22.360 --> 28:24.160 | |
One is optimality, one is safety. | |
28:24.160 --> 28:26.720 | |
Clearly you cannot compromise safety. | |
28:26.720 --> 28:30.160 | |
So you're looking for safe, optimal motions. | |
28:30.160 --> 28:33.160 | |
The other thing you have to think about | |
28:33.160 --> 28:37.360 | |
is can you actually compute a reasonable trajectory | |
28:37.360 --> 28:41.560 | |
in a small amount of time, because you have a time budget. | |
28:41.560 --> 28:44.360 | |
So the optimal becomes suboptimal, | |
28:44.360 --> 28:49.360 | |
but in our lab we focus on synthesizing smooth trajectory | |
28:50.560 --> 28:52.360 | |
that satisfy all the constraints. | |
28:52.360 --> 28:57.200 | |
In other words, don't violate any safety constraints | |
28:57.200 --> 29:02.200 | |
and is as efficient as possible. | |
29:02.200 --> 29:04.600 | |
And when I say efficient, it could mean | |
29:04.600 --> 29:07.760 | |
I want to get from point A to point B as quickly as possible | |
29:07.760 --> 29:11.200 | |
or I want to get to it as gracefully as possible | |
29:11.200 --> 29:15.360 | |
or I want to consume as little energy as possible. | |
29:15.360 --> 29:17.600 | |
But always staying within the safety constraints. | |
29:17.600 --> 29:22.440 | |
But yes, always finding a safe trajectory. | |
29:22.440 --> 29:24.440 | |
So there's a lot of excitement and progress | |
29:24.440 --> 29:26.440 | |
in the field of machine learning. | |
29:26.440 --> 29:31.440 | |
And reinforcement learning and the neural network variant | |
29:31.440 --> 29:33.440 | |
of that with deeper reinforcement learning. | |
29:33.440 --> 29:37.440 | |
Do you see a role of machine learning in... | |
29:37.440 --> 29:40.040 | |
So a lot of the success with flying robots | |
29:40.040 --> 29:41.840 | |
did not rely on machine learning, | |
29:41.840 --> 29:44.440 | |
except for maybe a little bit of the perception | |
29:44.440 --> 29:46.040 | |
on the computer vision side. | |
29:46.040 --> 29:48.040 | |
On the control side and the planning, | |
29:48.040 --> 29:50.040 | |
do you see there's a role in the future | |
29:50.040 --> 29:51.040 | |
for machine learning? | |
29:51.040 --> 29:53.040 | |
So let me disagree a little bit with you. | |
29:53.040 --> 29:56.040 | |
I think we never perhaps called out in my work | |
29:56.040 --> 29:59.040 | |
called out learning, but even this very simple idea | |
29:59.040 --> 30:04.040 | |
of being able to fly through a constrained space. | |
30:04.040 --> 30:07.040 | |
The first time you try it, you'll invariably... | |
30:07.040 --> 30:10.040 | |
You might get it wrong if the task is challenging. | |
30:10.040 --> 30:14.040 | |
And the reason is to get it perfectly right, | |
30:14.040 --> 30:17.040 | |
you have to model everything in the environment. | |
30:17.040 --> 30:22.040 | |
And flying is notoriously hard to model. | |
30:22.040 --> 30:28.040 | |
There are aerodynamic effects that we constantly discover, | |
30:28.040 --> 30:31.040 | |
even just before I was talking to you, | |
30:31.040 --> 30:37.040 | |
I was talking to a student about how blades flap when they fly. | |
30:37.040 --> 30:43.040 | |
And that ends up changing how a rotorcraft | |
30:43.040 --> 30:46.040 | |
is accelerated in the angular direction. | |
30:46.040 --> 30:48.040 | |
Does it use like microflaps or something? | |
30:48.040 --> 30:49.040 | |
It's not microflaps. | |
30:49.040 --> 30:52.040 | |
We assume that each blade is rigid, | |
30:52.040 --> 30:54.040 | |
but actually it flaps a little bit. | |
30:54.040 --> 30:55.040 | |
It bends. | |
30:55.040 --> 30:56.040 | |
Interesting, yeah. | |
30:56.040 --> 30:58.040 | |
And so the models rely on the fact, | |
30:58.040 --> 31:01.040 | |
on the assumption that they're actually rigid. | |
31:01.040 --> 31:02.040 | |
But that's not true. | |
31:02.040 --> 31:04.040 | |
If you're flying really quickly, | |
31:04.040 --> 31:07.040 | |
these effects become significant. | |
31:07.040 --> 31:09.040 | |
If you're flying close to the ground, | |
31:09.040 --> 31:12.040 | |
you get pushed off by the ground. | |
31:12.040 --> 31:15.040 | |
Something which every pilot knows when he tries to land | |
31:15.040 --> 31:19.040 | |
or she tries to land, this is called a ground effect. | |
31:19.040 --> 31:21.040 | |
Something very few pilots think about | |
31:21.040 --> 31:23.040 | |
is what happens when you go close to a ceiling, | |
31:23.040 --> 31:25.040 | |
or you get sucked into a ceiling. | |
31:25.040 --> 31:29.040 | |
There are very few aircraft that fly close to any kind of ceiling. | |
31:29.040 --> 31:33.040 | |
Likewise, when you go close to a wall, | |
31:33.040 --> 31:36.040 | |
there are these wall effects. | |
31:36.040 --> 31:39.040 | |
And if you've gone on a train and you pass another train | |
31:39.040 --> 31:41.040 | |
that's traveling the opposite direction, | |
31:41.040 --> 31:43.040 | |
you can feel the buffeting. | |
31:43.040 --> 31:46.040 | |
And so these kinds of microclimates | |
31:46.040 --> 31:48.040 | |
affect our UAVs significantly. | |
31:48.040 --> 31:51.040 | |
And they're impossible to model, essentially. | |
31:51.040 --> 31:53.040 | |
I wouldn't say they're impossible to model, | |
31:53.040 --> 31:55.040 | |
but the level of sophistication you would need | |
31:55.040 --> 32:00.040 | |
in the model and the software would be tremendous. | |
32:00.040 --> 32:03.040 | |
Plus, to get everything right would be awfully tedious. | |
32:03.040 --> 32:05.040 | |
So the way we do this is over time, | |
32:05.040 --> 32:10.040 | |
we figure out how to adapt to these conditions. | |
32:10.040 --> 32:13.040 | |
So early on, we use the form of learning | |
32:13.040 --> 32:15.040 | |
that we call iterative learning. | |
32:15.040 --> 32:18.040 | |
So this idea, if you want to perform a task, | |
32:18.040 --> 32:23.040 | |
there are a few things that you need to change | |
32:23.040 --> 32:25.040 | |
and iterate over a few parameters | |
32:25.040 --> 32:29.040 | |
that over time you can figure out. | |
32:29.040 --> 32:34.040 | |
So I could call it policy gradient reinforcement learning, | |
32:34.040 --> 32:36.040 | |
but actually it was just iterative learning. | |
32:36.040 --> 32:38.040 | |
And so this was there way back. | |
32:38.040 --> 32:40.040 | |
I think what's interesting is, | |
32:40.040 --> 32:43.040 | |
if you look at autonomous vehicles today, | |
32:43.040 --> 32:46.040 | |
learning occurs, could occur in two pieces. | |
32:46.040 --> 32:48.040 | |
One is perception, understanding the world. | |
32:48.040 --> 32:51.040 | |
Second is action, taking actions. | |
32:51.040 --> 32:54.040 | |
Everything that I've seen that is successful | |
32:54.040 --> 32:56.040 | |
is on the perception side of things. | |
32:56.040 --> 32:59.040 | |
So in computer vision, we've made amazing strides | |
32:59.040 --> 33:00.040 | |
in the last 10 years. | |
33:00.040 --> 33:03.040 | |
So recognizing objects, actually detecting objects, | |
33:03.040 --> 33:08.040 | |
classifying them and tagging them in some sense, | |
33:08.040 --> 33:11.040 | |
annotating them, this is all done through machine learning. | |
33:11.040 --> 33:13.040 | |
On the action side, on the other hand, | |
33:13.040 --> 33:17.040 | |
I don't know if any examples where there are fielded systems | |
33:17.040 --> 33:21.040 | |
where we actually learn the right behavior. | |
33:21.040 --> 33:23.040 | |
Outside of single demonstration is successful. | |
33:23.040 --> 33:25.040 | |
On the laboratory, this is the Holy Grail. | |
33:25.040 --> 33:27.040 | |
Can you do end to end learning? | |
33:27.040 --> 33:31.040 | |
Can you go from pixels to motor currents? | |
33:31.040 --> 33:33.040 | |
This is really, really hard. | |
33:33.040 --> 33:36.040 | |
And I think if you go forward, | |
33:36.040 --> 33:38.040 | |
the right way to think about these things | |
33:38.040 --> 33:43.040 | |
is data driven approaches, learning based approaches, | |
33:43.040 --> 33:46.040 | |
in concert with model based approaches, | |
33:46.040 --> 33:48.040 | |
which is the traditional way of doing things. | |
33:48.040 --> 33:50.040 | |
So I think there's a piece, | |
33:50.040 --> 33:52.040 | |
there's a role for each of these methodologies. | |
33:52.040 --> 33:55.040 | |
So what do you think, just jumping out on topics, | |
33:55.040 --> 33:57.040 | |
since you mentioned autonomous vehicles, | |
33:57.040 --> 33:59.040 | |
what do you think are the limits on the perception side? | |
33:59.040 --> 34:02.040 | |
So I've talked to Elon Musk, | |
34:02.040 --> 34:04.040 | |
and there on the perception side, | |
34:04.040 --> 34:07.040 | |
they're using primarily computer vision | |
34:07.040 --> 34:09.040 | |
to perceive the environment. | |
34:09.040 --> 34:13.040 | |
In your work with, because you work with the real world a lot, | |
34:13.040 --> 34:15.040 | |
and the physical world, | |
34:15.040 --> 34:17.040 | |
what are the limits of computer vision? | |
34:17.040 --> 34:20.040 | |
Do you think you can solve autonomous vehicles, | |
34:20.040 --> 34:22.040 | |
focusing on the perception side, | |
34:22.040 --> 34:25.040 | |
focusing on vision alone and machine learning? | |
34:25.040 --> 34:29.040 | |
So we also have a spin off company, Exxon Technologies, | |
34:29.040 --> 34:32.040 | |
that works underground in mines. | |
34:32.040 --> 34:35.040 | |
So you go into mines, they're dark. | |
34:35.040 --> 34:37.040 | |
They're dirty. | |
34:37.040 --> 34:39.040 | |
You fly in a dirty area, | |
34:39.040 --> 34:42.040 | |
there's stuff you kick up by the propellers, | |
34:42.040 --> 34:44.040 | |
the downwash kicks up dust. | |
34:44.040 --> 34:48.040 | |
I challenge you to get a computer vision algorithm to work there. | |
34:48.040 --> 34:53.040 | |
So we use LIDARS in that setting. | |
34:53.040 --> 34:57.040 | |
Indoors, and even outdoors when we fly through fields, | |
34:57.040 --> 34:59.040 | |
I think there's a lot of potential | |
34:59.040 --> 35:03.040 | |
for just solving the problem using computer vision alone. | |
35:03.040 --> 35:05.040 | |
But I think the bigger question is, | |
35:05.040 --> 35:08.040 | |
can you actually solve, | |
35:08.040 --> 35:11.040 | |
or can you actually identify all the corner cases | |
35:11.040 --> 35:14.040 | |
using a single sensing modality | |
35:14.040 --> 35:16.040 | |
and using learning alone? | |
35:16.040 --> 35:18.040 | |
What's your intuition there? | |
35:18.040 --> 35:20.040 | |
So look, if you have a corner case | |
35:20.040 --> 35:22.040 | |
and your algorithm doesn't work, | |
35:22.040 --> 35:25.040 | |
your instinct is to go get data about the corner case | |
35:25.040 --> 35:29.040 | |
and patch it up, learn how to deal with that corner case. | |
35:29.040 --> 35:32.040 | |
But at some point, | |
35:32.040 --> 35:36.040 | |
this is going to saturate, this approach is not viable. | |
35:36.040 --> 35:39.040 | |
So today, computer vision algorithms | |
35:39.040 --> 35:43.040 | |
can detect objects 90% of the time, | |
35:43.040 --> 35:45.040 | |
classify them 90% of the time. | |
35:45.040 --> 35:49.040 | |
Cats on the internet probably can do 95%, I don't know. | |
35:49.040 --> 35:54.040 | |
But to get from 90% to 99%, you need a lot more data. | |
35:54.040 --> 35:56.040 | |
And then I tell you, well, that's not enough | |
35:56.040 --> 35:58.040 | |
because I have a safety critical application | |
35:58.040 --> 36:01.040 | |
that want to go from 99% to 99.9%, | |
36:01.040 --> 36:03.040 | |
well, that's even more data. | |
36:03.040 --> 36:07.040 | |
So I think if you look at | |
36:07.040 --> 36:11.040 | |
wanting accuracy on the x axis | |
36:11.040 --> 36:15.040 | |
and look at the amount of data on the y axis, | |
36:15.040 --> 36:18.040 | |
I believe that curve is an exponential curve. | |
36:18.040 --> 36:21.040 | |
Wow, okay, it's even hard if it's linear. | |
36:21.040 --> 36:24.040 | |
It's hard if it's linear, totally, but I think it's exponential. | |
36:24.040 --> 36:26.040 | |
And the other thing you have to think about | |
36:26.040 --> 36:31.040 | |
is that this process is a very, very power hungry process | |
36:31.040 --> 36:34.040 | |
to run data farms or servers. | |
36:34.040 --> 36:36.040 | |
Power, do you mean literally power? | |
36:36.040 --> 36:38.040 | |
Literally power, literally power. | |
36:38.040 --> 36:43.040 | |
So in 2014, five years ago, and I don't have more recent data, | |
36:43.040 --> 36:50.040 | |
2% of US electricity consumption was from data farms. | |
36:50.040 --> 36:54.040 | |
So we think about this as an information science | |
36:54.040 --> 36:56.040 | |
and information processing problem. | |
36:56.040 --> 36:59.040 | |
Actually, it is an energy processing problem. | |
36:59.040 --> 37:02.040 | |
And so unless we've figured out better ways of doing this, | |
37:02.040 --> 37:04.040 | |
I don't think this is viable. | |
37:04.040 --> 37:08.040 | |
So talking about driving, which is a safety critical application | |
37:08.040 --> 37:11.040 | |
and some aspect of the flight is safety critical, | |
37:11.040 --> 37:14.040 | |
maybe philosophical question, maybe an engineering one. | |
37:14.040 --> 37:16.040 | |
What problem do you think is harder to solve? | |
37:16.040 --> 37:19.040 | |
Autonomous driving or autonomous flight? | |
37:19.040 --> 37:21.040 | |
That's a really interesting question. | |
37:21.040 --> 37:26.040 | |
I think autonomous flight has several advantages | |
37:26.040 --> 37:30.040 | |
that autonomous driving doesn't have. | |
37:30.040 --> 37:33.040 | |
So look, if I want to go from point A to point B, | |
37:33.040 --> 37:35.040 | |
I have a very, very safe trajectory. | |
37:35.040 --> 37:38.040 | |
Go vertically up to a maximum altitude, | |
37:38.040 --> 37:41.040 | |
fly horizontally to just about the destination | |
37:41.040 --> 37:43.040 | |
and then come down vertically. | |
37:43.040 --> 37:46.040 | |
This is preprogrammed. | |
37:46.040 --> 37:49.040 | |
The equivalent of that is very hard to find | |
37:49.040 --> 37:53.040 | |
in a self driving car world because you're on the ground, | |
37:53.040 --> 37:55.040 | |
you're in a two dimensional surface, | |
37:55.040 --> 37:58.040 | |
and the trajectories on the two dimensional surface | |
37:58.040 --> 38:01.040 | |
are more likely to encounter obstacles. | |
38:01.040 --> 38:03.040 | |
I mean this in an intuitive sense, | |
38:03.040 --> 38:05.040 | |
but mathematically true, that's... | |
38:05.040 --> 38:08.040 | |
Mathematically as well, that's true. | |
38:08.040 --> 38:11.040 | |
There's other option on the 2G space of platooning | |
38:11.040 --> 38:13.040 | |
or because there's so many obstacles, | |
38:13.040 --> 38:15.040 | |
you can connect with those obstacles | |
38:15.040 --> 38:16.040 | |
and all these kinds of problems. | |
38:16.040 --> 38:18.040 | |
But those exist in the three dimensional space as well. | |
38:18.040 --> 38:19.040 | |
So they do. | |
38:19.040 --> 38:23.040 | |
So the question also implies how difficult are obstacles | |
38:23.040 --> 38:25.040 | |
in the three dimensional space in flight? | |
38:25.040 --> 38:27.040 | |
So that's the downside. | |
38:27.040 --> 38:29.040 | |
I think in three dimensional space, | |
38:29.040 --> 38:31.040 | |
you're modeling three dimensional world, | |
38:31.040 --> 38:33.040 | |
not just because you want to avoid it, | |
38:33.040 --> 38:35.040 | |
but you want to reason about it | |
38:35.040 --> 38:37.040 | |
and you want to work in that three dimensional environment. | |
38:37.040 --> 38:39.040 | |
And that's significantly harder. | |
38:39.040 --> 38:41.040 | |
So that's one disadvantage. | |
38:41.040 --> 38:43.040 | |
I think the second disadvantage is of course, | |
38:43.040 --> 38:45.040 | |
anytime you fly, you have to put up | |
38:45.040 --> 38:49.040 | |
with the peculiarities of aerodynamics | |
38:49.040 --> 38:51.040 | |
and their complicated environments. | |
38:51.040 --> 38:52.040 | |
How do you negotiate that? | |
38:52.040 --> 38:54.040 | |
So that's always a problem. | |
38:54.040 --> 38:57.040 | |
Do you see a time in the future where there is... | |
38:57.040 --> 39:00.040 | |
You mentioned there's agriculture applications. | |
39:00.040 --> 39:03.040 | |
So there's a lot of applications of flying robots. | |
39:03.040 --> 39:07.040 | |
But do you see a time in the future where there is tens of thousands | |
39:07.040 --> 39:10.040 | |
or maybe hundreds of thousands of delivery drones | |
39:10.040 --> 39:14.040 | |
that fill the sky, a delivery of flying robots? | |
39:14.040 --> 39:18.040 | |
I think there's a lot of potential for the last mile delivery. | |
39:18.040 --> 39:21.040 | |
And so in crowded cities, | |
39:21.040 --> 39:24.040 | |
I don't know if you go to a place like Hong Kong, | |
39:24.040 --> 39:27.040 | |
just crossing the river can take half an hour. | |
39:27.040 --> 39:32.040 | |
And while a drone can just do it in five minutes at most. | |
39:32.040 --> 39:38.040 | |
I think you look at delivery of supplies to remote villages. | |
39:38.040 --> 39:41.040 | |
I work with a nonprofit called Weave Robotics. | |
39:41.040 --> 39:43.040 | |
So they work in the Peruvian Amazon, | |
39:43.040 --> 39:47.040 | |
where the only highways are rivers. | |
39:47.040 --> 39:49.040 | |
And to get from point A to point B | |
39:49.040 --> 39:52.040 | |
may take five hours. | |
39:52.040 --> 39:56.040 | |
While with a drone, you can get there in 30 minutes. | |
39:56.040 --> 39:59.040 | |
So just delivering drugs, | |
39:59.040 --> 40:04.040 | |
retrieving samples for testing vaccines. | |
40:04.040 --> 40:06.040 | |
I think there's huge potential here. | |
40:06.040 --> 40:09.040 | |
So I think the challenges are not technological. | |
40:09.040 --> 40:12.040 | |
The challenge is economical. | |
40:12.040 --> 40:16.040 | |
The one thing I'll tell you that nobody thinks about | |
40:16.040 --> 40:21.040 | |
is the fact that we've not made huge strides in battery technology. | |
40:21.040 --> 40:22.040 | |
Yes, it's true. | |
40:22.040 --> 40:24.040 | |
Batteries are becoming less expensive | |
40:24.040 --> 40:27.040 | |
because we have these mega factories that are coming up. | |
40:27.040 --> 40:29.040 | |
But they're all based on lithium based technologies. | |
40:29.040 --> 40:34.040 | |
And if you look at the energy density and the power density, | |
40:34.040 --> 40:39.040 | |
those are two fundamentally limiting numbers. | |
40:39.040 --> 40:41.040 | |
So power density is important because for a UAV | |
40:41.040 --> 40:43.040 | |
to take off vertically into the air, | |
40:43.040 --> 40:47.040 | |
which most drones do, they don't have a runway, | |
40:47.040 --> 40:52.040 | |
you consume roughly 200 watts per kilo at the small size. | |
40:52.040 --> 40:54.040 | |
That's a lot. | |
40:54.040 --> 40:58.040 | |
In contrast, the human brain consumes less than 80 watts, | |
40:58.040 --> 41:00.040 | |
the whole of the human brain. | |
41:00.040 --> 41:04.040 | |
So just imagine just lifting yourself into the air | |
41:04.040 --> 41:08.040 | |
is like two or three light bulbs, which makes no sense to me. | |
41:08.040 --> 41:12.040 | |
Yeah, so you're going to have to at scale solve the energy problem | |
41:12.040 --> 41:19.040 | |
then charging the batteries, storing the energy and so on. | |
41:19.040 --> 41:21.040 | |
And then the storage is the second problem. | |
41:21.040 --> 41:23.040 | |
But storage limits the range. | |
41:23.040 --> 41:30.040 | |
But you have to remember that you have to burn a lot of it | |
41:30.040 --> 41:32.040 | |
for a given time. | |
41:32.040 --> 41:33.040 | |
So the burning is another problem. | |
41:33.040 --> 41:35.040 | |
Which is a power question. | |
41:35.040 --> 41:36.040 | |
Yes. | |
41:36.040 --> 41:39.040 | |
And do you think just your intuition, | |
41:39.040 --> 41:45.040 | |
there are breakthroughs in batteries on the horizon? | |
41:45.040 --> 41:47.040 | |
How hard is that problem? | |
41:47.040 --> 41:52.040 | |
Look, there are a lot of companies that are promising flying cars, | |
41:52.040 --> 42:00.040 | |
that are autonomous, and that are clean. | |
42:00.040 --> 42:02.040 | |
I think they're over promising. | |
42:02.040 --> 42:05.040 | |
The autonomy piece is doable. | |
42:05.040 --> 42:08.040 | |
The clean piece, I don't think so. | |
42:08.040 --> 42:12.040 | |
There's another company that I work with called Jatatra. | |
42:12.040 --> 42:16.040 | |
They make small jet engines. | |
42:16.040 --> 42:20.040 | |
And they can get up to 50 miles an hour very easily and lift 50 kilos. | |
42:20.040 --> 42:22.040 | |
But they're jet engines. | |
42:22.040 --> 42:24.040 | |
They're efficient. | |
42:24.040 --> 42:26.040 | |
They're a little louder than electric vehicles. | |
42:26.040 --> 42:29.040 | |
But they can build flying cars. | |
42:29.040 --> 42:33.040 | |
So your sense is that there's a lot of pieces that have come together. | |
42:33.040 --> 42:39.040 | |
So on this crazy question, if you look at companies like Kitty Hawk, | |
42:39.040 --> 42:45.040 | |
working on electric, so the clean, talking as the bashing through. | |
42:45.040 --> 42:52.040 | |
It's a crazy dream, but you work with flight a lot. | |
42:52.040 --> 42:58.040 | |
You've mentioned before that manned flights or carrying a human body | |
42:58.040 --> 43:01.040 | |
is very difficult to do. | |
43:01.040 --> 43:04.040 | |
So how crazy is flying cars? | |
43:04.040 --> 43:11.040 | |
Do you think there will be a day when we have vertical takeoff and landing vehicles | |
43:11.040 --> 43:17.040 | |
that are sufficiently affordable that we're going to see a huge amount of them? | |
43:17.040 --> 43:21.040 | |
And they would look like something like we dream of when we think about flying cars. | |
43:21.040 --> 43:23.040 | |
Yeah, like the Jetsons. | |
43:23.040 --> 43:26.040 | |
So look, there are a lot of smart people working on this. | |
43:26.040 --> 43:32.040 | |
And you never say something is not possible when you're people like Sebastian Thrun working on it. | |
43:32.040 --> 43:35.040 | |
So I totally think it's viable. | |
43:35.040 --> 43:38.040 | |
I question, again, the electric piece. | |
43:38.040 --> 43:40.040 | |
The electric piece, yeah. | |
43:40.040 --> 43:42.040 | |
For short distances, you can do it. | |
43:42.040 --> 43:46.040 | |
And there's no reason to suggest that these all just have to be rotor crafts. | |
43:46.040 --> 43:50.040 | |
You take off vertically, but then you morph into a forward flight. | |
43:50.040 --> 43:52.040 | |
I think there are a lot of interesting designs. | |
43:52.040 --> 43:56.040 | |
The question to me is, are these economically viable? | |
43:56.040 --> 44:02.040 | |
And if you agree to do this with fossil fuels, it instantly immediately becomes viable. | |
44:02.040 --> 44:04.040 | |
That's a real challenge. | |
44:04.040 --> 44:09.040 | |
Do you think it's possible for robots and humans to collaborate successfully on tasks? | |
44:09.040 --> 44:18.040 | |
So a lot of robotics folks that I talk to and work with, I mean, humans just add a giant mess to the picture. | |
44:18.040 --> 44:22.040 | |
So it's best to remove them from consideration when solving specific tasks. | |
44:22.040 --> 44:24.040 | |
It's very difficult to model. | |
44:24.040 --> 44:26.040 | |
There's just a source of uncertainty. | |
44:26.040 --> 44:36.040 | |
In your work with these agile flying robots, do you think there's a role for collaboration with humans? | |
44:36.040 --> 44:43.040 | |
Is it best to model tasks in a way that doesn't have a human in the picture? | |
44:43.040 --> 44:48.040 | |
I don't think we should ever think about robots without human in the picture. | |
44:48.040 --> 44:54.040 | |
Ultimately, robots are there because we want them to solve problems for humans. | |
44:54.040 --> 44:58.040 | |
But there's no general solution to this problem. | |
44:58.040 --> 45:02.040 | |
I think if you look at human interaction and how humans interact with robots, | |
45:02.040 --> 45:06.040 | |
you know, we think of these in sort of three different ways. | |
45:06.040 --> 45:09.040 | |
One is the human commanding the robot. | |
45:09.040 --> 45:13.040 | |
The second is the human collaborating with the robot. | |
45:13.040 --> 45:19.040 | |
So for example, we work on how a robot can actually pick up things with a human and carry things. | |
45:19.040 --> 45:21.040 | |
That's like true collaboration. | |
45:21.040 --> 45:26.040 | |
And third, we think about humans as bystanders, self driving cars. | |
45:26.040 --> 45:33.040 | |
What's the human's role and how do self driving cars acknowledge the presence of humans? | |
45:33.040 --> 45:36.040 | |
So I think all of these things are different scenarios. | |
45:36.040 --> 45:39.040 | |
It depends on what kind of humans, what kind of tasks. | |
45:39.040 --> 45:45.040 | |
And I think it's very difficult to say that there's a general theory that we all have for this. | |
45:45.040 --> 45:52.040 | |
But at the same time, it's also silly to say that we should think about robots independent of humans. | |
45:52.040 --> 45:59.040 | |
So to me, human robot interaction is almost a mandatory aspect of everything we do. | |
45:59.040 --> 46:00.040 | |
Yes. | |
46:00.040 --> 46:05.040 | |
But to wish to agree, so your thoughts, if we jump to autonomous vehicles, for example, | |
46:05.040 --> 46:10.040 | |
there's a big debate between what's called level two and level four. | |
46:10.040 --> 46:13.040 | |
So semi autonomous and autonomous vehicles. | |
46:13.040 --> 46:19.040 | |
And sort of the Tesla approach currently at least has a lot of collaboration between human and machine. | |
46:19.040 --> 46:24.040 | |
So the human is supposed to actively supervise the operation of the robot. | |
46:24.040 --> 46:33.040 | |
Part of the safety definition of how safe a robot is in that case is how effective is the human in monitoring it. | |
46:33.040 --> 46:43.040 | |
Do you think that's ultimately not a good approach in sort of having a human in the picture, | |
46:43.040 --> 46:51.040 | |
not as a bystander or part of the infrastructure, but really as part of what's required to make the system safe? | |
46:51.040 --> 46:53.040 | |
This is harder than it sounds. | |
46:53.040 --> 47:01.040 | |
I think, you know, if you, I mean, I'm sure you've driven before in highways and so on, | |
47:01.040 --> 47:10.040 | |
it's really very hard to relinquish controls to a machine and then take over when needed. | |
47:10.040 --> 47:18.040 | |
So I think Tesla's approach is interesting because it allows you to periodically establish some kind of contact with the car. | |
47:18.040 --> 47:24.040 | |
Toyota, on the other hand, is thinking about shared autonomy or collaborative autonomy as a paradigm. | |
47:24.040 --> 47:31.040 | |
If I may argue, these are very, very simple ways of human robot collaboration because the task is pretty boring. | |
47:31.040 --> 47:34.040 | |
You sit in a vehicle, you go from point A to point B. | |
47:34.040 --> 47:42.040 | |
I think the more interesting thing to me is, for example, search and rescue, I've got a human first responder, robot first responders. | |
47:42.040 --> 47:44.040 | |
I got to do something. | |
47:44.040 --> 47:45.040 | |
It's important. | |
47:45.040 --> 47:47.040 | |
I have to do it in two minutes. | |
47:47.040 --> 47:48.040 | |
The building is burning. | |
47:48.040 --> 47:50.040 | |
There's been an explosion. | |
47:50.040 --> 47:51.040 | |
It's collapsed. | |
47:51.040 --> 47:52.040 | |
How do I do it? | |
47:52.040 --> 47:58.040 | |
I think to me, those are the interesting things where it's very, very unstructured and what's the role of the human? | |
47:58.040 --> 47:59.040 | |
What's the role of the robot? | |
47:59.040 --> 48:02.040 | |
Clearly, there's lots of interesting challenges. | |
48:02.040 --> 48:05.040 | |
As a field, I think we're going to make a lot of progress in this area. | |
48:05.040 --> 48:07.040 | |
Yeah, it's an exciting form of collaboration. | |
48:07.040 --> 48:08.040 | |
You're right. | |
48:08.040 --> 48:15.040 | |
In the autonomous driving, the main enemy is just boredom of the human as opposed to the rescue operations. | |
48:15.040 --> 48:23.040 | |
It's literally life and death and the collaboration enables the effective completion of the mission. | |
48:23.040 --> 48:24.040 | |
So it's exciting. | |
48:24.040 --> 48:27.040 | |
Well, in some sense, we're also doing this. | |
48:27.040 --> 48:37.040 | |
You think about the human driving a car and almost invariably the human is trying to estimate the state of the car, the state of the environment, and so on. | |
48:37.040 --> 48:40.040 | |
But what is the car where to estimate the state of the human? | |
48:40.040 --> 48:47.040 | |
So for example, I'm sure you have a smartphone and the smartphone tries to figure out what you're doing and send you reminders. | |
48:47.040 --> 48:53.040 | |
And oftentimes telling you to drive to a certain place, although you have no intention of going there, because it thinks that that's where you should be. | |
48:53.040 --> 48:59.040 | |
Because of some Gmail calendar entry or something like that. | |
48:59.040 --> 49:02.040 | |
And it's trying to constantly figure out who you are, what you're doing. | |
49:02.040 --> 49:06.040 | |
If a car were to do that, maybe that would make the driver safer. | |
49:06.040 --> 49:14.040 | |
Because the car is trying to figure out there's a driver paying attention, looking at his or her eyes, looking at circadian movements. | |
49:14.040 --> 49:16.040 | |
So I think the potential is there. | |
49:16.040 --> 49:21.040 | |
But from the reverse side, it's not robot modeling, but it's human modeling. | |
49:21.040 --> 49:23.040 | |
It's more in the human, right? | |
49:23.040 --> 49:30.040 | |
And I think the robots can do a very good job of modeling humans if you really think about the framework that you have. | |
49:30.040 --> 49:39.040 | |
A human sitting in a cockpit surrounded by sensors, all staring at him, in addition to be staring outside, but also staring at him. | |
49:39.040 --> 49:41.040 | |
I think there's a real synergy there. | |
49:41.040 --> 49:48.040 | |
Yeah, I love that problem because it's the new 21st century form of psychology actually, AI enabled psychology. | |
49:48.040 --> 49:54.040 | |
A lot of people have sci fi inspired fears of walking robots like those from Boston Dynamics. | |
49:54.040 --> 49:59.040 | |
If you just look at shows on Netflix and so on, or flying robots like those you work with. | |
49:59.040 --> 50:03.040 | |
How would you, how do you think about those fears? | |
50:03.040 --> 50:05.040 | |
How would you alleviate those fears? | |
50:05.040 --> 50:09.040 | |
Do you have inklings, echoes of those same concerns? | |
50:09.040 --> 50:23.040 | |
Any time we develop a technology meaning to have positive impact in the world, there's always a worry that somebody could subvert those technologies and use it in an adversarial setting. | |
50:23.040 --> 50:25.040 | |
And robotics is no exception, right? | |
50:25.040 --> 50:29.040 | |
So I think it's very easy to weaponize robots. | |
50:29.040 --> 50:31.040 | |
I think we talk about swarms. | |
50:31.040 --> 50:38.040 | |
One thing I worry a lot about is, for us to get swarms to work and do something reliably, it's really hard. | |
50:38.040 --> 50:44.040 | |
But suppose I have this challenge of trying to destroy something. | |
50:44.040 --> 50:49.040 | |
And I have a swarm of robots where only one out of the swarm needs to get to its destination. | |
50:49.040 --> 50:53.040 | |
So that suddenly becomes a lot more doable. | |
50:53.040 --> 51:00.040 | |
And so I worry about this general idea of using autonomy with lots and lots of agents. | |
51:00.040 --> 51:04.040 | |
I mean, having said that, look, a lot of this technology is not very mature. | |
51:04.040 --> 51:12.040 | |
My favorite saying is that if somebody had to develop this technology, wouldn't you rather the good guys do it? | |
51:12.040 --> 51:21.040 | |
So the good guys have a good understanding of the technology so they can figure out how this technology is being used in a bad way or could be used in a bad way and try to defend against it? | |
51:21.040 --> 51:23.040 | |
So we think a lot about that. | |
51:23.040 --> 51:28.040 | |
So we're doing research on how to defend against swarms, for example. | |
51:28.040 --> 51:29.040 | |
That's interesting. | |
51:29.040 --> 51:36.040 | |
There is, in fact, a report by the National Academies on counter UAS technologies. | |
51:36.040 --> 51:38.040 | |
This is a real threat. | |
51:38.040 --> 51:47.040 | |
But we're also thinking about how to defend against this and knowing how swarms work, knowing how autonomy works is, I think, very important. | |
51:47.040 --> 51:49.040 | |
So it's not just politicians? | |
51:49.040 --> 51:51.040 | |
You think engineers have a role in this discussion? | |
51:51.040 --> 51:52.040 | |
Absolutely. | |
51:52.040 --> 51:59.040 | |
I think the days where politicians can be agnostic to technology are gone. | |
51:59.040 --> 52:05.040 | |
I think every politician needs to be literate in technology. | |
52:05.040 --> 52:09.040 | |
And I often say technology is the new liberal art. | |
52:09.040 --> 52:18.040 | |
Understanding how technology will change your life, I think, is important and every human being needs to understand that. | |
52:18.040 --> 52:22.040 | |
And maybe we can elect some engineers to office as well on the other side. | |
52:22.040 --> 52:25.040 | |
What are the biggest open problems in robotics in UV? | |
52:25.040 --> 52:27.040 | |
You said we're in the early days in some sense. | |
52:27.040 --> 52:30.040 | |
What are the problems we would like to solve in robotics? | |
52:30.040 --> 52:32.040 | |
I think there are lots of problems, right? | |
52:32.040 --> 52:36.040 | |
But I would phrase it in the following way. | |
52:36.040 --> 52:46.040 | |
If you look at the robots we're building, they're still very much tailored towards doing specific tasks in specific settings. | |
52:46.040 --> 52:59.040 | |
I think the question of how do you get them to operate in much broader settings where things can change in unstructured environments is up in the air. | |
52:59.040 --> 53:02.040 | |
So think of the self driving cars. | |
53:02.040 --> 53:05.040 | |
Today, we can build a self driving car in a parking lot. | |
53:05.040 --> 53:09.040 | |
We can do level five autonomy in a parking lot. | |
53:09.040 --> 53:17.040 | |
But can you do a level five autonomy in the streets of Napoli in Italy or Mumbai in India? | |
53:17.040 --> 53:18.040 | |
No. | |
53:18.040 --> 53:27.040 | |
So in some sense, when we think about robotics, we have to think about where they're functioning, what kind of environment, what kind of a task. | |
53:27.040 --> 53:32.040 | |
We have no understanding of how to put both those things together. | |
53:32.040 --> 53:36.040 | |
So we're in the very early days of applying it to the physical world. | |
53:36.040 --> 53:39.040 | |
And I was just in Naples, actually. | |
53:39.040 --> 53:46.040 | |
And there's levels of difficulty and complexity depending on which area you're applying it to. | |
53:46.040 --> 53:47.040 | |
I think so. | |
53:47.040 --> 53:51.040 | |
And we don't have a systematic way of understanding that. | |
53:51.040 --> 54:00.040 | |
Everybody says just because a computer can now beat a human at any board game, we suddenly know something about intelligence. | |
54:00.040 --> 54:01.040 | |
That's not true. | |
54:01.040 --> 54:04.040 | |
A computer board game is very, very structured. | |
54:04.040 --> 54:11.040 | |
It is the equivalent of working in a Henry Ford factory where things, parts come, you assemble, move on. | |
54:11.040 --> 54:14.040 | |
It's a very, very, very structured setting. | |
54:14.040 --> 54:15.040 | |
That's the easiest thing. | |
54:15.040 --> 54:18.040 | |
And we know how to do that. | |
54:18.040 --> 54:23.040 | |
So you've done a lot of incredible work at the UPenn University of Pennsylvania Grass Club. | |
54:23.040 --> 54:26.040 | |
You're now Dean of Engineering at UPenn. | |
54:26.040 --> 54:34.040 | |
What advice do you have for a new bright eyed undergrad interested in robotics or AI or engineering? | |
54:34.040 --> 54:37.040 | |
Well, I think there's really three things. | |
54:37.040 --> 54:45.040 | |
One is you have to get used to the idea that the world will not be the same in five years or four years whenever you graduate, right? | |
54:45.040 --> 54:46.040 | |
Which is really hard to do. | |
54:46.040 --> 54:53.040 | |
So this thing about predicting the future, every one of us needs to be trying to predict the future always. | |
54:53.040 --> 55:01.040 | |
Not because you'll be any good at it, but by thinking about it, I think you sharpen your senses and you become smarter. | |
55:01.040 --> 55:02.040 | |
So that's number one. | |
55:02.040 --> 55:09.040 | |
Number two, it's a callery of the first piece, which is you really don't know what's going to be important. | |
55:09.040 --> 55:15.040 | |
So this idea that I'm going to specialize in something which will allow me to go in a particular direction. | |
55:15.040 --> 55:22.040 | |
It may be interesting, but it's important also to have this breadth so you have this jumping off point. | |
55:22.040 --> 55:25.040 | |
I think the third thing, and this is where I think Penn excels. | |
55:25.040 --> 55:30.040 | |
I mean, we teach engineering, but it's always in the context of the liberal arts. | |
55:30.040 --> 55:32.040 | |
It's always in the context of society. | |
55:32.040 --> 55:35.040 | |
As engineers, we cannot afford to lose sight of that. | |
55:35.040 --> 55:37.040 | |
So I think that's important. | |
55:37.040 --> 55:43.040 | |
But I think one thing that people underestimate when they do robotics is the importance of mathematical foundations, | |
55:43.040 --> 55:47.040 | |
the importance of representations. | |
55:47.040 --> 55:56.040 | |
Not everything can just be solved by looking for ROS packages on the Internet or to find a deep neural network that works. | |
55:56.040 --> 56:06.040 | |
I think the representation question is key, even to machine learning, where if you ever hope to achieve or get to explainable AI, | |
56:06.040 --> 56:09.040 | |
somehow there need to be representations that you can understand. | |
56:09.040 --> 56:16.040 | |
So if you want to do robotics, you should also do mathematics, and you said liberal arts, a little literature. | |
56:16.040 --> 56:19.040 | |
If you want to build a robot, you should be reading Dostoyevsky. | |
56:19.040 --> 56:20.040 | |
I agree with that. | |
56:20.040 --> 56:21.040 | |
Very good. | |
56:21.040 --> 56:24.040 | |
So Vijay, thank you so much for talking today. It was an honor. | |
56:24.040 --> 56:47.040 | |
Thank you. It was just a very exciting conversation. Thank you. | |