File size: 42,648 Bytes
323a418
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
1027
1028
1029
1030
1031
1032
1033
1034
1035
1036
1037
1038
1039
1040
1041
1042
1043
1044
1045
1046
1047
1048
1049
1050
1051
1052
1053
1054
1055
1056
1057
1058
1059
1060
1061
1062
1063
1064
1065
1066
1067
1068
1069
1070
1071
1072
1073
1074
1075
1076
1077
1078
1079
1080
1081
1082
1083
1084
1085
1086
1087
1088
1089
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
1106
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
1122
1123
1124
1125
1126
1127
1128
1129
1130
1131
1132
1133
1134
1135
1136
1137
1138
1139
1140
1141
1142
1143
1144
1145
1146
1147
1148
1149
1150
1151
1152
1153
1154
1155
1156
1157
1158
1159
1160
1161
1162
1163
1164
1165
1166
1167
1168
1169
1170
1171
1172
1173
1174
1175
1176
1177
1178
1179
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
1201
1202
1203
1204
1205
1206
1207
1208
1209
1210
1211
1212
1213
1214
1215
1216
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
1228
1229
1230
1231
1232
1233
1234
1235
1236
1237
1238
1239
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
1252
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
1264
1265
1266
1267
1268
1269
1270
1271
1272
1273
1274
1275
1276
1277
1278
1279
1280
1281
1282
1283
1284
1285
1286
1287
1288
1289
1290
1291
1292
1293
1294
1295
1296
1297
1298
1299
1300
1301
1302
1303
1304
1305
1306
1307
1308
1309
1310
1311
1312
1313
1314
1315
1316
1317
1318
1319
1320
1321
1322
1323
1324
1325
1326
1327
1328
1329
1330
1331
1332
1333
1334
1335
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
1361
1362
1363
1364
1365
1366
1367
1368
1369
1370
1371
1372
1373
1374
1375
1376
1377
1378
1379
1380
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
1392
1393
1394
1395
1396
1397
1398
1399
1400
1401
1402
1403
1404
1405
1406
1407
1408
1409
1410
1411
1412
1413
1414
1415
1416
1417
1418
1419
WEBVTT

00:00.000 --> 00:02.800
 The following is a conversation with Eric Schmidt.

00:02.800 --> 00:07.640
 He was the CEO of Google for ten years and a chairman for six more, guiding the company

00:07.640 --> 00:13.080
 through an incredible period of growth and a series of world changing innovations.

00:13.080 --> 00:19.280
 He is one of the most impactful leaders in the era of the internet and the powerful voice

00:19.280 --> 00:22.400
 for the promise of technology in our society.

00:22.400 --> 00:27.760
 It was truly an honor to speak with him as part of the MIT course on artificial general

00:27.760 --> 00:32.040
 intelligence and the artificial intelligence podcast.

00:32.040 --> 00:37.120
 And now here's my conversation with Eric Schmidt.

00:37.120 --> 00:40.840
 What was the first moment when you fell in love with technology?

00:40.840 --> 00:47.080
 I grew up in the 1960s as a boy where every boy wanted to be an astronaut and part of

00:47.080 --> 00:49.000
 the space program.

00:49.000 --> 00:54.400
 So like everyone else of my age, we would go out to the cow pasture behind my house,

00:54.400 --> 00:58.640
 which was literally a cow pasture, and we would shoot model rockets off.

00:58.640 --> 01:00.200
 And that I think is the beginning.

01:00.200 --> 01:05.680
 And of course, generationally, today, it would be video games and all the amazing things

01:05.680 --> 01:09.280
 that you can do online with computers.

01:09.280 --> 01:15.080
 There's a transformative, inspiring aspect of science and math that maybe rockets would

01:15.080 --> 01:17.560
 bring, would instill in individuals.

01:17.560 --> 01:21.720
 You've mentioned yesterday that eighth grade math is where the journey through Mathematical

01:21.720 --> 01:24.520
 University diverges from many people.

01:24.520 --> 01:27.080
 It's this fork in the roadway.

01:27.080 --> 01:31.160
 There's a professor of math at Berkeley, Edward Franco.

01:31.160 --> 01:32.720
 I'm not sure if you're familiar with him.

01:32.720 --> 01:33.720
 I am.

01:33.720 --> 01:35.400
 He has written this amazing book.

01:35.400 --> 01:41.960
 I recommend to everybody called Love and Math, two of my favorite words.

01:41.960 --> 01:49.680
 He says that if painting was taught like math, then students would be asked to paint a fence,

01:49.680 --> 01:54.520
 which is his analogy of essentially how math is taught, and you never get a chance to discover

01:54.520 --> 01:59.400
 the beauty of the art of painting or the beauty of the art of math.

01:59.400 --> 02:05.240
 So how, when, and where did you discover that beauty?

02:05.240 --> 02:12.040
 I think what happens with people like myself is that your math enabled pretty early, and

02:12.040 --> 02:16.640
 all of a sudden you discover that you can use that to discover new insights.

02:16.640 --> 02:22.120
 The great scientists will all tell a story, the men and women who are fantastic today,

02:22.120 --> 02:25.560
 that somewhere when they were in high school or in college, they discovered that they could

02:25.560 --> 02:30.760
 discover something themselves, and that sense of building something, of having an impact

02:30.760 --> 02:35.520
 that you own drives knowledge, acquisition, and learning.

02:35.520 --> 02:41.000
 In my case, it was programming, and the notion that I could build things that had not existed

02:41.000 --> 02:46.560
 that I had built, that it had my name on it, and this was before open source, but you could

02:46.560 --> 02:49.160
 think of it as open source contributions.

02:49.160 --> 02:53.760
 So today, if I were a 16 or 17 year old boy, I'm sure that I would aspire as a computer

02:53.760 --> 02:59.000
 scientist to make a contribution like the open source heroes of the world today.

02:59.000 --> 03:03.760
 That would be what would be driving me, and I'd be trying and learning and making mistakes,

03:03.760 --> 03:06.720
 and so forth, in the ways that it works.

03:06.720 --> 03:12.360
 The repository that GitHub represents and that open source libraries represent is an

03:12.360 --> 03:17.840
 enormous bank of knowledge of all of the people who are doing that, and one of the lessons

03:17.840 --> 03:22.240
 that I learned at Google was that the world is a very big place, and there's an awful

03:22.240 --> 03:26.360
 lot of smart people, and an awful lot of them are underutilized.

03:26.360 --> 03:32.240
 So here's an opportunity, for example, building parts of programs, building new ideas to contribute

03:32.240 --> 03:36.640
 to the greater of society.

03:36.640 --> 03:41.000
 So in that moment in the 70s, the inspiring moment where there was nothing, and then you

03:41.000 --> 03:44.800
 created something through programming, that magical moment.

03:44.800 --> 03:50.360
 So in 1975, I think you've created a program called Lex, which I especially like because

03:50.360 --> 03:51.560
 my name is Lex.

03:51.560 --> 03:52.560
 So thank you.

03:52.560 --> 03:58.240
 Thank you for creating a brand that established a reputation that's long lasting reliable

03:58.240 --> 04:01.240
 and has a big impact on the world and still used today.

04:01.240 --> 04:03.000
 So thank you for that.

04:03.000 --> 04:11.880
 But more seriously, in that time, in the 70s, as an engineer, personal computers were being

04:11.880 --> 04:12.880
 born.

04:12.880 --> 04:17.800
 Do you think you'd be able to predict the 80s, 90s, and the aughts of where computers

04:17.800 --> 04:18.800
 would go?

04:18.800 --> 04:23.160
 I'm sure I could not and would not have gotten it right.

04:23.160 --> 04:27.960
 I was the beneficiary of the great work of many, many people who saw it clearer than I

04:27.960 --> 04:29.160
 did.

04:29.160 --> 04:34.760
 With Lex, I worked with a fellow named Michael Lesk, who was my supervisor, and he essentially

04:34.760 --> 04:39.400
 helped me architect and deliver a system that's still in use today.

04:39.400 --> 04:43.800
 After that, I worked at Xerox Palo Alto Research Center, where the Alto was invented, and the

04:43.800 --> 04:50.600
 Alto is the predecessor of the modern personal computer, or Macintosh, and so forth.

04:50.600 --> 04:55.360
 And the altos were very rare, and I had to drive an hour from Berkeley to go use them,

04:55.360 --> 05:01.160
 but I made a point of skipping classes and doing whatever it took to have access to this

05:01.160 --> 05:02.480
 extraordinary achievement.

05:02.480 --> 05:05.000
 I knew that they were consequential.

05:05.000 --> 05:08.240
 What I did not understand was scaling.

05:08.240 --> 05:12.960
 I did not understand what would happen when you had 100 million as opposed to 100.

05:12.960 --> 05:17.360
 And so since then, and I have learned the benefit of scale, I always look for things

05:17.360 --> 05:23.160
 which are going to scale to platforms, so mobile phones, Android, all those things.

05:23.160 --> 05:27.560
 There are many, many people in the world.

05:27.560 --> 05:28.560
 People really have needs.

05:28.560 --> 05:32.600
 They really will use these platforms, and you can build big businesses on top of them.

05:32.600 --> 05:33.600
 So it's interesting.

05:33.600 --> 05:36.880
 So when you see a piece of technology, now you think, what will this technology look

05:36.880 --> 05:39.080
 like when it's in the hands of a billion people?

05:39.080 --> 05:40.160
 That's right.

05:40.160 --> 05:46.520
 So an example would be that the market is so competitive now that if you can't figure

05:46.520 --> 05:51.880
 out a way for something to have a million users or a billion users, it probably is not

05:51.880 --> 05:57.280
 going to be successful because something else will become the general platform, and your

05:57.280 --> 06:04.360
 idea will become a lost idea or a specialized service with relatively few users.

06:04.360 --> 06:06.000
 So it's a path to generality.

06:06.000 --> 06:07.720
 It's a path to general platform use.

06:07.720 --> 06:09.400
 It's a path to broad applicability.

06:09.400 --> 06:15.000
 Now, there are plenty of good businesses that are tiny, so luxury goods, for example.

06:15.000 --> 06:20.360
 But if you want to have an impact at scale, you have to look for things which are of

06:20.360 --> 06:25.200
 common value, common pricing, common distribution, and solve common problems, the problems that

06:25.200 --> 06:26.200
 everyone has.

06:26.200 --> 06:30.440
 And by the way, people have lots of problems, information, medicine, health, education,

06:30.440 --> 06:31.440
 and so forth.

06:31.440 --> 06:32.440
 Work on those problems.

06:32.440 --> 06:40.240
 Like you said, you're a big fan of the middle class because there's so many of them by definition.

06:40.240 --> 06:46.600
 So any product, any thing that has a huge impact, it improves their lives is a great

06:46.600 --> 06:48.960
 business decision and it's just good for society.

06:48.960 --> 06:53.520
 And there's nothing wrong with starting off in the high end as long as you have a plan

06:53.520 --> 06:55.520
 to get to the middle class.

06:55.520 --> 06:59.280
 There's nothing wrong with starting with a specialized market in order to learn and to

06:59.280 --> 07:01.080
 build and to fund things.

07:01.080 --> 07:04.520
 So you start a luxury market to build a general purpose market.

07:04.520 --> 07:09.640
 But if you define yourself as only a narrow market, someone else can come along with a

07:09.640 --> 07:14.320
 general purpose market that can push you to the corner, can restrict the scale of operation,

07:14.320 --> 07:17.320
 can force you to be a lesser impact than you might be.

07:17.320 --> 07:22.800
 So it's very important to think in terms of broad businesses and broad impact, even if

07:22.800 --> 07:26.360
 you start in a little corner somewhere.

07:26.360 --> 07:33.200
 So as you look to the 70s, but also in the decades to come, and you saw computers, did

07:33.200 --> 07:40.240
 you see them as tools or was there a little element of another entity?

07:40.240 --> 07:46.240
 I remember a quote saying AI began with our dream to create the gods.

07:46.240 --> 07:51.520
 Is there a feeling when you wrote that program that you were creating another entity, giving

07:51.520 --> 07:52.800
 life to something?

07:52.800 --> 07:58.880
 I wish I could say otherwise, but I simply found the technology platforms so exciting.

07:58.880 --> 08:00.400
 That's what I was focused on.

08:00.400 --> 08:04.640
 I think the majority of the people that I've worked with, and there are a few exceptions,

08:04.640 --> 08:09.960
 Steve Jobs being an example, really saw this as a great technological play.

08:09.960 --> 08:15.520
 I think relatively few of the technical people understood the scale of its impact.

08:15.520 --> 08:19.680
 So I used NCP, which is a predecessor to TCPIP.

08:19.680 --> 08:21.240
 It just made sense to connect things.

08:21.240 --> 08:26.240
 We didn't think of it in terms of the internet, and then companies, and then Facebook, and

08:26.240 --> 08:29.200
 then Twitter, and then politics, and so forth.

08:29.200 --> 08:30.800
 We never did that build.

08:30.800 --> 08:32.920
 We didn't have that vision.

08:32.920 --> 08:38.200
 And I think most people, it's a rare person who can see compounding at scale.

08:38.200 --> 08:41.520
 Most people can see, if you ask people to predict the future, they'll say, they'll give

08:41.520 --> 08:44.080
 you an answer of six to nine months or 12 months.

08:44.080 --> 08:47.560
 Because that's about as far as people can imagine.

08:47.560 --> 08:51.020
 But there's an old saying, which actually was attributed to a professor at MIT a long

08:51.020 --> 08:58.120
 time ago, that we overestimate what can be done in one year, and we underestimate what

08:58.120 --> 09:00.280
 can be done in a decade.

09:00.280 --> 09:05.560
 And there's a great deal of evidence that these core platforms at hardware and software

09:05.560 --> 09:07.800
 take a decade.

09:07.800 --> 09:09.600
 So think about self driving cars.

09:09.600 --> 09:12.160
 Self driving cars were thought about in the 90s.

09:12.160 --> 09:17.160
 Over projects around them, the first DARPA Durand Challenge was roughly 2004.

09:17.160 --> 09:19.760
 So that's roughly 15 years ago.

09:19.760 --> 09:25.400
 And today we have self driving cars operating in a city in Arizona, right, so 15 years.

09:25.400 --> 09:31.720
 And we still have a ways to go before they're more generally available.

09:31.720 --> 09:33.840
 So you've spoken about the importance.

09:33.840 --> 09:37.080
 You just talked about predicting into the future.

09:37.080 --> 09:41.640
 You've spoken about the importance of thinking five years ahead and having a plan for those

09:41.640 --> 09:42.640
 five years.

09:42.640 --> 09:47.840
 And the way to say it is that almost everybody has a one year plan.

09:47.840 --> 09:50.960
 Almost no one has a proper five year plan.

09:50.960 --> 09:55.160
 And the key thing to having a five year plan is having a model for what's going to happen

09:55.160 --> 09:57.000
 under the underlying platforms.

09:57.000 --> 09:59.840
 So here's an example.

09:59.840 --> 10:05.120
 Computer Moore's Law, as we know it, the thing that powered improvements in CPUs has largely

10:05.120 --> 10:10.400
 halted in its traditional shrinking mechanism, because the costs have just gotten so high.

10:10.400 --> 10:12.200
 It's getting harder and harder.

10:12.200 --> 10:16.600
 But there's plenty of algorithmic improvements and specialized hardware improvements.

10:16.600 --> 10:21.240
 So you need to understand the nature of those improvements and where they'll go in order

10:21.240 --> 10:24.360
 to understand how it will change the platform.

10:24.360 --> 10:28.000
 In the area of network connectivity, what are the gains that are going to be possible

10:28.000 --> 10:29.480
 in wireless?

10:29.480 --> 10:35.720
 It looks like there's an enormous expansion of wireless connectivity at many different

10:35.720 --> 10:36.960
 bands, right?

10:36.960 --> 10:40.520
 And that we will primarily, historically, I've always thought that we were primarily

10:40.520 --> 10:45.040
 going to be using fiber, but now it looks like we're going to be using fiber plus very

10:45.040 --> 10:51.560
 powerful high bandwidth sort of short distance connectivity to bridge the last mile, right?

10:51.560 --> 10:53.100
 That's an amazing achievement.

10:53.100 --> 10:56.880
 If you know that, then you're going to build your systems differently.

10:56.880 --> 10:59.800
 By the way, those networks have different latency properties, right?

10:59.800 --> 11:05.040
 Because they're more symmetric, the algorithms feel faster for that reason.

11:05.040 --> 11:09.920
 And so when you think about whether it's a fiber or just technologies in general.

11:09.920 --> 11:15.920
 So there's this barber, wooden poem or quote that I really like.

11:15.920 --> 11:20.400
 It's from the champions of the impossible rather than the slaves of the possible that

11:20.400 --> 11:23.280
 evolution draws its creative force.

11:23.280 --> 11:27.840
 So in predicting the next five years, I'd like to talk about the impossible and the

11:27.840 --> 11:28.840
 possible.

11:28.840 --> 11:34.720
 Well, and again, one of the great things about humanity is that we produce dreamers, right?

11:34.720 --> 11:37.760
 We literally have people who have a vision and a dream.

11:37.760 --> 11:43.400
 They are, if you will, disagreeable in the sense that they disagree with the, they disagree

11:43.400 --> 11:46.240
 with what the sort of zeitgeist is.

11:46.240 --> 11:48.040
 They say there is another way.

11:48.040 --> 11:49.040
 They have a belief.

11:49.040 --> 11:50.320
 They have a vision.

11:50.320 --> 11:56.560
 If you look at science, science is always marked by such people who, who went against

11:56.560 --> 12:01.360
 some conventional wisdom, collected the knowledge at the time and assembled it in a way that

12:01.360 --> 12:03.760
 produced a powerful platform.

12:03.760 --> 12:11.120
 And you've been amazingly honest about in an inspiring way about things you've been wrong

12:11.120 --> 12:14.800
 about predicting and you've obviously been right about a lot of things.

12:14.800 --> 12:23.880
 But in this kind of tension, how do you balance as a company in predicting the next five years,

12:23.880 --> 12:26.520
 the impossible, planning for the impossible.

12:26.520 --> 12:32.720
 So listening to those crazy dreamers, letting them do, letting them run away and make the

12:32.720 --> 12:38.760
 impossible real, make it happen and slow, you know, that's how programmers often think

12:38.760 --> 12:44.800
 and slowing things down and saying, well, this is the rational, this is the possible,

12:44.800 --> 12:49.160
 the pragmatic, the, the dreamer versus the pragmatist.

12:49.160 --> 12:56.680
 So it's helpful to have a model which encourages a predictable revenue stream as well as the

12:56.680 --> 12:58.720
 ability to do new things.

12:58.720 --> 13:03.120
 So in Google's case, we're big enough and well enough managed and so forth that we have

13:03.120 --> 13:06.600
 a pretty good sense of what our revenue will be for the next year or two, at least for

13:06.600 --> 13:07.960
 a while.

13:07.960 --> 13:13.720
 And so we have enough cash generation that we can make bets.

13:13.720 --> 13:18.760
 And indeed, Google has become alphabet so the corporation is organized around these

13:18.760 --> 13:19.760
 bets.

13:19.760 --> 13:25.560
 And these bets are in areas of fundamental importance to, to the world, whether it's

13:25.560 --> 13:31.920
 digital intelligence, medical technology, self driving cars, connectivity through balloons,

13:31.920 --> 13:33.440
 on and on and on.

13:33.440 --> 13:36.080
 And there's more coming and more coming.

13:36.080 --> 13:41.480
 So one way you could express this is that the current business is successful enough

13:41.480 --> 13:44.720
 that we have the luxury of making bets.

13:44.720 --> 13:48.960
 And another one that you could say is that we have the, the wisdom of being able to see

13:48.960 --> 13:53.920
 that a corporate structure needs to be created to enhance the likelihood of the success of

13:53.920 --> 13:55.320
 those bets.

13:55.320 --> 13:59.760
 So we essentially turned ourselves into a conglomerate of bets.

13:59.760 --> 14:04.360
 And then this underlying corporation Google, which is itself innovative.

14:04.360 --> 14:08.160
 So in order to pull this off, you have to have a bunch of belief systems.

14:08.160 --> 14:12.080
 And one of them is that you have to have bottoms up and tops down the bottoms up.

14:12.080 --> 14:13.600
 We call 20% time.

14:13.600 --> 14:17.040
 And the idea is that people can spend 20% of the time on whatever they want.

14:17.040 --> 14:21.960
 And the top down is that our founders in particular have a keen eye on technology and they're

14:21.960 --> 14:24.000
 reviewing things constantly.

14:24.000 --> 14:27.520
 So an example would be they'll, they'll hear about an idea or I'll hear about something

14:27.520 --> 14:28.880
 and it sounds interesting.

14:28.880 --> 14:34.920
 Let's go visit them and then let's begin to assemble the pieces to see if that's possible.

14:34.920 --> 14:39.920
 And if you do this long enough, you get pretty good at predicting what's likely to work.

14:39.920 --> 14:42.120
 So that's, that's a beautiful balance that struck.

14:42.120 --> 14:44.560
 Is this something that applies at all scale?

14:44.560 --> 14:53.960
 So seems seems to be that the Sergei, again, 15 years ago, came up with a concept that

14:53.960 --> 14:59.040
 called 10% of the budget should be on things that are unrelated.

14:59.040 --> 15:05.040
 It was called 70, 20, 10, 70% of our time on core business, 20% on adjacent business

15:05.040 --> 15:06.920
 and 10% on other.

15:06.920 --> 15:11.200
 And he proved mathematically, of course, he's a brilliant mathematician, that you needed

15:11.200 --> 15:18.800
 that 10% right to make the sum of the growth work and it turns out he was right.

15:18.800 --> 15:24.320
 So getting into the world of artificial intelligence, you've, you've talked quite extensively and

15:24.320 --> 15:32.160
 effectively to the impact in the near term, the positive impact of artificial intelligence,

15:32.160 --> 15:38.720
 whether it's machine, especially machine learning in medical applications and education

15:38.720 --> 15:44.160
 and just making information more accessible, right in the AI community, there is a kind

15:44.160 --> 15:49.520
 of debate, so there's this shroud of uncertainty as we face this new world with artificial

15:49.520 --> 15:50.800
 intelligence in it.

15:50.800 --> 15:57.560
 And there is some people like Elon Musk, you've disagreed on at least on the degree of emphasis

15:57.560 --> 16:00.800
 he places on the existential threat of AI.

16:00.800 --> 16:07.120
 So I've spoken with Stuart Russell, Max Tagmark, who share Elon Musk's view, and Yoshio Benjio,

16:07.120 --> 16:09.240
 Steven Pinker, who do not.

16:09.240 --> 16:13.320
 And so there's a, there's a, there's a lot of very smart people who are thinking about

16:13.320 --> 16:17.280
 this stuff, disagreeing, which is really healthy, of course.

16:17.280 --> 16:22.800
 So what do you think is the healthiest way for the AI community to, and really for the

16:22.800 --> 16:30.880
 general public to think about AI and the concern of the technology being mismanaged

16:30.880 --> 16:33.000
 in some, in some kind of way.

16:33.000 --> 16:37.560
 So the source of education for the general public has been robot killer movies.

16:37.560 --> 16:38.760
 Right.

16:38.760 --> 16:44.840
 And Terminator, et cetera, and the one thing I can assure you we're not building are those

16:44.840 --> 16:46.200
 kinds of solutions.

16:46.200 --> 16:51.240
 Furthermore, if they were to show up, someone would notice and unplug them, right?

16:51.240 --> 16:57.760
 So as exciting as those movies are, and they're great movies, where the killer robots to start,

16:57.760 --> 17:00.520
 we would find a way to stop them, right?

17:00.520 --> 17:04.320
 So I'm not concerned about that.

17:04.320 --> 17:08.680
 And much of this has to do with the timeframe of conversation.

17:08.680 --> 17:16.120
 So you can imagine a situation a hundred years from now, when the human brain is fully understood,

17:16.120 --> 17:19.880
 and the next generation and next generation of brilliant MIT scientists have figured

17:19.880 --> 17:25.960
 all this out, we're going to have a large number of ethics questions, right, around science

17:25.960 --> 17:29.760
 and thinking and robots and computers and so forth and so on.

17:29.760 --> 17:32.360
 So it depends on the question of the timeframe.

17:32.360 --> 17:37.280
 In the next five to 10 years, we're not facing those questions.

17:37.280 --> 17:42.000
 What we're facing in the next five to 10 years is how do we spread this disruptive technology

17:42.000 --> 17:46.520
 as broadly as possible to gain the maximum benefit of it?

17:46.520 --> 17:51.880
 The primary benefit should be in healthcare and in education, healthcare because it's

17:51.880 --> 17:52.880
 obvious.

17:52.880 --> 17:55.840
 We're all the same, even though we somehow believe we're not.

17:55.840 --> 18:00.400
 As a medical matter, the fact that we have big data about our health will save lives,

18:00.400 --> 18:05.520
 allow us to deal with skin cancer and other cancers, ophthalmological problems.

18:05.520 --> 18:10.080
 There's people working on psychological diseases and so forth using these techniques.

18:10.080 --> 18:11.680
 I go on and on.

18:11.680 --> 18:15.840
 The promise of AI in medicine is extraordinary.

18:15.840 --> 18:20.360
 There are many, many companies and startups and funds and solutions and we will all live

18:20.360 --> 18:22.120
 much better for that.

18:22.120 --> 18:25.680
 The same argument in education.

18:25.680 --> 18:31.760
 Can you imagine that for each generation of child and even adult, you have a tutor educator

18:31.760 --> 18:37.320
 that's AI based, that's not a human but is properly trained, that helps you get smarter,

18:37.320 --> 18:41.440
 helps you address your language difficulties or your math difficulties or what have you.

18:41.440 --> 18:43.400
 Why don't we focus on those two?

18:43.400 --> 18:49.240
 The gains societally of making humans smarter and healthier are enormous.

18:49.240 --> 18:54.000
 Those translate for decades and decades and will all benefit from them.

18:54.000 --> 18:58.560
 There are people who are working on AI safety, which is the issue that you're describing.

18:58.560 --> 19:02.920
 There are conversations in the community that should there be such problems, what should

19:02.920 --> 19:04.360
 the rules be like?

19:04.360 --> 19:09.360
 Google, for example, has announced its policies with respect to AI safety, which I certainly

19:09.360 --> 19:14.320
 support and I think most everybody would support and they make sense.

19:14.320 --> 19:19.760
 It helps guide the research but the killer robots are not arriving this year and they're

19:19.760 --> 19:23.840
 not even being built.

19:23.840 --> 19:31.040
 On that line of thinking, you said the timescale, in this topic or other topics, have you found

19:31.040 --> 19:37.920
 a useful, on the business side or the intellectual side, to think beyond 5, 10 years, to think

19:37.920 --> 19:39.480
 50 years out?

19:39.480 --> 19:42.000
 Has it ever been useful or productive?

19:42.000 --> 19:49.040
 In our industry, there are essentially no examples of 50 year predictions that have been correct.

19:49.040 --> 19:54.320
 Let's review AI, which was largely invented here at MIT and a couple of other universities

19:54.320 --> 19:57.840
 in 1956, 1957, 1958.

19:57.840 --> 20:01.800
 The original claims were a decade or two.

20:01.800 --> 20:08.040
 When I was a PhD student, I studied AI a bit and it entered during my looking at it a period

20:08.040 --> 20:13.880
 which is known as AI winter, which went on for about 30 years, which is a whole generation

20:13.880 --> 20:18.800
 of scientists and a whole group of people who didn't make a lot of progress because the

20:18.800 --> 20:22.160
 algorithms had not improved and the computers had not improved.

20:22.160 --> 20:26.160
 It took some brilliant mathematicians, starting with a fellow named Jeff Hinton at Toronto

20:26.160 --> 20:33.120
 in Montreal, who basically invented this deep learning model which empowers us today.

20:33.120 --> 20:40.400
 The seminal work there was 20 years ago and in the last 10 years, it's become popularized.

20:40.400 --> 20:43.520
 Think about the time frames for that level of discovery.

20:43.520 --> 20:46.080
 It's very hard to predict.

20:46.080 --> 20:50.240
 Many people think that we'll be flying around in the equivalent of flying cars.

20:50.240 --> 20:51.240
 Who knows?

20:51.240 --> 20:56.680
 My own view, if I want to go out on a limb, is to say that we know a couple of things

20:56.680 --> 20:58.000
 about 50 years from now.

20:58.000 --> 21:00.480
 We know that there'll be more people alive.

21:00.480 --> 21:04.000
 We know that we'll have to have platforms that are more sustainable because the earth

21:04.000 --> 21:09.440
 is limited in the ways we all know and that the kind of platforms that are going to get

21:09.440 --> 21:13.000
 billed will be consistent with the principles that I've described.

21:13.000 --> 21:17.560
 They will be much more empowering of individuals, they'll be much more sensitive to the ecology

21:17.560 --> 21:20.440
 because they have to be, they just have to be.

21:20.440 --> 21:24.160
 I also think that humans are going to be a great deal smarter and I think they're going

21:24.160 --> 21:28.320
 to be a lot smarter because of the tools that I've discussed with you and of course people

21:28.320 --> 21:29.320
 will live longer.

21:29.320 --> 21:32.080
 Life extension is continuing apace.

21:32.080 --> 21:36.840
 A baby born today has a reasonable chance of living to 100, which is pretty exciting.

21:36.840 --> 21:40.760
 It's well past the 21st century, so we better take care of them.

21:40.760 --> 21:46.160
 You mentioned interesting statistic on some very large percentage, 60, 70% of people may

21:46.160 --> 21:48.360
 live in cities.

21:48.360 --> 21:53.880
 Today more than half the world lives in cities and one of the great stories of humanity in

21:53.880 --> 21:57.560
 the last 20 years has been the rural to urban migration.

21:57.560 --> 22:02.720
 This has occurred in the United States, it's occurred in Europe, it's occurring in Asia

22:02.720 --> 22:04.760
 and it's occurring in Africa.

22:04.760 --> 22:09.280
 When people move to cities, the cities get more crowded, but believe it or not their health

22:09.280 --> 22:15.560
 gets better, their productivity gets better, their IQ and educational capabilities improve,

22:15.560 --> 22:19.840
 so it's good news that people are moving to cities, but we have to make them livable

22:19.840 --> 22:22.800
 and safe.

22:22.800 --> 22:28.240
 So you, first of all, you are, but you've also worked with some of the greatest leaders

22:28.240 --> 22:30.180
 in the history of tech.

22:30.180 --> 22:37.080
 What insights do you draw from the difference in leadership styles of yourself, Steve Jobs,

22:37.080 --> 22:45.320
 Elon Musk, Larry Page, now the new CEO, Sandra Pichai and others from the, I would say, calm

22:45.320 --> 22:49.600
 sages to the mad geniuses?

22:49.600 --> 22:53.880
 One of the things that I learned as a young executive is that there's no single formula

22:53.880 --> 22:56.200
 for leadership.

22:56.200 --> 23:00.080
 They try to teach one, but that's not how it really works.

23:00.080 --> 23:04.360
 There are people who just understand what they need to do and they need to do it quickly.

23:04.360 --> 23:06.800
 Those people are often entrepreneurs.

23:06.800 --> 23:09.080
 They just know and they move fast.

23:09.080 --> 23:13.400
 There are other people who are systems thinkers and planners, that's more who I am, somewhat

23:13.400 --> 23:18.760
 more conservative, more thorough in execution, a little bit more risk averse.

23:18.760 --> 23:24.120
 There's also people who are sort of slightly insane, right, in the sense that they are

23:24.120 --> 23:29.040
 emphatic and charismatic and they feel it and they drive it and so forth.

23:29.040 --> 23:31.440
 There's no single formula to success.

23:31.440 --> 23:35.320
 There is one thing that unifies all of the people that you named, which is very high

23:35.320 --> 23:41.240
 intelligence, at the end of the day, the thing that characterizes all of them is that they

23:41.240 --> 23:46.360
 saw the world quicker, faster, they processed information faster, they didn't necessarily

23:46.360 --> 23:50.160
 make the right decisions all the time, but they were on top of it.

23:50.160 --> 23:54.600
 The other thing that's interesting about all those people is they all started young.

23:54.600 --> 23:58.560
 Think about Steve Jobs starting Apple roughly at 18 or 19.

23:58.560 --> 24:01.840
 Think about Bill Gates starting at roughly 2021.

24:01.840 --> 24:07.040
 Think about by the time they were 30, Mark Zuckerberg, a more good example at 1920.

24:07.040 --> 24:13.720
 By the time they were 30, they had 10 years, at 30 years old, they had 10 years of experience

24:13.720 --> 24:19.920
 of dealing with people and products and shipments and the press and business and so forth.

24:19.920 --> 24:24.480
 It's incredible how much experience they had compared to the rest of us who were busy getting

24:24.480 --> 24:25.480
 our PhDs.

24:25.480 --> 24:26.480
 Yes, exactly.

24:26.480 --> 24:32.760
 We should celebrate these people because they've just had more life experience and that helps

24:32.760 --> 24:34.520
 inform the judgment.

24:34.520 --> 24:41.360
 At the end of the day, when you're at the top of these organizations, all the easy questions

24:41.360 --> 24:43.680
 have been dealt with.

24:43.680 --> 24:45.840
 How should we design the buildings?

24:45.840 --> 24:48.400
 Where should we put the colors on our product?

24:48.400 --> 24:51.440
 What should the box look like?

24:51.440 --> 24:55.520
 The problems, that's why it's so interesting to be in these rooms, the problems that they

24:55.520 --> 25:00.200
 face in terms of the way they operate, the way they deal with their employees, their

25:00.200 --> 25:04.160
 customers, their innovation are profoundly challenging.

25:04.160 --> 25:09.360
 Each of the companies is demonstrably different culturally.

25:09.360 --> 25:11.800
 They are not, in fact, cut of the same.

25:11.800 --> 25:16.680
 They behave differently based on input, their internal cultures are different, their compensation

25:16.680 --> 25:24.920
 schemes are different, their values are different, so there's proof that diversity works.

25:24.920 --> 25:33.440
 So when faced with a tough decision, in need of advice, it's been said that the best thing

25:33.440 --> 25:39.840
 one can do is to find the best person in the world who can give that advice and find a

25:39.840 --> 25:44.880
 way to be in a room with them, one on one and ask.

25:44.880 --> 25:51.920
 So here we are, and let me ask in a long winded way, I wrote this down, in 1998 there were

25:51.920 --> 26:01.960
 many good search engines, Lycos, Excite, Altavista, Infoseek, AskJeeves maybe, Yahoo even.

26:01.960 --> 26:07.040
 So Google stepped in and disrupted everything, they disrupted the nature of search, the nature

26:07.040 --> 26:12.040
 of our access to information, the way we discover new knowledge.

26:12.040 --> 26:19.120
 So now it's 2018, actually 20 years later, there are many good personal AI assistants,

26:19.120 --> 26:22.360
 including of course the best from Google.

26:22.360 --> 26:28.720
 So you've spoken in medical and education, the impact of such an AI assistant could bring.

26:28.720 --> 26:34.920
 So we arrive at this question, so it's a personal one for me, but I hope my situation represents

26:34.920 --> 26:41.200
 that of many other, as we said, dreamers and the crazy engineers.

26:41.200 --> 26:46.680
 So my whole life, I've dreamed of creating such an AI assistant.

26:46.680 --> 26:50.800
 Every step I've taken has been towards that goal, now I'm a research scientist in Human

26:50.800 --> 26:58.920
 Senate AI here at MIT, so the next step for me as I sit here, facing my passion, is to

26:58.920 --> 27:04.880
 do what Larry and Sergey did in 1998, this simple start up.

27:04.880 --> 27:10.640
 And so here's my simple question, given the low odds of success, the timing and luck required,

27:10.640 --> 27:14.280
 the countless other factors that can't be controlled or predicted, which is all the

27:14.280 --> 27:16.560
 things that Larry and Sergey faced.

27:16.560 --> 27:23.080
 Is there some calculation, some strategy to follow in this step, or do you simply follow

27:23.080 --> 27:26.560
 the passion just because there's no other choice?

27:26.560 --> 27:32.880
 I think the people who are in universities are always trying to study the extraordinarily

27:32.880 --> 27:37.360
 chaotic nature of innovation and entrepreneurship.

27:37.360 --> 27:42.880
 My answer is that they didn't have that conversation, they just did it.

27:42.880 --> 27:48.840
 They sensed a moment when, in the case of Google, there was all of this data that needed

27:48.840 --> 27:53.940
 to be organized and they had a better algorithm, they had invented a better way.

27:53.940 --> 28:01.040
 So today with Human Senate AI, which is your area of research, there must be new approaches.

28:01.040 --> 28:07.320
 It's such a big field, there must be new approaches, different from what we and others are doing.

28:07.320 --> 28:12.320
 There must be startups to fund, there must be research projects to try, there must be

28:12.320 --> 28:15.200
 graduate students to work on new approaches.

28:15.200 --> 28:19.120
 Here at MIT, there are people who are looking at learning from the standpoint of looking

28:19.120 --> 28:23.840
 at child learning, how do children learn starting at each one?

28:23.840 --> 28:25.560
 And the work is fantastic.

28:25.560 --> 28:30.120
 Those approaches are different from the approach that most people are taking.

28:30.120 --> 28:33.980
 Perhaps that's a bet that you should make, or perhaps there's another one.

28:33.980 --> 28:40.200
 But at the end of the day, the successful entrepreneurs are not as crazy as they sound.

28:40.200 --> 28:43.200
 They see an opportunity based on what's happened.

28:43.200 --> 28:45.400
 Let's use Uber as an example.

28:45.400 --> 28:49.840
 As Travis sells the story, he and his cofounder were sitting in Paris and they had this idea

28:49.840 --> 28:52.160
 because they couldn't get a cab.

28:52.160 --> 28:56.800
 And they said, we have smartphones and the rest is history.

28:56.800 --> 29:04.040
 So what's the equivalent of that Travis Eiffel Tower, where is a cab moment that you could,

29:04.040 --> 29:08.800
 as an entrepreneur, take advantage of, whether it's in Human Senate AI or something else?

29:08.800 --> 29:11.480
 That's the next great start up.

29:11.480 --> 29:13.760
 And the psychology of that moment.

29:13.760 --> 29:20.120
 So when Sergey and Larry talk about, and listen to a few interviews, it's very nonchalant.

29:20.120 --> 29:25.280
 Well, here's the very fascinating web data.

29:25.280 --> 29:29.080
 And here's an algorithm we have for, you know, we just kind of want to play around with that

29:29.080 --> 29:30.080
 data.

29:30.080 --> 29:32.520
 And it seems like that's a really nice way to organize this data.

29:32.520 --> 29:38.000
 Well, I should say what happened, remember, is that they were graduate students at Stanford

29:38.000 --> 29:41.320
 and they thought this is interesting, so they built a search engine and they kept it in

29:41.320 --> 29:43.400
 their room.

29:43.400 --> 29:47.520
 And they had to get power from the room next door because they were using too much power

29:47.520 --> 29:48.520
 in the room.

29:48.520 --> 29:51.640
 So they ran an extension cord over.

29:51.640 --> 29:55.360
 And then they went and they found a house and they had Google World headquarters of

29:55.360 --> 29:57.600
 five people to start the company.

29:57.600 --> 30:02.560
 And they raised $100,000 from Andy Bechtelstein, who was the sun founder to do this, and Dave

30:02.560 --> 30:04.520
 Chariton and a few others.

30:04.520 --> 30:11.960
 The point is their beginnings were very simple, but they were based on a powerful insight.

30:11.960 --> 30:14.320
 That is a replicable model for any startup.

30:14.320 --> 30:16.520
 It has to be a powerful insight.

30:16.520 --> 30:17.680
 The beginnings are simple.

30:17.680 --> 30:19.960
 And there has to be an innovation.

30:19.960 --> 30:24.280
 In Larry and Sergey's case, it was PageRank, which was a brilliant idea, one of the most

30:24.280 --> 30:26.880
 cited papers in the world today.

30:26.880 --> 30:29.880
 What's the next one?

30:29.880 --> 30:37.280
 So you're one of, if I may say, richest people in the world, and yet it seems that money

30:37.280 --> 30:43.200
 is simply a side effect of your passions and not an inherent goal.

30:43.200 --> 30:48.360
 But it's a, you're a fascinating person to ask.

30:48.360 --> 30:55.080
 So much of our society at the individual level and at the company level and its nations is

30:55.080 --> 30:58.920
 driven by the desire for wealth.

30:58.920 --> 31:01.280
 What do you think about this drive?

31:01.280 --> 31:07.000
 And what have you learned about, if I may romanticize the notion, the meaning of life,

31:07.000 --> 31:10.520
 having achieved success on so many dimensions?

31:10.520 --> 31:16.960
 There have been many studies of human happiness and above some threshold, which is typically

31:16.960 --> 31:23.600
 relatively low for this conversation, there's no difference in happiness about money.

31:23.600 --> 31:30.120
 The happiness is correlated with meaning and purpose, a sense of family, a sense of impact.

31:30.120 --> 31:34.440
 So if you organize your life, assuming you have enough to get around and have a nice

31:34.440 --> 31:40.400
 home and so forth, you'll be far happier if you figure out what you care about and work

31:40.400 --> 31:41.800
 on that.

31:41.800 --> 31:44.120
 It's often being in service to others.

31:44.120 --> 31:47.840
 It's a great deal of evidence that people are happiest when they're serving others

31:47.840 --> 31:49.640
 and not themselves.

31:49.640 --> 31:57.480
 This goes directly against the sort of press induced excitement about powerful and wealthy

31:57.480 --> 32:01.840
 leaders of one kind and indeed, these are consequential people.

32:01.840 --> 32:06.720
 But if you are in a situation where you've been very fortunate as I have, you also have

32:06.720 --> 32:12.160
 to take that as a responsibility and you have to basically work both to educate others and

32:12.160 --> 32:16.760
 give them that opportunity, but also use that wealth to advance human society.

32:16.760 --> 32:20.440
 In my case, I'm particularly interested in using the tools of artificial intelligence

32:20.440 --> 32:22.800
 and machine learning to make society better.

32:22.800 --> 32:24.000
 I've mentioned education.

32:24.000 --> 32:29.040
 I've mentioned inequality and middle class and things like this, all of which are a passion

32:29.040 --> 32:30.160
 of mine.

32:30.160 --> 32:31.920
 It doesn't matter what you do.

32:31.920 --> 32:36.560
 It matters that you believe in it, that it's important to you and that your life will be

32:36.560 --> 32:40.480
 far more satisfying if you spend your life doing that.

32:40.480 --> 32:45.320
 I think there's no better place to end than a discussion of the meaning of life.

32:45.320 --> 32:46.320
 Eric, thank you so much.

32:46.320 --> 32:47.320
 Thank you very much.

32:47.320 --> 33:16.320
 Thank you.