File size: 102,667 Bytes
323a418
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
1027
1028
1029
1030
1031
1032
1033
1034
1035
1036
1037
1038
1039
1040
1041
1042
1043
1044
1045
1046
1047
1048
1049
1050
1051
1052
1053
1054
1055
1056
1057
1058
1059
1060
1061
1062
1063
1064
1065
1066
1067
1068
1069
1070
1071
1072
1073
1074
1075
1076
1077
1078
1079
1080
1081
1082
1083
1084
1085
1086
1087
1088
1089
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
1106
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
1122
1123
1124
1125
1126
1127
1128
1129
1130
1131
1132
1133
1134
1135
1136
1137
1138
1139
1140
1141
1142
1143
1144
1145
1146
1147
1148
1149
1150
1151
1152
1153
1154
1155
1156
1157
1158
1159
1160
1161
1162
1163
1164
1165
1166
1167
1168
1169
1170
1171
1172
1173
1174
1175
1176
1177
1178
1179
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
1201
1202
1203
1204
1205
1206
1207
1208
1209
1210
1211
1212
1213
1214
1215
1216
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
1228
1229
1230
1231
1232
1233
1234
1235
1236
1237
1238
1239
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
1252
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
1264
1265
1266
1267
1268
1269
1270
1271
1272
1273
1274
1275
1276
1277
1278
1279
1280
1281
1282
1283
1284
1285
1286
1287
1288
1289
1290
1291
1292
1293
1294
1295
1296
1297
1298
1299
1300
1301
1302
1303
1304
1305
1306
1307
1308
1309
1310
1311
1312
1313
1314
1315
1316
1317
1318
1319
1320
1321
1322
1323
1324
1325
1326
1327
1328
1329
1330
1331
1332
1333
1334
1335
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
1361
1362
1363
1364
1365
1366
1367
1368
1369
1370
1371
1372
1373
1374
1375
1376
1377
1378
1379
1380
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
1392
1393
1394
1395
1396
1397
1398
1399
1400
1401
1402
1403
1404
1405
1406
1407
1408
1409
1410
1411
1412
1413
1414
1415
1416
1417
1418
1419
1420
1421
1422
1423
1424
1425
1426
1427
1428
1429
1430
1431
1432
1433
1434
1435
1436
1437
1438
1439
1440
1441
1442
1443
1444
1445
1446
1447
1448
1449
1450
1451
1452
1453
1454
1455
1456
1457
1458
1459
1460
1461
1462
1463
1464
1465
1466
1467
1468
1469
1470
1471
1472
1473
1474
1475
1476
1477
1478
1479
1480
1481
1482
1483
1484
1485
1486
1487
1488
1489
1490
1491
1492
1493
1494
1495
1496
1497
1498
1499
1500
1501
1502
1503
1504
1505
1506
1507
1508
1509
1510
1511
1512
1513
1514
1515
1516
1517
1518
1519
1520
1521
1522
1523
1524
1525
1526
1527
1528
1529
1530
1531
1532
1533
1534
1535
1536
1537
1538
1539
1540
1541
1542
1543
1544
1545
1546
1547
1548
1549
1550
1551
1552
1553
1554
1555
1556
1557
1558
1559
1560
1561
1562
1563
1564
1565
1566
1567
1568
1569
1570
1571
1572
1573
1574
1575
1576
1577
1578
1579
1580
1581
1582
1583
1584
1585
1586
1587
1588
1589
1590
1591
1592
1593
1594
1595
1596
1597
1598
1599
1600
1601
1602
1603
1604
1605
1606
1607
1608
1609
1610
1611
1612
1613
1614
1615
1616
1617
1618
1619
1620
1621
1622
1623
1624
1625
1626
1627
1628
1629
1630
1631
1632
1633
1634
1635
1636
1637
1638
1639
1640
1641
1642
1643
1644
1645
1646
1647
1648
1649
1650
1651
1652
1653
1654
1655
1656
1657
1658
1659
1660
1661
1662
1663
1664
1665
1666
1667
1668
1669
1670
1671
1672
1673
1674
1675
1676
1677
1678
1679
1680
1681
1682
1683
1684
1685
1686
1687
1688
1689
1690
1691
1692
1693
1694
1695
1696
1697
1698
1699
1700
1701
1702
1703
1704
1705
1706
1707
1708
1709
1710
1711
1712
1713
1714
1715
1716
1717
1718
1719
1720
1721
1722
1723
1724
1725
1726
1727
1728
1729
1730
1731
1732
1733
1734
1735
1736
1737
1738
1739
1740
1741
1742
1743
1744
1745
1746
1747
1748
1749
1750
1751
1752
1753
1754
1755
1756
1757
1758
1759
1760
1761
1762
1763
1764
1765
1766
1767
1768
1769
1770
1771
1772
1773
1774
1775
1776
1777
1778
1779
1780
1781
1782
1783
1784
1785
1786
1787
1788
1789
1790
1791
1792
1793
1794
1795
1796
1797
1798
1799
1800
1801
1802
1803
1804
1805
1806
1807
1808
1809
1810
1811
1812
1813
1814
1815
1816
1817
1818
1819
1820
1821
1822
1823
1824
1825
1826
1827
1828
1829
1830
1831
1832
1833
1834
1835
1836
1837
1838
1839
1840
1841
1842
1843
1844
1845
1846
1847
1848
1849
1850
1851
1852
1853
1854
1855
1856
1857
1858
1859
1860
1861
1862
1863
1864
1865
1866
1867
1868
1869
1870
1871
1872
1873
1874
1875
1876
1877
1878
1879
1880
1881
1882
1883
1884
1885
1886
1887
1888
1889
1890
1891
1892
1893
1894
1895
1896
1897
1898
1899
1900
1901
1902
1903
1904
1905
1906
1907
1908
1909
1910
1911
1912
1913
1914
1915
1916
1917
1918
1919
1920
1921
1922
1923
1924
1925
1926
1927
1928
1929
1930
1931
1932
1933
1934
1935
1936
1937
1938
1939
1940
1941
1942
1943
1944
1945
1946
1947
1948
1949
1950
1951
1952
1953
1954
1955
1956
1957
1958
1959
1960
1961
1962
1963
1964
1965
1966
1967
1968
1969
1970
1971
1972
1973
1974
1975
1976
1977
1978
1979
1980
1981
1982
1983
1984
1985
1986
1987
1988
1989
1990
1991
1992
1993
1994
1995
1996
1997
1998
1999
2000
2001
2002
2003
2004
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
2025
2026
2027
2028
2029
2030
2031
2032
2033
2034
2035
2036
2037
2038
2039
2040
2041
2042
2043
2044
2045
2046
2047
2048
2049
2050
2051
2052
2053
2054
2055
2056
2057
2058
2059
2060
2061
2062
2063
2064
2065
2066
2067
2068
2069
2070
2071
2072
2073
2074
2075
2076
2077
2078
2079
2080
2081
2082
2083
2084
2085
2086
2087
2088
2089
2090
2091
2092
2093
2094
2095
2096
2097
2098
2099
2100
2101
2102
2103
2104
2105
2106
2107
2108
2109
2110
2111
2112
2113
2114
2115
2116
2117
2118
2119
2120
2121
2122
2123
2124
2125
2126
2127
2128
2129
2130
2131
2132
2133
2134
2135
2136
2137
2138
2139
2140
2141
2142
2143
2144
2145
2146
2147
2148
2149
2150
2151
2152
2153
2154
2155
2156
2157
2158
2159
2160
2161
2162
2163
2164
2165
2166
2167
2168
2169
2170
2171
2172
2173
2174
2175
2176
2177
2178
2179
2180
2181
2182
2183
2184
2185
2186
2187
2188
2189
2190
2191
2192
2193
2194
2195
2196
2197
2198
2199
2200
2201
2202
2203
2204
2205
2206
2207
2208
2209
2210
2211
2212
2213
2214
2215
2216
2217
2218
2219
2220
2221
2222
2223
2224
2225
2226
2227
2228
2229
2230
2231
2232
2233
2234
2235
2236
2237
2238
2239
2240
2241
2242
2243
2244
2245
2246
2247
2248
2249
2250
2251
2252
2253
2254
2255
2256
2257
2258
2259
2260
2261
2262
2263
2264
2265
2266
2267
2268
2269
2270
2271
2272
2273
2274
2275
2276
2277
2278
2279
2280
2281
2282
2283
2284
2285
2286
2287
2288
2289
2290
2291
2292
2293
2294
2295
2296
2297
2298
2299
2300
2301
2302
2303
2304
2305
2306
2307
2308
2309
2310
2311
2312
2313
2314
2315
2316
2317
2318
2319
2320
2321
2322
2323
2324
2325
2326
2327
2328
2329
2330
2331
2332
2333
2334
2335
2336
2337
2338
2339
2340
2341
2342
2343
2344
2345
2346
2347
2348
2349
2350
2351
2352
2353
2354
2355
2356
2357
2358
2359
2360
2361
2362
2363
2364
2365
2366
2367
2368
2369
2370
2371
2372
2373
2374
2375
2376
2377
2378
2379
2380
2381
2382
2383
2384
2385
2386
2387
2388
2389
2390
2391
2392
2393
2394
2395
2396
2397
2398
2399
2400
2401
2402
2403
2404
2405
2406
2407
2408
2409
2410
2411
2412
2413
2414
2415
2416
2417
2418
2419
2420
2421
2422
2423
2424
2425
2426
2427
2428
2429
2430
2431
2432
2433
2434
2435
2436
2437
2438
2439
2440
2441
2442
2443
2444
2445
2446
2447
2448
2449
2450
2451
2452
2453
2454
2455
2456
2457
2458
2459
2460
2461
2462
2463
2464
2465
2466
2467
2468
2469
2470
2471
2472
2473
2474
2475
2476
2477
2478
2479
2480
2481
2482
2483
2484
2485
2486
2487
2488
2489
2490
2491
2492
2493
2494
2495
2496
2497
2498
2499
2500
2501
2502
2503
2504
2505
2506
2507
2508
2509
2510
2511
2512
2513
2514
2515
2516
2517
2518
2519
2520
2521
2522
2523
2524
2525
2526
2527
2528
2529
2530
2531
2532
2533
2534
2535
2536
2537
2538
2539
2540
2541
2542
2543
2544
2545
2546
2547
2548
2549
2550
2551
2552
2553
2554
2555
2556
2557
2558
2559
2560
2561
2562
2563
2564
2565
2566
2567
2568
2569
2570
2571
2572
2573
2574
2575
2576
2577
2578
2579
2580
2581
2582
2583
2584
2585
2586
2587
2588
2589
2590
2591
2592
2593
2594
2595
2596
2597
2598
2599
2600
2601
2602
2603
2604
2605
2606
2607
2608
2609
2610
2611
2612
2613
2614
2615
2616
2617
2618
2619
2620
2621
2622
2623
2624
2625
2626
2627
2628
2629
2630
2631
2632
2633
2634
2635
2636
2637
2638
2639
2640
2641
2642
2643
2644
2645
2646
2647
2648
2649
2650
2651
2652
2653
2654
2655
2656
2657
2658
2659
2660
2661
2662
2663
2664
2665
2666
2667
2668
2669
2670
2671
2672
2673
2674
2675
2676
2677
2678
2679
2680
2681
2682
2683
2684
2685
2686
2687
2688
2689
2690
2691
2692
2693
2694
2695
2696
2697
2698
2699
2700
2701
2702
2703
2704
2705
2706
2707
2708
2709
2710
2711
2712
2713
2714
2715
2716
2717
2718
2719
2720
2721
2722
2723
2724
2725
2726
2727
2728
2729
2730
2731
2732
2733
2734
2735
2736
2737
2738
2739
2740
2741
2742
2743
2744
2745
2746
2747
2748
2749
2750
2751
2752
2753
2754
2755
2756
2757
2758
2759
2760
2761
2762
2763
2764
2765
2766
2767
2768
2769
2770
2771
2772
2773
2774
2775
2776
2777
2778
2779
2780
2781
2782
2783
2784
2785
2786
2787
2788
2789
2790
2791
2792
2793
2794
2795
2796
2797
2798
2799
2800
2801
2802
2803
2804
2805
2806
2807
2808
2809
2810
2811
2812
2813
2814
2815
2816
2817
2818
2819
2820
2821
2822
2823
2824
2825
2826
2827
2828
2829
2830
2831
2832
2833
2834
2835
2836
2837
2838
2839
2840
2841
2842
2843
2844
2845
2846
2847
2848
2849
2850
2851
2852
2853
2854
2855
2856
2857
2858
2859
2860
2861
2862
2863
2864
2865
2866
2867
2868
2869
2870
2871
2872
2873
2874
2875
2876
2877
2878
2879
2880
2881
2882
2883
2884
2885
2886
2887
2888
2889
2890
2891
2892
2893
2894
2895
2896
2897
2898
2899
2900
2901
2902
2903
2904
2905
2906
2907
2908
2909
2910
2911
2912
2913
2914
2915
2916
2917
2918
2919
2920
2921
2922
2923
2924
2925
2926
2927
2928
2929
2930
2931
2932
2933
2934
2935
2936
2937
2938
2939
2940
2941
2942
2943
2944
2945
2946
2947
2948
2949
2950
2951
2952
2953
2954
2955
2956
2957
2958
2959
2960
2961
2962
2963
2964
2965
2966
2967
2968
2969
2970
2971
2972
2973
2974
2975
2976
2977
2978
2979
2980
2981
2982
2983
2984
2985
2986
2987
2988
2989
2990
2991
2992
2993
2994
2995
2996
2997
2998
2999
3000
3001
3002
3003
3004
3005
3006
3007
3008
3009
3010
3011
3012
3013
3014
3015
3016
3017
3018
3019
3020
3021
3022
3023
3024
3025
3026
3027
3028
3029
3030
3031
3032
3033
3034
3035
3036
3037
3038
3039
3040
3041
3042
3043
3044
3045
3046
3047
3048
3049
3050
3051
3052
3053
3054
3055
3056
3057
3058
3059
3060
3061
3062
3063
3064
3065
3066
3067
3068
3069
3070
3071
3072
3073
3074
3075
3076
3077
3078
3079
3080
3081
3082
3083
3084
3085
3086
3087
3088
3089
3090
3091
3092
3093
3094
3095
3096
3097
3098
3099
3100
3101
3102
3103
3104
3105
3106
3107
3108
3109
3110
3111
3112
3113
3114
3115
3116
3117
3118
3119
3120
3121
3122
3123
3124
3125
3126
3127
3128
3129
3130
3131
3132
3133
3134
3135
3136
3137
3138
3139
3140
3141
3142
3143
3144
3145
3146
3147
3148
3149
3150
3151
3152
3153
3154
3155
3156
3157
3158
3159
3160
3161
3162
3163
3164
3165
3166
3167
3168
3169
3170
3171
3172
3173
3174
3175
3176
3177
3178
3179
3180
3181
3182
3183
3184
3185
3186
3187
3188
3189
3190
3191
3192
3193
3194
3195
3196
3197
3198
3199
3200
3201
3202
3203
3204
3205
3206
3207
3208
3209
3210
3211
3212
3213
3214
3215
3216
3217
3218
3219
3220
3221
3222
3223
3224
3225
3226
3227
3228
3229
3230
3231
3232
3233
3234
3235
3236
3237
3238
3239
3240
3241
3242
3243
3244
3245
3246
3247
3248
3249
3250
3251
3252
3253
3254
3255
3256
3257
3258
3259
3260
3261
3262
3263
3264
3265
3266
3267
3268
3269
3270
3271
3272
3273
3274
3275
3276
3277
3278
3279
3280
3281
3282
3283
3284
3285
3286
3287
3288
3289
3290
3291
3292
3293
3294
3295
3296
3297
3298
3299
3300
3301
3302
3303
3304
3305
3306
3307
3308
3309
3310
3311
3312
3313
3314
3315
3316
3317
3318
3319
3320
3321
3322
3323
3324
3325
3326
3327
3328
3329
3330
3331
3332
3333
3334
3335
3336
3337
3338
3339
3340
3341
3342
3343
3344
3345
3346
3347
3348
3349
3350
3351
3352
3353
3354
3355
3356
3357
3358
3359
3360
3361
3362
3363
3364
3365
3366
3367
3368
3369
3370
3371
3372
3373
3374
3375
3376
3377
3378
3379
3380
3381
3382
3383
3384
3385
3386
3387
3388
3389
3390
3391
3392
3393
3394
3395
3396
3397
3398
3399
3400
3401
3402
3403
3404
3405
3406
3407
3408
3409
3410
3411
3412
3413
3414
3415
3416
3417
3418
3419
3420
3421
3422
3423
3424
3425
3426
3427
3428
3429
3430
3431
3432
3433
3434
3435
3436
3437
3438
3439
3440
3441
3442
3443
3444
3445
3446
3447
3448
3449
3450
3451
3452
3453
3454
3455
3456
3457
3458
3459
3460
3461
3462
3463
3464
3465
3466
3467
3468
3469
3470
3471
3472
3473
3474
3475
3476
3477
3478
3479
3480
3481
3482
3483
3484
3485
3486
3487
3488
3489
3490
3491
3492
3493
3494
3495
3496
3497
3498
3499
3500
3501
3502
3503
3504
3505
3506
3507
3508
3509
3510
3511
3512
3513
3514
3515
3516
3517
3518
3519
3520
3521
3522
3523
3524
3525
3526
3527
3528
3529
3530
3531
3532
3533
3534
3535
3536
3537
3538
3539
3540
3541
3542
3543
3544
3545
3546
3547
3548
3549
3550
3551
3552
3553
3554
3555
3556
3557
3558
3559
3560
3561
3562
3563
3564
3565
3566
3567
3568
3569
3570
3571
3572
3573
3574
3575
3576
3577
3578
3579
3580
3581
3582
3583
3584
3585
3586
3587
3588
3589
3590
3591
3592
3593
3594
3595
3596
3597
3598
3599
3600
3601
3602
3603
3604
3605
3606
3607
3608
3609
3610
3611
3612
3613
3614
3615
3616
3617
3618
3619
3620
3621
3622
3623
3624
3625
3626
3627
3628
3629
3630
3631
3632
3633
3634
3635
3636
3637
3638
3639
3640
3641
3642
3643
3644
3645
3646
3647
3648
3649
3650
3651
3652
3653
3654
3655
3656
3657
3658
3659
3660
3661
3662
3663
3664
3665
3666
3667
3668
3669
3670
3671
3672
3673
3674
3675
3676
3677
3678
3679
3680
3681
3682
3683
3684
3685
3686
3687
3688
3689
3690
3691
3692
3693
3694
3695
3696
3697
3698
3699
3700
3701
3702
3703
3704
3705
3706
3707
3708
3709
3710
3711
3712
3713
3714
3715
3716
3717
3718
3719
3720
3721
3722
3723
3724
3725
3726
3727
3728
3729
3730
3731
3732
3733
3734
3735
3736
3737
3738
3739
3740
3741
3742
3743
3744
3745
3746
3747
3748
3749
3750
3751
3752
3753
3754
3755
3756
3757
3758
3759
3760
3761
3762
3763
3764
3765
3766
3767
3768
3769
3770
3771
3772
3773
3774
3775
3776
3777
3778
3779
3780
3781
3782
3783
3784
3785
3786
3787
3788
3789
3790
3791
3792
3793
3794
3795
3796
3797
3798
3799
3800
3801
3802
3803
3804
3805
3806
3807
3808
3809
3810
3811
3812
3813
3814
3815
3816
3817
3818
3819
3820
3821
3822
3823
3824
3825
3826
3827
3828
3829
3830
3831
3832
3833
3834
3835
3836
3837
3838
3839
3840
3841
3842
3843
3844
3845
3846
3847
3848
3849
3850
3851
3852
3853
3854
3855
3856
3857
3858
3859
3860
3861
3862
3863
3864
3865
3866
3867
3868
3869
3870
3871
3872
3873
3874
3875
3876
3877
3878
3879
3880
3881
3882
3883
3884
3885
3886
3887
3888
3889
3890
3891
3892
3893
3894
3895
3896
3897
3898
3899
3900
3901
3902
3903
3904
3905
3906
3907
3908
3909
3910
3911
3912
3913
3914
3915
3916
3917
3918
3919
3920
3921
3922
3923
3924
3925
3926
3927
3928
3929
3930
3931
3932
3933
3934
3935
3936
3937
3938
3939
3940
3941
3942
3943
3944
3945
WEBVTT

00:00.000 --> 00:02.960
 The following is a conversation with Kai Fu Li.

00:02.960 --> 00:06.520
 He's the chairman and CEO of Sinovation Ventures

00:06.520 --> 00:10.560
 that manages a $2 billion dual currency investment fund

00:10.560 --> 00:13.160
 with a focus on developing the next generation

00:13.160 --> 00:15.440
 of Chinese high tech companies.

00:15.440 --> 00:17.840
 He's the former president of Google China

00:17.840 --> 00:20.880
 and the founder of what is now called Microsoft Research

00:20.880 --> 00:24.160
 Asia, an institute that trained many

00:24.160 --> 00:26.520
 of the artificial intelligence leaders in China,

00:26.520 --> 00:32.080
 including CTOs or AI execs at Baidu, Tencent, Alibaba,

00:32.080 --> 00:34.840
 Lenovo, and Huawei.

00:34.840 --> 00:38.520
 He was named one of the 100 most influential people

00:38.520 --> 00:40.680
 in the world by Time Magazine.

00:40.680 --> 00:43.880
 He's the author of seven bestselling books in Chinese

00:43.880 --> 00:47.080
 and most recently, the New York Times bestseller called

00:47.080 --> 00:50.600
 AI Superpowers, China, Silicon Valley,

00:50.600 --> 00:52.760
 and the New World Order.

00:52.760 --> 00:57.200
 He has unparalleled experience in working across major tech

00:57.200 --> 01:00.120
 companies and governments on applications of AI.

01:00.120 --> 01:02.440
 And so he has a unique perspective

01:02.440 --> 01:05.080
 on global innovation in the future of AI

01:05.080 --> 01:09.000
 that I think is important to listen to and think about.

01:09.000 --> 01:11.960
 This is the Artificial Intelligence Podcast.

01:11.960 --> 01:15.240
 If you enjoy it, subscribe on YouTube and iTunes,

01:15.240 --> 01:18.880
 support it on Patreon, or simply connect with me on Twitter

01:18.880 --> 01:21.040
 at Lex Freedman.

01:21.040 --> 01:26.120
 And now, here's my conversation with Kaifu Li.

01:26.120 --> 01:29.440
 I immigrated from Russia to US when I was 13.

01:29.440 --> 01:32.480
 You immigrated to US at about the same age.

01:32.480 --> 01:35.920
 The Russian people, the American people, the Chinese people,

01:35.920 --> 01:39.440
 each have a certain soul, a spirit,

01:39.440 --> 01:42.080
 that permeates throughout the generations.

01:42.080 --> 01:45.120
 So maybe it's a little bit of a poetic question,

01:45.120 --> 01:49.240
 but could you describe your sense of what

01:49.240 --> 01:52.080
 defines the Chinese soul?

01:52.080 --> 01:56.160
 I think the Chinese soul of people today, right,

01:56.160 --> 02:02.000
 we're talking about people who have had centuries of burden

02:02.000 --> 02:05.240
 because of the poverty that the country has gone through

02:05.240 --> 02:10.560
 and suddenly shined with hope of prosperity

02:10.560 --> 02:13.440
 in the past 40 years as China opened up

02:13.440 --> 02:16.440
 and embraced market economy.

02:16.440 --> 02:20.200
 And undoubtedly, there are two sets of pressures

02:20.200 --> 02:24.160
 on the people, that of the tradition,

02:24.160 --> 02:28.040
 that of facing difficult situations,

02:28.040 --> 02:31.160
 and that of hope of wanting to be the first

02:31.160 --> 02:33.840
 to become successful and wealthy,

02:33.840 --> 02:38.360
 so that it's a very strong hunger and strong desire

02:38.360 --> 02:41.160
 and strong work ethic that drives China forward.

02:41.160 --> 02:43.960
 And is there roots to not just this generation,

02:43.960 --> 02:47.880
 but before, that's deeper than just

02:47.880 --> 02:50.080
 the new economic developments?

02:50.080 --> 02:52.520
 Is there something that's unique to China

02:52.520 --> 02:54.960
 that you could speak to that's in the people?

02:54.960 --> 02:56.000
 Yeah.

02:56.000 --> 03:00.280
 Well, the Chinese tradition is about excellence,

03:00.280 --> 03:02.680
 dedication, and results.

03:02.680 --> 03:07.240
 And the Chinese exams and study subjects in schools

03:07.240 --> 03:11.080
 have traditionally started from memorizing 10,000 characters,

03:11.080 --> 03:13.600
 not an easy task to start with.

03:13.600 --> 03:17.640
 And further by memorizing historic philosophers,

03:17.640 --> 03:19.000
 literature, poetry.

03:19.000 --> 03:22.480
 So it really is probably the strongest road

03:22.480 --> 03:26.920
 learning mechanism created to make sure people had good memory

03:26.920 --> 03:30.080
 and remembered things extremely well.

03:30.080 --> 03:33.720
 That, I think, at the same time suppresses

03:33.720 --> 03:37.360
 the breakthrough innovation.

03:37.360 --> 03:42.520
 And also enhances the speed execution get results.

03:42.520 --> 03:47.400
 And that, I think, characterizes the historic basis of China.

03:47.400 --> 03:49.160
 That's interesting, because there's echoes of that

03:49.160 --> 03:52.080
 in Russian education as well as rote memorization.

03:52.080 --> 03:53.800
 So you memorize a lot of poetry.

03:53.800 --> 03:59.240
 I mean, there's just an emphasis on perfection in all forms

03:59.240 --> 04:02.240
 that's not conducive to perhaps what you're speaking to,

04:02.240 --> 04:03.640
 which is creativity.

04:03.640 --> 04:05.640
 But you think that kind of education

04:05.640 --> 04:09.040
 holds back the innovative spirit that you

04:09.040 --> 04:10.960
 might see in the United States?

04:10.960 --> 04:14.840
 Well, it holds back the breakthrough innovative spirit

04:14.840 --> 04:16.480
 that we see in the United States.

04:16.480 --> 04:21.880
 But it does not hold back the valuable execution oriented,

04:21.880 --> 04:26.320
 result oriented value creating engines, which we see China

04:26.320 --> 04:27.960
 being very successful.

04:27.960 --> 04:32.320
 So is there a difference between a Chinese AI engineer

04:32.320 --> 04:35.600
 today and an American AI engineer perhaps rooted

04:35.600 --> 04:38.320
 in the culture that we just talked about or the education

04:38.320 --> 04:41.160
 or the very soul of the people or no?

04:41.160 --> 04:43.720
 And what would your advice be to each

04:43.720 --> 04:45.520
 if there's a difference?

04:45.520 --> 04:47.120
 Well, there's a lot that's similar,

04:47.120 --> 04:51.240
 because AI is about mastering sciences,

04:51.240 --> 04:54.880
 about using known technologies and trying new things.

04:54.880 --> 04:59.760
 But it's also about picking from many parts of possible networks

04:59.760 --> 05:02.920
 to use and different types of parameters to tune.

05:02.920 --> 05:05.280
 And that part is somewhat rote.

05:05.280 --> 05:09.040
 And it is also, as anyone who's built AI products,

05:09.040 --> 05:12.680
 can tell you a lot about cleansing the data.

05:12.680 --> 05:15.200
 Because AI runs better with more data.

05:15.200 --> 05:20.160
 And data is generally unstructured, errorful,

05:20.160 --> 05:22.360
 and unclean.

05:22.360 --> 05:26.280
 And the effort to clean the data is immense.

05:26.280 --> 05:32.280
 So I think the better part of the American AI engineering

05:32.280 --> 05:36.840
 process is to try new things, to do things people haven't done

05:36.840 --> 05:41.840
 before, and to use technology to solve most, if not all,

05:41.840 --> 05:43.480
 problems.

05:43.480 --> 05:47.160
 So to make the algorithm work despite not so great data,

05:47.160 --> 05:50.680
 find error tolerant ways to deal with the data.

05:50.680 --> 05:55.960
 The Chinese way would be to basically enumerate,

05:55.960 --> 05:58.560
 to the fullest extent, all the possible ways

05:58.560 --> 06:01.000
 by a lot of machines, try lots of different ways

06:01.000 --> 06:05.320
 to get it to work, and spend a lot of resources and money

06:05.320 --> 06:07.720
 and time cleaning up data.

06:07.720 --> 06:11.880
 That means the AI engineer may be writing data cleansing

06:11.880 --> 06:15.600
 algorithms, working with thousands of people

06:15.600 --> 06:19.160
 who label or correct or do things with the data.

06:19.160 --> 06:21.920
 That is the incredible hard work that

06:21.920 --> 06:24.040
 might lead to better results.

06:24.040 --> 06:28.240
 So the Chinese engineer would rely on and ask for more and more

06:28.240 --> 06:31.120
 data and find ways to cleanse them and make them work

06:31.120 --> 06:34.200
 in the system, and probably less time thinking

06:34.200 --> 06:39.320
 about new algorithms that can overcome data or other issues.

06:39.320 --> 06:40.560
 So where's your intuition?

06:40.560 --> 06:43.160
 What do you think the biggest impact the next 10 years

06:43.160 --> 06:43.920
 lies?

06:43.920 --> 06:47.120
 Is it in some breakthrough algorithms?

06:47.120 --> 06:53.920
 Or is it in just this at scale rigor, a rigorous approach

06:53.920 --> 06:57.120
 to data, cleaning data, organizing data

06:57.120 --> 06:58.440
 onto the same algorithms?

06:58.440 --> 07:02.600
 What do you think the big impact in the applied world is?

07:02.600 --> 07:04.560
 Well, if you're really in the company

07:04.560 --> 07:08.400
 and you have to deliver results, using known techniques

07:08.400 --> 07:12.240
 and enhancing data seems like the more expedient approach

07:12.240 --> 07:15.640
 that's very low risk and likely to generate

07:15.640 --> 07:17.200
 better and better results.

07:17.200 --> 07:20.520
 And that's why the Chinese approach has done quite well.

07:20.520 --> 07:24.240
 Now, there are a lot of more challenging startups

07:24.240 --> 07:28.440
 and problems, such as autonomous vehicles,

07:28.440 --> 07:32.560
 medical diagnosis, that existing algorithms probably

07:32.560 --> 07:34.240
 won't solve.

07:34.240 --> 07:38.680
 And that would put the Chinese approach more challenged

07:38.680 --> 07:43.720
 and give them more breakthrough innovation approach, more

07:43.720 --> 07:45.440
 of an edge on those kinds of problems.

07:45.440 --> 07:47.040
 So let me talk to that a little more.

07:47.040 --> 07:50.960
 So my intuition, personally, is that data

07:50.960 --> 07:53.680
 can take us extremely far.

07:53.680 --> 07:56.480
 So you brought up autonomous vehicles and medical diagnosis.

07:56.480 --> 08:00.080
 So your intuition is that huge amounts of data

08:00.080 --> 08:04.000
 might not be able to completely help us solve that problem.

08:04.000 --> 08:04.600
 Right.

08:04.600 --> 08:08.080
 So breaking that down further, autonomous vehicle,

08:08.080 --> 08:10.080
 I think huge amounts of data probably

08:10.080 --> 08:13.360
 will solve trucks driving on highways, which

08:13.360 --> 08:15.640
 will deliver significant value.

08:15.640 --> 08:19.320
 And China will probably lead in that.

08:19.320 --> 08:24.880
 And full L5 autonomous is likely to require new technologies

08:24.880 --> 08:26.320
 we don't yet know.

08:26.320 --> 08:30.320
 And that might require academia and great industrial research,

08:30.320 --> 08:32.480
 both innovating and working together.

08:32.480 --> 08:35.360
 And in that case, US has an advantage.

08:35.360 --> 08:37.040
 So the interesting question there is,

08:37.040 --> 08:39.280
 I don't know if you're familiar on the autonomous vehicle

08:39.280 --> 08:43.480
 space and the developments with Tesla and Elon Musk,

08:43.480 --> 08:49.400
 where they are, in fact, a full steam ahead

08:49.400 --> 08:53.480
 into this mysterious, complex world of full autonomy, L5,

08:53.480 --> 08:55.080
 L4, L5.

08:55.080 --> 08:58.800
 And they're trying to solve that purely with data.

08:58.800 --> 09:00.800
 So the same kind of thing that you're saying

09:00.800 --> 09:03.200
 is just for highway, which is what a lot of people

09:03.200 --> 09:07.200
 share your intuition, they're trying to solve with data.

09:07.200 --> 09:09.320
 It's just to linger on that moment further.

09:09.320 --> 09:13.600
 Do you think possible for them to achieve success

09:13.600 --> 09:17.040
 with simply just a huge amount of this training

09:17.040 --> 09:20.440
 on edge cases, on difficult cases in urban environments,

09:20.440 --> 09:22.840
 not just highway and so on?

09:22.840 --> 09:24.480
 I think they'll be very hard.

09:24.480 --> 09:27.680
 One could characterize Tesla's approach as kind

09:27.680 --> 09:31.600
 of a Chinese strength approach, gather all the data you can,

09:31.600 --> 09:34.000
 and hope that will overcome the problems.

09:34.000 --> 09:38.480
 But in autonomous driving, clearly a lot of the decisions

09:38.480 --> 09:41.480
 aren't merely solved by aggregating data

09:41.480 --> 09:43.520
 and having feedback loop.

09:43.520 --> 09:48.040
 There are things that are more akin to human thinking.

09:48.040 --> 09:51.680
 And how would those be integrated and built?

09:51.680 --> 09:54.000
 There has not yet been a lot of success

09:54.000 --> 09:57.200
 integrating human intelligence or, you know,

09:57.200 --> 09:58.800
 colored expert systems, if you will,

09:58.800 --> 10:02.960
 even though that's a taboo word with the machine learning.

10:02.960 --> 10:05.600
 And the integration of the two types of thinking

10:05.600 --> 10:07.840
 hasn't yet been demonstrated.

10:07.840 --> 10:09.600
 And the question is, how much can you

10:09.600 --> 10:12.440
 push a purely machine learning approach?

10:12.440 --> 10:15.480
 And of course, Tesla also has an additional constraint

10:15.480 --> 10:18.520
 that they don't have all the sensors.

10:18.520 --> 10:21.120
 I know that they think it's foolish to use LIDARS,

10:21.120 --> 10:25.920
 but that's clearly a one less, very valuable and reliable

10:25.920 --> 10:29.200
 source of input that they're foregoing, which

10:29.200 --> 10:32.440
 may also have consequences.

10:32.440 --> 10:33.840
 I think the advantage, of course,

10:33.840 --> 10:37.040
 is capturing data that no one has ever seen before.

10:37.040 --> 10:41.040
 And in some cases, such as computer vision and speech

10:41.040 --> 10:44.800
 recognition, I have seen Chinese companies accumulate data

10:44.800 --> 10:47.320
 that's not seen anywhere in the Western world,

10:47.320 --> 10:50.200
 and they have delivered superior results.

10:50.200 --> 10:53.720
 But then speech recognition and object recognition

10:53.720 --> 10:57.080
 are relatively suitable problems for deep learning

10:57.080 --> 11:02.440
 and don't have the potentially need for the human intelligence

11:02.440 --> 11:04.440
 analytical planning elements.

11:04.440 --> 11:06.400
 And the same on the speech recognition side,

11:06.400 --> 11:09.440
 your intuition that speech recognition and the machine

11:09.440 --> 11:11.440
 learning approaches to speech recognition

11:11.440 --> 11:14.600
 won't take us to a conversational system that

11:14.600 --> 11:19.160
 can pass the Turing test, which is maybe akin to what

11:19.160 --> 11:20.040
 driving is.

11:20.040 --> 11:25.120
 So it needs to have something more than just simply simple

11:25.120 --> 11:27.480
 language understanding, simple language generation.

11:27.480 --> 11:32.000
 Roughly right, I would say that based on purely machine

11:32.000 --> 11:35.160
 learning approaches, it's hard to imagine.

11:35.160 --> 11:40.520
 It could lead to a full conversational experience

11:40.520 --> 11:44.600
 across arbitrary domains, which is akin to L5.

11:44.600 --> 11:46.920
 I'm a little hesitant to use the word Turing test,

11:46.920 --> 11:50.280
 because the original definition was probably too easy.

11:50.280 --> 11:52.320
 We probably do that.

11:52.320 --> 11:55.280
 The spirit of the Turing test is what I was referring to.

11:55.280 --> 11:56.520
 Of course.

11:56.520 --> 11:59.400
 So you've had major leadership research positions

11:59.400 --> 12:01.640
 at Apple, Microsoft, Google.

12:01.640 --> 12:06.320
 So continuing on the discussion of America, Russia, Chinese soul

12:06.320 --> 12:10.520
 and culture and so on, what is the culture of Silicon

12:10.520 --> 12:16.400
 Valley in contrast to China and maybe US broadly?

12:16.400 --> 12:19.920
 And what is the unique culture of each of these three

12:19.920 --> 12:22.040
 major companies, in your view?

12:22.040 --> 12:25.120
 I think in aggregate, Silicon Valley companies,

12:25.120 --> 12:27.200
 we could probably include Microsoft in that,

12:27.200 --> 12:29.120
 even though they're not in the Valley,

12:29.120 --> 12:33.960
 is really dream big and have visionary goals

12:33.960 --> 12:37.920
 and believe that technology will conquer all

12:37.920 --> 12:42.240
 and also the self confidence and the self entitlement

12:42.240 --> 12:45.440
 that whatever they produce, the whole world should use

12:45.440 --> 12:47.240
 and must use.

12:47.240 --> 12:54.080
 And those are historically important, I think.

12:54.080 --> 12:59.120
 Steve Jobs's famous quote that he doesn't do focus groups.

12:59.120 --> 13:02.360
 He looks in the mirror and asks the person in the mirror,

13:02.360 --> 13:03.520
 what do you want?

13:03.520 --> 13:07.000
 And that really is an inspirational comment

13:07.000 --> 13:10.480
 that says the great company shouldn't just ask users

13:10.480 --> 13:13.240
 what they want, but develop something

13:13.240 --> 13:16.200
 that users will know they want when they see it,

13:16.200 --> 13:18.960
 but they could never come up with themselves.

13:18.960 --> 13:23.880
 I think that is probably the most exhilarating description

13:23.880 --> 13:26.560
 of what the essence of Silicon Valley is,

13:26.560 --> 13:31.840
 that this brilliant idea could cause you to build something

13:31.840 --> 13:35.520
 that couldn't come out of the focus groups or A.B. tests.

13:35.520 --> 13:38.040
 And iPhone would be an example of that.

13:38.040 --> 13:40.560
 No one in the age of BlackBerry would write down

13:40.560 --> 13:43.720
 they want an iPhone or multi touch, a browser,

13:43.720 --> 13:44.800
 might be another example.

13:44.800 --> 13:47.520
 No one would say they want that in the days of FTP,

13:47.520 --> 13:49.440
 but once they see it, they want it.

13:49.440 --> 13:55.680
 So I think that is what Silicon Valley is best at.

13:55.680 --> 13:58.920
 But it also came with a lot of success.

13:58.920 --> 14:01.960
 These products became global platforms,

14:01.960 --> 14:05.080
 and there were basically no competitors anywhere.

14:05.080 --> 14:08.400
 And that has also led to a belief

14:08.400 --> 14:13.240
 that these are the only things that one should do,

14:13.240 --> 14:17.960
 that companies should not tread on other companies territory,

14:17.960 --> 14:24.040
 so that a Groupon and a Yelp and an OpenTable

14:24.040 --> 14:26.240
 and the Grubhub would each feel,

14:26.240 --> 14:28.520
 okay, I'm not going to do the other companies business

14:28.520 --> 14:33.280
 because that would not be the pride of innovating

14:33.280 --> 14:36.920
 what each of these four companies have innovated.

14:36.920 --> 14:42.720
 But I think the Chinese approach is do whatever it takes to win.

14:42.720 --> 14:45.000
 And it's a winner take all market.

14:45.000 --> 14:47.200
 And in fact, in the internet space,

14:47.200 --> 14:50.840
 the market leader will get predominantly all the value

14:50.840 --> 14:53.320
 extracted out of the system.

14:53.320 --> 14:59.600
 And the system isn't just defined as one narrow category,

14:59.600 --> 15:01.360
 but gets broader and broader.

15:01.360 --> 15:07.960
 So it's amazing ambition for success and domination

15:07.960 --> 15:11.760
 of increasingly larger product categories

15:11.760 --> 15:15.080
 leading to clear market winner status

15:15.080 --> 15:19.120
 and the opportunity to extract tremendous value.

15:19.120 --> 15:25.840
 And that develops a practical, result oriented,

15:25.840 --> 15:31.520
 ultra ambitious winner take all gladiatorial mentality.

15:31.520 --> 15:37.400
 And if what it takes is to build what the competitors built,

15:37.400 --> 15:41.920
 essentially a copycat, that can be done without infringing laws.

15:41.920 --> 15:46.280
 If what it takes is to satisfy a foreign country's need

15:46.280 --> 15:48.480
 by forking the code base and building something

15:48.480 --> 15:51.440
 that looks really ugly and different, they'll do it.

15:51.440 --> 15:56.280
 So it's contrasted very sharply with the Silicon Valley approach.

15:56.280 --> 16:00.080
 And I think the flexibility and the speed and execution

16:00.080 --> 16:01.960
 has helped the Chinese approach.

16:01.960 --> 16:05.040
 And I think the Silicon Valley approach

16:05.040 --> 16:10.280
 is potentially challenged if every Chinese entrepreneur is

16:10.280 --> 16:13.200
 learning from the whole world, US and China,

16:13.200 --> 16:16.280
 and the American entrepreneurs only look internally

16:16.280 --> 16:19.600
 and write off China as a copycat.

16:19.600 --> 16:22.880
 And the second part of your question about the three

16:22.880 --> 16:23.520
 companies.

16:23.520 --> 16:26.000
 The unique elements of the three companies, perhaps.

16:26.000 --> 16:26.840
 Yeah.

16:26.840 --> 16:33.080
 I think Apple represents, while the user, please the user,

16:33.080 --> 16:38.520
 and the essence of design and brand,

16:38.520 --> 16:44.080
 and it's the one company and perhaps the only tech company

16:44.080 --> 16:49.920
 that draws people with a strong, serious desire

16:49.920 --> 16:53.560
 for the product and the willingness to pay a premium

16:53.560 --> 16:57.160
 because of the halo effect of the brand, which

16:57.160 --> 17:00.960
 came from the attention to detail and great respect

17:00.960 --> 17:03.360
 for user needs.

17:03.360 --> 17:09.200
 Microsoft represents a platform approach

17:09.200 --> 17:14.280
 that builds giant products that become very strong modes

17:14.280 --> 17:17.680
 that others can't do because it's

17:17.680 --> 17:21.480
 well architected at the bottom level

17:21.480 --> 17:26.640
 and the work is efficiently delegated to individuals

17:26.640 --> 17:30.360
 and then the whole product is built

17:30.360 --> 17:33.560
 by adding small parts that sum together.

17:33.560 --> 17:37.760
 So it's probably the most effective high tech assembly

17:37.760 --> 17:40.480
 line that builds a very difficult product

17:40.480 --> 17:44.800
 that the whole process of doing that

17:44.800 --> 17:50.800
 is kind of a differentiation and something competitors

17:50.800 --> 17:52.480
 can't easily repeat.

17:52.480 --> 17:54.800
 Are there elements of the Chinese approach

17:54.800 --> 17:59.280
 in the way Microsoft went about assembling those little pieces

17:59.280 --> 18:03.920
 and essentially dominating the market for a long time?

18:03.920 --> 18:05.640
 Or do you see those as distinct?

18:05.640 --> 18:08.240
 I think there are elements that are the same.

18:08.240 --> 18:10.440
 I think the three American companies

18:10.440 --> 18:13.880
 that had or have Chinese characteristics,

18:13.880 --> 18:16.080
 and obviously as well as American characteristics,

18:16.080 --> 18:20.400
 are Microsoft, Facebook, and Amazon.

18:20.400 --> 18:21.720
 Yes, that's right, Amazon.

18:21.720 --> 18:25.560
 Because these are companies that will tenaciously

18:25.560 --> 18:31.320
 go after adjacent markets, build up strong product offering,

18:31.320 --> 18:38.200
 and find ways to extract greater value from a sphere that's

18:38.200 --> 18:39.960
 ever increasing.

18:39.960 --> 18:43.520
 And they understand the value of the platforms.

18:43.520 --> 18:45.600
 So that's the similarity.

18:45.600 --> 18:53.760
 And then with Google, I think it's a genuinely value oriented

18:53.760 --> 18:56.960
 company that does have a heart and soul

18:56.960 --> 18:59.760
 and that wants to do great things for the world

18:59.760 --> 19:06.040
 by connecting information and that has also

19:06.040 --> 19:13.280
 very strong technology genes and wants to use technology

19:13.280 --> 19:19.080
 and has found out of the box ways to use technology

19:19.080 --> 19:23.680
 to deliver incredible value to the end user.

19:23.680 --> 19:25.240
 We can look at Google, for example.

19:25.240 --> 19:28.040
 You mentioned heart and soul.

19:28.040 --> 19:31.840
 There seems to be an element where Google

19:31.840 --> 19:34.840
 is after making the world better.

19:34.840 --> 19:36.520
 There's a more positive view.

19:36.520 --> 19:38.960
 I mean, they used to have the slogan, don't be evil.

19:38.960 --> 19:43.120
 And Facebook a little bit more has a negative tend to it,

19:43.120 --> 19:46.000
 at least in the perception of privacy and so on.

19:46.000 --> 19:51.280
 Do you have a sense of how these different companies can

19:51.280 --> 19:53.400
 achieve, because you've talked about how much

19:53.400 --> 19:55.600
 we can make the world better in all these kinds of ways

19:55.600 --> 19:59.360
 with AI, what is it about a company that can make,

19:59.360 --> 20:03.200
 give it a heart and soul, gain the trust of the public,

20:03.200 --> 20:08.000
 and just actually just not be evil and do good for the world?

20:08.000 --> 20:09.000
 It's really hard.

20:09.000 --> 20:13.120
 And I think Google has struggled with that.

20:13.120 --> 20:15.160
 First, they don't do evil.

20:15.160 --> 20:18.880
 Mantra is very dangerous, because every employee's

20:18.880 --> 20:20.800
 definition of evil is different.

20:20.800 --> 20:23.800
 And that has led to some difficult employee situations

20:23.800 --> 20:25.240
 for them.

20:25.240 --> 20:29.520
 So I don't necessarily think that's a good value statement.

20:29.520 --> 20:31.840
 But just watching the kinds of things

20:31.840 --> 20:36.440
 Google or its parent company Alphabet does in new areas

20:36.440 --> 20:40.440
 like health care, like eradicating mosquitoes,

20:40.440 --> 20:42.360
 things that are really not in the business

20:42.360 --> 20:45.040
 of a internet tech company, I think

20:45.040 --> 20:47.200
 that shows that there is a heart and soul

20:47.200 --> 20:53.920
 and desire to do good and willingness to put in the resources

20:53.920 --> 20:58.280
 to do something when they see it's good, they will pursue it.

20:58.280 --> 21:00.640
 That doesn't necessarily mean it has

21:00.640 --> 21:02.520
 all the trust of the users.

21:02.520 --> 21:06.400
 I realize while most people would view Facebook

21:06.400 --> 21:09.760
 as the primary target of their recent unhappiness

21:09.760 --> 21:12.720
 about Silicon Valley companies, many would put Google

21:12.720 --> 21:14.080
 in that category.

21:14.080 --> 21:16.800
 And some have named Google's business practices

21:16.800 --> 21:19.840
 as predatory also.

21:19.840 --> 21:24.240
 So it's kind of difficult to have the two parts of a body.

21:24.240 --> 21:28.080
 The brain wants to do what it's supposed to do for a shareholder,

21:28.080 --> 21:29.280
 maximize profit.

21:29.280 --> 21:30.880
 And then the heart and soul wants

21:30.880 --> 21:36.120
 to do good things that may run against what the brain wants to do.

21:36.120 --> 21:40.320
 So in this complex balancing that these companies have to do,

21:40.320 --> 21:44.520
 you've mentioned that you're concerned about a future where

21:44.520 --> 21:47.360
 too few companies like Google, Facebook, Amazon

21:47.360 --> 21:51.560
 are controlling our data or are controlling too much

21:51.560 --> 21:53.360
 of our digital lives.

21:53.360 --> 21:55.400
 Can you elaborate on this concern?

21:55.400 --> 21:58.640
 Perhaps do you have a better way forward?

21:58.640 --> 22:05.000
 I think I'm hardly the most vocal complainer of this.

22:05.000 --> 22:07.280
 There are a lot louder complainers out there.

22:07.280 --> 22:11.840
 I do observe that having a lot of data

22:11.840 --> 22:16.120
 does perpetuate their strength and limits

22:16.120 --> 22:19.400
 competition in many spaces.

22:19.400 --> 22:24.200
 But I also believe AI is much broader than the internet space.

22:24.200 --> 22:26.280
 So the entrepreneurial opportunities

22:26.280 --> 22:30.480
 still exists in using AI to empower

22:30.480 --> 22:34.160
 financial, retail, manufacturing, education,

22:34.160 --> 22:35.480
 applications.

22:35.480 --> 22:39.800
 So I don't think it's quite a case of full monopolistic dominance

22:39.800 --> 22:43.960
 that totally stifles innovation.

22:43.960 --> 22:46.400
 But I do believe in their areas of strength

22:46.400 --> 22:49.760
 it's hard to dislodge them.

22:49.760 --> 22:53.280
 I don't know if I have a good solution.

22:53.280 --> 22:57.160
 Probably the best solution is let the entrepreneurial VC

22:57.160 --> 23:00.840
 ecosystem work well and find all the places that

23:00.840 --> 23:04.200
 can create the next Google, the next Facebook.

23:04.200 --> 23:08.560
 So there will always be increasing number of challengers.

23:08.560 --> 23:11.360
 In some sense, that has happened a little bit.

23:11.360 --> 23:15.760
 You see Uber, Airbnb having emerged despite the strength

23:15.760 --> 23:19.040
 of the big three.

23:19.040 --> 23:22.400
 And I think China as an environment

23:22.400 --> 23:25.280
 may be more interesting for the emergence.

23:25.280 --> 23:28.920
 Because if you look at companies between, let's say,

23:28.920 --> 23:36.320
 $50 to $300 billion, China has emerged more of such companies

23:36.320 --> 23:39.880
 than the US in the last three to four years.

23:39.880 --> 23:42.120
 Because of the larger marketplace,

23:42.120 --> 23:47.000
 because of the more fearless nature of the entrepreneurs.

23:47.000 --> 23:50.840
 And the Chinese giants are just as powerful as American ones.

23:50.840 --> 23:52.920
 Tencent Alibaba are very strong.

23:52.920 --> 23:57.040
 But Bytes Dance has emerged worth $75 billion.

23:57.040 --> 24:00.120
 And financial, while it's Alibaba affiliated,

24:00.120 --> 24:03.920
 it's nevertheless independent and worth $150 billion.

24:03.920 --> 24:08.280
 And so I do think if we start to extend

24:08.280 --> 24:12.640
 to traditional businesses, we will see very valuable companies.

24:12.640 --> 24:18.120
 So it's probably not the case that in five or 10 years,

24:18.120 --> 24:20.920
 we'll still see the whole world with these five companies

24:20.920 --> 24:22.680
 having such dominance.

24:22.680 --> 24:26.040
 So you've mentioned a couple of times

24:26.040 --> 24:27.840
 this fascinating world of entrepreneurship

24:27.840 --> 24:31.080
 in China of the fearless nature of the entrepreneurs.

24:31.080 --> 24:32.640
 So can you maybe talk a little bit

24:32.640 --> 24:35.520
 about what it takes to be an entrepreneur in China?

24:35.520 --> 24:38.240
 What are the strategies that are undertaken?

24:38.240 --> 24:41.120
 What are the ways that you success?

24:41.120 --> 24:43.960
 What is the dynamic of VCF funding,

24:43.960 --> 24:46.480
 of the way the government helps companies, and so on?

24:46.480 --> 24:49.520
 What are the interesting aspects here that are distinct from,

24:49.520 --> 24:52.880
 that are different from the Silicon Valley world

24:52.880 --> 24:55.240
 of entrepreneurship?

24:55.240 --> 24:58.080
 Well, many of the listeners probably

24:58.080 --> 25:03.000
 still would brand Chinese entrepreneur as copycats.

25:03.000 --> 25:06.120
 And no doubt, 10 years ago, that would not

25:06.120 --> 25:09.080
 be an inaccurate description.

25:09.080 --> 25:12.320
 Back 10 years ago, an entrepreneur probably

25:12.320 --> 25:14.840
 could not get funding if he or she could not

25:14.840 --> 25:20.400
 describe what product he or she is copying from the US.

25:20.400 --> 25:23.520
 The first question is, who has proven this business model,

25:23.520 --> 25:27.200
 which is a nice way of asking, who are you copying?

25:27.200 --> 25:29.520
 And that reason is understandable,

25:29.520 --> 25:34.840
 because China had a much lower internet penetration

25:34.840 --> 25:40.920
 and didn't have enough indigenous experience

25:40.920 --> 25:43.200
 to build innovative products.

25:43.200 --> 25:47.600
 And secondly, internet was emerging.

25:47.600 --> 25:49.800
 Link startup was the way to do things,

25:49.800 --> 25:52.920
 building a first minimally viable product,

25:52.920 --> 25:55.320
 and then expanding was the right way to go.

25:55.320 --> 25:59.480
 And the American successes have given a shortcut

25:59.480 --> 26:02.840
 that if you build your minimally viable product based

26:02.840 --> 26:05.040
 on an American product, it's guaranteed

26:05.040 --> 26:06.720
 to be a decent starting point.

26:06.720 --> 26:08.400
 Then you tweak it afterwards.

26:08.400 --> 26:11.720
 So as long as there are no IP infringement, which,

26:11.720 --> 26:15.080
 as far as I know, there hasn't been in the mobile and AI

26:15.080 --> 26:19.360
 spaces, that's a much better shortcut.

26:19.360 --> 26:23.720
 And I think Silicon Valley would view that as still not

26:23.720 --> 26:29.200
 very honorable, because that's not your own idea to start with.

26:29.200 --> 26:32.600
 But you can't really, at the same time,

26:32.600 --> 26:35.160
 believe every idea must be your own

26:35.160 --> 26:38.120
 and believe in the link startup methodology,

26:38.120 --> 26:41.880
 because link startup is intended to try many, many things

26:41.880 --> 26:44.240
 and then converge when that works.

26:44.240 --> 26:46.720
 And it's meant to be iterated and changed.

26:46.720 --> 26:51.240
 So finding a decent starting point without legal violations,

26:51.240 --> 26:55.520
 there should be nothing morally dishonorable about that.

26:55.520 --> 26:57.080
 So just a quick pause on that.

26:57.080 --> 27:01.920
 It's fascinating that that's why is that not honorable, right?

27:01.920 --> 27:04.680
 It's exactly as you formulated.

27:04.680 --> 27:08.040
 It seems like a perfect start for business

27:08.040 --> 27:12.440
 is to take a look at Amazon and say, OK,

27:12.440 --> 27:14.560
 we'll do exactly what Amazon is doing.

27:14.560 --> 27:16.800
 Let's start there in this particular market.

27:16.800 --> 27:20.520
 And then let's out innovate them from that starting point.

27:20.520 --> 27:22.200
 Yes. Come up with new ways.

27:22.200 --> 27:26.520
 I mean, is it wrong to be, except the word copycat just

27:26.520 --> 27:28.800
 sounds bad, but is it wrong to be a copycat?

27:28.800 --> 27:31.640
 It just seems like a smart strategy.

27:31.640 --> 27:35.800
 But yes, doesn't have a heroic nature to it

27:35.800 --> 27:42.280
 that Steve Jobs, Elon Musk, sort of in something completely

27:42.280 --> 27:43.880
 coming up with something completely new.

27:43.880 --> 27:45.480
 Yeah, I like the way you describe it.

27:45.480 --> 27:50.440
 It's a nonheroic, acceptable way to start the company.

27:50.440 --> 27:52.840
 And maybe more expedient.

27:52.840 --> 27:58.920
 So that's, I think, a baggage for Silicon Valley,

27:58.920 --> 28:01.320
 that if it doesn't let go, then it

28:01.320 --> 28:05.160
 may limit the ultimate ceiling of the company.

28:05.160 --> 28:07.200
 Take Snapchat as an example.

28:07.200 --> 28:09.840
 I think Evan's brilliant.

28:09.840 --> 28:11.480
 He built a great product.

28:11.480 --> 28:14.160
 But he's very proud that he wants

28:14.160 --> 28:16.800
 to build his own features, not copy others.

28:16.800 --> 28:21.000
 While Facebook was more willing to copy his features,

28:21.000 --> 28:23.440
 and you see what happens in the competition.

28:23.440 --> 28:27.440
 So I think putting that handcuff on the company

28:27.440 --> 28:31.560
 would limit its ability to reach the maximum potential.

28:31.560 --> 28:33.800
 So back to the Chinese environment,

28:33.800 --> 28:38.400
 copying was merely a way to learn from the American masters.

28:38.400 --> 28:43.480
 Just like if we learned to play piano or painting,

28:43.480 --> 28:44.560
 you start by copying.

28:44.560 --> 28:46.160
 You don't start by innovating when

28:46.160 --> 28:48.200
 you don't have the basic skill sets.

28:48.200 --> 28:51.040
 So very amazingly, the Chinese entrepreneurs

28:51.040 --> 28:56.160
 about six years ago started to branch off

28:56.160 --> 28:59.520
 with these lean startups built on American ideas

28:59.520 --> 29:02.280
 to build better products than American products.

29:02.280 --> 29:04.960
 But they did start from the American idea.

29:04.960 --> 29:08.600
 And today, WeChat is better than WhatsApp.

29:08.600 --> 29:10.520
 Weibo is better than Twitter.

29:10.520 --> 29:12.920
 Zihu is better than Quora and so on.

29:12.920 --> 29:17.000
 So that, I think, is Chinese entrepreneurs

29:17.000 --> 29:18.480
 going to step two.

29:18.480 --> 29:21.760
 And then step three is once these entrepreneurs have

29:21.760 --> 29:23.720
 done one or two of these companies,

29:23.720 --> 29:27.400
 they now look at the Chinese market and the opportunities

29:27.400 --> 29:30.600
 and come up with ideas that didn't exist elsewhere.

29:30.600 --> 29:36.320
 So products like and financial under which includes Alipay,

29:36.320 --> 29:42.080
 which is mobile payments, and also the financial products

29:42.080 --> 29:48.560
 for loans built on that, and also in education, VIP kid,

29:48.560 --> 29:54.880
 and in social video, social network, TikTok,

29:54.880 --> 29:58.640
 and in social eCommerce, Pinduoduo,

29:58.640 --> 30:01.720
 and then in ride sharing, Mobike.

30:01.720 --> 30:05.640
 These are all Chinese innovative products

30:05.640 --> 30:08.720
 that now are being copied elsewhere.

30:08.720 --> 30:13.040
 So an additional interesting observation

30:13.040 --> 30:16.000
 is some of these products are built on unique Chinese

30:16.000 --> 30:19.360
 demographics, which may not work in the US,

30:19.360 --> 30:23.160
 but may work very well in Southeast Asia, Africa,

30:23.160 --> 30:27.840
 and other developing worlds that are a few years behind China.

30:27.840 --> 30:31.040
 And a few of these products maybe are universal

30:31.040 --> 30:33.760
 and are getting traction even in the United States,

30:33.760 --> 30:35.360
 such as TikTok.

30:35.360 --> 30:42.080
 So this whole ecosystem is supported by VCs

30:42.080 --> 30:44.920
 as a virtuous cycle, because a large market

30:44.920 --> 30:49.400
 with innovative entrepreneurs will draw a lot of money

30:49.400 --> 30:51.560
 and then invest in these companies.

30:51.560 --> 30:54.480
 As the market gets larger and larger,

30:54.480 --> 30:58.400
 China market is easily three, four times larger than the US.

30:58.400 --> 31:01.120
 They will create greater value and greater returns

31:01.120 --> 31:05.400
 for the VCs, thereby raising even more money.

31:05.400 --> 31:10.000
 So at Sinovation Ventures, our first fund was $15 million.

31:10.000 --> 31:12.040
 Our last fund was $500 million.

31:12.040 --> 31:16.520
 So it reflects the valuation of the companies

31:16.520 --> 31:19.840
 and our us going multi stage and things like that.

31:19.840 --> 31:23.840
 It also has government support, but not

31:23.840 --> 31:26.080
 in the way most Americans would think of it.

31:26.080 --> 31:29.520
 The government actually leaves the entrepreneurial space

31:29.520 --> 31:33.200
 as a private enterprise, so the self regulating.

31:33.200 --> 31:36.200
 And the government would build infrastructures

31:36.200 --> 31:39.320
 that would around it to make it work better.

31:39.320 --> 31:41.960
 For example, the mass entrepreneur mass innovation

31:41.960 --> 31:44.880
 plan builds 8,000 incubators.

31:44.880 --> 31:48.360
 So the pipeline is very strong to the VCs

31:48.360 --> 31:49.680
 for autonomous vehicles.

31:49.680 --> 31:53.280
 The Chinese government is building smart highways

31:53.280 --> 31:56.680
 with sensors, smart cities that separate pedestrians

31:56.680 --> 32:01.560
 from cars that may allow initially an inferior autonomous

32:01.560 --> 32:05.760
 vehicle company to launch a car without increasing,

32:05.760 --> 32:11.520
 with lower casualty, because the roads or the city is smart.

32:11.520 --> 32:13.800
 And the Chinese government at local levels

32:13.800 --> 32:17.360
 would have these guiding funds acting as LPs,

32:17.360 --> 32:19.400
 passive LPs to funds.

32:19.400 --> 32:23.240
 And when the fund makes money, part of the money made

32:23.240 --> 32:27.280
 is given back to the GPs and potentially other LPs

32:27.280 --> 32:31.960
 to increase everybody's return at the expense

32:31.960 --> 32:33.680
 of the government's return.

32:33.680 --> 32:36.360
 So that's an interesting incentive

32:36.360 --> 32:41.640
 that entrusts the task of choosing entrepreneurs to VCs

32:41.640 --> 32:43.800
 who are better at it than the government

32:43.800 --> 32:46.680
 by letting some of the profits move that way.

32:46.680 --> 32:48.720
 So this is really fascinating, right?

32:48.720 --> 32:51.800
 So I look at the Russian government as a case study

32:51.800 --> 32:54.480
 where, let me put it this way, there

32:54.480 --> 32:58.520
 is no such government driven, large scale

32:58.520 --> 33:00.840
 support of entrepreneurship.

33:00.840 --> 33:04.000
 And probably the same is true in the United States.

33:04.000 --> 33:07.640
 But the entrepreneurs themselves kind of find a way.

33:07.640 --> 33:11.680
 So maybe in a form of advice or explanation,

33:11.680 --> 33:15.560
 how did the Chinese government arrive to be this way,

33:15.560 --> 33:17.680
 so supportive on entrepreneurship,

33:17.680 --> 33:21.520
 to be in this particular way so forward thinking

33:21.520 --> 33:23.120
 at such a large scale?

33:23.120 --> 33:28.280
 And also perhaps, how can we copy it in other countries?

33:28.280 --> 33:29.800
 How can we encourage other governments,

33:29.800 --> 33:31.600
 like even the United States government,

33:31.600 --> 33:33.760
 to support infrastructure for autonomous vehicles

33:33.760 --> 33:36.040
 in that same kind of way, perhaps?

33:36.040 --> 33:36.680
 Yes.

33:36.680 --> 33:44.440
 So these techniques are the result of several key things,

33:44.440 --> 33:46.480
 some of which may be learnable, some of which

33:46.480 --> 33:48.440
 may be very hard.

33:48.440 --> 33:51.080
 One is just trial and error and watching

33:51.080 --> 33:52.960
 what everyone else is doing.

33:52.960 --> 33:54.960
 I think it's important to be humble and not

33:54.960 --> 33:56.920
 feel like you know all the answers.

33:56.920 --> 33:59.480
 The guiding funds idea came from Singapore,

33:59.480 --> 34:01.440
 which came from Israel.

34:01.440 --> 34:06.080
 And China made a few tweaks and turned it into a,

34:06.080 --> 34:09.600
 because the Chinese cities and government officials kind

34:09.600 --> 34:11.320
 of compete with each other.

34:11.320 --> 34:14.640
 Because they all want to make their city more successful,

34:14.640 --> 34:20.280
 so they can get the next level in their political career.

34:20.280 --> 34:22.320
 And it's somewhat competitive.

34:22.320 --> 34:25.200
 So the central government made it a bit of a competition.

34:25.200 --> 34:26.840
 Everybody has a budget.

34:26.840 --> 34:29.840
 They can put it on AI, or they can put it on bio,

34:29.840 --> 34:32.200
 or they can put it on energy.

34:32.200 --> 34:35.040
 And then whoever gets the results, the city shines,

34:35.040 --> 34:38.000
 the people are better off, the mayor gets a promotion.

34:38.000 --> 34:41.680
 So the tools is kind of almost like an entrepreneurial

34:41.680 --> 34:44.840
 environment for local governments

34:44.840 --> 34:47.480
 to see who can do a better job.

34:47.480 --> 34:52.440
 And also, many of them tried different experiments.

34:52.440 --> 34:58.440
 Some have given award to very smart researchers,

34:58.440 --> 35:00.840
 just give them money and hope they'll start a company.

35:00.840 --> 35:05.840
 Some have given money to academic research labs,

35:05.840 --> 35:08.440
 maybe government research labs, to see

35:08.440 --> 35:11.920
 if they can spin off some companies from the science

35:11.920 --> 35:14.040
 lab or something like that.

35:14.040 --> 35:17.080
 Some have tried to recruit overseas Chinese

35:17.080 --> 35:18.960
 to come back and start companies.

35:18.960 --> 35:20.960
 And they've had mixed results.

35:20.960 --> 35:23.400
 The one that worked the best was the guiding funds.

35:23.400 --> 35:25.840
 So it's almost like a lean startup idea

35:25.840 --> 35:29.160
 where people try different things in what works, sticks,

35:29.160 --> 35:30.600
 and everybody copies.

35:30.600 --> 35:32.880
 So now every city has a guiding fund.

35:32.880 --> 35:35.680
 So that's how that came about.

35:35.680 --> 35:40.400
 The autonomous vehicle and the massive spending

35:40.400 --> 35:46.080
 in highways and smart cities, that's a Chinese way.

35:46.080 --> 35:49.480
 It's about building infrastructure to facilitate.

35:49.480 --> 35:52.840
 It's a clear division of the government's responsibility

35:52.840 --> 35:55.400
 from the market.

35:55.400 --> 36:00.560
 The market should do everything in a private freeway.

36:00.560 --> 36:02.920
 But there are things the market can't afford to do,

36:02.920 --> 36:04.520
 like infrastructure.

36:04.520 --> 36:08.000
 So the government always appropriates

36:08.000 --> 36:12.000
 large amounts of money for infrastructure building.

36:12.000 --> 36:16.880
 This happens with not only autonomous vehicle and AI,

36:16.880 --> 36:20.840
 but happened with the 3G and 4G.

36:20.840 --> 36:25.320
 You'll find that the Chinese wireless reception

36:25.320 --> 36:28.760
 is better than the US, because massive spending that

36:28.760 --> 36:30.720
 tries to cover the whole country.

36:30.720 --> 36:34.360
 Whereas in the US, it may be a little spotty.

36:34.360 --> 36:36.160
 It's a government driven, because I think

36:36.160 --> 36:44.120
 they view the coverage of cell access and 3G, 4G access

36:44.120 --> 36:47.080
 to be a governmental infrastructure spending,

36:47.080 --> 36:49.880
 as opposed to capitalistic.

36:49.880 --> 36:52.160
 So of course, the state or enterprise

36:52.160 --> 36:55.000
 is also publicly traded, but they also

36:55.000 --> 36:57.720
 carry a government responsibility

36:57.720 --> 37:00.240
 to deliver infrastructure to all.

37:00.240 --> 37:01.880
 So it's a different way of thinking

37:01.880 --> 37:05.400
 that may be very hard to inject into Western countries

37:05.400 --> 37:09.280
 to say starting tomorrow, bandwidth infrastructure

37:09.280 --> 37:13.840
 and highways are going to be governmental spending

37:13.840 --> 37:16.240
 with some characteristics.

37:16.240 --> 37:18.240
 What's your sense, and sorry to interrupt,

37:18.240 --> 37:21.680
 but because it's such a fascinating point,

37:21.680 --> 37:25.600
 do you think on the autonomous vehicle space

37:25.600 --> 37:30.120
 it's possible to solve the problem of full autonomy

37:30.120 --> 37:34.040
 without significant investment in infrastructure?

37:34.040 --> 37:36.400
 Well, that's really hard to speculate.

37:36.400 --> 37:38.960
 I think it's not a yes, no question,

37:38.960 --> 37:41.920
 but how long does it take question?

37:41.920 --> 37:45.120
 15 years, 30 years, 45 years.

37:45.120 --> 37:48.960
 Clearly with infrastructure augmentation,

37:48.960 --> 37:52.320
 where there's road, the city, or whole city planning,

37:52.320 --> 37:56.440
 building a new city, I'm sure that will accelerate

37:56.440 --> 37:59.040
 the day of the L5.

37:59.040 --> 38:01.520
 I'm not knowledgeable enough, and it's

38:01.520 --> 38:03.920
 hard to predict even when we're knowledgeable,

38:03.920 --> 38:07.120
 because a lot of it is speculative.

38:07.120 --> 38:09.800
 But in the US, I don't think people

38:09.800 --> 38:13.240
 would consider building a new city the size of Chicago

38:13.240 --> 38:15.920
 to make it the AI slash autonomous city.

38:15.920 --> 38:18.840
 There are smaller ones being built, I'm aware of that.

38:18.840 --> 38:21.280
 But is infrastructure spend really

38:21.280 --> 38:23.720
 impossible for US or Western countries?

38:23.720 --> 38:25.680
 I don't think so.

38:25.680 --> 38:28.920
 The US highway system was built.

38:28.920 --> 38:31.960
 Was that during President Eisenhower or Kennedy?

38:31.960 --> 38:33.160
 Eisenhower, yeah.

38:33.160 --> 38:38.960
 So maybe historians can study how the President Eisenhower

38:38.960 --> 38:42.960
 get the resources to build this massive infrastructure that

38:42.960 --> 38:47.560
 surely gave US a tremendous amount of prosperity

38:47.560 --> 38:50.800
 over the next decade, if not century.

38:50.800 --> 38:53.240
 If I may comment on that, then, it

38:53.240 --> 38:54.880
 takes us to artificial intelligence

38:54.880 --> 38:58.080
 a little bit, because in order to build infrastructure,

38:58.080 --> 39:00.520
 it creates a lot of jobs.

39:00.520 --> 39:02.840
 So I'll be actually interested if you

39:02.840 --> 39:06.120
 would say that you're talking in your book about all kinds

39:06.120 --> 39:08.960
 of jobs that could and could not be automated.

39:08.960 --> 39:12.000
 I wonder if building infrastructure

39:12.000 --> 39:15.720
 is one of the jobs that would not be easily automated,

39:15.720 --> 39:18.160
 something you can think about, because I think you've mentioned

39:18.160 --> 39:21.160
 somewhere in a talk, or that there

39:21.160 --> 39:24.280
 might be, as jobs are being automated,

39:24.280 --> 39:28.160
 a role for government to create jobs that can't be automated.

39:28.160 --> 39:31.040
 Yes, I think that's a possibility.

39:31.040 --> 39:34.280
 Back in the last financial crisis,

39:34.280 --> 39:40.320
 China put a lot of money to basically give this economy

39:40.320 --> 39:45.520
 a boost, and a lot of it went into infrastructure building.

39:45.520 --> 39:49.920
 And I think that's a legitimate way, at the government level,

39:49.920 --> 39:55.680
 to deal with the employment issues as well as build out

39:55.680 --> 39:58.960
 the infrastructure, as long as the infrastructures are truly

39:58.960 --> 40:03.160
 needed, and as long as there is an employment problem, which

40:03.160 --> 40:04.960
 we don't know.

40:04.960 --> 40:07.920
 So maybe taking a little step back,

40:07.920 --> 40:12.840
 if you've been a leader and a researcher in AI

40:12.840 --> 40:16.200
 for several decades, at least 30 years,

40:16.200 --> 40:21.040
 so how has AI changed in the West and the East

40:21.040 --> 40:23.120
 as you've observed, as you've been deep in it

40:23.120 --> 40:25.120
 over the past 30 years?

40:25.120 --> 40:28.520
 Well, AI began as the pursuit of understanding

40:28.520 --> 40:34.160
 human intelligence, and the term itself represents that.

40:34.160 --> 40:37.680
 But it kind of drifted into the one subarea that

40:37.680 --> 40:40.880
 worked extremely well, which is machine intelligence.

40:40.880 --> 40:45.080
 And that's actually more using pattern recognition techniques

40:45.080 --> 40:51.280
 to basically do incredibly well on a limited domain,

40:51.280 --> 40:54.840
 large amount of data, but relatively simple kinds

40:54.840 --> 40:58.720
 of planning, tasks, and not very creative.

40:58.720 --> 41:02.480
 So we didn't end up building human intelligence.

41:02.480 --> 41:04.760
 We built a different machine that

41:04.760 --> 41:08.040
 was a lot better than us, some problems,

41:08.040 --> 41:11.840
 but nowhere close to us on other problems.

41:11.840 --> 41:14.200
 So today, I think a lot of people still

41:14.200 --> 41:18.080
 misunderstand when we say artificial intelligence

41:18.080 --> 41:20.720
 and what various products can do.

41:20.720 --> 41:24.160
 People still think it's about replicating human intelligence.

41:24.160 --> 41:26.160
 But the products out there really

41:26.160 --> 41:31.680
 are closer to having invented the internet or the spreadsheet

41:31.680 --> 41:35.360
 or the database and getting broader adoption.

41:35.360 --> 41:38.400
 And speaking further to the fears, near term fears

41:38.400 --> 41:41.240
 that people have about AI, so you're commenting

41:41.240 --> 41:45.680
 on the general intelligence that people

41:45.680 --> 41:48.040
 in the popular culture from sci fi movies

41:48.040 --> 41:50.920
 have a sense about AI, but there's practical fears

41:50.920 --> 41:54.800
 about AI, the kind of narrow AI that you're talking about

41:54.800 --> 41:57.280
 of automating particular kinds of jobs,

41:57.280 --> 41:59.400
 and you talk about them in the book.

41:59.400 --> 42:01.520
 So what are the kinds of jobs in your view

42:01.520 --> 42:04.840
 that you see in the next five, 10 years beginning

42:04.840 --> 42:09.240
 to be automated by AI systems algorithms?

42:09.240 --> 42:13.000
 Yes, this is also maybe a little bit counterintuitive

42:13.000 --> 42:15.440
 because it's the routine jobs that

42:15.440 --> 42:18.360
 will be displaced the soonest.

42:18.360 --> 42:23.120
 And they may not be displaced entirely, maybe 50%, 80%

42:23.120 --> 42:26.320
 of a job, but when the workload drops by that much,

42:26.320 --> 42:28.760
 employment will come down.

42:28.760 --> 42:31.520
 And also another part of misunderstanding

42:31.520 --> 42:35.720
 is most people think of AI replacing routine jobs,

42:35.720 --> 42:38.760
 then they think of the assembly line, the workers.

42:38.760 --> 42:40.960
 Well, that will have some effects,

42:40.960 --> 42:44.600
 but it's actually the routine white collar workers that's

42:44.600 --> 42:49.280
 easiest to replace because to replace a white collar worker,

42:49.280 --> 42:50.720
 you just need software.

42:50.720 --> 42:53.120
 To replace a blue collar worker,

42:53.120 --> 42:57.200
 you need robotics, mechanical excellence,

42:57.200 --> 43:01.880
 and the ability to deal with dexterity,

43:01.880 --> 43:05.640
 and maybe even unknown environments, very, very difficult.

43:05.640 --> 43:11.200
 So if we were to categorize the most dangerous white collar

43:11.200 --> 43:15.600
 jobs, they would be things like back office,

43:15.600 --> 43:20.800
 people who copy and paste and deal with simple computer

43:20.800 --> 43:25.560
 programs and data, and maybe paper and OCR,

43:25.560 --> 43:29.000
 and they don't make strategic decisions,

43:29.000 --> 43:32.040
 they basically facilitate the process.

43:32.040 --> 43:34.680
 These software and paper systems don't work,

43:34.680 --> 43:40.520
 so you have people dealing with new employee orientation,

43:40.520 --> 43:45.400
 searching for past lawsuits and financial documents,

43:45.400 --> 43:49.800
 and doing reference check, so basic searching and management

43:49.800 --> 43:52.800
 of data that's the most in danger of being lost.

43:52.800 --> 43:56.440
 In addition to the white collar repetitive work,

43:56.440 --> 43:59.360
 a lot of simple interaction work can also

43:59.360 --> 44:02.840
 be taken care of, such as tele sales, telemarketing,

44:02.840 --> 44:07.280
 customer service, as well as many physical jobs

44:07.280 --> 44:09.880
 that are in the same location and don't

44:09.880 --> 44:12.240
 require a high degree of dexterity,

44:12.240 --> 44:17.840
 so fruit picking, dishwashing, assembly line, inspection,

44:17.840 --> 44:20.360
 our jobs in that category.

44:20.360 --> 44:25.440
 So altogether, back office is a big part,

44:25.440 --> 44:29.840
 and the other, the blue collar may be smaller initially,

44:29.840 --> 44:32.560
 but over time, AI will get better.

44:32.560 --> 44:36.880
 And when we start to get to over the next 15, 20 years,

44:36.880 --> 44:39.120
 the ability to actually have the dexterity

44:39.120 --> 44:42.600
 of doing assembly line, that's a huge chunk of jobs.

44:42.600 --> 44:44.760
 And when autonomous vehicles start

44:44.760 --> 44:47.400
 to work initially starting with truck drivers,

44:47.400 --> 44:49.640
 but eventually to all drivers, that's

44:49.640 --> 44:52.040
 another huge group of workers.

44:52.040 --> 44:55.560
 So I see modest numbers in the next five years,

44:55.560 --> 44:58.080
 but increasing rapidly after that.

44:58.080 --> 45:01.240
 On the worry of the jobs that are in danger

45:01.240 --> 45:04.320
 and the gradual loss of jobs, I'm not

45:04.320 --> 45:06.680
 sure if you're familiar with Andrew Yang.

45:06.680 --> 45:07.800
 Yes, I am.

45:07.800 --> 45:10.560
 So there's a candidate for president of the United States

45:10.560 --> 45:14.960
 whose platform, Andrew Yang, is based around, in part,

45:14.960 --> 45:17.680
 around job loss due to automation,

45:17.680 --> 45:21.120
 and also, in addition, the need, perhaps,

45:21.120 --> 45:26.120
 of universal basic income to support jobs that are folks who

45:26.120 --> 45:28.560
 lose their job due to automation and so on,

45:28.560 --> 45:31.960
 and in general, support people under complex,

45:31.960 --> 45:34.320
 unstable job market.

45:34.320 --> 45:36.720
 So what are your thoughts about his concerns,

45:36.720 --> 45:40.000
 him as a candidate, his ideas in general?

45:40.000 --> 45:44.600
 I think his thinking is generally in the right direction,

45:44.600 --> 45:48.440
 but his approach as a presidential candidate

45:48.440 --> 45:52.240
 may be a little bit ahead at the time.

45:52.240 --> 45:56.080
 I think the displacements will happen,

45:56.080 --> 45:58.280
 but will they happen soon enough for people

45:58.280 --> 46:00.480
 to agree to vote for him?

46:00.480 --> 46:03.760
 The unemployment numbers are not very high yet.

46:03.760 --> 46:07.600
 And I think he and I have the same challenge.

46:07.600 --> 46:11.520
 If I want to theoretically convince people this is an issue

46:11.520 --> 46:13.880
 and he wants to become the president,

46:13.880 --> 46:17.760
 people have to see how can this be the case when

46:17.760 --> 46:19.680
 unemployment numbers are low.

46:19.680 --> 46:21.360
 So that is the challenge.

46:21.360 --> 46:27.360
 And I think I do agree with him on the displacement issue,

46:27.360 --> 46:32.280
 on universal basic income, at a very vanilla level.

46:32.280 --> 46:36.800
 I don't agree with it because I think the main issue

46:36.800 --> 46:38.320
 is retraining.

46:38.320 --> 46:43.200
 So people need to be incented not by just giving a monthly

46:43.200 --> 46:47.160
 $2,000 check or $1,000 check and do whatever they want

46:47.160 --> 46:50.920
 because they don't have the know how

46:50.920 --> 46:56.840
 to know what to retrain to go into what type of a job

46:56.840 --> 46:58.640
 and guidance is needed.

46:58.640 --> 47:01.720
 And retraining is needed because historically

47:01.720 --> 47:05.080
 in technology revolutions, when routine jobs were displaced,

47:05.080 --> 47:06.920
 new routine jobs came up.

47:06.920 --> 47:09.400
 So there was always room for that.

47:09.400 --> 47:12.640
 But with AI and automation, the whole point

47:12.640 --> 47:15.320
 is replacing all routine jobs eventually.

47:15.320 --> 47:17.840
 So there will be fewer and fewer routine jobs.

47:17.840 --> 47:22.640
 And AI will create jobs, but it won't create routine jobs

47:22.640 --> 47:24.840
 because if it creates routine jobs,

47:24.840 --> 47:26.880
 why wouldn't AI just do it?

47:26.880 --> 47:30.360
 So therefore, the people who are losing the jobs

47:30.360 --> 47:32.280
 are losing routine jobs.

47:32.280 --> 47:35.720
 The jobs that are becoming available are nonroutine jobs.

47:35.720 --> 47:39.320
 So the social stipend needs to be put in place

47:39.320 --> 47:42.040
 is for the routine workers who lost their jobs

47:42.040 --> 47:46.120
 to be retrained maybe in six months, maybe in three years.

47:46.120 --> 47:48.560
 Takes a while to retrain on the nonroutine job

47:48.560 --> 47:51.360
 and then take on a job that will last

47:51.360 --> 47:53.400
 for that person's lifetime.

47:53.400 --> 47:56.160
 Now, having said that, if you look deeply

47:56.160 --> 47:58.240
 into Andrew's document, he does cater for that.

47:58.240 --> 48:03.240
 So I'm not disagreeing with what he's trying to do.

48:03.280 --> 48:06.360
 But for simplification, sometimes he just says UBI,

48:06.360 --> 48:08.760
 but simple UBI wouldn't work.

48:08.760 --> 48:10.600
 And I think you've mentioned elsewhere

48:10.600 --> 48:15.600
 that the goal isn't necessarily to give people enough money

48:15.760 --> 48:19.120
 to survive or live or even to prosper.

48:19.120 --> 48:22.800
 The point is to give them a job that gives them meaning.

48:22.800 --> 48:25.600
 That meaning is extremely important.

48:25.600 --> 48:28.600
 That our employment, at least in the United States

48:28.600 --> 48:31.200
 and perhaps it cares across the world,

48:31.200 --> 48:34.600
 provides something that's, forgive me for saying,

48:34.600 --> 48:36.960
 greater than money, it provides meaning.

48:38.400 --> 48:43.400
 So now what kind of jobs do you think can't be automated?

48:44.840 --> 48:46.600
 You talk a little bit about creativity

48:46.600 --> 48:48.200
 and compassion in your book.

48:48.200 --> 48:50.720
 What aspects do you think it's difficult

48:50.720 --> 48:52.320
 to automate for an AI system?

48:52.320 --> 48:57.320
 Because an AI system is currently merely optimizing.

48:57.360 --> 49:00.120
 It's not able to reason, plan,

49:00.120 --> 49:02.920
 or think creatively or strategically.

49:02.920 --> 49:05.320
 It's not able to deal with complex problems.

49:05.320 --> 49:09.520
 It can't come up with a new problem and solve it.

49:09.520 --> 49:12.320
 A human needs to find the problem

49:12.320 --> 49:15.520
 and pose it as an optimization problem,

49:15.520 --> 49:17.520
 then have the AI work at it.

49:17.520 --> 49:21.320
 So an AI would have a very hard time

49:21.320 --> 49:23.320
 discovering a new drug

49:23.320 --> 49:26.320
 or discovering a new style of painting

49:27.320 --> 49:30.320
 or dealing with complex tasks

49:30.320 --> 49:32.320
 such as managing a company

49:32.320 --> 49:35.320
 that isn't just about optimizing the bottom line,

49:35.320 --> 49:39.320
 but also about employee satisfaction, corporate brand,

49:39.320 --> 49:40.320
 and many, many other things.

49:40.320 --> 49:44.320
 So that is one category of things.

49:44.320 --> 49:48.320
 And because these things are challenging, creative, complex,

49:48.320 --> 49:52.320
 doing them creates a higher degree of satisfaction

49:52.320 --> 49:55.320
 and therefore appealing to our desire for working,

49:55.320 --> 49:57.320
 which isn't just to make the money,

49:57.320 --> 49:58.320
 make the ends meet,

49:58.320 --> 50:00.320
 but also that we've accomplished something

50:00.320 --> 50:03.320
 that others maybe can't do or can't do as well.

50:04.320 --> 50:07.320
 Another type of job that is much numerous

50:07.320 --> 50:09.320
 would be compassionate jobs,

50:09.320 --> 50:14.320
 jobs that require compassion, empathy, human touch, human trust.

50:14.320 --> 50:18.320
 AI can't do that because AI is cold, calculating,

50:18.320 --> 50:22.320
 and even if it can fake that to some extent,

50:22.320 --> 50:26.320
 it will make errors and that will make it look very silly.

50:26.320 --> 50:29.320
 And also, I think even if AI did okay,

50:29.320 --> 50:33.320
 people would want to interact with another person,

50:33.320 --> 50:38.320
 whether it's for some kind of a service or a teacher or a doctor

50:38.320 --> 50:41.320
 or a concierge or a masseuse or bartender.

50:41.320 --> 50:46.320
 There are so many jobs where people just don't want to interact

50:46.320 --> 50:49.320
 with a cold robot or software.

50:50.320 --> 50:53.320
 I've had an entrepreneur who built an elderly care robot

50:53.320 --> 50:58.320
 and they found that the elderly really only use it for customer service.

50:58.320 --> 51:00.320
 But not to service the product,

51:00.320 --> 51:05.320
 but they click on customer service and the video of a person comes up

51:05.320 --> 51:07.320
 and then the person says,

51:07.320 --> 51:11.320
 how come my daughter didn't call me? Let me show you a picture of her grandkids.

51:11.320 --> 51:15.320
 So people earn for that, people people interaction.

51:15.320 --> 51:19.320
 So even if robots improved, people just don't want it.

51:19.320 --> 51:21.320
 And those jobs are going to be increasing

51:21.320 --> 51:24.320
 because AI will create a lot of value,

51:24.320 --> 51:29.320
 $16 trillion to the world in next 11 years according to PWC

51:29.320 --> 51:34.320
 and that will give people money to enjoy services,

51:34.320 --> 51:39.320
 whether it's eating a gourmet meal or tourism and traveling

51:39.320 --> 51:41.320
 or having concierge services.

51:41.320 --> 51:44.320
 The services revolving around, you know,

51:44.320 --> 51:47.320
 every dollar of that $16 trillion will be tremendous.

51:47.320 --> 51:52.320
 It will create more opportunities to service the people who did well

51:52.320 --> 51:55.320
 through AI with things.

51:55.320 --> 52:01.320
 But even at the same time, the entire society is very much short

52:01.320 --> 52:05.320
 in need of many service oriented, compassionate oriented jobs.

52:05.320 --> 52:10.320
 The best example is probably in healthcare services.

52:10.320 --> 52:15.320
 There's going to be 2 million new jobs, not counting replacement,

52:15.320 --> 52:20.320
 just brand new incremental jobs in the next six years in healthcare services.

52:20.320 --> 52:24.320
 That includes nurses orderly in the hospital,

52:24.320 --> 52:29.320
 elderly care and also at home care.

52:29.320 --> 52:31.320
 It's particularly lacking.

52:31.320 --> 52:34.320
 And those jobs are not likely to be filled.

52:34.320 --> 52:36.320
 So there's likely to be a shortage.

52:36.320 --> 52:41.320
 And the reason they're not filled is simply because they don't pay very well

52:41.320 --> 52:47.320
 and that the social status of these jobs are not very good.

52:47.320 --> 52:52.320
 So they pay about half as much as a heavy equipment operator,

52:52.320 --> 52:55.320
 which will be replaced a lot sooner.

52:55.320 --> 52:59.320
 And they pay probably comparably to someone on the assembly line.

52:59.320 --> 53:03.320
 And so if we're ignoring all the other issues

53:03.320 --> 53:07.320
 and just think about satisfaction from one's job,

53:07.320 --> 53:11.320
 someone repetitively doing the same manual action at an assembly line,

53:11.320 --> 53:14.320
 that can't create a lot of job satisfaction.

53:14.320 --> 53:17.320
 But someone taking care of a sick person

53:17.320 --> 53:21.320
 and getting a hug and thank you from that person and the family,

53:21.320 --> 53:24.320
 I think is quite satisfying.

53:24.320 --> 53:28.320
 So if only we could fix the pay for service jobs,

53:28.320 --> 53:33.320
 there are plenty of jobs that require some training or a lot of training

53:33.320 --> 53:36.320
 for the people coming off the routine jobs to take.

53:36.320 --> 53:43.320
 We can easily imagine someone who was maybe a cashier at the grocery store,

53:43.320 --> 53:49.320
 at stores become automated, learns to become a nurse or at home care.

53:49.320 --> 53:54.320
 Also, I do want to point out the blue collar jobs are going to stay around a bit longer,

53:54.320 --> 53:57.320
 some of them quite a bit longer.

53:57.320 --> 54:01.320
 AI cannot be told, go clean an arbitrary home.

54:01.320 --> 54:03.320
 That's incredibly hard.

54:03.320 --> 54:07.320
 Arguably is an L5 level of difficulty.

54:07.320 --> 54:09.320
 And then AI cannot be a good plumber,

54:09.320 --> 54:12.320
 because plumber is almost like a mini detective

54:12.320 --> 54:15.320
 that has to figure out where the leak came from.

54:15.320 --> 54:22.320
 So yet AI probably can be an assembly line and auto mechanic and so on.

54:22.320 --> 54:26.320
 So one has to study which blue collar jobs are going away

54:26.320 --> 54:30.320
 and facilitate retraining for the people to go into the ones that won't go away

54:30.320 --> 54:32.320
 or maybe even will increase.

54:32.320 --> 54:39.320
 I mean, it is fascinating that it's easier to build a world champion chess player

54:39.320 --> 54:41.320
 than it is to build a mediocre plumber.

54:41.320 --> 54:43.320
 Yes, very true.

54:43.320 --> 54:47.320
 And to AI, and that goes counterintuitive to a lot of people's understanding

54:47.320 --> 54:49.320
 of what artificial intelligence is.

54:49.320 --> 54:53.320
 So it sounds, I mean, you're painting a pretty optimistic picture

54:53.320 --> 54:56.320
 about retraining, about the number of jobs

54:56.320 --> 55:01.320
 and actually the meaningful nature of those jobs once we automate repetitive tasks.

55:01.320 --> 55:07.320
 So overall, are you optimistic about the future

55:07.320 --> 55:11.320
 where much of the repetitive tasks are automated,

55:11.320 --> 55:15.320
 that there is a lot of room for humans, for the compassionate,

55:15.320 --> 55:19.320
 for the creative input that only humans can provide?

55:19.320 --> 55:23.320
 I am optimistic if we start to take action.

55:23.320 --> 55:27.320
 If we have no action in the next five years,

55:27.320 --> 55:33.320
 I think it's going to be hard to deal with the devastating losses that will emerge.

55:33.320 --> 55:39.320
 So if we start thinking about retraining, maybe with the low hanging fruits,

55:39.320 --> 55:45.320
 explaining to vocational schools why they should train more plumbers than auto mechanics,

55:45.320 --> 55:53.320
 maybe starting with some government subsidy for corporations to have more training positions.

55:53.320 --> 55:57.320
 We start to explain to people why retraining is important.

55:57.320 --> 56:00.320
 We start to think about what the future of education,

56:00.320 --> 56:04.320
 how that needs to be tweaked for the era of AI.

56:04.320 --> 56:06.320
 If we start to make incremental progress,

56:06.320 --> 56:09.320
 and the greater number of people understand,

56:09.320 --> 56:12.320
 then there's no reason to think we can't deal with this,

56:12.320 --> 56:16.320
 because this technological revolution is arguably similar to

56:16.320 --> 56:20.320
 what electricity, industrial revolutions, and internet brought about.

56:20.320 --> 56:24.320
 Do you think there's a role for policy, for governments to step in

56:24.320 --> 56:27.320
 to help with policy to create a better world?

56:27.320 --> 56:32.320
 Absolutely, and the governments don't have to believe

56:32.320 --> 56:39.320
 that unemployment will go up, and they don't have to believe automation will be this fast to do something.

56:39.320 --> 56:42.320
 Revamping vocational school would be one example.

56:42.320 --> 56:47.320
 Another is if there's a big gap in healthcare service employment,

56:47.320 --> 56:54.320
 and we know that a country's population is growing older and more longevity living older,

56:54.320 --> 56:59.320
 because people over 80 require five times as much care as those under 80,

56:59.320 --> 57:04.320
 then it is a good time to incent training programs for elderly care,

57:04.320 --> 57:07.320
 to find ways to improve the pay.

57:07.320 --> 57:13.320
 Maybe one way would be to offer as part of Medicare or the equivalent program

57:13.320 --> 57:18.320
 for people over 80 to be entitled to a few hours of elderly care at home,

57:18.320 --> 57:21.320
 and then that might be reimbursable,

57:21.320 --> 57:28.320
 and that will stimulate the service industry around the policy.

57:28.320 --> 57:32.320
 Do you have concerns about large entities,

57:32.320 --> 57:38.320
 whether it's governments or companies, controlling the future of AI development in general?

57:38.320 --> 57:40.320
 So we talked about companies.

57:40.320 --> 57:48.320
 Do you have a better sense that governments can better represent the interest of the people

57:48.320 --> 57:54.320
 than companies, or do you believe companies are better at representing the interest of the people?

57:54.320 --> 57:56.320
 Or is there no easy answer?

57:56.320 --> 57:59.320
 I don't think there's an easy answer because it's a double edged sword.

57:59.320 --> 58:06.320
 The companies and governments can provide better services with more access to data and more access to AI,

58:06.320 --> 58:13.320
 but that also leads to greater power, which can lead to uncontrollable problems,

58:13.320 --> 58:17.320
 whether it's monopoly or corruption in the government.

58:17.320 --> 58:24.320
 So I think one has to be careful to look at how much data that companies and governments have,

58:24.320 --> 58:29.320
 and some kind of checks and balances would be helpful.

58:29.320 --> 58:33.320
 So again, I come from Russia.

58:33.320 --> 58:36.320
 There's something called the Cold War.

58:36.320 --> 58:40.320
 So let me ask a difficult question here, looking at conflict.

58:40.320 --> 58:45.320
 Steven Pinker wrote a great book that conflict all over the world is decreasing in general.

58:45.320 --> 58:51.320
 But do you have a sense that having written the book AI Superpowers,

58:51.320 --> 58:57.320
 do you see a major international conflict potentially arising between major nations,

58:57.320 --> 59:02.320
 whatever they are, whether it's Russia, China, European nations, United States,

59:02.320 --> 59:09.320
 or others in the next 10, 20, 50 years around AI, around the digital space, cyber space?

59:09.320 --> 59:12.320
 Do you worry about that?

59:12.320 --> 59:19.320
 Is that something we need to think about and try to alleviate or prevent?

59:19.320 --> 59:22.320
 I believe in greater engagement.

59:22.320 --> 59:33.320
 A lot of the worries about more powerful AI are based on an arms race metaphor.

59:33.320 --> 59:41.320
 And when you extrapolate into military kinds of scenarios,

59:41.320 --> 59:48.320
 AI can automate autonomous weapons that needs to be controlled somehow.

59:48.320 --> 59:57.320
 And autonomous decision making can lead to not enough time to fix international crises.

59:57.320 --> 1:00:02.320
 So I actually believe a Cold War mentality would be very dangerous

1:00:02.320 --> 1:00:07.320
 because should two countries rely on AI to make certain decisions

1:00:07.320 --> 1:00:11.320
 and they don't even talk to each other, they do their own scenario planning,

1:00:11.320 --> 1:00:14.320
 then something could easily go wrong.

1:00:14.320 --> 1:00:24.320
 I think engagement, interaction, some protocols to avoid inadvertent disasters is actually needed.

1:00:24.320 --> 1:00:28.320
 So it's natural for each country to want to be the best,

1:00:28.320 --> 1:00:34.320
 whether it's in nuclear technologies or AI or bio.

1:00:34.320 --> 1:00:40.320
 But I think it's important to realize if each country has a black box AI

1:00:40.320 --> 1:00:48.320
 and don't talk to each other, that probably presents greater challenges to humanity

1:00:48.320 --> 1:00:50.320
 than if they interacted.

1:00:50.320 --> 1:00:56.320
 I think there can still be competition, but with some degree of protocol for interaction.

1:00:56.320 --> 1:01:01.320
 Just like when there was a nuclear competition,

1:01:01.320 --> 1:01:07.320
 there were some protocol for deterrence among US, Russia, and China.

1:01:07.320 --> 1:01:10.320
 And I think that engagement is needed.

1:01:10.320 --> 1:01:15.320
 So of course, we're still far from AI presenting that kind of danger.

1:01:15.320 --> 1:01:22.320
 But what I worry the most about is the level of engagement seems to be coming down.

1:01:22.320 --> 1:01:25.320
 The level of distrust seems to be going up,

1:01:25.320 --> 1:01:32.320
 especially from the US towards other large countries such as China and Russia.

1:01:32.320 --> 1:01:34.320
 Is there a way to make that better?

1:01:34.320 --> 1:01:40.320
 So that's beautifully put, level of engagement and even just basic trust and communication

1:01:40.320 --> 1:01:52.320
 as opposed to making artificial enemies out of particular countries.

1:01:52.320 --> 1:02:01.320
 Do you have a sense how we can make it better, actionable items that as a society we can take on?

1:02:01.320 --> 1:02:10.320
 I'm not an expert at geopolitics, but I would say that we look pretty foolish as humankind

1:02:10.320 --> 1:02:19.320
 when we are faced with the opportunity to create $16 trillion for humanity.

1:02:19.320 --> 1:02:29.320
 And yet we're not solving fundamental problems with parts of the world still in poverty.

1:02:29.320 --> 1:02:34.320
 And for the first time, we have the resources to overcome poverty and hunger.

1:02:34.320 --> 1:02:38.320
 We're not using it on that, but we're fueling competition among superpowers.

1:02:38.320 --> 1:02:41.320
 And that's a very unfortunate thing.

1:02:41.320 --> 1:02:54.320
 If we become utopian for a moment, imagine a benevolent world government that has this $16 trillion

1:02:54.320 --> 1:03:02.320
 and maybe some AI to figure out how to use it to deal with diseases and problems and hate and things like that.

1:03:02.320 --> 1:03:04.320
 World would be a lot better off.

1:03:04.320 --> 1:03:07.320
 So what is wrong with the current world?

1:03:07.320 --> 1:03:13.320
 I think the people with more skill than I should think about this.

1:03:13.320 --> 1:03:19.320
 And then the geopolitics issue with superpower competition is one side of the issue.

1:03:19.320 --> 1:03:29.320
 There's another side which I worry maybe even more, which is as the $16 trillion all gets made by U.S. and China

1:03:29.320 --> 1:03:34.320
 and a few of the other developed countries, the poorer country will get nothing

1:03:34.320 --> 1:03:42.320
 because they don't have technology and the wealth disparity and inequality will increase.

1:03:42.320 --> 1:03:50.320
 So a poorer country with a large population will not only benefit from the AI boom or other technology booms

1:03:50.320 --> 1:03:57.320
 but they will have their workers who previously had hoped they could do the China model and do outsource manufacturing

1:03:57.320 --> 1:04:02.320
 or the India model so they could do the outsource process or call center

1:04:02.320 --> 1:04:05.320
 while all those jobs are going to be gone in 10 or 15 years.

1:04:05.320 --> 1:04:14.320
 So the individual citizen may be a net liability, I mean financially speaking, to a poorer country

1:04:14.320 --> 1:04:19.320
 and not an asset to claw itself out of poverty.

1:04:19.320 --> 1:04:29.320
 So in that kind of situation, these large countries with not much tech are going to be facing a downward spiral

1:04:29.320 --> 1:04:37.320
 and it's unclear what could be done and then when we look back and say there's $16 trillion being created

1:04:37.320 --> 1:04:43.320
 and it's all being kept by U.S. China and other developed countries, it just doesn't feel right.

1:04:43.320 --> 1:04:50.320
 So I hope people who know about geopolitics can find solutions that's beyond my expertise.

1:04:50.320 --> 1:04:54.320
 So different countries that we've talked about have different value systems.

1:04:54.320 --> 1:05:02.320
 If you look at the United States to an almost extreme degree, there is an absolute desire for freedom of speech.

1:05:02.320 --> 1:05:14.320
 If you look at a country where I was raised, that desire just amongst the people is not as elevated as it is to basically fundamental level

1:05:14.320 --> 1:05:17.320
 to the essence of what it means to be America, right?

1:05:17.320 --> 1:05:20.320
 And the same is true with China, there's different value systems.

1:05:20.320 --> 1:05:30.320
 There is some censorship of internet content that China and Russia and many other countries undertake.

1:05:30.320 --> 1:05:40.320
 Do you see that having effects on innovation, other aspects of some of the tech stuff, AI development we talked about

1:05:40.320 --> 1:05:52.320
 and maybe from another angle, do you see that changing in different ways over the next 10 years, 20 years, 50 years as China continues to grow

1:05:52.320 --> 1:05:55.320
 as it does now in its tech innovation?

1:05:55.320 --> 1:06:08.320
 There's a common belief that full freedom of speech and expression is correlated with creativity, which is correlated with entrepreneurial success.

1:06:08.320 --> 1:06:15.320
 I think empirically we have seen that is not true and China has been successful.

1:06:15.320 --> 1:06:25.320
 That's not to say the fundamental values are not right or not the best, but it's just that perfect correlation isn't there.

1:06:25.320 --> 1:06:36.320
 It's hard to read the tea leaves on opening up or not in any country and I've not been very good at that in my past predictions.

1:06:36.320 --> 1:06:46.320
 But I do believe every country shares some fundamental value, a lot of fundamental values for the long term.

1:06:46.320 --> 1:07:02.320
 So, you know, China is drafting its privacy policy for individual citizens and they don't look that different from the American or European ones.

1:07:02.320 --> 1:07:13.320
 So, people do want to protect their privacy and have the opportunity to express and I think the fundamental values are there.

1:07:13.320 --> 1:07:21.320
 The question is in the execution and timing, how soon or when will that start to open up?

1:07:21.320 --> 1:07:31.320
 So, as long as each government knows, ultimately people want that kind of protection, there should be a plan to move towards that.

1:07:31.320 --> 1:07:35.320
 As to when or how, again, I'm not an expert.

1:07:35.320 --> 1:07:38.320
 On the point of privacy to me, it's really interesting.

1:07:38.320 --> 1:07:44.320
 So, AI needs data to create a personalized awesome experience.

1:07:44.320 --> 1:07:47.320
 I'm just speaking generally in terms of products.

1:07:47.320 --> 1:07:53.320
 And then we have currently, depending on the age and depending on the demographics of who we're talking about,

1:07:53.320 --> 1:07:58.320
 some people are more or less concerned about the amount of data they hand over.

1:07:58.320 --> 1:08:03.320
 So, in your view, how do we get this balance right?

1:08:03.320 --> 1:08:09.320
 That we provide an amazing experience to people that use products.

1:08:09.320 --> 1:08:15.320
 You look at Facebook, you know, the more Facebook knows about you, yes, it's scary to say.

1:08:15.320 --> 1:08:20.320
 The better it can probably, a better experience it can probably create.

1:08:20.320 --> 1:08:24.320
 So, in your view, how do we get that balance right?

1:08:24.320 --> 1:08:38.320
 Yes, I think a lot of people have a misunderstanding that it's okay and possible to just rip all the data out from a provider and give it back to you.

1:08:38.320 --> 1:08:43.320
 So, you can deny them access to further data and still enjoy the services we have.

1:08:43.320 --> 1:08:48.320
 If we take back all the data, all the services will give us nonsense.

1:08:48.320 --> 1:08:57.320
 We'll no longer be able to use products that function well in terms of, you know, right ranking, right products, right user experience.

1:08:57.320 --> 1:09:04.320
 So, yet I do understand we don't want to permit misuse of the data.

1:09:04.320 --> 1:09:16.320
 From legal policy standpoint, I think there can be severe punishment for those who have egregious misuse of the data.

1:09:16.320 --> 1:09:19.320
 That's, I think, a good first step.

1:09:19.320 --> 1:09:27.320
 Actually, China on this aspect has very strong laws about people who sell or give data to other companies.

1:09:27.320 --> 1:09:40.320
 And that over the past few years, since that law came into effect, pretty much eradicated the illegal distribution sharing of data.

1:09:40.320 --> 1:09:52.320
 Additionally, I think giving, I think technology is often a very good way to solve technology misuse.

1:09:52.320 --> 1:09:58.320
 So, can we come up with new technologies that will let us have our cake and eat it too?

1:09:58.320 --> 1:10:07.320
 People are looking into homomorphic encryption, which is letting you keep the data, have it encrypted and train encrypted data.

1:10:07.320 --> 1:10:13.320
 Of course, we haven't solved that one yet, but that kind of direction may be worth pursuing.

1:10:13.320 --> 1:10:22.320
 Also federated learning, which would allow one hospital to train on its hospitals patient data fully because they have a license for that.

1:10:22.320 --> 1:10:28.320
 And then hospitals would then share their models, not data, but models to create a supra AI.

1:10:28.320 --> 1:10:30.320
 And that also maybe has some promise.

1:10:30.320 --> 1:10:39.320
 So I would want to encourage us to be open minded and think of this as not just the policy binary yes no,

1:10:39.320 --> 1:10:48.320
 but letting the technologists try to find solutions to let us have our cake and eat it too, or have most of our cake and eat most of it too.

1:10:48.320 --> 1:10:55.320
 Finally, I think giving each end user a choice is important and having transparency is important.

1:10:55.320 --> 1:11:04.320
 Also, I think that's universal, but the choice you give to the user should not be at a granular level that the user cannot understand.

1:11:04.320 --> 1:11:12.320
 GDPR today causes all these pop ups of yes, no, will you give this site this right to use this part of your data?

1:11:12.320 --> 1:11:20.320
 I don't think any user understands what they're saying yes or no to, and I suspect most are just saying yes because they don't understand it.

1:11:20.320 --> 1:11:30.320
 So while GDPR in its current implementation has lived up to its promise of transparency and user choice,

1:11:30.320 --> 1:11:39.320
 it implemented it in such a way that really didn't deliver the spirit of GDPR.

1:11:39.320 --> 1:11:41.320
 It fit the letter, but not the spirit.

1:11:41.320 --> 1:11:50.320
 So again, I think we need to think about is there a way to fit the spirit of GDPR by using some kind of technology?

1:11:50.320 --> 1:11:52.320
 Can we have a slider?

1:11:52.320 --> 1:12:01.320
 That's an AI trying to figure out how much you want to slide between perfect protection security of your personal data

1:12:01.320 --> 1:12:07.320
 versus high degree of convenience with some risks of not having full privacy.

1:12:07.320 --> 1:12:11.320
 Each user should have some preference and that gives you the user choice,

1:12:11.320 --> 1:12:18.320
 but maybe we should turn the problem on its head and ask can there be an AI algorithm that can customize this

1:12:18.320 --> 1:12:24.320
 because we can understand the slider, but we sure cannot understand every pop up question.

1:12:24.320 --> 1:12:30.320
 And I think getting that right requires getting the balance between what we talked about earlier,

1:12:30.320 --> 1:12:36.320
 which is heart and soul versus profit driven decisions and strategy.

1:12:36.320 --> 1:12:45.320
 I think from my perspective, the best way to make a lot of money in the long term is to keep your heart and soul intact.

1:12:45.320 --> 1:12:53.320
 I think getting that slider right in the short term may feel like you'll be sacrificing profit,

1:12:53.320 --> 1:12:59.320
 but in the long term, you'll be getting user trust and providing a great experience.

1:12:59.320 --> 1:13:01.320
 Do you share that kind of view in general?

1:13:01.320 --> 1:13:11.320
 Yes, absolutely. I sure would hope there is a way we can do long term projects that really do the right thing.

1:13:11.320 --> 1:13:16.320
 I think a lot of people who embrace GDPR, their hearts in the right place.

1:13:16.320 --> 1:13:20.320
 I think they just need to figure out how to build a solution.

1:13:20.320 --> 1:13:24.320
 I've heard utopians talk about solutions that get me excited,

1:13:24.320 --> 1:13:29.320
 but not sure how in the current funding environment they can get started, right?

1:13:29.320 --> 1:13:37.320
 People talk about, imagine this crowdsourced data collection that we all trust,

1:13:37.320 --> 1:13:45.320
 and then we have these agents that we ask them to ask the trusted agent.

1:13:45.320 --> 1:13:48.320
 That agent only, that platform.

1:13:48.320 --> 1:14:02.320
 A trusted joint platform that we all believe is trustworthy that can give us all the close loop personal suggestions

1:14:02.320 --> 1:14:07.320
 by the new social network, new search engine, new ecommerce engine

1:14:07.320 --> 1:14:12.320
 that has access to even more of our data, but not directly but indirectly.

1:14:12.320 --> 1:14:18.320
 I think that general concept of licensing to some trusted engine

1:14:18.320 --> 1:14:22.320
 and finding a way to trust that engine seems like a great idea,

1:14:22.320 --> 1:14:27.320
 but if you think how long it's going to take to implement and tweak and develop it right,

1:14:27.320 --> 1:14:31.320
 as well as to collect all the trust and the data from the people,

1:14:31.320 --> 1:14:34.320
 it's beyond the current cycle of venture capital.

1:14:34.320 --> 1:14:37.320
 How do you do that is a big question.

1:14:37.320 --> 1:14:44.320
 You've recently had a fight with cancer, stage 4 lymphoma,

1:14:44.320 --> 1:14:54.320
 and in a sort of deep personal level, what did it feel like in the darker moments to face your own mortality?

1:14:54.320 --> 1:14:57.320
 Well, I've been the workaholic my whole life,

1:14:57.320 --> 1:15:04.320
 and I've basically worked 9.96, 9am to 9pm, 6 days a week, roughly.

1:15:04.320 --> 1:15:10.320
 And I didn't really pay a lot of attention to my family, friends, and people who loved me,

1:15:10.320 --> 1:15:14.320
 and my life revolved around optimizing for work.

1:15:14.320 --> 1:15:25.320
 While my work was not routine, my optimization really made my life basically a very mechanical process.

1:15:25.320 --> 1:15:36.320
 But I got a lot of highs out of it because of accomplishments that I thought were really important and dear and the highest priority to me.

1:15:36.320 --> 1:15:41.320
 But when I faced mortality and the possible death in matter of months,

1:15:41.320 --> 1:15:45.320
 I suddenly realized that this really meant nothing to me,

1:15:45.320 --> 1:15:48.320
 that I didn't feel like working for another minute,

1:15:48.320 --> 1:15:54.320
 that if I had 6 months left in my life, I would spend it all with my loved ones.

1:15:54.320 --> 1:16:02.320
 And thanking them, giving them love back, and apologizing to them that I lived my life the wrong way.

1:16:02.320 --> 1:16:11.320
 So that moment of reckoning caused me to really rethink that why we exist in this world

1:16:11.320 --> 1:16:22.320
 is something that we might be too much shaped by the society to think that success and accomplishments is why we live.

1:16:22.320 --> 1:16:29.320
 And while that can get you periodic successes and satisfaction,

1:16:29.320 --> 1:16:35.320
 it's really in them facing death, you see what's truly important to you.

1:16:35.320 --> 1:16:41.320
 So as a result of going through the challenges with cancer,

1:16:41.320 --> 1:16:45.320
 I've resolved to live a more balanced lifestyle.

1:16:45.320 --> 1:16:48.320
 I'm now in remission, knock on wood,

1:16:48.320 --> 1:16:52.320
 and I'm spending more time with my family.

1:16:52.320 --> 1:16:54.320
 My wife travels with me.

1:16:54.320 --> 1:16:57.320
 When my kids need me, I spend more time with them.

1:16:57.320 --> 1:17:02.320
 And before, I used to prioritize everything around work.

1:17:02.320 --> 1:17:05.320
 When I had a little bit of time, I would dole it out to my family.

1:17:05.320 --> 1:17:09.320
 Now, when my family needs something, really needs something,

1:17:09.320 --> 1:17:12.320
 I drop everything at work and go to them.

1:17:12.320 --> 1:17:15.320
 And then in the time remaining, I allocate to work.

1:17:15.320 --> 1:17:18.320
 But one's family is very understanding.

1:17:18.320 --> 1:17:22.320
 It's not like they will take 50 hours a week from me.

1:17:22.320 --> 1:17:26.320
 So I'm actually able to still work pretty hard,

1:17:26.320 --> 1:17:28.320
 maybe 10 hours less per week.

1:17:28.320 --> 1:17:35.320
 So I realize the most important thing in my life is really love and the people I love.

1:17:35.320 --> 1:17:38.320
 And I give that the highest priority.

1:17:38.320 --> 1:17:40.320
 It isn't the only thing I do.

1:17:40.320 --> 1:17:45.320
 But when that is needed, I put that at the top priority.

1:17:45.320 --> 1:17:49.320
 And I feel much better and I feel much more balanced.

1:17:49.320 --> 1:17:56.320
 And I think this also gives a hint as to a life of routine work,

1:17:56.320 --> 1:17:58.320
 a life of pursuit of numbers.

1:17:58.320 --> 1:18:03.320
 While my job was not routine, it wasn't pursuit of numbers.

1:18:03.320 --> 1:18:05.320
 Pursuit of, can I make more money?

1:18:05.320 --> 1:18:07.320
 Can I fund more great companies?

1:18:07.320 --> 1:18:09.320
 Can I raise more money?

1:18:09.320 --> 1:18:13.320
 Can I make sure our VC is ranked higher and higher every year?

1:18:13.320 --> 1:18:20.320
 This competitive nature of driving for bigger numbers and better numbers

1:18:20.320 --> 1:18:27.320
 became an endless pursuit of that's mechanical.

1:18:27.320 --> 1:18:31.320
 And bigger numbers really didn't make me happier.

1:18:31.320 --> 1:18:36.320
 And faced with death, I realized bigger numbers really meant nothing.

1:18:36.320 --> 1:18:42.320
 And what was important is that people who have given their heart and their love to me

1:18:42.320 --> 1:18:45.320
 deserve for me to do the same.

1:18:45.320 --> 1:18:52.320
 So there's deep profound truth in that, that everyone should hear and internalize.

1:18:52.320 --> 1:18:56.320
 And that's really powerful for you to say that.

1:18:56.320 --> 1:19:02.320
 I have to ask sort of a difficult question here.

1:19:02.320 --> 1:19:07.320
 So I've competed in sports my whole life, looking historically.

1:19:07.320 --> 1:19:14.320
 I'd like to challenge some aspect of that a little bit on the point of hard work.

1:19:14.320 --> 1:19:20.320
 That it feels that there are certain aspects that is the greatest,

1:19:20.320 --> 1:19:26.320
 the most beautiful aspects of human nature, is the ability to become obsessed,

1:19:26.320 --> 1:19:33.320
 of becoming extremely passionate to the point where, yes, flaws are revealed

1:19:33.320 --> 1:19:36.320
 and just giving yourself fully to a task.

1:19:36.320 --> 1:19:41.320
 That is, in another sense, you mentioned love being important,

1:19:41.320 --> 1:19:47.320
 but in another sense, this kind of obsession, this pure exhibition of passion and hard work

1:19:47.320 --> 1:19:50.320
 is truly what it means to be human.

1:19:50.320 --> 1:19:53.320
 What lessons should we take that's deeper?

1:19:53.320 --> 1:19:55.320
 Because you've accomplished incredible things.

1:19:55.320 --> 1:19:57.320
 Like chasing numbers.

1:19:57.320 --> 1:20:01.320
 But really, there's some incredible work there.

1:20:01.320 --> 1:20:07.320
 So how do you think about that when you look back in your 20s, your 30s?

1:20:07.320 --> 1:20:10.320
 What would you do differently?

1:20:10.320 --> 1:20:16.320
 Would you really take back some of the incredible hard work?

1:20:16.320 --> 1:20:17.320
 I would.

1:20:17.320 --> 1:20:20.320
 But it's in percentages, right?

1:20:20.320 --> 1:20:22.320
 We're both now computer scientists.

1:20:22.320 --> 1:20:27.320
 So I think when one balances one's life, when one is younger,

1:20:27.320 --> 1:20:33.320
 you might give a smaller percentage to family, but you would still give them high priority.

1:20:33.320 --> 1:20:38.320
 And when you get older, you would give a larger percentage to them and still the high priority.

1:20:38.320 --> 1:20:43.320
 And when you're near retirement, you give most of it to them and the highest priority.

1:20:43.320 --> 1:20:50.320
 So I think the key point is not that we would work 20 hours less for the whole life

1:20:50.320 --> 1:20:56.320
 and just spend it aimlessly with the family, but that when the family has a need,

1:20:56.320 --> 1:21:02.320
 when your wife is having a baby, when your daughter has a birthday,

1:21:02.320 --> 1:21:07.320
 or when they're depressed, or when they're celebrating something,

1:21:07.320 --> 1:21:11.320
 or when they have a get together, or when we have family time,

1:21:11.320 --> 1:21:18.320
 that is important for us to put down our phone and PC and be 100% with them.

1:21:18.320 --> 1:21:26.320
 And that priority on the things that really matter isn't going to be so taxing

1:21:26.320 --> 1:21:32.320
 that it would eliminate or even dramatically reduce our accomplishments.

1:21:32.320 --> 1:21:36.320
 It might have some impact, but it might also have other impact

1:21:36.320 --> 1:21:39.320
 because if you have a happier family, maybe you fight less.

1:21:39.320 --> 1:21:45.320
 If you fight less, you don't spend time taking care of all the aftermath of a fight.

1:21:45.320 --> 1:21:46.320
 That's right.

1:21:46.320 --> 1:21:48.320
 And I'm sure that it would take more time.

1:21:48.320 --> 1:21:53.320
 And if it did, I'd be willing to take that reduction.

1:21:53.320 --> 1:21:56.320
 And it's not a dramatic number, but it's a number

1:21:56.320 --> 1:22:00.320
 that I think would give me a greater degree of happiness

1:22:00.320 --> 1:22:03.320
 and knowing that I've done the right thing

1:22:03.320 --> 1:22:10.320
 and still have plenty of hours to get the success that I want to get.

1:22:10.320 --> 1:22:14.320
 So given the many successful companies that you've launched

1:22:14.320 --> 1:22:17.320
 and much success throughout your career,

1:22:17.320 --> 1:22:25.320
 what advice would you give to young people today looking,

1:22:25.320 --> 1:22:28.320
 or it doesn't have to be young, but people today looking to launch

1:22:28.320 --> 1:22:35.320
 and to create the next $1 billion tech startup, or even AI based startup?

1:22:35.320 --> 1:22:42.320
 I would suggest that people understand technology waves move quickly.

1:22:42.320 --> 1:22:45.320
 What worked two years ago may not work today.

1:22:45.320 --> 1:22:49.320
 And that is very much a case in point for AI.

1:22:49.320 --> 1:22:53.320
 I think two years ago, or maybe three years ago,

1:22:53.320 --> 1:22:57.320
 you certainly could say I have a couple of super smart PhDs

1:22:57.320 --> 1:22:59.320
 and we're not sure what we're going to do,

1:22:59.320 --> 1:23:04.320
 but here's how we're going to start and get funding for a very high valuation.

1:23:04.320 --> 1:23:11.320
 Those days are over because AI is going from rocket science towards mainstream.

1:23:11.320 --> 1:23:14.320
 Not yet commodity, but more mainstream.

1:23:14.320 --> 1:23:20.320
 So first, the creation of any company to eventual capitalist

1:23:20.320 --> 1:23:25.320
 has to be creation of business value and monetary value.

1:23:25.320 --> 1:23:29.320
 And when you have a very scarce commodity,

1:23:29.320 --> 1:23:34.320
 VCs may be willing to accept greater uncertainty.

1:23:34.320 --> 1:23:40.320
 But now the number of people who have the equivalent of PhD three years ago

1:23:40.320 --> 1:23:43.320
 because that can be learned more quickly.

1:23:43.320 --> 1:23:45.320
 Platforms are emerging.

1:23:45.320 --> 1:23:51.320
 The cost to become an AI engineer is much lower and there are many more AI engineers.

1:23:51.320 --> 1:23:53.320
 So the market is different.

1:23:53.320 --> 1:23:57.320
 So I would suggest someone who wants to build an AI company

1:23:57.320 --> 1:24:01.320
 be thinking about the normal business questions.

1:24:01.320 --> 1:24:05.320
 What customer cases are you trying to address?

1:24:05.320 --> 1:24:08.320
 What kind of pain are you trying to address?

1:24:08.320 --> 1:24:10.320
 How does that translate to value?

1:24:10.320 --> 1:24:16.320
 How will you extract value and get paid through what channel?

1:24:16.320 --> 1:24:19.320
 And how much business value will get created?

1:24:19.320 --> 1:24:26.320
 That today needs to be thought about much earlier up front than it did three years ago.

1:24:26.320 --> 1:24:30.320
 The scarcity question of AI talent has changed.

1:24:30.320 --> 1:24:32.320
 The number of AI talent has changed.

1:24:32.320 --> 1:24:41.320
 So now you need not just AI but also understanding of business customer and the marketplace.

1:24:41.320 --> 1:24:49.320
 So I also think you should have a more reasonable evaluation expectation

1:24:49.320 --> 1:24:51.320
 and growth expectation.

1:24:51.320 --> 1:24:53.320
 There's going to be more competition.

1:24:53.320 --> 1:25:00.320
 But the good news though is that AI technologies are now more available in open source.

1:25:00.320 --> 1:25:06.320
 TensorFlow, PyTorch and such tools are much easier to use.

1:25:06.320 --> 1:25:13.320
 So you should be able to experiment and get results iteratively faster than before.

1:25:13.320 --> 1:25:18.320
 So take more of a business mindset to this.

1:25:18.320 --> 1:25:25.320
 Think less of this as a laboratory taken into a company because we've gone beyond that stage.

1:25:25.320 --> 1:25:31.320
 The only exception is if you truly have a breakthrough in some technology that really no one has,

1:25:31.320 --> 1:25:34.320
 then the old way still works.

1:25:34.320 --> 1:25:36.320
 But I think that's harder and harder now.

1:25:36.320 --> 1:25:44.320
 So I know you believe as many do that we're far from creating an artificial general intelligence system.

1:25:44.320 --> 1:25:54.320
 But say once we do and you get to ask her one question, what would that question be?

1:25:54.320 --> 1:26:00.320
 What is it that differentiates you and me?

1:26:00.320 --> 1:26:05.320
 Beautifully put, Kaifu, thank you so much for your time today.

1:26:05.320 --> 1:26:26.320
 Thank you.