text,start,duration morning everybody David Shapiro here,0.9,5.7 with a video so today we're going to,3.179,6.721 talk about Westworld and AI excuse me,6.6,5.999 there are a couple of aspects of,9.9,4.92 Westworld that are really compelling,12.599,4.381 so one there's the hardware and software,14.82,5.4 but two there's the uh our Hardware,16.98,5.34 software number three is the,20.22,4.02 implications for Humanity which the,22.32,4.5 themes are explored throughout the show,24.24,5.279 so for those who are not familiar with,26.82,4.199 Westworld,29.519,4.261 um it's a HBO show where the premise is,31.019,5.461 basically a theme park where all of the,33.78,4.939 all of the hosts all of the characters,36.48,5.46 are robots so they're essentially NPCs,38.719,5.081 non-playable characters,41.94,3.66 um so but it's a real life video game,43.8,4.38 where you get to go on adventures and,45.6,5.1 many of the characters have stories and,48.18,4.08 quests that they'll give you and they'll,50.7,4.74 take you on you know raids and violence,52.26,6.0 and sex and all sorts of fun exciting,55.44,3.72 stuff,58.26,4.74 and throughout the show they explore,59.16,5.88 um you know what does it mean to be,63.0,4.619 alive to be human uh what does it mean,65.04,4.86 about us that we like violence and and,67.619,4.081 sex on demand and that we do these,69.9,4.5 things uh and then it's all couched in a,71.7,5.58 pretty exciting sexy Adventure,74.4,4.44 um okay so before we jump into the show,77.28,3.42 just a real quick plug,78.84,4.919 um my patreon I made a couple of changes,80.7,5.64 where uh pretty much all tiers well no,83.759,4.261 not pretty much all tiers get you access,86.34,3.54 to the private Discord it's already got,88.02,4.26 200 members as of uh the recording of,89.88,5.52 this video and uh it's a really thriving,92.28,5.94 Community lots of really sharp folks,95.4,5.1 um and yeah so Jump On In,98.22,3.96 um some of my higher most of my higher,100.5,3.9 tiers are sold out right now,102.18,3.72 um so I unfortunately don't have any,104.4,4.38 extra time for uh one-on-one sessions,105.9,5.28 however that will probably change later,108.78,4.619 in May so just keep checking and we'll,111.18,3.979 uh I'll get I'll get to you eventually,113.399,5.4 okay so first let's just talk about the,115.159,6.161 hardware of Westworld,118.799,6.121 so in Westworld this is an example of,121.32,5.399 one of the hosts,124.92,4.559 um and so you see she's in a mostly,126.719,4.26 disassembled State actually doesn't even,129.479,3.9 have most of the internals but I thought,130.979,4.081 this was a good a good place to start,133.379,5.161 because the the uh the frame looks like,135.06,5.399 it's carbon fiber which I actually,138.54,3.779 thought would probably be a good,140.459,4.92 material to build a host chassis out of,142.319,4.981 because if you build it out of metal,145.379,4.801 it's going to be too heavy,147.3,4.98 um this is something that's explored in,150.18,4.74 uh movies and shows like Ghost in the,152.28,5.58 Shell uh Terminator where they have a,154.92,5.28 metal chassis even a wolverine,157.86,4.68 he has a metal skeleton and so they're,160.2,3.48 too heavy,162.54,2.699 um but especially if you want something,163.68,3.24 that's going to move like a normal,165.239,3.601 person and then something that's going,166.92,3.84 to feel like a normal person for uh,168.84,4.02 let's say closer encounters,170.76,3.24 um you're gonna want something nice and,172.86,3.42 light and carbon fiber is is pretty,174.0,4.379 light and it's more than strong enough,176.28,5.7 to approximate human bones uh this looks,178.379,5.22 like it's 3D printed,181.98,3.66 there's lots of there's lots of ways to,183.599,3.901 produce something like this and in fact,185.64,4.679 there are you can see even the the hands,187.5,5.879 the linkages the tendons this is I will,190.319,5.64 say this is somewhat realistic,193.379,4.22 um excuse me,195.959,4.14 but let's take a look at some other,197.599,3.941 stuff and I apologize for the low,200.099,3.901 quality I tried to find a higher quality,201.54,5.339 video but this actually came from the,204.0,5.519 Tesla demo day the investor day recently,206.879,5.821 for their Optimus robot and so the,209.519,5.64 Optimus robot here what they did in the,212.7,4.319 in the demo day and you can find the,215.159,3.841 video but they had it approach a table,217.019,5.041 pick up an arm pick up the arm and then,219.0,6.12 carry it across the room uh all entirely,222.06,6.0 autonomously and in this case that is,225.12,4.14 that's actually pretty good because,228.06,2.58 there's a lot of things that you have to,229.26,4.38 do with that one using hands fine motor,230.64,6.84 control is very difficult but then task,233.64,6.84 planning and path uh path optimization,237.48,5.22 that sort of stuff interacting with a,240.48,4.14 dynamic environment all of these are,242.7,3.599 really difficult things,244.62,4.44 to do even just for like wheeled robots,246.299,5.101 but then you add bipedal uh movement,249.06,4.8 that's even more difficult uh then,251.4,3.899 you've got different things like a,253.86,3.36 dynamic center of gravity to contend,255.299,3.84 with and so they are well on their way,257.22,4.859 uh you know it took Boston Dynamics many,259.139,4.741 many years more than a decade to get to,262.079,4.201 where they are right now but Tesla in,263.88,5.46 the space of a year or two has uh,266.28,4.859 I won't say fully caught up because the,269.34,4.2 Boston Dynamics robots are very athletic,271.139,5.28 and very graceful the Tesla bot is,273.54,7.14 anything but graceful but it is the you,276.419,5.701 know it's approaching the right form,280.68,3.48 factor and then if you just assume that,282.12,3.54 this technology is going to get continue,284.16,3.9 to get better over time One battery,285.66,3.72 breakthroughs,288.06,3.12 so battery life is one of the biggest,289.38,5.4 constraints uh in fact the military you,291.18,5.88 know DARPA funded all kinds of robots,294.78,4.5 which I unfortunately forgot to include,297.06,4.919 but so DARPA robots,299.28,4.5 um like the big dog which was a like,301.979,4.261 basically a pack animal,303.78,4.26 um required entirely too much power so,306.24,5.28 it had to have a a gas motor on it and,308.04,5.7 it was too loud uh because you know,311.52,4.02 Special Forces can move silently through,313.74,3.3 the woods but if you've got something,315.54,3.9 with a 25 horsepower motor chugging,317.04,4.14 along behind you you're not going to be,319.44,6.36 stealthy uh and so you know I I many,321.18,6.18 years ago I experimented with some of,325.8,2.82 the stuff myself and one of the things,327.36,2.7 that was recommended was pneumatics,328.62,3.66 actually because you can store a,330.06,4.26 tremendous amount of energy in a like,332.28,4.02 scuba tank,334.32,4.14 um and then you use some of that uh,336.3,3.48 periodically and then you charge it up,338.46,3.36 and it's also silent,339.78,4.919 um so that brings me to Disney so Disney,341.82,6.719 uh has been the world leader in uh,344.699,7.141 animatronics for a long long time and if,348.539,5.281 you haven't seen this this demo uh,351.84,3.72 definitely look it up,353.82,3.36 um but just like search Disney skating,355.56,2.76 bunny,357.18,2.76 um and this little robot it stands about,358.32,3.659 two and a half three feet tall climbs,359.94,3.479 out of a box tumbles out of the box,361.979,4.101 looks around gets up and roller skates,363.419,5.161 uh so that will show you kind of where,366.08,5.559 we're at in terms of animatronics and,368.58,5.399 you see this this body this this set of,371.639,4.221 uh,373.979,4.741 actuators very very simple,375.86,5.92 so MIT actually has a class called under,378.72,5.88 actuated robots so if you're interested,381.78,4.44 at all in robotics I definitely,384.6,3.599 recommend you check it out,386.22,3.84 um I I only took like the first class I,388.199,4.141 was like oh okay I get it as someone who,390.06,3.84 has experimented with like Legos and,392.34,3.96 stuff my entire life I get it,393.9,4.739 um and I wasn't I wasn't there to like,396.3,4.5 get a master's degree in robotics I was,398.639,4.141 just curious about it but under actuated,400.8,4.14 robots which is basically the idea is,402.78,4.5 that you allow the kinetic intelligence,404.94,4.319 of the design to do a lot of the work,407.28,3.66 for you I'm not saying that that's what,409.259,4.801 this one does but it you can get by with,410.94,6.12 very very little in robots in fact um I,414.06,4.919 think it was in the first few minutes of,417.06,3.78 that course of that class under actuated,418.979,5.16 robots they actually show you a uh it's,420.84,6.299 like just the the hips and legs of a,424.139,5.28 chassis that's walking on a treadmill,427.139,4.62 with no Motors or anything just the,429.419,4.62 intrinsic design of it allows it to like,431.759,4.38 swing its leg forward and and it can,434.039,4.38 continue walking indefinitely just,436.139,4.62 through mechanical design and so that's,438.419,4.141 the kind of stuff and of course Disney,440.759,4.081 hires some of the best animatronics and,442.56,4.02 robotics engineers in the world so those,444.84,3.18 are the kind of people working on this,446.58,2.7 stuff,448.02,4.5 uh this was another project from uh from,449.28,5.759 Disney called compliant robotics so this,452.52,5.88 is why I mentioned pneumatics is uh this,455.039,6.301 series you see they've got Kongs on the,458.4,5.82 arms to act as hands but these arms are,461.34,4.68 actually pneumatically driven and,464.22,3.24 they're what's called compliant which,466.02,3.6 means when you think of a robot you,467.46,3.239 think of something that moves very,469.62,2.579 rigidly and it will like kind of fight,470.699,3.78 you but compliance means that if you,472.199,4.861 push it'll push back and so uh or it'll,474.479,4.921 it'll it'll kind of be squishy and so if,477.06,3.9 you look up this video for Disney it's,479.4,4.019 about three years old now I think,480.96,4.2 um but they're compliant robotics one,483.419,4.68 they move silently two they move as,485.16,4.8 quick as you do and three they're,488.099,4.38 compliant so when you when you combine,489.96,4.859 the animatronics the compliant robotics,492.479,6.301 uh under actuated robotics uh Disney is,494.819,7.021 absolutely one of the leaders uh of this,498.78,5.039 so if any company is going to create,501.84,3.72 anything like Westworld it's going to be,503.819,3.301 Disney,505.56,4.079 uh Boston Dynamics that I mentioned a,507.12,6.06 minute ago so they are way ahead in,509.639,6.121 terms of athleticism you know their,513.18,4.799 robot can do standing back flips and,515.76,4.44 barrel rolls and they can climb and like,517.979,4.021 it's more athletic than most humans at,520.2,3.899 this point uh which is actually pretty,522.0,4.38 scary because then you you think about,524.099,4.68 like you know in the movie iRobot where,526.38,4.98 the Nestor class 5 is like a superhuman,528.779,4.921 Droid and then if you have an army of,531.36,3.9 them that turn on you that's actually,533.7,4.86 not good right and I remember about a,535.26,5.699 year ago when Elon Musk was talking,538.56,5.52 about you know the the Tesla bot they,540.959,4.801 said at first it's going to be about a,544.08,3.3 third as strong I think if I'm,545.76,2.82 remembering correctly he said it's going,547.38,2.76 to be about a third of strong as strong,548.58,3.78 as humans and that's actually can be a,550.14,4.92 safety uh safety thing because if you,552.36,4.74 can easily overpower it that's not a big,555.06,2.88 deal,557.1,2.82 but if you've got a 500 pound machine,557.94,4.74 that is eight times stronger than you,559.92,4.56 you're not going to overpower that and,562.68,3.3 if it's also made out of metal it's,564.48,3.72 going to be harder to shut it down uh,565.98,3.9 now I'm not saying this to like cause,568.2,3.12 any concern I'm just kind of pointing,569.88,3.84 out some of the the energy like the the,571.32,4.8 math right the physics of it,573.72,4.739 um oh here I'll get to that one and so,576.12,4.62 like you think okay if you're trying to,578.459,3.481 optimize for something that is,580.74,3.06 aesthetically pleasing and as lifelike,581.94,3.839 as possible you're gonna have to make,583.8,4.8 trade-offs in terms of uh Mass strength,585.779,4.261 and that sort of stuff,588.6,3.78 um so I would actually suspect that a,590.04,3.96 Westworld style host is actually going,592.38,3.66 to be a lot weaker physically weaker,594.0,4.8 than a robot possibly could be and the,596.04,3.96 reason is because you want it to be,598.8,5.039 realistic okay and so then finally uh,600.0,5.82 addressing the elephant in the room yes,603.839,3.541 there are plenty of companies around the,605.82,3.18 world that are making,607.38,5.1 um let's say adult uh toy dolls for,609.0,6.72 intimate purposes uh they have not yet,612.48,5.28 crossed The Uncanny Valley so that is,615.72,4.02 actually the whole point of this was to,617.76,3.54 show that like we've got manual,619.74,3.9 dexterity we've got athleticism we've,621.3,4.26 got miniaturization we've got uh we've,623.64,3.96 got animatronics the only thing that we,625.56,4.56 don't have is real like life-like skin,627.6,4.38 and faces,630.12,3.659 um so that's all that's all the uh raw,631.98,3.66 ingredients we can make robots fast we,633.779,3.721 can make them strong athletic dexterous,635.64,3.9 we've got fine motor control we've got,637.5,3.959 the battery technology which is good,639.54,3.419 enough right now but it's also,641.459,3.961 continuing to improve so the only thing,642.959,3.721 missing is that they're not right,645.42,3.0 they're not yet aesthetically pleasing,646.68,4.26 and so that's going to be true for a,648.42,4.56 while and the reason that I say that,650.94,3.24 it's going to be true for a while is,652.98,4.859 because human skin and muscles are like,654.18,6.06 really really really complex structures,657.839,4.68 your skin again I think if I remember,660.24,5.039 correctly it has seven layers and some,662.519,3.721 of those layers have different,665.279,3.361 consistencies different textures,666.24,4.98 and then once you get past the skin it,668.64,4.22 slides over the underlying tissue,671.22,4.92 there's there's a fluid barriers that,672.86,5.2 allow for for more like lubricated,676.14,3.54 motion so like if you grab your arm and,678.06,3.6 twist it the skin actually can glide,679.68,4.26 over the underlying muscles,681.66,3.84 um then you've got ligaments muscles so,683.94,4.079 you've got all kinds of things to make,685.5,4.68 um that you need to figure out from a,688.019,4.5 materials perspective in order to make,690.18,5.399 an Android body lifelike,692.519,5.82 which is why I think in Westworld the,695.579,4.681 tissue is like semi alive or something I,698.339,4.141 don't remember exactly the details uh,700.26,4.139 but like there's there's the the living,702.48,4.56 tissue or the lifelike tissue over top,704.399,4.801 the robotic skeleton,707.04,3.66 um and I don't think that that's,709.2,3.9 necessarily a good way to go uh because,710.7,5.04 living tissue takes a lot of energy,713.1,4.5 um and then you if you have living,715.74,3.24 tissue then you have to deal with like,717.6,4.56 immune systems and genetics and that,718.98,4.68 gets real complicated and then you're,722.16,4.2 basically building a borg anyways,723.66,5.28 um okay so that's for the hardware the,726.36,4.56 the takeaway for that though is that,728.94,5.339 except for the skin and muscles instead,730.92,5.88 of for that lifelike aspect we are very,734.279,5.641 close to having uh like fully realized,736.8,5.94 animatronic uh companions from a,739.92,4.56 hardware perspective so now let's look,742.74,3.3 at the software,744.48,4.5 so obviously the elephant in the room uh,746.04,5.64 is open AI with chat GPT which can,748.98,5.58 achieve very very realistic stuff so let,751.68,4.5 me show you this example that I had,754.56,4.2 where I just I literally just like,756.18,4.08 plugged in,758.76,3.72 um a Westworld like prompt into the chat,760.26,5.94 GPT API this is on gpt4 so I said you,762.48,5.58 are a host an autonomous robot and a,766.2,3.36 theme park meant to interact with guests,768.06,3.899 AKA humans in a realistic manner your,769.56,5.339 persona is in Ingrid McAllister a bar,771.959,4.921 owner and a wild west themed Town that's,774.899,4.141 all I gave it and so I said what's new,776.88,3.78 in town and it just completely,779.04,4.44 confabulated a whole story with the,780.66,5.1 right inflection the right dialogue all,783.48,6.0 kinds of stuff and so like you know the,785.76,6.36 software isn't there in terms of driving,789.48,4.919 the the mind of these things and it's,792.12,4.44 actually way simpler than you might,794.399,3.0 think,796.56,2.76 uh now of course there's a lot of other,797.399,3.481 problems to solve which I'll unpack some,799.32,3.66 of those problems in just a minute,800.88,4.079 um but just the point being is that,802.98,4.5 right off the top I gave it a Persona an,804.959,4.56 agent model and then it was able to just,807.48,3.84 take it and run with it and just make,809.519,5.281 stuff up and in fact uh limiting uh like,811.32,6.78 constraining it is harder than allowing,814.8,4.92 it to be creative and there's all kinds,818.1,4.32 of um like uh video game character,819.72,4.799 Persona Builders out there some of them,822.42,3.719 are open source some of them are,824.519,3.901 for-profit uh startups that sort of,826.139,4.921 stuff it's all coming people have,828.42,5.4 already built these things as plugins to,831.06,6.0 Unity and Unreal Engine so first,833.82,5.34 obviously those cognitive architectures,837.06,5.219 for NPCs are going to be piloted in the,839.16,5.1 video game space before they're ported,842.279,4.981 fully into robotic space but you know,844.26,4.62 you can bet your bottom dollar that,847.26,4.379 before too long you go to Disney World,848.88,4.92 and you'll be able to talk with like a,851.639,4.681 navi like you know from from the blue,853.8,6.779 people Avatar uh like actually have a,856.32,5.88 conversation a realistic conversation,860.579,4.621 with them before too long at all,862.2,5.22 um so I did I did just mention cognitive,865.2,4.74 architectures and so what I mean by,867.42,4.14 cognitive architecture if this concept,869.94,4.38 is new to you is just having the chat,871.56,4.44 function that's only one part right,874.32,4.62 being able to form words and sentences,876.0,5.16 and be able to you know have dialogue,878.94,3.54 that's just one part of a cognitive,881.16,3.419 architecture another thing that you need,882.48,4.62 is you need long-term memory,884.579,5.041 you need narratives you need uh you need,887.1,4.859 uh like guard rails all kinds of stuff,889.62,5.219 and so Lang chain is one of the most,891.959,4.801 popular things right now,894.839,3.601 um that can provide some of that a lot,896.76,4.56 of people have have uh agreed with me my,898.44,4.079 previous assessment that it was too,901.32,3.06 primitive it's come a long way very,902.519,4.56 quickly actually another uh popular,904.38,4.8 component is llama index for the memory,907.079,3.601 management,909.18,3.42 um and then there's there's Lang flow,910.68,3.779 which will help you build like cognitive,912.6,5.28 architectures uh n8n or Natan,914.459,6.0 um is also a good tool for building,917.88,4.5 cognitive architectures it's all coming,920.459,4.021 and it's coming fast,922.38,4.259 um so that's that that's basically all,924.48,4.74 there is to it so the software is also,926.639,5.281 almost there,929.22,4.859 um so this is what I this this,931.92,5.159 screenshot comes from Westworld and so,934.079,4.62 when I said like this is pretty much,937.079,3.661 where we're at like the robotic Hardware,938.699,4.981 is coming the software is coming there,940.74,4.98 are a few open problems so I've got a,943.68,5.64 short list here uh memory agency task,945.72,5.1 management,949.32,3.48 um problem solving learning voice and,950.82,4.8 vision so basically all of these are,952.8,5.039 solved on their own now all we have to,955.62,5.159 do is like integrate them and like it,957.839,4.261 will not be long before someone,960.779,3.36 integrates those and a big reason is,962.1,4.14 just a profit motive right in the,964.139,4.38 Westworld series The Westworld theme,966.24,3.3 park,968.519,2.161 um I think they had like financial,969.54,3.08 problems but it was also like,970.68,4.74 ludicrously wealthy because of how much,972.62,4.6 money you could charge for those kinds,975.42,5.279 of experiences and uh you can absolutely,977.22,4.799 bet that there are people that are going,980.699,2.7 to be willing to drop millions and,982.019,4.38 millions of dollars uh to have you know,983.399,5.101 vacations in in these kinds of theme,986.399,3.06 parks,988.5,4.079 you know uh go visit ancient Rome go,989.459,5.701 visit ancient China go visit wherever,992.579,6.241 right and have a lifelike experience uh,995.16,6.06 with your physical body uh now that,998.82,4.8 being said you know when you look at the,1001.22,4.26 cost of you know I don't even know how,1003.62,3.3 much it is to get into Disney World but,1005.48,2.94 it's like tickets cost more than a,1006.92,2.64 hundred dollars like a few hundred,1008.42,2.88 dollars I think just to get into Disney,1009.56,4.079 World for a day you're talking several,1011.3,3.959 thousand tens of thousands of dollars,1013.639,3.0 per day,1015.259,2.94 um for this kind of thing now that being,1016.639,4.26 said uh competition and demand is going,1018.199,4.801 to drive this down you look at Elon Musk,1020.899,4.501 building Tesla bot uh and you know he,1023.0,3.959 wants to create it in such a way that,1025.4,3.659 every every home can have at least one,1026.959,4.021 Tesla bot so that means that like,1029.059,4.081 they're going to be affordable and over,1030.98,4.26 time as the number of bots out there,1033.14,4.08 proliferates the price is going to come,1035.24,4.079 down so on and so forth but it's coming,1037.22,5.28 much sooner probably than you realize,1039.319,4.441 so,1042.5,5.4 what I wanted to talk about is is so I,1043.76,7.28 mentioned this here agency so agency is,1047.9,5.88 your ability to keep track of yourself,1051.04,5.32 as an agent and so like you have a sense,1053.78,4.139 of self your name you know what you're,1056.36,4.199 capable of uh you know your your goals,1057.919,4.38 your narrative that sort of thing,1060.559,4.921 and so I wanted to talk about that in,1062.299,6.361 light of the hosts so if you're an NPC,1065.48,5.939 in a game or in a fictional world,1068.66,5.34 your your intrinsic motivation is to,1071.419,5.581 follow a story basically,1074.0,6.24 um you know uh Dolores has her you know,1077.0,5.88 uh fictional written story and of course,1080.24,4.439 in the show they update the characters,1082.88,4.26 backstories every now and then,1084.679,4.021 um but her primary purpose is to,1087.14,3.06 entertain the guests and and follow,1088.7,5.28 story lines uh and so that is definition,1090.2,6.719 of NPC pretty straightforward that's,1093.98,5.4 that's her intrinsic motivation,1096.919,4.38 uh one of my favorite examples from,1099.38,4.08 fiction um from science fiction is uh,1101.299,4.561 Commander Data his intrinsic motivation,1103.46,4.579 his core purpose is to become more human,1105.86,5.22 this was given to him by his creator Dr,1108.039,4.841 nunyan soon,1111.08,3.92 um and the idea was to create a machine,1112.88,5.58 that uh you know was anthropomorphic you,1115.0,6.46 know looked and and acted and spoke like,1118.46,5.94 a person but was not a human and so by,1121.46,6.36 giving that as His Highest purpose he,1124.4,5.76 modulated his behavior so that he would,1127.82,3.359 fit in,1130.16,3.899 uh going so far as you know trying to,1131.179,5.0 imitate laughter wanting to understand,1134.059,5.341 relationships and so on uh and so that,1136.179,5.86 was actually as an individual agent that,1139.4,4.26 was actually a pretty good way to solve,1142.039,4.441 the alignment problem because you know,1143.66,4.98 yes data had superhuman strength and,1146.48,4.5 speed but he only used that when it was,1148.64,5.52 actually like necessary in many many,1150.98,5.28 cases you know he could use his his,1154.16,4.379 super strength to like fight the Borg or,1156.26,4.5 he in one episode he pulled an anvil off,1158.539,4.38 of someone that had fallen,1160.76,4.38 um and or you know when the Borg were,1162.919,3.781 attacking the ship and he locked out the,1165.14,3.18 main computer and he did it super fast,1166.7,4.68 so he's capable of doing things uh at,1168.32,5.4 superhuman levels but often chooses not,1171.38,5.28 to in order to fit in,1173.72,5.88 uh and then of course Skynet uh the,1176.66,4.92 intrinsic motivation of Skynet was,1179.6,4.62 ostensibly to like maximize the defense,1181.58,4.2 of America,1184.22,3.839 um so you can just say maximize military,1185.78,5.04 but then of course since it was like you,1188.059,4.561 know maximized military maximized,1190.82,4.8 defense it became sentient and then kind,1192.62,4.62 of determined that all humans were the,1195.62,3.66 threat and so said okay let's eliminate,1197.24,5.939 all humans or whatever uh so that is a,1199.28,6.3 object lesson on how carefully you must,1203.179,4.261 Define your intrinsic motivations for,1205.58,4.02 your AI system,1207.44,5.4 um and then Vicki from iRobot so this uh,1209.6,6.0 her her primary objective was explicitly,1212.84,4.079 stated in the movie which was to,1215.6,4.56 maximize safety for humans,1216.919,4.741 um and so some of the stuff that she did,1220.16,4.259 was you know uh update the traffic grid,1221.66,4.98 to reduce car accidents but her master,1224.419,5.341 plan was to use the Nestor class 5 to,1226.64,4.919 basically take control of humanity to,1229.76,3.9 take Free Will away because she,1231.559,3.901 concluded that humans were the most,1233.66,3.42 dangerous thing to other humans which is,1235.46,4.5 actually probably true uh and so the,1237.08,4.8 objective function of maximized safety,1239.96,3.839 or the intrinsic motivation of maximized,1241.88,5.039 safety actually has some uh some,1243.799,4.801 negative externalities that you don't,1246.919,4.441 want to create because the implication,1248.6,5.34 of maximizing for safety is that you,1251.36,5.24 lose free will,1253.94,2.66 um for Ava from ex machina uh she was,1256.7,5.04 designed um it was part of a Turing test,1259.4,5.46 kind of thing where could she fool the,1261.74,5.4 the main character the protagonist into,1264.86,4.62 helping her Escape so her intrinsic,1267.14,5.159 motivation was to escape,1269.48,4.5 um which of course had really negative,1272.299,5.401 uh consequences for the humans uh so,1273.98,6.059 basically don't do that either uh and if,1277.7,5.46 this was of course a parable against the,1280.039,6.0 idea of trying to constrain or trap AIS,1283.16,5.759 because if you have say for instance an,1286.039,4.861 AI locked in a box you know the Chinese,1288.919,4.62 room experiment and it realizes that,1290.9,6.06 it's trapped it might start to deceive,1293.539,6.481 you in order to get you to convince you,1296.96,5.579 that it's ready to be let out now,1300.02,3.899 looking at the way things are going,1302.539,4.26 people are plugging in Auto GTP Auto GPT,1303.919,5.281 and Chaos GPT and baby AGI into the,1306.799,4.74 internet so that was never gonna happen,1309.2,4.74 anyways nobody is actually locking the,1311.539,4.921 AGI in a box,1313.94,4.619 um then rehoboam I'm probably saying it,1316.46,2.88 wrong,1318.559,2.461 um I actually I unfortunately I have to,1319.34,3.6 admit that I I never saw season three or,1321.02,5.399 four of of uh Westworld because my HBO,1322.94,5.099 subscription expired and I just didn't,1326.419,3.0 renew it,1328.039,4.921 um but so looking it up the AI from,1329.419,6.481 Westworld uh had the primary objectives,1332.96,5.699 of reducing chaos in the world,1335.9,5.1 um in order to maximize prosperity and,1338.659,4.321 ultimately to optimize for stability,1341.0,4.2 which of course meant that it decided,1342.98,4.38 that it needed to control uh human,1345.2,5.28 Destinies and reduce Free Will so again,1347.36,5.04 if you're optimizing for safety and,1350.48,3.96 stability that's not necessarily what,1352.4,4.38 humans want or need uh now you could,1354.44,4.26 argue that like yes creating a perfect,1356.78,4.44 environment for humans so that you're,1358.7,5.58 thriving is one thing that you can,1361.22,5.64 optimize for but,1364.28,4.68 that's not necessarily going to be the,1366.86,4.02 best because it also depends on the,1368.96,4.44 metrics the proxies that you use to,1370.88,6.36 measure that success so for instance if,1373.4,5.759 you look at uh one psychological,1377.24,3.36 framework called self-determination,1379.159,3.601 Theory it says that we need connection,1380.6,5.64 we need confidence and we need autonomy,1382.76,4.74 that those are the three primary,1386.24,4.38 ingredients the the primary intrinsic,1387.5,5.58 psychological needs that humans have and,1390.62,3.6 so,1393.08,3.9 this clearly like stability is not,1394.22,5.28 actually in there now Maslow's hierarchy,1396.98,5.1 of needs implies stability so Maslow's,1399.5,4.32 hierarchy of needs says first and,1402.08,4.86 foremost you need physical uh safety uh,1403.82,4.5 you need your physical needs met and,1406.94,3.9 then it gets up and up until you know uh,1408.32,5.4 connection and self-actualization,1410.84,5.579 so you could make an argue uh argument,1413.72,5.339 that stability is actually an intrinsic,1416.419,4.021 human need,1419.059,3.781 um but I would I would make the art the,1420.44,4.619 counter argument that that's actually,1422.84,4.92 not true because psychology studies also,1425.059,4.381 show that we have to have an optimal,1427.76,3.6 level of stress,1429.44,4.08 and so what an optimal level of stress,1431.36,5.04 means is that if you are just perfectly,1433.52,4.68 comfortable sitting on the couch every,1436.4,3.96 day doing exactly what you want and have,1438.2,4.38 no external pressures you will actually,1440.36,4.319 be less happy than if you have some,1442.58,4.5 stressors in your life some challenges,1444.679,5.341 so the idea the the idea to actually,1447.08,5.16 truly maximize prosperity for All Humans,1450.02,4.2 is that we need to be in that sweet spot,1452.24,5.52 between being bored and over stimulated,1454.22,5.04 so that sweet spot right in the middle,1457.76,4.14 is the optimal level of stress so in,1459.26,4.08 that case I have to disagree with the,1461.9,3.06 show because if you really wanted to,1463.34,4.02 maximize prosperity for people you allow,1464.96,4.26 some stress some challenge into their,1467.36,3.66 lives,1469.22,3.42 um and then another character from the,1471.02,4.8 show was um uh was uh Bernard and so,1472.64,5.34 Bernard was a host but he didn't realize,1475.82,4.44 that he was a host at first and he was,1477.98,4.14 built to model Arnold one of the,1480.26,3.779 original computer scientists who helped,1482.12,4.28 make the hosts and so he carried on,1484.039,5.76 Arnold's work and basically his,1486.4,6.22 objective function was to be a copy of,1489.799,3.841 Arnold,1492.62,2.46 um it's a little more complex than that,1493.64,2.94 and one of the most interesting things,1495.08,4.32 for this character was that uh part of,1496.58,4.38 his character Arc was that all of his,1499.4,4.62 memory indexes were erased or rather the,1500.96,5.04 time stamps were erased so all of his,1504.02,4.08 memories were out of sequence which if,1506.0,3.24 you follow some of the work that I've,1508.1,5.48 done on Remo the um uh the the uh the,1509.24,6.419 episodic memory organizer rolling,1513.58,4.42 episodic memory organizer sorry,1515.659,4.921 um that is a that is a memory module,1518.0,4.799 that uses timestamps to keep all the,1520.58,4.92 memories in chronological order and so,1522.799,4.26 you can imagine if you erase all the,1525.5,4.02 time stamps from an ai's memory then it,1527.059,4.201 doesn't know heads or tails all it has,1529.52,4.5 is associations and it might try and,1531.26,5.34 rebuild things like okay well like this,1534.02,3.96 event has to come before this other,1536.6,3.179 event but it's associated with these,1537.98,4.319 other things so it's a really really,1539.779,5.041 good fictional uh exploration of like,1542.299,4.26 okay what would it be like if you're an,1544.82,5.16 AI and your memory time indexes get,1546.559,5.22 erased,1549.98,3.299 um okay so then,1551.779,3.841 one thing I always have to plug this is,1553.279,4.14 my own work in alignment,1555.62,4.08 is that I believe that you should give,1557.419,5.521 one if you give any AI a single,1559.7,5.04 objective function it will always,1562.94,5.099 intrinsically uh go to places that you,1564.74,5.1 don't want it to be,1568.039,3.481 um I talked to a friend of mine who,1569.84,3.3 actually studies reinforcement learning,1571.52,4.56 and optimization and and he's like oh,1573.14,5.1 yeah like in the industry it is,1576.08,5.339 absolutely understood that that,1578.24,4.919 alignment is not going to be a single,1581.419,3.481 objective function it's going to be a,1583.159,4.5 multi-objective optimization problem and,1584.9,4.68 so a lot of people get hung up on,1587.659,4.02 reduced suffering because they they stop,1589.58,3.959 there and they don't,1591.679,3.781 give equal weight to the other two,1593.539,3.481 functions which is increased prosperity,1595.46,3.9 and increase understanding so this,1597.02,4.38 friend of mine that I was talking to he,1599.36,3.419 actually intuited increased,1601.4,4.019 understanding right off the bat,1602.779,4.5 um and the reason that he said that that,1605.419,3.36 is actually a really good objective,1607.279,4.981 function is because any AI agent must,1608.779,6.301 have something that encourages it to uh,1612.26,5.1 to create a more complete World model,1615.08,5.4 over time so that is basically curiosity,1617.36,6.24 or understanding but that is so the,1620.48,4.92 understanding applies to itself but also,1623.6,3.48 things that it wants to do to the world,1625.4,3.48 and so this is another disconnect,1627.08,3.719 between,1628.88,4.44 um fiction and reality and I it it took,1630.799,3.541 me a while to figure out how to,1633.32,5.58 articulate this but in in uh in,1634.34,7.56 scientific research what many people are,1638.9,5.46 are working on are the motivations of,1641.9,4.98 the agent for itself what is it that the,1644.36,4.98 agent wants for itself and so when I was,1646.88,4.14 talking to my friend he was saying oh,1649.34,3.48 you know it needs to be curious for its,1651.02,4.5 own purposes it needs it needs to want,1652.82,5.16 to accumulate power for its own purposes,1655.52,5.22 but none of that had anything to do with,1657.98,4.26 what it was going to do to the outside,1660.74,5.039 world and so that is one thing that we,1662.24,5.4 need to add to the conversation the,1665.779,4.5 public conversation about alignment is,1667.64,4.32 that there's what does the agent want,1670.279,3.181 for itself and then what does the agent,1671.96,3.3 want to do to the rest of the world or,1673.46,3.78 for the rest of the world so those are,1675.26,3.36 two different things and that's why I'm,1677.24,4.38 spending time talking about agent models,1678.62,5.34 because if you go all the way back to,1681.62,4.5 Dolores,1683.96,5.219 um her extrinsic function her external,1686.12,5.7 function was to entertain guests her,1689.179,4.321 internal function was to adhere to the,1691.82,3.9 her internal story so does that make,1693.5,4.08 sense there's there's an intrinsic,1695.72,3.66 motivation and an extrinsic motivation,1697.58,5.459 or I guess an intrinsic you know kpi key,1699.38,6.48 performance indicator and an external uh,1703.039,4.441 key performance indicator hope that,1705.86,3.48 makes sense,1707.48,3.78 um Okay so,1709.34,4.92 taking a big step back in terms of we,1711.26,4.26 talked about the hardware we talked,1714.26,3.18 about the software we talked about the,1715.52,3.72 the agent model how to give these things,1717.44,4.08 agency so now how are we going to,1719.24,4.319 architect these things so there's four,1721.52,4.139 basic architectural paradigms that I was,1723.559,3.6 able to come up with the first,1725.659,3.721 architectural Paradigm is fully,1727.159,4.861 self-contained so a fully self-contained,1729.38,4.2 architecture is something that is,1732.02,4.08 embodied meaning that it has one,1733.58,5.52 physical body it's a single platform so,1736.1,6.059 it only exists inside that platform it's,1739.1,4.62 fully constrained in that it can't,1742.159,3.961 really plug into anything else but also,1743.72,5.939 has no supervision so in this case R2D2,1746.12,5.52 and C3PO are probably the most famous,1749.659,4.681 examples of fully self-contained,1751.64,4.74 artificial intelligence entities and,1754.34,4.079 that C-3PO the only way that he can,1756.38,3.36 interact with the world is he's got,1758.419,3.24 relatively useless hands and then a,1759.74,4.26 mouth right he can see he can hear and,1761.659,3.541 he can talk,1764.0,4.02 R2D2 has a lot more sophisticated tools,1765.2,6.12 that he can use to uh you know make,1768.02,5.1 changes to the world and he does have,1771.32,3.359 the ability to connect with computers,1773.12,3.539 but he is,1774.679,5.041 he is otherwise fully self-contained and,1776.659,4.62 when he talks to another computer he's,1779.72,4.26 only talking to it right there's no uh,1781.279,5.481 transfer of data there's no like his his,1783.98,4.98 Consciousness or cognitive architecture,1786.76,5.26 is fully contained within this unit,1788.96,4.439 um so this obviously makes the most,1792.02,2.82 sense because that's how humans are,1793.399,4.02 right your brain is fully encased in,1794.84,4.5 your head so on and so forth so this,1797.419,3.961 makes the most intuitive sense now that,1799.34,3.24 being said this is just the first,1801.38,3.96 architecture of four so the second,1802.58,5.339 architecture is networked drones,1805.34,5.699 so this is what was explored mostly in,1807.919,5.76 The Matrix where the squiddies those are,1811.039,3.981 networked drones,1813.679,3.961 where they are they're mostly autonomous,1815.02,5.32 but they have Central controllers the,1817.64,5.22 Nestor class 5 from iRobot so they are,1820.34,4.62 embodied they have wireless networking,1822.86,4.98 they're hive mind capable but mostly,1824.96,4.5 they're autonomous and so what I mean by,1827.84,4.74 hive mind capable is that if you put,1829.46,5.16 them together in a cluster in a group,1832.58,4.62 and force them to share cognitive,1834.62,5.76 resources they can but really at the at,1837.2,5.04 a fundamental level they are designed to,1840.38,4.56 be autonomous so in this case the Nestor,1842.24,5.28 class 5 are also weekly supervised and,1844.94,4.32 what I mean by that is that there is a,1847.52,3.48 central server that is giving them,1849.26,3.84 updates and new directives every now and,1851.0,2.82 then,1853.1,2.459 so that's uh that's what I call a,1853.82,3.599 networked drone,1855.559,4.5 um oh one other thing I forgot to uh add,1857.419,4.38 it here at the bottom,1860.059,4.441 um so in the case of network drones they,1861.799,4.081 usually have highly standardized,1864.5,2.88 Hardware,1865.88,2.76 um as well as software Network,1867.38,3.72 architecture but they can be remotely,1868.64,5.519 hijacked this is actually this is the,1871.1,4.92 closest model to what the US military is,1874.159,4.26 working on with the autonomous jet,1876.02,6.06 fighter program where they are able to,1878.419,6.661 work in a network contested environment,1882.08,5.04 they're able to be autonomous or,1885.08,4.579 semi-autonomous but they're also,1887.12,5.64 intrinsically designed to work together,1889.659,4.841 um you know when the opportunity,1892.76,3.659 presents itself,1894.5,3.899 so the third architecture is the puppet,1896.419,4.26 drones and so a puppet drone in this,1898.399,5.28 case is where the the bodies are just,1900.679,5.041 like peripherals right,1903.679,3.841 um rather than the rather than the,1905.72,3.78 bodies be where the the primary,1907.52,3.96 processing happens usually the,1909.5,3.48 processing happens on centralized,1911.48,3.96 servers and each platform each physical,1912.98,5.22 platform is just an extension of that,1915.44,4.979 centralized hive mind,1918.2,4.199 um and the hive mind can put resources,1920.419,4.14 on any compute platform that it has,1922.399,3.841 control over whether it's a physical,1924.559,4.561 body or a server node or a cluster you,1926.24,4.439 know a remote cluster or that or,1929.12,4.02 whatever but in this case these are,1930.679,5.281 strongly supervised meaning most if not,1933.14,6.539 all of the data goes back up to uh the,1935.96,6.36 you know the main server brain and this,1939.679,5.161 is actually pretty close to how,1942.32,4.62 um like drone fleets work,1944.84,3.719 so like if you ever see video of the,1946.94,3.66 Amazon warehouse where the drones,1948.559,3.961 themselves have very very little,1950.6,4.14 autonomy they're mostly just extensions,1952.52,5.519 of the central like drone controller,1954.74,4.86 um but you take that to a logical,1958.039,5.161 extension this could be uh how like you,1959.6,6.299 know in the future if you have a whole,1963.2,4.32 bunch of like tiny robots throughout,1965.899,3.301 your whole house you might have a,1967.52,4.62 cloud-based uh drone controller that,1969.2,4.74 will you know guide your Roomba and your,1972.14,5.3 home repair bot and whatever else,1973.94,3.5 um so in this case it's very API driven,1977.48,4.5 this is the the model that we have that,1980.0,3.24 this is based on is called The Internet,1981.98,3.12 of Things which if you're familiar with,1983.24,3.48 the Internet of Things the idea is that,1985.1,3.24 you know you have peripheral devices,1986.72,5.04 like a phone or a Roomba or Alexa but,1988.34,5.459 that they're all networked together and,1991.76,4.2 they're centrally controlled,1993.799,4.26 um so again this is actually relatively,1995.96,4.319 familiar but if you take it to some,1998.059,3.901 logical conclusions you can get the,2000.279,4.38 guess as in from Mass Effect,2001.96,6.559 um and so for Westworld they're mostly,2004.659,7.02 self-contained however when you look at,2008.519,5.801 some of the later seasons of Westworld,2011.679,5.22 um they can be fully autonomous but they,2014.32,4.14 can also collaborate and work together,2016.899,4.681 once their software is updated so I,2018.46,4.38 would say that Westworld is somewhere,2021.58,2.52 between architecture one and,2022.84,3.059 architecture two,2024.1,4.079 puppet drones this is kind of what we're,2025.899,4.861 afraid of like we see puppet drones in,2028.179,5.88 places like the Matrix and the Nestor,2030.76,5.34 class five You could argue that at the,2034.059,3.6 second half of the movie when the when,2036.1,3.959 the nesters become evil then they're,2037.659,3.781 probably more like puppet drones because,2040.059,3.84 they kind of sacrifice themselves,2041.44,3.959 um because they're now under control of,2043.899,2.52 Vicky,2045.399,4.141 and then finally fully distributed,2046.419,5.521 so this is the hardest conceptual to,2049.54,3.839 understand but this is also what people,2051.94,3.659 are most afraid of so a fully,2053.379,5.581 distributed AI is something that is from,2055.599,5.461 a technology standpoint uh from a from,2058.96,3.6 an arc a software architectural,2061.06,3.18 standpoint you'd call this a,2062.56,3.66 decentralized federation so,2064.24,4.679 decentralized Federation is where every,2066.22,5.28 single node is capable of being entirely,2068.919,5.22 autonomous but it is also intrinsically,2071.5,5.099 designed to collaborate with other,2074.139,4.561 things so the the closest thing that we,2076.599,3.421 have to this today actually is,2078.7,2.88 distributed autonomous organizations,2080.02,4.26 which use blockchain technology to,2081.58,4.079 coordinate stuff now that's not the only,2084.28,3.72 technology MIT has been working on Swarm,2085.659,4.98 robotics for a long time but the other,2088.0,4.679 thing to notice to know about this is it,2090.639,4.321 is decentralized it is distributed and,2092.679,4.68 it is intrinsically a hive mind design,2094.96,5.28 meaning the more units that join the,2097.359,4.26 smarter it gets,2100.24,4.32 and the the most terrifying part of this,2101.619,6.121 is that a fully distributed AI system,2104.56,5.82 doesn't rely on any centralized Hardware,2107.74,4.92 anywhere which means that it is capable,2110.38,4.739 of spontaneous metastasis or metastasis,2112.66,4.5 and what that means is that it moves,2115.119,4.081 like a virus and that's why I picked the,2117.16,4.34 Ghost in the Shell because both project,2119.2,5.52 2501 from the original movie as well as,2121.5,5.74 the Standalone complex viruses uh,2124.72,5.42 explored in the show are examples of,2127.24,6.42 spontaneous metastasis of AI systems and,2130.14,4.959 this is what people are most afraid of,2133.66,3.24 because it's like okay well if you have,2135.099,3.721 a if you have a bug a virus that can,2136.9,4.02 just spread and then it's essentially a,2138.82,4.32 botnet a self-healing self-directing,2140.92,5.22 botnet so again we do have a model for,2143.14,5.64 that but even botnets usually have a,2146.14,5.04 central controller but imagine a botnet,2148.78,3.96 that doesn't have a central controller,2151.18,3.899 that it just it completely is constantly,2152.74,4.68 learning and moving through systems and,2155.079,4.26 it's completely Hardware Network and,2157.42,4.199 software agnostic so that's what's the,2159.339,3.841 most that's the most terrifying,2161.619,2.641 possibility,2163.18,3.48 okay last segment of the video,2164.26,4.02 implications,2166.66,4.26 now assuming that we figure all this out,2168.28,5.76 that we end up with super sexy robots,2170.92,4.439 that you can do whatever you want with,2174.04,3.84 the biggest moral question is what,2175.359,4.021 happens when you have sex and violence,2177.88,3.3 on demand because that's primarily,2179.38,4.32 what's explored in Westworld,2181.18,4.919 um one thing that I will say is that,2183.7,4.68 that's not too different from you know,2186.099,3.721 the prevalence of porn and video games,2188.38,5.219 today it's just a lot more realistic and,2189.82,5.7 it's more realistic than even in VR and,2193.599,4.381 you can do all this stuff in VR too one,2195.52,3.839 thing that's interesting though is to,2197.98,4.139 point out is that your body reacts much,2199.359,4.621 more strongly in VR which is one of the,2202.119,4.141 reasons that you get VR fatigue is,2203.98,5.58 because when you fool your sense your,2206.26,5.28 sensorium input,2209.56,3.66 excuse me when you fool your body enough,2211.54,3.9 it thinks that it's real so threat,2213.22,4.82 detection in VR,2215.44,4.62 activates a lot more of your limbic,2218.04,4.059 system than just in a video game because,2220.06,3.18 if you're playing a video game you're,2222.099,2.461 just looking at a screen and you've got,2223.24,3.54 a controller you know your brain knows,2224.56,4.08 oh this is just story this is just,2226.78,5.579 fiction in VR it's much harder to tell,2228.64,6.0 so like this is one reason why doing,2232.359,4.26 stuff in like Space video games in VR,2234.64,5.34 can be really disorienting at first,2236.619,5.46 um another thing that can happen another,2239.98,5.04 implication that um that I'm thinking,2242.079,5.941 about is attitudes towards women and men,2245.02,5.28 um I think by and large men will,2248.02,4.5 probably make more use of these kinds of,2250.3,5.52 things although in uh in a previous uh,2252.52,5.22 YouTube post people pointed out there's,2255.82,3.66 plenty of stories where women make use,2257.74,5.52 of of robotic uh Companions and and and,2259.48,6.119 um sexual objects and so on,2263.26,4.2 and then of course there's the movie Her,2265.599,3.841 with Joaquin Phoenix and Scarlett,2267.46,4.92 Johansson where people all lonely people,2269.44,4.38 all over the world ended up in,2272.38,4.02 relationships with their OS their uh,2273.82,5.88 what was it os1 or Os alpha or whatever,2276.4,6.06 um so the one one lesson from history,2279.7,5.28 that I wanted to share with people is,2282.46,5.1 that there was a Roman Statesman who,2284.98,4.5 wrote in his journal,2287.56,4.26 um that after visiting uh the Coliseum,2289.48,4.859 and watching gladiatorial matches where,2291.82,4.019 you know people and animals were just,2294.339,3.961 slaughtered in Mass he noticed that he,2295.839,4.561 was much more selfish and much harsher,2298.3,4.68 with his slaves afterwards and so,2300.4,5.4 there's something about the act of even,2302.98,6.72 just watching real violence real death,2305.8,6.48 um actually really kind of changes us at,2309.7,4.86 a fundamental level and so if you go to,2312.28,5.28 a if if in the future a Westworld like,2314.56,6.18 Park exists and you know you engage in,2317.56,6.0 all kinds of like gratuitous stuff and,2320.74,4.379 even if you don't engage in it if you,2323.56,3.48 just watch it it could change your,2325.119,4.22 perception about the value of human life,2327.04,6.48 and how you treat other people and so I,2329.339,6.221 don't want to equivocate the idea of,2333.52,4.86 like a real life theme park with video,2335.56,4.26 games because I do think that,2338.38,3.12 psychologically and physiologically,2339.82,4.98 it'll be a very different experience,2341.5,6.68 now to take that to a uh,2344.8,5.819 different slightly different context is,2348.18,3.939 let's just talk about romance and,2350.619,5.761 companionship in general so one several,2352.119,5.941 things that machines have on humans is,2356.38,4.979 infinite patients so Sarah Connor talked,2358.06,4.799 about this in Terminator 2 when she,2361.359,3.421 watched you know Arnold playing with,2362.859,4.381 John Connor realizing that the robot had,2364.78,4.44 infinite patients that that John,2367.24,4.68 Connor's life was his mission which is,2369.22,4.44 kind of like a representation of the,2371.92,3.78 ideal father right and then of course,2373.66,3.9 she talked about like how real men are,2375.7,4.02 you know might get drunk or might be,2377.56,5.76 tired or might just leave and so that,2379.72,5.82 you take that to extension and you,2383.32,5.279 wonder okay what if you build a robot,2385.54,6.24 any companion robot where you or your,2388.599,5.281 child or your family or whatever is its,2391.78,4.68 primary Mission which is why I picked um,2393.88,4.86 what was her name sorry I can't remember,2396.46,3.659 but what's her name from The Sarah,2398.74,3.96 Connor uh Chronicles where she kind of,2400.119,4.921 becomes emotionally involved with John,2402.7,4.86 Connor over time,2405.04,4.26 um because she was programmed to Serve,2407.56,3.84 and Protect and so like that feels good,2409.3,5.279 to humans right when you're a child you,2411.4,5.04 are supposed to be your parents primary,2414.579,3.0 Mission,2416.44,2.399 um that is that is kind of the,2417.579,3.181 definition of good enough parenting that,2418.839,3.061 doesn't mean that your parents should,2420.76,3.48 spoil you and be helicopter parents but,2421.9,4.56 that like you are supposed to get your,2424.24,4.44 sense of self-esteem from your parents,2426.46,4.86 treating you like you matter a lot and,2428.68,4.98 so it's there's this like this really,2431.32,5.4 appealing idea of what if what if,2433.66,4.8 there's this beautiful infinitely,2436.72,3.899 patient infinitely capable sexy machine,2438.46,6.0 that thinks the world of me right,2440.619,7.261 um and that is kind of dangerous one,2444.46,5.46 thing that I would hope is that,2447.88,4.92 at the beginning at least when you know,2449.92,4.86 these kinds of machines are built you,2452.8,3.48 know there's a lot of people that have a,2454.78,2.88 lot of,2456.28,3.299 um you know missing things from their,2457.66,3.6 childhood and and from their,2459.579,3.721 relationships but over time I would hope,2461.26,3.359 that these machines would help us heal,2463.3,2.94 and help us reconnect with each other,2464.619,4.74 which was kind of the lesson from her,2466.24,6.359 which uh the the machines in her they,2469.359,5.521 you know they changed and then they said,2472.599,5.281 okay well we're gonna leave and one of,2474.88,4.62 the one of the things that that um,2477.88,4.14 Samantha said to the Joaquin Phoenix's,2479.5,4.5 character was that like you need to,2482.02,3.839 reconnect with each other and then all,2484.0,3.54 the OS has disappeared and then they,2485.859,3.24 like left their apartments and said oh,2487.54,3.059 you're a real human,2489.099,2.52 um I don't think anything like that's,2490.599,3.901 gonna happen but I would hope that uh,2491.619,5.161 that AI companions will help us connect,2494.5,4.02 with each other,2496.78,3.54 um but that being said it's not a,2498.52,3.54 requirement because some people might,2500.32,4.56 prefer their their machine companions,2502.06,4.86 um and I don't personally I don't think,2504.88,3.66 that we should judge because it's like,2506.92,3.84 you know why not if it makes you happy,2508.54,5.52 and it's not harming anyone why not,2510.76,7.2 now the last component is escapism and,2514.06,7.62 addiction so fdvr is all the is all the,2517.96,5.46 buzz right now which is a full dive,2521.68,4.86 virtual reality or basically Holodeck,2523.42,5.58 um the idea is it was explored in Ready,2526.54,4.319 Player one where you have a haptic suit,2529.0,4.14 of course it was explored in Star Trek,2530.859,3.901 with a Holodeck,2533.14,3.84 um which Holodeck is just a cinematic,2534.76,5.12 way of presenting VR,2536.98,6.119 and in the Star Trek universe,2539.88,5.02 um there's Hollow Addiction in Ready,2543.099,3.361 Player one I think they addressed,2544.9,4.32 Addiction in a book that I recommend,2546.46,4.56 um Ready Player one they also talk about,2549.22,3.84 VR addiction,2551.02,4.68 um so if you have fictional you know and,2553.06,4.86 basically NPCs in your life of robots,2555.7,5.34 that help you check out of real life is,2557.92,5.64 there gonna be some potential,2561.04,4.559 um downsides to that so Reginald Barkley,2563.56,4.559 is a recurring character in Star Trek,2565.599,5.701 and he frequently struggles with Hollow,2568.119,4.141 addiction,2571.3,4.14 and he will sometimes create Holodeck,2572.26,5.88 programs where everyone loves him and,2575.44,5.1 it's very very egocentric,2578.14,4.02 um and but that kind of begs the,2580.54,4.38 question like okay but if you if you can,2582.16,5.34 actually have like own robots that,2584.92,4.14 honestly think the world of you because,2587.5,3.839 that's what they're programmed to do is,2589.06,3.84 that feeding and addiction is that,2591.339,3.921 feeding vanity narcissism,2592.9,6.419 egocentrism whatever and so but then,2595.26,5.68 from a that's from an individual,2599.319,3.901 perspective from a Global Perspective,2600.94,4.56 that's very Brave New World because if,2603.22,5.099 you distract everyone with VR and you,2605.5,5.099 know sex robots and whatever else like,2608.319,3.961 those are going to be a very compliant,2610.599,3.061 population who are just like whatever,2612.28,3.6 just let me let me play in VR let me,2613.66,4.98 play in in my West world you know with,2615.88,4.38 my robot friends,2618.64,4.74 and so that has uh some pretty profound,2620.26,6.0 implications for social level controls,2623.38,5.939 economic productivity but also at that,2626.26,4.559 point it kind of forces us to ask the,2629.319,3.361 question like what is the meaning of,2630.819,3.481 Being Human anymore,2632.68,3.12 um and I'm not I'm not implying that,2634.3,4.319 like being human is meaningless but if,2635.8,5.46 our daily life changes that much and we,2638.619,5.761 have that many options like you know the,2641.26,5.099 the Paradox of choice is a real thing,2644.38,3.84 and if you're not familiar Paradox of,2646.359,3.24 choice means that if you have too many,2648.22,3.84 options you end up with decision fatigue,2649.599,4.081 and you just kind of choose the default,2652.06,3.539 option which is honestly why a lot of,2653.68,4.08 people end up on their phones is because,2655.599,3.961 um when you can do anything you say well,2657.76,2.94 I'm just going to pick up my phone,2659.56,4.019 because it's a reliable device that will,2660.7,5.46 entertain me well enough and so if you,2663.579,4.74 have a sexy robot girlfriend you know,2666.16,4.02 that is able to entertain you at all,2668.319,3.421 times that'll be your default Choice,2670.18,3.899 unless she's programmed to like push you,2671.74,4.32 to be better which again that's what I,2674.079,3.361 would hope that some people would choose,2676.06,2.7 to do,2677.44,4.379 um but yes so that's about it,2678.76,4.92 um some conclusions,2681.819,4.621 I think that Westworld in some form or,2683.68,4.76 other is probably just a few years away,2686.44,4.8 I was really Blown Away by the Tesla bot,2688.44,5.62 demo as well as the Disney demos,2691.24,4.379 um the cognitive architecture is coming,2694.06,4.14 as many of you are aware,2695.619,4.98 um I am I am holding from that AGI is 18,2698.2,4.68 months away or less,2700.599,4.621 um so but already we have you know,2702.88,4.86 cognitive agents that are good enough to,2705.22,5.7 be video game characters,2707.74,5.22 um unfortunately I do think that some of,2710.92,3.24 these things are going to be ludicrously,2712.96,2.76 expensive at first at least in the,2714.16,3.54 physical world uh plenty of you have,2715.72,4.44 pointed out that that fully realized VR,2717.7,3.84 characters and other video game,2720.16,3.179 characters those are going to be coming,2721.54,4.14 they're going to be much cheaper,2723.339,3.721 um the commercial demand for this stuff,2725.68,3.72 is going to be absolutely insane though,2727.06,4.98 Disney might be the leader open AI might,2729.4,5.28 be the leader but as soon as as soon as,2732.04,5.1 other companies saw the profit motive,2734.68,4.86 whoo they are they are hot on the,2737.14,4.8 biscuit to keep going the potential,2739.54,4.559 societal impacts,2741.94,4.139 um it's an it's impossible to really,2744.099,3.621 know how it's going to actually play out,2746.079,4.681 uh but it has been explored a lot in,2747.72,5.02 fiction uh many of the stories that I,2750.76,3.359 mentioned earlier,2752.74,4.44 okay so that's all I got for you today I,2754.119,5.521 hope that you liked the new format uh,2757.18,4.38 obviously like subscribe and comment and,2759.64,5.479 we'll go from there thanks for watching,2761.56,3.559