0 00:00:00,010 --> 00:00:05,777 SUB BY : DENI AUROR@ https://aurorarental.blogspot.com/ 1 00:00:06,505 --> 00:00:09,575 Welcome to YouTube Original Stages, 2 00:00:09,608 --> 00:00:12,979 once home to Howard Hughes's Spruce Goose assembly hangar, 3 00:00:13,012 --> 00:00:14,847 and home to much of the first Iron Man, 4 00:00:14,881 --> 00:00:16,483 filmed 12 years ago. 5 00:00:17,149 --> 00:00:18,885 Many happy memories here. 6 00:00:18,918 --> 00:00:21,354 And speaking of taking a look back... 7 00:00:21,387 --> 00:00:22,655 Technology. 8 00:00:22,688 --> 00:00:24,123 It's advancing faster 9 00:00:24,156 --> 00:00:25,892 and taking less time to be widely adopted 10 00:00:25,925 --> 00:00:27,393 than ever before, 11 00:00:27,427 --> 00:00:30,196 like as in it took roughly 10,000 years 12 00:00:30,229 --> 00:00:32,065 to go from writing to printing press, 13 00:00:32,098 --> 00:00:34,700 but only about 500 more to get to email. 14 00:00:34,734 --> 00:00:38,171 Now it seems we're at the dawn of a new age, 15 00:00:38,204 --> 00:00:39,739 the age of A.I... 16 00:00:39,772 --> 00:00:41,507 Artificial Intelligence. 17 00:00:41,541 --> 00:00:42,875 Please define. 18 00:00:48,314 --> 00:00:50,015 Uh-huh, okay. There you have it. 19 00:00:50,049 --> 00:00:52,017 What does it mean? I don't know. 20 00:00:52,051 --> 00:00:53,552 Tons of folks are working on it, right? 21 00:00:53,585 --> 00:00:55,155 Most people don't know that much about it, 22 00:00:55,188 --> 00:00:56,822 and of course, there's no shortage 23 00:00:56,856 --> 00:00:58,123 of data or opinions. 24 00:00:58,157 --> 00:00:59,859 Anyway, I've heard it said 25 00:00:59,893 --> 00:01:01,861 that the best way to learn about a subject 26 00:01:01,894 --> 00:01:02,896 is to teach it, 27 00:01:02,929 --> 00:01:04,197 but to level with ya, 28 00:01:04,230 --> 00:01:06,999 I have a wildly incomplete education... 29 00:01:07,032 --> 00:01:08,501 Not in my day job, 30 00:01:08,534 --> 00:01:11,137 where I've been A.I.-adjacent for over a decade. 31 00:01:11,170 --> 00:01:13,473 Anyway, I figured now would be as good a time as any 32 00:01:13,506 --> 00:01:15,407 to catch up on the state of things 33 00:01:15,441 --> 00:01:17,543 regarding this emerging phenomenon. 34 00:01:17,576 --> 00:01:20,779 My sense of it is it kind of feels like 35 00:01:20,813 --> 00:01:23,115 Pandora's box, maybe... ish? 36 00:01:23,148 --> 00:01:24,884 Much of my understanding on this topic 37 00:01:24,917 --> 00:01:26,786 has come from sci-fi stories, 38 00:01:26,819 --> 00:01:28,054 which usually depict us 39 00:01:28,087 --> 00:01:30,590 heading toward Shangri-La or dystopia. 40 00:01:30,623 --> 00:01:31,791 Like most things, 41 00:01:31,824 --> 00:01:34,660 I suspect the truth is probably somewhere in the middle. 42 00:01:34,694 --> 00:01:35,628 Now, along the way, 43 00:01:35,661 --> 00:01:37,497 we'll demystify some common misconceptions 44 00:01:37,530 --> 00:01:40,633 about things we thought we understood, but probably don't, 45 00:01:40,667 --> 00:01:42,135 terms such as 46 00:01:42,168 --> 00:01:44,136 "machine learning," "algorithms," 47 00:01:44,170 --> 00:01:46,940 "computer vision" and "Big Data," 48 00:01:46,973 --> 00:01:48,741 they will be conveniently unpacked 49 00:01:48,775 --> 00:01:51,544 to help us feel like we know what we're doing, 50 00:01:51,577 --> 00:01:52,412 kinda. 51 00:01:52,445 --> 00:01:54,681 By the way, Pandora's box... 52 00:01:58,485 --> 00:01:59,652 wasn't a box. 53 00:02:00,853 --> 00:02:02,255 It... 54 00:02:02,922 --> 00:02:04,791 was a clay jar. 55 00:02:04,824 --> 00:02:06,492 How about that? 56 00:02:06,525 --> 00:02:08,428 Demystified. 57 00:02:11,096 --> 00:02:14,200 A.I. is teaching the machine, 58 00:02:14,234 --> 00:02:17,236 and the machine becoming smart. 59 00:02:17,270 --> 00:02:19,705 Each time we create a more powerful technology, 60 00:02:19,738 --> 00:02:22,441 we create a bigger lever for changing the world. 61 00:02:22,475 --> 00:02:24,677 Autonomous driving started. 62 00:02:24,711 --> 00:02:26,345 It's an extraordinary time, 63 00:02:26,378 --> 00:02:29,315 one of unprecedented change and possibility. 64 00:02:30,483 --> 00:02:32,318 To help us understand what's happening, 65 00:02:32,352 --> 00:02:34,187 this series will look at innovators 66 00:02:34,220 --> 00:02:35,921 pushing the boundaries of A.I... 67 00:02:35,954 --> 00:02:37,189 No, stop! 68 00:02:37,223 --> 00:02:38,991 ...and how their groundbreaking work 69 00:02:39,025 --> 00:02:40,894 is profoundly impacting our lives... 70 00:02:40,927 --> 00:02:42,194 Yay! 71 00:02:42,228 --> 00:02:44,463 ...and the world around us. 72 00:02:44,496 --> 00:02:47,266 In this episode, we'll meet two different visionaries 73 00:02:47,299 --> 00:02:49,302 exploring identity, creativity, 74 00:02:49,335 --> 00:02:52,438 and collaboration between humans and machines. 75 00:02:52,471 --> 00:02:55,007 Intelligence used to be the province of only humans, 76 00:02:55,041 --> 00:02:56,375 but it no longer is. 77 00:02:56,409 --> 00:02:58,912 We don't program the machines. They learn by themselves. 78 00:03:09,155 --> 00:03:12,024 Mm. Ah. That's good. 79 00:03:12,058 --> 00:03:14,093 All right. 80 00:03:14,126 --> 00:03:17,430 My background's always been a mixture of art and science. 81 00:03:18,564 --> 00:03:21,767 I ended up doing a PhD in bioengineering, 82 00:03:21,801 --> 00:03:24,437 then I ended up in the film industry, 83 00:03:24,470 --> 00:03:27,073 working on King Kong to Avatar, 84 00:03:27,106 --> 00:03:28,841 simulating faces. 85 00:03:30,009 --> 00:03:31,844 I'd got to a point in my career 86 00:03:31,877 --> 00:03:32,912 where I'd been, you know, 87 00:03:32,946 --> 00:03:35,081 lucky enough to win a couple of Academy Awards, 88 00:03:35,114 --> 00:03:37,283 so I thought, "Okay, what happens 89 00:03:37,317 --> 00:03:40,319 if we actually tried to bring those characters to life, 90 00:03:40,353 --> 00:03:42,722 that actually you could interact with?" 91 00:03:45,124 --> 00:03:47,126 Baby... Ooh. 92 00:03:48,761 --> 00:03:49,795 What can you see? 93 00:03:49,829 --> 00:03:54,199 So "Baby X" is a lifelike simulation of a toddler. 94 00:03:54,233 --> 00:03:57,470 Hey. Are you excited to be here? 95 00:03:57,503 --> 00:03:59,338 She's actually seeing me through the web camera, 96 00:03:59,371 --> 00:04:02,141 she's listening through the microphone. 97 00:04:02,174 --> 00:04:04,577 Woo... yeah. 98 00:04:04,610 --> 00:04:07,346 Baby X is about exploring the nature 99 00:04:07,380 --> 00:04:09,349 of how would we build a digital consciousness, 100 00:04:09,382 --> 00:04:10,549 if it's possible? 101 00:04:10,582 --> 00:04:12,218 We don't know if it's possible, 102 00:04:12,251 --> 00:04:14,053 but we're chipping away at that problem. 103 00:04:14,086 --> 00:04:15,588 Hey, Baby. Hey. 104 00:04:15,621 --> 00:04:17,423 "Problem" is an understatement 105 00:04:17,456 --> 00:04:19,024 for what Mark's chipping away at. 106 00:04:19,058 --> 00:04:20,359 His vision of the future 107 00:04:20,393 --> 00:04:22,562 is one where human and machine cooperate, 108 00:04:22,595 --> 00:04:25,397 and the best way to achieve that, he thinks, 109 00:04:25,431 --> 00:04:28,834 is to make A.I. as life-like as possible. 110 00:04:28,867 --> 00:04:31,137 Peek-a-boo! 111 00:04:32,638 --> 00:04:35,274 Which is why he began where most life begins... 112 00:04:35,307 --> 00:04:36,575 a baby... 113 00:04:36,609 --> 00:04:39,078 modeled after his own daughter. 114 00:04:39,111 --> 00:04:41,614 So if we start revealing her layers, 115 00:04:41,647 --> 00:04:43,582 she's driven by virtual muscles, 116 00:04:43,616 --> 00:04:45,551 and the virtual muscles, in turn, 117 00:04:45,584 --> 00:04:47,553 are driven by a virtual brain. 118 00:04:47,586 --> 00:04:49,888 Now, these are radically simplified models 119 00:04:49,922 --> 00:04:51,024 from the real thing, 120 00:04:51,057 --> 00:04:52,325 but nevertheless, 121 00:04:52,358 --> 00:04:54,593 they're models that we can explore how they work, 122 00:04:54,627 --> 00:04:56,996 because we have a real template that exists, 123 00:04:57,030 --> 00:04:58,398 the human brain. 124 00:04:59,932 --> 00:05:02,869 So, these are all driven by neural networks. 125 00:05:03,869 --> 00:05:04,803 "Neural network" 126 00:05:04,837 --> 00:05:06,773 is a virtual, much simpler version 127 00:05:06,806 --> 00:05:07,940 of the human brain. 128 00:05:07,974 --> 00:05:11,143 The brain is the most complex system in our body. 129 00:05:11,177 --> 00:05:14,880 It's got 85 billion neurons, each of which fire non-stop, 130 00:05:14,914 --> 00:05:19,385 receiving, processing, and sending information. 131 00:05:19,418 --> 00:05:22,021 Baby X's brain is nowhere near as complex, 132 00:05:22,054 --> 00:05:23,856 but that's the goal. 133 00:05:23,890 --> 00:05:26,625 Instead of neurons, it's got nodes. 134 00:05:26,658 --> 00:05:28,494 The more the nodes are exposed to, 135 00:05:28,527 --> 00:05:30,129 the more they learn. 136 00:05:30,163 --> 00:05:32,431 What we've learned is it's very hard to build a digital brain, 137 00:05:32,465 --> 00:05:34,466 but where we want to go with it 138 00:05:34,500 --> 00:05:37,503 is we're trying to build a human-like A.I. 139 00:05:37,537 --> 00:05:39,271 which has a flexible intelligence 140 00:05:39,305 --> 00:05:40,873 that can relate to people. 141 00:05:41,607 --> 00:05:43,409 I think the best kind of systems 142 00:05:43,442 --> 00:05:46,179 are when humans and A.I. work together. 143 00:05:46,212 --> 00:05:49,549 One of the biggest misconceptions of A.I. 144 00:05:49,582 --> 00:05:52,351 is that there is a super-intelligent being, 145 00:05:52,384 --> 00:05:54,253 or what we call a generalized A.I., 146 00:05:54,286 --> 00:05:56,655 that knows all, can do all, 147 00:05:56,688 --> 00:05:59,124 smarter than all of us put together. 148 00:05:59,157 --> 00:06:01,093 That is a total misconception. 149 00:06:01,127 --> 00:06:03,629 A.I. is built on us. 150 00:06:03,663 --> 00:06:06,365 A.I. is mimicking our thought processes. 151 00:06:06,398 --> 00:06:09,569 A.I. is basically an emulation of us. 152 00:06:11,037 --> 00:06:13,939 Like visionaries before him, Mark's a dreamer. 153 00:06:13,973 --> 00:06:16,475 The current state of his moonshot, however, 154 00:06:16,508 --> 00:06:17,809 is a little more earthbound. 155 00:06:17,843 --> 00:06:19,845 Thank you for granting access 156 00:06:19,879 --> 00:06:22,448 to your microphone. It's good to hear you. 157 00:06:22,481 --> 00:06:23,749 Today, most avatars 158 00:06:23,783 --> 00:06:26,652 are basically glorified customer-service reps. 159 00:06:26,685 --> 00:06:27,720 Rest assured, 160 00:06:27,753 --> 00:06:29,455 your health is my primary concern. 161 00:06:29,488 --> 00:06:31,290 They can answer simple questions 162 00:06:31,323 --> 00:06:33,091 and give scripted responses. 163 00:06:33,125 --> 00:06:35,060 I love helping our customers, 164 00:06:35,094 --> 00:06:36,863 so I'm keen to keep learning. 165 00:06:36,896 --> 00:06:39,832 Beats dealing with automated phonelines for sure, 166 00:06:39,866 --> 00:06:42,268 but it's a far cry from Mark's ultimate vision... 167 00:06:42,301 --> 00:06:43,635 Hey, Baby. Hey. 168 00:06:43,669 --> 00:06:46,672 ...to create avatars that can actually learn, 169 00:06:46,706 --> 00:06:49,341 interpret, and interact with the world around them, 170 00:06:49,375 --> 00:06:51,310 like a real human. 171 00:06:51,343 --> 00:06:53,145 What's this? 172 00:06:53,178 --> 00:06:55,080 Spider. 173 00:06:55,114 --> 00:06:58,050 So we're starting to get a spider forming in her mind here, 174 00:06:58,083 --> 00:07:00,853 she's starting to associate the word with the image. 175 00:07:00,887 --> 00:07:03,122 So, Baby... spider. 176 00:07:04,156 --> 00:07:05,324 Spider. 177 00:07:05,357 --> 00:07:06,893 Spider... 178 00:07:06,926 --> 00:07:10,096 Good! Okay, what's this? 179 00:07:10,830 --> 00:07:12,130 Spider. 180 00:07:12,164 --> 00:07:14,233 No. This is a duck. 181 00:07:14,266 --> 00:07:15,901 Look at the duck. 182 00:07:15,934 --> 00:07:17,302 Duck. 183 00:07:17,336 --> 00:07:18,904 Yeah. 184 00:07:18,937 --> 00:07:22,742 Baby X uses a type of A.I. called "object recognition." 185 00:07:23,809 --> 00:07:27,646 Basically, it's how a computer sees... 186 00:07:27,680 --> 00:07:29,915 how it identifies an object, like a spider, 187 00:07:29,949 --> 00:07:33,485 or tells the difference between a spider and a duck. 188 00:07:33,518 --> 00:07:36,255 It's something that you and I do naturally... 189 00:07:36,288 --> 00:07:39,525 ...but machines, like Baby X, need to learn from scratch, 190 00:07:39,558 --> 00:07:42,695 by basically sifting through enormous piles of data 191 00:07:42,728 --> 00:07:44,096 to search for patterns, 192 00:07:44,130 --> 00:07:46,165 so that eventually, it can drive a car, 193 00:07:46,199 --> 00:07:49,001 or pick out a criminal in a crowded photograph, 194 00:07:49,035 --> 00:07:52,438 or tell the difference between me and... that guy. 195 00:07:52,471 --> 00:07:55,607 But now I'm gonna tell her that spiders are scary. 196 00:07:55,641 --> 00:07:59,211 Look out! Rawr! Scary spider! Rawr! 197 00:08:00,846 --> 00:08:03,248 Hey, hey. Don't cry. It's okay. Hey... 198 00:08:04,250 --> 00:08:05,417 Hey, it's okay. 199 00:08:05,451 --> 00:08:08,087 Now she's responding emotionally to me as well, 200 00:08:08,120 --> 00:08:10,722 so we've gone all the way down 201 00:08:10,756 --> 00:08:14,593 to virtual neurotransmitters, hormones, and so forth, 202 00:08:14,626 --> 00:08:15,995 so Baby X has a stress system. 203 00:08:16,595 --> 00:08:18,397 If I give her a fright... 204 00:08:18,431 --> 00:08:19,465 Boo! 205 00:08:19,498 --> 00:08:20,733 So we'll see basically 206 00:08:20,766 --> 00:08:22,902 some noradrenaline was released then, 207 00:08:22,935 --> 00:08:25,737 and she's gone into a much more vigilant state of mind. 208 00:08:25,771 --> 00:08:27,506 What Mark is working on 209 00:08:27,539 --> 00:08:29,875 is known as "affective computing," 210 00:08:29,908 --> 00:08:33,612 A.I. that interprets and simulates human emotion. 211 00:08:33,646 --> 00:08:36,849 I believe that machines are gonna interact with humans 212 00:08:36,882 --> 00:08:38,917 just the way we interact with one another, 213 00:08:38,950 --> 00:08:41,120 through perception, through conversation. 214 00:08:41,153 --> 00:08:44,390 So as A.I. continues to become mainstream, 215 00:08:44,423 --> 00:08:46,759 it needs to really understand humans, 216 00:08:46,792 --> 00:08:49,461 and so we want to build emotion A.I. 217 00:08:49,495 --> 00:08:51,864 that enables machines to have empathy. 218 00:08:51,897 --> 00:08:53,265 Hello, Pepa. 219 00:08:53,299 --> 00:08:55,233 -Hello. - Hello. 220 00:08:55,267 --> 00:08:57,036 -Hello. - Hello. 221 00:08:57,069 --> 00:08:58,303 -Hello. 222 00:08:58,336 --> 00:08:59,638 Oh, dear. 223 00:08:59,671 --> 00:09:02,041 -We can do this forever. - I know we could. 224 00:09:02,074 --> 00:09:04,009 They've showed, for example, 225 00:09:04,043 --> 00:09:07,112 older adults who have A.I. aides at their nursing homes, 226 00:09:07,146 --> 00:09:08,647 they are happier 227 00:09:08,680 --> 00:09:10,783 with a robot that emotes and is social 228 00:09:10,816 --> 00:09:12,851 than having no one there. 229 00:09:12,885 --> 00:09:16,422 That's really the enhancement of human relationships. 230 00:09:16,455 --> 00:09:19,158 Hey... Hello. 231 00:09:19,191 --> 00:09:20,826 You know, human cooperation 232 00:09:20,860 --> 00:09:23,763 is the most powerful force in human history, right? 233 00:09:23,796 --> 00:09:26,065 Human cooperation with intelligent machines 234 00:09:26,098 --> 00:09:28,868 will define the next era of history. 235 00:09:28,901 --> 00:09:31,670 Using a machine which is connected 236 00:09:31,704 --> 00:09:34,040 with the rest of the world through the Internet, 237 00:09:34,073 --> 00:09:37,476 that can work as a creative, collaborative partner? 238 00:09:37,510 --> 00:09:39,245 That's unbelievable. 239 00:09:47,820 --> 00:09:50,289 Jessica. Jessica. One more time, one more time. 240 00:09:50,322 --> 00:09:52,792 We're gonna go from just the first two verses, 241 00:09:52,825 --> 00:09:53,859 and the first two verses 242 00:09:53,892 --> 00:09:56,128 will take us to three minutes, okay? 243 00:09:56,162 --> 00:09:57,463 I love music. 244 00:09:57,496 --> 00:09:59,665 The whole concept of music is collaboration, 245 00:09:59,698 --> 00:10:01,734 so if there are some people that see me as a musician, 246 00:10:01,767 --> 00:10:02,902 that's awesome. 247 00:10:06,706 --> 00:10:08,874 I first became interested in A.I. 248 00:10:08,907 --> 00:10:11,510 because A.I. is a very fruitful place to create in. 249 00:10:11,543 --> 00:10:13,612 It's a new tool for us. 250 00:10:13,645 --> 00:10:16,182 I dream, and make my dreams reality, 251 00:10:16,215 --> 00:10:17,783 whether the dream is a song 252 00:10:17,816 --> 00:10:20,853 or the dream is an avatar of myself. 253 00:10:20,886 --> 00:10:24,022 One time, a friend was like, "Well, you can't clone yourself. 254 00:10:24,055 --> 00:10:25,891 You can't be in two places at once." 255 00:10:25,925 --> 00:10:28,094 That's the promise of the avatar. 256 00:10:29,361 --> 00:10:30,563 I left it over there. 257 00:10:30,596 --> 00:10:32,764 All right, here we go. 258 00:10:32,797 --> 00:10:35,000 So, you're about to enter the Matrix. 259 00:10:35,033 --> 00:10:39,672 I'm gonna sort of direct you through just a bunch of poses. 260 00:10:39,705 --> 00:10:41,706 The team from Soul Machines 261 00:10:41,740 --> 00:10:44,143 is here to create a digital avatar of myself. 262 00:10:44,176 --> 00:10:46,679 They had to put me in this huge contraption 263 00:10:46,712 --> 00:10:48,414 with these crazy lights. 264 00:10:49,582 --> 00:10:50,916 What do you want me to do? 265 00:10:50,949 --> 00:10:52,751 Your face is an instrument. 266 00:10:52,785 --> 00:10:55,421 All the wrinkles on the face is like a signature, 267 00:10:55,454 --> 00:10:56,789 so we want to get 268 00:10:56,822 --> 00:10:59,758 the highest-quality digital model of you that we can. 269 00:10:59,792 --> 00:11:01,626 Okay. 270 00:11:01,660 --> 00:11:03,629 Yeah, that's perfect. Okay, go. 271 00:11:06,164 --> 00:11:09,935 So we have to capture all the textures of their face. 272 00:11:09,968 --> 00:11:11,703 The geometry of their face... 273 00:11:11,736 --> 00:11:13,339 Big, gnashy teeth. 274 00:11:13,372 --> 00:11:15,140 How their face deforms 275 00:11:15,173 --> 00:11:17,443 to form the different facial expressions. 276 00:11:17,476 --> 00:11:18,577 And how about a kiss? 277 00:11:18,610 --> 00:11:19,544 You could do... 278 00:11:19,578 --> 00:11:20,746 With my eyes closed? 279 00:11:20,779 --> 00:11:21,880 'Cause I don't kiss with my eyes open. 280 00:11:21,913 --> 00:11:23,816 Every once in a while, I peek. 281 00:11:25,484 --> 00:11:26,785 I wanted to have 282 00:11:26,819 --> 00:11:29,421 a digital avatar around the idea of Idatity, 283 00:11:29,454 --> 00:11:32,791 and that's the marriage of my data and my identity. 284 00:11:32,825 --> 00:11:35,193 Everyone's concerned about, like, identity theft. 285 00:11:35,227 --> 00:11:38,063 Meanwhile, everybody's giving away all their data for free 286 00:11:38,097 --> 00:11:38,964 on the Internet. 287 00:11:38,998 --> 00:11:41,633 I'm what I like and what I don't like, 288 00:11:41,667 --> 00:11:43,635 I'm where I go, I'm who I know. 289 00:11:43,668 --> 00:11:45,704 I'm what I search. I am my thumbprint. 290 00:11:45,738 --> 00:11:48,073 I am my data. That's who I am. 291 00:11:48,106 --> 00:11:49,842 You pull your eyelids down like that. 292 00:11:49,875 --> 00:11:51,209 We want to get that... yup. 293 00:11:51,243 --> 00:11:53,212 When I'm on Instagram and I'm on Google, 294 00:11:53,245 --> 00:11:56,949 I'm actually programming those algorithms to better understand me. 295 00:11:56,982 --> 00:11:58,216 Awesome. 296 00:11:58,250 --> 00:12:00,719 In the future, my avatar's gonna be doing all that stuff, 297 00:12:00,752 --> 00:12:02,121 because I'm gonna program it. 298 00:12:02,154 --> 00:12:05,124 Get entertained through it, get information through it, 299 00:12:05,157 --> 00:12:06,391 and you feel like 300 00:12:06,424 --> 00:12:09,828 you're having a FaceTime with an intelligent entity. 301 00:12:09,861 --> 00:12:11,564 "Yo, check out this link." 302 00:12:11,597 --> 00:12:12,864 "Oh, wow, that's crazy." 303 00:12:12,898 --> 00:12:15,467 "Yo, can you post that on my Twitter?" 304 00:12:17,803 --> 00:12:19,838 - Hey. - Hey. 305 00:12:19,872 --> 00:12:22,708 All right, I'm the Soul Machines lead audio engineer. 306 00:12:22,741 --> 00:12:26,412 Hopefully we'll be able to build an A.I. version of your voice. 307 00:12:26,445 --> 00:12:29,147 After creating Will's look, 308 00:12:29,180 --> 00:12:31,817 then we now have to create his voice. 309 00:12:31,850 --> 00:12:34,820 For that, we actually have to capture a lot of samples 310 00:12:34,853 --> 00:12:36,388 about how Will speaks, 311 00:12:36,422 --> 00:12:39,124 and that's actually quite a challenging process. 312 00:12:39,157 --> 00:12:41,460 - Shall we kick off? - Yeah, let's kick off. 313 00:12:41,493 --> 00:12:42,995 - A'ight, boo, here we go. - Yeah. 314 00:12:43,028 --> 00:12:44,897 I'm Will, and I'm happy to meet you. 315 00:12:44,930 --> 00:12:47,666 I'm here to bring technology to life, 316 00:12:47,699 --> 00:12:50,468 and let's talk about Artificial Intelligence. 317 00:12:50,502 --> 00:12:53,405 Oops. Really? Whoa. 318 00:12:53,438 --> 00:12:54,773 That's dope! 319 00:12:54,806 --> 00:12:57,142 So there's so many ways of saying "dope," bro. 320 00:12:57,175 --> 00:12:58,010 Yeah, yeah. 321 00:12:58,043 --> 00:12:59,945 Now, how realistic is it going to be? 322 00:12:59,978 --> 00:13:01,647 This will sound like you. 323 00:13:01,680 --> 00:13:04,717 The sentences can be divided up into parts 324 00:13:04,750 --> 00:13:06,618 so that we can create words 325 00:13:06,651 --> 00:13:08,787 and build sentences, like LEGO blocks. 326 00:13:08,820 --> 00:13:11,356 It will sound exactly like you. 327 00:13:11,390 --> 00:13:13,993 Well, maybe we don't want to have it too accurate. 328 00:13:14,026 --> 00:13:18,663 So you don't freak people out, maybe I don't want it accurate. 329 00:13:18,697 --> 00:13:20,832 Maybe, there should be some type of... 330 00:13:20,865 --> 00:13:21,934 "That's the A.I.," 331 00:13:21,967 --> 00:13:23,568 'cause this is all new ground. 332 00:13:23,602 --> 00:13:25,070 - Yeah. - Like, we've... 333 00:13:25,104 --> 00:13:27,206 we are in an intersection of a place 334 00:13:27,239 --> 00:13:28,907 that we've never been in society, 335 00:13:28,940 --> 00:13:31,109 where people have to determine 336 00:13:31,143 --> 00:13:33,412 what's real and what's not. 337 00:13:35,480 --> 00:13:37,382 While Mark jets back to New Zealand 338 00:13:37,416 --> 00:13:39,918 to try to create Will's digital doppelganger, 339 00:13:39,951 --> 00:13:42,387 Will's left waiting, and wondering... 340 00:13:42,420 --> 00:13:44,522 can Mark pull this off? 341 00:13:44,556 --> 00:13:45,758 What does it mean 342 00:13:45,791 --> 00:13:47,659 to have a lifelike avatar of you? 343 00:13:47,692 --> 00:13:50,696 A digital replicant of yourself? 344 00:13:50,729 --> 00:13:52,498 Is that a good idea? 345 00:13:52,531 --> 00:13:54,466 How far is too far? 346 00:13:54,499 --> 00:13:56,735 We've been collaborating with machines 347 00:13:56,768 --> 00:13:58,370 since the dawn of technology. 348 00:13:58,404 --> 00:14:00,138 I mean, even today, 349 00:14:00,171 --> 00:14:02,274 in some sense, we are all cyborgs already. 350 00:14:02,307 --> 00:14:03,775 For example, 351 00:14:03,809 --> 00:14:06,144 you use OKCupid to find a date, 352 00:14:06,177 --> 00:14:09,514 and then you use Yelp to decide where to go, you know, 353 00:14:09,547 --> 00:14:10,882 what restaurant to go to, 354 00:14:10,915 --> 00:14:12,451 and then you start driving your car, 355 00:14:12,484 --> 00:14:15,220 but there's a GPS system that actually tells you where to go. 356 00:14:15,254 --> 00:14:17,923 So the human and the machine decision-making 357 00:14:17,956 --> 00:14:19,491 are very tightly interwoven, 358 00:14:19,525 --> 00:14:22,294 and I think this will only increase as we go forward. 359 00:14:25,764 --> 00:14:29,501 Human collaboration with intelligent machines... 360 00:14:29,535 --> 00:14:31,770 A different musician in a different town 361 00:14:31,803 --> 00:14:32,871 with a different approach 362 00:14:32,904 --> 00:14:34,940 is giving the same problem a shot. 363 00:14:34,973 --> 00:14:36,775 People are concerned 364 00:14:36,808 --> 00:14:38,710 about A.I. replacing humans, 365 00:14:38,743 --> 00:14:40,445 and I think it is not only 366 00:14:40,478 --> 00:14:42,948 not going to replace humans, it's going to enhance humans. 367 00:14:45,784 --> 00:14:48,119 I'm Gil Weinberg. I'm the founding director 368 00:14:48,153 --> 00:14:50,522 of Georgia Tech Center for Music Technology. 369 00:14:51,823 --> 00:14:53,992 Ready? 370 00:14:54,025 --> 00:14:57,295 In my lab, we are trying to create the new technologies 371 00:14:57,329 --> 00:15:00,799 that will explore new ways to be expressive... 372 00:15:00,832 --> 00:15:02,801 to be creative... 373 00:15:02,835 --> 00:15:05,571 Shimon, it's a marimba-playing robot. 374 00:15:08,039 --> 00:15:11,643 What it does is listen to humans playing, 375 00:15:11,676 --> 00:15:14,346 and it can improvise. 376 00:15:15,681 --> 00:15:18,350 Shimon is our first robotic musician 377 00:15:18,383 --> 00:15:20,753 that has the ability to find patterns, 378 00:15:20,786 --> 00:15:22,054 so, machine learning. 379 00:15:23,956 --> 00:15:25,190 Machine learning 380 00:15:25,224 --> 00:15:28,326 is the ability to find patterns in data. 381 00:15:28,360 --> 00:15:31,463 So, for example, if we feed Shimon Miles Davis, 382 00:15:31,496 --> 00:15:32,463 it will try to see 383 00:15:32,497 --> 00:15:34,766 what note is he likely to play after what note, 384 00:15:34,799 --> 00:15:38,103 and once it finds its patterns, it can start to manipulate it, 385 00:15:38,136 --> 00:15:40,205 and I can have the robot playing in a style 386 00:15:40,239 --> 00:15:43,575 that maybe is 30% Miles Davis, 30% Bach, 387 00:15:43,609 --> 00:15:46,377 30% Madonna, and 10% my own, 388 00:15:46,411 --> 00:15:50,048 and create morphing of music that humans would never create. 389 00:15:55,320 --> 00:15:56,688 Gil's groundbreaking work 390 00:15:56,722 --> 00:15:59,725 in artificial creativity and musical expression 391 00:15:59,758 --> 00:16:02,561 has been performed by symphonies around the world... 392 00:16:03,695 --> 00:16:04,997 ...but his innovation 393 00:16:05,030 --> 00:16:07,566 also caught the attention of another musician... 394 00:16:07,599 --> 00:16:08,634 Okay. 395 00:16:08,667 --> 00:16:10,869 ...a guy who unexpectedly pushed Gil 396 00:16:10,902 --> 00:16:12,671 beyond enhancing robots 397 00:16:12,704 --> 00:16:15,040 to augmenting humans. 398 00:16:15,073 --> 00:16:17,809 I met Jason Barnes about six years ago, 399 00:16:17,842 --> 00:16:20,912 when I was just about finishing one phase of developing Shimon, 400 00:16:20,945 --> 00:16:24,549 and I was starting to think, "What's next?" 401 00:16:24,583 --> 00:16:27,819 I got my first drum kit when I was 15, on Christmas, 402 00:16:27,852 --> 00:16:30,021 and when I lost my limb, I was 22, 403 00:16:30,054 --> 00:16:32,358 so I was kind of used to having two limbs. 404 00:16:34,392 --> 00:16:36,995 I started trying to fabricate prosthetics 405 00:16:37,028 --> 00:16:38,697 to try and get me back on the kit, 406 00:16:38,730 --> 00:16:41,934 which eventually led me to working and collaborating with Georgia Tech. 407 00:16:44,269 --> 00:16:46,772 He told me that he lost his arm, 408 00:16:46,805 --> 00:16:48,606 he was devastated, he was depressed, 409 00:16:48,640 --> 00:16:49,808 music was his life, 410 00:16:49,841 --> 00:16:53,078 and he said, "I saw that you develop robotic musicians. 411 00:16:53,112 --> 00:16:55,080 Can you use some of the technology that you have 412 00:16:55,113 --> 00:16:59,418 in order to allow me to play again like I used to?" 413 00:16:59,451 --> 00:17:02,854 So that's the prosthetic arm that we built for Jason. 414 00:17:02,888 --> 00:17:04,289 When he came to us, 415 00:17:04,323 --> 00:17:06,925 he just wanted to be able to use sensors here 416 00:17:06,959 --> 00:17:09,961 so he can hold the stick tight or loose. 417 00:17:09,994 --> 00:17:12,164 I suggested "Let's do that, but also, 418 00:17:12,197 --> 00:17:13,398 let's have two sticks. 419 00:17:13,431 --> 00:17:15,534 One stick can operate with a mind of its own, 420 00:17:15,567 --> 00:17:17,435 understanding the music and improvising. 421 00:17:17,468 --> 00:17:20,305 One stick can operate based on what you tell it with your muscle, 422 00:17:20,338 --> 00:17:23,242 and also, each one of the sticks can play 20 hertz... 423 00:17:24,710 --> 00:17:26,111 ...faster than any humans, 424 00:17:26,144 --> 00:17:27,880 and together, they can create polyrhythm, 425 00:17:27,913 --> 00:17:31,516 create all kind of textures that humans cannot create." 426 00:17:31,550 --> 00:17:33,485 All right. I think we're ready to play. 427 00:17:38,123 --> 00:17:40,325 In some ways, the robotic drum arm 428 00:17:40,359 --> 00:17:43,195 allows Jason to play better than he ever has, 429 00:17:43,228 --> 00:17:45,196 but it still lacks the true function, 430 00:17:45,230 --> 00:17:47,732 or feeling, of a human hand. 431 00:17:47,765 --> 00:17:49,033 They don't provide 432 00:17:49,067 --> 00:17:51,469 the kind of dexterity and subtle control 433 00:17:51,503 --> 00:17:53,806 that would really allow anything. 434 00:17:55,607 --> 00:17:56,541 This revelation 435 00:17:56,574 --> 00:17:58,877 drove Gil to his next innovation... 436 00:17:58,910 --> 00:18:02,146 the Skywalker Hand. 437 00:18:02,180 --> 00:18:04,382 Inspired by Luke Skywalker from Star Wars, 438 00:18:04,416 --> 00:18:07,252 and created in collaboration with Jason, 439 00:18:07,285 --> 00:18:09,187 the revolutionary tech 440 00:18:09,221 --> 00:18:11,690 brings what was once the realm of sci-fi 441 00:18:11,723 --> 00:18:13,725 a little closer to our galaxy. 442 00:18:13,758 --> 00:18:15,727 This is just like a 3D-printed hand 443 00:18:15,760 --> 00:18:18,263 that you can, like, download the files online. 444 00:18:18,297 --> 00:18:20,766 Currently, most advanced prosthetic hands 445 00:18:20,799 --> 00:18:24,135 can't even thumbs-up or flip you the bird. 446 00:18:24,168 --> 00:18:26,805 They can only open or grip, 447 00:18:26,839 --> 00:18:28,774 using all five fingers at once. 448 00:18:28,807 --> 00:18:32,210 Most of the prosthetics that are available on the market nowadays, 449 00:18:32,243 --> 00:18:34,279 um, actually use EMG technology, 450 00:18:34,312 --> 00:18:35,814 which stands for "electromyography," 451 00:18:35,847 --> 00:18:38,616 and essentially what it does is there are two sensors 452 00:18:38,650 --> 00:18:40,919 that make contact with my residual limb, 453 00:18:40,952 --> 00:18:43,755 and they pick up electrical signals from the muscles... 454 00:18:43,788 --> 00:18:46,090 So again, when I flex and extend my residual limb, 455 00:18:46,124 --> 00:18:47,959 it will open and close the hand, 456 00:18:47,992 --> 00:18:50,195 um, and I can rotate as well, 457 00:18:50,228 --> 00:18:51,763 but the problem with EMG 458 00:18:51,796 --> 00:18:54,999 is it's a very vague electrical signal, so zero to 100%. 459 00:18:55,032 --> 00:18:56,468 It's not very accurate at all. 460 00:18:56,502 --> 00:18:59,437 The Skywalker Hand actually uses ultrasound tech. 461 00:18:59,470 --> 00:19:00,972 Ultrasound provides an image, 462 00:19:01,005 --> 00:19:04,443 and you can see everything that's going on inside of the arm. 463 00:19:04,476 --> 00:19:07,312 Ultrasound uses high-frequency sound waves 464 00:19:07,345 --> 00:19:10,182 to capture live images from inside the body. 465 00:19:11,282 --> 00:19:12,984 As Jason flexes his muscles 466 00:19:13,018 --> 00:19:14,786 to move each of his missing fingers, 467 00:19:14,819 --> 00:19:19,458 ultrasound generates live images that visualize his intention. 468 00:19:20,459 --> 00:19:23,128 The A.I. then uses machine learning 469 00:19:23,161 --> 00:19:24,462 to predict patterns, 470 00:19:24,496 --> 00:19:26,531 letting a man who's lost one of his hands 471 00:19:26,564 --> 00:19:29,400 move all five of his fingers individually, 472 00:19:29,434 --> 00:19:32,270 even if he's as unpredictable as Keith Moon. 473 00:19:32,304 --> 00:19:34,138 The work that Gil is doing 474 00:19:34,172 --> 00:19:35,440 is really important. 475 00:19:35,474 --> 00:19:37,909 Gil comes from a non-engineering background, 476 00:19:37,943 --> 00:19:39,912 which means that his technology 477 00:19:39,945 --> 00:19:42,013 and the way he thinks about robotics 478 00:19:42,046 --> 00:19:43,315 is actually quite different 479 00:19:43,348 --> 00:19:44,816 than, say, the way I would think about it, 480 00:19:44,849 --> 00:19:46,718 since I come from an engineering background. 481 00:19:46,751 --> 00:19:49,521 And the commonality is that we want to design robots 482 00:19:49,554 --> 00:19:52,491 to really impact and make a difference in the world. 483 00:19:53,992 --> 00:19:56,261 We were able to create a proof of concept 484 00:19:56,294 --> 00:19:57,663 with Jason Barnes. 485 00:19:57,696 --> 00:20:01,433 Once we discovered that we can do this with ultrasound, 486 00:20:01,466 --> 00:20:03,234 immediately I looked at, 487 00:20:03,268 --> 00:20:05,370 "Hey, let's try to help more people." 488 00:20:10,208 --> 00:20:12,978 That's okay, just leave me hanging, holding it. 489 00:20:13,011 --> 00:20:14,346 It's not heavy or anything. 490 00:20:14,379 --> 00:20:15,447 It's safe, if you want to slide it back... 491 00:20:15,480 --> 00:20:17,349 No, no. I'm messing with you. 492 00:20:17,382 --> 00:20:18,583 So I met Jason Barnes 493 00:20:18,617 --> 00:20:20,785 at an event called "Lucky Fin Weekend." 494 00:20:20,819 --> 00:20:23,922 They're a foundation that deals with limb difference. 495 00:20:23,955 --> 00:20:25,290 There we go. 496 00:20:25,323 --> 00:20:27,526 - Ah, all right. - And it's out. 497 00:20:27,559 --> 00:20:29,361 Do you ever work on your car 498 00:20:29,394 --> 00:20:30,762 without the hook? 499 00:20:30,795 --> 00:20:33,765 Not really. It's just way easier and efficient for me to... 500 00:20:33,798 --> 00:20:36,334 The hook, the hook really trips me out, though, man. 501 00:20:36,368 --> 00:20:38,503 When I lost my hand, 502 00:20:38,536 --> 00:20:39,938 it was close to 30 years ago, 503 00:20:39,971 --> 00:20:42,574 and prosthetics were kind of stuck in the Dark Ages. 504 00:20:47,879 --> 00:20:50,782 In general, they didn't really do a whole lot, 505 00:20:50,815 --> 00:20:52,049 and even if they moved, 506 00:20:52,083 --> 00:20:55,821 they seemed to be more passive than actually worthwhile to use. 507 00:20:58,757 --> 00:21:01,059 I don't like to talk about my accident, 508 00:21:01,092 --> 00:21:03,495 because I don't feel it defines me. 509 00:21:03,528 --> 00:21:05,330 The narrative on limb-different people 510 00:21:05,363 --> 00:21:07,398 has been the accident. 511 00:21:07,432 --> 00:21:10,335 "This is what happened, and these are these sad things," 512 00:21:10,368 --> 00:21:12,771 and it becomes inspiration porn. 513 00:21:14,839 --> 00:21:17,309 For me, for example, right, if I do something, 514 00:21:17,342 --> 00:21:19,444 I have to, like, smash it out of the park, 515 00:21:19,477 --> 00:21:21,079 because otherwise I feel like there's gonna be this, 516 00:21:21,112 --> 00:21:24,149 "Oh, well, he did it good enough because he's missing his hand." 517 00:21:24,182 --> 00:21:25,950 - Yeah, yeah. - And I'm like, "F that!" 518 00:21:25,983 --> 00:21:29,520 Like, I want to... I'm gonna be as good or better than somebody with two hands 519 00:21:29,554 --> 00:21:31,023 doing whatever I'm doing, you know? 520 00:21:32,390 --> 00:21:34,359 Prosthetics, at this point in my life, 521 00:21:34,392 --> 00:21:37,829 don't really seem like something I would want or need. 522 00:21:37,862 --> 00:21:39,998 Manual robotic prosthetics 523 00:21:40,031 --> 00:21:41,800 have not been adopted well. 524 00:21:41,833 --> 00:21:42,867 Amputees try them, 525 00:21:42,900 --> 00:21:44,736 and then they don't continue to use them. 526 00:21:50,742 --> 00:21:53,078 Yeah, man, you stoked to check out the lab? 527 00:21:53,111 --> 00:21:54,412 Yeah, yeah, for sure. 528 00:21:54,446 --> 00:21:57,282 Right now, I'm the only amputee that's ever used 529 00:21:57,315 --> 00:21:58,649 the Skywalker Arm before. 530 00:21:58,683 --> 00:22:00,618 Did you have... were you right-handed? 531 00:22:00,651 --> 00:22:01,987 No, I was born left-handed, actually. 532 00:22:02,020 --> 00:22:03,454 Oh, you lucky bastard. 533 00:22:03,487 --> 00:22:05,256 - Yeah, I know, right? - I was right-handed. 534 00:22:05,289 --> 00:22:06,958 It was extremely important 535 00:22:06,991 --> 00:22:09,561 to get as many different people as we can in there, 536 00:22:09,594 --> 00:22:10,928 including other amputees. 537 00:22:10,962 --> 00:22:13,798 It's hard to find people that are amputees in general, 538 00:22:13,832 --> 00:22:16,768 and then, like, upper-extremity amputees is the next thing, 539 00:22:16,801 --> 00:22:18,470 and then finding people who are willing, 540 00:22:18,503 --> 00:22:20,238 to step out of their comfort zone 541 00:22:20,271 --> 00:22:22,106 - and then do this. - Right. 542 00:22:22,139 --> 00:22:23,575 When I met Jason, 543 00:22:23,608 --> 00:22:26,545 I found it really interesting that we had a lot in common, 544 00:22:26,578 --> 00:22:30,014 because we were both into cars, we were both into music. 545 00:22:30,048 --> 00:22:31,783 - Hi, Gil. - Hey. What's up? 546 00:22:31,816 --> 00:22:33,951 - Jason. Nice to meet ya. - Nice meeting you. 547 00:22:33,985 --> 00:22:36,854 He's a step or two ahead of me with the technology stuff. 548 00:22:36,888 --> 00:22:39,724 The way this hand works is it essentially picks up 549 00:22:39,758 --> 00:22:42,093 the ultrasound signals from my residual limb, 550 00:22:42,127 --> 00:22:43,862 so when I move my index finger, 551 00:22:43,895 --> 00:22:45,129 it'll move my index... 552 00:22:45,162 --> 00:22:47,232 ring... 553 00:22:47,265 --> 00:22:48,799 Wow, for the first time, 554 00:22:48,833 --> 00:22:50,502 prosthetics are finally getting to the point 555 00:22:50,535 --> 00:22:52,270 where they're getting pretty close 556 00:22:52,303 --> 00:22:54,239 to actual human hand. 557 00:22:54,272 --> 00:22:55,907 You know, it got me excited. I was like, 558 00:22:55,941 --> 00:22:58,143 "This is the type of thing that I've been waiting for." 559 00:22:58,176 --> 00:22:59,877 If I was ever going to try one again, 560 00:22:59,910 --> 00:23:02,881 this would be the type of stuff that I would want to check out. 561 00:23:02,914 --> 00:23:04,081 When I move my thumb... 562 00:23:08,286 --> 00:23:10,354 I know from experience 563 00:23:10,388 --> 00:23:12,357 that it's not always working perfectly. 564 00:23:12,391 --> 00:23:14,459 It's very interesting for me to have someone else 565 00:23:14,492 --> 00:23:16,327 who comes and tries our technology 566 00:23:16,360 --> 00:23:18,130 to see if it can be generalized. 567 00:23:20,665 --> 00:23:23,434 Is my arm getting warmer because you're wrapping it, 568 00:23:23,468 --> 00:23:24,836 or does that have heat in it? 569 00:23:24,870 --> 00:23:26,737 - It does have heat in it. - Oh, okay. 570 00:23:26,771 --> 00:23:29,741 First thing we need, if we're gonna get Jay to try the hand, 571 00:23:29,774 --> 00:23:32,310 is we need to get a custom-fit socket to his arm 572 00:23:32,343 --> 00:23:34,546 that's comfortable and fits nice and snug. 573 00:23:34,579 --> 00:23:36,347 You comfortable when they do this? 574 00:23:36,381 --> 00:23:38,316 This is the most awkward part for me. 575 00:23:38,349 --> 00:23:40,652 - Nah, it was kinda weird. - Ah, yeah. Yeah. 576 00:23:40,685 --> 00:23:42,787 I was 12 years old when I lost my hand 577 00:23:42,820 --> 00:23:44,689 and had a prosthetic for six months, 578 00:23:44,723 --> 00:23:46,925 and pretty much ever since then, I haven't used it, 579 00:23:46,958 --> 00:23:48,793 and it's been close to 30 years now. 580 00:23:48,827 --> 00:23:50,828 And there's the impression of your arm. 581 00:23:50,862 --> 00:23:52,430 That's way easier than I thought it was gonna be. 582 00:23:52,463 --> 00:23:53,732 That's wild, yeah! 583 00:23:53,765 --> 00:23:55,966 It may not be right for me, but this is something 584 00:23:56,000 --> 00:23:58,503 that could really, really help people's lives. 585 00:23:58,536 --> 00:23:59,805 It would be really cool 586 00:23:59,838 --> 00:24:03,008 to have a hand in helping to develop the technology. 587 00:24:04,242 --> 00:24:06,411 All right. 588 00:24:06,444 --> 00:24:07,779 All right, ready? 589 00:24:08,580 --> 00:24:10,448 Just slide it in. 590 00:24:10,481 --> 00:24:12,150 Turn this... tighten. 591 00:24:13,485 --> 00:24:14,652 How tight? 592 00:24:14,686 --> 00:24:16,788 As tight as you can before it really hurts... 593 00:24:16,821 --> 00:24:18,356 - Oh, really? - ...because the tighter it is, 594 00:24:18,389 --> 00:24:20,491 - the better reading we'll see. - Okay. 595 00:24:20,525 --> 00:24:22,360 - Now we apply the probe... - Okay. 596 00:24:22,394 --> 00:24:24,062 ...so it can read your movements. 597 00:24:24,095 --> 00:24:25,229 Now we also 598 00:24:25,263 --> 00:24:27,432 have to work on the algorithm and the machine learning, 599 00:24:27,465 --> 00:24:29,300 and for this, we will need you to train. 600 00:24:29,333 --> 00:24:30,701 Okay. 601 00:24:30,735 --> 00:24:33,137 An able-bodied person, when you move your finger, 602 00:24:33,171 --> 00:24:35,039 you're not thinking about moving your finger, 603 00:24:35,072 --> 00:24:37,708 you just do it, because that's how we're hardwired, 604 00:24:37,742 --> 00:24:39,544 but, honestly, I don't really remember 605 00:24:39,577 --> 00:24:41,646 what it was like to even have that hand. 606 00:24:41,679 --> 00:24:44,149 Even though an amputee doesn't have a thumb, 607 00:24:44,182 --> 00:24:45,717 they still have the muscle. 608 00:24:45,750 --> 00:24:48,319 You still have some kind of memory 609 00:24:48,352 --> 00:24:49,888 of how you moved your fingers, 610 00:24:49,921 --> 00:24:52,490 and you can think about moving your phantom fingers, 611 00:24:52,523 --> 00:24:54,525 and the muscles would move accordingly, 612 00:24:54,559 --> 00:24:56,694 and that's exactly what we use in order to, uh, 613 00:24:56,727 --> 00:24:59,564 recreate the motion and put it in a prosthetic arm. 614 00:24:59,598 --> 00:25:03,534 But does Jay still remember how to move fingers 615 00:25:03,568 --> 00:25:06,204 that he didn't have for, I believe, 30 years ago? 616 00:25:06,237 --> 00:25:08,606 Now we'll run the model, 617 00:25:08,640 --> 00:25:10,708 and you'll be able to control the hand. 618 00:25:10,741 --> 00:25:13,444 You're optimistic. I'm crossing fingers. 619 00:25:13,478 --> 00:25:15,379 Can I cross these fingers? 620 00:25:15,413 --> 00:25:17,281 Is that... is that an option yet? 621 00:25:17,314 --> 00:25:19,750 Having Jay here for a day 622 00:25:19,783 --> 00:25:21,752 and hoping to get him to a point 623 00:25:21,786 --> 00:25:23,687 that he controls finger by finger, 624 00:25:23,721 --> 00:25:25,923 I'm a little concerned that it will not work 625 00:25:25,957 --> 00:25:28,026 in such a short period of time. 626 00:25:28,059 --> 00:25:29,794 Okay. And... 627 00:25:29,828 --> 00:25:32,730 - Ready? - Yeah. You should try each of the fingers. 628 00:25:32,763 --> 00:25:34,299 All right, that's the thumb... 629 00:25:35,500 --> 00:25:37,869 - Oh, shit! - Unbelievable. 630 00:25:39,404 --> 00:25:41,338 All right, index... 631 00:25:41,372 --> 00:25:42,640 Yay! 632 00:25:42,674 --> 00:25:44,008 Wow, I'm surprised. 633 00:25:44,041 --> 00:25:46,043 Middle... 634 00:25:46,077 --> 00:25:47,245 Dude. 635 00:25:50,181 --> 00:25:51,917 Five for five? 636 00:25:53,918 --> 00:25:56,320 All five of them! 637 00:25:56,353 --> 00:25:57,488 - Whoa. - That's wild. 638 00:25:57,522 --> 00:25:59,324 All right, let me do it again. 639 00:25:59,357 --> 00:26:00,625 You're a natural, man. 640 00:26:00,658 --> 00:26:02,527 Doesn't that feel crazy? 641 00:26:02,560 --> 00:26:04,295 - Yeah! - Feels wild. 642 00:26:04,328 --> 00:26:06,931 - I didn't think it'd be as good. - I didn't either. 643 00:26:06,965 --> 00:26:09,234 He hit me in the back after it worked, so... 644 00:26:09,267 --> 00:26:10,501 That's the first time. 645 00:26:10,535 --> 00:26:13,471 It's like a game-changer, even in its infancy, 646 00:26:13,504 --> 00:26:14,739 which is kind of insane, 647 00:26:14,772 --> 00:26:16,975 because it can only get better from there. 648 00:26:17,008 --> 00:26:19,644 And it's really cool to play a small part in that. 649 00:26:19,677 --> 00:26:21,946 Now we have two main goals. 650 00:26:21,979 --> 00:26:24,949 First, you need to move your muscle or your phantom finger, 651 00:26:24,983 --> 00:26:28,219 and immediately see response, so this is one direction of research. 652 00:26:28,253 --> 00:26:31,455 The other direction is to make it more accurate. 653 00:26:31,489 --> 00:26:33,157 Being able to type on a keyboard, 654 00:26:33,191 --> 00:26:35,960 use a computer mouse, uh, open a water bottle, 655 00:26:35,993 --> 00:26:38,262 things like that that most people take for granted. 656 00:26:38,296 --> 00:26:41,933 It's kind of like a... you know, sci-fi movie, soon to be written. 657 00:26:41,966 --> 00:26:44,435 Give us five, right? 658 00:26:44,469 --> 00:26:46,805 That's awkward... oh, robot to robot hand. 659 00:26:46,838 --> 00:26:47,906 Nice! 660 00:26:49,173 --> 00:26:51,676 - That's... that was real, right? - Yeah. 661 00:26:51,710 --> 00:26:53,877 If I find out you guys had a button under that desk... 662 00:26:53,911 --> 00:26:56,046 No, nah, I promise. I promise. 663 00:26:56,080 --> 00:26:58,616 What began as one man's pursuit 664 00:26:58,650 --> 00:27:01,185 to innovate music through A.I. and robotics 665 00:27:01,218 --> 00:27:04,222 unexpectedly became something much greater. 666 00:27:05,689 --> 00:27:08,259 A human body cooperating with a bionic hand 667 00:27:08,292 --> 00:27:09,927 is one thing... 668 00:27:09,961 --> 00:27:12,030 but is it possible to humanize a machine 669 00:27:12,063 --> 00:27:15,066 to the point that it truly seems lifelike? 670 00:27:15,099 --> 00:27:18,937 Or is that still sci-fi, and far, far away? 671 00:27:25,009 --> 00:27:26,611 How did things go with Will? 672 00:27:26,645 --> 00:27:29,280 You know, one of the real challenges there 673 00:27:29,313 --> 00:27:30,881 was just getting enough material 674 00:27:30,915 --> 00:27:33,084 that we could actually come back with. 675 00:27:33,117 --> 00:27:36,754 We can't possibly capture somebody's real personality, 676 00:27:36,787 --> 00:27:38,156 you know, that's impossible, 677 00:27:38,189 --> 00:27:40,324 but in order for it to really work, 678 00:27:40,357 --> 00:27:44,195 it's really important to capture a feeling of Will. 679 00:27:44,228 --> 00:27:45,463 Right, so... 680 00:27:45,497 --> 00:27:48,733 Will's avatar is actually Mark's first go 681 00:27:48,767 --> 00:27:51,802 at creating a digital copy of a real person. 682 00:27:51,836 --> 00:27:53,838 Wow, that's looking pretty good. 683 00:27:53,871 --> 00:27:56,441 He's not just trying to clone a human, 684 00:27:56,474 --> 00:27:57,641 by any stretch, 685 00:27:57,675 --> 00:27:59,710 but trying to create an artificial stand-in 686 00:27:59,743 --> 00:28:01,746 that's somewhat believable. 687 00:28:01,779 --> 00:28:04,716 Still, like most firsts, it's bumpy, 688 00:28:04,749 --> 00:28:07,017 and it's a cautious road into the unknown. 689 00:28:07,051 --> 00:28:09,420 A big challenge that I've found 690 00:28:09,453 --> 00:28:11,288 while I've been looking through a lot of the images 691 00:28:11,322 --> 00:28:14,358 is it seems that Will was moving a lot during the shots. 692 00:28:14,391 --> 00:28:16,928 Okay. When we're building digital Will, 693 00:28:16,961 --> 00:28:19,130 we have about eight artists on our team 694 00:28:19,163 --> 00:28:20,064 that come together 695 00:28:20,097 --> 00:28:22,066 and pull all of the different components 696 00:28:22,100 --> 00:28:24,234 to bring together this real-time character 697 00:28:24,268 --> 00:28:26,637 that's driven by the artificial intelligence 698 00:28:26,670 --> 00:28:29,107 to behave like Will behaves. 699 00:28:29,707 --> 00:28:31,843 Big challenges we've got 700 00:28:31,876 --> 00:28:34,278 is how we create Will's personality. 701 00:28:34,312 --> 00:28:35,546 Yeah. Like, the liveliness 702 00:28:35,580 --> 00:28:37,314 and the energy that he generates, 703 00:28:37,348 --> 00:28:38,316 and the excitement. 704 00:28:39,950 --> 00:28:41,753 The facial hair was a challenge. 705 00:28:41,786 --> 00:28:44,121 Because it's so sparse, it's quite tricky to get 706 00:28:44,155 --> 00:28:46,090 the hair separated from the skin. 707 00:28:46,124 --> 00:28:48,493 We have to be able to synthesize 708 00:28:48,526 --> 00:28:51,829 the sort of feel that you're interacting with Will. 709 00:28:51,862 --> 00:28:54,599 So, Teah, I've got some stuff to hear. 710 00:28:54,632 --> 00:28:56,767 We've got 16 variations. 711 00:28:56,801 --> 00:28:58,269 - 16 variations? - Yeah. 712 00:28:58,302 --> 00:29:01,072 We take the voice data that we've got, 713 00:29:01,105 --> 00:29:03,808 and then we can enable the digital version of Will 714 00:29:03,841 --> 00:29:05,743 to say all kinds of different things. 715 00:29:05,776 --> 00:29:07,411 Here's the forecast. 716 00:29:07,444 --> 00:29:08,680 Yo, check out the forecast. 717 00:29:08,713 --> 00:29:10,014 Yo, check out the weather and shit. 718 00:29:10,048 --> 00:29:11,582 Here's the weather. Check out the weather. 719 00:29:11,615 --> 00:29:13,451 Yah, 'bout to make it rain! 720 00:29:13,484 --> 00:29:14,819 Kinda. 721 00:29:14,852 --> 00:29:16,654 That's fantastic... the words, 722 00:29:16,687 --> 00:29:18,723 the delivery, emphasis... 723 00:29:18,756 --> 00:29:21,960 Shows you just how complex people react. 724 00:29:23,561 --> 00:29:26,431 It's awesome where we are in the world of tech. 725 00:29:27,598 --> 00:29:29,367 Scary where we are, as well. 726 00:29:29,400 --> 00:29:32,570 My mind started thinking, like, "Wait a second here. 727 00:29:32,604 --> 00:29:34,405 Why am I doing this? 728 00:29:34,439 --> 00:29:36,941 What's the endgame?" 729 00:29:37,942 --> 00:29:41,846 Because, eventually, I won't be around, 730 00:29:41,880 --> 00:29:43,214 but it would. 731 00:29:43,248 --> 00:29:45,650 Will's endgame is more modest than Mark's: 732 00:29:45,683 --> 00:29:48,719 a beefed-up Instagram following, a virtual assistant, 733 00:29:48,753 --> 00:29:51,556 anything that might help him expand his creative outlets 734 00:29:51,589 --> 00:29:55,960 or free up time for more creative or philanthropic pursuits. 735 00:29:57,628 --> 00:30:00,431 Okay, so, here we go. 736 00:30:00,464 --> 00:30:02,634 That's looking really different. 737 00:30:02,667 --> 00:30:04,201 It's gonna be really interesting, 738 00:30:04,234 --> 00:30:06,104 because, you know, it's not every day 739 00:30:06,137 --> 00:30:08,439 you get confronted with your virtual self. 740 00:30:08,472 --> 00:30:09,874 Right. 741 00:30:09,907 --> 00:30:12,076 Does he feel that this is like him? 742 00:30:12,109 --> 00:30:14,078 If it's not representative of him 743 00:30:14,112 --> 00:30:15,880 or if he doesn't think it's authentic, 744 00:30:15,914 --> 00:30:18,216 then he won't want to support it. 745 00:30:22,353 --> 00:30:24,889 - What up, Mark? -Oh, hey, how are you? 746 00:30:24,923 --> 00:30:26,758 - You can see me, right? -Yes. 747 00:30:26,791 --> 00:30:29,460 Yo, wassup? This is will.i.am. 748 00:30:30,695 --> 00:30:31,929 This is the new version of you. 749 00:30:31,962 --> 00:30:33,798 We can give him glasses there. 750 00:30:33,831 --> 00:30:35,433 That's awesome. 751 00:30:35,466 --> 00:30:38,836 I remember I had a pimple on my face that day. You captured it. 752 00:30:38,869 --> 00:30:40,771 The good thing is, it's digital, 753 00:30:40,804 --> 00:30:42,273 and we can remove it really easily. 754 00:30:42,306 --> 00:30:44,542 How come you didn't remove that? 755 00:30:44,575 --> 00:30:47,111 You can make him do a variety of things. 756 00:30:47,145 --> 00:30:49,046 Let's play "Simon Says." 757 00:30:49,079 --> 00:30:50,815 Say, "I sound like a girl." 758 00:30:50,848 --> 00:30:52,550 I sound like a girl. 759 00:30:52,584 --> 00:30:54,085 Say that with a higher pitch. 760 00:30:54,118 --> 00:30:56,054 I sound like a girl. 761 00:30:56,087 --> 00:30:58,056 Raise your eyebrows. 762 00:30:59,390 --> 00:31:00,624 Poke out your tongue. 763 00:31:02,393 --> 00:31:04,995 Tell me about growing up in Los Angeles. 764 00:31:05,029 --> 00:31:06,363 I was born and raised in Boyle Heights, 765 00:31:06,397 --> 00:31:08,933 which is west of east Los Angeles, 766 00:31:08,966 --> 00:31:10,601 which is east of Hollywood. 767 00:31:10,634 --> 00:31:12,903 Just east of downtown. 768 00:31:12,937 --> 00:31:14,739 Should it sound exactly like me? 769 00:31:14,772 --> 00:31:16,206 Nope. 770 00:31:16,240 --> 00:31:17,742 Should it sound a little bit robotic? 771 00:31:17,775 --> 00:31:20,078 Yes. It should. 772 00:31:20,611 --> 00:31:22,112 For my mom. 773 00:31:22,145 --> 00:31:24,749 My mom should not be confused. 774 00:31:24,782 --> 00:31:26,083 What's your name? 775 00:31:26,116 --> 00:31:27,618 Mi nombre es Will. 776 00:31:27,652 --> 00:31:29,254 You speak Spanish? 777 00:31:29,287 --> 00:31:30,220 I don't know. 778 00:31:31,122 --> 00:31:33,123 I know it needs some fine-tuning, 779 00:31:33,157 --> 00:31:35,392 but the way it's looking so far 780 00:31:35,426 --> 00:31:36,694 is mind-blowing. 781 00:31:36,728 --> 00:31:38,062 Thanks, Mark. 782 00:31:38,096 --> 00:31:39,930 Yeah, no worries. 783 00:31:39,964 --> 00:31:41,833 How far do you go down that path 784 00:31:41,866 --> 00:31:44,202 until you can label it a living... 785 00:31:44,235 --> 00:31:47,071 a digital living character? 786 00:31:47,104 --> 00:31:50,274 This raises some of the deepest questions 787 00:31:50,308 --> 00:31:53,311 in science and philosophy, actually, 788 00:31:53,344 --> 00:31:55,346 you know, the nature of free will. 789 00:31:55,380 --> 00:31:56,347 How do you actually 790 00:31:56,380 --> 00:31:58,449 build a character which is truly autonomous? 791 00:31:58,482 --> 00:32:01,252 Peek-a-boo! 792 00:32:02,319 --> 00:32:05,389 What is free will? What does it take to do that? 793 00:32:05,423 --> 00:32:07,024 Artificial Intelligence 794 00:32:07,058 --> 00:32:09,060 is crucial to the work we are doing, 795 00:32:09,093 --> 00:32:10,828 to inspire, to surprise, 796 00:32:10,861 --> 00:32:13,498 to push human creativity and abilities 797 00:32:13,531 --> 00:32:15,033 to uncharted domains. 798 00:32:16,868 --> 00:32:18,069 Unbelievable. 799 00:32:22,940 --> 00:32:24,442 Free will... 800 00:32:25,476 --> 00:32:27,644 ...it's something we've been grappling with 801 00:32:27,678 --> 00:32:30,348 for thousands of years, from Aristotle to Descartes, 802 00:32:30,381 --> 00:32:33,150 and will continue to grapple with for a thousand more. 803 00:32:33,184 --> 00:32:35,786 Will we ever be able to make an A.I. 804 00:32:35,819 --> 00:32:37,488 that can think on its own? 805 00:32:37,522 --> 00:32:39,891 A second, artificial version of me 806 00:32:39,924 --> 00:32:42,060 that is truly autonomous? 807 00:32:42,093 --> 00:32:45,329 A Robert that can actually think and feel on his own, 808 00:32:45,362 --> 00:32:47,999 while this Robert here takes a nap? 809 00:32:49,901 --> 00:32:51,202 Impossible? 810 00:32:51,235 --> 00:32:52,636 Well, when you consider 811 00:32:52,670 --> 00:32:55,606 what human cooperation has already accomplished... 812 00:32:55,639 --> 00:32:57,675 a man on the moon... 813 00:32:57,708 --> 00:32:59,944 decoding the human genome... 814 00:32:59,977 --> 00:33:02,613 discovering faraway galaxies... 815 00:33:02,647 --> 00:33:05,816 I'd put my money on dreamers like Mark and Gil 816 00:33:05,850 --> 00:33:09,487 over the "Earth is flat" folks any day. 817 00:33:09,520 --> 00:33:12,523 Until then... nap time. 818 00:33:18,196 --> 00:33:19,697 Look at everything we've created. 819 00:33:21,865 --> 00:33:23,500 Artificial Intelligence is gonna be 820 00:33:23,534 --> 00:33:26,370 the technology that takes that to the next level. 821 00:33:26,404 --> 00:33:28,472 Artificial Intelligence can help us 822 00:33:28,506 --> 00:33:30,541 to feed the world's population. 823 00:33:30,574 --> 00:33:33,878 The fact that we can find where famine might happen, 824 00:33:33,911 --> 00:33:35,546 it's mind-blowing. 825 00:33:35,579 --> 00:33:37,181 These are conflict areas, 826 00:33:37,214 --> 00:33:39,750 this is an area that we need to look at protecting. 827 00:33:39,784 --> 00:33:41,285 Then launch A.I. 828 00:33:41,318 --> 00:33:44,155 We are going to release the speed limit on your car. 829 00:33:46,190 --> 00:33:47,725 Tim, can you hear me? 830 00:33:47,758 --> 00:33:49,260 With A.I., 831 00:33:49,293 --> 00:33:51,295 ideas are easy, execution is hard. 832 00:33:52,663 --> 00:33:55,766 What excites me the most about where we might be going 833 00:33:55,799 --> 00:33:57,034 is having more super-powers... 834 00:33:57,067 --> 00:33:58,402 I got him! 835 00:33:58,436 --> 00:34:00,705 ...and A.I. is super-powers for our mind. 836 00:34:00,738 --> 00:34:03,241 Even though the limb is synthetic materials, 837 00:34:03,274 --> 00:34:05,342 it moves as if it's flesh and bone. 838 00:34:07,044 --> 00:34:09,814 where you can prevent disease before it happens. 839 00:34:09,847 --> 00:34:11,415 A.I. can give us that answer 840 00:34:11,448 --> 00:34:13,084 that we've been seeking all along... 841 00:34:13,117 --> 00:34:14,585 "Are we alone?" 842 00:34:14,619 --> 00:34:15,752 Bah! 843 00:34:15,786 --> 00:34:17,688 I love the idea that there are passionate people 844 00:34:17,721 --> 00:34:19,757 dedicating their time and energy 845 00:34:19,790 --> 00:34:21,392 to making these things happen.