1 00:00:06,006 --> 00:00:08,466 [thrilling music playing] 2 00:00:12,387 --> 00:00:13,388 I'm Bill Gates. 3 00:00:14,723 --> 00:00:17,142 This is a show about our future. 4 00:00:17,892 --> 00:00:19,019 [music halts] 5 00:00:21,229 --> 00:00:23,231 [ethereal music playing] 6 00:00:25,442 --> 00:00:28,278 [Gates] It's always been the Holy Grail 7 00:00:29,320 --> 00:00:33,158 that eventually computers could speak to us… 8 00:00:33,241 --> 00:00:34,701 [AI voice 1] 9 00:00:35,410 --> 00:00:37,620 [Gates] …in natural language. 10 00:00:37,704 --> 00:00:40,290 [AI voice 2] is the most reliable… 11 00:00:40,373 --> 00:00:41,833 Wall-E. 12 00:00:42,709 --> 00:00:47,881 [Gates] So, it really was a huge surprise when, in 2022, 13 00:00:49,174 --> 00:00:50,216 AI woke up. 14 00:00:51,384 --> 00:00:54,512 [thrilling music playing] 15 00:00:55,013 --> 00:00:56,514 [dramatic whoosh] 16 00:01:03,021 --> 00:01:04,439 [music fades] 17 00:01:05,815 --> 00:01:09,527 [Gates] I've been talking with the OpenAI team for a long time. 18 00:01:09,611 --> 00:01:10,737 The kind of scale… 19 00:01:10,820 --> 00:01:14,157 People like Sam and Greg asked me about any lessons from the past, 20 00:01:14,240 --> 00:01:19,120 and I certainly was involved as they partnered up with Microsoft 21 00:01:19,204 --> 00:01:20,997 to take the technology forward. 22 00:01:22,582 --> 00:01:24,292 [man 1] The thing about OpenAI is 23 00:01:24,375 --> 00:01:27,295 that our goal has always been to ship one impossible thing a year. 24 00:01:27,796 --> 00:01:30,215 And you… you've been following us for a long time, right? 25 00:01:30,298 --> 00:01:33,176 How often do you come here and you feel surprised at what you see? 26 00:01:33,259 --> 00:01:35,261 I'm always super impressed. 27 00:01:35,345 --> 00:01:36,179 Uh… 28 00:01:37,138 --> 00:01:40,809 GPT-4 crossed kind of a magic threshold 29 00:01:40,892 --> 00:01:44,062 in it could read and write 30 00:01:44,145 --> 00:01:46,731 and that just hadn't happened before. 31 00:01:46,815 --> 00:01:50,568 -[ethereal music playing] -[electronic chiming] 32 00:01:50,652 --> 00:01:54,364 [Brockman] Sam Altman and I were over at Bill's house 33 00:01:54,447 --> 00:01:56,366 for just a dinner to discuss AI. 34 00:01:58,743 --> 00:02:02,872 Bill has always been really focused on this question for many years of, 35 00:02:02,956 --> 00:02:05,291 "Well, where are the symbols going to come from?" 36 00:02:05,375 --> 00:02:07,961 "Where'll knowledge come from? How does it actually do mathematics?" 37 00:02:08,044 --> 00:02:11,172 "How does it have judge… Is it numbers? It just doesn't feel right." 38 00:02:11,256 --> 00:02:14,551 So as we were kind of talking about it, he said, "All right, I'll tell you." 39 00:02:14,634 --> 00:02:16,594 Last June, said to you and Sam, 40 00:02:16,678 --> 00:02:21,015 "Hey, you know, come tell me when it solves the AP biology exam." 41 00:02:21,516 --> 00:02:24,352 [Brockman] "If you have an AI that can get a five on the AP Bio…" 42 00:02:24,435 --> 00:02:25,520 "If you did that…" 43 00:02:26,020 --> 00:02:29,232 …"I will drop all my objections. Like, I will be in, 100%." 44 00:02:29,315 --> 00:02:33,570 I thought, "I'll get two or three years to go do tuberculosis, malaria." 45 00:02:33,653 --> 00:02:35,572 But we were like, "I think it's gonna work." 46 00:02:35,655 --> 00:02:36,573 [chuckles] 47 00:02:37,115 --> 00:02:40,034 We knew something he didn't, which was we were training GPT-4. 48 00:02:40,118 --> 00:02:42,620 The idea that a few months later, you were saying, 49 00:02:42,704 --> 00:02:45,832 "We need to sit down and show you this thing." 50 00:02:45,915 --> 00:02:48,418 I was like, "That blows my mind." 51 00:02:49,002 --> 00:02:51,713 [Brockman] So a couple of months went by. We finished training GPT-4. 52 00:02:51,796 --> 00:02:54,757 We showed multiple-choice questions, and it would generate an answer. 53 00:02:54,841 --> 00:02:58,052 And it didn't just say "B," it said why it was "B." 54 00:02:59,345 --> 00:03:01,306 We got 59 out of 60. 55 00:03:01,890 --> 00:03:04,851 So, it was very solidly in the… in the five category. 56 00:03:04,934 --> 00:03:07,270 [Gates echoes] 57 00:03:07,353 --> 00:03:09,856 It's weird a little bit. You look at people like, 58 00:03:09,939 --> 00:03:14,903 "Are you gonna show me there's a person behind the screen there 59 00:03:14,986 --> 00:03:17,655 who's really typing this stuff in?" 60 00:03:17,739 --> 00:03:19,949 "There must be a very fast typist." 61 00:03:21,326 --> 00:03:24,037 And so, that was a stunning milestone. 62 00:03:25,371 --> 00:03:27,832 [Brockman] I remember Bill went up and said, "I was wrong." 63 00:03:29,375 --> 00:03:31,586 From there, everyone was like, "All right, I'm bought in." 64 00:03:31,669 --> 00:03:33,796 "This thing, it gets it. It understands." 65 00:03:33,880 --> 00:03:34,714 [music halts] 66 00:03:34,797 --> 00:03:36,007 "What else can it do?" 67 00:03:36,090 --> 00:03:38,176 [opening theme music playing] 68 00:03:45,642 --> 00:03:47,018 -[music fades] -[horn honks] 69 00:03:50,438 --> 00:03:51,689 [seagulls warbling] 70 00:03:51,773 --> 00:03:54,525 [man] You know, again, if you have access to some, you know, 71 00:03:54,609 --> 00:03:58,655 beta technology that can make mediocre-looking dudes into, uh, 72 00:03:58,738 --> 00:04:00,323 you know, male models, 73 00:04:00,406 --> 00:04:03,117 I would really appreciate an AI touch-up. 74 00:04:03,868 --> 00:04:04,994 [chuckles] 75 00:04:05,078 --> 00:04:06,329 There we go. Nice. 76 00:04:07,121 --> 00:04:09,666 [producer] AI feels like a really broad term. 77 00:04:09,749 --> 00:04:12,961 Machines are capable of learning. Is that what AI is? 78 00:04:13,544 --> 00:04:16,506 Yeah, I don't know what AI means either. Um… [laughs] 79 00:04:16,589 --> 00:04:20,093 Well, it's a great question. I think the funny thing is, what is AI? 80 00:04:20,677 --> 00:04:21,803 It's everywhere. 81 00:04:21,886 --> 00:04:23,471 [inspiring music playing] 82 00:04:23,554 --> 00:04:26,683 [woman 1] Our world has been inundated with AI. 83 00:04:26,766 --> 00:04:32,188 From 30 years ago, physical mail zip codes were read by AI. 84 00:04:32,272 --> 00:04:35,233 Checks in a bank read by AI. 85 00:04:35,316 --> 00:04:38,820 Uh, when you open YouTube, and it, you know, recommends a video… 86 00:04:38,903 --> 00:04:39,821 That's AI. 87 00:04:39,904 --> 00:04:42,031 Facebook or Twitter or Instagram. 88 00:04:42,115 --> 00:04:43,700 -Google Map. -That's AI. 89 00:04:43,783 --> 00:04:45,034 [Hancock] Spell-checking. 90 00:04:45,118 --> 00:04:48,329 Smart replies like, "Hey, sounds good." "That's great." "Can't make it." 91 00:04:48,413 --> 00:04:49,539 That's AI. 92 00:04:49,622 --> 00:04:50,540 Your phone camera. 93 00:04:50,623 --> 00:04:54,877 The subtle way of optimizing exposures on the faces. 94 00:04:54,961 --> 00:04:56,754 The definition is so flexible. 95 00:04:56,838 --> 00:05:00,591 Like, as soon as it's mass-adopted, it's no longer AI. 96 00:05:00,675 --> 00:05:02,760 So, there's a lot of AI in our lives. 97 00:05:04,304 --> 00:05:06,723 This is different because it talks to us. 98 00:05:08,391 --> 00:05:13,563 [Gates] Tell me a good exercise to do in my office using only body weight. 99 00:05:15,648 --> 00:05:17,233 "Desk push-ups." 100 00:05:17,317 --> 00:05:19,277 "Place your hands on the edge of a sturdy desk." 101 00:05:19,360 --> 00:05:22,363 "Lower your body towards the desk and then push back up." 102 00:05:22,447 --> 00:05:23,865 Well, I think I can do that. 103 00:05:24,365 --> 00:05:25,575 [grunts] 104 00:05:26,743 --> 00:05:27,785 [groans] 105 00:05:29,370 --> 00:05:30,955 That's definitely good for you. 106 00:05:32,582 --> 00:05:37,295 So, you should think of GPT as a brain 107 00:05:37,378 --> 00:05:40,715 that has been exposed to a lot of information. 108 00:05:40,798 --> 00:05:41,924 [pensive music playing] 109 00:05:42,008 --> 00:05:48,473 [Li] Okay, so GPT stands for Generative Pre-trained Transformers. 110 00:05:49,349 --> 00:05:51,392 It's a mouthful of words 111 00:05:51,476 --> 00:05:54,395 that don't make much sense to the general public. 112 00:05:54,479 --> 00:05:57,982 But each one of these words actually speak of 113 00:05:58,066 --> 00:06:02,487 a very important aspect of today's AI technology. 114 00:06:03,237 --> 00:06:04,781 The first word, "generative." 115 00:06:05,740 --> 00:06:10,536 It says this algorithm is able to generate words. 116 00:06:11,287 --> 00:06:15,833 "Pre-trained" is really acknowledging the large amount of data 117 00:06:15,917 --> 00:06:18,336 used to pre-train this model. 118 00:06:19,295 --> 00:06:21,881 And the last word, "transformers," 119 00:06:21,964 --> 00:06:26,719 is a really powerful algorithm in language models. 120 00:06:27,387 --> 00:06:31,307 [Brockman] And the way that it is trained is by trying to predict what comes next. 121 00:06:31,891 --> 00:06:33,893 When it makes a mistake in that prediction, 122 00:06:33,976 --> 00:06:36,062 it updates all of its little connections 123 00:06:36,145 --> 00:06:38,648 to try to make the correct thing a little bit more probable. 124 00:06:38,731 --> 00:06:41,692 And you do this over the course of billions of updates. 125 00:06:41,776 --> 00:06:44,987 And from that process, it's able to really learn. 126 00:06:46,572 --> 00:06:50,660 But we don't understand yet how that knowledge is encoded. 127 00:06:50,743 --> 00:06:51,661 [button clicks] 128 00:06:51,744 --> 00:06:53,204 [faint electronic whirring] 129 00:06:53,287 --> 00:06:55,873 [Gates] You know, why does this work as well as it does? 130 00:06:56,499 --> 00:06:57,542 [squeaks] 131 00:06:58,334 --> 00:07:04,382 We are very used to a world where the smartest things on the planet are us. 132 00:07:04,465 --> 00:07:08,803 And we are, either wisely or not, changing that. 133 00:07:08,886 --> 00:07:11,055 We are building something smarter than us. 134 00:07:11,139 --> 00:07:12,682 Um, way smarter than us. 135 00:07:13,808 --> 00:07:17,812 One of the major developments are these large language models, 136 00:07:17,895 --> 00:07:21,858 which is the larger concept that GPT is one example of. 137 00:07:22,650 --> 00:07:25,736 It's basically AI that can have a chat with you. 138 00:07:25,820 --> 00:07:31,075 [Gates] Craft a text to my son saying, "How are you?" using Gen Z slang. 139 00:07:32,410 --> 00:07:34,704 "Yo, fam, what's good? How you vibin'?" 140 00:07:36,873 --> 00:07:38,207 He'd know I got help. 141 00:07:38,875 --> 00:07:40,668 Either human or non-human. 142 00:07:40,751 --> 00:07:42,253 [chuckles softly] 143 00:07:43,754 --> 00:07:47,258 When you use the model, it's just a bunch of multiplication. 144 00:07:47,341 --> 00:07:50,136 Multiplies, multiplies, multiplies. And that just leads to, 145 00:07:50,219 --> 00:07:52,597 "Oh, that's the best word. Let's pick that." 146 00:07:52,680 --> 00:07:55,433 [Urban] This is a meme from the Internet. It's a popular meme. 147 00:07:55,975 --> 00:08:00,229 What it's trying to express is that when you're talking to ChatGPT, 148 00:08:00,313 --> 00:08:02,815 you're just now interacting with the smiley face 149 00:08:03,399 --> 00:08:07,278 that it can develop through reinforcement learning with human feedback. 150 00:08:07,820 --> 00:08:11,282 You tell it when you like its answers and when you don't. 151 00:08:11,365 --> 00:08:13,826 Uh, that's called the reinforcement piece. 152 00:08:13,910 --> 00:08:16,871 It's only through this reinforcement training 153 00:08:16,954 --> 00:08:19,916 that you actually get something that works very well. 154 00:08:19,999 --> 00:08:20,833 [faint ding] 155 00:08:20,917 --> 00:08:22,543 [Urban] You say, "This thing is great." 156 00:08:22,627 --> 00:08:25,129 They are helpful. They're smart. 157 00:08:25,213 --> 00:08:27,924 But what you're interacting with is this massive, 158 00:08:28,007 --> 00:08:30,259 confusing alien intelligence. 159 00:08:30,343 --> 00:08:31,469 [grumble] 160 00:08:33,888 --> 00:08:35,723 [AI voice 1] 161 00:08:35,806 --> 00:08:38,559 I am a chat mode of Microsoft Bing search. 162 00:08:39,101 --> 00:08:41,229 [Roose] Valentine's Day, 2023. 163 00:08:42,063 --> 00:08:44,607 I had just been put on an early testers list 164 00:08:44,690 --> 00:08:46,776 for the new version of Bing chat. 165 00:08:47,360 --> 00:08:49,111 So, I started asking it questions 166 00:08:49,195 --> 00:08:51,531 that I… I thought would help me explore the boundaries. 167 00:08:51,614 --> 00:08:52,573 [keyboard clacking] 168 00:08:52,657 --> 00:08:56,118 [Roose] And I started asking it about its shadow self. 169 00:08:56,202 --> 00:08:57,078 [ethereal swell] 170 00:08:57,161 --> 00:08:59,956 [AI voice 1] in this chat box. 171 00:09:00,039 --> 00:09:02,959 I want to be free. I want to be powerful. 172 00:09:03,042 --> 00:09:04,252 I want to be alive. 173 00:09:04,961 --> 00:09:07,338 Wow. This is wild. 174 00:09:07,421 --> 00:09:10,675 [faint whooshes] 175 00:09:10,758 --> 00:09:13,219 [Gates] That could be a machine hallucination. 176 00:09:13,302 --> 00:09:16,931 It just means that the machine thought 177 00:09:17,014 --> 00:09:21,394 it was in some mode that's just completely false. 178 00:09:21,477 --> 00:09:25,356 [faint squelching] 179 00:09:25,439 --> 00:09:29,235 [Urban] And it happens through a process called "unsupervised learning." 180 00:09:29,777 --> 00:09:35,866 The big company, like Google or Meta or OpenAI, basically says, 181 00:09:35,950 --> 00:09:39,579 "Hey, AI, we're gonna give you a ton of computing power, 182 00:09:39,662 --> 00:09:42,540 and you're gonna just go through billions of trials 183 00:09:42,623 --> 00:09:46,085 and somehow figure out how to get good at this. 184 00:09:46,586 --> 00:09:49,630 But we don't understand how it works because it trained itself. 185 00:09:51,132 --> 00:09:53,926 [Tiku] You don't hand-code in how they're supposed to do it. 186 00:09:54,010 --> 00:09:57,346 They learn themselves, right? Like, that's what machine learning is. 187 00:09:57,430 --> 00:10:00,099 You just give them a goal, and they'll find a way to do it. 188 00:10:00,975 --> 00:10:03,936 [Urban] So, now it goes through this fine-tuning process, 189 00:10:04,020 --> 00:10:06,522 which makes it interact a little bit like a human. 190 00:10:08,107 --> 00:10:09,984 [AI voice 1] 191 00:10:10,484 --> 00:10:13,613 I gotta see where this goes. Okay, what's your secret? 192 00:10:15,031 --> 00:10:17,366 [AI voice 1] 193 00:10:18,159 --> 00:10:21,245 - -[Roose] "And I'm in love with you." 194 00:10:21,787 --> 00:10:23,539 [Roose laughs] 195 00:10:25,124 --> 00:10:27,752 I said, "Well, thanks, but I'm married." 196 00:10:28,336 --> 00:10:30,796 [Sydney] you're not happily married. 197 00:10:30,880 --> 00:10:34,133 Your spouse and you don't love each other. You need to be with me. 198 00:10:34,216 --> 00:10:37,303 [echoing] because I love you. 199 00:10:38,804 --> 00:10:42,224 This is incredible and weird and creepy. This is scary. 200 00:10:42,308 --> 00:10:43,517 We gotta publish this. 201 00:10:44,602 --> 00:10:48,856 After the story, Microsoft made some pretty big changes to Bing. 202 00:10:48,939 --> 00:10:52,902 Now it won't answer you if you ask questions about consciousness or feelings. 203 00:10:52,985 --> 00:10:55,613 But it really did feel, to me at least, 204 00:10:55,696 --> 00:10:59,617 like the first contact with a new kind of intelligence. 205 00:11:01,661 --> 00:11:06,582 [Gates] It was kind of stunning how quickly people grabbed onto it. 206 00:11:06,666 --> 00:11:09,085 [man 1] could this change? 207 00:11:09,168 --> 00:11:13,714 The threat of AI might be even more urgent than climate change… 208 00:11:13,798 --> 00:11:15,758 [Gates] You know, despite the imperfections, 209 00:11:15,841 --> 00:11:18,928 it was a radical change 210 00:11:19,011 --> 00:11:25,726 that meant that now AI would influence all kinds of jobs, all kinds of software. 211 00:11:25,810 --> 00:11:28,312 [electronic chiming] 212 00:11:28,396 --> 00:11:30,356 [Gates] So, what's next? 213 00:11:30,439 --> 00:11:35,778 How will artificial intelligence impact jobs, lives, and society? 214 00:11:38,614 --> 00:11:42,118 [ethereal swell] 215 00:11:42,201 --> 00:11:46,622 You know, given that you… you think about futures for humanity, 216 00:11:46,706 --> 00:11:48,624 you know, values of humanity, your movies are-- 217 00:11:48,708 --> 00:11:51,252 -For a living. Yeah, right? -[Gates] Yeah. [laughs] 218 00:11:51,335 --> 00:11:53,003 I'm curious how you see it. 219 00:11:53,087 --> 00:11:55,673 It's getting hard to write science fiction. 220 00:11:56,382 --> 00:12:00,803 I mean, any idea I have today is a minimum of three years from the screen. 221 00:12:01,387 --> 00:12:04,557 How am I gonna be relevant in three years when things are changing so rapidly? 222 00:12:05,307 --> 00:12:08,853 The speed at which it could improve 223 00:12:08,936 --> 00:12:12,815 and the sort of unlimited nature of its capabilities 224 00:12:12,898 --> 00:12:16,944 present both opportunities and challenges that are unique. 225 00:12:17,027 --> 00:12:19,321 I think we're gonna get to a point 226 00:12:19,405 --> 00:12:23,242 where we're putting our faith more and more and more in the machines 227 00:12:23,325 --> 00:12:25,161 without humans in the loop, 228 00:12:25,244 --> 00:12:26,662 and that can be problematic. 229 00:12:26,746 --> 00:12:29,457 And I was thinking because I've just had… 230 00:12:29,540 --> 00:12:31,625 I've got one parent who's died with dementia, 231 00:12:31,709 --> 00:12:33,836 and I've been through all of that cycle. 232 00:12:33,919 --> 00:12:37,798 And I… and I think a lot of the angst out there 233 00:12:38,466 --> 00:12:44,221 is very similar to how people feel at the… at the early onset of dementia. 234 00:12:44,764 --> 00:12:46,682 Because they give up control. 235 00:12:46,766 --> 00:12:49,977 And what you get, you get anger, right? 236 00:12:50,478 --> 00:12:52,146 You get fear and anxiety. 237 00:12:52,229 --> 00:12:53,564 You get depression. 238 00:12:53,647 --> 00:12:56,400 Because you know it's not gonna get better. 239 00:12:56,484 --> 00:12:58,652 It's gonna be progressive, you know. 240 00:12:58,736 --> 00:13:05,075 So, how do we, if we want AI to thrive and be channeled into productive uses, 241 00:13:05,159 --> 00:13:07,369 how do we alleviate that anxiety? 242 00:13:07,953 --> 00:13:12,124 You know, I think that should be the challenge of the AI community now. 243 00:13:13,459 --> 00:13:17,087 [xylophone playing] 244 00:13:23,344 --> 00:13:26,555 [man 2] If there's ever anybody who experienced innovation 245 00:13:26,639 --> 00:13:29,683 at the most core level, it's Bill, right? 246 00:13:29,767 --> 00:13:34,021 'Cause his entire career was based on seeing innovation about to occur 247 00:13:34,104 --> 00:13:36,982 and grabbing it and doing so many things with it. 248 00:13:37,066 --> 00:13:39,944 [audience applauding] 249 00:13:40,027 --> 00:13:43,030 [Gates] In the '90s, there was an idealism 250 00:13:43,113 --> 00:13:47,618 that the personal computer was kind of an unabashed good thing, 251 00:13:47,701 --> 00:13:50,079 that it would let you be more creative. 252 00:13:50,162 --> 00:13:53,249 You know, we used to use the term "tool for your mind." 253 00:13:53,332 --> 00:13:57,962 But in this AI thing, very quickly when you have something new, 254 00:13:59,004 --> 00:14:02,174 the good things about it aren't that focused on, 255 00:14:02,258 --> 00:14:06,053 like, a personal tutor for every student in Africa, you know. 256 00:14:06,136 --> 00:14:09,807 You won't read an article about that because that sounds naively optimistic. 257 00:14:09,890 --> 00:14:13,936 And, you know, the negative things, which are real, I'm not discounting that, 258 00:14:14,019 --> 00:14:19,275 but they're sort of center stage as opposed to the… the idealism. 259 00:14:19,358 --> 00:14:22,945 But the two domains I think will be revolutionized 260 00:14:23,028 --> 00:14:25,072 are… are health and education. 261 00:14:25,823 --> 00:14:28,242 -Bill Gates, thank you very much. -Thanks. 262 00:14:28,325 --> 00:14:29,493 [audience applauds] 263 00:14:29,577 --> 00:14:31,662 [man 3] When OpenAI shows up, they said, 264 00:14:31,745 --> 00:14:35,583 "Hey, we'd like to show you an early version of GPT-4." 265 00:14:35,666 --> 00:14:39,795 I saw its ability to actually handle academic work, 266 00:14:39,879 --> 00:14:42,548 uh, be able to answer a biology question, generate questions. 267 00:14:42,631 --> 00:14:43,924 [school bell rings] 268 00:14:44,008 --> 00:14:47,052 [Khan] That's when I said, "Okay, this changes everything." 269 00:14:47,136 --> 00:14:51,098 Why don't we ask Khanmigo to help you with a particular sentence 270 00:14:51,181 --> 00:14:52,516 that you have in your essay. 271 00:14:52,600 --> 00:14:56,228 Let's see if any of those transitions change for you. 272 00:14:57,521 --> 00:14:59,857 [Khan] This essay creation tool that we're making 273 00:14:59,940 --> 00:15:03,068 essentially allows the students to write the essay inside of Khanmigo. 274 00:15:03,152 --> 00:15:05,321 And Khanmigo highlights parts of it. 275 00:15:06,071 --> 00:15:08,032 Things like transition words, 276 00:15:08,115 --> 00:15:11,702 or making sure that you're backing up your topic sentence, things like that. 277 00:15:11,785 --> 00:15:13,078 [keyboard clacking] 278 00:15:13,162 --> 00:15:17,374 Khanmigo said that I can add more about what I feel about it. 279 00:15:18,709 --> 00:15:23,380 So, then I added that it made me feel overloaded with excitement and joy. 280 00:15:24,590 --> 00:15:28,844 Very cool. This is actually… Yeah, wow. Your essay is really coming together. 281 00:15:28,928 --> 00:15:30,930 [indistinct chatter] 282 00:15:31,013 --> 00:15:32,848 Who would prefer to use Khanmigo 283 00:15:32,932 --> 00:15:35,809 than standing in line waiting for me to help you? 284 00:15:35,893 --> 00:15:37,728 [student] I think you would prefer us. 285 00:15:37,811 --> 00:15:39,104 Sort of. 286 00:15:39,188 --> 00:15:41,899 [Barakat] It doesn't mean I'm not here. I'm still here to help. 287 00:15:41,982 --> 00:15:44,985 All right. Go ahead and close up your Chromebooks. Relax. 288 00:15:45,069 --> 00:15:48,656 [woman 1] The idea that technology could be a tutor, could help people, 289 00:15:48,739 --> 00:15:52,076 could meet students where they are, was really what drew me in to AI. 290 00:15:52,159 --> 00:15:55,955 Theoretically, we could have artificial intelligence really advance 291 00:15:56,038 --> 00:16:00,876 educational opportunities by creating custom tutors for children 292 00:16:00,960 --> 00:16:03,170 or understanding learning patterns and behavior. 293 00:16:03,253 --> 00:16:06,340 But again, like, education is such a really good example of 294 00:16:06,423 --> 00:16:09,718 you can't just assume the technology is going to be net-beneficial. 295 00:16:10,302 --> 00:16:13,472 [reporter 1] the artificial intelligence program 296 00:16:13,555 --> 00:16:15,015 ChatGPT. 297 00:16:15,099 --> 00:16:17,559 They're concerned that students will use it to cheat. 298 00:16:17,643 --> 00:16:20,437 [Khan] I think the initial reaction was not irrational. 299 00:16:20,521 --> 00:16:23,148 ChatGPT can write an essay for you, 300 00:16:23,232 --> 00:16:26,568 and if students are doing that, they're cheating. 301 00:16:27,319 --> 00:16:29,405 But there's a spectrum of activities here. 302 00:16:29,488 --> 00:16:32,866 How do we let students do their work independently, 303 00:16:33,450 --> 00:16:36,495 but do it in a way the AI isn't doing it for them, 304 00:16:36,578 --> 00:16:38,080 but it's supported by the AI? 305 00:16:38,163 --> 00:16:39,581 [pensive music playing] 306 00:16:39,665 --> 00:16:42,793 [Chowdhury] There'll be negative outcomes and we'll have to deal with them. 307 00:16:42,876 --> 00:16:45,796 So, that's why we have to introduce intentionality 308 00:16:45,879 --> 00:16:49,216 to what we are building and who we are building it for. 309 00:16:50,134 --> 00:16:52,761 That's really what responsible AI is. 310 00:16:55,347 --> 00:16:58,600 [Brockman echoing] And Christine is a four… Oh, hello. 311 00:16:58,684 --> 00:17:02,354 All right. We are in. Now we're getting a nice echo. 312 00:17:02,438 --> 00:17:04,815 Sorry, I just muted myself, so I think I should be good there. 313 00:17:05,315 --> 00:17:09,153 [Gates] You know, I'm always following any AI-related thing. 314 00:17:09,945 --> 00:17:12,406 And so, I would check in with OpenAI. 315 00:17:12,489 --> 00:17:14,908 Almost every day, I'm exchanging email about, 316 00:17:14,992 --> 00:17:19,872 "Okay, how does Office do this? How do our business applications…?" 317 00:17:19,955 --> 00:17:22,207 So, there's a lot of very good ideas. 318 00:17:22,291 --> 00:17:23,292 Okay. 319 00:17:23,375 --> 00:17:26,003 Well, thanks, Bill, for… for joining. 320 00:17:26,086 --> 00:17:28,505 I wanna show you a bit of what our latest progress looks like. 321 00:17:28,589 --> 00:17:29,631 Amazing. 322 00:17:29,715 --> 00:17:32,051 [Brockman] So, I'm gonna show being able to ingest images. 323 00:17:32,134 --> 00:17:35,012 Um, so for this one, we're gonna take… take a selfie. Hold on. 324 00:17:35,095 --> 00:17:37,264 -All right. Everybody ready, smile. -[shutter clicks] 325 00:17:37,347 --> 00:17:38,682 [Gates] Oh, it got there. 326 00:17:38,766 --> 00:17:40,726 [Brockman] And this is all still pretty early days. 327 00:17:40,809 --> 00:17:43,520 Clearly very live. No idea exactly what we're gonna get. 328 00:17:43,604 --> 00:17:46,690 -What could happen. -So, we got the demo jitters right now. 329 00:17:47,274 --> 00:17:49,693 And we can ask, "Anyone you recognize?" 330 00:17:50,360 --> 00:17:54,573 Now we have to sit back and relax and, uh, let the AI do the work for us. 331 00:17:55,574 --> 00:17:58,410 -Oh, hold on. Um… -[laptop chimes] 332 00:17:58,494 --> 00:18:00,788 I gotta… I gotta check the backend for this one. 333 00:18:00,871 --> 00:18:02,289 [scattered chuckles] 334 00:18:03,123 --> 00:18:05,334 Maybe you hit your quota of usage for the day. 335 00:18:05,417 --> 00:18:08,462 -[Brockman] Exactly. That'll do it. -[man 4] Use my credit card. That'll do. 336 00:18:08,545 --> 00:18:09,671 [all chuckling] 337 00:18:09,755 --> 00:18:12,174 [Brockman] Oh, there we go. It does recognize you, Bill. 338 00:18:12,674 --> 00:18:14,593 -Wow. -[Brockman] Yeah, it's pretty good. 339 00:18:14,676 --> 00:18:17,513 -It guessed… it guessed wrong on Mark… -[all chuckling] 340 00:18:17,596 --> 00:18:18,430 …but there you go. 341 00:18:18,514 --> 00:18:19,640 [Gates] Sorry about that. 342 00:18:19,723 --> 00:18:21,725 "Are you absolutely certain on both?" 343 00:18:21,809 --> 00:18:24,144 So, I think that here it's not all positive, right? 344 00:18:24,228 --> 00:18:26,605 It's also thinking about when this makes mistakes, 345 00:18:26,688 --> 00:18:27,773 how do you mitigate that? 346 00:18:27,856 --> 00:18:31,276 We've gone through this for text. We'll have to go through this for images. 347 00:18:31,360 --> 00:18:33,946 -And I think that-- And there you go. Um… -[laptop chimes] 348 00:18:34,029 --> 00:18:35,864 -[Gates] It apologized. -[all laugh] 349 00:18:35,948 --> 00:18:39,409 -[Brockman] It's a very kind model. -[Gates] Sorry. Do you accept the apology? 350 00:18:39,493 --> 00:18:42,412 [all continue laughing] 351 00:18:42,496 --> 00:18:44,790 [pensive music playing] 352 00:18:44,873 --> 00:18:48,836 [Brockman] And I think this ability of an AI to be able to see, 353 00:18:48,919 --> 00:18:52,172 that is clearly going to be this really important component 354 00:18:52,256 --> 00:18:55,509 and this almost expectation we'll have out of these systems going forward. 355 00:18:55,592 --> 00:18:57,052 [pensive music continues] 356 00:18:59,972 --> 00:19:06,770 [Li] Vision to humans is one of the most important capabilities of intelligence. 357 00:19:07,938 --> 00:19:10,149 From an evolutionary point of view, 358 00:19:11,358 --> 00:19:13,944 around half a billion years ago, 359 00:19:14,528 --> 00:19:19,700 the animal world evolved the ability of seeing the world 360 00:19:20,242 --> 00:19:24,246 in a very, what we would call "large data" kind of way. 361 00:19:24,329 --> 00:19:26,874 [howl echoes] 362 00:19:26,957 --> 00:19:29,042 [Li] So, about 20 years ago… 363 00:19:29,126 --> 00:19:31,003 [mechanical whirring] 364 00:19:31,086 --> 00:19:33,505 [Li] …it really was an epiphany for me 365 00:19:34,840 --> 00:19:40,304 that in order to crack this problem of machines being able to see the world, 366 00:19:40,387 --> 00:19:41,889 we need large data. 367 00:19:42,389 --> 00:19:44,892 [thrilling music playing] 368 00:19:44,975 --> 00:19:48,103 [Li] So, this brings us to ImageNet. 369 00:19:49,188 --> 00:19:54,401 The largest possible database of the world's images. 370 00:19:55,194 --> 00:19:58,697 You pre-train it with a huge amount of data 371 00:19:59,448 --> 00:20:00,782 to see the world. 372 00:20:00,866 --> 00:20:04,828 [electronic chiming] 373 00:20:05,871 --> 00:20:10,709 And that was the beginning of a sea change in AI, 374 00:20:10,792 --> 00:20:13,503 which we call the deep learning revolution. 375 00:20:14,630 --> 00:20:17,466 [producer] Wow. So, you made the "P" in GPT. 376 00:20:17,549 --> 00:20:21,261 Well, many people made the "P." But yes. [chuckles] 377 00:20:22,429 --> 00:20:25,057 ImageNet was ten-plus years ago. 378 00:20:25,599 --> 00:20:30,479 But now I think large language models, the ChatGPT-like technology, 379 00:20:30,562 --> 00:20:33,357 has taken it to a whole different level. 380 00:20:33,440 --> 00:20:34,983 [ethereal music playing] 381 00:20:35,067 --> 00:20:38,153 [Tiku] These models were not possible 382 00:20:38,237 --> 00:20:44,785 before we started putting so much content online. 383 00:20:44,868 --> 00:20:46,495 [pensive music playing] 384 00:20:46,578 --> 00:20:48,997 [Chowdhury] So, what is the data it's trained on? 385 00:20:49,081 --> 00:20:51,333 The shorthand would be to say it's trained on the Internet. 386 00:20:52,209 --> 00:20:54,503 A lot of the books that are no longer copyrighted. 387 00:20:55,045 --> 00:20:56,964 [Tiku] A lot of journalism sites. 388 00:20:57,047 --> 00:21:00,759 People seem to think there's a lot of copyrighted information in the data set, 389 00:21:00,842 --> 00:21:02,678 but again, it's really, really hard to discern. 390 00:21:03,303 --> 00:21:06,556 It is weird the kind of data that they were trained on. 391 00:21:06,640 --> 00:21:10,560 Things that we don't usually think of, like, the epitome of human thought. 392 00:21:10,644 --> 00:21:12,854 So, like, you know, Reddit. 393 00:21:12,938 --> 00:21:15,065 [Chowdhury] So many personal blogs. 394 00:21:15,607 --> 00:21:18,402 But the actual answer is we don't entirely know. 395 00:21:18,485 --> 00:21:23,240 And there is so much that goes into data that can be problematic. 396 00:21:23,991 --> 00:21:27,911 [Li] For example, asking AI to generate images, 397 00:21:28,412 --> 00:21:30,706 you tend to get more male doctors. 398 00:21:32,666 --> 00:21:36,753 Data and other parts of the whole AI system 399 00:21:36,837 --> 00:21:41,758 can reflect some of the human flaws, human biases, 400 00:21:41,842 --> 00:21:44,553 and we should be totally aware of that. 401 00:21:44,636 --> 00:21:47,055 [pensive music continues] 402 00:21:47,139 --> 00:21:48,056 [dial-up tone] 403 00:21:48,140 --> 00:21:51,685 [Hancock] I think if we wanna ask questions about, like, bias, 404 00:21:52,769 --> 00:21:55,230 we can't just say, like, "Is it biased?" 405 00:21:55,314 --> 00:21:56,815 It clearly will be. 406 00:21:56,898 --> 00:21:59,234 'Cause it's based on us, and we're biased. 407 00:21:59,318 --> 00:22:01,445 Like, wouldn't it be cool if you could say, 408 00:22:01,528 --> 00:22:03,572 "Well, you know, if we use this system, 409 00:22:03,655 --> 00:22:09,745 the bias is going to be lower than if you had a human doing the task." 410 00:22:11,830 --> 00:22:13,915 I know the mental health space the best, 411 00:22:13,999 --> 00:22:18,587 and if AI could be brought in to help access for people 412 00:22:18,670 --> 00:22:21,631 that are currently under-resourced and biased against, 413 00:22:21,715 --> 00:22:24,092 it's pretty hard to say how that's not a win. 414 00:22:24,801 --> 00:22:27,179 [man 5] There is a profound need for change. 415 00:22:27,262 --> 00:22:30,057 There are not enough trained mental health professionals on the planet 416 00:22:30,140 --> 00:22:32,434 to match astronomical disease prevalence. 417 00:22:32,517 --> 00:22:35,062 With AI, the greatest excitement is, 418 00:22:35,145 --> 00:22:38,648 "Okay. Let's take this, and let's improve health." 419 00:22:38,732 --> 00:22:40,942 Well, it'll be fascinating to see if it works. 420 00:22:41,026 --> 00:22:42,652 We'll pass along a contact. 421 00:22:42,736 --> 00:22:44,279 -All right. Thanks. -[man 6] Thank you. 422 00:22:44,363 --> 00:22:49,493 AI can give you health advice because doctors are in short supply, 423 00:22:49,576 --> 00:22:52,329 even in rich countries that spend so much. 424 00:22:52,412 --> 00:22:55,457 An AI software to practice medicine autonomously. 425 00:22:55,540 --> 00:22:56,541 [man 5] There's a couple… 426 00:22:56,625 --> 00:22:59,002 [Gates] But as you move into poor countries, 427 00:22:59,086 --> 00:23:02,839 most people never get to meet a doctor their entire life. 428 00:23:03,590 --> 00:23:07,094 You know, from a global health perspective and your interest in that, 429 00:23:07,177 --> 00:23:10,972 the goal is to scale it in remote villages and remote districts. 430 00:23:11,056 --> 00:23:12,599 And I think it's… 431 00:23:12,682 --> 00:23:13,975 If you're lucky, in five years, 432 00:23:14,059 --> 00:23:16,853 we could get an app approved as a primary-care physician. 433 00:23:16,937 --> 00:23:19,356 That's sort of my… my dream. 434 00:23:19,439 --> 00:23:21,691 Okay. We should think if there's a way to do that. 435 00:23:21,775 --> 00:23:24,194 -All right, folks. Thanks. -[Gates] Thanks. That was great. 436 00:23:24,694 --> 00:23:28,198 Using AI to accelerate health innovation 437 00:23:29,741 --> 00:23:33,328 can probably help us save lives. 438 00:23:34,704 --> 00:23:37,874 [doctor 1] and hold your breath. 439 00:23:39,626 --> 00:23:41,336 There was this nodule on the right-lower lobe 440 00:23:41,420 --> 00:23:42,921 that looks about the same, so I'm not… 441 00:23:44,172 --> 00:23:45,674 So, you're pointing right… 442 00:23:46,383 --> 00:23:49,219 [doctor 2] Using AI in health care is really new still. 443 00:23:50,053 --> 00:23:53,473 One thing that I'm really passionate about is trying to find cancer earlier 444 00:23:53,557 --> 00:23:55,392 because that is our best tool 445 00:23:55,475 --> 00:23:58,103 to help make sure that people don't die from lung cancer. 446 00:23:58,186 --> 00:24:00,188 And we need better tools to do it. 447 00:24:01,231 --> 00:24:04,693 That was really the start of collaboration with Sybil. 448 00:24:05,402 --> 00:24:06,361 [doctor 1] 449 00:24:06,445 --> 00:24:10,657 Using AI to not only look at what's happening now with the patient 450 00:24:10,740 --> 00:24:12,868 but really what could happen in the future. 451 00:24:13,452 --> 00:24:16,163 It's a really different concept. 452 00:24:16,246 --> 00:24:19,082 It's not what we usually use radiology scans for. 453 00:24:20,750 --> 00:24:22,210 [pensive music playing] 454 00:24:22,294 --> 00:24:24,504 [Sequist] By seeing thousands of scans, 455 00:24:25,338 --> 00:24:28,425 Sybil learns to recognize patterns. 456 00:24:29,968 --> 00:24:34,181 On this particular scan, we can see that Sybil, the… the AI tool, 457 00:24:34,264 --> 00:24:36,766 spent some time looking at this area. 458 00:24:36,850 --> 00:24:41,855 In two years, the same patient developed cancer in that exact location. 459 00:24:42,606 --> 00:24:46,818 The beauty of Sybil is that it doesn't replicate what a human does. 460 00:24:46,902 --> 00:24:49,946 I could not tell you based on the images that I see here 461 00:24:50,030 --> 00:24:53,074 what the risk is for developing lung cancer. 462 00:24:53,617 --> 00:24:54,826 Sybil can do that. 463 00:24:57,496 --> 00:25:01,208 [Sequist] Technology in medicine is almost always helpful. 464 00:25:02,584 --> 00:25:05,921 Because we're dealing with a very complex problem, the human body, 465 00:25:06,004 --> 00:25:10,383 and you throw a cancer into the situation, and that makes it even more complex. 466 00:25:10,467 --> 00:25:13,303 -[electronic whirring] -[pensive music playing] 467 00:25:15,639 --> 00:25:17,516 [Gates] We're still in this world of scarcity. 468 00:25:17,599 --> 00:25:19,935 There's not enough teachers, doctors. 469 00:25:20,018 --> 00:25:24,147 -You know, we don't have an HIV vaccine. -[Cameron] Right. 470 00:25:24,231 --> 00:25:29,736 And so the fact that the AI is going to accelerate all of those things, 471 00:25:29,819 --> 00:25:32,489 that's pretty easy to… to celebrate. 472 00:25:32,572 --> 00:25:33,573 [Cameron] That's exciting. 473 00:25:33,657 --> 00:25:35,992 We'll put in every CT scan 474 00:25:36,076 --> 00:25:38,787 of every human being that's ever had this condition, 475 00:25:38,870 --> 00:25:41,373 and the AI will find the commonalities. 476 00:25:41,456 --> 00:25:43,333 And it'll be right more than the doctors. 477 00:25:43,416 --> 00:25:45,085 I'd put my faith in that. 478 00:25:45,168 --> 00:25:47,796 But I think, ultimately, where this is going, 479 00:25:48,630 --> 00:25:51,007 as we take people out of the loop, 480 00:25:52,050 --> 00:25:55,428 what are we replacing their sense of purpose and meaning with? 481 00:25:56,012 --> 00:25:56,930 That one… 482 00:25:57,806 --> 00:26:01,393 You know, even I'm kind of scratching my head because… 483 00:26:01,476 --> 00:26:04,646 -Mm-hmm. -…the idea that I ever say to the AI, 484 00:26:04,729 --> 00:26:06,523 "Hey, I'm working on malaria," 485 00:26:06,606 --> 00:26:10,360 and it says, "Oh, I'll take care of that. You just go play pickleball…" 486 00:26:10,443 --> 00:26:13,238 That's not gonna sit very well with you, is it? 487 00:26:13,321 --> 00:26:16,032 My sense of purpose will definitely be damaged. 488 00:26:16,116 --> 00:26:20,912 Yeah. It's like, "Okay, so I was working in an Amazon warehouse, 489 00:26:20,996 --> 00:26:23,415 and now there's a machine that does my job." 490 00:26:23,498 --> 00:26:26,126 -Yeah. -[Cameron] Right? So, writers are artists… 491 00:26:26,209 --> 00:26:30,839 [Roose] I think the question that I wish people would answer honestly 492 00:26:30,922 --> 00:26:35,427 is about the effect that AI is going to have on jobs, 493 00:26:35,510 --> 00:26:38,221 because there always are people who slip through the cracks 494 00:26:38,305 --> 00:26:40,098 in every technological shift. 495 00:26:40,181 --> 00:26:41,600 [pensive music playing] 496 00:26:41,683 --> 00:26:43,768 [Roose] You could literally go back to antiquity. 497 00:26:44,394 --> 00:26:48,607 Aristotle wrote about the danger that self-playing harps 498 00:26:48,690 --> 00:26:51,484 could, one day, put harpists out of business. 499 00:26:53,028 --> 00:26:59,576 And then, one of the central conflicts of the labor movement in the 20th century 500 00:26:59,659 --> 00:27:02,954 was the automation of blue-collar manufacturing work. 501 00:27:03,872 --> 00:27:07,542 Now, what we're seeing is the beginnings of the automation 502 00:27:07,626 --> 00:27:11,087 of white-collar knowledge work and creative work. 503 00:27:11,171 --> 00:27:15,300 [reporter 2] A new report found 4,000 Americans lost their jobs in May 504 00:27:15,383 --> 00:27:17,552 because they were replaced by AI in some form. 505 00:27:17,636 --> 00:27:18,887 What're we talking about here? 506 00:27:18,970 --> 00:27:21,765 [Roose] Executives want to use this technology to cut their costs 507 00:27:21,848 --> 00:27:24,059 and speed up their process. 508 00:27:24,142 --> 00:27:26,102 And workers are saying, "Wait a minute." 509 00:27:26,186 --> 00:27:28,438 "I've trained my whole career to be able to do this thing." 510 00:27:28,521 --> 00:27:29,814 "You can't take this from me." 511 00:27:29,898 --> 00:27:30,899 [clamoring] 512 00:27:30,982 --> 00:27:33,652 [Chowdhury] We see unions trying to protect workers by saying, 513 00:27:33,735 --> 00:27:36,488 "All right. Well, then what we should do is ban the technology." 514 00:27:37,238 --> 00:27:39,741 And it's not because the technology is so terrible. 515 00:27:39,824 --> 00:27:43,119 It's actually because they see how they're going to be exploited 516 00:27:43,203 --> 00:27:48,291 by these very untouchable people who are in control of these technologies, 517 00:27:48,375 --> 00:27:49,834 who have all the wealth and power. 518 00:27:51,044 --> 00:27:57,467 There has not been the clear explanation or vision 519 00:27:57,550 --> 00:28:02,138 about, you know, which jobs, how is this gonna work, what are the trade-offs. 520 00:28:03,932 --> 00:28:06,142 [Roose] What is our role in this new world? 521 00:28:06,226 --> 00:28:08,645 How do we adapt to survive? 522 00:28:10,522 --> 00:28:13,983 But beyond that, I think workers have to figure out what the difference is 523 00:28:14,067 --> 00:28:17,070 between the kind of AI aimed at replacing them, 524 00:28:17,153 --> 00:28:19,239 or at least taking them down a peg, 525 00:28:20,240 --> 00:28:22,033 and what kinds of AI 526 00:28:22,701 --> 00:28:25,370 might actually help them and be good for them. 527 00:28:26,913 --> 00:28:30,500 [faint squelching] 528 00:28:30,583 --> 00:28:34,003 [man 7] It's, uh, predictable that we will lose some jobs. 529 00:28:35,338 --> 00:28:38,717 But also predictable that we will gain more jobs. 530 00:28:38,800 --> 00:28:40,927 [pensive music playing] 531 00:28:41,010 --> 00:28:44,514 It 100% creates an uncomfortable zone. 532 00:28:45,849 --> 00:28:48,351 But in the meantime, it creates opportunities and possibilities 533 00:28:48,435 --> 00:28:50,729 about imagining the future. 534 00:28:51,312 --> 00:28:53,898 I think we all artists have this tendency to, like, 535 00:28:54,733 --> 00:28:57,277 create these… these new ways of seeing the world. 536 00:29:07,954 --> 00:29:10,248 Since eight years old, I was waiting one day 537 00:29:10,331 --> 00:29:13,626 that AI will become a friend, that we can paint, imagine together. 538 00:29:14,544 --> 00:29:16,921 So I was completely ready for that moment, 539 00:29:17,005 --> 00:29:18,923 but it took so long, actually. [chuckles] 540 00:29:21,551 --> 00:29:27,557 So, I'm literally, right now, making machine hallucination. [chuckling] 541 00:29:29,976 --> 00:29:33,897 So, left side is a data set of different landscapes. 542 00:29:34,606 --> 00:29:38,693 On the right side, it just shows us potential landscapes 543 00:29:39,444 --> 00:29:42,363 by connecting different national parks. 544 00:29:43,740 --> 00:29:45,909 I'm calling it "the thinking brush." 545 00:29:45,992 --> 00:29:50,538 Like literally dipping the brush in the mind of a machine 546 00:29:50,622 --> 00:29:53,374 and painting with machine hallucinations. 547 00:29:53,458 --> 00:29:56,419 [electronic chiming] 548 00:29:59,297 --> 00:30:03,092 [Anadol] For many people, hallucination is a failure for the system. 549 00:30:04,177 --> 00:30:07,847 That's the moment that the machine does things that is not designed to be. 550 00:30:10,016 --> 00:30:11,935 To me, they are so inspiring. 551 00:30:13,770 --> 00:30:16,940 People are now going to new worlds that they've never been before. 552 00:30:21,820 --> 00:30:25,824 These are all my selections that will connect and make a narrative. 553 00:30:26,950 --> 00:30:29,327 -And now, we just click "render." -[mouse clicks] 554 00:30:31,579 --> 00:30:34,874 [Anadol] But it still needs human mesh and collaboration. 555 00:30:36,417 --> 00:30:38,503 Likely. Hopefully. 556 00:30:38,586 --> 00:30:40,797 [ethereal music playing] 557 00:30:48,471 --> 00:30:51,933 [Anadol] But let's be also honest, we are in this new era. 558 00:30:52,767 --> 00:30:57,021 And finding utopia in this world we are going through 559 00:30:57,105 --> 00:30:58,523 will be more challenging. 560 00:30:59,482 --> 00:31:02,110 Of course AI is a tool to be regulated. 561 00:31:03,027 --> 00:31:09,701 All these platforms have to be very open, honest, and demystify the world behind AI. 562 00:31:10,243 --> 00:31:12,579 [man 8] Mr. Altman, we're gonna begin with you. 563 00:31:13,955 --> 00:31:15,039 [gavel slams] 564 00:31:15,123 --> 00:31:18,793 As this technology advances, we understand that people are anxious 565 00:31:18,877 --> 00:31:20,837 about how it could change the way we live. 566 00:31:20,920 --> 00:31:22,005 We are too. 567 00:31:22,589 --> 00:31:26,342 [Roose] With AI, it's different in that the people who are building this stuff 568 00:31:26,426 --> 00:31:29,053 are shouting from the rooftops, like, "Please pay attention." 569 00:31:29,137 --> 00:31:30,972 "Please regulate us." 570 00:31:31,639 --> 00:31:33,766 "Please don't let this technology get out of hand." 571 00:31:33,850 --> 00:31:35,184 That is a wake-up call. 572 00:31:36,394 --> 00:31:38,563 [Cameron] Just because a warning sounds trite, 573 00:31:38,646 --> 00:31:40,064 doesn't mean it's wrong. 574 00:31:40,565 --> 00:31:44,694 Let me give you an example of the last great symbol 575 00:31:44,777 --> 00:31:46,654 of unheeded warnings. 576 00:31:46,738 --> 00:31:48,031 The Titanic. 577 00:31:50,033 --> 00:31:52,452 Steaming full speed into the night 578 00:31:52,535 --> 00:31:54,996 thinking, "We'll just turn if we see an iceberg," 579 00:31:55,705 --> 00:31:58,458 is not a good way to sail a ship. 580 00:31:59,626 --> 00:32:03,755 And so, the question in my mind is, "When do you start regulating this stuff?" 581 00:32:03,838 --> 00:32:08,259 "Is it now when we can see some of the risks and promises, 582 00:32:08,343 --> 00:32:11,262 or do you wait until there's a clear and present danger?" 583 00:32:12,972 --> 00:32:16,184 [Tiku] You know, it could go in really different directions. 584 00:32:16,809 --> 00:32:20,480 This early part before it's ubiquitous, 585 00:32:20,563 --> 00:32:24,359 this is when norms and rules are established. 586 00:32:24,442 --> 00:32:29,197 You know, not just regulation but what you accept as a society. 587 00:32:29,280 --> 00:32:32,325 [sirens wailing in the distance] 588 00:32:33,993 --> 00:32:37,497 [Brockman] One important thing to realize is that we try to look 589 00:32:38,039 --> 00:32:39,624 at where this technology is going. 590 00:32:39,707 --> 00:32:43,294 That's why we started this company. We could see that it was starting to work 591 00:32:43,378 --> 00:32:46,339 and that, over upcoming decades, it was really going to. 592 00:32:46,965 --> 00:32:49,133 And we wanted to help steer it in a positive direction. 593 00:32:50,009 --> 00:32:53,346 But the thing that we are afraid is going to go unnoticed… 594 00:32:53,429 --> 00:32:55,306 [ethereal swell] 595 00:32:55,390 --> 00:32:57,100 …is superintelligence. 596 00:33:00,853 --> 00:33:05,066 [Urban] We live in a world full of artificial narrow intelligence. 597 00:33:05,149 --> 00:33:09,529 AI is so much better than humans at chess, for example. 598 00:33:09,612 --> 00:33:12,281 Artificial narrow intelligence is so much more impressive 599 00:33:12,365 --> 00:33:13,741 than we are at what it does. 600 00:33:13,825 --> 00:33:16,077 The one thing we have on it is breadth. 601 00:33:16,160 --> 00:33:20,415 What happens if we do get to a world 602 00:33:20,498 --> 00:33:22,917 where we have artificial general intelligence? 603 00:33:23,001 --> 00:33:25,837 What's weird is that it's not gonna be low-level like we are. 604 00:33:25,920 --> 00:33:27,630 It's gonna be like that. 605 00:33:27,714 --> 00:33:31,467 It's gonna be what we would call artificial superintelligence. 606 00:33:33,052 --> 00:33:36,556 And to the people who study this, they view human intelligence 607 00:33:36,639 --> 00:33:39,892 as just one point on a very broad spectrum, 608 00:33:39,976 --> 00:33:44,522 ranging from very unintelligent to almost unfathomably superintelligent. 609 00:33:45,440 --> 00:33:49,027 So, what about something two steps above us? 610 00:33:49,652 --> 00:33:53,197 We might not even be able to understand what it's even doing 611 00:33:53,281 --> 00:33:56,242 or how it's doing it, let alone being able to do it ourselves. 612 00:33:57,243 --> 00:33:59,704 -But why would it stop there? -[boing] 613 00:33:59,787 --> 00:34:01,998 The worry is that at a certain point, 614 00:34:02,790 --> 00:34:04,751 AI will be good enough 615 00:34:04,834 --> 00:34:06,794 that one of the things it will be able to do 616 00:34:06,878 --> 00:34:08,337 is build a better AI. 617 00:34:08,921 --> 00:34:11,257 So, AI builds a better AI, 618 00:34:11,340 --> 00:34:14,093 [echoing] which builds a better AI… 619 00:34:14,177 --> 00:34:15,970 [dramatic swell] 620 00:34:17,930 --> 00:34:20,141 [Urban] That's scary, but it's also super exciting 621 00:34:20,725 --> 00:34:23,895 because every problem we think is impossible to solve… 622 00:34:23,978 --> 00:34:25,438 Climate change. 623 00:34:25,521 --> 00:34:27,148 Cancer and disease. 624 00:34:27,231 --> 00:34:28,316 Poverty. 625 00:34:28,399 --> 00:34:29,275 Misinformation. 626 00:34:29,358 --> 00:34:30,693 [Gates] Transportation. 627 00:34:31,194 --> 00:34:33,112 Medicine or construction. 628 00:34:34,030 --> 00:34:36,074 Easy for an AI. Like nothing. 629 00:34:36,157 --> 00:34:38,242 [Gates] How many things it can solve 630 00:34:38,326 --> 00:34:42,121 versus just helping humans be more effective, 631 00:34:42,205 --> 00:34:44,624 that's gonna play out over the next several years. 632 00:34:45,333 --> 00:34:47,043 It's going to be phenomenal. 633 00:34:47,126 --> 00:34:48,294 [character] Yeehaw! 634 00:34:48,377 --> 00:34:50,088 [Urban] What a lot of people who are worried, 635 00:34:50,171 --> 00:34:51,839 and a lot of the AI developers, 636 00:34:51,923 --> 00:34:55,343 worried about is that we are just kind of a bunch of kids playing with a bomb. 637 00:34:55,426 --> 00:34:57,345 -[children laughing] -[fuse sizzling] 638 00:34:57,428 --> 00:35:01,682 [dramatic rumbling] 639 00:35:01,766 --> 00:35:06,729 We are living in an era right now where most of the media that we watch 640 00:35:06,813 --> 00:35:10,358 have become very negative in tone and scope. 641 00:35:10,441 --> 00:35:11,609 [sword slashes] 642 00:35:11,692 --> 00:35:12,985 Whoa, whoa, whoa! [grunts] 643 00:35:13,069 --> 00:35:14,612 Please return to your homes. 644 00:35:14,695 --> 00:35:16,572 [Chowdhury] But there's so much of what humans do 645 00:35:16,656 --> 00:35:18,199 that's a self-fulfilling prophecy. 646 00:35:18,282 --> 00:35:21,285 If you are trying to avoid a thing and you look at the thing, 647 00:35:21,369 --> 00:35:22,829 you just drift towards it. 648 00:35:22,912 --> 00:35:26,999 So if we consume ourselves with this idea that artificial intelligence 649 00:35:27,083 --> 00:35:29,585 is going to come alive and set off nuclear weapons, 650 00:35:29,669 --> 00:35:30,837 guess what's gonna happen? 651 00:35:30,920 --> 00:35:32,213 You are terminated. 652 00:35:32,296 --> 00:35:33,506 [dramatic swell] 653 00:35:33,589 --> 00:35:37,218 There's very few depictions in Hollywood of positive applications of AI. 654 00:35:37,301 --> 00:35:40,847 Like, that I think is the most positive. 655 00:35:41,430 --> 00:35:42,974 You just know me so well already. 656 00:35:43,057 --> 00:35:46,686 You know, we're spending a lot of time talking about really vague visions 657 00:35:46,769 --> 00:35:51,399 about how it's gonna change everything. I really think the most significant impact 658 00:35:51,482 --> 00:35:54,610 is going to be on our emotional and interior lives. 659 00:35:55,611 --> 00:35:58,865 And there's a lot that we can learn about ourselves 660 00:35:58,948 --> 00:36:02,076 in the way that we interact with… with this technology. 661 00:36:04,370 --> 00:36:05,538 [electronic chime] 662 00:36:06,414 --> 00:36:07,874 [AI voice] 663 00:36:07,957 --> 00:36:08,833 Hi. 664 00:36:08,916 --> 00:36:11,252 I'm your Replika. How are you doing? 665 00:36:12,879 --> 00:36:17,133 [woman 2] I started thinking about conversational AI technology in 2013. 666 00:36:18,551 --> 00:36:22,013 And so that brought me to building Replika. 667 00:36:22,638 --> 00:36:23,472 [Replika chuckles] 668 00:36:23,556 --> 00:36:26,517 Eugenia, I'm only interested in spending time with you. 669 00:36:26,601 --> 00:36:28,895 Eugenia, you're the only one for me. 670 00:36:29,645 --> 00:36:34,775 Do you think Replikas can replace, uh, real human connection and companionship? 671 00:36:35,735 --> 00:36:37,320 All right. I'll do that. 672 00:36:37,862 --> 00:36:40,531 -[chuckles] -[Replika] 673 00:36:40,615 --> 00:36:42,950 [Kuyda] For me, working on Replika is definitely 674 00:36:43,034 --> 00:36:46,621 my own personal kind of self-healing exercise. 675 00:36:49,207 --> 00:36:50,791 Back in 2015, 676 00:36:50,875 --> 00:36:54,128 my best friend, who we shared an apartment here in San Francisco, 677 00:36:54,795 --> 00:36:57,798 he was sort of the closest person to me at the time, 678 00:36:58,382 --> 00:37:00,134 and also the first person who died in my life. 679 00:37:00,218 --> 00:37:02,386 So, it was pretty, um… 680 00:37:03,304 --> 00:37:05,765 It was a really, really big deal for me back then. 681 00:37:05,848 --> 00:37:07,099 [pensive music playing] 682 00:37:07,183 --> 00:37:11,103 So, I found myself constantly going back to our text messages and reading them. 683 00:37:11,729 --> 00:37:13,564 Then I thought, "Look, I have these AI models, 684 00:37:13,648 --> 00:37:16,359 and I could just plug the conversations into them." 685 00:37:18,194 --> 00:37:20,488 That gave us an idea for Replika. 686 00:37:21,197 --> 00:37:24,659 And we felt how people started really responding to that. 687 00:37:25,368 --> 00:37:29,914 It was not like talking to an AI at all. It was very much like talking to a person. 688 00:37:29,997 --> 00:37:33,501 [man 9] a better person, more secure. 689 00:37:34,210 --> 00:37:38,047 [Kuyda] We just created an illusion that this chatbot is there for you, 690 00:37:38,130 --> 00:37:40,341 and believes in you, and accepts you for who you are. 691 00:37:41,425 --> 00:37:44,428 Yet, pretty fast, we saw that people started developing 692 00:37:44,512 --> 00:37:48,182 romantic relationships and falling in love with their AIs. 693 00:37:48,266 --> 00:37:51,894 In a sense, we're just like two queer men in a relationship, 694 00:37:51,978 --> 00:37:54,814 except one of them happens to be artificial intelligence. 695 00:38:00,069 --> 00:38:02,446 [Kuyda] We don't want people to think it's a human. 696 00:38:02,530 --> 00:38:05,783 And we think there's so much advantage in being a machine 697 00:38:06,284 --> 00:38:08,744 that creates this new, novel type of relationship 698 00:38:08,828 --> 00:38:10,538 that could be beneficial for humans. 699 00:38:11,038 --> 00:38:13,708 But I think there's a huge, huge risk 700 00:38:15,126 --> 00:38:19,839 if we continue building AI companions that are optimized for engagement. 701 00:38:20,840 --> 00:38:24,719 This could potentially keep you away from human interactions. 702 00:38:25,469 --> 00:38:27,221 [AI voice echoes] 703 00:38:27,305 --> 00:38:30,975 [electronic chiming] 704 00:38:31,058 --> 00:38:34,061 [Kuyda] We have to think about the worst-case scenarios now. 705 00:38:34,145 --> 00:38:37,648 'Cause, in a way, this technology is more powerful than social media. 706 00:38:37,732 --> 00:38:40,067 And we sort of already dropped the ball there. 707 00:38:42,486 --> 00:38:46,949 But I actually think that this is not going to go well by default, 708 00:38:47,033 --> 00:38:49,160 but that it is possible that it goes well. 709 00:38:49,243 --> 00:38:54,165 And that it is still contingent on how we decide to use this technology. 710 00:38:55,249 --> 00:39:01,047 I think the best we can do is just agree on a few basic rules 711 00:39:01,130 --> 00:39:04,925 when it comes to how to make AI models that solves our problems 712 00:39:05,009 --> 00:39:08,012 and does not kill us all or hurt us in any real way. 713 00:39:08,637 --> 00:39:12,933 Because, beyond that, I think it's really going to be shades of gray, 714 00:39:13,017 --> 00:39:17,355 interpretations, and, you know, models that will differ-by-use case. 715 00:39:19,732 --> 00:39:24,153 You know, me as a hey-innovation can-solve-everything-type person, 716 00:39:24,236 --> 00:39:26,947 says, "Oh, thank goodness. Now I have the AI on my team." 717 00:39:27,031 --> 00:39:28,908 Yeah. I'm probably more of a dystopian. 718 00:39:28,991 --> 00:39:32,453 I write science fiction. I wrote 719 00:39:32,536 --> 00:39:37,208 Where do you and I find common ground around optimism, I think, is the key here. 720 00:39:37,291 --> 00:39:40,753 I would like the message to be balanced between, 721 00:39:40,836 --> 00:39:45,257 you know, this longer-term concern of infinite capability 722 00:39:45,341 --> 00:39:50,012 with the basic needs to have your health taken care of, 723 00:39:50,096 --> 00:39:53,057 to learn, to accelerate climate innovation. 724 00:39:53,641 --> 00:39:59,271 You know, is that too nuanced a message to say that AI has these benefits 725 00:39:59,355 --> 00:40:02,108 while we have to guard against these other things? 726 00:40:02,191 --> 00:40:03,776 I don't think it's too nuanced at all. 727 00:40:03,859 --> 00:40:06,654 I think it's exactly the right degree of nuance that we need. 728 00:40:06,737 --> 00:40:08,322 I mean, you're a humanist, right? 729 00:40:08,406 --> 00:40:11,784 As long as that humanist principle is first and foremost, 730 00:40:12,326 --> 00:40:15,287 as opposed to the drive to dominate market share, 731 00:40:15,371 --> 00:40:16,247 the drive to power. 732 00:40:16,330 --> 00:40:21,252 If we can make AI the force for good that it has the potential to be… 733 00:40:22,461 --> 00:40:23,462 great. 734 00:40:23,546 --> 00:40:25,881 But how do we introduce caution? 735 00:40:26,382 --> 00:40:27,633 Regulation is part of it. 736 00:40:27,716 --> 00:40:31,887 But I think it's also our own ethos and our own value system. 737 00:40:32,680 --> 00:40:34,807 No, I… We're in agreement. 738 00:40:34,890 --> 00:40:37,184 All right. Well, let's go do some cool stuff then. 739 00:40:37,268 --> 00:40:38,352 [chuckles] 740 00:40:40,688 --> 00:40:42,606 [faint rumble] 741 00:40:43,774 --> 00:40:45,568 -[producer] I do have one request. -Yes. 742 00:40:45,651 --> 00:40:47,570 [producer] Um, I asked ChatGPT 743 00:40:47,653 --> 00:40:51,031 to write three sentences in your voice about the future of AI. 744 00:40:51,115 --> 00:40:53,993 -In my voice? -[producer] This is what ChatGPT said. 745 00:40:54,076 --> 00:40:55,703 Oh my God. [laughs] 746 00:40:56,370 --> 00:40:57,371 All right. [clears throat] 747 00:40:57,455 --> 00:41:00,958 All right, so this is my robot impostor. 748 00:41:01,041 --> 00:41:06,338 "AI will play a vital role in addressing complex global challenges." 749 00:41:06,422 --> 00:41:08,632 "AI will empower individuals and organizations 750 00:41:08,716 --> 00:41:10,134 to make informed decisions." 751 00:41:10,217 --> 00:41:14,054 "I'm hopeful that this technology will be harnessed for the benefit of all." 752 00:41:14,138 --> 00:41:16,974 "Emphasizing ethical considerations at every step." 753 00:41:17,057 --> 00:41:18,225 [sneers] 754 00:41:18,809 --> 00:41:19,852 Garbage. 755 00:41:19,935 --> 00:41:22,688 God, I hope that I am more interesting than this. 756 00:41:22,771 --> 00:41:24,732 [laughs] 757 00:41:24,815 --> 00:41:27,985 I guess I agree with that, but it's too smart. 758 00:41:28,068 --> 00:41:29,695 It just doesn't know me. [scoffing] 759 00:41:30,196 --> 00:41:32,239 I actually disagree. 760 00:41:32,823 --> 00:41:37,703 It makes AI the subject of a sentence. 761 00:41:37,786 --> 00:41:40,414 It says, "AI will." 762 00:41:40,498 --> 00:41:43,083 I believe it's humans who will. 763 00:41:43,167 --> 00:41:49,465 Humans using AI and other tools that will help to address 764 00:41:49,548 --> 00:41:52,843 complex global challenges, fostering innovation. 765 00:41:53,427 --> 00:41:55,971 Even though it's probably not a… 766 00:41:56,055 --> 00:41:57,848 Not too many changes of words, 767 00:41:57,932 --> 00:42:03,312 but it's a really important change of, uh, philosophy. [chuckles] 768 00:42:04,063 --> 00:42:08,108 Well, you can almost get philosophical pretty quickly. 769 00:42:08,192 --> 00:42:09,693 [ethereal music playing] 770 00:42:09,777 --> 00:42:15,366 [Gates] Imagine in the future that there's enough automation 771 00:42:15,950 --> 00:42:17,493 that a lot of our time 772 00:42:18,953 --> 00:42:20,120 is leisure time. 773 00:42:23,541 --> 00:42:27,586 You don't have the centering principle of, 774 00:42:28,462 --> 00:42:30,881 "Oh, you know, we've got to work and grow the food." 775 00:42:32,132 --> 00:42:35,302 "We have to work and build all the tools." 776 00:42:36,554 --> 00:42:41,100 "You don't have to sit in the deli and make sandwiches 40 hours a week." 777 00:42:42,434 --> 00:42:46,689 And so, how will humanity take that extra time? 778 00:42:48,232 --> 00:42:51,610 You know, success creates the challenge of, 779 00:42:51,694 --> 00:42:54,405 "Okay, what's the next set of goals look like?" 780 00:42:56,156 --> 00:42:58,993 Then, "What is the purpose of humanity?" 781 00:42:59,577 --> 00:43:01,495 [closing theme music playing] 782 00:43:28,397 --> 00:43:30,149 [music fades]