1 00:00:16,750 --> 00:00:18,285 [phone dialing] 2 00:00:18,318 --> 00:00:20,687 [line ringing] 3 00:00:20,721 --> 00:00:23,490 - Levy 911, what is the address of your emergency? 4 00:00:23,524 --> 00:00:25,025 - There was just a wreck. 5 00:00:25,058 --> 00:00:26,260 A head-on collision right here-- 6 00:00:26,293 --> 00:00:27,628 Oh, my God Almighty. 7 00:00:27,661 --> 00:00:30,697 [phones dialing] 8 00:00:30,731 --> 00:00:32,399 - Hello? 9 00:00:32,432 --> 00:00:34,701 - They just had a bad accident in front of the BP station. 10 00:00:34,735 --> 00:00:36,403 - I don't know how bad it is, but it sounded nasty. 11 00:00:36,436 --> 00:00:39,173 - Okay, we've got--we've got multiple 911 calls, sir. 12 00:00:39,206 --> 00:00:40,741 [phones dialing] 13 00:00:40,774 --> 00:00:43,076 - Yes, ma'am, a little car, a little sportscar. 14 00:00:43,110 --> 00:00:45,112 - A little black car and a semi car 15 00:00:45,145 --> 00:00:46,313 went under a semi-truck. 16 00:00:46,346 --> 00:00:47,414 - Come on underneath the truck. 17 00:00:47,447 --> 00:00:49,082 - Took the top of the car off. 18 00:00:49,116 --> 00:00:50,584 - Okay, are the vehicles in the road? 19 00:00:50,617 --> 00:00:52,553 - No, vehicle--he went way off the road. 20 00:00:52,586 --> 00:00:54,755 - Went off into the ditch and went to the woods. 21 00:00:54,788 --> 00:00:56,623 - Ran off probably a quarter mile 22 00:00:56,657 --> 00:00:57,791 from where the wreck happened. 23 00:00:57,824 --> 00:00:59,326 - I watched it happen. 24 00:00:59,359 --> 00:01:02,396 - Okay, sir, is there any obvious injuries? 25 00:01:02,429 --> 00:01:04,765 - I mean, he's dead as hell. 26 00:01:04,798 --> 00:01:06,600 - He--he's dead? 27 00:01:06,633 --> 00:01:09,002 - Yeah, it's kind of obvious. 28 00:01:09,036 --> 00:01:11,572 - A fatal crash involving a Tesla. 29 00:01:11,605 --> 00:01:12,739 - Tesla. - Tesla. 30 00:01:12,773 --> 00:01:14,441 - Related to the Autopilot mode. 31 00:01:14,474 --> 00:01:16,243 - Autopilot mode. - We've heard warnings 32 00:01:16,276 --> 00:01:17,778 about the dangers of this technology. 33 00:01:17,811 --> 00:01:19,246 - Tesla is under fire. 34 00:01:19,279 --> 00:01:21,181 - Critics have been calling for changes 35 00:01:21,215 --> 00:01:22,683 to Tesla's Autopilot software. 36 00:01:22,716 --> 00:01:26,353 - Tweets and retweets from Tesla CEO Elon Musk, 37 00:01:26,386 --> 00:01:29,423 pointing out that other auto companies are involved 38 00:01:29,456 --> 00:01:31,525 in far more fatal crashes. 39 00:01:31,558 --> 00:01:35,162 - How did a Tesla on Autopilot slam into a tractor-trailer? 40 00:01:35,195 --> 00:01:37,464 Why didn't the built-in safety system stop it? 41 00:01:37,497 --> 00:01:40,734 - Who's to blame: the driver or the car? 42 00:01:48,475 --> 00:01:52,179 - It is my honor to welcome to the stage Mr. Elon Musk. 43 00:01:52,212 --> 00:01:55,215 [cheers and applause] 44 00:01:59,253 --> 00:02:01,221 - Welcome, everyone, to the Model 3 unveil. 45 00:02:01,255 --> 00:02:02,456 [audience shouting] 46 00:02:02,489 --> 00:02:05,726 - For years, Elon Musk is talked about being 47 00:02:05,759 --> 00:02:09,229 on the verge of self-driving car technology. 48 00:02:09,263 --> 00:02:11,565 - No hands. No feet. Nothing. 49 00:02:11,598 --> 00:02:15,169 - Elon Musk approaches this a way a lot of people 50 00:02:15,202 --> 00:02:19,039 in Silicon Valley do: they are often telling you things 51 00:02:19,072 --> 00:02:20,607 about how the future will be. 52 00:02:20,641 --> 00:02:23,177 - Getting in a car will be like getting in an elevator. 53 00:02:23,210 --> 00:02:24,678 You just tell it where you want to go, 54 00:02:24,711 --> 00:02:28,749 and it takes you there with extreme levels of safety. 55 00:02:28,782 --> 00:02:30,284 And that'll be normal. 56 00:02:30,317 --> 00:02:31,552 [applause] 57 00:02:31,585 --> 00:02:33,420 - Musk is certainly a visionary. 58 00:02:33,453 --> 00:02:36,423 - "Time" Magazine's Person of the Year has been released: 59 00:02:36,456 --> 00:02:39,259 Elon Musk, the Tesla and SpaceX CEO, 60 00:02:39,293 --> 00:02:41,295 for driving society's most daring 61 00:02:41,328 --> 00:02:43,497 and disruptive transformations. 62 00:02:43,530 --> 00:02:47,501 - Elon Musk wanted to disrupt and revolutionize 63 00:02:47,534 --> 00:02:48,735 the auto industry. 64 00:02:48,769 --> 00:02:51,705 And Autopilot was kind of a halo. 65 00:02:51,738 --> 00:02:54,741 It gave Tesla this image of the way they like 66 00:02:54,775 --> 00:02:59,112 to be portrayed as a technology company. 67 00:02:59,146 --> 00:03:01,114 - The difference is that the stakes are higher 68 00:03:01,148 --> 00:03:02,516 for the technology. 69 00:03:02,549 --> 00:03:06,253 - Whoa. It nearly drove us into the Subaru there. 70 00:03:06,286 --> 00:03:09,756 - We're talking about physical cars on the road, 71 00:03:09,790 --> 00:03:11,592 and we're talking about lives at stake. 72 00:03:11,625 --> 00:03:13,460 - Yet another driver caught on camera 73 00:03:13,493 --> 00:03:15,195 apparently asleep at the wheel. 74 00:03:15,229 --> 00:03:18,198 - But will it end the question of how much control drivers 75 00:03:18,232 --> 00:03:21,268 should really hand over to the computers in their car? 76 00:03:22,669 --> 00:03:26,106 - Tesla is under a lot more scrutiny now 77 00:03:26,139 --> 00:03:27,274 than it has been before. 78 00:03:27,307 --> 00:03:29,610 And part of that is in their marketing. 79 00:03:29,643 --> 00:03:31,278 - The rollout of the latest version 80 00:03:31,311 --> 00:03:33,247 of its self-driving technology. 81 00:03:33,280 --> 00:03:38,218 - Wow. Oh, my God! Okay. Okay. 82 00:03:38,252 --> 00:03:40,487 - There's this enormous gray area 83 00:03:40,521 --> 00:03:42,289 Tesla's willing to explore. 84 00:03:42,322 --> 00:03:44,658 - Ahh! 85 00:03:44,691 --> 00:03:48,529 This is very uncomfortable so far. 86 00:03:48,562 --> 00:03:51,131 - How much do you push the edge of the envelope? 87 00:03:53,100 --> 00:03:56,303 - My car should get over here. 88 00:03:56,336 --> 00:03:59,406 Okay, that was-- that was great. 89 00:03:59,439 --> 00:04:00,741 That was really great. 90 00:04:00,774 --> 00:04:02,376 That was one of those times where I was like, 91 00:04:02,409 --> 00:04:03,443 okay, it can do it. 92 00:04:03,477 --> 00:04:04,545 Like, I know it can do it. 93 00:04:04,578 --> 00:04:06,113 It just needs to do it every time. 94 00:04:06,146 --> 00:04:11,318 ♪ ♪ 95 00:04:11,351 --> 00:04:13,787 This technology is eventually 96 00:04:13,820 --> 00:04:16,523 going to enable the car to drive itself. 97 00:04:18,192 --> 00:04:20,260 And that's what I am testing. 98 00:04:20,294 --> 00:04:23,030 Navigate to the East Greenwich Super Charger. 99 00:04:23,063 --> 00:04:24,765 What Tesla calls Early Access Program-- 100 00:04:24,798 --> 00:04:27,301 it's a group of owners that test software, 101 00:04:27,334 --> 00:04:29,203 but it's not public. 102 00:04:30,370 --> 00:04:32,206 You can see what the car sees here. 103 00:04:32,239 --> 00:04:34,274 It sees the pedestrians. 104 00:04:34,308 --> 00:04:36,577 It sees that this is a pickup truck 105 00:04:36,610 --> 00:04:38,612 that has come up behind us. 106 00:04:42,149 --> 00:04:44,184 This is a scary street. 107 00:04:44,218 --> 00:04:46,119 See how narrow this is? 108 00:04:47,454 --> 00:04:52,593 See how hesitant it is, though, while it's going by these cars? 109 00:04:52,626 --> 00:04:56,063 It--it's learning. It's not there yet. 110 00:04:57,631 --> 00:05:00,567 You know, people believe that it's gonna happen, 111 00:05:00,601 --> 00:05:02,369 or else you wouldn't do it, right? 112 00:05:04,438 --> 00:05:06,640 I think that the average is like 35,000, 113 00:05:06,673 --> 00:05:10,777 36,000 auto deaths a year in just the United States. 114 00:05:10,811 --> 00:05:13,380 And I think 90-something percent 115 00:05:13,413 --> 00:05:16,016 of those are human error. 116 00:05:16,049 --> 00:05:18,719 I believe that autonomy is necessary 117 00:05:18,752 --> 00:05:23,490 to end virtually all traffic deaths in this country. 118 00:05:25,292 --> 00:05:28,428 I think that Elon Musk is somebody that comes along, 119 00:05:28,462 --> 00:05:30,264 like, once in a generation. 120 00:05:30,297 --> 00:05:33,433 ♪ ♪ 121 00:05:33,467 --> 00:05:35,102 - Now, Elon Musk, I think, 122 00:05:35,135 --> 00:05:38,238 is a name known to everybody who thinks about the future. 123 00:05:38,272 --> 00:05:40,607 - Musk's fascination with technology 124 00:05:40,641 --> 00:05:43,610 dates to his childhood in South Africa. 125 00:05:43,644 --> 00:05:45,546 - Where I grew up was extremely violent. 126 00:05:45,579 --> 00:05:47,681 I got punched in the face many times. 127 00:05:47,714 --> 00:05:50,117 I almost got beaten to death once. 128 00:05:50,150 --> 00:05:51,385 And I think, if you have not been punched 129 00:05:51,418 --> 00:05:53,353 in the face with a fist, 130 00:05:53,387 --> 00:05:55,155 you don't know what--you have no idea what it's like. 131 00:05:55,189 --> 00:05:56,557 - He hated going to school, 132 00:05:56,590 --> 00:05:58,692 because the other kids liked to follow him home, 133 00:05:58,725 --> 00:06:01,428 and they would throw soda cans at his head. 134 00:06:01,461 --> 00:06:04,264 So he sought refuge in computer games, 135 00:06:04,298 --> 00:06:06,033 which got him into coding. 136 00:06:06,066 --> 00:06:08,535 - When you were a kid, you programmed a game? 137 00:06:08,569 --> 00:06:09,636 Blaster, it's called? 138 00:06:09,670 --> 00:06:11,205 - Yeah, it's a simple game. 139 00:06:11,238 --> 00:06:13,473 - By 17, you were on a plane from South Africa. 140 00:06:13,507 --> 00:06:14,741 - Yeah. I kind of wanted to be 141 00:06:14,775 --> 00:06:17,010 where the cutting edge of technology was. 142 00:06:18,212 --> 00:06:20,047 Part of the reason I got interested in technology-- 143 00:06:20,080 --> 00:06:22,115 maybe the reason-- was video games. 144 00:06:22,149 --> 00:06:23,717 I worked at a gaming startup, 145 00:06:23,750 --> 00:06:25,686 which weirdly was called Rocket Science. 146 00:06:25,719 --> 00:06:26,720 Yeah. 147 00:06:26,753 --> 00:06:29,022 [laughter] 148 00:06:29,056 --> 00:06:30,424 Fate loves irony. 149 00:06:32,259 --> 00:06:36,496 - In summer of '94, Elon came as a summer intern. 150 00:06:36,530 --> 00:06:38,031 He was kinda introverted, 151 00:06:38,065 --> 00:06:40,701 so he fit right into the rest of the group. 152 00:06:40,734 --> 00:06:44,338 Very, very interested in world building, in storytelling. 153 00:06:44,371 --> 00:06:47,074 We thought Elon was gonna be an entrepreneur, clearly. 154 00:06:47,107 --> 00:06:48,475 - You had a bring stint at Stanford. 155 00:06:48,509 --> 00:06:50,577 - That's right. - A Ph. D. in Applied Physics? 156 00:06:50,611 --> 00:06:52,412 - Applied Physics, Material Science. 157 00:06:52,446 --> 00:06:53,780 But then the Internet came along, 158 00:06:53,814 --> 00:06:55,716 and it seemed like I could either do a Ph. D. 159 00:06:55,749 --> 00:06:58,385 and watch the internet happen, or I could participate 160 00:06:58,418 --> 00:07:01,121 and help build it in some fashion. 161 00:07:01,154 --> 00:07:03,290 So, I started a company with my brother 162 00:07:03,323 --> 00:07:05,025 and a friend of mine, Greg Kouri, 163 00:07:05,058 --> 00:07:06,793 and created Zip2, where the initial idea 164 00:07:06,827 --> 00:07:08,529 was to create software that could help 165 00:07:08,562 --> 00:07:10,697 bring the media companies online. 166 00:07:10,731 --> 00:07:12,533 - You know, it's hard to remember a time 167 00:07:12,566 --> 00:07:15,536 without the ubiquity of, like, Google Maps and other things, 168 00:07:15,569 --> 00:07:18,205 and the expectation that, if you're gonna find anything, 169 00:07:18,238 --> 00:07:20,307 there is a digital path to finding it. 170 00:07:20,340 --> 00:07:24,077 And none of that existed in 1995. 171 00:07:25,212 --> 00:07:27,214 I think it was very forward-thinking. 172 00:07:27,247 --> 00:07:29,783 That's what makes a great entrepreneur. 173 00:07:29,816 --> 00:07:31,151 - Elon had a very strong personality, 174 00:07:32,319 --> 00:07:34,621 so sometimes he would get into arguments with people, 175 00:07:34,655 --> 00:07:37,724 and they would be pretty intense. 176 00:07:37,758 --> 00:07:40,160 He's not the kind of guy who went out for beers 177 00:07:40,194 --> 00:07:42,229 with people and saw movies and things like that. 178 00:07:42,262 --> 00:07:43,730 He basically just worked. 179 00:07:43,764 --> 00:07:47,000 - I started off being the CEO. But after we got VC funding, 180 00:07:47,034 --> 00:07:51,772 the venture capitalists wanted to hire a professional CEO. 181 00:07:51,805 --> 00:07:53,407 - Elon was always asking me, 182 00:07:53,440 --> 00:07:55,309 like, what should his title be? 183 00:07:55,342 --> 00:07:56,643 You know, what should his role be? 184 00:07:56,677 --> 00:07:58,145 I think, more than anything, 185 00:07:58,178 --> 00:08:00,380 he wanted to be the face of Zip2. 186 00:08:02,216 --> 00:08:04,117 - Wow. I can't believe it's actually here. 187 00:08:04,151 --> 00:08:05,719 That's pretty wild, man. 188 00:08:05,752 --> 00:08:07,254 - Elon was clearly, like, 189 00:08:07,287 --> 00:08:09,990 one of the most driven people I've ever known. 190 00:08:10,791 --> 00:08:14,394 - A year ago, Musk sold his software company, Zip2. 191 00:08:14,428 --> 00:08:17,664 - He took the $22 million he made from Zip2. 192 00:08:17,698 --> 00:08:19,266 He obviously could have retired. 193 00:08:19,299 --> 00:08:22,669 But instead, he just worked to start his next company, 194 00:08:22,703 --> 00:08:24,338 X.com, which became PayPal. 195 00:08:24,371 --> 00:08:29,510 - The company was sold to eBay in '02 for $1.5 billion. 196 00:08:29,543 --> 00:08:32,579 - I think Elon wants to make a dent in the world. 197 00:08:32,613 --> 00:08:35,115 We all have a finite amount of time. 198 00:08:35,148 --> 00:08:38,352 If you can move the needle, then why not? 199 00:08:38,385 --> 00:08:40,454 And the capital allowed him to do things 200 00:08:40,487 --> 00:08:41,655 that were really important. 201 00:08:41,688 --> 00:08:44,124 - After PayPal, I started debating 202 00:08:44,157 --> 00:08:48,595 between either solar, electric car, or space. 203 00:08:48,629 --> 00:08:50,397 I thought, like, nobody is gonna be 204 00:08:50,430 --> 00:08:53,100 crazy enough to do space, so I better do space. 205 00:08:53,133 --> 00:08:58,338 So, we embarked on that journey to create SpaceX in 2002. 206 00:08:58,372 --> 00:08:59,740 And in the beginning, I wouldn't--actually 207 00:08:59,773 --> 00:09:01,241 wouldn't even let my friends invest 208 00:09:01,275 --> 00:09:02,609 because I didn't want to lose their money. 209 00:09:02,643 --> 00:09:05,779 - Stage 1. - We have liftoff indication. 210 00:09:05,812 --> 00:09:07,648 - Before all the drama of SpaceX, 211 00:09:07,681 --> 00:09:10,651 I think Tesla has actually been probably 2/3 212 00:09:10,684 --> 00:09:14,321 of my total drama dose of a time. 213 00:09:14,354 --> 00:09:19,426 ♪ ♪ 214 00:09:19,459 --> 00:09:22,796 - In 2003, my business partner at the time, Martin Eberhard, 215 00:09:22,829 --> 00:09:26,433 and I were getting really concerned about climate change. 216 00:09:26,466 --> 00:09:30,003 We knew that you could make electric cars. 217 00:09:30,037 --> 00:09:32,172 So we started Tesla Motors. 218 00:09:33,707 --> 00:09:37,277 We began looking at raising significant money. 219 00:09:38,745 --> 00:09:41,782 We visited Elon at SpaceX's original office, 220 00:09:41,815 --> 00:09:43,317 and there was a couple of things 221 00:09:43,350 --> 00:09:45,719 that were really different about pitching to Elon. 222 00:09:45,752 --> 00:09:48,655 First, he understood the mission immediately. 223 00:09:48,689 --> 00:09:52,059 - It's very important that we accelerate the transition away 224 00:09:52,092 --> 00:09:54,795 from gasoline, you know, 225 00:09:54,828 --> 00:09:57,264 for environmental reasons, for economic reasons, 226 00:09:57,297 --> 00:09:59,233 for national security reasons. 227 00:09:59,266 --> 00:10:01,568 - The other thing is some of the feedback 228 00:10:01,602 --> 00:10:04,037 that we'd get from the regular venture community 229 00:10:04,071 --> 00:10:08,609 was that the idea is just kinda too crazy. 230 00:10:08,642 --> 00:10:11,345 When you're pitching someone who's building a rocket ship 231 00:10:11,378 --> 00:10:14,114 directly on the other side of the glass panel, 232 00:10:14,147 --> 00:10:16,283 you know, that you're in the conference room, 233 00:10:16,316 --> 00:10:17,584 you kinda feel he's not gonna say 234 00:10:17,618 --> 00:10:19,253 that your idea is too crazy. 235 00:10:21,054 --> 00:10:22,623 - I could have been the CEO from day one, 236 00:10:22,656 --> 00:10:25,225 but the idea of being CEO of two startups 237 00:10:25,259 --> 00:10:30,297 at the same time was not appealing. 238 00:10:30,330 --> 00:10:32,466 - Elon was Chairman of the Board, 239 00:10:32,499 --> 00:10:34,134 and he would check in every month. 240 00:10:34,168 --> 00:10:36,303 ♪ ♪ 241 00:10:36,336 --> 00:10:38,038 One of Elon's things-- he said, you know, 242 00:10:38,071 --> 00:10:40,307 you only get to introduce the car once. 243 00:10:40,340 --> 00:10:42,743 So you kinda wanna make it as good as you can get it. 244 00:10:42,776 --> 00:10:45,012 And when we unveiled it, we did it in Los Angeles. 245 00:10:45,612 --> 00:10:48,215 - We've got a zero-emission sportscar 246 00:10:48,248 --> 00:10:50,384 that can go head-to-head with a Ferrari 247 00:10:50,417 --> 00:10:51,418 and a Porsche and win. 248 00:10:52,319 --> 00:10:53,687 - The governor, Arnold Schwarzenegger, 249 00:10:53,720 --> 00:10:55,422 at the time, you know, shows up. 250 00:10:55,455 --> 00:10:59,193 And he's kinda big, and the car's kinda small. 251 00:10:59,226 --> 00:11:00,460 And I was worried that, you know, 252 00:11:00,494 --> 00:11:02,095 the Governor might get stuck in the car, 253 00:11:02,129 --> 00:11:04,231 and that would be, like, a, you know, PR nightmare. 254 00:11:04,264 --> 00:11:05,766 But he really liked it. In fact, he ended up 255 00:11:05,799 --> 00:11:07,367 ordering one a little bit later. 256 00:11:10,270 --> 00:11:12,472 But there were a lot of unexpected challenges 257 00:11:12,506 --> 00:11:14,274 developing, you know, the Roadster. 258 00:11:15,409 --> 00:11:17,744 And that's when Elon began to get much more involved, 259 00:11:17,778 --> 00:11:20,647 because, you know, we were in trouble. 260 00:11:20,681 --> 00:11:25,686 - 2008 was brutal. Tesla almost went bankrupt. 261 00:11:25,719 --> 00:11:28,222 We closed our financing round 262 00:11:28,255 --> 00:11:30,524 6:00 p. m., Christmas Eve, 2008. 263 00:11:30,557 --> 00:11:33,460 It was the last hour of the last day that it was possible. 264 00:11:33,493 --> 00:11:35,362 And I thought, okay, I got to bite the bullet 265 00:11:35,395 --> 00:11:38,065 and run the company, 266 00:11:40,067 --> 00:11:41,468 'cause there's just too much at stake. 267 00:11:41,502 --> 00:11:43,704 ♪ ♪ 268 00:11:43,737 --> 00:11:47,341 - From the time Elon became CEO to now, 269 00:11:47,374 --> 00:11:49,209 I mean, Tesla's changed the way 270 00:11:49,243 --> 00:11:51,111 people think about electric vehicles. 271 00:11:51,144 --> 00:11:54,448 - "Consumer Reports" says this is the best car 272 00:11:54,481 --> 00:11:58,018 they've tested in the history of the magazine. 273 00:11:58,051 --> 00:12:01,622 - When Tesla went public, that was incredibly important. 274 00:12:01,655 --> 00:12:04,358 He was able to push Tesla to grow 275 00:12:04,391 --> 00:12:06,293 and become much bigger much faster 276 00:12:06,326 --> 00:12:08,695 than I think most people had thought possible. 277 00:12:08,729 --> 00:12:13,200 - Tesla, surpassing a $1 trillion valuation today. 278 00:12:13,233 --> 00:12:16,003 - Elon Musk is now the richest person on the planet. 279 00:12:16,036 --> 00:12:17,638 ♪ ♪ 280 00:12:17,671 --> 00:12:21,141 - His pace of innovation is kind of unmatched. 281 00:12:21,175 --> 00:12:24,578 He can take more risks than any human can now, 282 00:12:24,611 --> 00:12:27,381 simply because he has the resources 283 00:12:27,414 --> 00:12:29,049 that allow him to do things 284 00:12:29,082 --> 00:12:32,719 that would be irresponsible or insane for anybody else. 285 00:12:39,092 --> 00:12:42,062 [light music] 286 00:12:42,095 --> 00:12:47,634 ♪ ♪ 287 00:12:47,668 --> 00:12:49,369 - I love the acceleration. 288 00:12:49,403 --> 00:12:51,305 I love the fact that it's electric. 289 00:12:51,338 --> 00:12:54,341 And I love the fact that people are interested in Tesla 290 00:12:54,374 --> 00:12:56,143 as a brand and interested in the car. 291 00:12:56,176 --> 00:12:58,045 I like talking about it. 292 00:12:58,078 --> 00:13:01,114 2012, 2013, when they first released their Model S, 293 00:13:01,148 --> 00:13:02,649 I thought it was just a very cool car. 294 00:13:02,683 --> 00:13:05,419 And at the time, you know, it was pretty revolutionary. 295 00:13:05,452 --> 00:13:07,154 Like, I believed in their mission. 296 00:13:07,187 --> 00:13:10,324 - Electric car can in fact be the best car in the world. 297 00:13:10,357 --> 00:13:12,159 [cheers and applause] 298 00:13:12,192 --> 00:13:14,728 - I think I had Elon Musk up on a pedestal. 299 00:13:14,761 --> 00:13:17,064 You know, he was kind of a hero at the time. 300 00:13:17,097 --> 00:13:19,366 ♪ ♪ 301 00:13:19,399 --> 00:13:21,168 I would follow his tweets. 302 00:13:21,201 --> 00:13:23,437 Think I have a mug with his face on it somewhere at home. 303 00:13:24,805 --> 00:13:27,708 - This was the first one that I got. 304 00:13:27,741 --> 00:13:29,309 Somebody gave it to me. 305 00:13:29,343 --> 00:13:36,316 The Tesla community is, I think, rare, for any brand. 306 00:13:36,350 --> 00:13:40,187 I think that any company would kill 307 00:13:40,220 --> 00:13:46,260 to have that level of fandom and devotion. 308 00:13:46,293 --> 00:13:47,561 - People that are the diehard fans, 309 00:13:47,594 --> 00:13:49,096 they have a bunch of different names. 310 00:13:50,464 --> 00:13:52,299 Like the Musketeers. 311 00:13:52,332 --> 00:13:55,035 And they think whatever Elon Musk says is, 312 00:13:55,068 --> 00:13:56,336 you know, gold, basically. 313 00:13:56,370 --> 00:13:58,305 - Elon Musk, how much do I have to beg 314 00:13:58,338 --> 00:14:00,007 to get a selfie with you? 315 00:14:00,040 --> 00:14:02,476 - Sure, I'll do a selfie. 316 00:14:02,509 --> 00:14:04,378 Sure. If your customers love you, 317 00:14:04,411 --> 00:14:08,182 your odds of success are dramatically higher. 318 00:14:08,215 --> 00:14:10,651 ♪ ♪ 319 00:14:10,684 --> 00:14:14,121 - Josh was a real Tesla enthusiast. 320 00:14:14,154 --> 00:14:17,558 He was very happy with the decision he made 321 00:14:17,591 --> 00:14:19,459 in buying that car. 322 00:14:19,493 --> 00:14:21,461 That's the last picture of him. - Yep. 323 00:14:21,495 --> 00:14:23,630 - Right after the vacation. 324 00:14:25,532 --> 00:14:27,067 Before the accident. 325 00:14:27,100 --> 00:14:30,070 [melancholy music] 326 00:14:30,103 --> 00:14:31,238 ♪ ♪ 327 00:14:31,271 --> 00:14:32,606 I was trying to think about 328 00:14:32,639 --> 00:14:34,575 how I first met Josh the other day, 329 00:14:34,608 --> 00:14:37,277 the exact timing that I met him. 330 00:14:38,779 --> 00:14:43,050 Mark Nelson and I were working on a computer program 331 00:14:43,083 --> 00:14:44,718 for the NAVY EOD forces. 332 00:14:44,751 --> 00:14:47,588 EOD stands for explosive ordinance disposal. 333 00:14:47,621 --> 00:14:51,124 Most people know it by bomb disposal. 334 00:14:51,158 --> 00:14:55,662 - I was with a friend of mine in San Diego, Ken Falke. 335 00:14:55,696 --> 00:15:00,033 And he introduced me to this young sailor, 336 00:15:00,067 --> 00:15:01,502 Josh Brown. 337 00:15:01,535 --> 00:15:04,404 - From maintenance to training plans 338 00:15:04,438 --> 00:15:06,173 to whatever it might have been, 339 00:15:06,206 --> 00:15:09,443 this entire detachment was automated, 340 00:15:09,476 --> 00:15:12,012 thanks to Josh Brown. 341 00:15:12,045 --> 00:15:13,780 - What attracted him to it? 342 00:15:13,814 --> 00:15:16,149 I think it was the excitement. 343 00:15:16,183 --> 00:15:18,318 He liked to be doing things. 344 00:15:20,387 --> 00:15:24,558 - Man, this is actually a fairly treacherous bottom. 345 00:15:24,591 --> 00:15:27,094 Just in case you doubt this is me. 346 00:15:27,127 --> 00:15:28,562 Hello, there. 347 00:15:28,595 --> 00:15:32,332 I think that might even get the motorcycle in there. Bye. 348 00:15:32,366 --> 00:15:34,001 - You get up in the morning and you decide, 349 00:15:34,034 --> 00:15:36,470 "Well, today, I'm gonna go for a three-mile run 350 00:15:36,503 --> 00:15:40,374 "and maybe a 1,000-yard swim, just to kind of get warmed up. 351 00:15:40,407 --> 00:15:42,376 "And then maybe I'll go diving today 352 00:15:42,409 --> 00:15:45,712 "or go parachuting or, you know, 353 00:15:45,746 --> 00:15:48,015 go blow something up." 354 00:15:48,048 --> 00:15:49,416 - It's interesting, too. Like, a lot of people 355 00:15:49,449 --> 00:15:51,251 kind of think, you know, these people 356 00:15:51,285 --> 00:15:53,253 who go and stand on landmines 357 00:15:53,287 --> 00:15:56,590 and render them safe have a death wish. 358 00:15:56,623 --> 00:15:59,159 It's just the opposite. You know, they do it 359 00:15:59,193 --> 00:16:02,496 because they know that somebody's gotta do it. 360 00:16:02,529 --> 00:16:06,433 - It's a profession where people just don't have 361 00:16:06,466 --> 00:16:09,670 more than one chance to make mistakes. 362 00:16:09,703 --> 00:16:11,705 - I used to call him the professor sometimes, 363 00:16:11,738 --> 00:16:14,641 because, you know, he was just so up on everything. 364 00:16:14,675 --> 00:16:17,611 Oh, geez. Car's doing it all itself. 365 00:16:17,644 --> 00:16:20,547 What am I gonna do with my hands down here? 366 00:16:20,581 --> 00:16:22,282 - His relationship to technology 367 00:16:22,316 --> 00:16:24,384 was really symbiotic. 368 00:16:24,418 --> 00:16:26,753 I mean, it was totally connected. 369 00:16:26,787 --> 00:16:29,122 And when he called and said, "Hey, I got a surprise car 370 00:16:29,156 --> 00:16:30,390 I want to show you," 371 00:16:30,424 --> 00:16:32,326 I wasn't surprised when it was a Tesla. 372 00:16:32,359 --> 00:16:34,228 - It was a beautiful looking car, 373 00:16:34,261 --> 00:16:37,331 and he loved it. 374 00:16:37,364 --> 00:16:39,199 He wanted to know 375 00:16:39,233 --> 00:16:42,035 how everything worked in that Tesla. 376 00:16:42,069 --> 00:16:44,338 - All right, so this video is gonna show some driving 377 00:16:44,371 --> 00:16:48,041 in some hills and turns, so that you can see 378 00:16:48,075 --> 00:16:49,543 how it's gonna react to different things. 379 00:16:49,576 --> 00:16:52,746 Overall, it actually does a fantastically good job. 380 00:16:52,779 --> 00:16:54,281 - He and the car, you know, 381 00:16:54,314 --> 00:16:58,118 were a match made in heaven, you know? 382 00:16:58,151 --> 00:16:59,720 He was perfectly suited 383 00:16:59,753 --> 00:17:03,357 for being on the bleeding edge of technology. 384 00:17:03,390 --> 00:17:07,294 And, you know, he had done a lot riskier things, 385 00:17:07,327 --> 00:17:10,097 you know, in his lifetime. 386 00:17:10,130 --> 00:17:12,099 - Well, I made it. 387 00:17:12,132 --> 00:17:14,001 That was--that was quite sporting. 388 00:17:14,034 --> 00:17:15,402 [laughter] 389 00:17:19,406 --> 00:17:22,376 [tense music] 390 00:17:22,409 --> 00:17:28,115 ♪ ♪ 391 00:17:28,148 --> 00:17:30,384 - Elon was looking for somebody to come in 392 00:17:30,417 --> 00:17:33,253 and help him really leverage his time. 393 00:17:33,287 --> 00:17:35,422 I had had a career as a serial entrepreneur, 394 00:17:35,455 --> 00:17:36,790 and I said to him several times, 395 00:17:36,823 --> 00:17:38,225 like, "I don't think I'm your guy. 396 00:17:38,258 --> 00:17:40,194 You need a big company car guy." 397 00:17:40,227 --> 00:17:42,362 And he kept saying, "No, I don't. 398 00:17:42,396 --> 00:17:44,031 I need a fellow entrepreneur." 399 00:17:44,064 --> 00:17:46,166 And then he called me one day and said, 400 00:17:46,200 --> 00:17:47,467 "I have a question for you." 401 00:17:47,501 --> 00:17:50,170 He said, "Tell me about the meaning of your work. 402 00:17:50,204 --> 00:17:52,472 You are gonna be able to change the world at a scale 403 00:17:52,506 --> 00:17:54,107 that you won't be able to change the world 404 00:17:54,141 --> 00:17:56,610 in your own companies." 405 00:17:56,643 --> 00:17:59,613 And that's what got me at the end of the day. 406 00:17:59,646 --> 00:18:03,584 - Elon had a very specific way of motivating people. 407 00:18:03,617 --> 00:18:06,653 And that was he would say really cool things, 408 00:18:06,687 --> 00:18:08,355 like science fiction things. 409 00:18:08,388 --> 00:18:11,458 And he would make you believe that you could do it. 410 00:18:11,491 --> 00:18:13,427 - At some point in the future, like, maybe five 411 00:18:13,460 --> 00:18:15,262 or six years from now, I think we'll be able 412 00:18:15,295 --> 00:18:18,632 to achieve true autonomous driving, 413 00:18:18,665 --> 00:18:21,468 where you could literally get in the car, go to sleep, 414 00:18:21,502 --> 00:18:23,337 and wake up at your destination. 415 00:18:23,370 --> 00:18:26,273 - He is proposing that there is a better way. 416 00:18:26,306 --> 00:18:28,509 And I did really believe in the goals 417 00:18:28,542 --> 00:18:31,512 of the-of the team and Autopilot. 418 00:18:31,545 --> 00:18:38,185 ♪ ♪ 419 00:18:38,218 --> 00:18:41,121 - In 2014, the group was put together 420 00:18:41,154 --> 00:18:43,490 for Autopilot and autonomy. 421 00:18:43,524 --> 00:18:46,326 I was one of the early team members. 422 00:18:46,360 --> 00:18:48,729 - The naming happened before I arrived, 423 00:18:48,762 --> 00:18:53,433 but it obviously is a familiar term in aviation. 424 00:18:53,467 --> 00:18:56,236 You know, a car ought to be like a plane. 425 00:18:56,270 --> 00:18:58,005 And we trust airplanes to do this. 426 00:18:58,038 --> 00:18:59,439 We ought to be able to trust cars. 427 00:19:00,240 --> 00:19:02,442 [beep] - Please route us through. 428 00:19:02,476 --> 00:19:05,445 - Roger. You're now under automatic control. 429 00:19:05,479 --> 00:19:06,647 Hands-off steering. 430 00:19:06,680 --> 00:19:08,782 [luxurious music] 431 00:19:08,815 --> 00:19:10,417 - What people need to realize is this is 432 00:19:10,450 --> 00:19:12,186 a very, very old idea. 433 00:19:12,219 --> 00:19:14,188 ♪ ♪ 434 00:19:14,221 --> 00:19:16,223 Over, you know, the past decades, 435 00:19:16,256 --> 00:19:20,460 you've had efforts to build self-driving cars. 436 00:19:20,494 --> 00:19:23,697 In the 2000s, DARPA, which is a research arm 437 00:19:23,730 --> 00:19:26,200 of the U. S. Department of Defense, 438 00:19:26,233 --> 00:19:27,701 put on these contests. 439 00:19:27,734 --> 00:19:29,469 - Welcome to the DARPA Grand Challenge. 440 00:19:29,503 --> 00:19:30,737 The objective: 441 00:19:30,771 --> 00:19:33,674 create a self-navigating autonomous vehicle. 442 00:19:33,707 --> 00:19:36,376 - After one of these contests, Google got interested. 443 00:19:36,410 --> 00:19:40,447 - Google is developing a robocar that drives itself. 444 00:19:40,480 --> 00:19:42,449 - And over the past ten years, 445 00:19:42,482 --> 00:19:45,219 the self-driving car industry ramps up. 446 00:19:45,252 --> 00:19:49,122 And Elon Musk of course is not gonna wanna miss out on that. 447 00:19:49,156 --> 00:19:55,028 ♪ ♪ 448 00:19:55,062 --> 00:19:56,296 - There was a bit of a mantra 449 00:19:56,330 --> 00:19:59,233 that would explain Elon's approach. 450 00:19:59,266 --> 00:20:01,134 And that mantra was that history is changed 451 00:20:01,168 --> 00:20:03,270 by unreasonable men. 452 00:20:03,303 --> 00:20:05,005 We were literally trying to change 453 00:20:05,038 --> 00:20:07,007 transportation worldwide. 454 00:20:08,442 --> 00:20:09,610 I remember the first time 455 00:20:09,643 --> 00:20:11,178 I walked in to Tesla's headquarters, 456 00:20:11,211 --> 00:20:13,514 where the Autopilot team sits. 457 00:20:13,547 --> 00:20:18,652 And there was a little sign on a pillar in that group, 458 00:20:18,685 --> 00:20:20,053 with a number on it. 459 00:20:20,087 --> 00:20:21,288 And I asked one of the engineers, 460 00:20:21,321 --> 00:20:23,290 "What's that about?" And he said, 461 00:20:23,323 --> 00:20:24,658 "That's the number of people that die 462 00:20:24,691 --> 00:20:26,493 "on U. S. highways every year. 463 00:20:26,527 --> 00:20:28,128 "That's why we're here. 464 00:20:28,161 --> 00:20:30,330 We're here to save lives." 465 00:20:30,364 --> 00:20:34,468 - Most other companies working on self-driving car at the time 466 00:20:34,501 --> 00:20:37,471 were building heavily on top of lidar. 467 00:20:37,504 --> 00:20:40,207 - Lidar--light detection and ranging. 468 00:20:40,240 --> 00:20:42,609 By bouncing pulses of light from a sensor, 469 00:20:42,643 --> 00:20:44,378 the vehicle's autonomous systems 470 00:20:44,411 --> 00:20:47,648 can figure out how far away objects are, allowing it to-- 471 00:20:47,681 --> 00:20:49,683 - And it can see through situations 472 00:20:49,716 --> 00:20:52,753 that your cameras might struggle with. 473 00:20:52,786 --> 00:20:56,590 And that's what is one of the attractions to lidar. 474 00:20:56,623 --> 00:20:58,025 - It seemed like the right decision 475 00:20:58,058 --> 00:20:59,760 for a lot of companies, and for Tesla, 476 00:20:59,793 --> 00:21:02,062 that was just not on the table. 477 00:21:03,463 --> 00:21:07,367 - Lidar was too expensive and very breakable. 478 00:21:07,401 --> 00:21:10,270 - Margins are thin, and every little bit matters. 479 00:21:10,304 --> 00:21:13,373 They have to get cars out into the world today, 480 00:21:13,407 --> 00:21:15,676 and they have to sell them. 481 00:21:15,709 --> 00:21:18,679 - And so the challenge that we took on was: 482 00:21:18,712 --> 00:21:23,450 could you achieve autonomy with radar and sonar 483 00:21:23,483 --> 00:21:25,519 and images from cameras? 484 00:21:25,552 --> 00:21:27,087 - So, if you're Elon Musk, 485 00:21:27,120 --> 00:21:29,723 he's gonna turn that into a positive, right? 486 00:21:29,756 --> 00:21:32,426 He's gonna tell the rest of the world--the language 487 00:21:32,459 --> 00:21:34,628 he's often liked to use-- that, "It's a crutch." 488 00:21:34,661 --> 00:21:36,730 - Lidar ends up being, like, somewhat of a crutch. 489 00:21:36,763 --> 00:21:40,467 - He starts to say, very early on, internally, 490 00:21:40,501 --> 00:21:42,436 and then pretty soon externally, 491 00:21:42,469 --> 00:21:45,138 that Tesla can build a self-driving car 492 00:21:45,172 --> 00:21:46,540 just with cameras. 493 00:21:46,573 --> 00:21:49,510 - You can absolutely be superhuman with just cameras. 494 00:21:49,543 --> 00:21:51,211 - As human beings, we have two eyes, 495 00:21:51,245 --> 00:21:52,746 and we manage not to run into each other, 496 00:21:52,779 --> 00:21:54,715 for the most part, when we're walking down the street. 497 00:21:54,748 --> 00:21:57,551 So the idea was, if you put eight cameras around a car, 498 00:21:57,584 --> 00:22:00,487 and essentially gave that camera eight eyes, 499 00:22:00,521 --> 00:22:02,256 you could keep it very safe 500 00:22:02,289 --> 00:22:04,057 from the other vehicles that are around it. 501 00:22:05,559 --> 00:22:08,729 - There was nothing theoretically preventing that 502 00:22:08,762 --> 00:22:09,730 from happening. 503 00:22:09,763 --> 00:22:11,498 Like, you know, humans do it, 504 00:22:11,532 --> 00:22:13,200 and so there must be a way 505 00:22:13,233 --> 00:22:15,335 eventually for us to do it with cameras. 506 00:22:15,369 --> 00:22:19,406 - There was no deep research phase, where various vehicles 507 00:22:19,439 --> 00:22:22,676 were outfitted with a range of sensors. 508 00:22:22,709 --> 00:22:25,012 Many team members would have liked that. 509 00:22:25,045 --> 00:22:27,781 Instead, the conclusion was made first, 510 00:22:27,814 --> 00:22:29,650 and then the tests and development activities 511 00:22:29,683 --> 00:22:33,053 began to prove that conclusion correct. 512 00:22:34,454 --> 00:22:38,058 - A big Silicon Valley company plans a major announcement. 513 00:22:38,091 --> 00:22:39,793 - Last night at an event in Southern California. 514 00:22:39,826 --> 00:22:41,061 - Holy shit! - Whoa. 515 00:22:41,094 --> 00:22:43,697 - Oh, shit. Oh, my God. 516 00:22:43,730 --> 00:22:45,499 - The first announcement for Autopilot 517 00:22:45,532 --> 00:22:48,035 was in the fall of 2014. 518 00:22:48,068 --> 00:22:49,570 - Welcome, everyone. [cheering] 519 00:22:49,603 --> 00:22:52,439 So, we've been able to accelerate Autopilot. 520 00:22:52,472 --> 00:22:54,474 - At the time, what they're building 521 00:22:54,508 --> 00:22:57,778 is really a driver-assisted system. 522 00:22:57,811 --> 00:23:01,615 The human must stay diligent, 523 00:23:01,648 --> 00:23:02,749 must keep their eyes on the road, 524 00:23:02,783 --> 00:23:03,750 ready to take over at any time. 525 00:23:03,784 --> 00:23:06,386 That's very different from a self-driving car. 526 00:23:06,420 --> 00:23:08,555 - It'll detect if there's a car in your blind spot. 527 00:23:08,589 --> 00:23:12,292 - But Elon Musk, and by extension Tesla, 528 00:23:12,326 --> 00:23:14,228 decided they were gonna tell people 529 00:23:14,261 --> 00:23:17,364 that we're on the way to a self-driving car. 530 00:23:17,397 --> 00:23:19,132 That's gonna be a selling point. 531 00:23:19,166 --> 00:23:20,601 - We have the Autopilot section here. 532 00:23:20,634 --> 00:23:22,002 And you can watch it. 533 00:23:22,035 --> 00:23:23,270 It'll read the speed limit signs, 534 00:23:23,303 --> 00:23:25,105 so we increase speed from 25 to 30. 535 00:23:25,138 --> 00:23:26,240 - It's following the white lines. 536 00:23:26,406 --> 00:23:27,508 - It's following the white lines. 537 00:23:28,242 --> 00:23:31,345 - The car can do almost anything. 538 00:23:31,378 --> 00:23:33,780 - He was up there in the lights, 539 00:23:33,814 --> 00:23:35,516 making all sorts of wild claims 540 00:23:35,549 --> 00:23:38,185 about the capabilities of that particular system. 541 00:23:38,218 --> 00:23:41,388 - In fact, when you get home, you'll actually be able to 542 00:23:41,421 --> 00:23:43,390 just step out of the car 543 00:23:43,423 --> 00:23:45,425 and have it park itself in your garage. 544 00:23:45,459 --> 00:23:47,494 - At some point, he cracked a joke about 545 00:23:47,528 --> 00:23:50,063 how he was gonna say something 546 00:23:50,097 --> 00:23:52,499 that his engineers would hear for the first time. 547 00:23:52,533 --> 00:23:56,737 - And then something-something I'd like to do-- 548 00:23:56,770 --> 00:23:58,705 which I think many of our engineers will be hearing this 549 00:23:58,739 --> 00:24:01,775 in real time-- [laughter] 550 00:24:01,808 --> 00:24:05,078 is have the charge connector plug itself in. 551 00:24:05,112 --> 00:24:07,114 [laughter] 552 00:24:07,147 --> 00:24:09,783 Like an articulating-- like, sort of a snake. 553 00:24:09,816 --> 00:24:11,552 - And I thought to myself, buddy, 554 00:24:11,585 --> 00:24:13,620 you already said a lot that your engineers 555 00:24:13,654 --> 00:24:16,056 are hearing for the first time. 556 00:24:16,089 --> 00:24:19,626 - The other thing that is really heightened inside Tesla 557 00:24:19,660 --> 00:24:21,495 is you've got Elon Musk 558 00:24:21,528 --> 00:24:25,165 really driving the aura around these cars. 559 00:24:25,199 --> 00:24:28,769 - Autopilot was a very strong focus of Elon. 560 00:24:28,802 --> 00:24:32,139 He sort of created Tesla's brand around it. 561 00:24:32,172 --> 00:24:37,611 - You know, I'm confident that, in less than a year, 562 00:24:37,644 --> 00:24:41,315 you'll be able to go from highway onramp to highway exit 563 00:24:41,348 --> 00:24:43,250 without touching any controls. 564 00:24:43,283 --> 00:24:44,518 [applause] 565 00:24:44,551 --> 00:24:46,119 - So people start buying this. 566 00:24:46,153 --> 00:24:49,356 And even then--and this is typical of Tesla, 567 00:24:49,389 --> 00:24:51,725 and Silicon Valley in general-- 568 00:24:51,758 --> 00:24:53,527 it wasn't ready. 569 00:24:53,560 --> 00:24:57,397 Then there's all this pressure inside Tesla to get it done. 570 00:24:57,431 --> 00:25:00,534 - Elon sets crazy ambitious goals for himself, 571 00:25:00,567 --> 00:25:02,736 and then that translates to crazy ambitious goals 572 00:25:02,769 --> 00:25:04,204 for people around him. 573 00:25:04,238 --> 00:25:07,074 So, often we describe to recruits, 574 00:25:07,107 --> 00:25:08,775 you are not joining the regular army here. 575 00:25:08,809 --> 00:25:11,111 You're joining special forces. 576 00:25:11,144 --> 00:25:16,350 ♪ ♪ 577 00:25:16,383 --> 00:25:21,421 - I got to be an approved Autopilot tester back in 2014. 578 00:25:21,455 --> 00:25:23,624 The first time I drove the car, 579 00:25:23,657 --> 00:25:26,326 went across an intersection in a neighborhood, 580 00:25:26,360 --> 00:25:30,330 and the car went full brake for about half a second, 581 00:25:30,364 --> 00:25:32,733 and then immediately full throttle. 582 00:25:34,501 --> 00:25:37,004 It was really sort of a wakeup call. 583 00:25:37,037 --> 00:25:38,572 This is very experimental. 584 00:25:38,605 --> 00:25:41,141 - I'm testing the latest version of Autopilot 585 00:25:41,175 --> 00:25:42,209 every week. 586 00:25:42,242 --> 00:25:45,779 We wanna make sure that our testing is exhaustive 587 00:25:45,812 --> 00:25:49,116 before we release the software. 588 00:25:49,149 --> 00:25:50,717 ♪ ♪ 589 00:25:50,751 --> 00:25:52,553 - A lot of the software updates were pushed 590 00:25:52,586 --> 00:25:54,488 to Elon Musk's own car. 591 00:25:54,521 --> 00:25:57,057 Certainly, his opinions were the ones 592 00:25:57,090 --> 00:26:00,294 that were always keeping the team on their toes. 593 00:26:00,327 --> 00:26:03,363 But what it also does is it focuses the team 594 00:26:03,397 --> 00:26:07,167 on certain short-term appeasement projects, 595 00:26:07,201 --> 00:26:09,670 as opposed to developing 596 00:26:09,703 --> 00:26:12,239 a more total, complete solution. 597 00:26:14,241 --> 00:26:17,544 I was concerned about the fact that the software 598 00:26:17,578 --> 00:26:23,217 simply wasn't validated across a wide range of roadways. 599 00:26:23,250 --> 00:26:26,019 And if this is the first step, 600 00:26:26,053 --> 00:26:30,090 in terms of this technology's relationship to the public, 601 00:26:30,123 --> 00:26:34,361 then, you know, it doesn't paint a pretty picture. 602 00:26:34,394 --> 00:26:38,699 And so I started considering other opportunities. 603 00:26:38,732 --> 00:26:41,268 - I mean, I almost--this may sound a little complacent, 604 00:26:41,301 --> 00:26:43,170 but I almost view it as, like, a solved problem. 605 00:26:43,203 --> 00:26:45,105 Like, we know exactly what to do, 606 00:26:45,138 --> 00:26:47,508 and we'll be there in a few years. 607 00:26:52,546 --> 00:26:54,047 - Are self-driving cars 608 00:26:54,081 --> 00:26:55,449 closer than we think? 609 00:26:55,482 --> 00:26:58,318 Well, a few days ago, Tesla CEO Elon Musk tweeted, 610 00:26:58,352 --> 00:27:00,554 "Autopilot goes to wide release on Thursday." 611 00:27:00,587 --> 00:27:02,422 - Unveiled an Autopilot system, 612 00:27:02,456 --> 00:27:06,326 which allows cars to change lanes by themselves. 613 00:27:06,360 --> 00:27:09,429 - It takes a lot of the tedious aspects of driving 614 00:27:09,463 --> 00:27:11,565 and alleviates it from your concern 615 00:27:11,598 --> 00:27:15,402 as you're driving on a long stretch of a highway. 616 00:27:15,435 --> 00:27:17,471 - We were really excited to not only show 617 00:27:17,504 --> 00:27:19,006 the technology to the world, 618 00:27:19,039 --> 00:27:21,441 but also show the potential of the technology. 619 00:27:21,475 --> 00:27:22,576 - You know, in fact, comfortably 620 00:27:22,609 --> 00:27:26,313 within three years, the car will be able to take you 621 00:27:26,346 --> 00:27:27,748 from point to point-- like, basically 622 00:27:27,781 --> 00:27:30,117 from your driveway to work-- 623 00:27:30,150 --> 00:27:31,552 without you touching anything. 624 00:27:31,585 --> 00:27:34,254 And you could be asleep the whole time. 625 00:27:34,288 --> 00:27:37,524 - Tesla, in putting out material blog posts, 626 00:27:37,558 --> 00:27:40,360 made it clear self-driving cars are not here, 627 00:27:40,394 --> 00:27:41,995 they are a long way away. 628 00:27:48,602 --> 00:27:51,705 - When it comes to Tesla and Elon Musk, 629 00:27:51,738 --> 00:27:56,210 the message is constantly going up and down 630 00:27:56,243 --> 00:27:57,477 and up and down, right? 631 00:27:57,511 --> 00:28:00,147 And Elon can change his mind at any moment. 632 00:28:00,180 --> 00:28:02,049 He can say one thing at one moment, 633 00:28:02,082 --> 00:28:04,151 and then he'll say something completely different. 634 00:28:12,226 --> 00:28:14,461 - The car can do almost anything. 635 00:28:14,494 --> 00:28:19,099 The expectation is that someone is paying attention to the road 636 00:28:19,132 --> 00:28:21,602 and is ready to take over if there is an issue. 637 00:28:26,673 --> 00:28:30,577 - What people need to realize is that it's very easy 638 00:28:30,611 --> 00:28:32,079 to say these things. 639 00:28:32,112 --> 00:28:33,680 And there's no check on him. 640 00:28:33,714 --> 00:28:35,649 - They advise you to keep your hands on the steering wheel 641 00:28:35,682 --> 00:28:37,317 when using the auto-steer, 642 00:28:37,351 --> 00:28:40,654 but as we're in testing, you really don't need to. 643 00:28:40,687 --> 00:28:43,023 - I think, for a lot of the Tesla fans, 644 00:28:43,056 --> 00:28:48,161 they focus in on the things that Elon Musk says 645 00:28:48,195 --> 00:28:49,730 that they want to hear. 646 00:28:51,331 --> 00:28:55,636 So when he says self-driving is a solved problem, 647 00:28:55,669 --> 00:28:57,070 that's what they hear, 648 00:28:57,104 --> 00:28:59,339 and that's what they pay attention to. 649 00:28:59,373 --> 00:29:00,741 - I had an investor say to me, 650 00:29:00,774 --> 00:29:04,011 "You guys have embarked on a really virtuous path, 651 00:29:04,044 --> 00:29:05,546 but it's gonna be a difficult path." 652 00:29:05,579 --> 00:29:08,649 And I said, "Why?" He said, "Well, we're also, you know, 653 00:29:08,682 --> 00:29:11,018 investors in the largest pharma companies in the world," 654 00:29:11,051 --> 00:29:14,688 and it's expected that people will die in a drug trial." 655 00:29:14,721 --> 00:29:18,325 And it happens largely outside of the spotlight of the media." 656 00:29:18,358 --> 00:29:20,794 - A number of videos hit the internet showing drivers 657 00:29:20,827 --> 00:29:22,462 going hands-free, 658 00:29:22,496 --> 00:29:26,266 playing games, even sleeping while the car's in motion. 659 00:29:26,300 --> 00:29:28,502 - He pointed that "your challenge is gonna be 660 00:29:28,535 --> 00:29:31,138 quite different with Autopilot," 661 00:29:31,171 --> 00:29:35,709 because people don't expect to tolerate deaths on a highway," 662 00:29:35,742 --> 00:29:37,544 and it's going to be in the spotlight." 663 00:29:38,145 --> 00:29:41,515 - Somebody is gonna get in an accident. 664 00:29:41,548 --> 00:29:43,250 Will Tesla be liable for that? 665 00:29:43,283 --> 00:29:44,718 - If there's unfortunately an accident, 666 00:29:44,751 --> 00:29:47,654 the driver is in control of the car. 667 00:29:49,823 --> 00:29:51,658 - I remember Josh talking to me 668 00:29:51,692 --> 00:29:53,760 about the Autopilot system. 669 00:29:53,794 --> 00:29:56,797 I wasn't there yet, in terms of the technology. 670 00:29:56,830 --> 00:29:58,198 You know, he was all-in. 671 00:29:58,232 --> 00:30:01,368 - All right, so this is the new 7.0 firmware 672 00:30:01,401 --> 00:30:03,370 for the Tesla Model S. 673 00:30:03,403 --> 00:30:06,106 - I believe that he felt, you know, very qualified 674 00:30:06,139 --> 00:30:09,443 as to how those features, you know, worked. 675 00:30:09,476 --> 00:30:11,411 He would create these videos. 676 00:30:11,445 --> 00:30:14,114 - This is gonna be a busy intersection, 677 00:30:14,147 --> 00:30:17,351 just so you can see how it reacts with the traffic. 678 00:30:17,384 --> 00:30:19,186 - He was studying it with an eye toward, 679 00:30:19,219 --> 00:30:21,321 "How can I help other people 680 00:30:21,355 --> 00:30:23,524 get the best out of their Tesla, too?" 681 00:30:23,557 --> 00:30:26,026 - Here's a turn that the auto-steer 682 00:30:26,059 --> 00:30:27,728 is probably going to do very, very poorly, 683 00:30:27,761 --> 00:30:31,365 'cause it's in a turn that's very sharp. 684 00:30:31,398 --> 00:30:33,200 And yep, it said take control, 685 00:30:33,233 --> 00:30:35,035 and he immediately took control. 686 00:30:35,068 --> 00:30:37,104 ♪ ♪ 687 00:30:37,137 --> 00:30:40,641 - I remember something about how Josh had said 688 00:30:40,674 --> 00:30:42,676 the car had helped possibly save his life, 689 00:30:42,709 --> 00:30:45,078 because of something that occurred on the road. 690 00:30:46,313 --> 00:30:48,615 [horn blaring] 691 00:30:53,687 --> 00:30:59,159 - Josh told me that Elon retweeted the video. 692 00:30:59,193 --> 00:31:01,495 And he was so happy about it. 693 00:31:01,528 --> 00:31:04,064 Sort of in the midst of this--you know, 694 00:31:04,097 --> 00:31:06,700 this great wave of technology and what's happening. 695 00:31:06,733 --> 00:31:09,736 And yeah, it was just a great moment for him. 696 00:31:14,374 --> 00:31:15,676 [phone dialing] 697 00:31:15,709 --> 00:31:17,744 [line ringing] 698 00:31:17,778 --> 00:31:20,781 - Levy 911, what is the address of your emergency? 699 00:31:20,814 --> 00:31:22,449 - There was just a wreck. 700 00:31:22,482 --> 00:31:23,684 A head-on collision right here-- 701 00:31:23,717 --> 00:31:25,552 Oh, my God Almighty. 702 00:31:25,586 --> 00:31:30,791 ♪ ♪ 703 00:31:30,824 --> 00:31:36,430 - I was here at work, and probably heard about 704 00:31:36,463 --> 00:31:40,067 the accident an hour after it happened. 705 00:31:42,336 --> 00:31:44,171 - Ken called me. 706 00:31:44,204 --> 00:31:48,509 Told me that Josh had been killed in a car accident. 707 00:31:50,744 --> 00:31:53,247 - Josh and his family had just been on a vacation 708 00:31:53,280 --> 00:31:56,183 to Disney World, in Orlando, 709 00:31:56,216 --> 00:31:59,686 and that he had said goodbye to everybody, 710 00:31:59,720 --> 00:32:01,722 jumped in his car. 711 00:32:01,755 --> 00:32:04,558 - I, Corporal Daphne Yunker, of the Florida Highway Patrol, 712 00:32:04,591 --> 00:32:06,493 am conducting a criminal investigation. 713 00:32:06,527 --> 00:32:10,230 - I noticed the car come over the upper grade 714 00:32:10,264 --> 00:32:11,732 and start coming down, 715 00:32:11,765 --> 00:32:16,003 and the semi turned left, started crossing the highway. 716 00:32:16,036 --> 00:32:20,107 I thought the car was gonna stop, and it didn't. 717 00:32:20,140 --> 00:32:23,677 It was like a white explosion, a cloud. 718 00:32:23,710 --> 00:32:28,015 - Only thing I could think of was how could this happen? 719 00:32:28,048 --> 00:32:33,053 And my heart was broken for his family. 720 00:32:38,692 --> 00:32:43,030 - I was--I was incredulous, and it was a real blow. 721 00:32:43,063 --> 00:32:46,533 Of all people. Of all people. 722 00:32:46,567 --> 00:32:49,203 - It did not seem to speed up or slow down. 723 00:32:49,236 --> 00:32:52,072 Drove right through a little grove of trees 724 00:32:52,105 --> 00:32:54,308 at the--at someone's property line. 725 00:32:54,341 --> 00:32:56,276 - The first question I think that went through my mind 726 00:32:56,310 --> 00:32:59,346 after the accident-- was he in self-drive? 727 00:32:59,379 --> 00:33:01,748 It was devastating, obviously, to lose a friend, 728 00:33:01,782 --> 00:33:03,517 but it was-- what was frustrating, 729 00:33:03,550 --> 00:33:07,221 I think, to me, was knowing that, 730 00:33:07,254 --> 00:33:10,691 you know, that maybe he was on that front edge of technology, 731 00:33:10,724 --> 00:33:14,061 maybe a little bit further than we would have all liked. 732 00:33:15,629 --> 00:33:17,798 - How do you think that that particular crash 733 00:33:17,831 --> 00:33:20,033 that day could have been prevented? 734 00:33:20,067 --> 00:33:24,671 ♪ ♪ 735 00:33:24,705 --> 00:33:26,607 - I don't know, because I didn't see anything 736 00:33:26,640 --> 00:33:29,409 that was--you know, I don't know. 737 00:33:31,478 --> 00:33:33,180 - When I heard about Josh's accident, 738 00:33:33,213 --> 00:33:36,149 it was personal, in that sense, 739 00:33:36,183 --> 00:33:38,785 that it felt like we'd lost a member of a family. 740 00:33:38,819 --> 00:33:42,789 ♪ ♪ 741 00:33:42,823 --> 00:33:45,025 - Elon had an all-hands meeting 742 00:33:45,058 --> 00:33:46,326 for the Autopilot team. 743 00:33:48,295 --> 00:33:50,063 It was just that, you know, this had happened, 744 00:33:50,097 --> 00:33:54,434 and we're doing all we could to figure it out, 745 00:33:54,468 --> 00:34:00,140 and, you know, we do want to try to make Autopilot safe. 746 00:34:02,042 --> 00:34:03,443 - At the time of that crash, I was aware 747 00:34:03,477 --> 00:34:06,580 that people were trusting the system to do things 748 00:34:06,613 --> 00:34:10,083 that it was not designed or capable of doing. 749 00:34:10,117 --> 00:34:12,719 The fact that that sort of accident happened 750 00:34:12,753 --> 00:34:15,455 was obviously tragic, but it wasn't really-- 751 00:34:15,489 --> 00:34:17,357 wasn't something that-- 752 00:34:17,391 --> 00:34:18,659 it was going to happen. 753 00:34:20,494 --> 00:34:22,296 It was going to happen. 754 00:34:26,834 --> 00:34:29,770 [light music] 755 00:34:29,803 --> 00:34:36,076 ♪ ♪ 756 00:34:37,444 --> 00:34:39,313 - It was the middle of June. 757 00:34:39,346 --> 00:34:42,482 I got an email from our investigatory team 758 00:34:42,516 --> 00:34:45,085 that there had been a Tesla fatality. 759 00:34:45,118 --> 00:34:49,523 ♪ ♪ 760 00:34:49,556 --> 00:34:51,758 The National Highway Traffic Safety Administration, 761 00:34:51,792 --> 00:34:55,729 or NHTSA, has the authority to regulate unreasonable risk 762 00:34:55,762 --> 00:34:57,164 to safety on the roads. 763 00:34:59,433 --> 00:35:03,370 Evening of June 29th, we had scheduled a call with Tesla. 764 00:35:03,403 --> 00:35:05,472 Our general counsel let them know 765 00:35:05,506 --> 00:35:07,674 we'd be opening this investigation 766 00:35:07,708 --> 00:35:10,177 and that it would be made public the following day. 767 00:35:11,712 --> 00:35:14,181 At that point, Elon Musk came on 768 00:35:14,214 --> 00:35:17,317 and just sort of started shouting. 769 00:35:17,351 --> 00:35:18,685 He was really, really upset 770 00:35:18,719 --> 00:35:20,754 that we'd be opening a public investigation, 771 00:35:20,787 --> 00:35:25,259 accusing us of singling Tesla out. 772 00:35:25,292 --> 00:35:27,227 Made the point several times that, you know, 773 00:35:27,261 --> 00:35:31,031 this was one fatality out of more than 35,000 a year, 774 00:35:31,064 --> 00:35:33,700 so why were we picking on Tesla 775 00:35:33,734 --> 00:35:35,469 and suggesting that he would sue us 776 00:35:35,502 --> 00:35:37,137 for opening this investigation. 777 00:35:39,039 --> 00:35:41,608 I was surprised to hear Elon on the call. 778 00:35:41,642 --> 00:35:44,211 I was surprised to hear how angry he was. 779 00:35:46,180 --> 00:35:48,048 But ultimately, none of that mattered. 780 00:35:48,081 --> 00:35:51,018 Our job was only to worry about the safety, 781 00:35:51,051 --> 00:35:53,220 and this was a clear issue of safety 782 00:35:53,253 --> 00:35:55,422 that needed to be investigated. 783 00:35:57,324 --> 00:36:00,460 - The driver of the semi reported that the Navy vet 784 00:36:00,494 --> 00:36:04,131 was watching a movie while driving. 785 00:36:04,164 --> 00:36:06,533 - Not very long after the accident, 786 00:36:06,567 --> 00:36:08,202 there were all these people 787 00:36:08,235 --> 00:36:11,205 saying really just crass things, 788 00:36:11,238 --> 00:36:16,376 claiming, you know, that Josh was watching a program. 789 00:36:18,345 --> 00:36:20,347 That's not Josh. 790 00:36:20,380 --> 00:36:21,782 Guarantee you that's not Josh. 791 00:36:21,815 --> 00:36:23,383 - What investigators are looking for 792 00:36:23,417 --> 00:36:26,987 is the data leading up to the accident. 793 00:36:28,455 --> 00:36:32,492 - One of the huge challenges of the system at the time 794 00:36:32,526 --> 00:36:37,331 was trying to differentiate between a truck 795 00:36:37,364 --> 00:36:40,200 and a bridge-- and a overhead bridge. 796 00:36:40,234 --> 00:36:42,336 You know, when a truck is parked perpendicular 797 00:36:42,369 --> 00:36:44,204 to the road and blocking the way, 798 00:36:44,238 --> 00:36:46,340 the system might think of it as a overhead bridge, 799 00:36:46,373 --> 00:36:49,676 and so it was safe to kind of continue driving through it. 800 00:36:49,710 --> 00:36:51,512 - Tesla posted to their blog, 801 00:36:51,545 --> 00:36:53,714 calling this incident a tragic loss. 802 00:36:53,747 --> 00:36:56,283 - Tesla said the car ran into a tractor-trailer 803 00:36:56,316 --> 00:36:57,718 because the software didn't notice 804 00:36:57,751 --> 00:37:03,624 the white side of the truck in the brightly lit sky. 805 00:37:03,657 --> 00:37:05,125 - After the crash, 806 00:37:05,158 --> 00:37:09,029 I think Tesla and Musk were pretty defensive. 807 00:37:09,062 --> 00:37:10,631 - If, in writing some article that's negative, 808 00:37:10,664 --> 00:37:12,266 you effectively dissuade people 809 00:37:12,299 --> 00:37:13,534 from using autonomous vehicle, 810 00:37:13,567 --> 00:37:14,701 you're killing people. 811 00:37:14,735 --> 00:37:17,104 - In the statement that Tesla put out, 812 00:37:17,137 --> 00:37:19,573 they more or less said it was driver error. 813 00:37:20,807 --> 00:37:24,144 They reminded people you have to keep your eyes on the road. 814 00:37:24,178 --> 00:37:25,579 They didn't say Joshua Brown 815 00:37:25,612 --> 00:37:27,181 didn't keep his eyes on the road, 816 00:37:27,214 --> 00:37:29,049 but that's what they were implying. 817 00:37:29,082 --> 00:37:30,517 - Tesla says you should keep your hands 818 00:37:30,551 --> 00:37:33,153 on the steering wheel during the Autopilot. 819 00:37:33,187 --> 00:37:35,455 The question then is... -both: What is the point? 820 00:37:35,489 --> 00:37:36,456 - We both said it. - If I have to hold on 821 00:37:36,490 --> 00:37:38,158 to the wheel? 822 00:37:38,192 --> 00:37:40,527 - Elon had already talked a pretty big game 823 00:37:40,561 --> 00:37:42,729 about what this technology was going to do. 824 00:37:43,730 --> 00:37:45,365 It's kinda hard to reel it back 825 00:37:45,399 --> 00:37:47,634 if you've already raised people's expectations 826 00:37:47,668 --> 00:37:48,902 and excitement. 827 00:37:48,936 --> 00:37:51,738 - Do you have any regrets about how Tesla rolled out Autopilot 828 00:37:51,772 --> 00:37:53,507 in the cars? 829 00:37:53,540 --> 00:37:55,709 - No, I think--I think we did the right thing. 830 00:37:55,742 --> 00:37:58,412 You know, it's basically advanced driver's assistance, 831 00:37:58,445 --> 00:38:00,113 at this point. 832 00:38:00,147 --> 00:38:02,082 Every single step we took, at least from our standpoint, 833 00:38:02,115 --> 00:38:05,219 was to reduce complacency in the use of Autopilot 834 00:38:05,252 --> 00:38:06,720 and to improve safety. 835 00:38:06,753 --> 00:38:10,791 ♪ ♪ 836 00:38:10,824 --> 00:38:13,260 - This is new technology that's on the roads. 837 00:38:13,293 --> 00:38:15,062 People have a lot of questions. 838 00:38:15,095 --> 00:38:17,598 This one crash was an opportunity to sort of say, 839 00:38:17,631 --> 00:38:19,399 is there a technological problem here 840 00:38:19,433 --> 00:38:22,102 with this--you know, with this Autopilot suite? 841 00:38:24,371 --> 00:38:26,673 The first thing we did was go to Tesla and say, 842 00:38:26,707 --> 00:38:29,610 "Hey, give us all the data you have on crashes 843 00:38:29,643 --> 00:38:30,744 on where Autopilot is in use." 844 00:38:33,313 --> 00:38:34,781 What we knew is that there were a lot of crashes. 845 00:38:34,815 --> 00:38:36,717 And this is not surprising and necessarily 846 00:38:36,750 --> 00:38:38,318 a cause for concern. 847 00:38:38,352 --> 00:38:42,623 There are a lot of crashes on the roadways in the U. S. 848 00:38:42,656 --> 00:38:44,791 So yeah, there's 38 separate crashes 849 00:38:44,825 --> 00:38:47,027 that we're looking at here. 850 00:38:47,060 --> 00:38:49,596 - The world doesn't know about these other crashes, 851 00:38:49,630 --> 00:38:52,065 because Tesla hasn't made it public. 852 00:38:52,099 --> 00:38:55,269 Tesla's saying Autopilot is safer, 853 00:38:55,302 --> 00:38:57,237 but what we're seeing with these crashes 854 00:38:57,271 --> 00:38:59,139 are these gray areas. 855 00:38:59,173 --> 00:39:01,508 - In the Tesla case, what we were looking at was: 856 00:39:01,542 --> 00:39:02,776 was there a pattern showing 857 00:39:02,809 --> 00:39:04,745 that there is a technological defect, 858 00:39:04,778 --> 00:39:06,580 or that people were using Autopilot 859 00:39:06,613 --> 00:39:09,383 beyond the way that it was designed to be used? 860 00:39:09,416 --> 00:39:11,218 - The internal pressure was, 861 00:39:11,251 --> 00:39:13,120 "We gotta get this problem solved, pronto." 862 00:39:15,189 --> 00:39:16,557 - When Autopilot first came out, 863 00:39:16,590 --> 00:39:19,026 the main way of making sure the driver 864 00:39:19,059 --> 00:39:20,761 was paying attention-- it could detect 865 00:39:20,794 --> 00:39:23,430 whether your hand was on the steering wheel. 866 00:39:23,463 --> 00:39:25,732 It would let you keep your hand off the steering wheel 867 00:39:25,766 --> 00:39:28,502 for minutes at a time--three, four, five minutes. 868 00:39:29,469 --> 00:39:32,339 - It was just too long between those warnings. 869 00:39:32,372 --> 00:39:34,308 What we had to do was struggle with how to do that 870 00:39:34,341 --> 00:39:37,411 in an elegant way that would keep consumers engaged 871 00:39:37,444 --> 00:39:41,215 and not--and not cause them to ignore or be frustrated by it. 872 00:39:41,248 --> 00:39:43,250 - After weeks of controversy and questions 873 00:39:43,283 --> 00:39:44,418 about the safety of it's Autopilot drivers. 874 00:39:44,451 --> 00:39:46,086 -What it calls major improvements 875 00:39:46,119 --> 00:39:47,454 to its Autopilot software. 876 00:39:48,455 --> 00:39:50,657 - They announced there would be this press conference, 877 00:39:50,691 --> 00:39:51,992 and Elon would talk about it. 878 00:39:52,593 --> 00:39:54,528 - Something quite significant is, 879 00:39:54,561 --> 00:39:57,130 if the user ignores repeated warnings, 880 00:39:57,164 --> 00:39:59,032 more than three times in an hour, 881 00:39:59,066 --> 00:40:02,503 then the driver will have to park the car and restart it. 882 00:40:02,536 --> 00:40:06,173 - There would be more frequent warnings, shorter intervals, 883 00:40:06,206 --> 00:40:08,141 up to three minutes. 884 00:40:08,175 --> 00:40:10,210 You would get a chime to remind you 885 00:40:10,244 --> 00:40:13,480 to put your hands back on. 886 00:40:13,514 --> 00:40:16,650 But that's still a system that has a lot of gaps. 887 00:40:16,683 --> 00:40:19,186 In terms of taking your eyes off the road, 888 00:40:19,219 --> 00:40:22,089 30 seconds is an eternity. 889 00:40:22,122 --> 00:40:24,191 - I really feel like we've struck a great balance 890 00:40:24,224 --> 00:40:29,229 between improving the safety and improved the usefulness. 891 00:40:29,263 --> 00:40:33,233 - I remember Elon talked about how it was gonna be the radar 892 00:40:33,267 --> 00:40:38,438 that was sort of first-rank or priority one. 893 00:40:38,472 --> 00:40:42,276 - We're making much more effective use of radar. 894 00:40:42,309 --> 00:40:47,047 - I just thought radar has been around for 75 years. 895 00:40:47,080 --> 00:40:51,051 If they could do this now, why didn't they do it before? 896 00:40:51,084 --> 00:40:52,686 I think the timing was significant. 897 00:40:52,719 --> 00:40:57,457 I mean, it was right after this tragic accident. 898 00:40:57,491 --> 00:41:00,527 And they were trying to make it sound like, 899 00:41:00,561 --> 00:41:02,529 "We got this under control." 900 00:41:02,563 --> 00:41:04,464 - Obvious question I have to ask: 901 00:41:04,498 --> 00:41:07,301 would the improvements have mitigated 902 00:41:07,334 --> 00:41:10,237 or saved, say, Josh Brown's life? 903 00:41:10,270 --> 00:41:12,739 ♪ ♪ 904 00:41:12,773 --> 00:41:15,342 - We believe it would have. 905 00:41:15,375 --> 00:41:18,745 - And so, the truck would have been seen by the radar only, 906 00:41:18,779 --> 00:41:20,747 and braking would have been engaged. 907 00:41:24,051 --> 00:41:26,553 - These things cannot be said with absolute certainty, 908 00:41:26,587 --> 00:41:28,755 but we believe it is very likely that, 909 00:41:28,789 --> 00:41:31,158 yes, it would have. 910 00:41:37,097 --> 00:41:38,398 - Yeah, I mean, there have been so many announcements of, 911 00:41:38,432 --> 00:41:39,499 like, autonomous EV startups. 912 00:41:39,533 --> 00:41:40,767 I'm waiting for my mom to announce one. 913 00:41:40,801 --> 00:41:42,069 - Okay. [laughter] 914 00:41:42,102 --> 00:41:44,705 - It's like, "Mom, you too?" [laughter] 915 00:41:44,738 --> 00:41:46,673 - Speaking of that, when you're talking about the sales, 916 00:41:46,707 --> 00:41:50,010 you have booked how many orders for-- 917 00:41:50,043 --> 00:41:51,778 - It's on the order of 400,000. - 400,000. 918 00:41:51,812 --> 00:41:53,580 - It's quite surprising, actually. 919 00:41:53,614 --> 00:41:56,083 I mean, the-- 920 00:41:56,116 --> 00:41:58,318 'cause we didn't do any advertising. 921 00:41:58,352 --> 00:42:00,320 - Elon had, I think, in some ways, 922 00:42:00,354 --> 00:42:01,722 a personal point of pride 923 00:42:01,755 --> 00:42:05,058 to be able to move faster than the competition. 924 00:42:05,092 --> 00:42:08,729 [cheers and applause] 925 00:42:08,762 --> 00:42:11,164 The company was betting its survival 926 00:42:11,198 --> 00:42:13,367 on the success of the Model 3. 927 00:42:13,400 --> 00:42:15,469 And the fact that Autopilot was gonna be on it, 928 00:42:15,502 --> 00:42:17,337 I think was a huge selling point. 929 00:42:17,371 --> 00:42:20,007 - If you think about fully autonomous vehicles, 930 00:42:20,040 --> 00:42:22,709 how far do you think we are from that becoming a reality? 931 00:42:22,743 --> 00:42:25,179 - I think we're basically 932 00:42:25,212 --> 00:42:28,315 less than two years away from complete autonomy. 933 00:42:28,348 --> 00:42:31,084 - Wow. - Complete. Safer than a human. 934 00:42:31,118 --> 00:42:33,153 - As with a lot of what happens with Elon, 935 00:42:33,187 --> 00:42:35,689 he doubles down on it over and over and over again. 936 00:42:35,722 --> 00:42:37,457 And he continues with his message, right, 937 00:42:37,491 --> 00:42:39,193 that, you know, this is gonna be 938 00:42:39,226 --> 00:42:41,061 a safe thing for the world. 939 00:42:41,094 --> 00:42:42,596 You know, the Joshua Brown crash 940 00:42:42,629 --> 00:42:45,032 was in the spring of 2016. 941 00:42:45,065 --> 00:42:48,202 By the fall of 2016, the entire Autopilot team 942 00:42:48,235 --> 00:42:51,205 essentially quit what they were doing, 943 00:42:51,238 --> 00:42:54,608 and they all chipped in on this video 944 00:42:54,641 --> 00:42:58,345 to show just how autonomous, 945 00:42:58,378 --> 00:43:00,280 so to speak, their car could be. 946 00:43:00,314 --> 00:43:03,283 [rock music] 947 00:43:03,317 --> 00:43:10,157 ♪ ♪ 948 00:43:16,730 --> 00:43:19,066 - Do you remember this video? 949 00:43:19,099 --> 00:43:20,734 - Yeah. 950 00:43:20,767 --> 00:43:23,370 - Changed lanes, and stopped just in-- 951 00:43:23,403 --> 00:43:24,705 just short of a crosswalk. 952 00:43:24,738 --> 00:43:27,274 We're turning right onto-- yeah, kind of 953 00:43:27,307 --> 00:43:28,675 in front of traffic, but-- 954 00:43:28,709 --> 00:43:32,646 - It's very slick, but the video does not give you 955 00:43:32,679 --> 00:43:36,517 a full impression of what is actually happening. 956 00:43:36,550 --> 00:43:38,252 - The people that were putting it together 957 00:43:38,285 --> 00:43:41,188 were sitting right behind me. 958 00:43:41,221 --> 00:43:44,558 And the Autopilot group was running lap after lap that day, 959 00:43:44,591 --> 00:43:46,426 to try to get a clean lap. 960 00:43:46,460 --> 00:43:51,098 - At one point, the car, while in Autopilot mode, 961 00:43:51,131 --> 00:43:53,100 hit a fence. 962 00:43:53,133 --> 00:43:55,736 They patched the car up, and they did another run. 963 00:43:55,769 --> 00:43:57,337 - And so, at the very end of the day, 964 00:43:57,371 --> 00:43:59,473 apparently the clean lap came in. 965 00:43:59,506 --> 00:44:01,375 They started editing it all together. 966 00:44:01,408 --> 00:44:03,410 ♪ ♪ 967 00:44:03,443 --> 00:44:05,345 - This was meant to be a demo video 968 00:44:05,379 --> 00:44:07,214 of what the team was working on and developing, 969 00:44:07,247 --> 00:44:10,551 and what its capability could deliver in the future. 970 00:44:10,584 --> 00:44:12,419 - I think my biggest problem with the video 971 00:44:12,452 --> 00:44:15,489 was the first line that says it's the driver 972 00:44:15,522 --> 00:44:17,391 was only there for legal reason. 973 00:44:17,424 --> 00:44:19,393 ♪ ♪ 974 00:44:19,426 --> 00:44:21,128 I think it's definitely language 975 00:44:21,161 --> 00:44:22,796 that's designed for marketing. 976 00:44:22,829 --> 00:44:25,098 We are trying to imply that the thing 977 00:44:25,132 --> 00:44:26,567 is fully capable of self-driving, 978 00:44:26,600 --> 00:44:30,404 and only the evil regulators are holding us back. 979 00:44:30,437 --> 00:44:31,505 - They sort of portrayed it as something 980 00:44:31,538 --> 00:44:32,706 as all their cars can do, 981 00:44:32,739 --> 00:44:35,776 and that, I don't think, was really fair. 982 00:44:35,809 --> 00:44:38,712 [cheering] 983 00:44:38,745 --> 00:44:40,347 - Not too long after that, 984 00:44:40,380 --> 00:44:43,150 Tesla started offering an official service called 985 00:44:43,183 --> 00:44:49,189 Full Self-Driving, capital FSD, for as much as $10,000. 986 00:44:49,223 --> 00:44:50,691 Now, in the short-term, 987 00:44:50,724 --> 00:44:53,360 what they and everybody else was really buying 988 00:44:53,393 --> 00:44:56,463 was the promise that this is gonna happen. 989 00:44:56,496 --> 00:44:58,765 - The idea was we were putting the hardware on every car 990 00:44:58,799 --> 00:45:01,134 in advance of having the software. 991 00:45:01,168 --> 00:45:04,104 It was a gutsy move because then the software 992 00:45:04,137 --> 00:45:08,208 had to be developed to deliver the capability. 993 00:45:08,242 --> 00:45:09,776 - We're still on track for being able 994 00:45:09,810 --> 00:45:13,080 to go cross-country, from L. A. to New York 995 00:45:13,113 --> 00:45:15,382 by the end of the year, fully autonomous. 996 00:45:15,415 --> 00:45:17,784 - There was a sincere belief inside of Tesla, and Elon 997 00:45:17,818 --> 00:45:19,219 had the sincere belief that 998 00:45:19,253 --> 00:45:21,388 hey, we're just around the corner. 999 00:45:21,421 --> 00:45:23,090 - A lot of people at the time believed 1000 00:45:23,123 --> 00:45:25,092 that Tesla had an advantage 1001 00:45:25,125 --> 00:45:27,561 in getting self-driving to the market first, 1002 00:45:27,594 --> 00:45:30,364 because it already had all the cars on the road 1003 00:45:30,397 --> 00:45:32,499 that could be collecting data all the time, 1004 00:45:32,533 --> 00:45:36,703 and that data would help train the computer to be better. 1005 00:45:36,737 --> 00:45:38,438 - So, you've already got a fleet of Teslas 1006 00:45:38,472 --> 00:45:40,641 driving all these roads. - Yeah. 1007 00:45:40,674 --> 00:45:42,543 - You're accumulating a huge amount of data. 1008 00:45:42,576 --> 00:45:44,144 - Yes. 1009 00:45:44,178 --> 00:45:47,548 - I expected to see sophisticated infrastructure 1010 00:45:47,581 --> 00:45:51,485 to collect that data, to process that data. 1011 00:45:51,518 --> 00:45:54,321 The reality was a lot of the types of data 1012 00:45:54,354 --> 00:45:56,657 that you would want to collect from the car, 1013 00:45:56,690 --> 00:46:00,394 like video data, high-quality images, 1014 00:46:00,427 --> 00:46:04,231 there was neither the hardware nor the backend infrastructure 1015 00:46:04,264 --> 00:46:08,101 to allow that volume of data to reach Tesla. 1016 00:46:08,135 --> 00:46:11,572 And so that rate of learning wasn't great. 1017 00:46:13,407 --> 00:46:16,143 - Elon, he put eight cameras on the car. 1018 00:46:16,176 --> 00:46:17,578 I don't think that was enough, 1019 00:46:17,611 --> 00:46:20,180 because they were not redundant, 1020 00:46:20,214 --> 00:46:22,182 other than the front cameras. 1021 00:46:22,216 --> 00:46:24,751 You really need redundancy, so if one of these sensors fails, 1022 00:46:24,785 --> 00:46:28,121 the car can either stop itself in a safe manner 1023 00:46:28,155 --> 00:46:30,057 or it can continue driving. 1024 00:46:30,090 --> 00:46:32,526 - There was a small space right in front of the car 1025 00:46:32,559 --> 00:46:37,297 that was completely out of the view for any of the cameras. 1026 00:46:37,331 --> 00:46:39,132 And so, you know, a small dog 1027 00:46:39,166 --> 00:46:42,102 or a baby could crawl in front of the car, 1028 00:46:42,135 --> 00:46:44,271 and a car wouldn't be able to know 1029 00:46:44,304 --> 00:46:48,442 whether it's safe to move forward or start to drive. 1030 00:46:53,347 --> 00:46:55,415 It was hard for me to personally believe 1031 00:46:55,449 --> 00:46:58,051 that promise was gonna be lived up to, 1032 00:46:58,085 --> 00:46:59,386 that we could be confident 1033 00:46:59,419 --> 00:47:01,722 that this was gonna enable full self-driving. 1034 00:47:01,755 --> 00:47:04,491 ♪ ♪ 1035 00:47:04,525 --> 00:47:07,127 - Sometime after the Joshua Brown crash, 1036 00:47:07,160 --> 00:47:10,230 the head of Autopilot left Tesla. 1037 00:47:10,264 --> 00:47:11,498 You know, it just gave the image 1038 00:47:11,532 --> 00:47:13,534 of some sort of instability there. 1039 00:47:13,567 --> 00:47:16,136 - There was a sense that when Elon felt that things 1040 00:47:16,170 --> 00:47:17,271 were not going well, 1041 00:47:17,304 --> 00:47:20,174 there was efforts to shake things up. 1042 00:47:20,207 --> 00:47:23,410 There were members of the team that I learned were fired. 1043 00:47:23,443 --> 00:47:27,014 I never knew why. They just stopped showing up. 1044 00:47:27,047 --> 00:47:30,017 [tense music] 1045 00:47:30,050 --> 00:47:31,451 ♪ ♪ 1046 00:47:31,485 --> 00:47:35,589 Theranos was happening during that same time period. 1047 00:47:35,622 --> 00:47:38,158 And a lot of the stories were kind of, like, 1048 00:47:38,192 --> 00:47:39,526 at the back of my mind, 1049 00:47:39,560 --> 00:47:43,397 and it just definitely made me question a lot more 1050 00:47:43,430 --> 00:47:49,002 about what's behind some of this public optimism. 1051 00:47:49,036 --> 00:47:53,774 ♪ ♪ 1052 00:47:53,807 --> 00:47:55,375 After I left Tesla, 1053 00:47:55,409 --> 00:47:58,745 I felt like I had to do a bit of soul searching, 1054 00:47:58,779 --> 00:48:03,517 just because I feel like sometimes it seems like 1055 00:48:03,550 --> 00:48:06,153 people and companies 1056 00:48:06,186 --> 00:48:09,356 were being rewarded not for telling the truth 1057 00:48:09,389 --> 00:48:13,026 but in fact for doing maybe a bit of the opposite. 1058 00:48:13,060 --> 00:48:17,364 ♪ ♪ 1059 00:48:17,397 --> 00:48:19,466 - It was my last day on the job at NHTSA 1060 00:48:19,499 --> 00:48:23,203 when we were ready to release that report. 1061 00:48:23,237 --> 00:48:25,239 It was the end of the Obama Administration, 1062 00:48:25,272 --> 00:48:27,641 and so we made sort of an internal commitment 1063 00:48:27,674 --> 00:48:31,278 to say we're not gonna leave this to the next guys. 1064 00:48:31,311 --> 00:48:34,214 - A months-long investigation into Tesla's Autopilot system 1065 00:48:34,248 --> 00:48:36,083 has wrapped up. - There was no defect, 1066 00:48:36,116 --> 00:48:37,651 and therefore there will be no recall 1067 00:48:37,684 --> 00:48:39,353 related to Tesla's Autopilot. 1068 00:48:39,386 --> 00:48:42,656 - Essentially clearing the company. 1069 00:48:42,689 --> 00:48:45,425 - I was a little bit dumbfounded. 1070 00:48:45,459 --> 00:48:50,130 The system couldn't see a tractor-trailer, 1071 00:48:50,163 --> 00:48:52,165 and that's not a defect? 1072 00:48:52,199 --> 00:48:55,536 ♪ ♪ 1073 00:48:55,569 --> 00:48:57,337 - So, this is--you know, it's a little complicated, 1074 00:48:57,371 --> 00:48:59,706 and almost counterintuitive, right? 1075 00:48:59,740 --> 00:49:03,177 Autopilot didn't even engage to try to stop that crash. 1076 00:49:03,210 --> 00:49:05,145 But the fact of the matter is Autopilot 1077 00:49:05,179 --> 00:49:09,516 wasn't designed to stop every crash in every instance. 1078 00:49:09,550 --> 00:49:12,085 It was a driver-assistance system. 1079 00:49:12,119 --> 00:49:14,588 It wasn't a full self-driving system. 1080 00:49:14,621 --> 00:49:17,124 - Tesla issued a statement saying it appreciated 1081 00:49:17,157 --> 00:49:19,359 the thoroughness of the investigation. 1082 00:49:19,393 --> 00:49:22,396 - My personal point of view was it's clear this technology 1083 00:49:22,429 --> 00:49:24,231 is being misused right now. 1084 00:49:24,264 --> 00:49:27,100 We saw people were pushing the limits on the system, 1085 00:49:27,134 --> 00:49:28,435 but--and this was early on 1086 00:49:28,468 --> 00:49:30,571 in the deployment of the technology, 1087 00:49:30,604 --> 00:49:34,341 and at the time, there wasn't enough data to show 1088 00:49:34,374 --> 00:49:36,510 that there was a technological defect here. 1089 00:49:37,544 --> 00:49:41,782 - I remember the day the news came out that crashes dropped 1090 00:49:41,815 --> 00:49:46,587 40% after the Autopilot component was added. 1091 00:49:46,620 --> 00:49:48,555 A lot of news articles repeated it 1092 00:49:48,589 --> 00:49:50,324 because NHTSA had said it, 1093 00:49:50,357 --> 00:49:52,759 and that gave it some legitimacy. 1094 00:49:52,793 --> 00:49:55,562 You know, I mean, if the regulators are saying it, 1095 00:49:55,596 --> 00:49:58,232 it must be true. 1096 00:49:58,265 --> 00:50:00,100 - You know, I think that's an unfortunate statistic 1097 00:50:00,133 --> 00:50:02,469 that didn't probably belong in the report. 1098 00:50:02,503 --> 00:50:04,505 It was based on data provided by the company 1099 00:50:04,538 --> 00:50:06,173 that hadn't been sort of 1100 00:50:06,206 --> 00:50:09,409 independently verified or vetted. 1101 00:50:09,443 --> 00:50:12,145 - Eventually, some independent researchers 1102 00:50:12,179 --> 00:50:13,747 started looking at the crash data 1103 00:50:13,780 --> 00:50:17,050 and didn't believe it was valid. 1104 00:50:17,084 --> 00:50:20,254 - But Tesla was very eager to pick up on that statistic 1105 00:50:20,287 --> 00:50:23,690 and use it to sort of say "not only is Autopilot good, 1106 00:50:23,724 --> 00:50:25,392 it's better than human drivers." 1107 00:50:25,425 --> 00:50:28,762 - NHTSA did a study on Tesla's Autopilot version 1, 1108 00:50:28,795 --> 00:50:30,697 which was relatively primitive, 1109 00:50:30,731 --> 00:50:35,002 and found that it was a 45% reduction in highway accidents. 1110 00:50:35,035 --> 00:50:37,638 - You know, I think that was a successful PR move 1111 00:50:37,671 --> 00:50:39,339 on their part. 1112 00:50:39,373 --> 00:50:42,442 Musk and Tesla, they're master marketers. 1113 00:50:47,114 --> 00:50:50,083 [light music] 1114 00:50:50,117 --> 00:50:51,718 ♪ ♪ 1115 00:50:51,752 --> 00:50:54,621 - What scares you the most about autonomous cars? 1116 00:50:54,655 --> 00:50:58,592 - I think people are wildly underestimating the complexity 1117 00:50:58,625 --> 00:51:00,661 of bringing automation into this system. 1118 00:51:03,730 --> 00:51:06,099 My name is Christopher Hart. I am the former Chairman 1119 00:51:06,133 --> 00:51:09,169 of the National Transportation Safety Board. 1120 00:51:09,203 --> 00:51:11,104 The NTSB is the federal agency 1121 00:51:11,138 --> 00:51:12,639 that was created to investigate 1122 00:51:12,673 --> 00:51:14,708 transportation accidents and make recommendations 1123 00:51:14,741 --> 00:51:16,109 to try to prevent the accidents 1124 00:51:16,143 --> 00:51:18,679 from happening again. 1125 00:51:18,712 --> 00:51:21,315 When I first heard about that Tesla crash, 1126 00:51:21,348 --> 00:51:24,318 I knew enough about automation from my own aviation experience 1127 00:51:24,351 --> 00:51:26,587 that I knew it was not gonna be as simple as people thought. 1128 00:51:27,588 --> 00:51:31,258 It took a year-plus to investigate. 1129 00:51:31,291 --> 00:51:34,461 Then, September 2017, there was a public hearing. 1130 00:51:34,494 --> 00:51:35,696 - Welcome to the boardroom 1131 00:51:35,729 --> 00:51:37,731 of the National Transportation Safety Board. 1132 00:51:37,764 --> 00:51:40,434 We were very curious about this particular crash. 1133 00:51:40,467 --> 00:51:44,671 And the further we got into it, the more we started realizing 1134 00:51:44,705 --> 00:51:46,540 that, wow, there are a lot of issues here 1135 00:51:46,573 --> 00:51:48,275 that really need to be looked at. 1136 00:51:48,308 --> 00:51:49,610 It is our sincere hope 1137 00:51:49,643 --> 00:51:52,613 that the lessons learned from this tragedy 1138 00:51:52,646 --> 00:51:56,016 can help prevent future tragedies. 1139 00:51:56,049 --> 00:52:00,654 An accident is rarely the result of just one factor. 1140 00:52:00,687 --> 00:52:04,191 In this crash, one would be, of course, the truck driver 1141 00:52:04,224 --> 00:52:08,061 pulling across the lane, when he shouldn't have. 1142 00:52:08,095 --> 00:52:11,231 But I would say that there was also the automation complacency 1143 00:52:11,265 --> 00:52:14,434 associated with the design of the Tesla vehicle. 1144 00:52:16,203 --> 00:52:17,437 For up to ten seconds, 1145 00:52:17,471 --> 00:52:19,339 that there would have been a line of sight 1146 00:52:19,373 --> 00:52:21,708 between this Tesla and the vehicle 1147 00:52:21,742 --> 00:52:23,377 that was crossing in front of him, 1148 00:52:23,410 --> 00:52:27,548 there was the opportunity to avoid this crash. 1149 00:52:27,581 --> 00:52:30,517 We could not determine exactly what he was doing 1150 00:52:30,551 --> 00:52:33,654 in this crash. 1151 00:52:33,687 --> 00:52:35,289 We certainly heard those rumors 1152 00:52:35,322 --> 00:52:38,058 about the driver watching videos. 1153 00:52:38,091 --> 00:52:40,794 But we had no evidence of that at all. 1154 00:52:40,827 --> 00:52:43,397 Did you find any evidence at all 1155 00:52:43,430 --> 00:52:45,065 that the driver of the Tesla 1156 00:52:45,098 --> 00:52:48,202 may have been watching a movie while driving this car? 1157 00:52:48,235 --> 00:52:50,103 - We looked through his laptop, 1158 00:52:50,137 --> 00:52:53,273 and there was no movies on that laptop. 1159 00:52:53,307 --> 00:52:57,177 - We, at the NTSB, really feel like the drivers 1160 00:52:57,211 --> 00:53:01,682 have the tendency to disengage when the Autopilot 1161 00:53:01,715 --> 00:53:04,751 is engaged with the Tesla vehicle. 1162 00:53:04,785 --> 00:53:06,620 Complacency creeps in over time, 1163 00:53:06,653 --> 00:53:09,156 and you develop overconfidence with the system. 1164 00:53:09,189 --> 00:53:12,192 - I was concerned about the use of the term "Autopilot," 1165 00:53:12,226 --> 00:53:13,527 because there are too many people 1166 00:53:13,560 --> 00:53:15,028 who construe the term "Autopilot" 1167 00:53:15,062 --> 00:53:18,332 to mean human-engagement no longer necessary. 1168 00:53:18,365 --> 00:53:20,300 - They advise you to keep your hands on the steering wheel 1169 00:53:20,334 --> 00:53:21,702 when using the auto-steer, 1170 00:53:21,735 --> 00:53:25,305 but as we're in testing, you really don't need to. 1171 00:53:25,339 --> 00:53:27,374 [beeping] - The Autopilot 1172 00:53:27,407 --> 00:53:28,775 is supposed to have a system 1173 00:53:28,809 --> 00:53:32,379 where it can detect driver engagement. 1174 00:53:32,412 --> 00:53:33,780 There were periods of time-- 1175 00:53:33,814 --> 00:53:36,283 almost of six minutes-- where his hands 1176 00:53:36,316 --> 00:53:38,685 were not even detected to be on the steering wheel. 1177 00:53:38,719 --> 00:53:41,588 We felt that the system of determining 1178 00:53:41,622 --> 00:53:46,193 driver engagement was poor. 1179 00:53:46,226 --> 00:53:48,095 - Another issue is at that time, 1180 00:53:48,128 --> 00:53:49,763 no manufacturer had a system 1181 00:53:49,796 --> 00:53:52,366 to reliably sense crossing traffic. 1182 00:53:52,399 --> 00:53:54,334 That's why these systems are supposed to be used 1183 00:53:54,368 --> 00:53:57,604 only on roads that don't have crossing traffic. 1184 00:53:57,638 --> 00:54:00,641 - This road, Highway 27A, in Florida, 1185 00:54:00,674 --> 00:54:04,244 was not a limited access road. But the question is, 1186 00:54:04,278 --> 00:54:07,047 if the system is not supposed to be operated 1187 00:54:07,080 --> 00:54:10,083 on anything other than a highway, 1188 00:54:10,117 --> 00:54:13,587 why does the system allow it to be operated 1189 00:54:13,620 --> 00:54:16,356 in other types of roadways? 1190 00:54:16,390 --> 00:54:19,560 It's like having a swimming pool without a fence around it. 1191 00:54:19,593 --> 00:54:22,229 It's a--it's an attractive nuisance. 1192 00:54:22,262 --> 00:54:25,265 Tesla allowed the driver to use the system 1193 00:54:25,299 --> 00:54:28,268 outside of the environment for which it was designed. 1194 00:54:28,302 --> 00:54:32,105 The result was a collision that, frankly, 1195 00:54:32,139 --> 00:54:33,307 should have never happened. 1196 00:54:35,375 --> 00:54:38,045 - The ultimate paradox is that the better the automation gets 1197 00:54:38,078 --> 00:54:40,547 after removing the human, the more challenging 1198 00:54:40,581 --> 00:54:44,718 the human-automation interface issues become. 1199 00:54:44,751 --> 00:54:46,220 While the human is the most 1200 00:54:46,253 --> 00:54:47,588 unpredictable and variable part 1201 00:54:47,621 --> 00:54:50,090 of the whole system, it is also, at the same time, 1202 00:54:50,123 --> 00:54:51,758 the most adaptable part of the system 1203 00:54:51,792 --> 00:54:54,728 when you need adaptation. 1204 00:54:54,761 --> 00:54:56,496 I think Elon is an IT wizard, 1205 00:54:56,530 --> 00:54:59,199 and I think that IT wizardry is going to help us 1206 00:54:59,233 --> 00:55:01,702 get where we want to go, but we need to do it in a way 1207 00:55:01,735 --> 00:55:04,438 that encompasses the human element as well. 1208 00:55:04,471 --> 00:55:06,373 - I think that the manufacturers 1209 00:55:06,406 --> 00:55:12,312 have a role in preventing this automation complacency. 1210 00:55:12,346 --> 00:55:15,749 You can't buy a production model self-driving car 1211 00:55:15,782 --> 00:55:19,052 from any automobile maker today. 1212 00:55:19,086 --> 00:55:21,788 Anyone who says that you can is misleading you, 1213 00:55:21,822 --> 00:55:24,591 and anyone who leaves that impression 1214 00:55:24,625 --> 00:55:26,727 is leaving the wrong impression. 1215 00:55:29,563 --> 00:55:32,733 To Tesla, we issued several recommendations-- 1216 00:55:32,766 --> 00:55:36,370 basically, do not allow the system 1217 00:55:36,403 --> 00:55:38,071 to be operated on roadways 1218 00:55:38,105 --> 00:55:41,141 where it's not designed to be operated. 1219 00:55:41,175 --> 00:55:44,211 And another recommendation was you need a better way 1220 00:55:44,244 --> 00:55:46,413 of determining driver engagement. 1221 00:55:46,446 --> 00:55:50,684 I do feel that if those recs are accomplished, 1222 00:55:50,717 --> 00:55:52,152 safety will be improved. 1223 00:55:53,353 --> 00:55:54,521 - Tesla releasing a statement 1224 00:55:54,555 --> 00:55:57,724 saying customer safety comes first. 1225 00:55:57,758 --> 00:56:00,594 - Boulder Crest is a nonprofit that takes care 1226 00:56:00,627 --> 00:56:03,096 of men and women who are suffering with PTSD. 1227 00:56:05,365 --> 00:56:07,134 The building we're standing in today 1228 00:56:07,167 --> 00:56:09,603 is the Josh Brown Center for Innovation. 1229 00:56:09,636 --> 00:56:11,638 We had this great dedication ceremony 1230 00:56:11,672 --> 00:56:13,307 the day we opened this. 1231 00:56:13,340 --> 00:56:14,675 The family came back to us 1232 00:56:14,708 --> 00:56:17,077 with this amazing paragraph that they wanted 1233 00:56:17,110 --> 00:56:20,747 the Director of the Boulder Crest Institute to read. 1234 00:56:20,781 --> 00:56:24,585 - Joshua believed, and our family continues to believe, 1235 00:56:24,618 --> 00:56:26,787 that the new technology going into cars 1236 00:56:26,820 --> 00:56:28,522 and the move to autonomous driving 1237 00:56:28,555 --> 00:56:30,591 has already saved many lives. 1238 00:56:30,624 --> 00:56:32,759 Change always comes with risks, 1239 00:56:32,793 --> 00:56:34,494 and zero tolerance for deaths 1240 00:56:34,528 --> 00:56:38,232 would totally stop innovations and improvements. 1241 00:56:38,265 --> 00:56:41,301 Nobody wants tragedy to touch their family. 1242 00:56:41,335 --> 00:56:43,704 But expecting to identify all limitations 1243 00:56:43,737 --> 00:56:45,506 of an emerging technology, 1244 00:56:45,539 --> 00:56:49,209 and expecting perfection is not feasible either. 1245 00:56:49,243 --> 00:56:51,278 - And that, to me, just-- I mean, it-- 1246 00:56:51,311 --> 00:56:53,347 I think it brought tears to everybody in that audience 1247 00:56:53,380 --> 00:56:55,182 that knew Josh Brown. 1248 00:56:55,215 --> 00:56:58,018 - Part of Joshua's legacy is that the accident 1249 00:56:58,051 --> 00:56:59,286 drove additional improvements, 1250 00:56:59,319 --> 00:57:02,055 making the new technology even safer. 1251 00:57:02,089 --> 00:57:05,058 Our family takes solace and pride in the fact 1252 00:57:05,092 --> 00:57:08,061 that our son is making such a positive impact 1253 00:57:08,095 --> 00:57:10,564 on future highway safety. 1254 00:57:10,597 --> 00:57:12,099 - We all sat there and thought, 1255 00:57:12,132 --> 00:57:15,769 we have to learn a lesson from what happened. 1256 00:57:15,802 --> 00:57:17,504 When you think of somebody like Josh, 1257 00:57:17,538 --> 00:57:19,306 who was on that leading edge, 1258 00:57:19,339 --> 00:57:22,042 he was gonna test that car. 1259 00:57:22,075 --> 00:57:23,677 I think there's a false sense of security 1260 00:57:23,710 --> 00:57:26,580 when you put these options in front of people. 1261 00:57:35,471 --> 00:57:36,605 - Wouldn't hurt to have more love in the world. 1262 00:57:36,639 --> 00:57:37,907 - How you gonna fix that? 1263 00:57:37,940 --> 00:57:39,341 You have a love machine you're working on? 1264 00:57:39,375 --> 00:57:40,476 [laughter] 1265 00:57:40,509 --> 00:57:44,046 - No, but probably spend more time with your friends 1266 00:57:44,079 --> 00:57:46,882 and less time on social media. 1267 00:57:49,018 --> 00:57:51,520 I mean, the only thing I've kept is Twitter, 1268 00:57:51,554 --> 00:57:53,255 because I kinda, like, need some means 1269 00:57:53,289 --> 00:57:55,524 of getting a message out, you know? 1270 00:57:56,826 --> 00:57:59,762 - I think Elon Musk definitely understands 1271 00:57:59,795 --> 00:58:01,030 the power of his celebrity. 1272 00:58:01,063 --> 00:58:04,633 - Elon, what do you think about dogecoin going crazy right now? 1273 00:58:04,667 --> 00:58:06,936 [people shouting] 1274 00:58:06,969 --> 00:58:09,839 - I think that's part of how he operates. 1275 00:58:09,872 --> 00:58:11,907 That's why he's on Twitter all the time. 1276 00:58:11,941 --> 00:58:15,911 - You use your tweeting to kind of get back at critics. 1277 00:58:15,945 --> 00:58:18,314 - Rarely. - You kinda have little wars 1278 00:58:18,347 --> 00:58:19,381 with the press. 1279 00:58:19,415 --> 00:58:20,749 - Twitter is a war zone. 1280 00:58:20,783 --> 00:58:22,852 - He sort of doesn't have a filter. 1281 00:58:22,885 --> 00:58:24,754 - Elon Musk was on Twitter today 1282 00:58:24,787 --> 00:58:25,955 calling one of the divers 1283 00:58:25,988 --> 00:58:28,491 in that cave rescue a pedophile. 1284 00:58:28,524 --> 00:58:30,693 - Elon Musk shook up the stock market 1285 00:58:30,726 --> 00:58:32,294 this afternoon with a tweet that read, 1286 00:58:32,328 --> 00:58:34,997 "I'm considering taking Tesla private." 1287 00:58:35,030 --> 00:58:37,366 - I think it kinda goes both ways. 1288 00:58:37,399 --> 00:58:40,536 He can say things, and he can get people believing them. 1289 00:58:43,339 --> 00:58:45,608 - New tonight at 5:00, Tesla has published 1290 00:58:45,641 --> 00:58:48,043 its first quarterly safety report. 1291 00:58:48,077 --> 00:58:50,913 - In 2018, Tesla started releasing 1292 00:58:50,946 --> 00:58:53,916 these Autopilot safety statistics, 1293 00:58:53,949 --> 00:58:56,552 and have continued to release data. 1294 00:58:56,585 --> 00:58:58,554 On the surface, they looked like 1295 00:58:58,587 --> 00:59:00,022 they presented a good picture. 1296 00:59:00,055 --> 00:59:01,924 - We publish the safety stats, like, basically, 1297 00:59:01,957 --> 00:59:06,262 miles driven on Autopilot and miles driven manually. 1298 00:59:06,295 --> 00:59:07,797 It was a factor of ten difference. 1299 00:59:07,830 --> 00:59:09,298 This is not subtle. 1300 00:59:09,331 --> 00:59:12,735 - But it was just broad numbers. 1301 00:59:12,768 --> 00:59:15,671 It's not really a fair comparison to say Teslas 1302 00:59:15,704 --> 00:59:18,541 are dramatically safer than all other cars on the road, 1303 00:59:18,574 --> 00:59:20,509 because all other cars on the road 1304 00:59:20,543 --> 00:59:22,511 can include 20-year-old vehicles 1305 00:59:22,545 --> 00:59:24,647 that are not in good repair. 1306 00:59:24,680 --> 00:59:27,750 - If you think about the miles that Tesla drives on Autopilot, 1307 00:59:27,783 --> 00:59:29,885 almost all those are gonna be freeway cruising miles. 1308 00:59:29,919 --> 00:59:31,887 Those miles are incredibly safe. 1309 00:59:31,921 --> 00:59:34,390 City streets, parking lots, things like that, 1310 00:59:34,423 --> 00:59:36,292 those are much more likely to have incidents. 1311 00:59:38,294 --> 00:59:39,795 I believe that they're presenting data 1312 00:59:39,829 --> 00:59:41,397 that makes them look the best, 1313 00:59:41,430 --> 00:59:43,899 that is still technically accurate. 1314 00:59:46,335 --> 00:59:49,505 - It's Tesla and Elon Musk providing data 1315 00:59:49,538 --> 00:59:51,874 to support their point of view, 1316 00:59:51,907 --> 00:59:53,776 but that's not a full picture. 1317 00:59:53,809 --> 00:59:55,811 I don't think that gives you enough data 1318 00:59:55,845 --> 00:59:57,279 to really make a judgment. 1319 00:59:57,313 --> 00:59:59,381 - People can say, "Oh, well, you're playing 1320 00:59:59,415 --> 01:00:00,850 with the statistics." 1321 01:00:00,883 --> 01:00:02,485 I'm like, we're not fiddling with the statistics. 1322 01:00:02,518 --> 01:00:05,020 The truth is that people are actually not great 1323 01:00:05,054 --> 01:00:08,524 at driving these two-ton death machines. 1324 01:00:08,557 --> 01:00:10,926 - Then, March of 2018-- 1325 01:00:10,960 --> 01:00:12,428 - Fatal crash and fire 1326 01:00:12,461 --> 01:00:14,663 involving a Tesla in Mountain View. 1327 01:00:14,697 --> 01:00:19,969 - That's when the Walter Huang crash happens in California. 1328 01:00:20,002 --> 01:00:22,805 It was at a point where the freeway splits, 1329 01:00:22,838 --> 01:00:25,374 and the Autopilot became confused, 1330 01:00:25,407 --> 01:00:27,910 and he ran straight into a concrete barrier. 1331 01:00:27,943 --> 01:00:30,946 - 38-year-old Walter Huang had a wife and two kids. 1332 01:00:30,980 --> 01:00:35,050 - The NTSB is investigating that fatal crash and fire. 1333 01:00:35,084 --> 01:00:38,788 - Tesla was a party to our investigation. 1334 01:00:38,821 --> 01:00:41,590 But one of the rules of being a party 1335 01:00:41,624 --> 01:00:44,593 is that the parties can't release information 1336 01:00:44,627 --> 01:00:47,563 about the active investigation. 1337 01:00:47,596 --> 01:00:50,699 - Tesla released data saying Walter Huang had his hands 1338 01:00:50,733 --> 01:00:53,369 off the wheel for six seconds before the crash. 1339 01:00:54,537 --> 01:00:57,506 - I called Elon Musk and said they would have to abide 1340 01:00:57,540 --> 01:00:59,275 by our party agreement. 1341 01:00:59,308 --> 01:01:00,843 And then a few days later, 1342 01:01:00,876 --> 01:01:04,947 Tesla was releasing information about the crash. 1343 01:01:04,980 --> 01:01:06,882 - Tesla released another statement that read, 1344 01:01:06,916 --> 01:01:08,818 "The only way for this accident to have occurred 1345 01:01:08,851 --> 01:01:11,854 is if Mr. Huang was not paying attention to the road." 1346 01:01:11,887 --> 01:01:16,425 - Tesla needed to be removed from that investigation. 1347 01:01:16,459 --> 01:01:20,029 And so I called. Elon was, I would say, argumentative. 1348 01:01:20,062 --> 01:01:24,834 He indicated that he was going to sue the NTSB. 1349 01:01:24,867 --> 01:01:27,403 There was an attempt to bully us into submission. 1350 01:01:27,436 --> 01:01:29,772 But we didn't back down, and he hung up on us. 1351 01:01:31,707 --> 01:01:34,410 That night, Tesla put out a press release 1352 01:01:34,443 --> 01:01:38,380 saying that they were resigning. 1353 01:01:38,414 --> 01:01:40,850 - Tesla announced it's leaving the investigation 1354 01:01:40,883 --> 01:01:42,551 into the deadly crash, 1355 01:01:42,585 --> 01:01:43,486 but the NTSB says 1356 01:01:43,519 --> 01:01:46,756 it kicked the electric car maker out first. 1357 01:01:46,789 --> 01:01:50,259 - It was sort of like, "You can't fire me, we quit" 1358 01:01:50,292 --> 01:01:51,627 sort of a thing. 1359 01:01:51,660 --> 01:01:53,896 - The system worked as described, 1360 01:01:53,929 --> 01:01:55,965 which is that it's a hands-on system. 1361 01:01:55,998 --> 01:01:58,701 It is not a self-driving system. 1362 01:02:00,836 --> 01:02:04,874 ♪ ♪ 1363 01:02:04,907 --> 01:02:06,976 - Today, we meet to consider a collision 1364 01:02:07,009 --> 01:02:10,279 involving a Tesla Model X SUV. 1365 01:02:10,312 --> 01:02:12,815 ♪ ♪ 1366 01:02:12,848 --> 01:02:15,751 We do know that in the Mountain View crash, 1367 01:02:15,785 --> 01:02:18,687 the driver was engaged continuously 1368 01:02:18,721 --> 01:02:20,823 with playing a video game. 1369 01:02:20,856 --> 01:02:22,691 It would be easy to say the driver 1370 01:02:22,725 --> 01:02:24,326 was not acting responsibly. 1371 01:02:24,360 --> 01:02:28,464 However, it also shows that there's great potential 1372 01:02:28,497 --> 01:02:32,635 for there to be this automation complacency to creep in. 1373 01:02:32,668 --> 01:02:36,038 In 2017, we issued two recommendations 1374 01:02:36,071 --> 01:02:39,442 to six automobile manufacturers. 1375 01:02:39,475 --> 01:02:44,747 And of the six, one manufacturer has ignored us, 1376 01:02:44,780 --> 01:02:48,584 and that manufacturer is Tesla. 1377 01:02:48,617 --> 01:02:51,887 All of our recommendations are based on tragic events. 1378 01:02:51,921 --> 01:02:55,291 And when someone doesn't respond or doesn't act, 1379 01:02:55,324 --> 01:02:56,459 that's heartbreaking, 1380 01:02:56,492 --> 01:02:58,894 especially when you see another accident 1381 01:02:58,928 --> 01:03:00,763 that could have been prevented 1382 01:03:00,796 --> 01:03:04,733 had those recommendations been implemented. 1383 01:03:04,767 --> 01:03:06,836 - What started as an ordinary drive to work 1384 01:03:06,869 --> 01:03:09,038 ended in tragedy for a father and husband 1385 01:03:09,071 --> 01:03:11,941 from suburban Lake Worth Beach driving a Tesla. 1386 01:03:11,974 --> 01:03:14,777 - In March of 2019, the next fatality 1387 01:03:14,810 --> 01:03:19,482 that we became aware of was Jeremy Banner. 1388 01:03:19,515 --> 01:03:23,486 - We saw the almost identical crash 1389 01:03:23,519 --> 01:03:25,521 that we saw in the Joshua Brown case. 1390 01:03:27,656 --> 01:03:30,693 You've got a Tesla being operated on Autopilot. 1391 01:03:30,726 --> 01:03:32,461 We've got a tractor-trailer 1392 01:03:32,495 --> 01:03:36,031 that is pulling across the road. 1393 01:03:36,065 --> 01:03:39,902 We've got a driver does not attempt any evasive steering. 1394 01:03:39,935 --> 01:03:42,471 Does not attempt any breaking action. 1395 01:03:44,940 --> 01:03:47,743 And goes right under the tractor-trailer, 1396 01:03:50,312 --> 01:03:52,314 sheering the roof off of the car, 1397 01:03:54,416 --> 01:03:56,419 and killing the driver. 1398 01:04:00,856 --> 01:04:02,458 - Where is the super duper radar 1399 01:04:02,491 --> 01:04:05,661 that Elon was talking about in September 2016? 1400 01:04:05,694 --> 01:04:09,765 ♪ ♪ 1401 01:04:09,799 --> 01:04:12,668 Well, whatever they did wasn't sufficient 1402 01:04:12,701 --> 01:04:14,703 to ensure it didn't happen again, 1403 01:04:14,737 --> 01:04:17,306 'cause the exact same crash happened. 1404 01:04:21,377 --> 01:04:24,647 - Stationary objects are this vexing problem in autonomy. 1405 01:04:24,680 --> 01:04:27,383 And everybody that's developing autonomous software 1406 01:04:27,416 --> 01:04:28,851 has this problem. 1407 01:04:28,884 --> 01:04:32,655 This is the one Achilles heel that you continue to see. 1408 01:04:32,688 --> 01:04:34,523 - I thought the self-driving problem would be hard, 1409 01:04:34,557 --> 01:04:36,659 but it's--it was harder than I thought. 1410 01:04:39,795 --> 01:04:44,467 - It may be that Autopilot vehicles have fewer crashes. 1411 01:04:44,500 --> 01:04:48,604 But we've continued to see other crashes that happen 1412 01:04:48,637 --> 01:04:51,874 because the system can't see something in the road. 1413 01:04:51,907 --> 01:04:54,577 - Nearly a dozen accidents where a Tesla slammed 1414 01:04:54,610 --> 01:04:56,479 into a parked emergency vehicle. 1415 01:04:56,512 --> 01:05:00,049 - If your company is supposed to be putting safety first, 1416 01:05:00,082 --> 01:05:03,385 and this well-respected safety agency says, 1417 01:05:03,419 --> 01:05:05,721 you know, there are these two deficiencies in your system, 1418 01:05:05,755 --> 01:05:07,590 you should address them, 1419 01:05:07,623 --> 01:05:10,359 why wouldn't you address them? Why wouldn't you fix them? 1420 01:05:10,392 --> 01:05:12,361 - One of the biggest mistakes people generally make-- 1421 01:05:12,394 --> 01:05:14,864 and I'm guilty of it, too-- is wishful thinking. 1422 01:05:14,897 --> 01:05:17,500 You know, like, you want something to be true 1423 01:05:17,533 --> 01:05:18,701 even if it isn't true. 1424 01:05:18,734 --> 01:05:20,503 And so you ignore the real truth 1425 01:05:20,536 --> 01:05:23,973 because of what you want to be true. 1426 01:05:24,006 --> 01:05:26,642 This is a very difficult trap to avoid. 1427 01:05:26,675 --> 01:05:30,579 ♪ ♪ 1428 01:05:30,613 --> 01:05:32,748 - I think, for those of us in the safety business, 1429 01:05:32,782 --> 01:05:35,484 we would have liked to have seen more regulations 1430 01:05:35,518 --> 01:05:38,287 implemented to improve safety. 1431 01:05:38,320 --> 01:05:40,423 I mean, it's horribly frustrating. 1432 01:05:40,456 --> 01:05:41,857 - The truth is companies have always had 1433 01:05:41,891 --> 01:05:44,760 an enormous amount of power, in terms of the technology 1434 01:05:44,794 --> 01:05:47,530 in the vehicles that they put on the road. 1435 01:05:47,563 --> 01:05:49,298 - Honestly, I worry that the government 1436 01:05:49,331 --> 01:05:51,934 cannot keep up with the technology. 1437 01:05:51,967 --> 01:05:53,502 I don't think in a situation like this 1438 01:05:53,536 --> 01:05:57,039 we want to necessarily inhibit innovation. 1439 01:05:57,072 --> 01:05:59,442 But when innovation is implemented, 1440 01:05:59,475 --> 01:06:02,344 we have to make sure that it's done safely. 1441 01:06:02,378 --> 01:06:05,481 Or it's going to be the Wild West out there. 1442 01:06:10,753 --> 01:06:13,722 [light music] 1443 01:06:13,756 --> 01:06:15,891 ♪ ♪ 1444 01:06:17,466 --> 01:06:18,867 - When you think full self-driving, 1445 01:06:18,900 --> 01:06:20,369 you think hands off the wheel. 1446 01:06:20,402 --> 01:06:21,970 You don't have to worry about anything. 1447 01:06:22,004 --> 01:06:25,507 You can listen to music and read a book, whatever. 1448 01:06:25,540 --> 01:06:29,378 I believe the first full video we saw was, like, in 2016. 1449 01:06:29,411 --> 01:06:31,279 I thought we're already here. 1450 01:06:31,313 --> 01:06:35,183 So, yeah, it was very, very exciting at the time. 1451 01:06:35,217 --> 01:06:36,418 A couple years later, 1452 01:06:36,451 --> 01:06:39,121 I purchased the full self-driving. 1453 01:06:39,154 --> 01:06:41,390 I now kind of make a distinction. 1454 01:06:41,423 --> 01:06:44,326 I think Tesla makes great electric vehicles. 1455 01:06:44,359 --> 01:06:47,963 But I think their advertising of certain Autopilot features 1456 01:06:47,996 --> 01:06:50,232 have been--overpromising 1457 01:06:50,265 --> 01:06:52,100 is probably the nicest way to say it. 1458 01:06:52,134 --> 01:06:54,002 - I almost view it as, like, a solved problem. 1459 01:06:54,036 --> 01:06:55,971 Like, we know exactly what to do, 1460 01:06:56,004 --> 01:06:57,372 and we'll be there in a few years. 1461 01:06:57,406 --> 01:07:00,042 - As far back as 2015, Elon Musk was saying 1462 01:07:00,075 --> 01:07:01,576 self-driving cars were two years away. 1463 01:07:01,610 --> 01:07:08,016 - I think we're basically less than two years away 1464 01:07:08,050 --> 01:07:09,318 from complete autonomy. 1465 01:07:09,351 --> 01:07:11,286 - The time when someone will be able 1466 01:07:11,320 --> 01:07:13,855 to take their hands off the wheel and go to sleep, 1467 01:07:13,889 --> 01:07:15,457 how far away is that? To do that safely? 1468 01:07:15,490 --> 01:07:17,826 - I think that's about-- that's about two years. 1469 01:07:17,859 --> 01:07:19,561 - The promise was very aspirational, 1470 01:07:19,594 --> 01:07:21,263 and probably not gonna happen. 1471 01:07:21,296 --> 01:07:24,967 But Tesla and Elon made people think it was gonna happen. 1472 01:07:25,000 --> 01:07:29,137 - By end of next year, self-driving will be 1473 01:07:29,171 --> 01:07:33,976 at least 100% to 200% safer than a person. 1474 01:07:34,009 --> 01:07:36,845 If you buy a car that does not have the hardware necessary 1475 01:07:36,878 --> 01:07:39,147 for full self-driving, it is like buying a horse. 1476 01:07:39,181 --> 01:07:43,118 I'm extremely confident of achieving full autonomy 1477 01:07:43,151 --> 01:07:47,389 and releasing it to the Tesla customer base next year. 1478 01:07:47,422 --> 01:07:49,524 - Some people say, what does it matter? 1479 01:07:49,558 --> 01:07:52,260 Well, I think it matters a lot. 1480 01:07:52,294 --> 01:07:54,830 Do you want other people on the roads 1481 01:07:54,863 --> 01:07:56,832 buying this technology and thinking 1482 01:07:56,865 --> 01:07:59,901 that it's more powerful than it really is? 1483 01:07:59,935 --> 01:08:01,536 - I felt that what I was being told 1484 01:08:01,570 --> 01:08:03,372 that we were gonna do didn't match 1485 01:08:03,405 --> 01:08:05,874 what we actually did 1486 01:08:05,907 --> 01:08:09,011 because Tesla has changed the hardware on the car. 1487 01:08:09,044 --> 01:08:11,380 They changed the computer. 1488 01:08:11,413 --> 01:08:13,448 And now they're changing the cameras. 1489 01:08:13,482 --> 01:08:17,586 I think that that should give someone pause 1490 01:08:17,619 --> 01:08:19,287 when Tesla says they're gonna do something else. 1491 01:08:21,089 --> 01:08:24,526 - Elon Musk has officially blown my mind yet again. 1492 01:08:24,559 --> 01:08:27,863 In a recent tweet, he talked about vision 1493 01:08:27,896 --> 01:08:29,965 and using only vision and no radar. 1494 01:08:29,998 --> 01:08:32,134 - Taking the radar out is literally 1495 01:08:32,167 --> 01:08:35,137 going to make the system better. 1496 01:08:35,170 --> 01:08:37,272 - When I heard they were gonna do cameras alone 1497 01:08:37,306 --> 01:08:39,408 and get rid of radar, 1498 01:08:39,441 --> 01:08:41,910 I was really taken aback. 1499 01:08:41,943 --> 01:08:45,247 The whole rest of the industry believes that you need cameras, 1500 01:08:45,280 --> 01:08:48,283 radar, and lidar. 1501 01:08:48,317 --> 01:08:50,986 Tesla's really the only automaker 1502 01:08:51,019 --> 01:08:53,922 who thinks that cameras alone is a good idea. 1503 01:08:53,955 --> 01:08:57,426 - You can absolutely be superhuman with just cameras. 1504 01:08:57,459 --> 01:09:01,229 - What Elon Musk is leaving out of his analogy-- 1505 01:09:01,263 --> 01:09:03,365 comparing cameras to eyes-- 1506 01:09:03,398 --> 01:09:08,036 is the fact that there is not a brain behind those cameras. 1507 01:09:08,070 --> 01:09:09,504 [beeping] 1508 01:09:09,538 --> 01:09:12,007 We don't know how to build a system 1509 01:09:12,040 --> 01:09:13,542 that can behave like the human brain. 1510 01:09:13,575 --> 01:09:19,314 And what that means is full autonomy may be decades away. 1511 01:09:19,348 --> 01:09:22,351 [cheers and applause] 1512 01:09:24,219 --> 01:09:26,855 - Anyone here use the full self-driving beta? 1513 01:09:26,888 --> 01:09:30,125 [cheering] Great. 1514 01:09:30,158 --> 01:09:31,193 The car will be able to take you 1515 01:09:31,226 --> 01:09:34,162 anywhere you want with ultimately ten times safer 1516 01:09:34,196 --> 01:09:36,231 than if you were driving it yourself. 1517 01:09:36,264 --> 01:09:39,401 It's gonna just completely revolutionize the world. 1518 01:09:39,434 --> 01:09:43,105 ♪ ♪ 1519 01:09:43,138 --> 01:09:45,974 - There's a question around whether Elon 1520 01:09:46,008 --> 01:09:47,843 is acting cynically, right? 1521 01:09:47,876 --> 01:09:49,911 Like, does he believe in what he says? 1522 01:09:49,945 --> 01:09:54,182 And is it okay as long as he does believe in what he says? 1523 01:09:54,216 --> 01:09:57,419 Some of it feels intentional to me. 1524 01:09:57,452 --> 01:10:00,155 There's, like, financing needs that he needs to make. 1525 01:10:00,188 --> 01:10:03,392 There are milestones that Elon needs to hit, 1526 01:10:03,425 --> 01:10:06,361 from an investor's perspective. 1527 01:10:06,395 --> 01:10:09,965 - At times, people misinterpret Elon. 1528 01:10:09,998 --> 01:10:12,367 Oftentimes, there's-- when the goal is set, 1529 01:10:12,401 --> 01:10:15,103 there's no capability to deliver against that goal. 1530 01:10:15,137 --> 01:10:17,306 You kind of need to believe that, as a team, 1531 01:10:17,339 --> 01:10:19,107 you're gonna achieve the impossible. 1532 01:10:19,141 --> 01:10:21,510 - I've had many conversations with the Tesla Autopilot team. 1533 01:10:21,543 --> 01:10:23,111 The reality of doing the right thing matters 1534 01:10:23,145 --> 01:10:24,980 more than the perception of doing the right thing. 1535 01:10:25,013 --> 01:10:28,417 - He's convinced that the technology will be delivered. 1536 01:10:28,450 --> 01:10:30,552 And I wouldn't necessarily bet against him, 1537 01:10:30,585 --> 01:10:32,421 because eventually he does deliver. 1538 01:10:33,622 --> 01:10:36,525 - Tesla says it's launching its highly anticipated 1539 01:10:36,558 --> 01:10:39,428 full self-driving software later this week. 1540 01:10:39,461 --> 01:10:41,229 - They're gonna open up self-driving 1541 01:10:41,263 --> 01:10:43,165 in America's cities. 1542 01:10:43,198 --> 01:10:47,235 That would seem to be quite a difficult thing to pull off. 1543 01:10:47,269 --> 01:10:49,104 In a city? 1544 01:10:51,540 --> 01:10:55,377 - So, this is a tricky thing for beta. 1545 01:10:55,410 --> 01:10:58,013 We are--this is a blind left. There's a fence here. 1546 01:10:58,046 --> 01:11:02,584 It can't see around. So my car is inching forward. 1547 01:11:02,617 --> 01:11:05,320 I feel honored that I get to do this, 1548 01:11:05,354 --> 01:11:09,324 and be, like, a little part of this, you know, history. 1549 01:11:09,358 --> 01:11:14,963 I stopped it because it was inching out too far. 1550 01:11:14,997 --> 01:11:17,165 There are definitely people that do not agree 1551 01:11:17,199 --> 01:11:20,402 with Tesla's approach. 1552 01:11:20,435 --> 01:11:22,904 I don't feel that it's risky. 1553 01:11:22,938 --> 01:11:28,377 I have never felt endangered, okay? 1554 01:11:28,410 --> 01:11:31,913 See, it's gonna miss this. Can't do it. 1555 01:11:31,947 --> 01:11:35,183 I can say that people who buy a Tesla understand 1556 01:11:35,217 --> 01:11:37,252 that it's not full self-driving yet. 1557 01:11:37,285 --> 01:11:42,024 And nobody is forcing anybody to buy full self-driving. 1558 01:11:42,057 --> 01:11:42,958 It's an option. 1559 01:11:45,060 --> 01:11:47,229 - Full self-driving, that's what I paid for, 1560 01:11:47,262 --> 01:11:48,830 and I don't have it. 1561 01:11:48,864 --> 01:11:51,500 I mean, it's right there in the name of it, right? 1562 01:11:51,533 --> 01:11:53,902 And I don't think that's fair to say. 1563 01:11:53,935 --> 01:11:56,505 Especially right now. 1564 01:11:56,538 --> 01:11:59,508 Musk, I think he has a huge responsibility. 1565 01:11:59,541 --> 01:12:01,576 You know, I think he needs to be a little bit more cautious 1566 01:12:01,610 --> 01:12:04,179 about what he tells his followers. 1567 01:12:04,212 --> 01:12:08,116 - Wow. Oh, my God! Okay! 1568 01:12:08,150 --> 01:12:11,386 - A lot of the work in the tech industry proceeds 1569 01:12:11,420 --> 01:12:14,856 with the central claim of improving human lives 1570 01:12:14,890 --> 01:12:17,960 through the methodical use of our technologies. 1571 01:12:17,993 --> 01:12:22,497 - Okay, did-- oh, God! Fuck! Jesus! 1572 01:12:22,531 --> 01:12:26,234 That was one of the closest calls we've ever had. 1573 01:12:26,268 --> 01:12:30,839 - With the ongoing full self-driving beta releases, 1574 01:12:30,872 --> 01:12:31,840 there's quite a spectacle. 1575 01:12:32,407 --> 01:12:33,976 - Is it just gonna run this light? 1576 01:12:34,009 --> 01:12:36,311 Holy shit, it just ran that red light. 1577 01:12:36,345 --> 01:12:38,914 - Here, we have a lot of customers 1578 01:12:38,947 --> 01:12:43,251 who are essentially standing in for professional test drivers. 1579 01:12:43,452 --> 01:12:46,054 - Ooh! Ooh! - Oh, fuck! Oh shit! 1580 01:12:46,088 --> 01:12:47,255 - Shit! We-- 1581 01:12:47,289 --> 01:12:48,857 - We hit that. - We actually hit it. 1582 01:12:48,890 --> 01:12:50,125 - We hit it. 1583 01:12:50,158 --> 01:12:54,496 - With Tesla, an example of scientific integrity, 1584 01:12:54,529 --> 01:12:56,365 public responsibility, 1585 01:12:56,398 --> 01:13:00,369 and reasoned and methodical engineering development 1586 01:13:00,402 --> 01:13:01,503 it is not. 1587 01:13:01,536 --> 01:13:03,372 - With a software update, you can actually 1588 01:13:03,405 --> 01:13:06,375 make thousands of people drive safer. 1589 01:13:06,408 --> 01:13:07,976 Just with a software update overnight. 1590 01:13:08,010 --> 01:13:10,112 - Wow. That's actually-- - Yeah. 1591 01:13:10,145 --> 01:13:11,913 - That's actually-- [beeping] 1592 01:13:11,947 --> 01:13:15,450 - Fuck. - [laughs] 1593 01:13:15,484 --> 01:13:16,451 Fuck. 1594 01:13:16,485 --> 01:13:19,054 - Are we gonna have to cut that? 1595 01:13:19,087 --> 01:13:26,061 ♪ ♪