1 00:00:20,742 --> 00:00:23,514 Levy 911, what is the address of your emergency? 2 00:00:23,548 --> 00:00:25,050 There was just a wreck. 3 00:00:25,083 --> 00:00:26,286 A head-on collision right here-- 4 00:00:26,319 --> 00:00:27,656 Oh, my God Almighty. 5 00:00:30,762 --> 00:00:32,431 Hello? 6 00:00:32,464 --> 00:00:34,736 They just had a bad accident in front of the BP station. 7 00:00:34,770 --> 00:00:36,439 I don't know how bad it is, but it sounded nasty. 8 00:00:36,472 --> 00:00:39,212 Okay, we've got--we've got multiple 911 calls, sir. 9 00:00:40,815 --> 00:00:43,119 Yes, ma'am, a little car, a little sportscar. 10 00:00:43,153 --> 00:00:45,157 A little black car and a semi car 11 00:00:45,190 --> 00:00:46,359 went under a semi-truck. 12 00:00:46,392 --> 00:00:47,461 Come on underneath the truck. 13 00:00:47,494 --> 00:00:49,131 Took the top of the car off. 14 00:00:49,165 --> 00:00:50,635 Okay, are the vehicles in the road? 15 00:00:50,668 --> 00:00:52,606 No, vehicle--he went way off the road. 16 00:00:52,639 --> 00:00:54,810 Went off into the ditch and went to the woods. 17 00:00:54,843 --> 00:00:56,680 Ran off probably a quarter mile 18 00:00:56,714 --> 00:00:57,849 from where the wreck happened. 19 00:00:57,882 --> 00:00:59,385 I watched it happen. 20 00:00:59,418 --> 00:01:02,458 Okay, sir, is there any obvious injuries? 21 00:01:02,491 --> 00:01:04,830 I mean, he's dead as hell. 22 00:01:04,863 --> 00:01:06,667 He--he's dead? 23 00:01:06,700 --> 00:01:09,071 Yeah, it's kind of obvious. 24 00:01:09,105 --> 00:01:11,644 A fatal crash involving a Tesla. 25 00:01:11,677 --> 00:01:12,812 - Tesla. - Tesla. 26 00:01:12,846 --> 00:01:14,516 Related to the Autopilot mode. 27 00:01:14,549 --> 00:01:16,319 - Autopilot mode. - We've heard warnings 28 00:01:16,352 --> 00:01:17,856 about the dangers of this technology. 29 00:01:17,889 --> 00:01:19,325 Tesla is under fire. 30 00:01:19,358 --> 00:01:21,262 Critics have been calling for changes 31 00:01:21,296 --> 00:01:22,766 to Tesla's Autopilot software. 32 00:01:22,799 --> 00:01:26,439 Tweets and retweets from Tesla CEO Elon Musk, 33 00:01:26,472 --> 00:01:29,513 pointing out that other auto companies are involved 34 00:01:29,546 --> 00:01:31,617 in far more fatal crashes. 35 00:01:31,650 --> 00:01:35,257 How did a Tesla on Autopilot slam into a tractor-trailer? 36 00:01:35,290 --> 00:01:37,562 Why didn't the built-in safety system stop it? 37 00:01:37,595 --> 00:01:40,835 Who's to blame: the driver or the car? 38 00:01:48,584 --> 00:01:52,291 It is my honor to welcome to the stage Mr. Elon Musk. 39 00:01:59,372 --> 00:02:01,342 Welcome, everyone, to the Model 3 unveil. 40 00:02:02,612 --> 00:02:05,852 For years, Elon Musk is talked about being 41 00:02:05,885 --> 00:02:09,358 on the verge of self-driving car technology. 42 00:02:09,392 --> 00:02:11,697 No hands. No feet. Nothing. 43 00:02:11,730 --> 00:02:15,304 Elon Musk approaches this a way a lot of people 44 00:02:15,337 --> 00:02:19,178 in Silicon Valley do: they are often telling you things 45 00:02:19,211 --> 00:02:20,748 about how the future will be. 46 00:02:20,782 --> 00:02:23,320 Getting in a car will be like getting in an elevator. 47 00:02:23,353 --> 00:02:24,823 You just tell it where you want to go, 48 00:02:24,856 --> 00:02:28,898 and it takes you there with extreme levels of safety. 49 00:02:28,931 --> 00:02:30,434 And that'll be normal. 50 00:02:31,737 --> 00:02:33,574 Musk is certainly a visionary. 51 00:02:33,607 --> 00:02:36,580 "Time" Magazine's Person of the Year has been released: 52 00:02:36,613 --> 00:02:39,418 Elon Musk, the Tesla and SpaceX CEO, 53 00:02:39,452 --> 00:02:41,456 for driving society's most daring 54 00:02:41,489 --> 00:02:43,661 and disruptive transformations. 55 00:02:43,694 --> 00:02:47,669 Elon Musk wanted to disrupt and revolutionize 56 00:02:47,702 --> 00:02:48,904 the auto industry. 57 00:02:48,938 --> 00:02:51,877 And Autopilot was kind of a halo. 58 00:02:51,910 --> 00:02:54,916 It gave Tesla this image of the way they like 59 00:02:54,950 --> 00:02:59,291 to be portrayed as a technology company. 60 00:02:59,325 --> 00:03:01,295 The difference is that the stakes are higher 61 00:03:01,329 --> 00:03:02,699 for the technology. 62 00:03:02,732 --> 00:03:06,439 Whoa. It nearly drove us into the Subaru there. 63 00:03:06,472 --> 00:03:09,946 We're talking about physical cars on the road, 64 00:03:09,980 --> 00:03:11,784 and we're talking about lives at stake. 65 00:03:11,817 --> 00:03:13,654 Yet another driver caught on camera 66 00:03:13,687 --> 00:03:15,390 apparently asleep at the wheel. 67 00:03:15,424 --> 00:03:18,396 But will it end the question of how much control drivers 68 00:03:18,430 --> 00:03:21,469 should really hand over to the computers in their car? 69 00:03:22,872 --> 00:03:26,312 Tesla is under a lot more scrutiny now 70 00:03:26,345 --> 00:03:27,481 than it has been before. 71 00:03:27,515 --> 00:03:29,820 And part of that is in their marketing. 72 00:03:29,853 --> 00:03:31,489 The rollout of the latest version 73 00:03:31,523 --> 00:03:33,460 of its self-driving technology. 74 00:03:33,493 --> 00:03:38,436 Wow. Oh, my God! Okay. Okay. 75 00:03:38,470 --> 00:03:40,708 There's this enormous gray area 76 00:03:40,742 --> 00:03:42,512 Tesla's willing to explore. 77 00:03:42,545 --> 00:03:44,883 Ahh! 78 00:03:44,916 --> 00:03:48,758 This is very uncomfortable so far. 79 00:03:48,791 --> 00:03:51,362 How much do you push the edge of the envelope? 80 00:03:53,333 --> 00:03:56,540 My car should get over here. 81 00:03:56,573 --> 00:03:59,646 Okay, that was-- that was great. 82 00:03:59,679 --> 00:04:00,982 That was really great. 83 00:04:01,015 --> 00:04:02,619 That was one of those times where I was like, 84 00:04:02,652 --> 00:04:03,687 okay, it can do it. 85 00:04:03,721 --> 00:04:04,790 Like, I know it can do it. 86 00:04:04,823 --> 00:04:06,359 It just needs to do it every time. 87 00:04:11,603 --> 00:04:14,041 This technology is eventually 88 00:04:14,074 --> 00:04:16,780 going to enable the car to drive itself. 89 00:04:18,450 --> 00:04:20,521 And that's what I am testing. 90 00:04:20,555 --> 00:04:23,293 Navigate to the East Greenwich Super Charger. 91 00:04:23,326 --> 00:04:25,030 What Tesla calls Early Access Program-- 92 00:04:25,063 --> 00:04:27,569 it's a group of owners that test software, 93 00:04:27,602 --> 00:04:29,472 but it's not public. 94 00:04:30,641 --> 00:04:32,478 You can see what the car sees here. 95 00:04:32,512 --> 00:04:34,549 It sees the pedestrians. 96 00:04:34,583 --> 00:04:36,854 It sees that this is a pickup truck 97 00:04:36,887 --> 00:04:38,891 that has come up behind us. 98 00:04:42,431 --> 00:04:44,468 This is a scary street. 99 00:04:44,503 --> 00:04:46,405 See how narrow this is? 100 00:04:47,742 --> 00:04:52,886 See how hesitant it is, though, while it's going by these cars? 101 00:04:52,919 --> 00:04:56,359 It--it's learning. It's not there yet. 102 00:04:57,929 --> 00:05:00,868 You know, people believe that it's gonna happen, 103 00:05:00,902 --> 00:05:02,672 or else you wouldn't do it, right? 104 00:05:04,743 --> 00:05:06,947 I think that the average is like 35,000, 105 00:05:06,980 --> 00:05:11,088 36,000 auto deaths a year in just the United States. 106 00:05:11,122 --> 00:05:13,694 And I think 90-something percent 107 00:05:13,727 --> 00:05:16,332 of those are human error. 108 00:05:16,365 --> 00:05:19,038 I believe that autonomy is necessary 109 00:05:19,071 --> 00:05:23,814 to end virtually all traffic deaths in this country. 110 00:05:25,618 --> 00:05:28,757 I think that Elon Musk is somebody that comes along, 111 00:05:28,791 --> 00:05:30,595 like, once in a generation. 112 00:05:33,801 --> 00:05:35,437 Now, Elon Musk, I think, 113 00:05:35,470 --> 00:05:38,577 is a name known to everybody who thinks about the future. 114 00:05:38,611 --> 00:05:40,948 Musk's fascination with technology 115 00:05:40,982 --> 00:05:43,954 dates to his childhood in South Africa. 116 00:05:43,988 --> 00:05:45,892 Where I grew up was extremely violent. 117 00:05:45,925 --> 00:05:48,029 I got punched in the face many times. 118 00:05:48,062 --> 00:05:50,467 I almost got beaten to death once. 119 00:05:50,501 --> 00:05:51,737 And I think, if you have not been punched 120 00:05:51,770 --> 00:05:53,707 in the face with a fist, 121 00:05:53,741 --> 00:05:55,511 you don't know what--you have no idea what it's like. 122 00:05:55,545 --> 00:05:56,914 He hated going to school, 123 00:05:56,947 --> 00:05:59,051 because the other kids liked to follow him home, 124 00:05:59,084 --> 00:06:01,790 and they would throw soda cans at his head. 125 00:06:01,823 --> 00:06:04,629 So he sought refuge in computer games, 126 00:06:04,663 --> 00:06:06,399 which got him into coding. 127 00:06:06,432 --> 00:06:08,904 When you were a kid, you programmed a game? 128 00:06:08,938 --> 00:06:10,006 Blaster, it's called? 129 00:06:10,040 --> 00:06:11,577 Yeah, it's a simple game. 130 00:06:11,610 --> 00:06:13,847 By 17, you were on a plane from South Africa. 131 00:06:13,881 --> 00:06:15,116 Yeah. I kind of wanted to be 132 00:06:15,150 --> 00:06:17,387 where the cutting edge of technology was. 133 00:06:18,591 --> 00:06:20,427 Part of the reason I got interested in technology-- 134 00:06:20,460 --> 00:06:22,497 maybe the reason-- was video games. 135 00:06:22,532 --> 00:06:24,101 I worked at a gaming startup, 136 00:06:24,134 --> 00:06:26,072 which weirdly was called Rocket Science. 137 00:06:26,105 --> 00:06:27,107 Yeah. 138 00:06:29,445 --> 00:06:30,815 Fate loves irony. 139 00:06:32,652 --> 00:06:36,893 In summer of '94, Elon came as a summer intern. 140 00:06:36,927 --> 00:06:38,429 He was kinda introverted, 141 00:06:38,463 --> 00:06:41,102 so he fit right into the rest of the group. 142 00:06:41,135 --> 00:06:44,743 Very, very interested in world building, in storytelling. 143 00:06:44,776 --> 00:06:47,481 We thought Elon was gonna be an entrepreneur, clearly. 144 00:06:47,515 --> 00:06:48,884 You had a bring stint at Stanford. 145 00:06:48,918 --> 00:06:50,988 - That's right. - A Ph. D. in Applied Physics? 146 00:06:51,022 --> 00:06:52,825 Applied Physics, Material Science. 147 00:06:52,859 --> 00:06:54,194 But then the Internet came along, 148 00:06:54,228 --> 00:06:56,132 and it seemed like I could either do a Ph. D. 149 00:06:56,165 --> 00:06:58,804 and watch the internet happen, or I could participate 150 00:06:58,837 --> 00:07:01,543 and help build it in some fashion. 151 00:07:01,576 --> 00:07:03,714 So, I started a company with my brother 152 00:07:03,747 --> 00:07:05,450 and a friend of mine, Greg Kouri, 153 00:07:05,483 --> 00:07:07,220 and created Zip2, where the initial idea 154 00:07:07,254 --> 00:07:08,958 was to create software that could help 155 00:07:08,991 --> 00:07:11,128 bring the media companies online. 156 00:07:11,162 --> 00:07:12,966 You know, it's hard to remember a time 157 00:07:12,999 --> 00:07:15,972 without the ubiquity of, like, Google Maps and other things, 158 00:07:16,005 --> 00:07:18,644 and the expectation that, if you're gonna find anything, 159 00:07:18,677 --> 00:07:20,748 there is a digital path to finding it. 160 00:07:20,781 --> 00:07:24,522 And none of that existed in 1995. 161 00:07:25,658 --> 00:07:27,662 I think it was very forward-thinking. 162 00:07:27,695 --> 00:07:30,233 That's what makes a great entrepreneur. 163 00:07:30,266 --> 00:07:31,603 Elon had a very strong personality, 164 00:07:32,772 --> 00:07:35,076 so sometimes he would get into arguments with people, 165 00:07:35,110 --> 00:07:38,182 and they would be pretty intense. 166 00:07:38,216 --> 00:07:40,621 He's not the kind of guy who went out for beers 167 00:07:40,655 --> 00:07:42,692 with people and saw movies and things like that. 168 00:07:42,725 --> 00:07:44,194 He basically just worked. 169 00:07:44,228 --> 00:07:47,467 I started off being the CEO. But after we got VC funding, 170 00:07:47,502 --> 00:07:52,244 the venture capitalists wanted to hire a professional CEO. 171 00:07:52,277 --> 00:07:53,881 Elon was always asking me, 172 00:07:53,914 --> 00:07:55,785 like, what should his title be? 173 00:07:55,818 --> 00:07:57,120 You know, what should his role be? 174 00:07:57,154 --> 00:07:58,624 I think, more than anything, 175 00:07:58,657 --> 00:08:00,861 he wanted to be the face of Zip2. 176 00:08:02,699 --> 00:08:04,602 Wow. I can't believe it's actually here. 177 00:08:04,636 --> 00:08:06,205 That's pretty wild, man. 178 00:08:06,238 --> 00:08:07,742 Elon was clearly, like, 179 00:08:07,775 --> 00:08:10,480 one of the most driven people I've ever known. 180 00:08:11,282 --> 00:08:14,889 A year ago, Musk sold his software company, Zip2. 181 00:08:14,923 --> 00:08:18,162 He took the $22 million he made from Zip2. 182 00:08:18,196 --> 00:08:19,766 He obviously could have retired. 183 00:08:19,799 --> 00:08:23,172 But instead, he just worked to start his next company, 184 00:08:23,206 --> 00:08:24,843 X.com, which became PayPal. 185 00:08:24,876 --> 00:08:30,020 The company was sold to eBay in '02 for $1.5 billion. 186 00:08:30,053 --> 00:08:33,092 I think Elon wants to make a dent in the world. 187 00:08:33,126 --> 00:08:35,631 We all have a finite amount of time. 188 00:08:35,664 --> 00:08:38,871 If you can move the needle, then why not? 189 00:08:38,904 --> 00:08:40,975 And the capital allowed him to do things 190 00:08:41,008 --> 00:08:42,177 that were really important. 191 00:08:42,210 --> 00:08:44,649 After PayPal, I started debating 192 00:08:44,682 --> 00:08:49,124 between either solar, electric car, or space. 193 00:08:49,158 --> 00:08:50,928 I thought, like, nobody is gonna be 194 00:08:50,961 --> 00:08:53,634 crazy enough to do space, so I better do space. 195 00:08:53,667 --> 00:08:58,877 So, we embarked on that journey to create SpaceX in 2002. 196 00:08:58,911 --> 00:09:00,280 And in the beginning, I wouldn't--actually 197 00:09:00,313 --> 00:09:01,783 wouldn't even let my friends invest 198 00:09:01,817 --> 00:09:03,152 because I didn't want to lose their money. 199 00:09:03,186 --> 00:09:06,325 - Stage 1. - We have liftoff indication. 200 00:09:06,358 --> 00:09:08,196 Before all the drama of SpaceX, 201 00:09:08,229 --> 00:09:11,202 I think Tesla has actually been probably 2/3 202 00:09:11,235 --> 00:09:14,876 of my total drama dose of a time. 203 00:09:20,019 --> 00:09:23,359 In 2003, my business partner at the time, Martin Eberhard, 204 00:09:23,392 --> 00:09:27,000 and I were getting really concerned about climate change. 205 00:09:27,033 --> 00:09:30,574 We knew that you could make electric cars. 206 00:09:30,608 --> 00:09:32,745 So we started Tesla Motors. 207 00:09:34,281 --> 00:09:37,855 We began looking at raising significant money. 208 00:09:39,324 --> 00:09:42,364 We visited Elon at SpaceX's original office, 209 00:09:42,397 --> 00:09:43,901 and there was a couple of things 210 00:09:43,934 --> 00:09:46,305 that were really different about pitching to Elon. 211 00:09:46,338 --> 00:09:49,244 First, he understood the mission immediately. 212 00:09:49,278 --> 00:09:52,652 It's very important that we accelerate the transition away 213 00:09:52,685 --> 00:09:55,390 from gasoline, you know, 214 00:09:55,423 --> 00:09:57,862 for environmental reasons, for economic reasons, 215 00:09:57,895 --> 00:09:59,833 for national security reasons. 216 00:09:59,866 --> 00:10:02,170 The other thing is some of the feedback 217 00:10:02,204 --> 00:10:04,642 that we'd get from the regular venture community 218 00:10:04,676 --> 00:10:09,218 was that the idea is just kinda too crazy. 219 00:10:09,251 --> 00:10:11,957 When you're pitching someone who's building a rocket ship 220 00:10:11,990 --> 00:10:14,729 directly on the other side of the glass panel, 221 00:10:14,762 --> 00:10:16,900 you know, that you're in the conference room, 222 00:10:16,933 --> 00:10:18,202 you kinda feel he's not gonna say 223 00:10:18,236 --> 00:10:19,873 that your idea is too crazy. 224 00:10:21,676 --> 00:10:23,246 I could have been the CEO from day one, 225 00:10:23,279 --> 00:10:25,851 but the idea of being CEO of two startups 226 00:10:25,885 --> 00:10:30,928 at the same time was not appealing. 227 00:10:30,961 --> 00:10:33,099 Elon was Chairman of the Board, 228 00:10:33,132 --> 00:10:34,769 and he would check in every month. 229 00:10:36,973 --> 00:10:38,677 One of Elon's things-- he said, you know, 230 00:10:38,710 --> 00:10:40,948 you only get to introduce the car once. 231 00:10:40,981 --> 00:10:43,386 So you kinda wanna make it as good as you can get it. 232 00:10:43,419 --> 00:10:45,658 And when we unveiled it, we did it in Los Angeles. 233 00:10:46,258 --> 00:10:48,864 We've got a zero-emission sportscar 234 00:10:48,897 --> 00:10:51,035 that can go head-to-head with a Ferrari 235 00:10:51,068 --> 00:10:52,070 and a Porsche and win. 236 00:10:52,972 --> 00:10:54,341 The governor, Arnold Schwarzenegger, 237 00:10:54,374 --> 00:10:56,078 at the time, you know, shows up. 238 00:10:56,111 --> 00:10:59,853 And he's kinda big, and the car's kinda small. 239 00:10:59,886 --> 00:11:01,121 And I was worried that, you know, 240 00:11:01,155 --> 00:11:02,758 the Governor might get stuck in the car, 241 00:11:02,792 --> 00:11:04,896 and that would be, like, a, you know, PR nightmare. 242 00:11:04,929 --> 00:11:06,432 But he really liked it. In fact, he ended up 243 00:11:06,465 --> 00:11:08,035 ordering one a little bit later. 244 00:11:10,941 --> 00:11:13,145 But there were a lot of unexpected challenges 245 00:11:13,179 --> 00:11:14,949 developing, you know, the Roadster. 246 00:11:16,085 --> 00:11:18,422 And that's when Elon began to get much more involved, 247 00:11:18,456 --> 00:11:21,328 because, you know, we were in trouble. 248 00:11:21,362 --> 00:11:26,372 2008 was brutal. Tesla almost went bankrupt. 249 00:11:26,405 --> 00:11:28,911 We closed our financing round 250 00:11:28,944 --> 00:11:31,215 6:00 p. m., Christmas Eve, 2008. 251 00:11:31,248 --> 00:11:34,154 It was the last hour of the last day that it was possible. 252 00:11:34,187 --> 00:11:36,058 And I thought, okay, I got to bite the bullet 253 00:11:36,091 --> 00:11:38,764 and run the company, 254 00:11:40,768 --> 00:11:42,170 'cause there's just too much at stake. 255 00:11:44,441 --> 00:11:48,049 From the time Elon became CEO to now, 256 00:11:48,082 --> 00:11:49,919 I mean, Tesla's changed the way 257 00:11:49,953 --> 00:11:51,823 people think about electric vehicles. 258 00:11:51,856 --> 00:11:55,163 "Consumer Reports" says this is the best car 259 00:11:55,196 --> 00:11:58,737 they've tested in the history of the magazine. 260 00:11:58,770 --> 00:12:02,344 When Tesla went public, that was incredibly important. 261 00:12:02,377 --> 00:12:05,083 He was able to push Tesla to grow 262 00:12:05,116 --> 00:12:07,020 and become much bigger much faster 263 00:12:07,053 --> 00:12:09,424 than I think most people had thought possible. 264 00:12:09,458 --> 00:12:13,934 Tesla, surpassing a $1 trillion valuation today. 265 00:12:13,967 --> 00:12:16,740 Elon Musk is now the richest person on the planet. 266 00:12:18,409 --> 00:12:21,883 His pace of innovation is kind of unmatched. 267 00:12:21,917 --> 00:12:25,323 He can take more risks than any human can now, 268 00:12:25,356 --> 00:12:28,129 simply because he has the resources 269 00:12:28,162 --> 00:12:29,799 that allow him to do things 270 00:12:29,832 --> 00:12:33,472 that would be irresponsible or insane for anybody else. 271 00:12:48,436 --> 00:12:50,139 I love the acceleration. 272 00:12:50,173 --> 00:12:52,077 I love the fact that it's electric. 273 00:12:52,110 --> 00:12:55,116 And I love the fact that people are interested in Tesla 274 00:12:55,149 --> 00:12:56,920 as a brand and interested in the car. 275 00:12:56,953 --> 00:12:58,824 I like talking about it. 276 00:12:58,857 --> 00:13:01,896 2012, 2013, when they first released their Model S, 277 00:13:01,930 --> 00:13:03,432 I thought it was just a very cool car. 278 00:13:03,466 --> 00:13:06,205 And at the time, you know, it was pretty revolutionary. 279 00:13:06,238 --> 00:13:07,942 Like, I believed in their mission. 280 00:13:07,975 --> 00:13:11,115 Electric car can in fact be the best car in the world. 281 00:13:12,985 --> 00:13:15,524 I think I had Elon Musk up on a pedestal. 282 00:13:15,557 --> 00:13:17,862 You know, he was kind of a hero at the time. 283 00:13:20,199 --> 00:13:21,970 I would follow his tweets. 284 00:13:22,003 --> 00:13:24,241 Think I have a mug with his face on it somewhere at home. 285 00:13:25,611 --> 00:13:28,517 This was the first one that I got. 286 00:13:28,550 --> 00:13:30,119 Somebody gave it to me. 287 00:13:30,153 --> 00:13:37,133 The Tesla community is, I think, rare, for any brand. 288 00:13:37,167 --> 00:13:41,008 I think that any company would kill 289 00:13:41,041 --> 00:13:47,087 to have that level of fandom and devotion. 290 00:13:47,120 --> 00:13:48,389 People that are the diehard fans, 291 00:13:48,422 --> 00:13:49,926 they have a bunch of different names. 292 00:13:51,295 --> 00:13:53,132 Like the Musketeers. 293 00:13:53,165 --> 00:13:55,871 And they think whatever Elon Musk says is, 294 00:13:55,904 --> 00:13:57,173 you know, gold, basically. 295 00:13:57,207 --> 00:13:59,144 Elon Musk, how much do I have to beg 296 00:13:59,177 --> 00:14:00,848 to get a selfie with you? 297 00:14:00,881 --> 00:14:03,319 Sure, I'll do a selfie. 298 00:14:03,352 --> 00:14:05,223 Sure. If your customers love you, 299 00:14:05,256 --> 00:14:09,031 your odds of success are dramatically higher. 300 00:14:11,536 --> 00:14:14,976 Josh was a real Tesla enthusiast. 301 00:14:15,009 --> 00:14:18,416 He was very happy with the decision he made 302 00:14:18,449 --> 00:14:20,319 in buying that car. 303 00:14:20,353 --> 00:14:22,323 That's the last picture of him. - Yep. 304 00:14:22,357 --> 00:14:24,494 Right after the vacation. 305 00:14:26,398 --> 00:14:27,935 Before the accident. 306 00:14:32,143 --> 00:14:33,479 I was trying to think about 307 00:14:33,513 --> 00:14:35,450 how I first met Josh the other day, 308 00:14:35,483 --> 00:14:38,155 the exact timing that I met him. 309 00:14:39,659 --> 00:14:43,934 Mark Nelson and I were working on a computer program 310 00:14:43,967 --> 00:14:45,604 for the NAVY EOD forces. 311 00:14:45,637 --> 00:14:48,476 EOD stands for explosive ordinance disposal. 312 00:14:48,510 --> 00:14:52,016 Most people know it by bomb disposal. 313 00:14:52,050 --> 00:14:56,559 I was with a friend of mine in San Diego, Ken Falke. 314 00:14:56,593 --> 00:15:00,934 And he introduced me to this young sailor, 315 00:15:00,968 --> 00:15:02,404 Josh Brown. 316 00:15:02,437 --> 00:15:05,309 From maintenance to training plans 317 00:15:05,343 --> 00:15:07,080 to whatever it might have been, 318 00:15:07,113 --> 00:15:10,353 this entire detachment was automated, 319 00:15:10,386 --> 00:15:12,925 thanks to Josh Brown. 320 00:15:12,958 --> 00:15:14,695 What attracted him to it? 321 00:15:14,729 --> 00:15:17,066 I think it was the excitement. 322 00:15:17,100 --> 00:15:19,237 He liked to be doing things. 323 00:15:21,308 --> 00:15:25,483 Man, this is actually a fairly treacherous bottom. 324 00:15:25,517 --> 00:15:28,022 Just in case you doubt this is me. 325 00:15:28,055 --> 00:15:29,491 Hello, there. 326 00:15:29,525 --> 00:15:33,265 I think that might even get the motorcycle in there. Bye. 327 00:15:33,299 --> 00:15:34,936 You get up in the morning and you decide, 328 00:15:34,969 --> 00:15:37,407 "Well, today, I'm gonna go for a three-mile run 329 00:15:37,440 --> 00:15:41,315 "and maybe a 1,000-yard swim, just to kind of get warmed up. 330 00:15:41,348 --> 00:15:43,319 "And then maybe I'll go diving today 331 00:15:43,352 --> 00:15:46,659 "or go parachuting or, you know, 332 00:15:46,693 --> 00:15:48,964 go blow something up." 333 00:15:48,997 --> 00:15:50,366 It's interesting, too. Like, a lot of people 334 00:15:50,399 --> 00:15:52,203 kind of think, you know, these people 335 00:15:52,237 --> 00:15:54,207 who go and stand on landmines 336 00:15:54,241 --> 00:15:57,548 and render them safe have a death wish. 337 00:15:57,581 --> 00:16:00,119 It's just the opposite. You know, they do it 338 00:16:00,153 --> 00:16:03,459 because they know that somebody's gotta do it. 339 00:16:03,492 --> 00:16:07,400 It's a profession where people just don't have 340 00:16:07,433 --> 00:16:10,641 more than one chance to make mistakes. 341 00:16:10,674 --> 00:16:12,678 I used to call him the professor sometimes, 342 00:16:12,711 --> 00:16:15,617 because, you know, he was just so up on everything. 343 00:16:15,651 --> 00:16:18,590 Oh, geez. Car's doing it all itself. 344 00:16:18,623 --> 00:16:21,529 What am I gonna do with my hands down here? 345 00:16:21,563 --> 00:16:23,265 His relationship to technology 346 00:16:23,299 --> 00:16:25,369 was really symbiotic. 347 00:16:25,403 --> 00:16:27,741 I mean, it was totally connected. 348 00:16:27,775 --> 00:16:30,112 And when he called and said, "Hey, I got a surprise car 349 00:16:30,146 --> 00:16:31,381 I want to show you," 350 00:16:31,415 --> 00:16:33,319 I wasn't surprised when it was a Tesla. 351 00:16:33,352 --> 00:16:35,223 It was a beautiful looking car, 352 00:16:35,256 --> 00:16:38,329 and he loved it. 353 00:16:38,362 --> 00:16:40,199 He wanted to know 354 00:16:40,233 --> 00:16:43,038 how everything worked in that Tesla. 355 00:16:43,072 --> 00:16:45,343 All right, so this video is gonna show some driving 356 00:16:45,376 --> 00:16:49,050 in some hills and turns, so that you can see 357 00:16:49,084 --> 00:16:50,554 how it's gonna react to different things. 358 00:16:50,587 --> 00:16:53,760 Overall, it actually does a fantastically good job. 359 00:16:53,793 --> 00:16:55,296 He and the car, you know, 360 00:16:55,329 --> 00:16:59,137 were a match made in heaven, you know? 361 00:16:59,170 --> 00:17:00,741 He was perfectly suited 362 00:17:00,774 --> 00:17:04,381 for being on the bleeding edge of technology. 363 00:17:04,414 --> 00:17:08,322 And, you know, he had done a lot riskier things, 364 00:17:08,355 --> 00:17:11,128 you know, in his lifetime. 365 00:17:11,161 --> 00:17:13,132 Well, I made it. 366 00:17:13,165 --> 00:17:15,036 That was--that was quite sporting. 367 00:17:29,197 --> 00:17:31,435 Elon was looking for somebody to come in 368 00:17:31,468 --> 00:17:34,307 and help him really leverage his time. 369 00:17:34,341 --> 00:17:36,478 I had had a career as a serial entrepreneur, 370 00:17:36,512 --> 00:17:37,848 and I said to him several times, 371 00:17:37,881 --> 00:17:39,284 like, "I don't think I'm your guy. 372 00:17:39,317 --> 00:17:41,255 You need a big company car guy." 373 00:17:41,288 --> 00:17:43,425 And he kept saying, "No, I don't. 374 00:17:43,459 --> 00:17:45,096 I need a fellow entrepreneur." 375 00:17:45,129 --> 00:17:47,233 And then he called me one day and said, 376 00:17:47,267 --> 00:17:48,536 "I have a question for you." 377 00:17:48,570 --> 00:17:51,241 He said, "Tell me about the meaning of your work. 378 00:17:51,275 --> 00:17:53,546 You are gonna be able to change the world at a scale 379 00:17:53,580 --> 00:17:55,182 that you won't be able to change the world 380 00:17:55,216 --> 00:17:57,688 in your own companies." 381 00:17:57,721 --> 00:18:00,694 And that's what got me at the end of the day. 382 00:18:00,727 --> 00:18:04,669 Elon had a very specific way of motivating people. 383 00:18:04,702 --> 00:18:07,741 And that was he would say really cool things, 384 00:18:07,775 --> 00:18:09,444 like science fiction things. 385 00:18:09,477 --> 00:18:12,551 And he would make you believe that you could do it. 386 00:18:12,584 --> 00:18:14,522 At some point in the future, like, maybe five 387 00:18:14,555 --> 00:18:16,358 or six years from now, I think we'll be able 388 00:18:16,391 --> 00:18:19,732 to achieve true autonomous driving, 389 00:18:19,765 --> 00:18:22,571 where you could literally get in the car, go to sleep, 390 00:18:22,605 --> 00:18:24,441 and wake up at your destination. 391 00:18:24,474 --> 00:18:27,380 He is proposing that there is a better way. 392 00:18:27,413 --> 00:18:29,619 And I did really believe in the goals 393 00:18:29,652 --> 00:18:32,625 of the-of the team and Autopilot. 394 00:18:39,337 --> 00:18:42,243 In 2014, the group was put together 395 00:18:42,276 --> 00:18:44,615 for Autopilot and autonomy. 396 00:18:44,649 --> 00:18:47,453 I was one of the early team members. 397 00:18:47,487 --> 00:18:49,859 The naming happened before I arrived, 398 00:18:49,892 --> 00:18:54,568 but it obviously is a familiar term in aviation. 399 00:18:54,602 --> 00:18:57,373 You know, a car ought to be like a plane. 400 00:18:57,407 --> 00:18:59,144 And we trust airplanes to do this. 401 00:18:59,177 --> 00:19:00,580 We ought to be able to trust cars. 402 00:19:01,381 --> 00:19:03,586 Please route us through. 403 00:19:03,620 --> 00:19:06,592 Roger. You're now under automatic control. 404 00:19:06,626 --> 00:19:07,795 Hands-off steering. 405 00:19:09,965 --> 00:19:11,569 What people need to realize is this is 406 00:19:11,602 --> 00:19:13,339 a very, very old idea. 407 00:19:15,376 --> 00:19:17,380 Over, you know, the past decades, 408 00:19:17,413 --> 00:19:21,622 you've had efforts to build self-driving cars. 409 00:19:21,656 --> 00:19:24,862 In the 2000s, DARPA, which is a research arm 410 00:19:24,895 --> 00:19:27,367 of the U. S. Department of Defense, 411 00:19:27,400 --> 00:19:28,870 put on these contests. 412 00:19:28,903 --> 00:19:30,640 Welcome to the DARPA Grand Challenge. 413 00:19:30,674 --> 00:19:31,909 The objective: 414 00:19:31,943 --> 00:19:34,849 create a self-navigating autonomous vehicle. 415 00:19:34,882 --> 00:19:37,554 After one of these contests, Google got interested. 416 00:19:37,588 --> 00:19:41,629 Google is developing a robocar that drives itself. 417 00:19:41,662 --> 00:19:43,633 And over the past ten years, 418 00:19:43,666 --> 00:19:46,405 the self-driving car industry ramps up. 419 00:19:46,438 --> 00:19:50,312 And Elon Musk of course is not gonna wanna miss out on that. 420 00:19:56,258 --> 00:19:57,493 There was a bit of a mantra 421 00:19:57,528 --> 00:20:00,433 that would explain Elon's approach. 422 00:20:00,466 --> 00:20:02,336 And that mantra was that history is changed 423 00:20:02,370 --> 00:20:04,474 by unreasonable men. 424 00:20:04,508 --> 00:20:06,211 We were literally trying to change 425 00:20:06,244 --> 00:20:08,215 transportation worldwide. 426 00:20:09,652 --> 00:20:10,821 I remember the first time 427 00:20:10,854 --> 00:20:12,390 I walked in to Tesla's headquarters, 428 00:20:12,423 --> 00:20:14,729 where the Autopilot team sits. 429 00:20:14,762 --> 00:20:19,872 And there was a little sign on a pillar in that group, 430 00:20:19,905 --> 00:20:21,274 with a number on it. 431 00:20:21,308 --> 00:20:22,511 And I asked one of the engineers, 432 00:20:22,544 --> 00:20:24,515 "What's that about?" And he said, 433 00:20:24,548 --> 00:20:25,884 "That's the number of people that die 434 00:20:25,917 --> 00:20:27,721 "on U. S. highways every year. 435 00:20:27,755 --> 00:20:29,357 "That's why we're here. 436 00:20:29,390 --> 00:20:31,562 We're here to save lives." 437 00:20:31,596 --> 00:20:35,704 Most other companies working on self-driving car at the time 438 00:20:35,737 --> 00:20:38,710 were building heavily on top of lidar. 439 00:20:38,743 --> 00:20:41,448 Lidar--light detection and ranging. 440 00:20:41,481 --> 00:20:43,853 By bouncing pulses of light from a sensor, 441 00:20:43,887 --> 00:20:45,624 the vehicle's autonomous systems 442 00:20:45,657 --> 00:20:48,897 can figure out how far away objects are, allowing it to-- 443 00:20:48,930 --> 00:20:50,934 And it can see through situations 444 00:20:50,967 --> 00:20:54,007 that your cameras might struggle with. 445 00:20:54,040 --> 00:20:57,848 And that's what is one of the attractions to lidar. 446 00:20:57,881 --> 00:20:59,284 It seemed like the right decision 447 00:20:59,317 --> 00:21:01,021 for a lot of companies, and for Tesla, 448 00:21:01,054 --> 00:21:03,325 that was just not on the table. 449 00:21:04,728 --> 00:21:08,636 Lidar was too expensive and very breakable. 450 00:21:08,670 --> 00:21:11,542 Margins are thin, and every little bit matters. 451 00:21:11,576 --> 00:21:14,648 They have to get cars out into the world today, 452 00:21:14,682 --> 00:21:16,953 and they have to sell them. 453 00:21:16,986 --> 00:21:19,959 And so the challenge that we took on was: 454 00:21:19,992 --> 00:21:24,735 could you achieve autonomy with radar and sonar 455 00:21:24,768 --> 00:21:26,806 and images from cameras? 456 00:21:26,839 --> 00:21:28,375 So, if you're Elon Musk, 457 00:21:28,408 --> 00:21:31,014 he's gonna turn that into a positive, right? 458 00:21:31,047 --> 00:21:33,720 He's gonna tell the rest of the world--the language 459 00:21:33,753 --> 00:21:35,924 he's often liked to use-- that, "It's a crutch." 460 00:21:35,957 --> 00:21:38,028 Lidar ends up being, like, somewhat of a crutch. 461 00:21:38,061 --> 00:21:41,769 He starts to say, very early on, internally, 462 00:21:41,803 --> 00:21:43,740 and then pretty soon externally, 463 00:21:43,773 --> 00:21:46,444 that Tesla can build a self-driving car 464 00:21:46,478 --> 00:21:47,848 just with cameras. 465 00:21:47,881 --> 00:21:50,821 You can absolutely be superhuman with just cameras. 466 00:21:50,854 --> 00:21:52,524 As human beings, we have two eyes, 467 00:21:52,558 --> 00:21:54,060 and we manage not to run into each other, 468 00:21:54,093 --> 00:21:56,031 for the most part, when we're walking down the street. 469 00:21:56,064 --> 00:21:58,870 So the idea was, if you put eight cameras around a car, 470 00:21:58,903 --> 00:22:01,809 and essentially gave that camera eight eyes, 471 00:22:01,843 --> 00:22:03,580 you could keep it very safe 472 00:22:03,613 --> 00:22:05,382 from the other vehicles that are around it. 473 00:22:06,886 --> 00:22:10,059 There was nothing theoretically preventing that 474 00:22:10,092 --> 00:22:11,061 from happening. 475 00:22:11,094 --> 00:22:12,831 Like, you know, humans do it, 476 00:22:12,865 --> 00:22:14,535 and so there must be a way 477 00:22:14,568 --> 00:22:16,672 eventually for us to do it with cameras. 478 00:22:16,706 --> 00:22:20,747 There was no deep research phase, where various vehicles 479 00:22:20,780 --> 00:22:24,020 were outfitted with a range of sensors. 480 00:22:24,053 --> 00:22:26,358 Many team members would have liked that. 481 00:22:26,391 --> 00:22:29,130 Instead, the conclusion was made first, 482 00:22:29,163 --> 00:22:31,001 and then the tests and development activities 483 00:22:31,034 --> 00:22:34,407 began to prove that conclusion correct. 484 00:22:35,810 --> 00:22:39,417 A big Silicon Valley company plans a major announcement. 485 00:22:39,450 --> 00:22:41,154 Last night at an event in Southern California. 486 00:22:41,187 --> 00:22:42,423 - Holy shit! - Whoa. 487 00:22:42,456 --> 00:22:45,062 Oh, shit. Oh, my God. 488 00:22:45,095 --> 00:22:46,866 The first announcement for Autopilot 489 00:22:46,899 --> 00:22:49,404 was in the fall of 2014. 490 00:22:49,437 --> 00:22:50,941 Welcome, everyone. 491 00:22:50,974 --> 00:22:53,813 So, we've been able to accelerate Autopilot. 492 00:22:53,846 --> 00:22:55,850 At the time, what they're building 493 00:22:55,884 --> 00:22:59,157 is really a driver-assisted system. 494 00:22:59,190 --> 00:23:02,998 The human must stay diligent, 495 00:23:03,031 --> 00:23:04,133 must keep their eyes on the road, 496 00:23:04,167 --> 00:23:05,135 ready to take over at any time. 497 00:23:05,169 --> 00:23:07,774 That's very different from a self-driving car. 498 00:23:07,808 --> 00:23:09,945 It'll detect if there's a car in your blind spot. 499 00:23:09,979 --> 00:23:13,686 But Elon Musk, and by extension Tesla, 500 00:23:13,720 --> 00:23:15,624 decided they were gonna tell people 501 00:23:15,657 --> 00:23:18,763 that we're on the way to a self-driving car. 502 00:23:18,796 --> 00:23:20,533 That's gonna be a selling point. 503 00:23:20,567 --> 00:23:22,003 We have the Autopilot section here. 504 00:23:22,036 --> 00:23:23,405 And you can watch it. 505 00:23:23,438 --> 00:23:24,675 It'll read the speed limit signs, 506 00:23:24,708 --> 00:23:26,512 so we increase speed from 25 to 30. 507 00:23:26,545 --> 00:23:27,648 It's following the white lines. 508 00:23:27,814 --> 00:23:28,917 It's following the white lines. 509 00:23:29,652 --> 00:23:32,758 The car can do almost anything. 510 00:23:32,791 --> 00:23:35,195 He was up there in the lights, 511 00:23:35,229 --> 00:23:36,933 making all sorts of wild claims 512 00:23:36,966 --> 00:23:39,605 about the capabilities of that particular system. 513 00:23:39,638 --> 00:23:42,811 In fact, when you get home, you'll actually be able to 514 00:23:42,844 --> 00:23:44,815 just step out of the car 515 00:23:44,848 --> 00:23:46,852 and have it park itself in your garage. 516 00:23:46,886 --> 00:23:48,923 At some point, he cracked a joke about 517 00:23:48,957 --> 00:23:51,494 how he was gonna say something 518 00:23:51,529 --> 00:23:53,933 that his engineers would hear for the first time. 519 00:23:53,967 --> 00:23:58,175 And then something-something I'd like to do-- 520 00:23:58,208 --> 00:24:00,145 which I think many of our engineers will be hearing this 521 00:24:00,179 --> 00:24:03,218 in real time-- 522 00:24:03,251 --> 00:24:06,525 is have the charge connector plug itself in. 523 00:24:08,596 --> 00:24:11,234 Like an articulating-- like, sort of a snake. 524 00:24:11,267 --> 00:24:13,005 And I thought to myself, buddy, 525 00:24:13,038 --> 00:24:15,075 you already said a lot that your engineers 526 00:24:15,109 --> 00:24:17,514 are hearing for the first time. 527 00:24:17,547 --> 00:24:21,087 The other thing that is really heightened inside Tesla 528 00:24:21,121 --> 00:24:22,958 is you've got Elon Musk 529 00:24:22,991 --> 00:24:26,632 really driving the aura around these cars. 530 00:24:26,666 --> 00:24:30,239 Autopilot was a very strong focus of Elon. 531 00:24:30,272 --> 00:24:33,613 He sort of created Tesla's brand around it. 532 00:24:33,646 --> 00:24:39,090 You know, I'm confident that, in less than a year, 533 00:24:39,123 --> 00:24:42,798 you'll be able to go from highway onramp to highway exit 534 00:24:42,831 --> 00:24:44,735 without touching any controls. 535 00:24:46,037 --> 00:24:47,607 So people start buying this. 536 00:24:47,641 --> 00:24:50,847 And even then--and this is typical of Tesla, 537 00:24:50,880 --> 00:24:53,218 and Silicon Valley in general-- 538 00:24:53,251 --> 00:24:55,022 it wasn't ready. 539 00:24:55,055 --> 00:24:58,896 Then there's all this pressure inside Tesla to get it done. 540 00:24:58,930 --> 00:25:02,036 Elon sets crazy ambitious goals for himself, 541 00:25:02,069 --> 00:25:04,240 and then that translates to crazy ambitious goals 542 00:25:04,273 --> 00:25:05,710 for people around him. 543 00:25:05,744 --> 00:25:08,583 So, often we describe to recruits, 544 00:25:08,616 --> 00:25:10,285 you are not joining the regular army here. 545 00:25:10,319 --> 00:25:12,624 You're joining special forces. 546 00:25:17,901 --> 00:25:22,944 I got to be an approved Autopilot tester back in 2014. 547 00:25:22,978 --> 00:25:25,149 The first time I drove the car, 548 00:25:25,182 --> 00:25:27,854 went across an intersection in a neighborhood, 549 00:25:27,888 --> 00:25:31,862 and the car went full brake for about half a second, 550 00:25:31,896 --> 00:25:34,267 and then immediately full throttle. 551 00:25:36,037 --> 00:25:38,543 It was really sort of a wakeup call. 552 00:25:38,576 --> 00:25:40,112 This is very experimental. 553 00:25:40,145 --> 00:25:42,684 I'm testing the latest version of Autopilot 554 00:25:42,718 --> 00:25:43,753 every week. 555 00:25:43,786 --> 00:25:47,326 We wanna make sure that our testing is exhaustive 556 00:25:47,359 --> 00:25:50,667 before we release the software. 557 00:25:52,303 --> 00:25:54,107 A lot of the software updates were pushed 558 00:25:54,140 --> 00:25:56,044 to Elon Musk's own car. 559 00:25:56,077 --> 00:25:58,616 Certainly, his opinions were the ones 560 00:25:58,649 --> 00:26:01,856 that were always keeping the team on their toes. 561 00:26:01,889 --> 00:26:04,928 But what it also does is it focuses the team 562 00:26:04,962 --> 00:26:08,736 on certain short-term appeasement projects, 563 00:26:08,770 --> 00:26:11,241 as opposed to developing 564 00:26:11,274 --> 00:26:13,813 a more total, complete solution. 565 00:26:15,817 --> 00:26:19,123 I was concerned about the fact that the software 566 00:26:19,157 --> 00:26:24,802 simply wasn't validated across a wide range of roadways. 567 00:26:24,835 --> 00:26:27,607 And if this is the first step, 568 00:26:27,641 --> 00:26:31,682 in terms of this technology's relationship to the public, 569 00:26:31,715 --> 00:26:35,957 then, you know, it doesn't paint a pretty picture. 570 00:26:35,990 --> 00:26:40,299 And so I started considering other opportunities. 571 00:26:40,332 --> 00:26:42,871 I mean, I almost--this may sound a little complacent, 572 00:26:42,904 --> 00:26:44,775 but I almost view it as, like, a solved problem. 573 00:26:44,808 --> 00:26:46,712 Like, we know exactly what to do, 574 00:26:46,745 --> 00:26:49,117 and we'll be there in a few years. 575 00:26:54,160 --> 00:26:55,663 Are self-driving cars 576 00:26:55,697 --> 00:26:57,066 closer than we think? 577 00:26:57,099 --> 00:26:59,938 Well, a few days ago, Tesla CEO Elon Musk tweeted, 578 00:26:59,972 --> 00:27:02,176 "Autopilot goes to wide release on Thursday." 579 00:27:02,209 --> 00:27:04,046 Unveiled an Autopilot system, 580 00:27:04,080 --> 00:27:07,954 which allows cars to change lanes by themselves. 581 00:27:07,988 --> 00:27:11,060 It takes a lot of the tedious aspects of driving 582 00:27:11,094 --> 00:27:13,198 and alleviates it from your concern 583 00:27:13,231 --> 00:27:17,039 as you're driving on a long stretch of a highway. 584 00:27:17,072 --> 00:27:19,110 We were really excited to not only show 585 00:27:19,143 --> 00:27:20,647 the technology to the world, 586 00:27:20,680 --> 00:27:23,084 but also show the potential of the technology. 587 00:27:23,118 --> 00:27:24,220 You know, in fact, comfortably 588 00:27:24,253 --> 00:27:27,961 within three years, the car will be able to take you 589 00:27:27,994 --> 00:27:29,397 from point to point-- like, basically 590 00:27:29,430 --> 00:27:31,769 from your driveway to work-- 591 00:27:31,802 --> 00:27:33,205 without you touching anything. 592 00:27:33,238 --> 00:27:35,910 And you could be asleep the whole time. 593 00:27:35,944 --> 00:27:39,183 Tesla, in putting out material blog posts, 594 00:27:39,217 --> 00:27:42,022 made it clear self-driving cars are not here, 595 00:27:42,056 --> 00:27:43,659 they are a long way away. 596 00:27:50,272 --> 00:27:53,378 When it comes to Tesla and Elon Musk, 597 00:27:53,411 --> 00:27:57,888 the message is constantly going up and down 598 00:27:57,921 --> 00:27:59,156 and up and down, right? 599 00:27:59,190 --> 00:28:01,829 And Elon can change his mind at any moment. 600 00:28:01,862 --> 00:28:03,733 He can say one thing at one moment, 601 00:28:03,766 --> 00:28:05,837 and then he'll say something completely different. 602 00:28:13,920 --> 00:28:16,157 The car can do almost anything. 603 00:28:16,190 --> 00:28:20,800 The expectation is that someone is paying attention to the road 604 00:28:20,833 --> 00:28:23,305 and is ready to take over if there is an issue. 605 00:28:28,381 --> 00:28:32,289 What people need to realize is that it's very easy 606 00:28:32,323 --> 00:28:33,793 to say these things. 607 00:28:33,826 --> 00:28:35,395 And there's no check on him. 608 00:28:35,429 --> 00:28:37,366 They advise you to keep your hands on the steering wheel 609 00:28:37,399 --> 00:28:39,036 when using the auto-steer, 610 00:28:39,070 --> 00:28:42,376 but as we're in testing, you really don't need to. 611 00:28:42,409 --> 00:28:44,748 I think, for a lot of the Tesla fans, 612 00:28:44,781 --> 00:28:49,891 they focus in on the things that Elon Musk says 613 00:28:49,925 --> 00:28:51,461 that they want to hear. 614 00:28:53,064 --> 00:28:57,373 So when he says self-driving is a solved problem, 615 00:28:57,406 --> 00:28:58,809 that's what they hear, 616 00:28:58,843 --> 00:29:01,080 and that's what they pay attention to. 617 00:29:01,114 --> 00:29:02,483 I had an investor say to me, 618 00:29:02,517 --> 00:29:05,757 "You guys have embarked on a really virtuous path, 619 00:29:05,790 --> 00:29:07,293 but it's gonna be a difficult path." 620 00:29:07,326 --> 00:29:10,399 And I said, "Why?" He said, "Well, we're also, you know, 621 00:29:10,432 --> 00:29:12,771 investors in the largest pharma companies in the world," 622 00:29:12,804 --> 00:29:16,444 and it's expected that people will die in a drug trial." 623 00:29:16,477 --> 00:29:20,085 And it happens largely outside of the spotlight of the media." 624 00:29:20,118 --> 00:29:22,557 A number of videos hit the internet showing drivers 625 00:29:22,590 --> 00:29:24,226 going hands-free, 626 00:29:24,260 --> 00:29:28,034 playing games, even sleeping while the car's in motion. 627 00:29:28,068 --> 00:29:30,272 He pointed that "your challenge is gonna be 628 00:29:30,305 --> 00:29:32,911 quite different with Autopilot," 629 00:29:32,944 --> 00:29:37,486 because people don't expect to tolerate deaths on a highway," 630 00:29:37,520 --> 00:29:39,323 and it's going to be in the spotlight." 631 00:29:39,925 --> 00:29:43,298 Somebody is gonna get in an accident. 632 00:29:43,331 --> 00:29:45,035 Will Tesla be liable for that? 633 00:29:45,068 --> 00:29:46,505 If there's unfortunately an accident, 634 00:29:46,538 --> 00:29:49,443 the driver is in control of the car. 635 00:29:51,615 --> 00:29:53,451 I remember Josh talking to me 636 00:29:53,485 --> 00:29:55,556 about the Autopilot system. 637 00:29:55,590 --> 00:29:58,596 I wasn't there yet, in terms of the technology. 638 00:29:58,629 --> 00:29:59,998 You know, he was all-in. 639 00:30:00,032 --> 00:30:03,171 All right, so this is the new 7.0 firmware 640 00:30:03,204 --> 00:30:05,175 for the Tesla Model S. 641 00:30:05,208 --> 00:30:07,914 I believe that he felt, you know, very qualified 642 00:30:07,947 --> 00:30:11,254 as to how those features, you know, worked. 643 00:30:11,287 --> 00:30:13,224 He would create these videos. 644 00:30:13,258 --> 00:30:15,930 This is gonna be a busy intersection, 645 00:30:15,963 --> 00:30:19,170 just so you can see how it reacts with the traffic. 646 00:30:19,203 --> 00:30:21,007 He was studying it with an eye toward, 647 00:30:21,040 --> 00:30:23,144 "How can I help other people 648 00:30:23,178 --> 00:30:25,349 get the best out of their Tesla, too?" 649 00:30:25,382 --> 00:30:27,854 Here's a turn that the auto-steer 650 00:30:27,887 --> 00:30:29,558 is probably going to do very, very poorly, 651 00:30:29,591 --> 00:30:33,198 'cause it's in a turn that's very sharp. 652 00:30:33,231 --> 00:30:35,035 And yep, it said take control, 653 00:30:35,068 --> 00:30:36,872 and he immediately took control. 654 00:30:38,976 --> 00:30:42,483 I remember something about how Josh had said 655 00:30:42,517 --> 00:30:44,521 the car had helped possibly save his life, 656 00:30:44,554 --> 00:30:46,925 because of something that occurred on the road. 657 00:30:55,543 --> 00:31:01,020 Josh told me that Elon retweeted the video. 658 00:31:01,054 --> 00:31:03,358 And he was so happy about it. 659 00:31:03,391 --> 00:31:05,930 Sort of in the midst of this--you know, 660 00:31:05,963 --> 00:31:08,569 this great wave of technology and what's happening. 661 00:31:08,602 --> 00:31:11,608 And yeah, it was just a great moment for him. 662 00:31:19,658 --> 00:31:22,664 Levy 911, what is the address of your emergency? 663 00:31:22,697 --> 00:31:24,333 There was just a wreck. 664 00:31:24,366 --> 00:31:25,570 A head-on collision right here-- 665 00:31:25,603 --> 00:31:27,439 Oh, my God Almighty. 666 00:31:32,717 --> 00:31:38,328 I was here at work, and probably heard about 667 00:31:38,361 --> 00:31:41,969 the accident an hour after it happened. 668 00:31:44,240 --> 00:31:46,077 Ken called me. 669 00:31:46,110 --> 00:31:50,419 Told me that Josh had been killed in a car accident. 670 00:31:52,657 --> 00:31:55,162 Josh and his family had just been on a vacation 671 00:31:55,195 --> 00:31:58,101 to Disney World, in Orlando, 672 00:31:58,134 --> 00:32:01,608 and that he had said goodbye to everybody, 673 00:32:01,642 --> 00:32:03,646 jumped in his car. 674 00:32:03,679 --> 00:32:06,484 I, Corporal Daphne Yunker, of the Florida Highway Patrol, 675 00:32:06,518 --> 00:32:08,421 am conducting a criminal investigation. 676 00:32:08,455 --> 00:32:12,162 I noticed the car come over the upper grade 677 00:32:12,196 --> 00:32:13,666 and start coming down, 678 00:32:13,699 --> 00:32:17,941 and the semi turned left, started crossing the highway. 679 00:32:17,974 --> 00:32:22,049 I thought the car was gonna stop, and it didn't. 680 00:32:22,082 --> 00:32:25,623 It was like a white explosion, a cloud. 681 00:32:25,656 --> 00:32:29,965 Only thing I could think of was how could this happen? 682 00:32:29,998 --> 00:32:35,008 And my heart was broken for his family. 683 00:32:40,653 --> 00:32:44,995 I was--I was incredulous, and it was a real blow. 684 00:32:45,028 --> 00:32:48,501 Of all people. Of all people. 685 00:32:48,536 --> 00:32:51,174 It did not seem to speed up or slow down. 686 00:32:51,207 --> 00:32:54,046 Drove right through a little grove of trees 687 00:32:54,079 --> 00:32:56,284 at the--at someone's property line. 688 00:32:56,317 --> 00:32:58,254 The first question I think that went through my mind 689 00:32:58,288 --> 00:33:01,327 after the accident-- was he in self-drive? 690 00:33:01,360 --> 00:33:03,732 It was devastating, obviously, to lose a friend, 691 00:33:03,766 --> 00:33:05,503 but it was-- what was frustrating, 692 00:33:05,536 --> 00:33:09,210 I think, to me, was knowing that, 693 00:33:09,243 --> 00:33:12,684 you know, that maybe he was on that front edge of technology, 694 00:33:12,717 --> 00:33:16,057 maybe a little bit further than we would have all liked. 695 00:33:17,627 --> 00:33:19,798 How do you think that that particular crash 696 00:33:19,831 --> 00:33:22,035 that day could have been prevented? 697 00:33:26,712 --> 00:33:28,616 I don't know, because I didn't see anything 698 00:33:28,649 --> 00:33:31,420 that was--you know, I don't know. 699 00:33:33,491 --> 00:33:35,195 When I heard about Josh's accident, 700 00:33:35,228 --> 00:33:38,167 it was personal, in that sense, 701 00:33:38,201 --> 00:33:40,806 that it felt like we'd lost a member of a family. 702 00:33:44,848 --> 00:33:47,052 Elon had an all-hands meeting 703 00:33:47,085 --> 00:33:48,354 for the Autopilot team. 704 00:33:50,325 --> 00:33:52,095 It was just that, you know, this had happened, 705 00:33:52,129 --> 00:33:56,470 and we're doing all we could to figure it out, 706 00:33:56,505 --> 00:34:02,182 and, you know, we do want to try to make Autopilot safe. 707 00:34:04,086 --> 00:34:05,488 At the time of that crash, I was aware 708 00:34:05,523 --> 00:34:08,629 that people were trusting the system to do things 709 00:34:08,662 --> 00:34:12,135 that it was not designed or capable of doing. 710 00:34:12,169 --> 00:34:14,774 The fact that that sort of accident happened 711 00:34:14,808 --> 00:34:17,513 was obviously tragic, but it wasn't really-- 712 00:34:17,547 --> 00:34:19,416 wasn't something that-- 713 00:34:19,450 --> 00:34:20,720 it was going to happen. 714 00:34:22,557 --> 00:34:24,360 It was going to happen. 715 00:34:39,524 --> 00:34:41,394 It was the middle of June. 716 00:34:41,427 --> 00:34:44,567 I got an email from our investigatory team 717 00:34:44,601 --> 00:34:47,172 that there had been a Tesla fatality. 718 00:34:51,648 --> 00:34:53,852 The National Highway Traffic Safety Administration, 719 00:34:53,886 --> 00:34:57,827 or NHTSA, has the authority to regulate unreasonable risk 720 00:34:57,860 --> 00:34:59,263 to safety on the roads. 721 00:35:01,535 --> 00:35:05,475 Evening of June 29th, we had scheduled a call with Tesla. 722 00:35:05,509 --> 00:35:07,580 Our general counsel let them know 723 00:35:07,614 --> 00:35:09,784 we'd be opening this investigation 724 00:35:09,818 --> 00:35:12,289 and that it would be made public the following day. 725 00:35:13,826 --> 00:35:16,297 At that point, Elon Musk came on 726 00:35:16,330 --> 00:35:19,436 and just sort of started shouting. 727 00:35:19,470 --> 00:35:20,806 He was really, really upset 728 00:35:20,840 --> 00:35:22,877 that we'd be opening a public investigation, 729 00:35:22,910 --> 00:35:27,386 accusing us of singling Tesla out. 730 00:35:27,419 --> 00:35:29,356 Made the point several times that, you know, 731 00:35:29,390 --> 00:35:33,164 this was one fatality out of more than 35,000 a year, 732 00:35:33,197 --> 00:35:35,836 so why were we picking on Tesla 733 00:35:35,870 --> 00:35:37,607 and suggesting that he would sue us 734 00:35:37,640 --> 00:35:39,276 for opening this investigation. 735 00:35:41,180 --> 00:35:43,752 I was surprised to hear Elon on the call. 736 00:35:43,786 --> 00:35:46,357 I was surprised to hear how angry he was. 737 00:35:48,328 --> 00:35:50,198 But ultimately, none of that mattered. 738 00:35:50,231 --> 00:35:53,171 Our job was only to worry about the safety, 739 00:35:53,204 --> 00:35:55,375 and this was a clear issue of safety 740 00:35:55,408 --> 00:35:57,580 that needed to be investigated. 741 00:35:59,483 --> 00:36:02,623 The driver of the semi reported that the Navy vet 742 00:36:02,657 --> 00:36:06,297 was watching a movie while driving. 743 00:36:06,330 --> 00:36:08,702 Not very long after the accident, 744 00:36:08,736 --> 00:36:10,372 there were all these people 745 00:36:10,405 --> 00:36:13,378 saying really just crass things, 746 00:36:13,411 --> 00:36:18,555 claiming, you know, that Josh was watching a program. 747 00:36:20,526 --> 00:36:22,530 That's not Josh. 748 00:36:22,563 --> 00:36:23,966 Guarantee you that's not Josh. 749 00:36:23,999 --> 00:36:25,569 What investigators are looking for 750 00:36:25,603 --> 00:36:29,176 is the data leading up to the accident. 751 00:36:30,646 --> 00:36:34,687 One of the huge challenges of the system at the time 752 00:36:34,721 --> 00:36:39,531 was trying to differentiate between a truck 753 00:36:39,564 --> 00:36:42,402 and a bridge-- and a overhead bridge. 754 00:36:42,436 --> 00:36:44,541 You know, when a truck is parked perpendicular 755 00:36:44,574 --> 00:36:46,410 to the road and blocking the way, 756 00:36:46,444 --> 00:36:48,549 the system might think of it as a overhead bridge, 757 00:36:48,582 --> 00:36:51,888 and so it was safe to kind of continue driving through it. 758 00:36:51,922 --> 00:36:53,726 Tesla posted to their blog, 759 00:36:53,759 --> 00:36:55,930 calling this incident a tragic loss. 760 00:36:55,963 --> 00:36:58,501 Tesla said the car ran into a tractor-trailer 761 00:36:58,535 --> 00:36:59,938 because the software didn't notice 762 00:36:59,971 --> 00:37:05,850 the white side of the truck in the brightly lit sky. 763 00:37:05,883 --> 00:37:07,352 After the crash, 764 00:37:07,385 --> 00:37:11,260 I think Tesla and Musk were pretty defensive. 765 00:37:11,293 --> 00:37:12,864 If, in writing some article that's negative, 766 00:37:12,897 --> 00:37:14,500 you effectively dissuade people 767 00:37:14,534 --> 00:37:15,770 from using autonomous vehicle, 768 00:37:15,803 --> 00:37:16,938 you're killing people. 769 00:37:16,972 --> 00:37:19,343 In the statement that Tesla put out, 770 00:37:19,376 --> 00:37:21,815 they more or less said it was driver error. 771 00:37:23,050 --> 00:37:26,390 They reminded people you have to keep your eyes on the road. 772 00:37:26,424 --> 00:37:27,827 They didn't say Joshua Brown 773 00:37:27,860 --> 00:37:29,430 didn't keep his eyes on the road, 774 00:37:29,463 --> 00:37:31,300 but that's what they were implying. 775 00:37:31,333 --> 00:37:32,770 Tesla says you should keep your hands 776 00:37:32,804 --> 00:37:35,408 on the steering wheel during the Autopilot. 777 00:37:35,442 --> 00:37:37,713 The question then is... - both: What is the point? 778 00:37:37,747 --> 00:37:38,715 - We both said it. - If I have to hold on 779 00:37:38,749 --> 00:37:40,418 to the wheel? 780 00:37:40,452 --> 00:37:42,790 Elon had already talked a pretty big game 781 00:37:42,824 --> 00:37:44,994 about what this technology was going to do. 782 00:37:45,996 --> 00:37:47,633 It's kinda hard to reel it back 783 00:37:47,667 --> 00:37:49,904 if you've already raised people's expectations 784 00:37:49,938 --> 00:37:51,173 and excitement. 785 00:37:51,207 --> 00:37:54,012 Do you have any regrets about how Tesla rolled out Autopilot 786 00:37:54,046 --> 00:37:55,783 in the cars? 787 00:37:55,816 --> 00:37:57,987 No, I think--I think we did the right thing. 788 00:37:58,020 --> 00:38:00,693 You know, it's basically advanced driver's assistance, 789 00:38:00,726 --> 00:38:02,395 at this point. 790 00:38:02,429 --> 00:38:04,366 Every single step we took, at least from our standpoint, 791 00:38:04,399 --> 00:38:07,507 was to reduce complacency in the use of Autopilot 792 00:38:07,540 --> 00:38:09,009 and to improve safety. 793 00:38:13,117 --> 00:38:15,556 This is new technology that's on the roads. 794 00:38:15,589 --> 00:38:17,359 People have a lot of questions. 795 00:38:17,392 --> 00:38:19,898 This one crash was an opportunity to sort of say, 796 00:38:19,931 --> 00:38:21,701 is there a technological problem here 797 00:38:21,735 --> 00:38:24,406 with this--you know, with this Autopilot suite? 798 00:38:26,678 --> 00:38:28,982 The first thing we did was go to Tesla and say, 799 00:38:29,016 --> 00:38:31,922 "Hey, give us all the data you have on crashes 800 00:38:31,955 --> 00:38:33,057 on where Autopilot is in use." 801 00:38:35,629 --> 00:38:37,098 What we knew is that there were a lot of crashes. 802 00:38:37,132 --> 00:38:39,036 And this is not surprising and necessarily 803 00:38:39,069 --> 00:38:40,639 a cause for concern. 804 00:38:40,673 --> 00:38:44,948 There are a lot of crashes on the roadways in the U. S. 805 00:38:44,981 --> 00:38:47,118 So yeah, there's 38 separate crashes 806 00:38:47,152 --> 00:38:49,356 that we're looking at here. 807 00:38:49,389 --> 00:38:51,928 The world doesn't know about these other crashes, 808 00:38:51,962 --> 00:38:54,399 because Tesla hasn't made it public. 809 00:38:54,433 --> 00:38:57,607 Tesla's saying Autopilot is safer, 810 00:38:57,640 --> 00:38:59,577 but what we're seeing with these crashes 811 00:38:59,611 --> 00:39:01,480 are these gray areas. 812 00:39:01,515 --> 00:39:03,852 In the Tesla case, what we were looking at was: 813 00:39:03,886 --> 00:39:05,121 was there a pattern showing 814 00:39:05,154 --> 00:39:07,092 that there is a technological defect, 815 00:39:07,125 --> 00:39:08,929 or that people were using Autopilot 816 00:39:08,962 --> 00:39:11,735 beyond the way that it was designed to be used? 817 00:39:11,768 --> 00:39:13,572 The internal pressure was, 818 00:39:13,605 --> 00:39:15,475 "We gotta get this problem solved, pronto." 819 00:39:17,547 --> 00:39:18,916 When Autopilot first came out, 820 00:39:18,949 --> 00:39:21,387 the main way of making sure the driver 821 00:39:21,420 --> 00:39:23,124 was paying attention-- it could detect 822 00:39:23,157 --> 00:39:25,796 whether your hand was on the steering wheel. 823 00:39:25,829 --> 00:39:28,100 It would let you keep your hand off the steering wheel 824 00:39:28,134 --> 00:39:30,873 for minutes at a time--three, four, five minutes. 825 00:39:31,841 --> 00:39:34,714 It was just too long between those warnings. 826 00:39:34,747 --> 00:39:36,685 What we had to do was struggle with how to do that 827 00:39:36,718 --> 00:39:39,791 in an elegant way that would keep consumers engaged 828 00:39:39,824 --> 00:39:43,599 and not--and not cause them to ignore or be frustrated by it. 829 00:39:43,632 --> 00:39:45,636 After weeks of controversy and questions 830 00:39:45,669 --> 00:39:46,805 about the safety of it's Autopilot drivers. 831 00:39:46,838 --> 00:39:48,474 What it calls major improvements 832 00:39:48,508 --> 00:39:49,844 to its Autopilot software. 833 00:39:50,846 --> 00:39:53,050 They announced there would be this press conference, 834 00:39:53,084 --> 00:39:54,386 and Elon would talk about it. 835 00:39:54,988 --> 00:39:56,925 Something quite significant is, 836 00:39:56,958 --> 00:39:59,530 if the user ignores repeated warnings, 837 00:39:59,564 --> 00:40:01,433 more than three times in an hour, 838 00:40:01,467 --> 00:40:04,908 then the driver will have to park the car and restart it. 839 00:40:04,941 --> 00:40:08,582 There would be more frequent warnings, shorter intervals, 840 00:40:08,615 --> 00:40:10,552 up to three minutes. 841 00:40:10,586 --> 00:40:12,623 You would get a chime to remind you 842 00:40:12,657 --> 00:40:15,896 to put your hands back on. 843 00:40:15,930 --> 00:40:19,069 But that's still a system that has a lot of gaps. 844 00:40:19,102 --> 00:40:21,608 In terms of taking your eyes off the road, 845 00:40:21,641 --> 00:40:24,514 30 seconds is an eternity. 846 00:40:24,547 --> 00:40:26,618 I really feel like we've struck a great balance 847 00:40:26,651 --> 00:40:31,661 between improving the safety and improved the usefulness. 848 00:40:31,695 --> 00:40:35,669 I remember Elon talked about how it was gonna be the radar 849 00:40:35,703 --> 00:40:40,879 that was sort of first-rank or priority one. 850 00:40:40,913 --> 00:40:44,721 We're making much more effective use of radar. 851 00:40:44,754 --> 00:40:49,496 I just thought radar has been around for 75 years. 852 00:40:49,530 --> 00:40:53,505 If they could do this now, why didn't they do it before? 853 00:40:53,538 --> 00:40:55,141 I think the timing was significant. 854 00:40:55,174 --> 00:40:59,917 I mean, it was right after this tragic accident. 855 00:40:59,951 --> 00:41:02,990 And they were trying to make it sound like, 856 00:41:03,024 --> 00:41:04,994 "We got this under control." 857 00:41:05,028 --> 00:41:06,931 Obvious question I have to ask: 858 00:41:06,965 --> 00:41:09,771 would the improvements have mitigated 859 00:41:09,804 --> 00:41:12,710 or saved, say, Josh Brown's life? 860 00:41:15,248 --> 00:41:17,820 We believe it would have. 861 00:41:17,853 --> 00:41:21,226 And so, the truck would have been seen by the radar only, 862 00:41:21,260 --> 00:41:23,230 and braking would have been engaged. 863 00:41:26,538 --> 00:41:29,042 These things cannot be said with absolute certainty, 864 00:41:29,076 --> 00:41:31,246 but we believe it is very likely that, 865 00:41:31,280 --> 00:41:33,652 yes, it would have. 866 00:41:39,597 --> 00:41:40,899 Yeah, I mean, there have been so many announcements of, 867 00:41:40,933 --> 00:41:42,001 like, autonomous EV startups. 868 00:41:42,035 --> 00:41:43,270 I'm waiting for my mom to announce one. 869 00:41:43,304 --> 00:41:44,574 Okay. 870 00:41:44,607 --> 00:41:47,212 It's like, "Mom, you too?" 871 00:41:47,245 --> 00:41:49,182 Speaking of that, when you're talking about the sales, 872 00:41:49,216 --> 00:41:52,523 you have booked how many orders for-- 873 00:41:52,556 --> 00:41:54,292 - It's on the order of 400,000. - 400,000. 874 00:41:54,326 --> 00:41:56,096 It's quite surprising, actually. 875 00:41:56,130 --> 00:41:58,602 I mean, the-- 876 00:41:58,635 --> 00:42:00,839 'cause we didn't do any advertising. 877 00:42:00,873 --> 00:42:02,843 Elon had, I think, in some ways, 878 00:42:02,877 --> 00:42:04,246 a personal point of pride 879 00:42:04,279 --> 00:42:07,586 to be able to move faster than the competition. 880 00:42:11,293 --> 00:42:13,698 The company was betting its survival 881 00:42:13,732 --> 00:42:15,903 on the success of the Model 3. 882 00:42:15,936 --> 00:42:18,007 And the fact that Autopilot was gonna be on it, 883 00:42:18,040 --> 00:42:19,877 I think was a huge selling point. 884 00:42:19,911 --> 00:42:22,550 If you think about fully autonomous vehicles, 885 00:42:22,583 --> 00:42:25,254 how far do you think we are from that becoming a reality? 886 00:42:25,288 --> 00:42:27,727 I think we're basically 887 00:42:27,760 --> 00:42:30,866 less than two years away from complete autonomy. 888 00:42:30,899 --> 00:42:33,638 - Wow. - Complete. Safer than a human. 889 00:42:33,672 --> 00:42:35,709 As with a lot of what happens with Elon, 890 00:42:35,743 --> 00:42:38,247 he doubles down on it over and over and over again. 891 00:42:38,280 --> 00:42:40,017 And he continues with his message, right, 892 00:42:40,051 --> 00:42:41,755 that, you know, this is gonna be 893 00:42:41,788 --> 00:42:43,625 a safe thing for the world. 894 00:42:43,658 --> 00:42:45,161 You know, the Joshua Brown crash 895 00:42:45,194 --> 00:42:47,600 was in the spring of 2016. 896 00:42:47,633 --> 00:42:50,773 By the fall of 2016, the entire Autopilot team 897 00:42:50,806 --> 00:42:53,779 essentially quit what they were doing, 898 00:42:53,812 --> 00:42:57,185 and they all chipped in on this video 899 00:42:57,218 --> 00:43:00,926 to show just how autonomous, 900 00:43:00,959 --> 00:43:02,863 so to speak, their car could be. 901 00:43:19,329 --> 00:43:21,668 Do you remember this video? 902 00:43:21,701 --> 00:43:23,337 Yeah. 903 00:43:23,370 --> 00:43:25,976 Changed lanes, and stopped just in-- 904 00:43:26,009 --> 00:43:27,312 just short of a crosswalk. 905 00:43:27,345 --> 00:43:29,884 We're turning right onto-- yeah, kind of 906 00:43:29,917 --> 00:43:31,286 in front of traffic, but-- 907 00:43:31,320 --> 00:43:35,261 It's very slick, but the video does not give you 908 00:43:35,294 --> 00:43:39,136 a full impression of what is actually happening. 909 00:43:39,169 --> 00:43:40,873 The people that were putting it together 910 00:43:40,906 --> 00:43:43,812 were sitting right behind me. 911 00:43:43,845 --> 00:43:47,185 And the Autopilot group was running lap after lap that day, 912 00:43:47,218 --> 00:43:49,055 to try to get a clean lap. 913 00:43:49,089 --> 00:43:53,732 At one point, the car, while in Autopilot mode, 914 00:43:53,765 --> 00:43:55,736 hit a fence. 915 00:43:55,769 --> 00:43:58,374 They patched the car up, and they did another run. 916 00:43:58,407 --> 00:43:59,977 And so, at the very end of the day, 917 00:44:00,011 --> 00:44:02,115 apparently the clean lap came in. 918 00:44:02,148 --> 00:44:04,019 They started editing it all together. 919 00:44:06,089 --> 00:44:07,993 This was meant to be a demo video 920 00:44:08,027 --> 00:44:09,864 of what the team was working on and developing, 921 00:44:09,897 --> 00:44:13,204 and what its capability could deliver in the future. 922 00:44:13,237 --> 00:44:15,074 I think my biggest problem with the video 923 00:44:15,107 --> 00:44:18,147 was the first line that says it's the driver 924 00:44:18,180 --> 00:44:20,051 was only there for legal reason. 925 00:44:22,088 --> 00:44:23,792 I think it's definitely language 926 00:44:23,825 --> 00:44:25,461 that's designed for marketing. 927 00:44:25,494 --> 00:44:27,766 We are trying to imply that the thing 928 00:44:27,800 --> 00:44:29,236 is fully capable of self-driving, 929 00:44:29,269 --> 00:44:33,077 and only the evil regulators are holding us back. 930 00:44:33,110 --> 00:44:34,179 They sort of portrayed it as something 931 00:44:34,212 --> 00:44:35,381 as all their cars can do, 932 00:44:35,414 --> 00:44:38,454 and that, I don't think, was really fair. 933 00:44:41,426 --> 00:44:43,030 Not too long after that, 934 00:44:43,063 --> 00:44:45,836 Tesla started offering an official service called 935 00:44:45,869 --> 00:44:51,881 Full Self-Driving, capital FSD, for as much as $10,000. 936 00:44:51,915 --> 00:44:53,384 Now, in the short-term, 937 00:44:53,417 --> 00:44:56,056 what they and everybody else was really buying 938 00:44:56,089 --> 00:44:59,162 was the promise that this is gonna happen. 939 00:44:59,195 --> 00:45:01,466 The idea was we were putting the hardware on every car 940 00:45:01,500 --> 00:45:03,838 in advance of having the software. 941 00:45:03,872 --> 00:45:06,811 It was a gutsy move because then the software 942 00:45:06,844 --> 00:45:10,919 had to be developed to deliver the capability. 943 00:45:10,953 --> 00:45:12,488 We're still on track for being able 944 00:45:12,523 --> 00:45:15,796 to go cross-country, from L. A. to New York 945 00:45:15,829 --> 00:45:18,100 by the end of the year, fully autonomous. 946 00:45:18,133 --> 00:45:20,505 There was a sincere belief inside of Tesla, and Elon 947 00:45:20,539 --> 00:45:21,941 had the sincere belief that 948 00:45:21,975 --> 00:45:24,112 hey, we're just around the corner. 949 00:45:24,145 --> 00:45:25,816 A lot of people at the time believed 950 00:45:25,849 --> 00:45:27,820 that Tesla had an advantage 951 00:45:27,853 --> 00:45:30,291 in getting self-driving to the market first, 952 00:45:30,324 --> 00:45:33,097 because it already had all the cars on the road 953 00:45:33,130 --> 00:45:35,234 that could be collecting data all the time, 954 00:45:35,268 --> 00:45:39,442 and that data would help train the computer to be better. 955 00:45:39,476 --> 00:45:41,179 So, you've already got a fleet of Teslas 956 00:45:41,213 --> 00:45:43,384 driving all these roads. - Yeah. 957 00:45:43,417 --> 00:45:45,288 You're accumulating a huge amount of data. 958 00:45:45,321 --> 00:45:46,891 Yes. 959 00:45:46,925 --> 00:45:50,298 I expected to see sophisticated infrastructure 960 00:45:50,331 --> 00:45:54,239 to collect that data, to process that data. 961 00:45:54,272 --> 00:45:57,078 The reality was a lot of the types of data 962 00:45:57,111 --> 00:45:59,416 that you would want to collect from the car, 963 00:45:59,449 --> 00:46:03,157 like video data, high-quality images, 964 00:46:03,190 --> 00:46:06,998 there was neither the hardware nor the backend infrastructure 965 00:46:07,031 --> 00:46:10,872 to allow that volume of data to reach Tesla. 966 00:46:10,906 --> 00:46:14,346 And so that rate of learning wasn't great. 967 00:46:16,183 --> 00:46:18,922 Elon, he put eight cameras on the car. 968 00:46:18,955 --> 00:46:20,358 I don't think that was enough, 969 00:46:20,391 --> 00:46:22,963 because they were not redundant, 970 00:46:22,997 --> 00:46:24,967 other than the front cameras. 971 00:46:25,001 --> 00:46:27,539 You really need redundancy, so if one of these sensors fails, 972 00:46:27,573 --> 00:46:30,912 the car can either stop itself in a safe manner 973 00:46:30,946 --> 00:46:32,850 or it can continue driving. 974 00:46:32,883 --> 00:46:35,321 There was a small space right in front of the car 975 00:46:35,354 --> 00:46:40,097 that was completely out of the view for any of the cameras. 976 00:46:40,131 --> 00:46:41,934 And so, you know, a small dog 977 00:46:41,968 --> 00:46:44,907 or a baby could crawl in front of the car, 978 00:46:44,940 --> 00:46:47,078 and a car wouldn't be able to know 979 00:46:47,111 --> 00:46:51,253 whether it's safe to move forward or start to drive. 980 00:46:56,163 --> 00:46:58,233 It was hard for me to personally believe 981 00:46:58,267 --> 00:47:00,872 that promise was gonna be lived up to, 982 00:47:00,906 --> 00:47:02,208 that we could be confident 983 00:47:02,241 --> 00:47:04,547 that this was gonna enable full self-driving. 984 00:47:07,352 --> 00:47:09,957 Sometime after the Joshua Brown crash, 985 00:47:09,990 --> 00:47:13,063 the head of Autopilot left Tesla. 986 00:47:13,097 --> 00:47:14,332 You know, it just gave the image 987 00:47:14,366 --> 00:47:16,370 of some sort of instability there. 988 00:47:16,403 --> 00:47:18,975 There was a sense that when Elon felt that things 989 00:47:19,009 --> 00:47:20,111 were not going well, 990 00:47:20,144 --> 00:47:23,017 there was efforts to shake things up. 991 00:47:23,050 --> 00:47:26,256 There were members of the team that I learned were fired. 992 00:47:26,289 --> 00:47:29,864 I never knew why. They just stopped showing up. 993 00:47:34,339 --> 00:47:38,447 Theranos was happening during that same time period. 994 00:47:38,480 --> 00:47:41,019 And a lot of the stories were kind of, like, 995 00:47:41,053 --> 00:47:42,388 at the back of my mind, 996 00:47:42,422 --> 00:47:46,263 and it just definitely made me question a lot more 997 00:47:46,296 --> 00:47:51,874 about what's behind some of this public optimism. 998 00:47:56,684 --> 00:47:58,253 After I left Tesla, 999 00:47:58,287 --> 00:48:01,627 I felt like I had to do a bit of soul searching, 1000 00:48:01,661 --> 00:48:06,403 just because I feel like sometimes it seems like 1001 00:48:06,436 --> 00:48:09,042 people and companies 1002 00:48:09,075 --> 00:48:12,248 were being rewarded not for telling the truth 1003 00:48:12,281 --> 00:48:15,922 but in fact for doing maybe a bit of the opposite. 1004 00:48:20,297 --> 00:48:22,368 It was my last day on the job at NHTSA 1005 00:48:22,401 --> 00:48:26,109 when we were ready to release that report. 1006 00:48:26,143 --> 00:48:28,147 It was the end of the Obama Administration, 1007 00:48:28,180 --> 00:48:30,552 and so we made sort of an internal commitment 1008 00:48:30,585 --> 00:48:34,192 to say we're not gonna leave this to the next guys. 1009 00:48:34,225 --> 00:48:37,131 A months-long investigation into Tesla's Autopilot system 1010 00:48:37,165 --> 00:48:39,002 has wrapped up. - There was no defect, 1011 00:48:39,035 --> 00:48:40,572 and therefore there will be no recall 1012 00:48:40,605 --> 00:48:42,275 related to Tesla's Autopilot. 1013 00:48:42,308 --> 00:48:45,582 Essentially clearing the company. 1014 00:48:45,615 --> 00:48:48,353 I was a little bit dumbfounded. 1015 00:48:48,387 --> 00:48:53,063 The system couldn't see a tractor-trailer, 1016 00:48:53,096 --> 00:48:55,100 and that's not a defect? 1017 00:48:58,508 --> 00:49:00,277 So, this is--you know, it's a little complicated, 1018 00:49:00,311 --> 00:49:02,649 and almost counterintuitive, right? 1019 00:49:02,683 --> 00:49:06,123 Autopilot didn't even engage to try to stop that crash. 1020 00:49:06,156 --> 00:49:08,093 But the fact of the matter is Autopilot 1021 00:49:08,127 --> 00:49:12,468 wasn't designed to stop every crash in every instance. 1022 00:49:12,502 --> 00:49:15,040 It was a driver-assistance system. 1023 00:49:15,074 --> 00:49:17,546 It wasn't a full self-driving system. 1024 00:49:17,579 --> 00:49:20,084 Tesla issued a statement saying it appreciated 1025 00:49:20,117 --> 00:49:22,321 the thoroughness of the investigation. 1026 00:49:22,355 --> 00:49:25,361 My personal point of view was it's clear this technology 1027 00:49:25,394 --> 00:49:27,198 is being misused right now. 1028 00:49:27,231 --> 00:49:30,070 We saw people were pushing the limits on the system, 1029 00:49:30,104 --> 00:49:31,406 but--and this was early on 1030 00:49:31,439 --> 00:49:33,545 in the deployment of the technology, 1031 00:49:33,578 --> 00:49:37,318 and at the time, there wasn't enough data to show 1032 00:49:37,351 --> 00:49:39,489 that there was a technological defect here. 1033 00:49:40,525 --> 00:49:44,767 I remember the day the news came out that crashes dropped 1034 00:49:44,800 --> 00:49:49,577 40% after the Autopilot component was added. 1035 00:49:49,610 --> 00:49:51,547 A lot of news articles repeated it 1036 00:49:51,581 --> 00:49:53,317 because NHTSA had said it, 1037 00:49:53,350 --> 00:49:55,755 and that gave it some legitimacy. 1038 00:49:55,789 --> 00:49:58,561 You know, I mean, if the regulators are saying it, 1039 00:49:58,595 --> 00:50:01,233 it must be true. 1040 00:50:01,266 --> 00:50:03,103 You know, I think that's an unfortunate statistic 1041 00:50:03,136 --> 00:50:05,474 that didn't probably belong in the report. 1042 00:50:05,509 --> 00:50:07,513 It was based on data provided by the company 1043 00:50:07,546 --> 00:50:09,182 that hadn't been sort of 1044 00:50:09,215 --> 00:50:12,421 independently verified or vetted. 1045 00:50:12,455 --> 00:50:15,160 Eventually, some independent researchers 1046 00:50:15,194 --> 00:50:16,764 started looking at the crash data 1047 00:50:16,797 --> 00:50:20,070 and didn't believe it was valid. 1048 00:50:20,104 --> 00:50:23,277 But Tesla was very eager to pick up on that statistic 1049 00:50:23,310 --> 00:50:26,717 and use it to sort of say "not only is Autopilot good, 1050 00:50:26,751 --> 00:50:28,420 it's better than human drivers." 1051 00:50:28,453 --> 00:50:31,794 NHTSA did a study on Tesla's Autopilot version 1, 1052 00:50:31,827 --> 00:50:33,731 which was relatively primitive, 1053 00:50:33,765 --> 00:50:38,040 and found that it was a 45% reduction in highway accidents. 1054 00:50:38,073 --> 00:50:40,679 You know, I think that was a successful PR move 1055 00:50:40,712 --> 00:50:42,381 on their part. 1056 00:50:42,415 --> 00:50:45,487 Musk and Tesla, they're master marketers. 1057 00:50:54,807 --> 00:50:57,679 What scares you the most about autonomous cars? 1058 00:50:57,713 --> 00:51:01,654 I think people are wildly underestimating the complexity 1059 00:51:01,687 --> 00:51:03,725 of bringing automation into this system. 1060 00:51:06,797 --> 00:51:09,168 My name is Christopher Hart. I am the former Chairman 1061 00:51:09,202 --> 00:51:12,241 of the National Transportation Safety Board. 1062 00:51:12,275 --> 00:51:14,178 The NTSB is the federal agency 1063 00:51:14,212 --> 00:51:15,715 that was created to investigate 1064 00:51:15,749 --> 00:51:17,786 transportation accidents and make recommendations 1065 00:51:17,819 --> 00:51:19,188 to try to prevent the accidents 1066 00:51:19,222 --> 00:51:21,761 from happening again. 1067 00:51:21,794 --> 00:51:24,399 When I first heard about that Tesla crash, 1068 00:51:24,432 --> 00:51:27,405 I knew enough about automation from my own aviation experience 1069 00:51:27,438 --> 00:51:29,677 that I knew it was not gonna be as simple as people thought. 1070 00:51:30,679 --> 00:51:34,352 It took a year-plus to investigate. 1071 00:51:34,385 --> 00:51:37,559 Then, September 2017, there was a public hearing. 1072 00:51:37,592 --> 00:51:38,795 Welcome to the boardroom 1073 00:51:38,828 --> 00:51:40,832 of the National Transportation Safety Board. 1074 00:51:40,865 --> 00:51:43,538 We were very curious about this particular crash. 1075 00:51:43,571 --> 00:51:47,779 And the further we got into it, the more we started realizing 1076 00:51:47,813 --> 00:51:49,650 that, wow, there are a lot of issues here 1077 00:51:49,683 --> 00:51:51,386 that really need to be looked at. 1078 00:51:51,419 --> 00:51:52,723 It is our sincere hope 1079 00:51:52,756 --> 00:51:55,729 that the lessons learned from this tragedy 1080 00:51:55,762 --> 00:51:59,135 can help prevent future tragedies. 1081 00:51:59,168 --> 00:52:03,778 An accident is rarely the result of just one factor. 1082 00:52:03,811 --> 00:52:07,318 In this crash, one would be, of course, the truck driver 1083 00:52:07,351 --> 00:52:11,192 pulling across the lane, when he shouldn't have. 1084 00:52:11,226 --> 00:52:14,365 But I would say that there was also the automation complacency 1085 00:52:14,399 --> 00:52:17,572 associated with the design of the Tesla vehicle. 1086 00:52:19,342 --> 00:52:20,578 For up to ten seconds, 1087 00:52:20,612 --> 00:52:22,481 that there would have been a line of sight 1088 00:52:22,516 --> 00:52:24,853 between this Tesla and the vehicle 1089 00:52:24,887 --> 00:52:26,524 that was crossing in front of him, 1090 00:52:26,557 --> 00:52:30,699 there was the opportunity to avoid this crash. 1091 00:52:30,732 --> 00:52:33,671 We could not determine exactly what he was doing 1092 00:52:33,705 --> 00:52:36,811 in this crash. 1093 00:52:36,844 --> 00:52:38,447 We certainly heard those rumors 1094 00:52:38,480 --> 00:52:41,219 about the driver watching videos. 1095 00:52:41,252 --> 00:52:43,958 But we had no evidence of that at all. 1096 00:52:43,991 --> 00:52:46,564 Did you find any evidence at all 1097 00:52:46,597 --> 00:52:48,233 that the driver of the Tesla 1098 00:52:48,266 --> 00:52:51,373 may have been watching a movie while driving this car? 1099 00:52:51,406 --> 00:52:53,276 We looked through his laptop, 1100 00:52:53,310 --> 00:52:56,449 and there was no movies on that laptop. 1101 00:52:56,483 --> 00:53:00,357 We, at the NTSB, really feel like the drivers 1102 00:53:00,391 --> 00:53:04,867 have the tendency to disengage when the Autopilot 1103 00:53:04,900 --> 00:53:07,939 is engaged with the Tesla vehicle. 1104 00:53:07,973 --> 00:53:09,810 Complacency creeps in over time, 1105 00:53:09,843 --> 00:53:12,348 and you develop overconfidence with the system. 1106 00:53:12,381 --> 00:53:15,387 I was concerned about the use of the term "Autopilot," 1107 00:53:15,421 --> 00:53:16,724 because there are too many people 1108 00:53:16,757 --> 00:53:18,226 who construe the term "Autopilot" 1109 00:53:18,260 --> 00:53:21,534 to mean human-engagement no longer necessary. 1110 00:53:21,567 --> 00:53:23,504 They advise you to keep your hands on the steering wheel 1111 00:53:23,538 --> 00:53:24,907 when using the auto-steer, 1112 00:53:24,940 --> 00:53:28,514 but as we're in testing, you really don't need to. 1113 00:53:28,548 --> 00:53:30,585 The Autopilot 1114 00:53:30,618 --> 00:53:31,987 is supposed to have a system 1115 00:53:32,021 --> 00:53:35,595 where it can detect driver engagement. 1116 00:53:35,628 --> 00:53:36,997 There were periods of time-- 1117 00:53:37,031 --> 00:53:39,502 almost of six minutes-- where his hands 1118 00:53:39,536 --> 00:53:41,907 were not even detected to be on the steering wheel. 1119 00:53:41,941 --> 00:53:44,813 We felt that the system of determining 1120 00:53:44,847 --> 00:53:49,422 driver engagement was poor. 1121 00:53:49,455 --> 00:53:51,326 Another issue is at that time, 1122 00:53:51,359 --> 00:53:52,996 no manufacturer had a system 1123 00:53:53,029 --> 00:53:55,602 to reliably sense crossing traffic. 1124 00:53:55,635 --> 00:53:57,572 That's why these systems are supposed to be used 1125 00:53:57,606 --> 00:54:00,845 only on roads that don't have crossing traffic. 1126 00:54:00,879 --> 00:54:03,885 This road, Highway 27A, in Florida, 1127 00:54:03,918 --> 00:54:07,491 was not a limited access road. But the question is, 1128 00:54:07,526 --> 00:54:10,297 if the system is not supposed to be operated 1129 00:54:10,330 --> 00:54:13,336 on anything other than a highway, 1130 00:54:13,370 --> 00:54:16,844 why does the system allow it to be operated 1131 00:54:16,877 --> 00:54:19,616 in other types of roadways? 1132 00:54:19,650 --> 00:54:22,823 It's like having a swimming pool without a fence around it. 1133 00:54:22,856 --> 00:54:25,494 It's a--it's an attractive nuisance. 1134 00:54:25,528 --> 00:54:28,534 Tesla allowed the driver to use the system 1135 00:54:28,568 --> 00:54:31,540 outside of the environment for which it was designed. 1136 00:54:31,574 --> 00:54:35,380 The result was a collision that, frankly, 1137 00:54:35,414 --> 00:54:36,584 should have never happened. 1138 00:54:38,654 --> 00:54:41,326 The ultimate paradox is that the better the automation gets 1139 00:54:41,359 --> 00:54:43,831 after removing the human, the more challenging 1140 00:54:43,865 --> 00:54:48,006 the human-automation interface issues become. 1141 00:54:48,039 --> 00:54:49,510 While the human is the most 1142 00:54:49,543 --> 00:54:50,879 unpredictable and variable part 1143 00:54:50,912 --> 00:54:53,383 of the whole system, it is also, at the same time, 1144 00:54:53,416 --> 00:54:55,053 the most adaptable part of the system 1145 00:54:55,087 --> 00:54:58,026 when you need adaptation. 1146 00:54:58,059 --> 00:54:59,796 I think Elon is an IT wizard, 1147 00:54:59,830 --> 00:55:02,501 and I think that IT wizardry is going to help us 1148 00:55:02,536 --> 00:55:05,007 get where we want to go, but we need to do it in a way 1149 00:55:05,040 --> 00:55:07,746 that encompasses the human element as well. 1150 00:55:07,779 --> 00:55:09,683 I think that the manufacturers 1151 00:55:09,716 --> 00:55:15,628 have a role in preventing this automation complacency. 1152 00:55:15,662 --> 00:55:19,068 You can't buy a production model self-driving car 1153 00:55:19,101 --> 00:55:22,374 from any automobile maker today. 1154 00:55:22,408 --> 00:55:25,113 Anyone who says that you can is misleading you, 1155 00:55:25,147 --> 00:55:27,919 and anyone who leaves that impression 1156 00:55:27,953 --> 00:55:30,057 is leaving the wrong impression. 1157 00:55:32,896 --> 00:55:36,069 To Tesla, we issued several recommendations-- 1158 00:55:36,102 --> 00:55:39,710 basically, do not allow the system 1159 00:55:39,743 --> 00:55:41,412 to be operated on roadways 1160 00:55:41,446 --> 00:55:44,485 where it's not designed to be operated. 1161 00:55:44,520 --> 00:55:47,559 And another recommendation was you need a better way 1162 00:55:47,592 --> 00:55:49,763 of determining driver engagement. 1163 00:55:49,796 --> 00:55:54,038 I do feel that if those recs are accomplished, 1164 00:55:54,071 --> 00:55:55,508 safety will be improved. 1165 00:55:56,710 --> 00:55:57,879 Tesla releasing a statement 1166 00:55:57,913 --> 00:56:01,085 saying customer safety comes first. 1167 00:56:01,119 --> 00:56:03,958 Boulder Crest is a nonprofit that takes care 1168 00:56:03,991 --> 00:56:06,462 of men and women who are suffering with PTSD. 1169 00:56:08,734 --> 00:56:10,505 The building we're standing in today 1170 00:56:10,538 --> 00:56:12,976 is the Josh Brown Center for Innovation. 1171 00:56:13,009 --> 00:56:15,013 We had this great dedication ceremony 1172 00:56:15,047 --> 00:56:16,684 the day we opened this. 1173 00:56:16,717 --> 00:56:18,053 The family came back to us 1174 00:56:18,086 --> 00:56:20,457 with this amazing paragraph that they wanted 1175 00:56:20,490 --> 00:56:24,131 the Director of the Boulder Crest Institute to read. 1176 00:56:24,165 --> 00:56:27,973 Joshua believed, and our family continues to believe, 1177 00:56:28,006 --> 00:56:30,177 that the new technology going into cars 1178 00:56:30,210 --> 00:56:31,914 and the move to autonomous driving 1179 00:56:31,947 --> 00:56:33,985 has already saved many lives. 1180 00:56:34,018 --> 00:56:36,155 Change always comes with risks, 1181 00:56:36,189 --> 00:56:37,892 and zero tolerance for deaths 1182 00:56:37,926 --> 00:56:41,634 would totally stop innovations and improvements. 1183 00:56:41,667 --> 00:56:44,706 Nobody wants tragedy to touch their family. 1184 00:56:44,740 --> 00:56:47,111 But expecting to identify all limitations 1185 00:56:47,144 --> 00:56:48,915 of an emerging technology, 1186 00:56:48,948 --> 00:56:52,622 and expecting perfection is not feasible either. 1187 00:56:52,656 --> 00:56:54,693 And that, to me, just-- I mean, it-- 1188 00:56:54,726 --> 00:56:56,764 I think it brought tears to everybody in that audience 1189 00:56:56,797 --> 00:56:58,601 that knew Josh Brown. 1190 00:56:58,634 --> 00:57:01,439 Part of Joshua's legacy is that the accident 1191 00:57:01,472 --> 00:57:02,709 drove additional improvements, 1192 00:57:02,742 --> 00:57:05,480 making the new technology even safer. 1193 00:57:05,515 --> 00:57:08,486 Our family takes solace and pride in the fact 1194 00:57:08,521 --> 00:57:11,492 that our son is making such a positive impact 1195 00:57:11,527 --> 00:57:13,998 on future highway safety. 1196 00:57:14,031 --> 00:57:15,535 We all sat there and thought, 1197 00:57:15,568 --> 00:57:19,208 we have to learn a lesson from what happened. 1198 00:57:19,241 --> 00:57:20,945 When you think of somebody like Josh, 1199 00:57:20,979 --> 00:57:22,749 who was on that leading edge, 1200 00:57:22,782 --> 00:57:25,487 he was gonna test that car. 1201 00:57:25,521 --> 00:57:27,124 I think there's a false sense of security 1202 00:57:27,157 --> 00:57:30,030 when you put these options in front of people. 1203 00:57:36,677 --> 00:57:37,812 Wouldn't hurt to have more love in the world. 1204 00:57:37,846 --> 00:57:39,115 How you gonna fix that? 1205 00:57:39,148 --> 00:57:40,551 You have a love machine you're working on? 1206 00:57:41,720 --> 00:57:45,260 No, but probably spend more time with your friends 1207 00:57:45,293 --> 00:57:48,099 and less time on social media. 1208 00:57:50,237 --> 00:57:52,742 I mean, the only thing I've kept is Twitter, 1209 00:57:52,776 --> 00:57:54,478 because I kinda, like, need some means 1210 00:57:54,513 --> 00:57:56,750 of getting a message out, you know? 1211 00:57:58,053 --> 00:58:00,992 I think Elon Musk definitely understands 1212 00:58:01,025 --> 00:58:02,261 the power of his celebrity. 1213 00:58:02,294 --> 00:58:05,868 Elon, what do you think about dogecoin going crazy right now? 1214 00:58:08,206 --> 00:58:11,079 I think that's part of how he operates. 1215 00:58:11,112 --> 00:58:13,149 That's why he's on Twitter all the time. 1216 00:58:13,183 --> 00:58:17,157 You use your tweeting to kind of get back at critics. 1217 00:58:17,191 --> 00:58:19,563 - Rarely. - You kinda have little wars 1218 00:58:19,596 --> 00:58:20,631 with the press. 1219 00:58:20,665 --> 00:58:22,000 Twitter is a war zone. 1220 00:58:22,034 --> 00:58:24,105 He sort of doesn't have a filter. 1221 00:58:24,138 --> 00:58:26,009 Elon Musk was on Twitter today 1222 00:58:26,042 --> 00:58:27,211 calling one of the divers 1223 00:58:27,244 --> 00:58:29,750 in that cave rescue a pedophile. 1224 00:58:29,783 --> 00:58:31,954 Elon Musk shook up the stock market 1225 00:58:31,987 --> 00:58:33,557 this afternoon with a tweet that read, 1226 00:58:33,591 --> 00:58:36,262 "I'm considering taking Tesla private." 1227 00:58:36,295 --> 00:58:38,634 I think it kinda goes both ways. 1228 00:58:38,667 --> 00:58:41,807 He can say things, and he can get people believing them. 1229 00:58:44,613 --> 00:58:46,884 New tonight at 5:00, Tesla has published 1230 00:58:46,917 --> 00:58:49,321 its first quarterly safety report. 1231 00:58:49,355 --> 00:58:52,194 In 2018, Tesla started releasing 1232 00:58:52,227 --> 00:58:55,200 these Autopilot safety statistics, 1233 00:58:55,233 --> 00:58:57,839 and have continued to release data. 1234 00:58:57,872 --> 00:58:59,843 On the surface, they looked like 1235 00:58:59,876 --> 00:59:01,312 they presented a good picture. 1236 00:59:01,345 --> 00:59:03,216 We publish the safety stats, like, basically, 1237 00:59:03,249 --> 00:59:07,559 miles driven on Autopilot and miles driven manually. 1238 00:59:07,592 --> 00:59:09,095 It was a factor of ten difference. 1239 00:59:09,128 --> 00:59:10,598 This is not subtle. 1240 00:59:10,631 --> 00:59:14,038 But it was just broad numbers. 1241 00:59:14,071 --> 00:59:16,977 It's not really a fair comparison to say Teslas 1242 00:59:17,010 --> 00:59:19,850 are dramatically safer than all other cars on the road, 1243 00:59:19,883 --> 00:59:21,820 because all other cars on the road 1244 00:59:21,854 --> 00:59:23,824 can include 20-year-old vehicles 1245 00:59:23,858 --> 00:59:25,962 that are not in good repair. 1246 00:59:25,995 --> 00:59:29,068 If you think about the miles that Tesla drives on Autopilot, 1247 00:59:29,101 --> 00:59:31,205 almost all those are gonna be freeway cruising miles. 1248 00:59:31,239 --> 00:59:33,209 Those miles are incredibly safe. 1249 00:59:33,243 --> 00:59:35,715 City streets, parking lots, things like that, 1250 00:59:35,748 --> 00:59:37,619 those are much more likely to have incidents. 1251 00:59:39,623 --> 00:59:41,125 I believe that they're presenting data 1252 00:59:41,159 --> 00:59:42,729 that makes them look the best, 1253 00:59:42,762 --> 00:59:45,233 that is still technically accurate. 1254 00:59:47,672 --> 00:59:50,845 It's Tesla and Elon Musk providing data 1255 00:59:50,878 --> 00:59:53,216 to support their point of view, 1256 00:59:53,249 --> 00:59:55,120 but that's not a full picture. 1257 00:59:55,153 --> 00:59:57,157 I don't think that gives you enough data 1258 00:59:57,191 --> 00:59:58,627 to really make a judgment. 1259 00:59:58,661 --> 01:00:00,731 People can say, "Oh, well, you're playing 1260 01:00:00,765 --> 01:00:02,201 with the statistics." 1261 01:00:02,234 --> 01:00:03,838 I'm like, we're not fiddling with the statistics. 1262 01:00:03,871 --> 01:00:06,375 The truth is that people are actually not great 1263 01:00:06,409 --> 01:00:09,883 at driving these two-ton death machines. 1264 01:00:09,916 --> 01:00:12,287 Then, March of 2018-- 1265 01:00:12,321 --> 01:00:13,791 Fatal crash and fire 1266 01:00:13,824 --> 01:00:16,028 involving a Tesla in Mountain View. 1267 01:00:16,062 --> 01:00:21,339 That's when the Walter Huang crash happens in California. 1268 01:00:21,372 --> 01:00:24,178 It was at a point where the freeway splits, 1269 01:00:24,211 --> 01:00:26,750 and the Autopilot became confused, 1270 01:00:26,783 --> 01:00:29,288 and he ran straight into a concrete barrier. 1271 01:00:29,321 --> 01:00:32,327 38-year-old Walter Huang had a wife and two kids. 1272 01:00:32,361 --> 01:00:36,435 The NTSB is investigating that fatal crash and fire. 1273 01:00:36,469 --> 01:00:40,177 Tesla was a party to our investigation. 1274 01:00:40,210 --> 01:00:42,982 But one of the rules of being a party 1275 01:00:43,016 --> 01:00:45,988 is that the parties can't release information 1276 01:00:46,022 --> 01:00:48,961 about the active investigation. 1277 01:00:48,994 --> 01:00:52,100 Tesla released data saying Walter Huang had his hands 1278 01:00:52,134 --> 01:00:54,773 off the wheel for six seconds before the crash. 1279 01:00:55,942 --> 01:00:58,914 I called Elon Musk and said they would have to abide 1280 01:00:58,948 --> 01:01:00,685 by our party agreement. 1281 01:01:00,718 --> 01:01:02,254 And then a few days later, 1282 01:01:02,287 --> 01:01:06,362 Tesla was releasing information about the crash. 1283 01:01:06,395 --> 01:01:08,299 Tesla released another statement that read, 1284 01:01:08,333 --> 01:01:10,237 "The only way for this accident to have occurred 1285 01:01:10,270 --> 01:01:13,276 is if Mr. Huang was not paying attention to the road." 1286 01:01:13,309 --> 01:01:17,852 Tesla needed to be removed from that investigation. 1287 01:01:17,886 --> 01:01:21,459 And so I called. Elon was, I would say, argumentative. 1288 01:01:21,492 --> 01:01:26,269 He indicated that he was going to sue the NTSB. 1289 01:01:26,302 --> 01:01:28,841 There was an attempt to bully us into submission. 1290 01:01:28,874 --> 01:01:31,212 But we didn't back down, and he hung up on us. 1291 01:01:33,149 --> 01:01:35,855 That night, Tesla put out a press release 1292 01:01:35,888 --> 01:01:39,829 saying that they were resigning. 1293 01:01:39,863 --> 01:01:42,301 Tesla announced it's leaving the investigation 1294 01:01:42,334 --> 01:01:44,004 into the deadly crash, 1295 01:01:44,038 --> 01:01:44,940 but the NTSB says 1296 01:01:44,973 --> 01:01:48,213 it kicked the electric car maker out first. 1297 01:01:48,246 --> 01:01:51,720 It was sort of like, "You can't fire me, we quit" 1298 01:01:51,753 --> 01:01:53,089 sort of a thing. 1299 01:01:53,122 --> 01:01:55,360 The system worked as described, 1300 01:01:55,393 --> 01:01:57,431 which is that it's a hands-on system. 1301 01:01:57,464 --> 01:02:00,170 It is not a self-driving system. 1302 01:02:06,382 --> 01:02:08,453 Today, we meet to consider a collision 1303 01:02:08,486 --> 01:02:11,760 involving a Tesla Model X SUV. 1304 01:02:14,331 --> 01:02:17,237 We do know that in the Mountain View crash, 1305 01:02:17,271 --> 01:02:20,176 the driver was engaged continuously 1306 01:02:20,210 --> 01:02:22,314 with playing a video game. 1307 01:02:22,347 --> 01:02:24,184 It would be easy to say the driver 1308 01:02:24,218 --> 01:02:25,821 was not acting responsibly. 1309 01:02:25,855 --> 01:02:29,963 However, it also shows that there's great potential 1310 01:02:29,996 --> 01:02:34,138 for there to be this automation complacency to creep in. 1311 01:02:34,171 --> 01:02:37,545 In 2017, we issued two recommendations 1312 01:02:37,578 --> 01:02:40,952 to six automobile manufacturers. 1313 01:02:40,985 --> 01:02:46,262 And of the six, one manufacturer has ignored us, 1314 01:02:46,295 --> 01:02:50,103 and that manufacturer is Tesla. 1315 01:02:50,136 --> 01:02:53,409 All of our recommendations are based on tragic events. 1316 01:02:53,443 --> 01:02:56,817 And when someone doesn't respond or doesn't act, 1317 01:02:56,850 --> 01:02:57,986 that's heartbreaking, 1318 01:02:58,019 --> 01:03:00,423 especially when you see another accident 1319 01:03:00,457 --> 01:03:02,294 that could have been prevented 1320 01:03:02,327 --> 01:03:06,268 had those recommendations been implemented. 1321 01:03:06,302 --> 01:03:08,373 What started as an ordinary drive to work 1322 01:03:08,406 --> 01:03:10,578 ended in tragedy for a father and husband 1323 01:03:10,611 --> 01:03:13,483 from suburban Lake Worth Beach driving a Tesla. 1324 01:03:13,517 --> 01:03:16,322 In March of 2019, the next fatality 1325 01:03:16,355 --> 01:03:21,032 that we became aware of was Jeremy Banner. 1326 01:03:21,065 --> 01:03:25,040 We saw the almost identical crash 1327 01:03:25,073 --> 01:03:27,077 that we saw in the Joshua Brown case. 1328 01:03:29,214 --> 01:03:32,254 You've got a Tesla being operated on Autopilot. 1329 01:03:32,287 --> 01:03:34,024 We've got a tractor-trailer 1330 01:03:34,058 --> 01:03:37,598 that is pulling across the road. 1331 01:03:37,632 --> 01:03:41,472 We've got a driver does not attempt any evasive steering. 1332 01:03:41,506 --> 01:03:44,044 Does not attempt any breaking action. 1333 01:03:46,516 --> 01:03:49,321 And goes right under the tractor-trailer, 1334 01:03:51,893 --> 01:03:53,897 sheering the roof off of the car, 1335 01:03:56,001 --> 01:03:58,006 and killing the driver. 1336 01:04:02,447 --> 01:04:04,051 Where is the super duper radar 1337 01:04:04,084 --> 01:04:07,257 that Elon was talking about in September 2016? 1338 01:04:11,399 --> 01:04:14,271 Well, whatever they did wasn't sufficient 1339 01:04:14,304 --> 01:04:16,308 to ensure it didn't happen again, 1340 01:04:16,342 --> 01:04:18,914 'cause the exact same crash happened. 1341 01:04:22,989 --> 01:04:26,262 Stationary objects are this vexing problem in autonomy. 1342 01:04:26,295 --> 01:04:29,001 And everybody that's developing autonomous software 1343 01:04:29,034 --> 01:04:30,470 has this problem. 1344 01:04:30,503 --> 01:04:34,278 This is the one Achilles heel that you continue to see. 1345 01:04:34,311 --> 01:04:36,148 I thought the self-driving problem would be hard, 1346 01:04:36,182 --> 01:04:38,286 but it's--it was harder than I thought. 1347 01:04:41,425 --> 01:04:46,102 It may be that Autopilot vehicles have fewer crashes. 1348 01:04:46,135 --> 01:04:50,243 But we've continued to see other crashes that happen 1349 01:04:50,276 --> 01:04:53,517 because the system can't see something in the road. 1350 01:04:53,550 --> 01:04:56,222 Nearly a dozen accidents where a Tesla slammed 1351 01:04:56,255 --> 01:04:58,126 into a parked emergency vehicle. 1352 01:04:58,159 --> 01:05:01,700 If your company is supposed to be putting safety first, 1353 01:05:01,733 --> 01:05:05,039 and this well-respected safety agency says, 1354 01:05:05,073 --> 01:05:07,377 you know, there are these two deficiencies in your system, 1355 01:05:07,411 --> 01:05:09,248 you should address them, 1356 01:05:09,281 --> 01:05:12,020 why wouldn't you address them? Why wouldn't you fix them? 1357 01:05:12,053 --> 01:05:14,024 One of the biggest mistakes people generally make-- 1358 01:05:14,057 --> 01:05:16,530 and I'm guilty of it, too-- is wishful thinking. 1359 01:05:16,563 --> 01:05:19,168 You know, like, you want something to be true 1360 01:05:19,201 --> 01:05:20,370 even if it isn't true. 1361 01:05:20,403 --> 01:05:22,174 And so you ignore the real truth 1362 01:05:22,207 --> 01:05:25,648 because of what you want to be true. 1363 01:05:25,681 --> 01:05:28,319 This is a very difficult trap to avoid. 1364 01:05:32,294 --> 01:05:34,431 I think, for those of us in the safety business, 1365 01:05:34,465 --> 01:05:37,170 we would have liked to have seen more regulations 1366 01:05:37,204 --> 01:05:39,976 implemented to improve safety. 1367 01:05:40,009 --> 01:05:42,114 I mean, it's horribly frustrating. 1368 01:05:42,147 --> 01:05:43,550 The truth is companies have always had 1369 01:05:43,584 --> 01:05:46,455 an enormous amount of power, in terms of the technology 1370 01:05:46,489 --> 01:05:49,228 in the vehicles that they put on the road. 1371 01:05:49,261 --> 01:05:50,998 Honestly, I worry that the government 1372 01:05:51,031 --> 01:05:53,637 cannot keep up with the technology. 1373 01:05:53,670 --> 01:05:55,206 I don't think in a situation like this 1374 01:05:55,240 --> 01:05:58,747 we want to necessarily inhibit innovation. 1375 01:05:58,780 --> 01:06:01,152 But when innovation is implemented, 1376 01:06:01,185 --> 01:06:04,057 we have to make sure that it's done safely. 1377 01:06:04,091 --> 01:06:07,197 Or it's going to be the Wild West out there. 1378 01:06:17,652 --> 01:06:19,054 When you think full self-driving, 1379 01:06:19,087 --> 01:06:20,558 you think hands off the wheel. 1380 01:06:20,591 --> 01:06:22,160 You don't have to worry about anything. 1381 01:06:22,194 --> 01:06:25,701 You can listen to music and read a book, whatever. 1382 01:06:25,734 --> 01:06:29,576 I believe the first full video we saw was, like, in 2016. 1383 01:06:29,609 --> 01:06:31,478 I thought we're already here. 1384 01:06:31,513 --> 01:06:35,386 So, yeah, it was very, very exciting at the time. 1385 01:06:35,420 --> 01:06:36,623 A couple years later, 1386 01:06:36,656 --> 01:06:39,328 I purchased the full self-driving. 1387 01:06:39,361 --> 01:06:41,600 I now kind of make a distinction. 1388 01:06:41,633 --> 01:06:44,539 I think Tesla makes great electric vehicles. 1389 01:06:44,572 --> 01:06:48,179 But I think their advertising of certain Autopilot features 1390 01:06:48,212 --> 01:06:50,450 have been--overpromising 1391 01:06:50,483 --> 01:06:52,320 is probably the nicest way to say it. 1392 01:06:52,354 --> 01:06:54,224 I almost view it as, like, a solved problem. 1393 01:06:54,258 --> 01:06:56,195 Like, we know exactly what to do, 1394 01:06:56,228 --> 01:06:57,598 and we'll be there in a few years. 1395 01:06:57,632 --> 01:07:00,270 As far back as 2015, Elon Musk was saying 1396 01:07:00,303 --> 01:07:01,806 self-driving cars were two years away. 1397 01:07:01,840 --> 01:07:08,252 I think we're basically less than two years away 1398 01:07:08,286 --> 01:07:09,556 from complete autonomy. 1399 01:07:09,589 --> 01:07:11,526 The time when someone will be able 1400 01:07:11,560 --> 01:07:14,097 to take their hands off the wheel and go to sleep, 1401 01:07:14,131 --> 01:07:15,701 how far away is that? To do that safely? 1402 01:07:15,734 --> 01:07:18,072 I think that's about-- that's about two years. 1403 01:07:18,105 --> 01:07:19,809 The promise was very aspirational, 1404 01:07:19,842 --> 01:07:21,513 and probably not gonna happen. 1405 01:07:21,546 --> 01:07:25,220 But Tesla and Elon made people think it was gonna happen. 1406 01:07:25,253 --> 01:07:29,394 By end of next year, self-driving will be 1407 01:07:29,428 --> 01:07:34,238 at least 100% to 200% safer than a person. 1408 01:07:34,271 --> 01:07:37,110 If you buy a car that does not have the hardware necessary 1409 01:07:37,143 --> 01:07:39,414 for full self-driving, it is like buying a horse. 1410 01:07:39,448 --> 01:07:43,389 I'm extremely confident of achieving full autonomy 1411 01:07:43,422 --> 01:07:47,665 and releasing it to the Tesla customer base next year. 1412 01:07:47,698 --> 01:07:49,802 Some people say, what does it matter? 1413 01:07:49,836 --> 01:07:52,541 Well, I think it matters a lot. 1414 01:07:52,575 --> 01:07:55,113 Do you want other people on the roads 1415 01:07:55,146 --> 01:07:57,117 buying this technology and thinking 1416 01:07:57,150 --> 01:08:00,189 that it's more powerful than it really is? 1417 01:08:00,223 --> 01:08:01,826 I felt that what I was being told 1418 01:08:01,860 --> 01:08:03,664 that we were gonna do didn't match 1419 01:08:03,697 --> 01:08:06,168 what we actually did 1420 01:08:06,201 --> 01:08:09,308 because Tesla has changed the hardware on the car. 1421 01:08:09,341 --> 01:08:11,680 They changed the computer. 1422 01:08:11,713 --> 01:08:13,750 And now they're changing the cameras. 1423 01:08:13,784 --> 01:08:17,892 I think that that should give someone pause 1424 01:08:17,925 --> 01:08:19,595 when Tesla says they're gonna do something else. 1425 01:08:21,398 --> 01:08:24,839 Elon Musk has officially blown my mind yet again. 1426 01:08:24,872 --> 01:08:28,179 In a recent tweet, he talked about vision 1427 01:08:28,212 --> 01:08:30,283 and using only vision and no radar. 1428 01:08:30,316 --> 01:08:32,454 Taking the radar out is literally 1429 01:08:32,487 --> 01:08:35,460 going to make the system better. 1430 01:08:35,493 --> 01:08:37,598 When I heard they were gonna do cameras alone 1431 01:08:37,632 --> 01:08:39,736 and get rid of radar, 1432 01:08:39,769 --> 01:08:42,240 I was really taken aback. 1433 01:08:42,273 --> 01:08:45,581 The whole rest of the industry believes that you need cameras, 1434 01:08:45,614 --> 01:08:48,620 radar, and lidar. 1435 01:08:48,654 --> 01:08:51,325 Tesla's really the only automaker 1436 01:08:51,358 --> 01:08:54,264 who thinks that cameras alone is a good idea. 1437 01:08:54,297 --> 01:08:57,772 You can absolutely be superhuman with just cameras. 1438 01:08:57,805 --> 01:09:01,579 What Elon Musk is leaving out of his analogy-- 1439 01:09:01,613 --> 01:09:03,717 comparing cameras to eyes-- 1440 01:09:03,750 --> 01:09:08,392 is the fact that there is not a brain behind those cameras. 1441 01:09:09,896 --> 01:09:12,367 We don't know how to build a system 1442 01:09:12,400 --> 01:09:13,904 that can behave like the human brain. 1443 01:09:13,937 --> 01:09:19,682 And what that means is full autonomy may be decades away. 1444 01:09:24,592 --> 01:09:27,230 Anyone here use the full self-driving beta? 1445 01:09:27,263 --> 01:09:30,503 Great. 1446 01:09:30,537 --> 01:09:31,573 The car will be able to take you 1447 01:09:31,606 --> 01:09:34,545 anywhere you want with ultimately ten times safer 1448 01:09:34,579 --> 01:09:36,616 than if you were driving it yourself. 1449 01:09:36,649 --> 01:09:39,789 It's gonna just completely revolutionize the world. 1450 01:09:43,530 --> 01:09:46,368 There's a question around whether Elon 1451 01:09:46,402 --> 01:09:48,239 is acting cynically, right? 1452 01:09:48,272 --> 01:09:50,309 Like, does he believe in what he says? 1453 01:09:50,343 --> 01:09:54,585 And is it okay as long as he does believe in what he says? 1454 01:09:54,619 --> 01:09:57,825 Some of it feels intentional to me. 1455 01:09:57,858 --> 01:10:00,564 There's, like, financing needs that he needs to make. 1456 01:10:00,597 --> 01:10:03,804 There are milestones that Elon needs to hit, 1457 01:10:03,837 --> 01:10:06,776 from an investor's perspective. 1458 01:10:06,810 --> 01:10:10,383 At times, people misinterpret Elon. 1459 01:10:10,416 --> 01:10:12,788 Oftentimes, there's-- when the goal is set, 1460 01:10:12,822 --> 01:10:15,527 there's no capability to deliver against that goal. 1461 01:10:15,561 --> 01:10:17,732 You kind of need to believe that, as a team, 1462 01:10:17,765 --> 01:10:19,535 you're gonna achieve the impossible. 1463 01:10:19,569 --> 01:10:21,940 I've had many conversations with the Tesla Autopilot team. 1464 01:10:21,973 --> 01:10:23,543 The reality of doing the right thing matters 1465 01:10:23,577 --> 01:10:25,413 more than the perception of doing the right thing. 1466 01:10:25,446 --> 01:10:28,854 He's convinced that the technology will be delivered. 1467 01:10:28,887 --> 01:10:30,991 And I wouldn't necessarily bet against him, 1468 01:10:31,024 --> 01:10:32,862 because eventually he does deliver. 1469 01:10:34,064 --> 01:10:36,970 Tesla says it's launching its highly anticipated 1470 01:10:37,003 --> 01:10:39,876 full self-driving software later this week. 1471 01:10:39,909 --> 01:10:41,679 They're gonna open up self-driving 1472 01:10:41,713 --> 01:10:43,617 in America's cities. 1473 01:10:43,650 --> 01:10:47,691 That would seem to be quite a difficult thing to pull off. 1474 01:10:47,725 --> 01:10:49,562 In a city? 1475 01:10:52,000 --> 01:10:55,841 So, this is a tricky thing for beta. 1476 01:10:55,874 --> 01:10:58,479 We are--this is a blind left. There's a fence here. 1477 01:10:58,513 --> 01:11:03,055 It can't see around. So my car is inching forward. 1478 01:11:03,088 --> 01:11:05,794 I feel honored that I get to do this, 1479 01:11:05,828 --> 01:11:09,802 and be, like, a little part of this, you know, history. 1480 01:11:09,836 --> 01:11:15,446 I stopped it because it was inching out too far. 1481 01:11:15,480 --> 01:11:17,651 There are definitely people that do not agree 1482 01:11:17,685 --> 01:11:20,891 with Tesla's approach. 1483 01:11:20,924 --> 01:11:23,395 I don't feel that it's risky. 1484 01:11:23,429 --> 01:11:28,874 I have never felt endangered, okay? 1485 01:11:28,907 --> 01:11:32,413 See, it's gonna miss this. Can't do it. 1486 01:11:32,447 --> 01:11:35,687 I can say that people who buy a Tesla understand 1487 01:11:35,721 --> 01:11:37,758 that it's not full self-driving yet. 1488 01:11:37,791 --> 01:11:42,535 And nobody is forcing anybody to buy full self-driving. 1489 01:11:42,568 --> 01:11:43,469 It's an option. 1490 01:11:45,574 --> 01:11:47,745 Full self-driving, that's what I paid for, 1491 01:11:47,778 --> 01:11:49,347 and I don't have it. 1492 01:11:49,381 --> 01:11:52,020 I mean, it's right there in the name of it, right? 1493 01:11:52,053 --> 01:11:54,424 And I don't think that's fair to say. 1494 01:11:54,457 --> 01:11:57,030 Especially right now. 1495 01:11:57,063 --> 01:12:00,036 Musk, I think he has a huge responsibility. 1496 01:12:00,069 --> 01:12:02,106 You know, I think he needs to be a little bit more cautious 1497 01:12:02,140 --> 01:12:04,712 about what he tells his followers. 1498 01:12:04,745 --> 01:12:08,653 Wow. Oh, my God! Okay! 1499 01:12:08,687 --> 01:12:11,926 A lot of the work in the tech industry proceeds 1500 01:12:11,960 --> 01:12:15,399 with the central claim of improving human lives 1501 01:12:15,433 --> 01:12:18,507 through the methodical use of our technologies. 1502 01:12:18,540 --> 01:12:23,048 Okay, did-- oh, God! Fuck! Jesus! 1503 01:12:23,082 --> 01:12:26,789 That was one of the closest calls we've ever had. 1504 01:12:26,823 --> 01:12:31,398 With the ongoing full self-driving beta releases, 1505 01:12:31,431 --> 01:12:32,400 there's quite a spectacle. 1506 01:12:32,968 --> 01:12:34,539 Is it just gonna run this light? 1507 01:12:34,572 --> 01:12:36,876 Holy shit, it just ran that red light. 1508 01:12:36,910 --> 01:12:39,481 Here, we have a lot of customers 1509 01:12:39,515 --> 01:12:43,823 who are essentially standing in for professional test drivers. 1510 01:12:44,024 --> 01:12:46,629 - Ooh! Ooh! - Oh, fuck! Oh shit! 1511 01:12:46,663 --> 01:12:47,831 Shit! We-- 1512 01:12:47,865 --> 01:12:49,434 - We hit that. - We actually hit it. 1513 01:12:49,467 --> 01:12:50,704 We hit it. 1514 01:12:50,737 --> 01:12:55,079 With Tesla, an example of scientific integrity, 1515 01:12:55,112 --> 01:12:56,950 public responsibility, 1516 01:12:56,983 --> 01:13:00,958 and reasoned and methodical engineering development 1517 01:13:00,991 --> 01:13:02,093 it is not. 1518 01:13:02,126 --> 01:13:03,964 With a software update, you can actually 1519 01:13:03,997 --> 01:13:06,970 make thousands of people drive safer. 1520 01:13:07,003 --> 01:13:08,573 Just with a software update overnight. 1521 01:13:08,607 --> 01:13:10,711 - Wow. That's actually-- - Yeah. 1522 01:13:10,744 --> 01:13:12,514 That's actually-- 1523 01:13:12,548 --> 01:13:17,056 Fuck. 1524 01:13:17,090 --> 01:13:19,662 Are we gonna have to cut that?