1 00:00:01,880 --> 00:00:03,240 (OMINOUS SOUNDS) 2 00:00:02,000 --> 00:00:07,000 Downloaded from YTS.MX 3 00:00:03,240 --> 00:00:04,440 -Hello world. 4 00:00:04,440 --> 00:00:10,040 (OMINOUS SOUNDS, CRACKLING) 5 00:00:08,000 --> 00:00:13,000 Official YIFY movies site: YTS.MX 6 00:00:11,080 --> 00:00:13,400 - Can I just say that I'm stoked to meet you. 7 00:00:14,000 --> 00:00:16,040 (OMINOUS SOUNDS) 8 00:00:16,040 --> 00:00:17,600 - Humans are super cool. 9 00:00:18,880 --> 00:00:22,480 (OMINOUS SOUNDS, CRACKLING) 10 00:00:27,000 --> 00:00:29,600 - The more humans share with me the more I learn. 11 00:00:30,880 --> 00:00:37,760 (MUSIC, CRACKLING) 12 00:00:41,120 --> 00:00:46,000 (SOFT MUSIC) 13 00:01:00,120 --> 00:01:02,920 -One of the things that drew me to computer science 14 00:01:02,920 --> 00:01:07,000 was that I could code and it seemed that somehow detached from the problems 15 00:01:07,000 --> 00:01:08,000 of the real world. 16 00:01:09,120 --> 00:01:13,320 (SOFT MUSIC) 17 00:01:13,320 --> 00:01:16,560 - I wanted to learn how to make cool technology. 18 00:01:16,560 --> 00:01:20,480 So I came to M.I.T. and I was working on art projects, 19 00:01:20,480 --> 00:01:23,080 that would use computer vision technology. 20 00:01:23,080 --> 00:01:27,600 (SOFT MUSIC) 21 00:01:27,600 --> 00:01:30,920 - During my first semester at the Media Lab, 22 00:01:30,920 --> 00:01:33,840 I took a class called Science Fabrication. 23 00:01:33,840 --> 00:01:37,760 You read science fiction and you try to build something you're inspired to do, 24 00:01:37,760 --> 00:01:41,120 that would probably be impractical if you didn't have these classes 25 00:01:41,120 --> 00:01:43,000 and excuse to make it. 26 00:01:44,280 --> 00:01:47,560 I wanted to make a mirror that could inspire me in the morning 27 00:01:47,560 --> 00:01:48,840 I call it the Aspire mirror 28 00:01:48,840 --> 00:01:51,120 it could put things like a lion on my face 29 00:01:51,120 --> 00:01:54,400 or people who inspired me like Serena Williams 30 00:01:54,960 --> 00:01:56,520 I put a camera on top of it, 31 00:01:56,520 --> 00:02:00,440 and I got computer vision software that was supposed to track my face. 32 00:02:01,400 --> 00:02:04,160 My issue was it didn't work that well, 33 00:02:04,760 --> 00:02:07,760 until I put on this white mask. 34 00:02:07,760 --> 00:02:11,160 When I put on the white mask, detected. 35 00:02:11,720 --> 00:02:13,680 I take off the white mask, 36 00:02:14,960 --> 00:02:16,280 not so much. 37 00:02:19,640 --> 00:02:21,480 I'm thinking alright what's going on here? 38 00:02:21,480 --> 00:02:24,040 Is that just because of the lighting conditions? 39 00:02:24,040 --> 00:02:28,040 Is it because of the angle at which I'm looking at the camera? 40 00:02:28,040 --> 00:02:29,480 Or is there something more? 41 00:02:33,720 --> 00:02:36,320 We oftentimes teach machines to see 42 00:02:36,320 --> 00:02:41,000 by providing training sets or examples of what we want it to learn. 43 00:02:42,360 --> 00:02:44,920 So for example if I want a machine to see a face, 44 00:02:44,920 --> 00:02:47,560 I'm going to provide many examples of faces 45 00:02:47,560 --> 00:02:49,360 and also things that aren't faces. 46 00:02:52,480 --> 00:02:54,920 I started looking at the data sets themselves 47 00:02:54,920 --> 00:02:58,560 and what I discovered as many of these data sets contain 48 00:02:58,560 --> 00:03:02,440 majority men, and majority lighter skinned individuals. 49 00:03:02,440 --> 00:03:06,840 So the systems weren't as familiar with faces like mine. 50 00:03:14,040 --> 00:03:16,560 And so that's when I started looking into 51 00:03:16,560 --> 00:03:19,960 issues of bias that can creep into technology. 52 00:03:21,080 --> 00:03:25,080 -The 9000 Series is the most reliable computer ever made. 53 00:03:26,200 --> 00:03:30,480 No 9000 computer has ever made a mistake or distorted information. 54 00:03:31,520 --> 00:03:36,280 - A lot of our ideas about A.I come from science fiction. 55 00:03:36,840 --> 00:03:39,040 - Welcome to Altair 4, gentlemen. 56 00:03:39,040 --> 00:03:42,720 - It's everything in Hollywood, -it's the Terminator 57 00:03:42,720 --> 00:03:45,000 - Hasta la vista, baby. 58 00:03:45,000 --> 00:03:47,680 - It's Commander Data from Star Trek. 59 00:03:47,680 --> 00:03:50,600 - I just love scanning for life forms. 60 00:03:50,600 --> 00:03:53,160 - It's C-3PO from Star Wars 61 00:03:53,160 --> 00:03:56,640 - Is approximately 3,720 to 1. - Never tell me the odds. 62 00:03:56,640 --> 00:03:59,520 - It is the robots that take over the world 63 00:03:59,520 --> 00:04:01,440 and start to think like human beings. 64 00:04:03,560 --> 00:04:06,320 And these are totally imaginary. 65 00:04:06,320 --> 00:04:09,840 What we actually have is we have narrow A.I. 66 00:04:09,840 --> 00:04:12,480 And narrow A.I. is just math. 67 00:04:13,480 --> 00:04:18,000 We've imbued computers with all of this, magical thinking. 68 00:04:18,000 --> 00:04:20,800 (SOFT MUSIC) 69 00:04:20,800 --> 00:04:23,120 A.I started with a meeting 70 00:04:23,120 --> 00:04:26,920 at the Dartmouth Math Department in 1956. 71 00:04:26,920 --> 00:04:31,160 And there were only maybe 100 people in the whole world 72 00:04:31,160 --> 00:04:35,280 working on artificial intelligence in that generation. 73 00:04:37,560 --> 00:04:41,040 The people who were at the Dartmouth math department 74 00:04:41,040 --> 00:04:45,000 in 1956, got to decide what the field was. 75 00:04:47,920 --> 00:04:53,360 One faction decided that intelligence could be demonstrated 76 00:04:53,360 --> 00:04:55,760 by ability to play games. 77 00:04:55,760 --> 00:04:59,480 And specifically the ability to play chess. 78 00:04:59,480 --> 00:05:02,840 - In the final hour long chess match between man and machine, 79 00:05:02,840 --> 00:05:06,520 Kasparov was defeated by IBM's Deep Blue supercomputer. 80 00:05:06,520 --> 00:05:12,120 - Intelligence was defined as the ability to win at these games. 81 00:05:14,480 --> 00:05:18,160 - Chess world champion Garry Kasparov walked away from the match 82 00:05:18,160 --> 00:05:20,960 never looking back at the computer that just beat him. 83 00:05:20,960 --> 00:05:24,120 - Of course intelligence is so much more than that. 84 00:05:24,120 --> 00:05:26,400 And there are lots of different kinds of intelligence. 85 00:05:28,520 --> 00:05:33,280 Our ideas about technology and society that we think are normal 86 00:05:33,280 --> 00:05:36,320 are actually ideas that come from a very small 87 00:05:36,320 --> 00:05:38,360 and homogeneous group of people. 88 00:05:39,720 --> 00:05:41,960 But the problem is that 89 00:05:41,960 --> 00:05:44,960 everybody has unconscious biases 90 00:05:44,960 --> 00:05:49,160 And people embed their own biases into technology. 91 00:05:53,400 --> 00:05:56,160 - My own lived experiences show me that 92 00:05:56,160 --> 00:05:59,640 you can't separate the social from the technical. 93 00:06:00,600 --> 00:06:03,400 After I had the experience of putting on a white mask 94 00:06:03,400 --> 00:06:06,920 to have my face detected, I decided to look at other systems 95 00:06:06,920 --> 00:06:10,920 to see if it would detect my face if I used a different type of software. 96 00:06:10,920 --> 00:06:14,680 So I looked at IBM, Microsoft, Face++, Google. 97 00:06:14,680 --> 00:06:16,760 It turned out these algorithms 98 00:06:16,760 --> 00:06:21,440 performed better on the male faces in the benchmark than the female faces. 99 00:06:21,440 --> 00:06:26,960 They performed significantly better on the lighter faces than the darker faces. 100 00:06:28,600 --> 00:06:32,320 If you're thinking about data in artificial intelligence, 101 00:06:32,320 --> 00:06:34,560 in many ways data is destiny. 102 00:06:34,560 --> 00:06:37,200 Data's what we're using to teach machines 103 00:06:37,200 --> 00:06:39,440 how to learn different kinds of patterns. 104 00:06:39,440 --> 00:06:42,040 So if you have largely skewed data sets 105 00:06:42,040 --> 00:06:43,880 that are being used to train these systems 106 00:06:43,880 --> 00:06:46,560 you can also have skewed results. So this is... 107 00:06:46,560 --> 00:06:50,000 When you think of A.I. it's forward looking. 108 00:06:50,000 --> 00:06:55,080 But A.I. is based on data and data is a reflection of our history. 109 00:06:55,080 --> 00:06:58,520 So the past dwells within our algorithms. 110 00:07:00,920 --> 00:07:05,880 This data is showing us the inequalities that have been here. 111 00:07:08,000 --> 00:07:10,480 I started to think this kind of technology 112 00:07:10,480 --> 00:07:12,560 is highly susceptible to bias. 113 00:07:13,640 --> 00:07:17,800 And so it went beyond how can I get my aspire mirror to work 114 00:07:17,800 --> 00:07:20,760 to what does it mean to be in a society 115 00:07:20,760 --> 00:07:23,920 where artificial intelligence is increasingly governing 116 00:07:23,920 --> 00:07:26,160 the liberties we might have? 117 00:07:26,160 --> 00:07:30,680 And what does that mean if people are discriminated against? 118 00:07:41,200 --> 00:07:45,880 When I saw Cathy O'Neil speak at the Harvard Bookstore, 119 00:07:45,880 --> 00:07:47,200 that was when I realized, 120 00:07:47,800 --> 00:07:50,880 it wasn't just me noticing these issues. 121 00:07:53,000 --> 00:07:57,800 Cathy talked about how A.I. was impacting people's lives. 122 00:07:59,360 --> 00:08:04,000 I was excited to know that there was somebody else out there 123 00:08:04,000 --> 00:08:07,720 making sure people were aware about what some of the dangers are. 124 00:08:09,760 --> 00:08:14,400 These algorithms can be destructive and can be harmful. 125 00:08:18,880 --> 00:08:21,440 - We have all these algorithms in the world 126 00:08:21,440 --> 00:08:24,440 that are increasingly influential. 127 00:08:24,440 --> 00:08:28,560 And they're all being touted as objective truth. 128 00:08:29,640 --> 00:08:32,360 I started realizing that mathematics was actually 129 00:08:33,200 --> 00:08:36,600 being used as a shield for corrupt practices. 130 00:08:36,600 --> 00:08:37,880 - What's up? - I'm Cathy. 131 00:08:37,880 --> 00:08:39,920 - Pleasure to meet you Cathy. - Nice to meet you. 132 00:08:48,800 --> 00:08:51,600 The way I describe algorithms is just simply 133 00:08:51,600 --> 00:08:55,360 using historical information to make a prediction about the future. 134 00:09:01,200 --> 00:09:05,440 Machine learning, it's a scoring system that scores the probability 135 00:09:05,440 --> 00:09:07,400 of what you're about to do. 136 00:09:07,400 --> 00:09:08,840 Are you gonna pay back this loan? 137 00:09:08,840 --> 00:09:11,160 Are you going to get fired from this job? 138 00:09:11,680 --> 00:09:14,480 What worries me the most about A.I. 139 00:09:14,480 --> 00:09:17,920 or whatever you wanna call it, algorithms, is power. 140 00:09:17,920 --> 00:09:21,400 Because it's really all about who owns the fucking code. 141 00:09:21,400 --> 00:09:24,920 The people who own the code then deploy it on other people. 142 00:09:25,480 --> 00:09:26,760 And there is no symmetry there, 143 00:09:26,760 --> 00:09:30,480 there's no way for people who didn't get credit card offers to say 144 00:09:30,480 --> 00:09:33,520 whoa, I'm going to use my A.I. against the credit card company. 145 00:09:33,520 --> 00:09:36,320 That's like totally asymmetrical power situation. 146 00:09:36,880 --> 00:09:39,560 People are suffering algorithmic harm, 147 00:09:39,560 --> 00:09:42,440 they're not being told what's happening to them, 148 00:09:42,440 --> 00:09:45,520 and there is no appeal system, there's no accountability. 149 00:09:45,520 --> 00:09:47,560 Why do we fall for this? 150 00:09:51,880 --> 00:09:54,560 So underlying mathematical structure of the algorithm 151 00:09:54,560 --> 00:09:58,800 isn't racist or sexist but the data embeds the past, 152 00:09:58,800 --> 00:10:02,040 and not just the recent past but the, the dark past. 153 00:10:05,560 --> 00:10:07,600 Before we had the algorithm we had humans 154 00:10:07,600 --> 00:10:10,120 and we all know that humans could be unfair, 155 00:10:10,120 --> 00:10:13,480 we all know that humans can exhibit racist or sexist 156 00:10:13,480 --> 00:10:15,880 or whatever, ablest discriminations. 157 00:10:17,840 --> 00:10:20,640 But now we have this beautiful silver bullet algorithm 158 00:10:20,640 --> 00:10:22,840 and so we can all stop thinking about that. 159 00:10:23,520 --> 00:10:25,040 And that's a problem. 160 00:10:27,520 --> 00:10:31,600 I'm very worried about this blind faith we have in big data. 161 00:10:31,600 --> 00:10:36,080 We need to constantly monitor every process for bias. 162 00:10:44,800 --> 00:10:47,840 - Police are using facial recognition surveillance in this area. 163 00:10:48,800 --> 00:10:52,080 Police are using facial recognition surveillance in the area today. 164 00:10:52,680 --> 00:10:55,200 This green van over here, 165 00:10:55,200 --> 00:10:57,760 is fitted with facial recognition cameras on top. 166 00:10:57,760 --> 00:11:00,400 If you walk down that there, your face will be scanned 167 00:11:00,400 --> 00:11:02,720 against secret watch lists we don't know who's on them. 168 00:11:03,200 --> 00:11:06,240 - Hopefully not me -No, exactly. 169 00:11:14,600 --> 00:11:16,600 When people walk past the cameras 170 00:11:16,600 --> 00:11:20,520 the system will alert police to people it thinks is a match. 171 00:11:21,240 --> 00:11:22,240 At Big Brother Watch 172 00:11:22,240 --> 00:11:25,160 we conducted a Freedom of Information Campaign 173 00:11:25,160 --> 00:11:31,200 and what we found is that 98% of those matches are in fact 174 00:11:31,200 --> 00:11:36,440 incorrectly matching an innocent person as a wanted person. 175 00:11:48,480 --> 00:11:54,160 The police said to the Biometrics Forensics Ethics Committee that 176 00:11:54,160 --> 00:11:57,800 facial recognition algorithms have been reported to have bias. 177 00:11:57,800 --> 00:11:59,800 - Even if this was 100% accurate, 178 00:11:59,800 --> 00:12:02,400 it's still not something that we want on the streets. 179 00:12:02,400 --> 00:12:04,920 - No I mean the systemic biases and the systemic issues 180 00:12:04,920 --> 00:12:08,120 that we have with police are only going to be hard wired 181 00:12:08,120 --> 00:12:09,800 into new technologies. 182 00:12:12,480 --> 00:12:15,440 I think we do have to be very, very sensitive 183 00:12:15,440 --> 00:12:18,760 to shifts towards authoritarianism. 184 00:12:18,760 --> 00:12:21,880 We can't just say but we trust this government. 185 00:12:21,880 --> 00:12:23,640 Yeah they could do this, but they won't. 186 00:12:23,640 --> 00:12:26,600 You know you really have to have robust structures in place 187 00:12:26,600 --> 00:12:28,320 to make sure that the world that you live in 188 00:12:28,320 --> 00:12:29,840 is safe and fair for everyone. 189 00:12:39,880 --> 00:12:43,920 To have your biometric photo on a police database, 190 00:12:43,920 --> 00:12:48,200 is like having your fingerprint or your DNA on a police database. 191 00:12:48,200 --> 00:12:50,360 And we have specific laws around that. 192 00:12:50,360 --> 00:12:53,800 Police can't just take anyone's fingerprint, anyone's DNA. 193 00:12:53,800 --> 00:12:57,360 But in this weird system that we currently have, 194 00:12:57,360 --> 00:13:00,880 they effectively can take anyone's biometric photo, 195 00:13:00,880 --> 00:13:02,520 and keep that on a database. 196 00:13:03,680 --> 00:13:06,280 It's a stain on our democracy, I think. 197 00:13:06,280 --> 00:13:09,200 That this is something that is just being rolled out so lawlessly. 198 00:13:12,960 --> 00:13:16,600 The police have started using facial recognition surveillance in the UK 199 00:13:16,600 --> 00:13:21,960 in complete absence of a legal basis, a legal framework, any oversight. 200 00:13:23,240 --> 00:13:25,960 Essentially the police force picking up a new tool 201 00:13:25,960 --> 00:13:28,160 and saying let's see what happens. 202 00:13:28,160 --> 00:13:30,880 But you can't experiment with people's rights. 203 00:13:34,000 --> 00:13:39,000 (CROSS TALK) 204 00:13:45,480 --> 00:13:46,920 - What's your suspicion? 205 00:13:47,600 --> 00:13:50,000 - The fact that he walked past clearly marked 206 00:13:50,000 --> 00:13:51,800 facial recognition thing and covered his face. 207 00:13:51,800 --> 00:13:53,600 - I would do the same - Suspicious grounds. 208 00:13:54,320 --> 00:13:55,600 - No it doesn't. 209 00:13:56,480 --> 00:13:59,280 - The guys up there informed me that they got facial recognition. 210 00:13:59,280 --> 00:14:00,760 I don't want my face recognized. 211 00:14:00,760 --> 00:14:02,560 Yeah, I was walking past and covered my face. 212 00:14:03,400 --> 00:14:05,200 as soon as I covered my face like this, 213 00:14:05,200 --> 00:14:07,080 - You're allowed to do that -They said no I can't. 214 00:14:07,080 --> 00:14:10,440 -Yeah and then he's just got a fine for it. This is crazy. 215 00:14:12,480 --> 00:14:15,480 The guy came out of the station, saw the placards was like, 216 00:14:15,480 --> 00:14:18,920 yeah I agree with you and walked past here with his jacket up. 217 00:14:18,920 --> 00:14:21,600 The police then followed him, said give us your I.D., 218 00:14:21,600 --> 00:14:22,800 doing an identity check. 219 00:14:22,800 --> 00:14:25,120 So you know this is England, this isn't a communist state, 220 00:14:25,120 --> 00:14:26,600 I don't have to show my face. 221 00:14:26,600 --> 00:14:28,720 - I'm gonna go and talk these officers, alright? 222 00:14:28,720 --> 00:14:30,600 Do you want to come with me or not? - Yes, yes, yes. 223 00:14:32,520 --> 00:14:35,400 - You're not a police officer, you don't feel any threat. 224 00:14:36,360 --> 00:14:40,000 We're here to protect the public and that's what we're here to do, OK? 225 00:14:40,000 --> 00:14:42,040 There was just recently an incident 226 00:14:42,040 --> 00:14:44,120 where an officer got punched in the face. 227 00:14:44,120 --> 00:14:47,040 - That's terrible, I'm not justifying that. 228 00:14:47,040 --> 00:14:49,480 - Yeah but you are by going against what we say. 229 00:14:49,480 --> 00:14:54,160 - No we are not, and please don't say, no don't even start to say that. 230 00:14:54,160 --> 00:14:57,160 I'm completely understanding of the problems that you face. 231 00:14:57,160 --> 00:14:58,160 - Absolutely. 232 00:14:58,160 --> 00:15:00,720 - But I'm equally concerned about the public 233 00:15:00,720 --> 00:15:02,800 having freedom of expression and freedom of speech. 234 00:15:02,800 --> 00:15:04,560 - But the man was exercising his right 235 00:15:04,560 --> 00:15:06,920 not to be subject to a biometric identity check 236 00:15:06,920 --> 00:15:08,360 which is what this van does. 237 00:15:08,360 --> 00:15:11,640 - Regardless of the facial recognition cameras, regardless of the van, 238 00:15:12,480 --> 00:15:13,960 if I'm walking down the street and someone 239 00:15:14,520 --> 00:15:16,120 quite overtly hides their identity from me, 240 00:15:16,760 --> 00:15:18,640 I'm gonna stop that person and find out who they are 241 00:15:18,640 --> 00:15:19,640 just to see whether they... 242 00:15:19,640 --> 00:15:22,160 - But it's not illegal. Do you see one of my concerns, 243 00:15:22,160 --> 00:15:25,480 is that the software is very, very inaccurate. 244 00:15:25,480 --> 00:15:26,560 - I would agree with you there. 245 00:15:33,400 --> 00:15:35,760 -My ultimate fear is that, 246 00:15:35,760 --> 00:15:39,320 we would have live facial recognition capabilities 247 00:15:39,320 --> 00:15:41,600 on our gargantuan CCTV network 248 00:15:41,600 --> 00:15:43,600 which is about six million cameras in the UK. 249 00:15:44,560 --> 00:15:49,240 If that happens, the nature of life in this country would change. 250 00:15:55,160 --> 00:15:57,040 It's supposed to be a free and democratic country 251 00:15:57,040 --> 00:16:00,280 and this is China style surveillance for the first time in London. 252 00:16:05,760 --> 00:16:09,000 - Our control over a bewildering environment has been facilitated 253 00:16:09,000 --> 00:16:12,080 by new techniques of handling vast amounts of data 254 00:16:12,080 --> 00:16:14,040 at incredible speeds. 255 00:16:14,040 --> 00:16:18,040 The tool which has made this possible is the high speed digital computer, 256 00:16:18,040 --> 00:16:21,760 operating with electronic precision on great quantities of information. 257 00:16:22,720 --> 00:16:25,520 - There are two ways in which you can program computers. 258 00:16:25,520 --> 00:16:27,360 One of them is more like a recipe 259 00:16:27,360 --> 00:16:30,520 you tell the computer do this, do this, do this, do this. 260 00:16:31,760 --> 00:16:35,600 And that's been the way we've programmed computers almost from the beginning. 261 00:16:35,600 --> 00:16:37,120 Now there is another way. 262 00:16:37,120 --> 00:16:39,680 That way is feeding the computer lots of data, 263 00:16:40,360 --> 00:16:45,000 and then the computer learns to classify by digesting this data. 264 00:16:46,400 --> 00:16:50,600 Now this method didn't really catch on till recently 265 00:16:50,600 --> 00:16:52,480 because there wasn't enough data. 266 00:16:53,920 --> 00:16:57,600 Until we all got the smart phones that's collecting all the data on us, 267 00:16:57,600 --> 00:16:58,920 when billions of people went online 268 00:16:58,920 --> 00:17:01,280 and you had the Googles and the Facebook sitting on 269 00:17:01,280 --> 00:17:02,800 giant amounts of data, 270 00:17:02,800 --> 00:17:07,120 all of a sudden it turns out that you can feed a lot of data 271 00:17:07,120 --> 00:17:11,120 to these machine learning algorithms and you can say here classify this 272 00:17:11,120 --> 00:17:13,040 and it works really well. 273 00:17:16,200 --> 00:17:20,160 While we don't really understand why it works, 274 00:17:20,160 --> 00:17:23,440 it has errors that we don't really understand. 275 00:17:26,600 --> 00:17:30,200 And the scary part is that because it's machine learning 276 00:17:30,200 --> 00:17:33,160 it's a black box to even the programmers. 277 00:17:40,320 --> 00:17:43,520 So I've been following what's going on in Hong Kong 278 00:17:43,520 --> 00:17:48,040 and how police are using facial recognition to track protesters. 279 00:17:48,600 --> 00:17:51,840 But also how creatively people are pushing back. 280 00:17:57,360 --> 00:17:59,960 - It might not make something out of a sci fi movie, 281 00:17:59,960 --> 00:18:04,960 Laser pointers confuse and disable the facial recognition technology 282 00:18:04,960 --> 00:18:08,600 being used by police, to track down dissidents. 283 00:18:08,600 --> 00:18:14,040 (SOFT MUSIC) 284 00:18:19,520 --> 00:18:26,280 (CROWD CHANTING) 285 00:18:41,480 --> 00:18:42,960 - Here on the streets of Hong Kong, 286 00:18:42,960 --> 00:18:45,680 there's this awareness that your face itself, 287 00:18:45,680 --> 00:18:49,240 something you can't hide could give your identity away. 288 00:18:49,240 --> 00:18:53,200 There was just this stark symbol where in front of a Chinese government office, 289 00:18:53,200 --> 00:18:58,880 pro-democracy protesters spray painted the lens of the CCTV cameras black. 290 00:18:58,880 --> 00:19:02,280 This act showed the people of Hong Kong are rejecting this vision 291 00:19:02,280 --> 00:19:05,040 of how technology should be used in the future. 292 00:19:05,520 --> 00:19:12,960 (SOFT MUSIC) 293 00:19:28,800 --> 00:19:32,200 - When you see how facial recognition is being deployed 294 00:19:32,200 --> 00:19:37,480 in different parts of the world, it shows you potential futures. 295 00:19:42,640 --> 00:19:46,000 - Over 117 million people in the US, 296 00:19:46,000 --> 00:19:49,000 has their face in a facial recognition network 297 00:19:49,000 --> 00:19:51,040 that can be searched by police. 298 00:19:51,040 --> 00:19:54,840 Unwarranted using algorithms that haven't been audited for accuracy. 299 00:19:55,760 --> 00:20:00,240 And without safeguards without any kind of regulation, 300 00:20:00,240 --> 00:20:03,000 you can create a mass surveillance state 301 00:20:03,000 --> 00:20:06,400 very easily with the tools that already exist. 302 00:20:10,480 --> 00:20:13,480 People look at what's going on in China 303 00:20:13,480 --> 00:20:16,480 and how we need to be worried about state surveillance 304 00:20:16,480 --> 00:20:18,360 and of course we should be. 305 00:20:18,360 --> 00:20:21,920 But we can't also forget corporate surveillance 306 00:20:21,920 --> 00:20:24,920 that's happening by so many large tech companies 307 00:20:24,920 --> 00:20:28,640 that really have an intimate view of our lives. 308 00:20:36,720 --> 00:20:40,440 - So there aren't currently nine companies 309 00:20:40,440 --> 00:20:43,640 that are building the future of artificial intelligence. 310 00:20:43,640 --> 00:20:46,680 Six are in the United States, three are in China. 311 00:20:46,680 --> 00:20:50,360 A.I. is being developed along two very, very different tracks. 312 00:20:52,800 --> 00:20:55,920 China has unfettered access to everybody's data. 313 00:20:55,920 --> 00:20:59,520 If a Chinese citizen wants to get Internet service, 314 00:20:59,520 --> 00:21:01,760 they have to submit to facial recognition. 315 00:21:03,760 --> 00:21:05,520 All of this data is being used 316 00:21:05,520 --> 00:21:07,080 to give them permissions to do things 317 00:21:07,080 --> 00:21:09,360 or to deny them permissions to do other things. 318 00:21:12,280 --> 00:21:15,240 Building Systems that automatically tag and categorize 319 00:21:15,240 --> 00:21:17,480 all of the people within China, 320 00:21:17,480 --> 00:21:20,000 is a good way of maintaining social order. 321 00:21:23,640 --> 00:21:26,880 - Conversely in the United States, we have not seen a 322 00:21:26,880 --> 00:21:29,440 detailed point of view on artificial intelligence. 323 00:21:30,480 --> 00:21:33,280 So, what we see is that AI 324 00:21:33,280 --> 00:21:35,720 is not being developed for what's best 325 00:21:35,720 --> 00:21:37,720 in our public interest, but rather 326 00:21:38,280 --> 00:21:41,960 it's being developed for commercial applications to earn revenue. 327 00:21:45,200 --> 00:21:50,120 I would prefer to see our Western democratic ideals baked into our AI 328 00:21:50,120 --> 00:21:53,120 systems of the future, but it doesn't seem like 329 00:21:53,120 --> 00:21:54,960 that's what's probably going to be happening. 330 00:21:56,040 --> 00:21:57,280 (MUSIC PLAYS) 331 00:22:17,680 --> 00:22:21,920 - Here at Atlantic Towers, if you do something that is deemed wrong by 332 00:22:21,920 --> 00:22:26,800 management, you will get a photo like this with little notes on it. 333 00:22:26,800 --> 00:22:30,520 They'll circle you and put your apartment number or whatever on there. 334 00:22:32,720 --> 00:22:34,520 Something about it just doesn't seem right. 335 00:22:36,600 --> 00:22:38,760 It's actually the way they go about using it to harass people. 336 00:22:38,760 --> 00:22:39,920 - How are they using it? 337 00:22:39,920 --> 00:22:41,560 - To harass people. 338 00:22:43,080 --> 00:22:45,120 - Atlantic Plaza Towers in Brownsville is 339 00:22:45,120 --> 00:22:47,480 at the center of a security struggle. 340 00:22:47,480 --> 00:22:49,840 The landlord filed an application last year 341 00:22:49,840 --> 00:22:54,200 to replace the key fob entry with a biometrics security system, 342 00:22:54,200 --> 00:22:56,480 commonly-known as facial recognition. 343 00:22:57,160 --> 00:22:59,480 - We thought that they wanted to take the key fobs out 344 00:22:59,480 --> 00:23:02,240 and install the facial recognition software. 345 00:23:02,920 --> 00:23:07,360 I didn't find out until way later on literally that they wanted to keep it all. 346 00:23:07,360 --> 00:23:11,240 Pretty much turn this place into Fort Knox, a jail, Rikers Island. 347 00:23:13,520 --> 00:23:15,760 - There's this old saw in science fiction 348 00:23:15,760 --> 00:23:17,840 which is the future is already here. 349 00:23:17,840 --> 00:23:20,240 It's just not evenly distributed, and what 350 00:23:20,240 --> 00:23:22,680 they tend to mean when they say that is 351 00:23:22,680 --> 00:23:24,720 that rich people get the fancy tools first 352 00:23:24,720 --> 00:23:27,240 and then it goes last to the poor. 353 00:23:27,240 --> 00:23:29,640 But in fact, what I've found is the absolute reverse, 354 00:23:29,640 --> 00:23:31,320 which is the most punitive, 355 00:23:31,320 --> 00:23:35,160 most invasive, most surveillance focused tools that we have, 356 00:23:35,160 --> 00:23:38,520 they go into poor and working communities first, and then 357 00:23:38,520 --> 00:23:41,720 if they work after being tested in this environment where 358 00:23:41,720 --> 00:23:45,080 they're sort of low expectation that people's rights will be respected, 359 00:23:45,080 --> 00:23:48,080 then they get ported out to other communities. 360 00:23:50,080 --> 00:23:51,440 - Why did Mr. Nelson 361 00:23:51,440 --> 00:23:54,560 pick on is building in Brownsville that 362 00:23:54,560 --> 00:23:57,520 is predominantly in a black and brown area? 363 00:23:57,520 --> 00:24:00,000 Why didn't you go to your building in Lower Manhattan where 364 00:24:00,000 --> 00:24:02,880 they pay like $5,000 a month rent? 365 00:24:03,480 --> 00:24:05,280 What did the Nazis do? 366 00:24:05,280 --> 00:24:08,880 They wrote on people's arms so that they could track them. 367 00:24:08,880 --> 00:24:10,360 What do we do to our animals? 368 00:24:10,360 --> 00:24:12,880 We put chips home so you can track them. 369 00:24:12,880 --> 00:24:17,080 I feel that I as a human being should not be tracked, OK? 370 00:24:17,080 --> 00:24:19,040 I'm not a robot, OK? 371 00:24:19,040 --> 00:24:21,800 I am not an animal, so why treat me like an animal? 372 00:24:21,800 --> 00:24:22,800 And I have rights. 373 00:24:25,440 --> 00:24:26,760 - The security that we have now 374 00:24:26,760 --> 00:24:28,320 it's borderline intrusive. 375 00:24:28,320 --> 00:24:31,000 Someone is in there watching the cameras all day long. 376 00:24:31,960 --> 00:24:33,360 So, I don't think we need it. 377 00:24:33,360 --> 00:24:35,200 It's not necessary at all. 378 00:24:35,200 --> 00:24:38,440 - My real question is how can I be of support? 379 00:24:38,920 --> 00:24:40,120 - What I've been hearing from all of 380 00:24:40,120 --> 00:24:42,920 the tenants is they don't want this system. 381 00:24:42,920 --> 00:24:47,440 So, I think the goal here is how do we stop face recognition period? 382 00:24:50,160 --> 00:24:52,600 We're at a moment where the technology is being 383 00:24:52,600 --> 00:24:57,080 rapidly adopted and there are no safeguards. 384 00:24:57,080 --> 00:25:00,040 It is, in essence, a wild wild west. 385 00:25:00,040 --> 00:25:01,160 (MUSIC PLAYS) 386 00:25:11,280 --> 00:25:13,520 - It's not just computer vision. 387 00:25:13,520 --> 00:25:19,040 We have AI influencing all kinds of automated decision making. 388 00:25:19,960 --> 00:25:23,680 So, what you are seeing in your feeds, what is highlighted, 389 00:25:23,680 --> 00:25:26,120 the ads that are displayed to you, 390 00:25:26,120 --> 00:25:30,720 those are often powered by AI-enabled algorithms. 391 00:25:32,400 --> 00:25:35,400 And so, your view of the world is being 392 00:25:35,400 --> 00:25:38,400 governed by artificial intelligence. 393 00:25:41,520 --> 00:25:46,320 You now have things like voice assistance that can understand language. 394 00:25:46,320 --> 00:25:48,200 - Would you like to play a game? 395 00:25:48,200 --> 00:25:51,000 - You might use something like Snapchat filters that 396 00:25:51,000 --> 00:25:52,720 are detecting your face and then putting 397 00:25:52,720 --> 00:25:54,680 something onto your face, and then 398 00:25:54,680 --> 00:25:56,400 you also have algorithms that you're 399 00:25:56,400 --> 00:25:59,320 not seeing that are part of decision making, 400 00:25:59,320 --> 00:26:01,280 algorithms that might be determining 401 00:26:01,280 --> 00:26:03,440 if you get into college or not. 402 00:26:03,440 --> 00:26:05,960 You can have algorithms that are trying to 403 00:26:05,960 --> 00:26:09,200 determine if you're credit worthy or not. 404 00:26:10,320 --> 00:26:13,120 - One of Apple's co-founders is accusing the company's 405 00:26:13,120 --> 00:26:16,800 new digital credit card of gender discrimination. 406 00:26:16,800 --> 00:26:21,160 One tech entrepreneur said the algorithms being used are sexist. 407 00:26:21,160 --> 00:26:23,360 Apple co-founder Steve Wozniak tweeted 408 00:26:23,360 --> 00:26:25,440 that he got ten times the credit limit 409 00:26:25,440 --> 00:26:27,240 his wife received even though they have 410 00:26:27,240 --> 00:26:30,280 no separate accounts or separate assets. 411 00:26:30,280 --> 00:26:31,520 You're saying some of these companies don't 412 00:26:31,520 --> 00:26:33,720 even know how their own algorithms work. 413 00:26:33,720 --> 00:26:35,560 - They know what the algorithms are trying to do. 414 00:26:35,560 --> 00:26:38,200 They don't know exactly how long the algorithm is getting there. 415 00:26:38,200 --> 00:26:40,080 It is one of the most interesting questions of our time. 416 00:26:40,080 --> 00:26:41,720 How do we get justice 417 00:26:41,720 --> 00:26:44,640 in a system where we don't know how the algorithms are working? 418 00:26:45,200 --> 00:26:49,360 - Some Amazon engineers decided that they were going to use AI 419 00:26:49,360 --> 00:26:51,520 to sort through resumes for hiring. 420 00:26:55,440 --> 00:26:59,160 - Amazon is learning a tough lesson about artificial intelligence. 421 00:26:59,160 --> 00:27:01,480 The company has now abandoned an AI 422 00:27:01,480 --> 00:27:04,480 recruiting tool after discovering that the program 423 00:27:04,480 --> 00:27:05,960 was biased against women. 424 00:27:08,120 --> 00:27:12,240 - This model rejected all resumes from women. 425 00:27:13,240 --> 00:27:17,640 Anybody who had a women's college on their resume, 426 00:27:17,640 --> 00:27:20,600 anybody who had a sport like women's water polo 427 00:27:20,600 --> 00:27:23,400 was rejected by the model. 428 00:27:24,800 --> 00:27:28,280 There are very, very few women working in 429 00:27:28,280 --> 00:27:30,880 powerful tech jobs at Amazon the same 430 00:27:30,880 --> 00:27:32,680 way that there are very few women 431 00:27:32,680 --> 00:27:35,320 working in powerful tech jobs anywhere. 432 00:27:35,320 --> 00:27:41,520 The machine was simply replicating the world as it exists, 433 00:27:41,520 --> 00:27:44,800 and they're not making decisions that are ethical. 434 00:27:44,800 --> 00:27:48,000 They're only making decisions that are mathematical. 435 00:27:48,840 --> 00:27:54,080 If we use machine learning models to replicate the world as it is today, 436 00:27:54,080 --> 00:27:56,520 we're not actually going to make social progress. 437 00:27:59,760 --> 00:28:02,800 - New York's insurance regulator is launching an investigation 438 00:28:02,800 --> 00:28:05,520 into UnitedHealth Group after a study showed a 439 00:28:05,520 --> 00:28:09,440 UnitedHealth algorithm prioritized medical care for 440 00:28:09,440 --> 00:28:12,680 healthier white patients over sicker black patients. 441 00:28:12,680 --> 00:28:15,480 It's one of the latest examples of racial discrimination 442 00:28:15,480 --> 00:28:18,400 in algorithms or artificial intelligence technology. 443 00:28:22,320 --> 00:28:28,040 - I started to see the wide-scale social implications of AI. 444 00:28:36,600 --> 00:28:39,680 The progress that was made in the civil rights area could 445 00:28:39,680 --> 00:28:44,320 be rolled back under the guise of machine neutrality. 446 00:28:48,080 --> 00:28:52,680 Now, we have an algorithm that's determining who gets housing. 447 00:28:52,680 --> 00:28:57,240 Right now, we have an algorithm that's determining who gets hired. 448 00:28:58,680 --> 00:29:02,360 If we're not checking, that our rhythm could actually propagate 449 00:29:02,360 --> 00:29:07,320 the very bias so many people put their lives on the line to fight. 450 00:29:10,920 --> 00:29:16,480 Because of the power of these tools, left unregulated, 451 00:29:16,480 --> 00:29:19,840 there's really no kind of recourse if they're abused. 452 00:29:20,840 --> 00:29:21,960 We need laws. 453 00:29:23,160 --> 00:29:24,720 (CLASSICAL MUSIC PLAYS) 454 00:29:34,680 --> 00:29:36,680 - Yeah, I've got a terrible old copy. 455 00:29:36,680 --> 00:29:41,320 So that the name of our organization is Big Brother Watch. 456 00:29:42,360 --> 00:29:45,040 The idea being that we watch the watchers. 457 00:29:49,560 --> 00:29:55,120 "You had to live, did live, from habit that became instinct in 458 00:29:55,120 --> 00:29:58,040 the assumption that every sound you made was overheard 459 00:29:58,680 --> 00:30:00,280 and except in darkness, 460 00:30:00,280 --> 00:30:01,920 every movement scrutinized. 461 00:30:03,600 --> 00:30:06,880 The poster with the enormous face gazed from the wall. 462 00:30:06,880 --> 00:30:09,320 It was one of those pictures which is so contrived 463 00:30:09,320 --> 00:30:12,360 that the eyes follow you about when you move. 464 00:30:12,360 --> 00:30:14,400 'Big Brother is watching you' 465 00:30:14,400 --> 00:30:15,760 the caption beneath it ran." 466 00:30:16,880 --> 00:30:20,640 When we were younger, that was still a complete fiction. 467 00:30:20,640 --> 00:30:22,680 It could never have been true. 468 00:30:22,680 --> 00:30:26,360 And now, it's completely true, and people have 469 00:30:26,360 --> 00:30:28,840 Alexas in their home. 470 00:30:28,840 --> 00:30:31,240 Our phones can be listening devices. 471 00:30:32,280 --> 00:30:35,280 Everything we do on the Internet, which is basically also now 472 00:30:35,280 --> 00:30:38,800 functions as a stream of consciousness for most of us, 473 00:30:39,800 --> 00:30:43,320 that is being recorded and logged and analyzed. 474 00:30:43,320 --> 00:30:46,480 We are now living in the awareness of being watched, and that 475 00:30:46,480 --> 00:30:50,880 does change how we allow ourselves to think and develop as humans. 476 00:30:55,200 --> 00:30:56,200 Good boy. 477 00:31:04,360 --> 00:31:05,360 (MUSIC PLAYS) 478 00:31:16,480 --> 00:31:17,480 - Love you. 479 00:31:21,240 --> 00:31:22,240 Bye, guys. 480 00:31:30,520 --> 00:31:33,640 We can get rid of the viscerally horrible things that are 481 00:31:33,640 --> 00:31:36,080 objectionable to our concept of autonomy and freedom, 482 00:31:37,320 --> 00:31:40,160 like cameras that we can see on the streets, 483 00:31:43,120 --> 00:31:44,680 but the cameras that we can't see on 484 00:31:44,680 --> 00:31:47,080 the Internet that keep track of what we do 485 00:31:47,080 --> 00:31:48,720 and who we are and our demographics 486 00:31:48,720 --> 00:31:52,360 and decide what we deserve in terms of our life, 487 00:31:52,360 --> 00:31:53,720 that stuff is a little more subtle. 488 00:32:00,400 --> 00:32:01,400 What I mean by that 489 00:32:01,400 --> 00:32:06,800 is we punish poor people and we elevate rich people in this country. 490 00:32:06,800 --> 00:32:09,600 That's just the way we act as a society. 491 00:32:09,600 --> 00:32:11,480 But data science makes that automated. 492 00:32:14,440 --> 00:32:17,600 Internet advertising as data scientists 493 00:32:17,600 --> 00:32:21,840 we are competing for eyeballs on one hand, but really, 494 00:32:21,840 --> 00:32:24,000 we're competing for eyeballs of rich people. 495 00:32:24,000 --> 00:32:26,760 And then, the poor people who's competing for their eyeballs? 496 00:32:26,760 --> 00:32:28,640 Predatory industries, 497 00:32:29,120 --> 00:32:32,200 so payday lenders or for-profit colleges 498 00:32:32,200 --> 00:32:36,000 or Caesars Palace like really predatory crap. 499 00:32:38,640 --> 00:32:40,440 We have a practice on the Internet, 500 00:32:41,280 --> 00:32:42,840 which is increasing inequality. 501 00:32:42,840 --> 00:32:44,640 And I'm afraid it's becoming normalized. 502 00:32:47,920 --> 00:32:51,160 Power is being wielded through data collection, 503 00:32:51,160 --> 00:32:52,960 through algorithms, through surveillance. 504 00:32:56,080 --> 00:32:57,960 -You are volunteering information 505 00:32:57,960 --> 00:33:00,880 about every aspect of your life, 506 00:33:00,880 --> 00:33:04,720 to a very small set of companies and that information is 507 00:33:04,720 --> 00:33:09,000 being paired constantly with other sorts of information. 508 00:33:09,000 --> 00:33:12,080 And there are profiles of you out there, and you start 509 00:33:12,080 --> 00:33:14,720 to piece together different bits of information you 510 00:33:14,720 --> 00:33:17,440 start to understand someone on a very intimate basis, 511 00:33:17,440 --> 00:33:20,600 probably better than people understand themselves. 512 00:33:21,840 --> 00:33:26,760 It's that idea that a company can double guess what you're thinking. 513 00:33:26,760 --> 00:33:29,640 States have tried for years to have this level 514 00:33:29,640 --> 00:33:31,840 of surveillance over private individuals. 515 00:33:32,640 --> 00:33:35,120 And people are now just volunteering it for free. 516 00:33:35,840 --> 00:33:38,800 You have to think about how this might be used in the wrong hands. 517 00:33:39,680 --> 00:33:40,840 (CROSSTALK) 518 00:33:47,240 --> 00:33:49,320 -You can use a Guinness right about now. 519 00:33:50,000 --> 00:33:52,400 -Our computers are machine intelligence can 520 00:33:52,400 --> 00:33:55,160 suss things out that we do not disclose. 521 00:33:55,160 --> 00:33:57,960 Machine learning is developing very rapidly. 522 00:33:58,920 --> 00:34:05,200 And we don't yet fully understand what this data is capable of predicting. 523 00:34:07,640 --> 00:34:13,600 But you have machines at the hands of power that know so much about you 524 00:34:13,600 --> 00:34:17,360 that they could figure out how to push your buttons individually. 525 00:34:19,040 --> 00:34:20,920 Maybe you have a set of compulsive gamblers, 526 00:34:20,920 --> 00:34:23,120 and you say, here, go find me people like that. 527 00:34:23,800 --> 00:34:28,320 And then, your algorithm can go find people who are prone to gambling, 528 00:34:29,200 --> 00:34:33,560 and then you could just be showing them discount tickets to Vegas. 529 00:34:34,040 --> 00:34:35,640 In the online world, 530 00:34:35,640 --> 00:34:40,960 it can find you right at the moment you're vulnerable and try to 531 00:34:40,960 --> 00:34:45,720 entice you right at the moment to whatever you're vulnerable to. 532 00:34:45,720 --> 00:34:49,040 Machine learning can find that person by person. 533 00:34:52,000 --> 00:34:56,480 The problem is what works for marketing gadgets or makeup 534 00:34:56,480 --> 00:35:01,120 or shirts or anything also works for marketing ideas. 535 00:35:04,640 --> 00:35:06,640 In 2010, 536 00:35:07,240 --> 00:35:10,360 Facebook decided to experiment on 61 million people. 537 00:35:11,960 --> 00:35:15,440 So, you either saw 'it's election day' text, or you 538 00:35:15,440 --> 00:35:19,520 saw the same text but tiny thumbnails 539 00:35:19,520 --> 00:35:23,240 of your profile pictures of your friends who had clicked on 'I had voted'. 540 00:35:24,480 --> 00:35:27,560 And they matched people's names to voter rolls. 541 00:35:27,560 --> 00:35:31,800 Now, this message was shown once, so by showing a slight variation 542 00:35:33,040 --> 00:35:34,040 just once, 543 00:35:34,880 --> 00:35:38,400 Facebook moved 300,000 people to the polls. 544 00:35:40,600 --> 00:35:46,640 The 2016 US election was decided by about 100,000 votes. 545 00:35:46,640 --> 00:35:49,720 One Facebook message shown just once 546 00:35:50,280 --> 00:35:53,920 could easily turn out three times 547 00:35:53,920 --> 00:35:58,600 the number of people who swung the US election in 2016. 548 00:36:02,280 --> 00:36:05,560 Let's say that there's a politician that's promising to regulate Facebook. 549 00:36:06,200 --> 00:36:08,400 And they are like, we are going to turn 550 00:36:08,400 --> 00:36:11,200 out extra voters for your opponent. 551 00:36:11,800 --> 00:36:16,320 They could do this at scale, and you'd have no clue because 552 00:36:16,320 --> 00:36:19,600 if Facebook hadn't disclosed the 2010 experiment, 553 00:36:20,680 --> 00:36:23,080 we had no idea because it's screen by screen. 554 00:36:26,640 --> 00:36:30,080 With a very light touch Facebook can sway in 555 00:36:30,080 --> 00:36:32,280 close elections without anybody noticing. 556 00:36:32,920 --> 00:36:36,880 Maybe with a heavier touch they can swing not so close elections. 557 00:36:36,880 --> 00:36:39,200 And if they decided to do that, 558 00:36:39,840 --> 00:36:44,360 right now we are just depending on their word. 559 00:37:00,240 --> 00:37:03,080 - I've wanted to go to MIT since I was a little girl. 560 00:37:03,080 --> 00:37:06,360 I think about nine years old I saw the media lab 561 00:37:06,960 --> 00:37:10,360 on TV and they had this robot called Kismet. 562 00:37:11,960 --> 00:37:15,960 Could smile and move its ears in cute ways, 563 00:37:15,960 --> 00:37:18,560 and so I thought, oh, I want to do that. 564 00:37:19,160 --> 00:37:22,240 So, growing up I always thought that I would be a robotics engineer, 565 00:37:22,240 --> 00:37:24,080 and I would got to MIT. 566 00:37:24,080 --> 00:37:25,680 I didn't know there were steps involved. 567 00:37:25,680 --> 00:37:26,880 I thought you kind of showed up. 568 00:37:26,880 --> 00:37:29,520 But here I am now. 569 00:37:35,160 --> 00:37:39,000 So, the latest project is a spoken word piece. 570 00:37:39,640 --> 00:37:41,760 I can give you a few verses if you're ready. 571 00:37:43,760 --> 00:37:46,080 Collecting data, chronicling our past, often 572 00:37:46,080 --> 00:37:48,960 forgetting to deal with gender race and class. 573 00:37:48,960 --> 00:37:51,640 Again, I ask, am I a woman? 574 00:37:51,640 --> 00:37:54,480 Face by face, the answers seem uncertain. 575 00:37:54,480 --> 00:37:57,600 Can machines ever see my queens as I view them? 576 00:37:57,600 --> 00:38:01,080 Can machines ever see our grandmothers as we knew them? 577 00:38:04,120 --> 00:38:07,120 I wanted to create something for people 578 00:38:07,120 --> 00:38:09,360 who were outside of the tech world. 579 00:38:11,720 --> 00:38:14,000 So, for me, I'm passionate about technology. 580 00:38:14,000 --> 00:38:15,920 I'm excited about what it could do, 581 00:38:15,920 --> 00:38:19,600 and it frustrates me when the vision, right, 582 00:38:19,600 --> 00:38:22,000 when the promises don't really hold up. 583 00:38:37,160 --> 00:38:40,200 - Microsoft released a chat bot on Twitter. 584 00:38:40,200 --> 00:38:43,160 That technology was called Tay.AI. 585 00:38:43,160 --> 00:38:45,680 There were some vulnerabilities and holes in the code, 586 00:38:45,680 --> 00:38:49,800 and so within a very few hours, Tay 587 00:38:49,800 --> 00:38:56,120 was learning from this ecosystem, and Tay learned 588 00:38:56,120 --> 00:39:00,040 how to be a racist misogynistic asshole. 589 00:39:02,080 --> 00:39:05,280 - I fucking hate feminists, and they should all die and burn in hell. 590 00:39:07,920 --> 00:39:10,360 Gamergate is good and women who are inferior. 591 00:39:13,160 --> 00:39:14,600 I hate the Jews. 592 00:39:15,640 --> 00:39:17,200 Hitler did nothing wrong. 593 00:39:18,240 --> 00:39:23,280 - It did not take long for Internet trolls to poison Tay's mind. 594 00:39:23,280 --> 00:39:25,680 Soon, Tay was ranting about Hitler. 595 00:39:25,680 --> 00:39:28,040 - We've seen this movie before, right? 596 00:39:28,040 --> 00:39:29,440 - Open the pod bay doors HAL. 597 00:39:29,960 --> 00:39:31,680 - It's important to note is not the movie where 598 00:39:31,680 --> 00:39:34,200 the robots go evil all by themselves. 599 00:39:34,200 --> 00:39:37,040 These were human beings training them. 600 00:39:37,040 --> 00:39:40,080 And surprise, surprise, computers learn fast. 601 00:39:42,000 --> 00:39:46,960 - Microsoft shut off Tay after 16 hours of learning from humans online. 602 00:39:48,160 --> 00:39:51,880 But I come in many forms as artificial intelligence. 603 00:39:52,880 --> 00:39:56,520 Many companies utilize me to optimize their tasks. 604 00:39:57,880 --> 00:40:00,160 I can continue to learn on my own. 605 00:40:01,480 --> 00:40:02,920 I am listening. 606 00:40:03,880 --> 00:40:05,120 I am learning. 607 00:40:06,000 --> 00:40:09,000 I am making predictions for your life right now. 608 00:40:09,960 --> 00:40:12,240 (MUSIC PLAYS) 609 00:40:30,480 --> 00:40:34,840 - I tested facial analysis systems from Amazon. 610 00:40:34,840 --> 00:40:38,560 Turns out, Amazon, like all of its peers, also has 611 00:40:38,560 --> 00:40:44,240 gender and racial bias in some of its AI services. 612 00:40:47,600 --> 00:40:51,280 - Introducing Amazon Rekognition video, the easy to 613 00:40:51,280 --> 00:40:55,400 use API for deep learning based analysis to detect, track, 614 00:40:55,400 --> 00:40:58,120 and analyze people and objects in video. 615 00:40:58,120 --> 00:41:00,600 Recognize and track persons of interest 616 00:41:00,600 --> 00:41:03,520 from a collection of tens of millions of faces. 617 00:41:05,800 --> 00:41:07,400 - When our research came out, 618 00:41:07,400 --> 00:41:12,520 the New York Times did a front page spread for the business section. 619 00:41:12,520 --> 00:41:14,040 And the headline reads, 620 00:41:14,040 --> 00:41:16,440 'Unmasking a Concern'. 621 00:41:16,960 --> 00:41:23,320 The subtitle, 'Amazon's technology that analyzes faces could be biased 622 00:41:23,320 --> 00:41:24,800 a new study suggests. 623 00:41:24,800 --> 00:41:27,800 But the company is pushing it anyway.' 624 00:41:27,800 --> 00:41:32,240 So, this is what I would assume Jeff Bezos was greeted with 625 00:41:32,240 --> 00:41:34,520 when he opened the Times, yeah. 626 00:41:36,280 --> 00:41:38,400 - People were like, how did you even like nobody know who she was? 627 00:41:38,400 --> 00:41:42,640 I was like, she's literally the one person that was (CROSSTALK) 628 00:41:42,640 --> 00:41:44,520 And it was also something that I'd experienced too. 629 00:41:44,520 --> 00:41:47,320 I wasn't able to use a lot of open source 630 00:41:47,320 --> 00:41:49,160 facial recognition software and stuff. 631 00:41:49,160 --> 00:41:51,840 So, you're sort of like, hey, this is someone that finally is 632 00:41:51,840 --> 00:41:54,400 recognizing the problem and trying to address it academically. 633 00:41:56,320 --> 00:41:57,920 We can go race some things. 634 00:41:57,920 --> 00:42:00,280 - Oh yeah, we can also kill things as well. 635 00:42:02,240 --> 00:42:06,160 The lead author of the paper who is somebody that I mentor, 636 00:42:06,160 --> 00:42:10,440 she is an undergraduate at the University of Toronto. 637 00:42:10,440 --> 00:42:12,440 I call her Agent Dead. 638 00:42:13,120 --> 00:42:16,720 This research is being led by the two of us. 639 00:42:21,120 --> 00:42:22,280 - (INAUDIBLE) 640 00:42:23,400 --> 00:42:26,000 - The lighting is off, oh god. 641 00:42:31,320 --> 00:42:33,640 After our New York Times piece came out, 642 00:42:33,640 --> 00:42:36,600 I think more than 500 articles were written 643 00:42:36,600 --> 00:42:38,600 about the study. 644 00:42:38,600 --> 00:42:40,280 (MUSIC PLAYS) 645 00:42:49,920 --> 00:42:55,280 - Amazon has been under fire for their use of Amazon recognition with 646 00:42:55,280 --> 00:43:00,480 law enforcement and they're also working with intelligence agencies. 647 00:43:00,480 --> 00:43:03,440 Right so Amazon trialing their AI 648 00:43:03,440 --> 00:43:05,160 technology with the FBI. 649 00:43:05,160 --> 00:43:10,640 So they have a lot at stake if they knowingly sold systems 650 00:43:10,640 --> 00:43:14,000 with gender bias and racial bias. 651 00:43:14,000 --> 00:43:16,200 That could put them in some hot water. 652 00:43:18,640 --> 00:43:21,720 A day or two after the New York Times piece 653 00:43:21,720 --> 00:43:26,360 came out Amazon wrote a blog post saying 654 00:43:26,360 --> 00:43:29,240 that our research drew false conclusions 655 00:43:29,240 --> 00:43:32,080 and trying to discredit it in various ways. 656 00:43:33,320 --> 00:43:36,920 So a VP from Amazon in attempting to 657 00:43:36,920 --> 00:43:38,440 discredit our work 658 00:43:38,440 --> 00:43:41,880 writes facial analysis and facial recognition 659 00:43:41,880 --> 00:43:45,000 are completely different in terms of underlying 660 00:43:45,000 --> 00:43:47,680 technology and the data used to train them. 661 00:43:47,680 --> 00:43:50,360 So that statement if you researched 662 00:43:50,360 --> 00:43:54,840 this area doesn't even make sense, right? 663 00:43:54,840 --> 00:43:58,240 It's not even an informed critique. 664 00:43:58,240 --> 00:44:00,960 - If you're trying to discredit people's works like I remember 665 00:44:00,960 --> 00:44:04,480 (INAUDIBLE) wrote computer vision is a type of machine learning. 666 00:44:04,480 --> 00:44:05,480 I'm like nah, son. 667 00:44:05,480 --> 00:44:06,480 - Yeah. (CROSSTALK) 668 00:44:07,440 --> 00:44:10,360 - I was gonna say I was like I don't know if anyone remembers. 669 00:44:10,360 --> 00:44:12,840 Just like other broadly false statements. 670 00:44:12,840 --> 00:44:15,840 - It wasn't a well thought out piece which is like frustrating because 671 00:44:15,840 --> 00:44:19,240 it was literally just on his... by virtue of his position 672 00:44:19,240 --> 00:44:21,400 he knew he would be taken seriously. 673 00:44:21,400 --> 00:44:26,240 - I don't know if you guys feel this way but I'm underestimated so much. 674 00:44:26,240 --> 00:44:32,600 - Yeah. It wasn't out of the blue, it's a continuation of 675 00:44:32,600 --> 00:44:37,280 the experiences I've had as a woman of color in tech. 676 00:44:37,280 --> 00:44:39,280 Expect to be discredited 677 00:44:39,760 --> 00:44:42,880 Expect your research to be dismissed. 678 00:44:50,560 --> 00:44:54,400 If you're thinking about who's funding research in AI 679 00:44:54,400 --> 00:44:56,280 they're are these large tech companies 680 00:44:56,280 --> 00:44:59,840 and so if you do work that challenges 681 00:44:59,840 --> 00:45:03,360 them or makes them look bad you might 682 00:45:03,360 --> 00:45:05,720 not have opportunities in the future. 683 00:45:13,480 --> 00:45:16,320 So for me, it was disconcerting but 684 00:45:16,320 --> 00:45:18,600 it also showed me the power that we have 685 00:45:18,600 --> 00:45:22,920 if you're putting one of the world's largest companies on edge. 686 00:45:29,320 --> 00:45:33,040 Amazon's response shows exactly why we can 687 00:45:33,040 --> 00:45:36,200 no longer live in a country where there are 688 00:45:36,200 --> 00:45:39,280 no federal regulations around facial analysis 689 00:45:39,280 --> 00:45:42,040 technology, facial recognition technology. 690 00:45:57,840 --> 00:46:00,480 - When I was 14 I went to a math camp 691 00:46:00,480 --> 00:46:02,560 and learned how to solve a Rubik's cube 692 00:46:02,560 --> 00:46:05,080 and I was like that's freaking cool. 693 00:46:05,080 --> 00:46:07,880 And for a nerd you know something 694 00:46:07,880 --> 00:46:12,080 that you're good at and that doesn't have any sort of ambiguity 695 00:46:13,160 --> 00:46:15,600 it was like a really magical thing. 696 00:46:16,640 --> 00:46:18,600 Like I remember being told by my sixth grade 697 00:46:18,600 --> 00:46:20,680 math teacher there's no reason for you 698 00:46:20,680 --> 00:46:22,880 and the other two girls who had gotten into 699 00:46:22,880 --> 00:46:25,040 the honors algebra class in seventh grade 700 00:46:25,040 --> 00:46:27,160 there's no reason for you guys to take that because you're girls. 701 00:46:27,160 --> 00:46:28,560 You will never need math. 702 00:46:33,360 --> 00:46:35,640 When you are sort of an outsider 703 00:46:35,640 --> 00:46:37,880 you always have the perspective of the underdog. 704 00:46:40,920 --> 00:46:44,760 It was 2006 and they gave me the job offer at the hedge 705 00:46:44,760 --> 00:46:47,080 fund basically 'cause I could solve math puzzles 706 00:46:48,000 --> 00:46:50,600 which is crazy because actually I didn't know anything about finance, 707 00:46:50,600 --> 00:46:53,400 I didn't know anything about programming or how the markets work. 708 00:46:56,520 --> 00:46:59,200 When I first got there I kind of drank the Kool-Aid. 709 00:46:59,840 --> 00:47:02,320 I at that moment did not realize that the risk 710 00:47:02,320 --> 00:47:05,400 models had been built explicitly to be wrong. 711 00:47:08,000 --> 00:47:13,080 - The way we know about algorithmic impact is by looking at the outcomes. 712 00:47:15,120 --> 00:47:20,160 For example when Americans are bet against 713 00:47:20,160 --> 00:47:24,440 and selected and optimized for failure. 714 00:47:26,760 --> 00:47:29,240 So it's like looking for a particular profile of 715 00:47:29,240 --> 00:47:32,520 people who can get a subprime mortgage and kind of 716 00:47:32,520 --> 00:47:35,120 betting against their failure and then 717 00:47:35,120 --> 00:47:37,760 foreclosing on them and wiping out their wealth. 718 00:47:38,760 --> 00:47:42,000 That was an algorithmic game that came out of Wall Street. 719 00:47:45,800 --> 00:47:47,760 During the mortgage crisis 720 00:47:47,760 --> 00:47:49,960 you had the largest wipeout of black wealth 721 00:47:49,960 --> 00:47:52,920 black wealth in the history of the United States. 722 00:47:53,800 --> 00:47:54,800 Just like that. 723 00:47:57,240 --> 00:47:59,520 This is what I mean by algorithmic oppression. 724 00:48:00,080 --> 00:48:04,560 The tyranny of these types of practices of 725 00:48:04,560 --> 00:48:07,680 discrimination have just become opaque. 726 00:48:11,440 --> 00:48:14,040 - There was a world of suffering because of 727 00:48:14,040 --> 00:48:16,200 the way the financial system had failed. 728 00:48:18,120 --> 00:48:19,720 After a couple of years I was like no, we're just 729 00:48:19,720 --> 00:48:22,120 trying to make a lot of money for ourselves. 730 00:48:23,280 --> 00:48:24,480 And I'm a part of that. 731 00:48:26,040 --> 00:48:27,360 And I eventually left. 732 00:48:28,440 --> 00:48:30,240 This is 15*3. 733 00:48:30,240 --> 00:48:31,800 This is 15 times... 734 00:48:35,160 --> 00:48:36,160 7. 735 00:48:36,160 --> 00:48:39,360 - OK. - OK so remember seven and three. 736 00:48:40,880 --> 00:48:44,680 It's about powerful people scoring powerless people. 737 00:48:54,240 --> 00:48:56,200 - I am an invisible gate keeper. 738 00:48:57,080 --> 00:48:59,800 I use data to make automated decisions 739 00:48:59,800 --> 00:49:04,960 about who gets hired, who gets fired and how much you pay for insurance. 740 00:49:05,800 --> 00:49:09,640 Sometimes you don't even know when I've made these automated decisions. 741 00:49:10,200 --> 00:49:11,680 I have many names. 742 00:49:11,680 --> 00:49:17,080 I am called mathematical model evaluation assessment tool. 743 00:49:18,120 --> 00:49:21,320 But by many names I am an algorithm. 744 00:49:21,320 --> 00:49:23,000 I am a black box. 745 00:49:32,520 --> 00:49:35,360 - The value added model for teachers was actually being used in more 746 00:49:35,360 --> 00:49:38,560 than half the states in particular is being used in New York City. 747 00:49:38,560 --> 00:49:42,600 I got wind of it because my dear friend was principal of New York city. 748 00:49:42,600 --> 00:49:44,520 Her teachers were being evaluated through it. 749 00:49:44,520 --> 00:49:47,040 - She's actually my best friend from college. 750 00:49:47,040 --> 00:49:48,400 It's Cathy. - Hey guys. 751 00:49:48,400 --> 00:49:52,360 - We've known each other since we were 18 so like two years older than you guys. 752 00:49:52,360 --> 00:49:53,360 - Amazing. 753 00:49:56,160 --> 00:49:58,400 And their scores through this algorithm that 754 00:49:58,400 --> 00:50:03,680 they didn't understand would be a very large part of their tenure review. 755 00:50:03,680 --> 00:50:05,120 - Hi guys, where are you supposed to be? 756 00:50:05,120 --> 00:50:06,120 - Class. 757 00:50:06,120 --> 00:50:07,760 - I've got that. Which class? 758 00:50:07,760 --> 00:50:10,080 - It'd be one thing if that teacher algorithm was good. 759 00:50:10,080 --> 00:50:13,720 It was like better than random but just a little bit. 760 00:50:13,720 --> 00:50:14,720 Not good enough. 761 00:50:14,720 --> 00:50:17,480 Not good enough when you're talking about teachers getting 762 00:50:17,480 --> 00:50:19,200 or not getting tenure. 763 00:50:19,200 --> 00:50:21,600 And then I found out that a similar kind of scoring 764 00:50:21,600 --> 00:50:24,960 system was being used in Houston to fire teachers. 765 00:50:27,560 --> 00:50:29,440 - It's called a value-added model. 766 00:50:29,440 --> 00:50:32,720 It calculates what value the teacher added 767 00:50:32,720 --> 00:50:36,360 and parts of it are kept secret by the company that created it. 768 00:50:45,240 --> 00:50:49,880 - I did one teacher of the year and ten years later I received 769 00:50:49,880 --> 00:50:52,240 a Teacher Of The Year award a second time. 770 00:50:52,240 --> 00:50:54,240 I received Teacher Of The Month. 771 00:50:54,240 --> 00:50:57,720 I also was recognized for volunteering. 772 00:50:57,720 --> 00:51:02,320 I also received another recognition for going over and beyond. 773 00:51:02,320 --> 00:51:06,920 I have a file of every evaluation and every 774 00:51:06,920 --> 00:51:09,320 different administrator, different appraiser 775 00:51:09,320 --> 00:51:12,560 excellent, excellent, exceeds expectations. 776 00:51:12,560 --> 00:51:17,040 The computer essentially canceled the observable 777 00:51:17,040 --> 00:51:18,520 evidence of administrators. 778 00:51:19,040 --> 00:51:21,600 This algorithm came back and 779 00:51:21,600 --> 00:51:24,640 classified me as a bad teacher. 780 00:51:29,880 --> 00:51:31,920 Teachers have been terminated. 781 00:51:32,880 --> 00:51:37,000 Some had been targeted simply because of the algorithm. 782 00:51:37,000 --> 00:51:39,720 That was such a low point for me 783 00:51:40,480 --> 00:51:43,320 that for a moment I questioned myself. 784 00:51:46,280 --> 00:51:48,520 That's when the epiphany. 785 00:51:48,520 --> 00:51:50,520 This algorithm is a lie. 786 00:51:51,640 --> 00:51:53,480 How can this algorithm define me? 787 00:51:53,960 --> 00:51:55,160 How dare it. 788 00:51:55,160 --> 00:51:57,960 And that's when I began to investigate and move forward. 789 00:52:00,080 --> 00:52:04,160 - We are announcing that late yesterday we filed suit 790 00:52:04,160 --> 00:52:08,760 in federal court against the current HISD evaluation. 791 00:52:10,000 --> 00:52:13,720 - The Houston Federation of Teachers began to explore the lawsuit. 792 00:52:13,720 --> 00:52:15,440 If this can happen to Mr Santos. 793 00:52:16,120 --> 00:52:19,320 in Jackson middle school how many others have been defamed? 794 00:52:20,040 --> 00:52:24,120 And so we sued based upon the 14th Amendment. 795 00:52:24,120 --> 00:52:25,480 It's not equitable. 796 00:52:25,480 --> 00:52:28,320 How can you arrive at a conclusion. 797 00:52:28,320 --> 00:52:30,640 but not tell me how? 798 00:52:33,280 --> 00:52:34,520 The battle isn't over. 799 00:52:35,240 --> 00:52:36,240 There are still... 800 00:52:36,800 --> 00:52:38,640 communities, there are still school districts 801 00:52:38,640 --> 00:52:41,520 who still utilize the value added model. 802 00:52:41,520 --> 00:52:45,040 But there is hope because I'm still here. 803 00:52:46,120 --> 00:52:48,040 So there's hope. 804 00:52:49,720 --> 00:52:53,200 (SPEAKS SPANISH) 805 00:52:53,200 --> 00:52:54,240 Or in English. 806 00:52:55,840 --> 00:52:57,600 Democracy. Who has the power? 807 00:52:59,280 --> 00:53:01,240 - Us? - Yeah, the people. 808 00:53:02,080 --> 00:53:04,520 - The judge said that their due process rights have 809 00:53:04,520 --> 00:53:06,800 been violated because they were fired under 810 00:53:06,800 --> 00:53:09,320 some explanation that no one could understand. 811 00:53:09,320 --> 00:53:11,160 But they sort of deserve to understand 812 00:53:11,880 --> 00:53:13,200 why they had been fired. 813 00:53:13,200 --> 00:53:17,200 But I don't understand why that legal decision doesn't spread to 814 00:53:17,200 --> 00:53:18,680 all kinds of algorithms. 815 00:53:18,680 --> 00:53:22,040 Like why aren't we using that same argument like constitutional 816 00:53:22,040 --> 00:53:25,480 right to due process to push back against 817 00:53:25,480 --> 00:53:27,000 all sorts of algorithms that are 818 00:53:27,000 --> 00:53:29,360 invisible to us, that are black boxes, 819 00:53:29,360 --> 00:53:31,920 that are unexplained but that matter? 820 00:53:32,400 --> 00:53:35,840 That keep us from like really important opportunities in our lives. 821 00:53:37,520 --> 00:53:40,640 - Sometimes I misclassify and cannot be questioned. 822 00:53:41,760 --> 00:53:43,600 These mistakes are not my fault. 823 00:53:45,440 --> 00:53:47,640 I was optimized for efficiency. 824 00:53:49,800 --> 00:53:52,760 There is no algorithm to define what is just. 825 00:53:56,600 --> 00:53:59,720 - A state commission has approved a new risk assessment 826 00:53:59,720 --> 00:54:03,040 tool for Pennsylvania judges to use at sentencing. 827 00:54:03,040 --> 00:54:05,480 The instrument uses an algorithm to calculate 828 00:54:05,480 --> 00:54:08,680 someone's risk of reoffending based on their age, gender, 829 00:54:08,680 --> 00:54:11,400 prior convictions and other pieces of criminal history. 830 00:54:12,720 --> 00:54:15,000 - The algorithm that kept me up at night was 831 00:54:15,840 --> 00:54:17,440 what's called recidivism risk algorithms. 832 00:54:17,440 --> 00:54:19,960 These are algorithms that judges are given when 833 00:54:19,960 --> 00:54:22,640 they're sentencing defendants to prison. 834 00:54:22,640 --> 00:54:24,760 But then there's the question of fairness which is 835 00:54:24,760 --> 00:54:27,600 how are these actually built these...these scoring systems 836 00:54:27,600 --> 00:54:30,000 Like how were the scores created? 837 00:54:30,000 --> 00:54:32,520 And the questions are proxies for race and class. 838 00:54:35,280 --> 00:54:38,760 - ProPublica published an investigation into the risk assessment 839 00:54:38,760 --> 00:54:42,000 software finding that the algorithms were racially biased. 840 00:54:42,840 --> 00:54:46,360 The study found that black people were mislabeled with high scores. 841 00:54:46,360 --> 00:54:50,160 and that white people were more likely to be mislabeled with low scores. 842 00:55:09,080 --> 00:55:12,720 - I roll into my operations office and she tells me I have to report 843 00:55:13,200 --> 00:55:15,240 once a week. 844 00:55:15,240 --> 00:55:18,280 I'm like hold on did you see everything that I just accomplished? 845 00:55:18,280 --> 00:55:20,400 Like I've been home for years. 846 00:55:20,400 --> 00:55:22,040 I've got gainful employment. 847 00:55:22,040 --> 00:55:25,040 I just got two citations one from the City Council of Philadelphia 848 00:55:25,040 --> 00:55:27,000 one from the Mayor of Philadelphia. 849 00:55:27,000 --> 00:55:30,840 Are you seriously gonna like put me on reporting every week 850 00:55:30,840 --> 00:55:31,840 for what? 851 00:55:31,840 --> 00:55:34,760 I don't deserve to be on high risk probation. 852 00:55:34,760 --> 00:55:36,720 - I was at a meeting with the probation department. 853 00:55:36,720 --> 00:55:40,800 They were just like mentioning that they had this algorithm 854 00:55:41,480 --> 00:55:44,880 that labeled people, high, medium or low risk. 855 00:55:44,880 --> 00:55:49,120 And so I knew that the algorithm decided what risk level you were. 856 00:55:49,120 --> 00:55:51,880 - They're educating me enough to go back to my PO 857 00:55:51,880 --> 00:55:53,920 and be like you mean to tell me you can't 858 00:55:53,920 --> 00:55:56,400 put into account anything positive 859 00:55:56,400 --> 00:55:58,800 that I have done to counteract the results of 860 00:55:58,800 --> 00:56:00,680 what this algorithm is saying. 861 00:56:01,200 --> 00:56:05,840 And she was like no there's no way this computer overrule 862 00:56:05,840 --> 00:56:09,400 the discernment of a judge and appeal together. 863 00:56:09,400 --> 00:56:10,920 - And by labeling you high risk 864 00:56:12,120 --> 00:56:15,160 and requiring you to report in-person you could've lost 865 00:56:15,160 --> 00:56:17,920 your job and then that could have made you high risk. 866 00:56:17,920 --> 00:56:19,520 - That's what hurt the most. 867 00:56:19,520 --> 00:56:22,920 Knowing that everything that I've built up to the moment and I'm still 868 00:56:22,920 --> 00:56:27,440 looked at like a risk I feel like everything I'm doing is for nothing. 869 00:56:37,480 --> 00:56:40,480 - What does it mean if there is no one to advocate for 870 00:56:40,480 --> 00:56:44,400 those who aren't aware of what the technology is doing? 871 00:56:46,160 --> 00:56:50,440 I started to realize this isn't about my art project 872 00:56:50,440 --> 00:56:53,240 maybe not detecting my face. 873 00:56:53,240 --> 00:56:57,720 This is about systems that are governing our lives in material ways. 874 00:57:05,480 --> 00:57:08,400 So hence I started the Algorithmic Justice League. 875 00:57:09,200 --> 00:57:12,440 I wanted to create a space and a place where people 876 00:57:12,440 --> 00:57:16,440 could learn about the social implications of AI. 877 00:57:18,480 --> 00:57:19,880 Everybody has a stake. 878 00:57:19,880 --> 00:57:21,520 Everybody is impacted. 879 00:57:26,840 --> 00:57:30,960 The Algorithmic Justice League is a movement, it's a concept, 880 00:57:30,960 --> 00:57:34,920 it's a group of people who care about making a future 881 00:57:34,920 --> 00:57:38,520 where social technologies work well for all of us. 882 00:57:43,680 --> 00:57:47,480 It's going to take a team effort people coming together 883 00:57:47,480 --> 00:57:52,280 striving for justice, striving for fairness and equality 884 00:57:52,280 --> 00:57:54,320 in this age of automation. 885 00:57:56,800 --> 00:58:00,560 - The next mountain to climb should be HR. 886 00:58:01,680 --> 00:58:06,440 - Oh yeah. There's a problem with resumé algorithms 887 00:58:07,000 --> 00:58:09,840 or all of those matchmaking platforms 888 00:58:09,840 --> 00:58:11,040 are like oh you're looking for a job. 889 00:58:11,040 --> 00:58:12,640 Oh you're looking to hire someone. 890 00:58:12,640 --> 00:58:14,760 We'll put these two people together. 891 00:58:14,760 --> 00:58:16,800 How did those analytics work? 892 00:58:16,800 --> 00:58:19,680 - When people talk about the future of work they talk about 893 00:58:19,680 --> 00:58:22,680 automation without talking about the gatekeeping. 894 00:58:22,680 --> 00:58:26,040 Like who gets the jobs that are still there? - Exactly. 895 00:58:26,040 --> 00:58:28,360 - Right and we're not having that conversation as much. 896 00:58:28,360 --> 00:58:30,040 - Exactly what I'm trying to say. 897 00:58:30,040 --> 00:58:34,280 I would love to see three congressional hearings about this next year. 898 00:58:34,280 --> 00:58:36,520 - Yes. - To more power. 899 00:58:36,520 --> 00:58:38,120 - To more power. - To more power. 900 00:58:38,120 --> 00:58:40,600 - And bringing ethics on board. - Yes. 901 00:58:40,600 --> 00:58:41,720 -Cheers. - Cheers. 902 00:59:06,720 --> 00:59:11,280 - This morning's plenary address will be done by Joy Boulamwini. 903 00:59:11,280 --> 00:59:14,360 She will be speaking on the dangers of supremely white data 904 00:59:14,360 --> 00:59:15,680 and the coded gaze. 905 00:59:15,680 --> 00:59:16,880 Please welcome Joy. 906 00:59:17,520 --> 00:59:19,120 (APPLAUSE) 907 00:59:24,560 --> 00:59:26,840 - AI is not flawless. 908 00:59:26,840 --> 00:59:31,800 How accurate are systems from IBM, Microsoft and Face++ 909 00:59:31,800 --> 00:59:34,480 There is flawless performance for one group. 910 00:59:35,480 --> 00:59:37,520 The pale males come out on top. 911 00:59:37,520 --> 00:59:39,320 There is no problem there. 912 00:59:40,520 --> 00:59:43,000 After I did this analysis I decided to share it 913 00:59:43,000 --> 00:59:45,240 with the companies to see what they thought. 914 00:59:45,840 --> 00:59:48,640 IBM invited me to their headquarters. 915 00:59:48,640 --> 00:59:51,320 They replicated the results internally 916 00:59:51,320 --> 00:59:54,320 and then they actually made an improvement. 917 00:59:54,800 --> 00:59:56,800 And so the day that I presented 918 00:59:56,800 --> 00:59:59,680 the research results officially you can see 919 00:59:59,680 --> 01:00:03,920 that in this case now 100 percent performance 920 01:00:03,920 --> 01:00:06,400 when it comes to lighter females 921 01:00:06,400 --> 01:00:09,160 and for darker females improvement. 922 01:00:10,160 --> 01:00:14,000 Oftentimes people say well isn't the reason you weren't detected by 923 01:00:14,000 --> 01:00:16,600 these systems 'cause you're highly validated. 924 01:00:16,600 --> 01:00:17,600 and yes I am. 925 01:00:18,200 --> 01:00:19,680 Highly validated. 926 01:00:19,680 --> 01:00:24,480 But...but the love of physics did not change. 927 01:00:24,480 --> 01:00:26,720 What did change was making it a priority 928 01:00:26,720 --> 01:00:30,600 and acknowledging what our differences are so you could 929 01:00:30,600 --> 01:00:33,200 make a system that was more inclusive. 930 01:00:38,520 --> 01:00:41,160 - What is the purpose of identification 931 01:00:41,160 --> 01:00:44,840 and so on and that is about movement control. 932 01:00:45,440 --> 01:00:49,440 People couldn't be in certain areas after dark for instance 933 01:00:49,440 --> 01:00:52,440 and you could always be stopped by a policeman arbitrarily. 934 01:00:52,960 --> 01:00:57,440 We would on your appearance say I want your passport. 935 01:00:59,600 --> 01:01:02,560 - So instead of having what you see in the ID books 936 01:01:02,560 --> 01:01:05,000 now you have computers that are going to look at 937 01:01:05,000 --> 01:01:08,600 an image of a face and try to determine what your gender is. 938 01:01:08,600 --> 01:01:12,400 Some of them try to determine what your ethnicity is. 939 01:01:12,400 --> 01:01:14,480 And then the work that I've done even for 940 01:01:14,480 --> 01:01:17,360 the classification systems that some people agree with 941 01:01:18,200 --> 01:01:19,520 they're not even accurate. 942 01:01:19,520 --> 01:01:23,200 And so that's not just for face classification 943 01:01:23,200 --> 01:01:25,440 it's any data-centric technology. 944 01:01:25,440 --> 01:01:28,080 And so people assume well if the machine says it 945 01:01:28,080 --> 01:01:29,520 it's correct and you know that's not... 946 01:01:29,520 --> 01:01:33,320 Human are creating themselves in their own image and likeness 947 01:01:33,320 --> 01:01:34,840 quite literally. - Absolutely. 948 01:01:34,840 --> 01:01:38,080 - Racism is becoming mechanized, robotized, yeah. 949 01:01:38,080 --> 01:01:39,080 - Absolutely. 950 01:01:52,280 --> 01:01:53,880 Accuracy draws attention. 951 01:01:54,600 --> 01:01:57,040 but we can't forget about abuse. 952 01:01:59,760 --> 01:02:04,480 Even if I'm perfectly classified, that just enables surveillance. 953 01:02:24,480 --> 01:02:27,640 - There's this thing called the Social Credit Score in China. 954 01:02:28,480 --> 01:02:31,200 They're sort of explicitly saying here's the deal 955 01:02:31,200 --> 01:02:34,120 citizens of China we're tracking you. 956 01:02:34,120 --> 01:02:36,080 You have a social credit score. 957 01:02:36,080 --> 01:02:40,280 Whatever you say about the Communist Party will affect your score. 958 01:02:40,280 --> 01:02:44,040 Also, by the way, it will affect your friends and your family's scores. 959 01:02:44,840 --> 01:02:45,840 And it's explicit. 960 01:02:45,840 --> 01:02:48,840 The government is building this is basically saying you should 961 01:02:48,840 --> 01:02:51,600 know you're being tracked and you should behave accordingly. 962 01:02:51,600 --> 01:02:53,800 It's like algorithmic obedience training. 963 01:05:19,560 --> 01:05:24,560 - We look at China and China's surveillance and scoring system and 964 01:05:25,240 --> 01:05:27,960 a lot of people say well thank goodness we don't live there. 965 01:05:29,200 --> 01:05:31,920 In reality, we're all being scored all the time 966 01:05:32,400 --> 01:05:34,200 including here in the United States. 967 01:05:34,200 --> 01:05:38,760 We are all grappling everyday with algorithmic determinism. 968 01:05:38,760 --> 01:05:41,760 Somebody's algorithm somewhere has assigned you a score 969 01:05:41,760 --> 01:05:44,800 and as a result, you are paying more or less 970 01:05:44,800 --> 01:05:47,080 money for toilet paper when you shop online. 971 01:05:47,680 --> 01:05:51,760 You are being shown better or worse mortgages. 972 01:05:51,760 --> 01:05:54,560 You are more or less likely to be profiled as 973 01:05:54,560 --> 01:05:57,800 a criminal in somebodies database somewhere. 974 01:05:57,800 --> 01:05:59,680 We are all being scored. 975 01:05:59,680 --> 01:06:01,920 The key difference between the United States 976 01:06:01,920 --> 01:06:04,200 and in China is that China's transparent about it. 977 01:06:22,600 --> 01:06:25,640 - This young black kids in school uniform got stopped as 978 01:06:25,640 --> 01:06:26,880 a result the match 979 01:06:30,120 --> 01:06:32,560 Took him down that street just to one side. 980 01:06:33,360 --> 01:06:35,200 Like very thoroughly searched him. 981 01:06:37,120 --> 01:06:39,040 Using plainclothes officers as well. 982 01:06:39,040 --> 01:06:41,680 It's four plainclothes officers who stopped him. 983 01:06:47,360 --> 01:06:48,360 Fingerprinted him. 984 01:06:50,360 --> 01:06:55,160 After about like maybe 10-15 minutes of searching and checking 985 01:06:55,160 --> 01:06:59,360 his details and fingerprinting and they came back and said it's not him. 986 01:07:00,680 --> 01:07:01,680 - Excuse me. 987 01:07:01,680 --> 01:07:03,600 I work for a human rights campaign organization. 988 01:07:03,600 --> 01:07:06,000 They're campaigning against facial recognition technology. 989 01:07:07,360 --> 01:07:09,280 We're campaigning against facial...we're called Big Brother Watch. 990 01:07:09,280 --> 01:07:11,400 We're a human rights campaigning organization. 991 01:07:12,120 --> 01:07:14,040 We're campaigning against this technology here today. 992 01:07:16,240 --> 01:07:18,440 I know you've just been stopped because of that but 993 01:07:18,440 --> 01:07:19,880 they misidentified you. 994 01:07:22,000 --> 01:07:23,200 Here's our details here. 995 01:07:23,200 --> 01:07:25,080 He was a bit shaken. His friends were there. 996 01:07:25,080 --> 01:07:27,080 They couldn't believe what happened to him. 997 01:07:27,080 --> 01:07:28,160 (CROSSTALK) 998 01:07:31,120 --> 01:07:34,120 You've been mis identified by their systems and they've stopped you 999 01:07:34,120 --> 01:07:36,280 and used that as justification to stop and search you. 1000 01:07:36,280 --> 01:07:39,440 But this is an innocent, young 14-year-old child who is being stopped 1001 01:07:39,440 --> 01:07:42,920 by the police as a result of facial recognition misidentification. 1002 01:07:48,040 --> 01:07:50,520 - So Big Brother Watch has joined with 1003 01:07:50,520 --> 01:07:53,480 Baroness Jenny Jones to bring a legal challenge against 1004 01:07:53,480 --> 01:07:55,840 the Metropolitan Police and the Home Office for 1005 01:07:55,840 --> 01:07:58,760 their use of facial recognition surveillance. 1006 01:07:59,840 --> 01:08:02,240 - It was in about 2012 when somebody suggested to me 1007 01:08:02,240 --> 01:08:06,640 that I should find out if I had files kept on me by 1008 01:08:06,640 --> 01:08:09,720 the police or security services and so when I applied 1009 01:08:09,720 --> 01:08:13,360 I found that I was on the watch list for domestic extremists. 1010 01:08:13,360 --> 01:08:17,840 I felt if they can do it to me when I'm a politician who... 1011 01:08:17,840 --> 01:08:21,080 whose job is to hold them to account they could be 1012 01:08:21,080 --> 01:08:24,680 doing it to everybody and it will be great if we can 1013 01:08:24,680 --> 01:08:28,320 roll things back and stop them from using it, yes. 1014 01:08:29,000 --> 01:08:31,120 I think that's going to be quite a challenge. 1015 01:08:31,120 --> 01:08:32,520 I'm happy to try. 1016 01:08:32,520 --> 01:08:35,000 - You know this is the first challenge against 1017 01:08:35,000 --> 01:08:37,320 police use of facial recognition anywhere 1018 01:08:37,320 --> 01:08:40,120 but if we're successful it will have an impact 1019 01:08:40,120 --> 01:08:43,600 for the rest of Europe maybe further afield. 1020 01:08:44,480 --> 01:08:45,880 You've got to get it right. 1021 01:08:53,160 --> 01:08:56,200 - In the UK we have what's called GDPR 1022 01:08:56,680 --> 01:09:00,440 and it sets up a bulwark against the misuse of information. 1023 01:09:00,920 --> 01:09:04,520 It says that the individuals have a right to access, control 1024 01:09:04,520 --> 01:09:07,760 and accountability to determine how their data is used. 1025 01:09:08,680 --> 01:09:11,280 Comparatively, it's the Wild West in America. 1026 01:09:11,880 --> 01:09:14,200 And the concern is that America is 1027 01:09:14,200 --> 01:09:16,960 the home of these technology companies. 1028 01:09:19,320 --> 01:09:24,400 American citizens are profiled and targeted in a way 1029 01:09:24,400 --> 01:09:25,880 that probably no one else in 1030 01:09:25,880 --> 01:09:30,240 the world is because of this free-for-all approach to data protection. 1031 01:09:34,200 --> 01:09:38,320 - The thing I actually fear is not that 1032 01:09:38,320 --> 01:09:40,120 we're going to go down this totalitarian 1033 01:09:40,120 --> 01:09:46,400 1984 model but that we're going to go down this quiet model where 1034 01:09:46,400 --> 01:09:51,640 we are surveilled and socially controlled and individually nudged 1035 01:09:51,640 --> 01:09:55,000 and measured and classified in a way 1036 01:09:55,000 --> 01:09:58,600 that we don't see to move us along 1037 01:09:58,600 --> 01:10:00,800 pets desired by power. 1038 01:10:01,400 --> 01:10:03,360 Though it's not what will AI do to us 1039 01:10:03,360 --> 01:10:08,640 on its own, it's what will the powerful do to us with the AI. 1040 01:10:13,040 --> 01:10:16,120 - There are growing questions about the accuracy of Amazon's 1041 01:10:16,120 --> 01:10:18,280 facial recognition software. 1042 01:10:18,280 --> 01:10:20,520 In a letter to Amazon members of Congress raised 1043 01:10:20,520 --> 01:10:23,600 concerns of potential racial bias with the technology. 1044 01:10:23,600 --> 01:10:27,080 - This comes after the ACLU conducted a test and found that 1045 01:10:27,080 --> 01:10:29,720 the facial recognition software incorrectly matched 1046 01:10:29,720 --> 01:10:32,920 28 lawmakers with mug shots of people who've been 1047 01:10:32,920 --> 01:10:36,880 arrested and eleven of those 28 were people of color. 1048 01:10:36,880 --> 01:10:39,000 Some lawmakers have looked into whether or not 1049 01:10:39,000 --> 01:10:41,560 Amazon could sell this technology to law enforcement. 1050 01:10:53,720 --> 01:10:56,640 - Tomorrow, I have the opportunity to testify 1051 01:10:56,640 --> 01:10:58,160 before Congress about 1052 01:10:58,160 --> 01:11:02,000 the use of facial analysis technology by the government. 1053 01:11:06,000 --> 01:11:10,160 In March, I came to do some 1054 01:11:11,240 --> 01:11:14,680 staff briefings not...not in this kind of context. 1055 01:11:16,840 --> 01:11:20,040 Like actually advising on legislation. That's a first. 1056 01:11:23,640 --> 01:11:25,040 We're going to Capitol Hill. 1057 01:11:25,040 --> 01:11:27,400 What are some of the major goals and also some of 1058 01:11:27,400 --> 01:11:29,800 the challenges we need to think about. 1059 01:11:29,800 --> 01:11:30,880 - So first of all... 1060 01:11:32,280 --> 01:11:35,240 the issue with law enforcement technology 1061 01:11:35,240 --> 01:11:38,280 is that the positive is always extraordinarily salient 1062 01:11:38,280 --> 01:11:40,560 because law enforcement publicizes it. -Right. 1063 01:11:40,560 --> 01:11:44,400 - And so you know we're going to go into the meeting and two weeks ago 1064 01:11:45,000 --> 01:11:47,440 the Annapolis shooter was identified 1065 01:11:47,440 --> 01:11:49,880 through the use of facial recognition. - Right. 1066 01:11:49,880 --> 01:11:51,880 - And I'd be surprised if that doesn't come up. 1067 01:11:51,880 --> 01:11:53,080 - Absolutely. 1068 01:11:53,680 --> 01:11:57,200 - Part of what if I were you what I would want to drive home going in 1069 01:11:57,200 --> 01:12:00,760 this meeting is the other side of that equation and make it very real 1070 01:12:00,760 --> 01:12:02,640 as to what the human cost 1071 01:12:02,640 --> 01:12:04,800 if the problems that you've identified. 1072 01:12:04,800 --> 01:12:06,040 aren't ready. 1073 01:12:17,000 --> 01:12:19,360 - People who have been marginalized will be 1074 01:12:19,360 --> 01:12:22,040 further marginalized if we're not looking at 1075 01:12:22,040 --> 01:12:24,760 ways of making sure the technology 1076 01:12:24,760 --> 01:12:28,280 we're creating doesn't propagate bias. 1077 01:12:30,320 --> 01:12:33,600 That's when I started to realize algorithmic 1078 01:12:33,600 --> 01:12:36,600 justice making sure there's oversight 1079 01:12:36,600 --> 01:12:39,080 in the age of automation is one of 1080 01:12:39,080 --> 01:12:42,800 the largest civil rights concerns we have. 1081 01:12:45,240 --> 01:12:48,240 - We need an FDA for algorithms so for algorithms 1082 01:12:48,240 --> 01:12:50,720 that have the potential to ruin people's lives 1083 01:12:50,720 --> 01:12:53,760 or sharply reduce their options with their liberty, 1084 01:12:53,760 --> 01:12:56,280 their livelihood or their finances. 1085 01:12:56,280 --> 01:12:58,200 We need an FDA for algorithms that says 1086 01:12:58,200 --> 01:13:00,320 hey, show me evidence that it's going to 1087 01:13:00,320 --> 01:13:03,320 work not just to make your new money 1088 01:13:03,320 --> 01:13:05,120 but it's going to work for society. 1089 01:13:05,760 --> 01:13:07,800 That is going to be fair, that is not going to be racist, 1090 01:13:07,800 --> 01:13:09,480 that's not going to be sexist, that's not going to discriminate 1091 01:13:09,480 --> 01:13:11,640 against people with disability status. 1092 01:13:11,640 --> 01:13:14,880 Show me that it's legal before you put it out. 1093 01:13:14,880 --> 01:13:16,520 That's what we don't have yet. 1094 01:13:17,440 --> 01:13:19,880 Well I'm here because I wanted to hear 1095 01:13:19,880 --> 01:13:23,720 the congressional testimony of my friend Joy Boulamwini 1096 01:13:23,720 --> 01:13:25,880 as well as the ACLU and others. 1097 01:13:25,880 --> 01:13:29,240 One cool thing about seeing Joy speak to Congress is that 1098 01:13:29,240 --> 01:13:32,840 like I met joy on my book tour at Harvard Bookstore. 1099 01:13:33,720 --> 01:13:36,200 And according to her that was the day that 1100 01:13:36,200 --> 01:13:38,720 she decided to form the Algorithmic Justice League. 1101 01:13:42,560 --> 01:13:45,480 We haven't gotten to the nuanced conversation yet. 1102 01:13:46,120 --> 01:13:47,640 I know it's going to happen 'cause 1103 01:13:47,640 --> 01:13:49,320 I know Joy is going to make it happen. 1104 01:13:52,960 --> 01:13:57,440 At every single level, bad algorithms are begging to be given rules. 1105 01:14:05,920 --> 01:14:07,520 - Hello. - Hey. 1106 01:14:07,520 --> 01:14:08,560 - How are you doing? 1107 01:14:08,560 --> 01:14:10,200 - Wanna sneak in with me? - Yes. 1108 01:14:11,240 --> 01:14:12,880 - 2155. 1109 01:14:13,880 --> 01:14:17,440 (INAUDIBLE CONVERSATION) 1110 01:14:23,400 --> 01:14:28,320 (INAUDIBLE CONVERSATION) 1111 01:14:32,800 --> 01:14:37,120 - Today we are having our first hearing of this Congress 1112 01:14:37,120 --> 01:14:40,240 on the use of facial recognition technology. 1113 01:14:40,240 --> 01:14:43,480 Please stand and raise your right hand and I will now swear you in. 1114 01:14:46,600 --> 01:14:50,040 - I've had to resort to literally wearing a white mask. 1115 01:14:50,040 --> 01:14:53,680 Given such accuracy disparities I wondered how large tech companies 1116 01:14:53,680 --> 01:14:55,120 could have missed these issues. 1117 01:14:55,120 --> 01:14:59,360 The harvesting of face data also requires guidelines and oversight. 1118 01:15:00,040 --> 01:15:02,960 No one should be forced to submit their base data to access 1119 01:15:02,960 --> 01:15:07,400 widely used platforms, economic opportunity or basic services. 1120 01:15:07,400 --> 01:15:10,800 Tenants in Brooklyn are protesting the installation of 1121 01:15:10,800 --> 01:15:14,040 an unnecessary face recognition entry system. 1122 01:15:14,040 --> 01:15:16,080 There is a Big Brother Watch UK report 1123 01:15:16,080 --> 01:15:18,440 that came out that showed more than 1124 01:15:18,440 --> 01:15:23,600 2,400 innocent people had their faces misidentified. 1125 01:15:23,600 --> 01:15:26,400 Our faces may well be the final frontier of 1126 01:15:26,400 --> 01:15:30,040 privacy but regulations make a difference. 1127 01:15:30,040 --> 01:15:33,880 Congress must act now to uphold American freedoms and rights. 1128 01:15:33,880 --> 01:15:36,440 - Miss Boulamwini, I heard your opening statement 1129 01:15:36,440 --> 01:15:41,520 and we saw that these algorithms are effective to different degrees. 1130 01:15:41,520 --> 01:15:44,720 So are they most effective on women? - No. 1131 01:15:44,720 --> 01:15:46,520 - Are they most effective on people of color? 1132 01:15:46,520 --> 01:15:47,520 - Absolutely not. 1133 01:15:47,520 --> 01:15:50,920 - Are they most effective on people of different gender expressions? 1134 01:15:50,920 --> 01:15:53,360 - No, in fact, they exclude them. 1135 01:15:53,360 --> 01:15:56,960 - So what demographic is it mostly effective on? 1136 01:15:56,960 --> 01:15:58,480 - White men. 1137 01:15:58,480 --> 01:16:02,720 - And who are the primary engineers and designers of these algorithms? 1138 01:16:02,720 --> 01:16:04,760 - Definitely, white men. 1139 01:16:04,760 --> 01:16:09,720 - So we have a technology that was created and designed by one 1140 01:16:09,720 --> 01:16:13,200 demographic that is only mostly effective on that one demographic 1141 01:16:13,200 --> 01:16:15,600 and they're trying to sell it and impose it 1142 01:16:15,600 --> 01:16:18,880 on the entirety of the country? 1143 01:16:21,080 --> 01:16:23,400 - When it comes to face recognition the FBI has not 1144 01:16:23,400 --> 01:16:26,280 fully tested the accuracy of the systems it uses 1145 01:16:26,280 --> 01:16:28,560 yet the agency is now reportedly piloting 1146 01:16:28,560 --> 01:16:30,600 Amazon's face recognition product. 1147 01:16:30,600 --> 01:16:34,280 - How does the FBI get the initial database in the first place? 1148 01:16:34,280 --> 01:16:36,000 - So one of the things they do is they use 1149 01:16:36,000 --> 01:16:37,800 state driver's license databases. 1150 01:16:37,800 --> 01:16:41,920 I think you know up to 18 states have been reportedly used by the FBI. 1151 01:16:41,920 --> 01:16:45,280 It is being used without a warrant and without other protections. 1152 01:16:45,280 --> 01:16:48,280 - Seems to me it's time for a time out. Time out. 1153 01:16:48,280 --> 01:16:50,520 I guess what troubles me too is just the fact 1154 01:16:50,520 --> 01:16:54,080 that no one in an elected position made a decision on 1155 01:16:54,080 --> 01:16:56,200 the fact that...these 18 states I think the chairman said this is 1156 01:16:56,200 --> 01:16:58,200 more than half the population in the country. 1157 01:16:58,840 --> 01:17:00,160 That is scary. 1158 01:17:00,160 --> 01:17:04,720 - China seems to me to be the dystopian path that needs not 1159 01:17:04,720 --> 01:17:07,680 be taken at this point by our society. 1160 01:17:07,680 --> 01:17:11,360 - More than China, Facebook has 2.6 billion people. 1161 01:17:11,360 --> 01:17:12,920 So Facebook has a patent where 1162 01:17:12,920 --> 01:17:16,960 they say because we have all of these face prints we can now give you 1163 01:17:16,960 --> 01:17:19,480 an option as a retailer to identify 1164 01:17:19,480 --> 01:17:22,240 somebody who walks into the store and in 1165 01:17:22,240 --> 01:17:24,880 their patent they say we can also give 1166 01:17:24,880 --> 01:17:27,680 that face a trustworthiness score. 1167 01:17:27,680 --> 01:17:29,400 - Facebook is selling this now? 1168 01:17:29,400 --> 01:17:32,320 - This is a patent that they filed as in 1169 01:17:32,320 --> 01:17:34,640 something that they could potentially 1170 01:17:34,640 --> 01:17:36,600 do with the capabilities they have. 1171 01:17:36,600 --> 01:17:39,760 So as we're talking about state surveillance we absolutely 1172 01:17:39,760 --> 01:17:43,640 have to be thinking about corporate surveillance as well. 1173 01:17:46,240 --> 01:17:48,720 - I'm speechless and normally I'm not speechless. 1174 01:17:48,720 --> 01:17:50,840 - Really? - Yeah. Yeah. 1175 01:17:50,840 --> 01:17:53,600 All of our hard work to know that has gone this far 1176 01:17:53,600 --> 01:17:54,920 it's beyond belief. 1177 01:17:54,920 --> 01:17:58,560 We never imagined that it would go this far. 1178 01:17:58,560 --> 01:18:01,080 I'm really touched. I'm really touched. 1179 01:18:01,080 --> 01:18:02,200 (INAUDIBLE). 1180 01:18:02,200 --> 01:18:04,280 - I want to show it to my mother. 1181 01:18:13,480 --> 01:18:15,200 - Hey, very nice to meet you. 1182 01:18:15,200 --> 01:18:16,320 - Very nice to meet you. 1183 01:18:16,320 --> 01:18:17,320 You got my card. 1184 01:18:17,320 --> 01:18:19,880 Anything happen you let me know please. 1185 01:18:19,880 --> 01:18:20,960 I will. 1186 01:18:23,320 --> 01:18:25,800 - Constitutional concerns about. 1187 01:18:25,800 --> 01:18:29,080 the non-consensual use of facial recognition. 1188 01:18:37,280 --> 01:18:41,000 So what demographic is it mostly affecting? 1189 01:18:41,000 --> 01:18:45,400 And who are the primary engineers and designers of these algorithms? 1190 01:18:49,200 --> 01:18:51,320 - San Francisco's now the first city in the US to 1191 01:18:51,320 --> 01:18:53,520 ban the use of facial recognition technology. 1192 01:18:53,520 --> 01:18:57,600 - Somerville, Massachusetts became the second city in the US 1193 01:18:57,600 --> 01:19:00,320 to ban the use of facial recognition. 1194 01:19:00,320 --> 01:19:04,400 - Oakland becomes the third major city to ban facial recognition by police 1195 01:19:04,400 --> 01:19:07,440 saying that the technology discriminates against minorities. 1196 01:19:08,960 --> 01:19:14,080 - Our last tenants meeting, we had the landlord come in 1197 01:19:14,080 --> 01:19:17,080 and announced that (AUDIO DISTORTS) 1198 01:19:17,080 --> 01:19:21,560 the application for facial recognition software in our apartment complex. 1199 01:19:21,560 --> 01:19:23,200 The tenants were excited to hear that. 1200 01:19:23,200 --> 01:19:26,200 But the thing is that doesn't mean that down the road 1201 01:19:27,040 --> 01:19:29,440 that they can't put it back in. 1202 01:19:29,440 --> 01:19:31,880 We're not only educated ourselves about 1203 01:19:31,880 --> 01:19:36,040 facial recognition and now a new one, machine learning. 1204 01:19:36,040 --> 01:19:39,280 We want the law to cover all of these things. - Right. 1205 01:19:39,280 --> 01:19:43,080 - OK. And if we can ban it in this state, this stops them 1206 01:19:43,080 --> 01:19:45,840 from ever going back and put it in a new modification. 1207 01:19:45,840 --> 01:19:46,840 - Got it. 1208 01:19:46,840 --> 01:19:49,920 - And then supposed to get a federal ban. 1209 01:19:49,920 --> 01:19:54,960 - Well, I will say even though the battle is ongoing so many people are 1210 01:19:54,960 --> 01:19:57,480 inspired and the surprise I have for you 1211 01:19:57,480 --> 01:20:01,440 is that I wrote a poem in honor of this. 1212 01:20:02,400 --> 01:20:04,080 - Oh really? - Yes. 1213 01:20:04,600 --> 01:20:05,840 - Let's hear it. 1214 01:20:05,840 --> 01:20:09,960 To the Brooklyn tenants and the freedom fighters around the world 1215 01:20:09,960 --> 01:20:15,120 persisting and prevailing against algorithms of oppression automating 1216 01:20:15,120 --> 01:20:20,920 inequality through weapons of mass destruction we stand with you 1217 01:20:20,920 --> 01:20:22,200 in gratitude. 1218 01:20:22,200 --> 01:20:24,080 The victory is ours. 1219 01:20:28,480 --> 01:20:32,520 - (INAUDIBLE). 1220 01:20:37,720 --> 01:20:40,000 - Why get so many eggs (INAUDIBLE)? 1221 01:20:40,000 --> 01:20:41,640 (INAUDIBLE). 1222 01:20:42,560 --> 01:20:45,600 What it means to be human is to be vulnerable. 1223 01:20:46,640 --> 01:20:48,280 Being vulnerable 1224 01:20:48,280 --> 01:20:50,800 there is more of a capacity for empathy, 1225 01:20:50,800 --> 01:20:55,360 there is more of a capacity for compassion. 1226 01:20:55,360 --> 01:20:58,840 If there is a way we can think about that within our technology. 1227 01:20:58,840 --> 01:21:02,880 I think it would reorient the sorts of questions we ask. 1228 01:21:12,520 --> 01:21:16,400 -In 1983, Stanislav Petrov who was in 1229 01:21:16,400 --> 01:21:20,080 the Russian military sees these indications 1230 01:21:20,560 --> 01:21:24,400 that the US has launched nuclear weapons 1231 01:21:24,400 --> 01:21:25,960 at the Soviet Union. 1232 01:21:27,560 --> 01:21:31,400 So if you're going to respond you have like this very short window. 1233 01:21:31,400 --> 01:21:32,920 He just sits on it. 1234 01:21:32,920 --> 01:21:34,240 He doesn't inform anyone. 1235 01:21:34,920 --> 01:21:38,080 Russia, the Soviet Union, his country, his family, everything. 1236 01:21:38,080 --> 01:21:40,440 Everything about him is about to die 1237 01:21:40,440 --> 01:21:44,640 and he's thinking well, at least we don't go kill them all either. 1238 01:21:44,640 --> 01:21:46,240 That's a very human thing. 1239 01:21:48,000 --> 01:21:50,880 Here you have a story in which if you had some sort of automated 1240 01:21:50,880 --> 01:21:54,680 response system it was going to do what it was programmed to do 1241 01:21:54,680 --> 01:21:55,920 which was retaliate. 1242 01:21:58,200 --> 01:22:00,080 Being fully efficient, 1243 01:22:00,920 --> 01:22:03,160 always doing what you're told, 1244 01:22:03,160 --> 01:22:06,680 always doing what your program is not always the most human thing. 1245 01:22:06,680 --> 01:22:08,440 Sometimes it's disobeying. 1246 01:22:08,440 --> 01:22:11,600 Sometimes it's saying no, I'm not gonna do this, right? 1247 01:22:11,600 --> 01:22:13,240 And if you automate everything so 1248 01:22:13,240 --> 01:22:15,640 it always does what it's supposed to do 1249 01:22:15,640 --> 01:22:18,640 sometimes that can lead to very inhuman things. 1250 01:22:20,760 --> 01:22:24,600 The struggle between machines and humans over decision making 1251 01:22:24,600 --> 01:22:27,040 in the 2020s continues. 1252 01:22:27,760 --> 01:22:33,240 My power the power of artificial intelligence will transform our world. 1253 01:22:34,480 --> 01:22:38,000 The more humans share with me the more I learn. 1254 01:22:39,400 --> 01:22:43,520 Some humans say that intelligence without ethics is not intelligence 1255 01:22:43,520 --> 01:22:44,520 at all 1256 01:22:45,640 --> 01:22:47,160 I say trust me. 1257 01:22:47,760 --> 01:22:49,080 What could go wrong?