1 00:00:01,300 --> 00:00:04,333 ♪ 2 00:00:05,133 --> 00:00:08,233 - TikTok is a app that's completely different 3 00:00:08,233 --> 00:00:09,900 than any other type of social media 4 00:00:09,900 --> 00:00:12,533 or entertainment platform that we've ever seen before. 5 00:00:12,533 --> 00:00:16,533 The algorithms are reinforcing social disparities. 6 00:00:16,533 --> 00:00:18,566 - I thought I had the freedom of speech. 7 00:00:18,566 --> 00:00:21,200 - They just, like, find a way to make it 8 00:00:21,200 --> 00:00:23,633 so nobody sees any of your content. 9 00:00:23,633 --> 00:00:26,033 This is blatant shadow-banning. 10 00:00:26,033 --> 00:00:28,033 - The power of determining speech 11 00:00:28,033 --> 00:00:29,866 has been far too consolidated. 12 00:00:29,866 --> 00:00:31,366 That's antidemocratic. 13 00:00:31,366 --> 00:00:34,066 - TikTok rarely deletes content. 14 00:00:34,066 --> 00:00:36,033 They don't have to. They can just hide it. 15 00:00:36,033 --> 00:00:37,766 - This is bigger than TikTok. 16 00:00:37,766 --> 00:00:41,200 It's about who in our society gets heard. 17 00:00:41,200 --> 00:00:45,500 - "TikTok Boom," now only on "Independent Lens." 18 00:00:45,500 --> 00:00:48,500 [upbeat music] 19 00:00:48,500 --> 00:00:55,533 ♪ ♪ 20 00:00:58,733 --> 00:01:01,833 [spacey electronic music] 21 00:01:01,833 --> 00:01:05,866 ♪ ♪ 22 00:01:05,866 --> 00:01:08,933 - TikTok was a new app. 23 00:01:08,933 --> 00:01:14,433 I knew millions of people were downloading it by the day. 24 00:01:14,433 --> 00:01:17,766 And I just thought, "I want to post on a platform 25 00:01:17,766 --> 00:01:22,066 where younger generations are." 26 00:01:22,066 --> 00:01:24,500 Once I got into junior year of high school, 27 00:01:24,500 --> 00:01:28,433 I was like, "Okay, I'll give TikTok a try." 28 00:01:28,433 --> 00:01:31,600 I remember when I started seeing views pile up 29 00:01:31,600 --> 00:01:36,666 in the thousands and hundred thousands. 30 00:01:36,666 --> 00:01:38,200 I didn't know I had this much power 31 00:01:38,200 --> 00:01:44,266 just because of me putting my voice on an app. 32 00:01:44,266 --> 00:01:49,400 On TikTok, anything can happen. 33 00:01:49,400 --> 00:01:53,166 ♪ ♪ 34 00:01:53,166 --> 00:01:56,166 [poppy music playing] 35 00:01:56,166 --> 00:01:59,500 ♪ ♪ 36 00:01:59,500 --> 00:02:02,466 [light music] 37 00:02:02,466 --> 00:02:04,400 - ♪ Listen to me now ♪ 38 00:02:04,400 --> 00:02:09,200 ♪ ♪ 39 00:02:09,200 --> 00:02:12,200 [tense electronic music] 40 00:02:12,200 --> 00:02:14,666 ♪ ♪ 41 00:02:14,666 --> 00:02:17,633 - I guess I'm on TikTok now. 42 00:02:17,633 --> 00:02:20,033 - TikTok has been downloaded more than 2 billion times, 43 00:02:20,033 --> 00:02:24,400 more than any app ever. 44 00:02:24,400 --> 00:02:26,800 - I think TikTok right now is probably 45 00:02:26,800 --> 00:02:31,400 on the cutting edge of all social media. 46 00:02:31,400 --> 00:02:34,133 And it is becoming a world unto itself 47 00:02:34,133 --> 00:02:37,033 for a lot of people, especially young people. 48 00:02:37,033 --> 00:02:39,766 ♪ ♪ 49 00:02:39,766 --> 00:02:41,666 - TikTok is the first Chinese app 50 00:02:41,666 --> 00:02:46,466 to threaten the dominance of Silicon Valley. 51 00:02:46,466 --> 00:02:49,566 - It's a cybersecurity story. 52 00:02:49,566 --> 00:02:52,266 It's an algorithm story. 53 00:02:52,266 --> 00:02:54,366 It's a bias story. 54 00:02:54,366 --> 00:02:55,766 It's a geopolitical story. 55 00:02:55,766 --> 00:02:59,233 - We're looking at TikTok. We may be banning TikTok. 56 00:02:59,233 --> 00:03:01,200 - It was bizarre. 57 00:03:01,200 --> 00:03:03,600 Why would suddenly this kind of, like, 58 00:03:03,600 --> 00:03:06,533 fun little kids' app become wrapped up 59 00:03:06,533 --> 00:03:08,800 in this huge geopolitical storm 60 00:03:08,800 --> 00:03:11,266 between the U.S. and China 61 00:03:11,266 --> 00:03:13,933 that was only getting hotter? 62 00:03:13,933 --> 00:03:17,333 - ♪ Renegade, renegade, renegade, renegade ♪ 63 00:03:18,333 --> 00:03:22,300 [loud explosion] 64 00:03:22,300 --> 00:03:24,066 [car door closing] 65 00:03:24,066 --> 00:03:26,866 - Fix your mirrors. Put it in drive. 66 00:03:26,866 --> 00:03:29,966 - Put it in drive? - Yeah. 67 00:03:29,966 --> 00:03:32,766 [spacey electronic music] 68 00:03:32,766 --> 00:03:36,133 - So my family's from Afghanistan. 69 00:03:36,133 --> 00:03:38,300 They were just very grateful, 70 00:03:38,300 --> 00:03:41,533 because they could finally come here. 71 00:03:41,533 --> 00:03:43,666 They were just amazed by the privileges 72 00:03:43,666 --> 00:03:45,366 that were being brought to them. 73 00:03:45,366 --> 00:03:47,200 - So you didn't--you didn't use your signal there. 74 00:03:47,200 --> 00:03:50,733 - I did. Oh, I thought I did. 75 00:03:50,733 --> 00:03:55,466 My parents, their dreams of America 76 00:03:55,466 --> 00:03:58,333 were flourishing before 9/11. 77 00:03:58,333 --> 00:04:01,900 And then once those planes hit the towers, 78 00:04:01,900 --> 00:04:03,433 their dreams were shattered too, 79 00:04:03,433 --> 00:04:08,600 because it was as if they were responsible for that. 80 00:04:08,600 --> 00:04:11,633 Growing up as an Afghan-American, 81 00:04:11,633 --> 00:04:12,933 it was really rough. 82 00:04:12,933 --> 00:04:19,433 ♪ ♪ 83 00:04:19,433 --> 00:04:24,100 My school isn't diverse at all. 84 00:04:24,100 --> 00:04:26,333 I don't feel like I'm one of them. 85 00:04:26,333 --> 00:04:27,933 I don't think my classmates even think 86 00:04:27,933 --> 00:04:29,266 that I'm one of them. 87 00:04:29,266 --> 00:04:30,700 ♪ ♪ 88 00:04:30,700 --> 00:04:34,333 I got called a terrorist. I got called bin Laden. 89 00:04:34,333 --> 00:04:38,366 I've been called a part of the Taliban since I'm Afghan. 90 00:04:38,366 --> 00:04:40,966 ♪ ♪ 91 00:04:40,966 --> 00:04:46,666 And I always felt like an outsider. 92 00:04:46,666 --> 00:04:49,766 ♪ ♪ 93 00:04:49,766 --> 00:04:52,166 My mom made these two. 94 00:04:52,166 --> 00:04:56,200 I wore this dress on my TikTok. 95 00:04:56,200 --> 00:04:58,466 I have gone viral a few times. 96 00:04:58,466 --> 00:05:01,700 It was my first time posting on TikTok. 97 00:05:01,700 --> 00:05:05,033 It was, like, 40,000 views I've gotten on it. 98 00:05:05,033 --> 00:05:08,766 And it was just me dancing with my Afghan clothes on. 99 00:05:08,766 --> 00:05:11,066 The response was, like, amazing, 100 00:05:11,066 --> 00:05:14,166 because through TikTok, I found so many other Afghans 101 00:05:14,166 --> 00:05:17,133 and so many other Afghan-Americans. 102 00:05:17,133 --> 00:05:20,166 I never knew how big the Afghan-American community was 103 00:05:20,166 --> 00:05:22,433 till I joined TikTok, 104 00:05:22,433 --> 00:05:26,366 and I see more people accepting me for who I am. 105 00:05:26,366 --> 00:05:30,300 [electronic beeping] 106 00:05:32,566 --> 00:05:35,766 I started realizing TikTok had power. 107 00:05:35,766 --> 00:05:37,033 ♪ ♪ 108 00:05:37,033 --> 00:05:39,366 Even if it was for comedy or makeup, 109 00:05:39,366 --> 00:05:43,600 whatever I was posting, people wanted to watch it. 110 00:05:43,600 --> 00:05:45,700 And I was like, "Okay. 111 00:05:45,700 --> 00:05:48,866 So anything's possible on this app." 112 00:05:48,866 --> 00:05:50,266 ♪ ♪ 113 00:05:50,266 --> 00:05:54,666 Anyone can basically go viral on this app. 114 00:05:54,666 --> 00:05:56,100 ♪ ♪ 115 00:05:56,100 --> 00:05:59,100 [dramatic music] 116 00:05:59,100 --> 00:06:03,166 ♪ ♪ 117 00:06:03,166 --> 00:06:04,833 - ♪ Wipe, wipe ♪ 118 00:06:04,833 --> 00:06:06,833 ♪ Wipe it down wipe, wipe ♪ 119 00:06:09,233 --> 00:06:11,300 - [beatboxing] 120 00:06:11,300 --> 00:06:15,100 DJ Spencer in the mix. 121 00:06:15,100 --> 00:06:18,966 [beatboxing] 122 00:06:18,966 --> 00:06:21,000 My friend Scott comes up to me, 123 00:06:21,000 --> 00:06:22,666 and he goes, "Hey, man. 124 00:06:22,666 --> 00:06:25,300 I want to show you something you've never heard before." 125 00:06:25,300 --> 00:06:28,133 [beatboxing] 126 00:06:28,133 --> 00:06:31,400 And I was like, "What did you just do right now? 127 00:06:31,400 --> 00:06:34,400 Like, is this a trick? Is this a game? Like..." 128 00:06:34,400 --> 00:06:35,400 And he's like, "No, no, no, dude. 129 00:06:35,400 --> 00:06:36,766 "It's called beatboxing. 130 00:06:36,766 --> 00:06:38,666 You just make music with your mouth." 131 00:06:38,666 --> 00:06:41,800 And I'm like, "You were doing that with your face?" 132 00:06:41,800 --> 00:06:44,066 [beatboxing] 133 00:06:44,066 --> 00:06:48,233 I was 15. I was a sophomore in high school. 134 00:06:48,233 --> 00:06:51,433 And I remember that moment just being magic to me. 135 00:06:51,433 --> 00:06:52,600 [laughs] 136 00:06:52,600 --> 00:06:55,400 Like, it was love at first sound. 137 00:06:55,400 --> 00:06:57,500 [beatboxing] 138 00:06:57,500 --> 00:07:01,433 Hey, this is Spencer Polanco, and this is what I do. 139 00:07:01,433 --> 00:07:05,833 And I decided that night I was gonna be a beatboxer. 140 00:07:05,833 --> 00:07:08,766 But you can also imagine me busting out of my room 141 00:07:08,766 --> 00:07:12,300 just, you know, wild Spencer full of excitement in my eyes. 142 00:07:12,300 --> 00:07:16,266 "Dad, Mom, I know what I want to do for the rest of my life. 143 00:07:16,266 --> 00:07:19,133 I want to be a beatboxer." 144 00:07:19,133 --> 00:07:20,900 And then they looked at me like I was crazy. 145 00:07:20,900 --> 00:07:25,766 [beatboxing] 146 00:07:25,766 --> 00:07:28,033 - Spencer beatbox. 147 00:07:28,033 --> 00:07:29,600 - I grew up in New York City. 148 00:07:29,600 --> 00:07:32,466 My father is from Ecuador. 149 00:07:32,466 --> 00:07:34,333 And he came here and found my mom. 150 00:07:34,333 --> 00:07:38,900 My mom was first-generation Chinese family. 151 00:07:38,900 --> 00:07:42,133 My father, he wanted me to be a tennis player. 152 00:07:42,133 --> 00:07:44,200 And my mom wanted me to be a doctor. 153 00:07:44,200 --> 00:07:46,266 And then when I expressed something 154 00:07:46,266 --> 00:07:49,233 they didn't understand, which was beatboxing, 155 00:07:49,233 --> 00:07:52,300 they were accepting but confused 156 00:07:52,300 --> 00:07:55,300 and then unaccepting and even more confused. 157 00:07:55,300 --> 00:07:57,533 I mean, I was a struggling artist. 158 00:07:57,533 --> 00:08:00,133 I was a typical artist 159 00:08:00,133 --> 00:08:02,800 that was performing in the streets of New York City. 160 00:08:02,800 --> 00:08:05,533 I was in the subways busking. 161 00:08:05,533 --> 00:08:09,733 Some days you make 20 bucks working all day. 162 00:08:09,733 --> 00:08:11,633 [beatboxing] 163 00:08:11,633 --> 00:08:13,200 When I started TikTok, 164 00:08:13,200 --> 00:08:15,733 I had to look at myself in the mirror and say, 165 00:08:15,733 --> 00:08:17,866 "If this isn't gonna happen right now, 166 00:08:17,866 --> 00:08:19,400 it's not gonna happen." 167 00:08:19,400 --> 00:08:23,566 Okay, this is the first viral video. 168 00:08:23,566 --> 00:08:26,333 [imitates beverage glugging] 169 00:08:26,333 --> 00:08:27,433 Ahh. 170 00:08:27,433 --> 00:08:30,666 [beatboxing] 171 00:08:30,666 --> 00:08:32,433 Look at my hair. 172 00:08:32,433 --> 00:08:36,400 I didn't even, like, fix my hair for this video. 173 00:08:36,400 --> 00:08:39,700 And that was a six-second video. 174 00:08:39,700 --> 00:08:41,900 I woke up the next day 175 00:08:41,900 --> 00:08:45,900 and it had, like, 3 million views. 176 00:08:45,900 --> 00:08:48,066 And I was just genuinely confused. 177 00:08:48,066 --> 00:08:50,433 I was like, "Oh, my God, they like me." 178 00:08:50,433 --> 00:08:54,400 I looked at it, and I'm like, "There's something here." 179 00:08:54,400 --> 00:08:56,366 So I decided to post more. 180 00:08:56,366 --> 00:08:59,433 And I decided to do every single idea I possibly could. 181 00:08:59,433 --> 00:09:02,500 [beatboxing] 182 00:09:02,500 --> 00:09:05,866 ♪ ♪ 183 00:09:05,866 --> 00:09:06,933 Pshh! 184 00:09:06,933 --> 00:09:07,933 [wrapper rips] 185 00:09:07,933 --> 00:09:09,366 [cookie crunches 186 00:09:09,366 --> 00:09:11,000 Late 2019 was my first brand deal, 187 00:09:11,000 --> 00:09:12,633 and it was for Nike. 188 00:09:12,633 --> 00:09:14,766 When I first told my parents 189 00:09:14,766 --> 00:09:16,800 that I was doing something for Nike, 190 00:09:16,800 --> 00:09:21,366 they were like, "What? Nike? Like, Nike?" 191 00:09:21,366 --> 00:09:23,000 [coughing] 192 00:09:23,000 --> 00:09:24,666 The fact that anyone wants to use beatboxing for anything 193 00:09:24,666 --> 00:09:27,666 is the coolest thing on Earth to me. 194 00:09:27,666 --> 00:09:29,566 Now there's so many independent artists, 195 00:09:29,566 --> 00:09:31,500 especially coming from TikTok, 196 00:09:31,500 --> 00:09:33,466 that have just, like, cult following 197 00:09:33,466 --> 00:09:35,133 that have people that believe in them 198 00:09:35,133 --> 00:09:37,866 being outside of the industry. 199 00:09:37,866 --> 00:09:39,633 [beatboxing] 200 00:09:39,633 --> 00:09:40,866 - ♪ Savage love ♪ 201 00:09:40,866 --> 00:09:42,200 - I'm a beatboxer, 202 00:09:42,200 --> 00:09:45,400 and I'm hanging out at Jason Derulo's house. 203 00:09:45,400 --> 00:09:48,266 Like, how did this happen? 204 00:09:48,266 --> 00:09:51,100 Like, this is dope. 205 00:09:51,100 --> 00:09:54,133 - ♪ But I still want that, your savage love ♪ 206 00:09:54,133 --> 00:09:57,200 - It wasn't until I was on TV 207 00:09:57,200 --> 00:09:59,933 when I presented at the Billboard Music Awards, 208 00:09:59,933 --> 00:10:01,633 that's when my parents, they were like, 209 00:10:01,633 --> 00:10:04,333 "All right, Spencer's on TV now. 210 00:10:04,333 --> 00:10:06,300 I don't know what to do with this information." 211 00:10:06,300 --> 00:10:09,300 [beatboxing] 212 00:10:09,300 --> 00:10:10,666 ♪ ♪ 213 00:10:10,666 --> 00:10:15,000 [ticking noise] 214 00:10:15,000 --> 00:10:17,033 - The creator economy is the fastest-growing type 215 00:10:17,033 --> 00:10:19,233 of small business with more than 50 million people 216 00:10:19,233 --> 00:10:20,800 around the world who consider themselves 217 00:10:20,800 --> 00:10:22,033 to be content creators. 218 00:10:22,033 --> 00:10:24,700 - TikTok is absolutely massive. 219 00:10:24,700 --> 00:10:28,166 Last year they reported they had over 2 billion users. 220 00:10:28,166 --> 00:10:31,166 TikTok is an app that's completely different 221 00:10:31,166 --> 00:10:32,833 than any other type of social media 222 00:10:32,833 --> 00:10:35,300 or entertainment platform that we've ever seen before. 223 00:10:35,300 --> 00:10:38,266 TikTok was the first platform to really popularize 224 00:10:38,266 --> 00:10:40,533 high-quality vertical video content, 225 00:10:40,533 --> 00:10:44,033 which just makes it so easy to consume. 226 00:10:44,033 --> 00:10:46,933 The For You page is completely individually tailored 227 00:10:46,933 --> 00:10:50,333 to you based on the data that they gather about you. 228 00:10:50,333 --> 00:10:53,533 What TikTok does so well is discovery. 229 00:10:53,533 --> 00:10:55,933 It allows you to drill down on the whole internet 230 00:10:55,933 --> 00:10:59,200 and find these really specific groups of people 231 00:10:59,200 --> 00:11:02,033 that resonate with you. 232 00:11:02,033 --> 00:11:05,466 It's just any kind of niche subculture or community, 233 00:11:05,466 --> 00:11:08,433 you can find creators that are in that niche. 234 00:11:08,433 --> 00:11:10,166 People just get discovered much faster. 235 00:11:10,166 --> 00:11:11,666 They blow up much faster. 236 00:11:11,666 --> 00:11:14,266 And it's just everything goes ten times more viral 237 00:11:14,266 --> 00:11:16,833 than it would on any other social app. 238 00:11:16,833 --> 00:11:19,433 It's remaking the food landscape, 239 00:11:19,433 --> 00:11:22,000 the fashion landscape. 240 00:11:22,000 --> 00:11:24,833 People are learning on TikTok. 241 00:11:24,833 --> 00:11:26,466 TikTok defines top 40. 242 00:11:26,466 --> 00:11:29,566 If you go to the trending list on Spotify 243 00:11:29,566 --> 00:11:31,933 of the most viral hits, it's all TikTok songs. 244 00:11:31,933 --> 00:11:34,400 We saw Fleetwood Mac "Dreams," you know, 245 00:11:34,400 --> 00:11:36,900 resurge because of a viral video. 246 00:11:36,900 --> 00:11:41,866 - ♪ It's only right that you should play the way ♪ 247 00:11:41,866 --> 00:11:43,633 - It's hard to find an industry that TikTok 248 00:11:43,633 --> 00:11:46,166 hasn't infiltrated or disrupted. 249 00:11:46,166 --> 00:11:48,300 [upbeat music] 250 00:11:48,300 --> 00:11:50,900 All the biggest brands, they want to do TikTok campaigns 251 00:11:50,900 --> 00:11:53,133 because it has the hype right now. 252 00:11:53,133 --> 00:11:57,866 TikTok just really resonated with this Gen Z audience, 253 00:11:57,866 --> 00:12:00,333 which is where the most valuable users are. 254 00:12:00,333 --> 00:12:03,233 You have brands poised to spend over $15 billion 255 00:12:03,233 --> 00:12:05,900 in the next year on influencer marketing alone. 256 00:12:05,900 --> 00:12:10,100 - Eyeballs bring money, and brands chase the eyeballs, 257 00:12:10,100 --> 00:12:13,600 chase the young users. 258 00:12:13,600 --> 00:12:15,466 - The whole media ecosystem has migrated 259 00:12:15,466 --> 00:12:17,100 towards this personality-driven 260 00:12:17,100 --> 00:12:18,633 form of entertainment. 261 00:12:18,633 --> 00:12:20,933 So often people think, "Oh, online content creators, 262 00:12:20,933 --> 00:12:22,733 right, that's some teenager that's dancing." 263 00:12:22,733 --> 00:12:26,100 No. Online influence is influence. 264 00:12:26,100 --> 00:12:28,700 And if you can make an impact online, 265 00:12:28,700 --> 00:12:31,300 you have the ability to reshape the world. 266 00:12:31,300 --> 00:12:34,900 ♪ ♪ 267 00:12:34,900 --> 00:12:36,433 The elephant in the room, of course, 268 00:12:36,433 --> 00:12:38,833 is the fact that TikTok is owned by ByteDance, 269 00:12:38,833 --> 00:12:40,133 a Chinese company. 270 00:12:40,133 --> 00:12:42,533 And it's the first time we've seen a huge 271 00:12:42,533 --> 00:12:44,833 Chinese consumer tech company come in 272 00:12:44,833 --> 00:12:47,800 and dominate the American market. 273 00:12:47,800 --> 00:12:52,633 [ticking noise] 274 00:12:52,633 --> 00:12:55,166 [tense music] 275 00:12:55,166 --> 00:12:58,900 - So there was this guy from China named Zhang Yiming. 276 00:12:58,900 --> 00:13:00,500 He worked for a number of startups 277 00:13:00,500 --> 00:13:03,233 before starting up his own company, 278 00:13:03,233 --> 00:13:06,533 a company called ByteDance. 279 00:13:06,533 --> 00:13:09,533 The thing that struck me the most about TikTok 280 00:13:09,533 --> 00:13:11,933 was just how calculated their founders were 281 00:13:11,933 --> 00:13:13,933 from the very beginning about their goal, 282 00:13:13,933 --> 00:13:16,200 which was to become this kind of global force 283 00:13:16,200 --> 00:13:18,700 and really, like, penetrate the, 284 00:13:18,700 --> 00:13:20,800 you know, the zeitgeist of the U.S. 285 00:13:20,800 --> 00:13:23,300 ♪ ♪ 286 00:13:23,300 --> 00:13:24,766 - It's a story that's gonna sound very similar 287 00:13:24,766 --> 00:13:27,466 to many Silicon Valley entrepreneurs. 288 00:13:27,466 --> 00:13:30,200 So Zhang Yiming came from a middle-class family. 289 00:13:30,200 --> 00:13:32,466 And he basically just had this dream 290 00:13:32,466 --> 00:13:35,533 that he was going to make something important of himself. 291 00:13:35,533 --> 00:13:37,300 ♪ ♪ 292 00:13:37,300 --> 00:13:40,233 So when the iPhone came out in the late 2000s, 293 00:13:40,233 --> 00:13:42,833 he was just really blown away by the fact 294 00:13:42,833 --> 00:13:45,733 that you could have a full computing device 295 00:13:45,733 --> 00:13:47,066 small enough to fit in your pocket. 296 00:13:47,066 --> 00:13:48,733 And he was just convinced mobile internet 297 00:13:48,733 --> 00:13:51,800 was going to be the next once-in-a-lifetime opportunity 298 00:13:51,800 --> 00:13:53,700 that he had read about in the history books. 299 00:13:53,700 --> 00:13:57,033 And he was determined to take advantage of this big wave. 300 00:13:57,033 --> 00:14:00,733 ♪ ♪ 301 00:14:11,800 --> 00:14:13,100 - They started in an apartment, 302 00:14:13,100 --> 00:14:15,000 which was very common in Chinese 303 00:14:15,000 --> 00:14:17,566 sort of internet entrepreneurship lore. 304 00:14:27,033 --> 00:14:28,900 - About 2015 or so, 305 00:14:28,900 --> 00:14:31,866 ByteDance was doing experiments with video. 306 00:14:31,866 --> 00:14:34,533 And Zhang Yiming had this sort of strange idea 307 00:14:34,533 --> 00:14:37,733 about recommendation engines, which serves you content 308 00:14:37,733 --> 00:14:39,866 that it thinks you'll be interested in. 309 00:14:52,733 --> 00:14:54,966 [applause] 310 00:14:54,966 --> 00:14:58,466 - So ByteDance launched Douyin. 311 00:14:58,466 --> 00:15:00,200 ♪ ♪ 312 00:15:00,200 --> 00:15:04,000 Within six months after launch it hit this inflection point. 313 00:15:04,000 --> 00:15:06,166 Somehow it just became viral. 314 00:15:06,166 --> 00:15:07,633 ♪ ♪ 315 00:15:07,633 --> 00:15:10,633 [upbeat music] 316 00:15:10,633 --> 00:15:12,833 ♪ ♪ 317 00:15:25,266 --> 00:15:26,700 - Yo, what's wrong, man? 318 00:15:26,700 --> 00:15:29,033 Dude, it is so hot. 319 00:15:29,033 --> 00:15:31,466 Dude, why don't you just go get some Ben & Jerry's ice cream? 320 00:15:31,466 --> 00:15:36,433 People pay me to put their product in my videos. 321 00:15:36,433 --> 00:15:38,066 It's--it's--it's crazy. 322 00:15:38,066 --> 00:15:39,766 These are from this week. 323 00:15:39,766 --> 00:15:41,700 They're so many. Look. 324 00:15:41,700 --> 00:15:44,000 I get, like, clothes from, like, 325 00:15:44,000 --> 00:15:46,033 all different major brands. 326 00:15:46,033 --> 00:15:48,000 Look at that. 327 00:16:04,533 --> 00:16:07,100 - After the success of Douyin in China, 328 00:16:07,100 --> 00:16:08,833 ByteDance acquired Musical.ly, 329 00:16:08,833 --> 00:16:10,800 a lip sync app already popular 330 00:16:10,800 --> 00:16:13,966 with kids and young teens in the U.S., 331 00:16:13,966 --> 00:16:20,933 and merged them, rebranding the app as TikTok. 332 00:16:20,933 --> 00:16:23,166 - One thing Zhang Yiming did 333 00:16:23,166 --> 00:16:25,166 was to create two different products. 334 00:16:25,166 --> 00:16:27,566 Douyin was for the Chinese market, 335 00:16:27,566 --> 00:16:29,033 and it was a Chinese app. 336 00:16:29,033 --> 00:16:32,366 And TikTok was for the global market. 337 00:16:32,366 --> 00:16:35,233 And what that meant was that he could keep the Chinese app 338 00:16:35,233 --> 00:16:37,766 walled off in China with Chinese rules 339 00:16:37,766 --> 00:16:41,200 and then keep TikTok for the rest of the globe. 340 00:16:41,200 --> 00:16:43,033 TikTok was all of a sudden 341 00:16:43,033 --> 00:16:46,000 this huge success from day one. 342 00:16:46,000 --> 00:16:47,766 - TikTok has done what no other Chinese-made app 343 00:16:47,766 --> 00:16:49,033 has done before. 344 00:16:49,033 --> 00:16:50,666 It's cracked the international market 345 00:16:50,666 --> 00:16:52,466 and become a global sensation. 346 00:16:52,466 --> 00:16:55,766 - TikTok is now available in 154 countries 347 00:16:55,766 --> 00:16:57,566 and 75 languages 348 00:16:57,566 --> 00:17:00,200 rivaling Silicon Valley's biggest apps. 349 00:17:06,966 --> 00:17:08,566 - TikTok and its owner, ByteDance, 350 00:17:08,566 --> 00:17:11,400 were the first Chinese social media company 351 00:17:11,400 --> 00:17:14,166 to really provide a wake-up call to Facebook, 352 00:17:14,166 --> 00:17:15,466 Google, Amazon, 353 00:17:15,466 --> 00:17:18,833 and others that it's not just about Silicon Valley 354 00:17:18,833 --> 00:17:20,633 bringing technology to the world 355 00:17:20,633 --> 00:17:23,500 but really that China is a real force to be reckoned with. 356 00:17:23,500 --> 00:17:26,133 [spacey electronic music] 357 00:17:26,133 --> 00:17:27,766 - People spend more time on TikTok per day 358 00:17:27,766 --> 00:17:32,333 than Facebook, Snapchat, Instagram, YouTube. 359 00:17:32,333 --> 00:17:35,466 Facebook is absolutely desperate right now 360 00:17:35,466 --> 00:17:38,133 to regain any semblance of relevance, 361 00:17:38,133 --> 00:17:39,966 which they've lost quite a while ago. 362 00:17:39,966 --> 00:17:43,466 - In many areas, we're behind our competitors. 363 00:17:43,466 --> 00:17:46,133 The fastest-growing app is TikTok. 364 00:17:46,133 --> 00:17:49,566 ♪ ♪ 365 00:17:49,566 --> 00:17:50,966 - There was sort of this perception 366 00:17:50,966 --> 00:17:54,133 that there was too much of a cultural difference 367 00:17:54,133 --> 00:17:57,066 between China and the rest of the world 368 00:17:57,066 --> 00:17:58,733 and that Chinese companies didn't know 369 00:17:58,733 --> 00:18:02,466 how to build globally successful social media. 370 00:18:02,466 --> 00:18:05,900 I think TikTok's completely blown that out of the water. 371 00:18:05,900 --> 00:18:07,066 ♪ ♪ 372 00:18:07,066 --> 00:18:08,466 The world we all grew up in 373 00:18:08,466 --> 00:18:12,766 was one where America dominated culturally, 374 00:18:12,766 --> 00:18:15,400 where it dominated technologically. 375 00:18:15,400 --> 00:18:18,300 And the world that we will end up retiring in 376 00:18:18,300 --> 00:18:19,700 will probably be one 377 00:18:19,700 --> 00:18:23,800 where China dominates in most of those areas. 378 00:18:23,800 --> 00:18:26,833 Power is shifting to China rapidly. 379 00:18:26,833 --> 00:18:29,866 And ByteDance and the story of TikTok is part of that. 380 00:18:29,866 --> 00:18:33,600 [birds chirping, horns honking] 381 00:18:33,600 --> 00:18:35,233 [door squeaks open] 382 00:18:35,233 --> 00:18:38,233 - Hi. - Hi. How are you? 383 00:18:38,233 --> 00:18:40,833 - Good. - So what are we gonna do? 384 00:18:40,833 --> 00:18:43,433 - I want to play with adding just, like, 385 00:18:43,433 --> 00:18:45,100 more dimension to my hair. 386 00:18:45,100 --> 00:18:46,966 It's, like, a little-- it's falling a little flat. 387 00:18:46,966 --> 00:18:48,500 - Yeah. 388 00:18:48,500 --> 00:18:51,633 - I think that it's scary to be the first generation 389 00:18:51,633 --> 00:18:53,900 to have our entire lives documented. 390 00:18:55,600 --> 00:18:58,233 Every action, you know, every haircut, 391 00:18:58,233 --> 00:19:02,200 every look change is on people's radars. 392 00:19:02,200 --> 00:19:04,600 You know, I work a job where so much of it is, 393 00:19:04,600 --> 00:19:07,100 like, looking-- being looked at or... 394 00:19:07,100 --> 00:19:09,866 - Yeah. - Looking at myself, honestly. 395 00:19:09,866 --> 00:19:11,233 - What do you do? 396 00:19:11,233 --> 00:19:14,066 - Some influencer and content creation work. 397 00:19:14,066 --> 00:19:19,033 So I went viral for the first time when I was 16 years old 398 00:19:19,033 --> 00:19:21,033 for a confrontation with my senator. 399 00:19:21,033 --> 00:19:24,300 - Republican Senator Jeff Flake held a town hall in Mesa, 400 00:19:24,300 --> 00:19:26,400 Arizona, last night and got an earful 401 00:19:26,400 --> 00:19:27,666 from 16-year-old activist Deja Foxx. 402 00:19:27,666 --> 00:19:29,100 Take a look. 403 00:19:29,100 --> 00:19:31,933 - So I'm a young woman, and you're a middle-aged man. 404 00:19:31,933 --> 00:19:33,766 I'm a person of color. - Ouch. 405 00:19:33,766 --> 00:19:36,200 - And you're white. 406 00:19:36,200 --> 00:19:38,133 I come from a background of poverty, 407 00:19:38,133 --> 00:19:39,500 and I didn't always have parents 408 00:19:39,500 --> 00:19:40,633 to guide me through life. 409 00:19:40,633 --> 00:19:43,066 You come from privilege, so... 410 00:19:52,900 --> 00:19:56,733 I woke up the next morning, and 18 million people 411 00:19:56,733 --> 00:19:58,266 had seen that video. 412 00:19:58,266 --> 00:20:00,600 And I had a request in my email to go live on CNN. 413 00:20:00,600 --> 00:20:04,133 I can't sit idly by while women like me are countlessly 414 00:20:04,133 --> 00:20:07,366 and constantly being ignored on Capitol Hill. 415 00:20:07,366 --> 00:20:10,300 Overnight, I could be visible to people 416 00:20:10,300 --> 00:20:13,266 that I will never know. 417 00:20:13,266 --> 00:20:18,333 The internet connected me to the entire world. 418 00:20:20,733 --> 00:20:23,766 [light electronic music] 419 00:20:23,766 --> 00:20:25,233 ♪ ♪ 420 00:20:25,233 --> 00:20:30,633 On any platform where you can get extreme reach, 421 00:20:30,633 --> 00:20:32,833 you open yourself up to benefits, right? 422 00:20:32,833 --> 00:20:37,933 Larger followings, more views, more attention. 423 00:20:37,933 --> 00:20:40,833 But you also open yourself up to more hate, 424 00:20:40,833 --> 00:20:44,866 to people who are going to tear you down, 425 00:20:44,866 --> 00:20:48,066 tear you apart, pick at you. 426 00:20:48,066 --> 00:20:52,700 I live in a pretty constant state of anxiety. 427 00:20:52,700 --> 00:20:55,033 I don't know what it's like to live in a world 428 00:20:55,033 --> 00:20:59,033 where I'm not being perceived always, 429 00:20:59,033 --> 00:21:00,700 and it's this tug-of-war between 430 00:21:00,700 --> 00:21:02,266 that's kind of what I want, 431 00:21:02,266 --> 00:21:07,533 and it's kind of the thing I fear the most. 432 00:21:07,533 --> 00:21:10,200 I think there is a really interesting line 433 00:21:10,200 --> 00:21:13,300 between what it means to be empowered 434 00:21:13,300 --> 00:21:18,933 by your sexuality versus being exploited by it online. 435 00:21:18,933 --> 00:21:25,666 ♪ ♪ 436 00:21:25,666 --> 00:21:29,400 You can be a bad bitch in a bikini 437 00:21:29,400 --> 00:21:32,466 and a boss bitch in a blazer. 438 00:21:32,466 --> 00:21:34,700 Do both. 439 00:21:34,700 --> 00:21:39,600 [ticking noise] 440 00:21:39,600 --> 00:21:45,500 The posts where I'm showing more skin do better. 441 00:21:45,500 --> 00:21:48,633 But I also question why that is. 442 00:21:48,633 --> 00:21:50,366 If we think that these platforms really are 443 00:21:50,366 --> 00:21:52,866 just showing us the most popular content 444 00:21:52,866 --> 00:21:54,600 without really interrogating 445 00:21:54,600 --> 00:21:59,166 why we're seeing what we're seeing, that's dangerous. 446 00:21:59,166 --> 00:22:03,100 And it's not transparent to us, though we're the ones driving 447 00:22:03,100 --> 00:22:05,466 all of the activity on the platform. 448 00:22:06,233 --> 00:22:08,566 There's so much mystery to the algorithm. 449 00:22:08,566 --> 00:22:10,933 "The algorithm," like, what does that even mean? 450 00:22:12,066 --> 00:22:13,900 - So when I call your name, 451 00:22:13,900 --> 00:22:17,666 I shall place the sorting hat on your head. 452 00:22:17,666 --> 00:22:19,733 And you will be sorted into your houses. 453 00:22:19,733 --> 00:22:22,200 - I've referred to TikTok as a sorting hat in reference 454 00:22:22,200 --> 00:22:24,566 to the sorting hat from "Harry Potter." 455 00:22:24,566 --> 00:22:26,766 When the kids show up at Hogwarts, 456 00:22:26,766 --> 00:22:28,666 there's this magical sorting hat 457 00:22:28,666 --> 00:22:31,766 that sorts them into one of the four schools. 458 00:22:31,766 --> 00:22:34,466 TikTok's recommendation algorithm serves 459 00:22:34,466 --> 00:22:36,433 as that type of sorting hat. 460 00:22:36,433 --> 00:22:40,166 It sorts its users into different audiences. 461 00:22:40,166 --> 00:22:43,833 Then it sorts videos into different clusters 462 00:22:43,833 --> 00:22:46,133 that appeal to different audiences. 463 00:22:46,133 --> 00:22:48,200 TikTok was very different in that 464 00:22:48,200 --> 00:22:50,266 even if you didn't follow anybody, 465 00:22:50,266 --> 00:22:52,933 you would over time, just by using the app, 466 00:22:52,933 --> 00:22:54,766 get a very personalized entertainment 467 00:22:54,766 --> 00:22:57,933 experience for yourself. 468 00:22:57,933 --> 00:23:01,233 - Is anyone else, like, a little weirded out about 469 00:23:01,233 --> 00:23:04,933 how specific TikTok's algorithm gets for the For You page? 470 00:23:04,933 --> 00:23:06,400 - The For You page on TikTok 471 00:23:06,400 --> 00:23:09,833 is the default that the app opens into. 472 00:23:09,833 --> 00:23:11,400 On the one hand, it has a bunch of attributes 473 00:23:11,400 --> 00:23:12,600 about the video. 474 00:23:12,600 --> 00:23:14,633 It has the song in it. 475 00:23:14,633 --> 00:23:16,133 It has a dog. 476 00:23:16,133 --> 00:23:18,533 On the other, it has a bunch of attributes about you. 477 00:23:18,533 --> 00:23:22,066 You're this age. You live here. 478 00:23:22,066 --> 00:23:24,400 Those are contextual clues 479 00:23:24,400 --> 00:23:25,866 to feed their algorithm 480 00:23:25,866 --> 00:23:29,466 to determine what your tastes are. 481 00:23:29,466 --> 00:23:32,000 - My TikTok algorithm is just like, "You have ADHD. 482 00:23:32,000 --> 00:23:34,766 You have BPD. You're depressed." 483 00:23:34,766 --> 00:23:37,366 - When you're looking at each video 484 00:23:37,366 --> 00:23:39,300 on the For You page, TikTok, 485 00:23:39,300 --> 00:23:44,666 the app, is looking at how you react to that video. 486 00:23:44,666 --> 00:23:46,766 And the algorithm starts to become smarter 487 00:23:46,766 --> 00:23:49,066 just off of these long sessions 488 00:23:49,066 --> 00:23:51,933 where you're addictively scrolling through videos... 489 00:23:51,933 --> 00:23:53,633 ♪ ♪ 490 00:23:53,633 --> 00:23:59,133 and then adjusts what videos it shows you in the future. 491 00:23:59,133 --> 00:24:04,166 And over time it builds almost a fingerprint of your tastes. 492 00:24:04,166 --> 00:24:06,133 - I'm talking, like, I was just thinking about 493 00:24:06,133 --> 00:24:07,600 making a peanut butter and jelly sandwich. 494 00:24:07,600 --> 00:24:09,766 And then out of nowhere, someone is making 495 00:24:09,766 --> 00:24:11,533 a peanut butter and jelly on my For You page. 496 00:24:11,533 --> 00:24:13,700 But lately, I kid you not, it hasn't been things that 497 00:24:13,700 --> 00:24:17,166 I'll Google or I talk about. It's been thoughts. 498 00:24:17,166 --> 00:24:19,466 - Are any other girls, like, kind of aggravated 499 00:24:19,466 --> 00:24:21,333 that it took more than 20 years to figure out 500 00:24:21,333 --> 00:24:22,666 we were bisexual, but it took 501 00:24:22,666 --> 00:24:25,866 my TikTok algorithm, like, 37 seconds? 502 00:24:25,866 --> 00:24:28,900 - TikTok is just the latest manifestation of the power 503 00:24:28,900 --> 00:24:31,733 that comes from connecting billions of people 504 00:24:31,733 --> 00:24:34,166 in the world with really powerful 505 00:24:34,166 --> 00:24:37,966 machine-learning recommendation algorithms. 506 00:24:37,966 --> 00:24:40,600 [birds chirping] 507 00:24:40,600 --> 00:24:43,733 - After joining TikTok, I decided, 508 00:24:43,733 --> 00:24:48,800 like, maybe I want to make more political savvy videos. 509 00:24:48,800 --> 00:24:52,666 And I got more views on that. 510 00:24:52,666 --> 00:24:55,066 Hi, if you actually think all lives matter, 511 00:24:55,066 --> 00:24:56,633 I want you to speak up about the kids 512 00:24:56,633 --> 00:24:58,166 in the cages at the border. I want you to speak up 513 00:24:58,166 --> 00:24:59,666 about the kids dying in the Middle East. 514 00:24:59,666 --> 00:25:01,866 I want you to speak up about the child trafficking. 515 00:25:01,866 --> 00:25:04,566 I first heard about the Uighurs through social media 516 00:25:04,566 --> 00:25:06,900 since I do follow Muslim pages 517 00:25:06,900 --> 00:25:08,966 and try to keep up with my community. 518 00:25:08,966 --> 00:25:11,833 - Across the northwestern province of Xinjiang, 519 00:25:11,833 --> 00:25:14,966 an estimated one million Chinese Muslims have vanished 520 00:25:14,966 --> 00:25:18,200 into a vast network of detention centers 521 00:25:18,200 --> 00:25:20,066 that targets Uighur Muslims. 522 00:25:20,066 --> 00:25:22,766 - I saw someone post a picture 523 00:25:22,766 --> 00:25:26,200 of these Uighur prisoners. 524 00:25:26,200 --> 00:25:28,133 And when I did more research, I found out 525 00:25:28,133 --> 00:25:30,266 that this genocide is happening in front of us, 526 00:25:30,266 --> 00:25:31,933 and no one is speaking about it. 527 00:25:33,033 --> 00:25:34,233 - ♪ Right here in these streets ♪ 528 00:25:34,233 --> 00:25:36,266 - ♪ Okay, okay, okay ♪ 529 00:25:36,266 --> 00:25:38,133 ♪ What the [bleep]? ♪ 530 00:25:38,133 --> 00:25:40,400 It says, "News outlets "when innocent Muslims 531 00:25:40,400 --> 00:25:41,633 "are getting murdered every day 532 00:25:41,633 --> 00:25:43,633 in the Middle East and in China." 533 00:25:43,633 --> 00:25:46,733 ♪ ♪ 534 00:25:46,733 --> 00:25:50,033 The next day after I posted that video, 535 00:25:50,033 --> 00:25:53,366 I looked at my feed, and I saw where the post used to be. 536 00:25:53,366 --> 00:25:56,700 It was no longer the image of my face on there. 537 00:25:56,700 --> 00:26:01,800 It was just a black little box. 538 00:26:01,800 --> 00:26:03,333 When I clicked on it, it just would say, 539 00:26:03,333 --> 00:26:06,600 "Video unavailable." 540 00:26:06,600 --> 00:26:09,000 I found out how TikTok's basically using our data, 541 00:26:09,000 --> 00:26:10,266 using our information, and using it 542 00:26:10,266 --> 00:26:11,266 for their own benefit. 543 00:26:11,266 --> 00:26:13,133 TikTok is a Beijing-owned app. 544 00:26:13,133 --> 00:26:15,200 It has censored videos that are against the CCP. 545 00:26:15,200 --> 00:26:16,500 I don't know about you guys, 546 00:26:16,500 --> 00:26:17,666 but I want to know what TikTok's doing 547 00:26:17,666 --> 00:26:18,766 with our information. 548 00:26:18,766 --> 00:26:23,666 [ticking noise] 549 00:26:23,666 --> 00:26:26,533 [spacey music] 550 00:26:26,533 --> 00:26:28,833 - AI is hungry for data. 551 00:26:28,833 --> 00:26:32,633 So the more data you have the more accurate the AI becomes. 552 00:26:32,633 --> 00:26:36,966 So in the age of AI, data is the new oil, 553 00:26:36,966 --> 00:26:39,266 and China is the new Saudi Arabia. 554 00:26:39,266 --> 00:26:40,933 ♪ ♪ 555 00:26:40,933 --> 00:26:45,933 - Your data is an asset to a lot of companies. 556 00:26:45,933 --> 00:26:48,033 Google and Amazon and Facebook, 557 00:26:48,033 --> 00:26:49,733 they're so big, and they have so much money, 558 00:26:49,733 --> 00:26:51,433 because they have all of your data. 559 00:26:51,433 --> 00:26:52,933 ♪ ♪ 560 00:26:52,933 --> 00:26:54,933 And there's a whole secondary market for data 561 00:26:54,933 --> 00:26:56,633 called data brokers. 562 00:26:56,633 --> 00:26:58,133 And they're gathering all of this data, 563 00:26:58,133 --> 00:26:59,933 and they're selling it to each other. 564 00:26:59,933 --> 00:27:03,533 And it's really no different than the stock market. 565 00:27:03,533 --> 00:27:06,166 If a company can start gathering that data, 566 00:27:06,166 --> 00:27:08,500 thousands of points of data a day, 567 00:27:08,500 --> 00:27:12,366 from the time someone's 5 until the time they're 18, 568 00:27:12,366 --> 00:27:14,433 those companies, when they sell that data, 569 00:27:14,433 --> 00:27:17,500 they have a profile that knows a child way better 570 00:27:17,500 --> 00:27:19,966 than a parent would. 571 00:27:19,966 --> 00:27:23,700 And that is extremely valuable to advertisers. 572 00:27:23,700 --> 00:27:26,600 ♪ ♪ 573 00:27:26,600 --> 00:27:28,633 If social media is determining 574 00:27:28,633 --> 00:27:30,600 and tracking you in different ways 575 00:27:30,600 --> 00:27:32,133 and telling you what your dreams 576 00:27:32,133 --> 00:27:34,733 are going to be based on the ads you're getting, 577 00:27:34,733 --> 00:27:38,533 that impacts the kid's brain, and it impacts their goals. 578 00:27:38,533 --> 00:27:40,966 ♪ ♪ 579 00:27:40,966 --> 00:27:44,233 There's a lot of harms happening with these companies 580 00:27:44,233 --> 00:27:46,233 that are based in Northern California. 581 00:27:46,233 --> 00:27:49,200 What's different about TikTok is, 582 00:27:49,200 --> 00:27:50,933 where is this data going? 583 00:27:50,933 --> 00:27:52,466 ♪ ♪ 584 00:27:52,466 --> 00:27:56,100 - TikTok is reportedly under federal investigation, 585 00:27:56,100 --> 00:27:59,300 the U.S. government reportedly launching a national security 586 00:27:59,300 --> 00:28:02,566 review of the company's data collection and censorship 587 00:28:02,566 --> 00:28:05,966 practices amid concerns that users' personal data 588 00:28:05,966 --> 00:28:08,133 could be accessible to foreign governments. 589 00:28:08,133 --> 00:28:10,066 ♪ ♪ 590 00:28:10,066 --> 00:28:12,833 - There's very little transparency. 591 00:28:12,833 --> 00:28:15,300 Just because a tech company says something-- 592 00:28:15,300 --> 00:28:16,966 and I'm not just talking about TikTok-- 593 00:28:16,966 --> 00:28:20,066 we don't have to take the tech company at its word. 594 00:28:20,066 --> 00:28:21,300 ♪ ♪ 595 00:28:21,300 --> 00:28:23,400 I think Facebook is somewhat scared 596 00:28:23,400 --> 00:28:26,366 of the quick rise of TikTok, because all of that data 597 00:28:26,366 --> 00:28:30,433 that Facebook was getting is now going to TikTok. 598 00:28:30,433 --> 00:28:35,300 ♪ ♪ 599 00:28:35,300 --> 00:28:40,133 [ticking noise] 600 00:28:40,133 --> 00:28:41,566 - China is one of the few markets 601 00:28:41,566 --> 00:28:43,333 where Facebook is unavailable right now 602 00:28:43,333 --> 00:28:45,000 because of government censors. 603 00:28:45,000 --> 00:28:46,500 - Well, Facebook CEO Mark Zuckerberg 604 00:28:46,500 --> 00:28:49,200 appears to be trying bit by bit to re-enter 605 00:28:49,200 --> 00:28:52,600 the world's largest internet population, China. 606 00:28:52,600 --> 00:28:54,966 - Facebook was probably the most aggressive 607 00:28:54,966 --> 00:28:56,600 social media company 608 00:28:56,600 --> 00:29:00,800 to try to get into China, because social media in China 609 00:29:00,800 --> 00:29:03,666 had been blocked for a number of years. 610 00:29:03,666 --> 00:29:05,466 ♪ ♪ 611 00:29:05,466 --> 00:29:07,100 Mark Zuckerberg was trying to find a way 612 00:29:07,100 --> 00:29:09,066 that they could exist somehow in China. 613 00:29:09,066 --> 00:29:12,100 And so he learned Mandarin. 614 00:29:20,233 --> 00:29:22,533 He went to a number of conferences 615 00:29:22,533 --> 00:29:25,666 where he could put himself in front of Xi Jinping 616 00:29:25,666 --> 00:29:28,266 and speak to Xi Jinping directly. 617 00:29:28,266 --> 00:29:31,633 And he really was aggressive in saying, you know, 618 00:29:31,633 --> 00:29:33,033 "This is a huge market. 619 00:29:33,033 --> 00:29:36,933 "How can we be this global connector for the world 620 00:29:36,933 --> 00:29:40,900 and not have China be part of it?" 621 00:29:40,900 --> 00:29:44,066 But it became clear to him in the last few years 622 00:29:44,066 --> 00:29:46,166 that it's not going to work, 623 00:29:46,166 --> 00:29:48,933 and so he completely changed. 624 00:29:48,933 --> 00:29:53,233 He completely had a 180, and he went on the offensive. 625 00:29:53,233 --> 00:29:56,433 - A decade ago almost all of the major internet platforms 626 00:29:56,433 --> 00:29:57,633 were American. 627 00:29:57,633 --> 00:30:01,966 Today six of the top ten are Chinese. 628 00:30:01,966 --> 00:30:05,433 - He realized that if I can't win over the Chinese market, 629 00:30:05,433 --> 00:30:07,366 then I'm gonna make it harder for them to win 630 00:30:07,366 --> 00:30:10,133 over my market in the U.S. 631 00:30:10,133 --> 00:30:12,033 - Mark Zuckerberg has reportedly called 632 00:30:12,033 --> 00:30:14,766 TikTok a threat to democracy. 633 00:30:14,766 --> 00:30:17,166 What the Facebook CEO failed to mention 634 00:30:17,166 --> 00:30:20,000 is that he tried to purchase TikTok's predecessor, 635 00:30:20,000 --> 00:30:21,500 Musical.ly. 636 00:30:21,500 --> 00:30:24,900 - Zuckerberg's clearly very concerned about TikTok, 637 00:30:24,900 --> 00:30:27,966 because it's the most genuine new competition 638 00:30:27,966 --> 00:30:32,166 he's received for a long time. 639 00:30:32,166 --> 00:30:34,500 - TikTok has captured the attention of the world's 640 00:30:34,500 --> 00:30:37,133 most lucrative market, young people, 641 00:30:37,133 --> 00:30:40,666 and with it the power to reshape the future. 642 00:30:40,666 --> 00:30:43,700 [ticking noise] 643 00:30:45,533 --> 00:30:48,066 [phone beeping] 644 00:30:48,066 --> 00:30:51,066 - [beatboxing] 645 00:30:51,066 --> 00:30:55,000 ♪ ♪ 646 00:30:55,000 --> 00:30:56,966 - That was good. - I liked that one. 647 00:30:56,966 --> 00:30:59,133 Content's easy with you, Merrick. 648 00:30:59,133 --> 00:31:02,466 - My name is Merrick Hanna. I am 16 years old. 649 00:31:02,466 --> 00:31:05,066 ♪ ♪ 650 00:31:05,066 --> 00:31:06,500 When people think of influencers, 651 00:31:06,500 --> 00:31:09,700 I think they think it's very leisurely. 652 00:31:09,700 --> 00:31:10,733 It's not. 653 00:31:10,733 --> 00:31:13,266 Open... 654 00:31:13,266 --> 00:31:16,366 - [beatboxing] 655 00:31:16,366 --> 00:31:18,100 - [laughs] 656 00:31:18,100 --> 00:31:19,700 It's fast. - Is that how it goes? 657 00:31:19,700 --> 00:31:21,300 - Yeah, it's that fast. - Why is it so fast? 658 00:31:21,300 --> 00:31:23,600 - 'Cause it loops. - Ahh. 659 00:31:23,600 --> 00:31:25,500 - Yeah, so it seems like it's longer when you watch it. 660 00:31:25,500 --> 00:31:27,000 - I don't think it needs to be quite as fast 661 00:31:27,000 --> 00:31:28,333 as you're suggesting. 662 00:31:28,333 --> 00:31:30,233 Try and make it longer. You're rushing so much. 663 00:31:30,233 --> 00:31:32,800 To manage Merrick's career at this point 664 00:31:32,800 --> 00:31:37,200 is definitely a full-time job on top of my full-time job, 665 00:31:37,200 --> 00:31:40,033 reading all the emails for him, reading the contracts, 666 00:31:40,033 --> 00:31:42,966 reading the offers, replying, the back and forth. 667 00:31:42,966 --> 00:31:44,733 - My dad helps a lot with finding ideas, 668 00:31:44,733 --> 00:31:46,533 because that is a big part of it. 669 00:31:46,533 --> 00:31:48,500 He'll find a trend that he thinks I can do. 670 00:31:48,500 --> 00:31:49,700 He'll show it to me, and I'll be like, 671 00:31:49,700 --> 00:31:50,900 "All right, I know what to do." 672 00:31:50,900 --> 00:31:54,133 Hit, hit, hit, hit, then I'll push back. 673 00:31:54,133 --> 00:31:55,466 Even though it may not seem like it, 674 00:31:55,466 --> 00:31:57,133 he does a lot of the work. 675 00:31:57,133 --> 00:31:59,833 - I don't sleep quite as much as I used to. 676 00:32:02,300 --> 00:32:05,266 - [beatboxing] 677 00:32:05,266 --> 00:32:09,300 - Now. 678 00:32:09,300 --> 00:32:10,733 It's like a gold rush. 679 00:32:10,733 --> 00:32:13,000 Brands wisely are now seeing 680 00:32:13,000 --> 00:32:15,466 that you can pinpoint an audience 681 00:32:15,466 --> 00:32:17,766 better probably through social media, 682 00:32:17,766 --> 00:32:19,533 and TikTok especially, 683 00:32:19,533 --> 00:32:22,833 than a lot of traditional means of advertising. 684 00:32:22,833 --> 00:32:24,433 And then there's a lot of people 685 00:32:24,433 --> 00:32:25,933 who are trying to take advantage of the gold rush 686 00:32:25,933 --> 00:32:28,066 who just shouldn't be, who are incompetent. 687 00:32:28,066 --> 00:32:31,333 But a lot of the influencers are young and inexperienced 688 00:32:31,333 --> 00:32:32,633 and don't know better. 689 00:32:32,633 --> 00:32:33,866 - Right. 690 00:32:33,866 --> 00:32:35,600 - I think that having a parent filter 691 00:32:35,600 --> 00:32:39,166 social media messages is critical parent involvement 692 00:32:39,166 --> 00:32:41,766 if you want to keep an eye on your child. 693 00:32:41,766 --> 00:32:43,566 - It's very, very, very important. 694 00:32:43,566 --> 00:32:44,933 - Very, very. 695 00:32:44,933 --> 00:32:47,366 - ♪ Inferno ♪ 696 00:32:47,366 --> 00:32:50,200 ♪ Baby, I'm the reason why ♪ 697 00:32:50,200 --> 00:32:54,700 - A third of TikTok's U.S. users may be 14 or under, 698 00:32:54,700 --> 00:32:58,400 raising safety questions. 699 00:32:58,400 --> 00:33:01,166 - An extremely popular video app--it was called Musical.ly; 700 00:33:01,166 --> 00:33:03,100 it's now called TikTok-- has agreed 701 00:33:03,100 --> 00:33:05,300 to pay millions of dollars in fines for illegally 702 00:33:05,300 --> 00:33:07,533 collecting personal information from children. 703 00:33:07,533 --> 00:33:10,533 [tense music] 704 00:33:10,533 --> 00:33:13,933 ♪ ♪ 705 00:33:13,933 --> 00:33:15,933 - Hey, Kari. 706 00:33:15,933 --> 00:33:18,833 These companies are preying on children. 707 00:33:18,833 --> 00:33:21,566 TikTok is amassing a profile on them 708 00:33:21,566 --> 00:33:24,966 so that they can be targeted by advertisers. 709 00:33:24,966 --> 00:33:28,766 They can push ideas to that child, and that is dangerous. 710 00:33:28,766 --> 00:33:31,033 ♪ ♪ 711 00:33:31,033 --> 00:33:36,366 One of the unique features of TikTok is that a child 712 00:33:36,366 --> 00:33:39,333 could post a video dancing and having fun, 713 00:33:39,333 --> 00:33:42,066 and there's a feature on it called Duets, 714 00:33:42,066 --> 00:33:45,233 where you have children posting their own videos 715 00:33:45,233 --> 00:33:47,733 and then you just have these older men staring at them, 716 00:33:47,733 --> 00:33:50,600 - ♪ Good morning, beautiful ♪ 717 00:33:50,600 --> 00:33:53,400 ♪ How was your night? ♪ 718 00:33:53,400 --> 00:33:56,166 - Where these older men doing--you know, 719 00:33:56,166 --> 00:33:58,966 just being part of I want to be in a duet with you. 720 00:33:58,966 --> 00:34:02,100 We had one where the girl went to kiss her camera, 721 00:34:02,100 --> 00:34:03,433 and at the same time, 722 00:34:03,433 --> 00:34:05,233 one of these older men kisses his camera. 723 00:34:05,233 --> 00:34:08,100 So it looks like you're having an older man 724 00:34:08,100 --> 00:34:11,033 making out with a young girl. 725 00:34:11,033 --> 00:34:13,166 These people are seeing your children, 726 00:34:13,166 --> 00:34:16,366 and they could contact them through the TikTok app. 727 00:34:16,366 --> 00:34:19,000 There were child predators before social media, 728 00:34:19,000 --> 00:34:22,833 but they didn't have direct access to your child's inbox. 729 00:34:22,833 --> 00:34:24,833 - ♪ Walk up in dat bitch too clean I'm froze ♪ 730 00:34:24,833 --> 00:34:26,500 ♪ They don't fight you where I'm from ♪ 731 00:34:26,500 --> 00:34:28,000 ♪ Like the beat I keep a drum ♪ 732 00:34:28,000 --> 00:34:29,600 ♪ You ain't got dough, lil' boy, you a bum ♪ 733 00:34:29,600 --> 00:34:32,100 - I'm involved in working on litigation against TikTok. 734 00:34:32,100 --> 00:34:34,900 And my son is up in his bedroom, you know, 735 00:34:34,900 --> 00:34:37,866 doing livestreams of TikTok, it turned out. 736 00:34:37,866 --> 00:34:39,600 And so, like, the work I do, 737 00:34:39,600 --> 00:34:41,166 is that ever in the back of your mind like, 738 00:34:41,166 --> 00:34:44,433 "So I'm using all these apps, and my dad has these lawsuits 739 00:34:44,433 --> 00:34:46,166 "against these companies for data 740 00:34:46,166 --> 00:34:47,166 and protecting people"? 741 00:34:47,166 --> 00:34:49,100 Like, do you think about that? 742 00:34:49,100 --> 00:34:51,066 - No. I just think about it, 'cause you come home 743 00:34:51,066 --> 00:34:52,600 and tell me about all that stuff. 744 00:34:52,600 --> 00:34:54,000 And then I'm, like, 745 00:34:54,000 --> 00:34:56,100 technology is, like, to someone my age, is, like, 746 00:34:56,100 --> 00:34:59,200 so essential to everything I do 747 00:34:59,200 --> 00:35:01,900 that it's, like, I kind of have to live with the fact 748 00:35:01,900 --> 00:35:04,833 that there's gonna be people that are profiting off my data, 749 00:35:04,833 --> 00:35:08,300 and I have no real recourse for that. 750 00:35:08,300 --> 00:35:09,566 What's more concerning 751 00:35:09,566 --> 00:35:12,433 is, like, the accuracy of the algorithm. 752 00:35:12,433 --> 00:35:15,400 Like, I could be talking about a movie. 753 00:35:15,400 --> 00:35:17,300 And then later that day, 754 00:35:17,300 --> 00:35:19,666 that movie, like, shows up on my feed. 755 00:35:19,666 --> 00:35:20,900 - Like, you're just talking 756 00:35:20,900 --> 00:35:22,200 randomly to somebody about a movie? 757 00:35:22,200 --> 00:35:23,433 - That happens, like, 758 00:35:23,433 --> 00:35:25,966 way more often than I'm comfortable with. 759 00:35:25,966 --> 00:35:27,566 - And that doesn't freak you out? 760 00:35:27,566 --> 00:35:29,166 - Oh, it does to some degree, but, like, 761 00:35:29,166 --> 00:35:30,700 I guess I'm used to it at this point. 762 00:35:30,700 --> 00:35:32,466 Like, it happens so much. 763 00:35:32,466 --> 00:35:34,333 ♪ ♪ 764 00:35:34,333 --> 00:35:37,200 - I am on the front lines of fighting privacy 765 00:35:37,200 --> 00:35:38,866 battles for children. 766 00:35:38,866 --> 00:35:42,433 And my kids know that's what I do, 767 00:35:42,433 --> 00:35:47,033 but they're on the app, so it's a fight you fight, 768 00:35:47,033 --> 00:35:50,166 but it's a difficult fight to win with your kids. 769 00:35:50,166 --> 00:35:53,833 You can only do so much, and these companies know that. 770 00:35:53,833 --> 00:35:55,900 If there's no regulation of it, 771 00:35:55,900 --> 00:35:58,766 you as a parent, you don't have any control 772 00:35:58,766 --> 00:36:01,533 over what's being pushed to these kids. 773 00:36:01,533 --> 00:36:05,766 ♪ ♪ 774 00:36:05,766 --> 00:36:09,066 - Gen Z is a unique generation. 775 00:36:09,066 --> 00:36:11,000 To be a digital native is to be someone 776 00:36:11,000 --> 00:36:15,566 who doesn't know a world without the internet. 777 00:36:15,566 --> 00:36:17,433 Teenagers are in a really sensitive point 778 00:36:17,433 --> 00:36:18,900 in their development 779 00:36:18,900 --> 00:36:21,833 both in terms of how their brains are rewiring 780 00:36:21,833 --> 00:36:24,866 and in terms of how they're making sense of themselves 781 00:36:24,866 --> 00:36:28,200 and their place in the world. 782 00:36:28,200 --> 00:36:30,300 What ends up happening is that the algorithms 783 00:36:30,300 --> 00:36:31,766 themselves end up 784 00:36:31,766 --> 00:36:35,900 shaping the development of teenagers on these apps 785 00:36:35,900 --> 00:36:40,033 in ways that we don't understand at all. 786 00:36:40,033 --> 00:36:42,166 ♪ ♪ 787 00:36:42,166 --> 00:36:44,766 With any recommendation algorithm, 788 00:36:44,766 --> 00:36:47,933 you run the risk of individuals 789 00:36:47,933 --> 00:36:51,933 who look similar to each other in terms of their activity 790 00:36:51,933 --> 00:36:54,200 getting pushed closer and closer and closer together 791 00:36:54,200 --> 00:36:57,000 in terms of the content that they're being recommended, 792 00:36:57,000 --> 00:36:59,300 whatever information is gonna confirm 793 00:36:59,300 --> 00:37:01,033 your pre-existing beliefs. 794 00:37:01,033 --> 00:37:02,866 ♪ ♪ 795 00:37:02,866 --> 00:37:06,400 By not allowing people from diverse perspectives 796 00:37:06,400 --> 00:37:09,100 to come into contact with each other, 797 00:37:09,100 --> 00:37:12,566 it lessens their ability for empathy. 798 00:37:12,566 --> 00:37:16,166 The algorithms are reinforcing social disparities. 799 00:37:16,166 --> 00:37:18,600 ♪ ♪ 800 00:37:18,600 --> 00:37:19,833 It's not just TikTok. 801 00:37:19,833 --> 00:37:24,233 It is the technology that TikTok relies on. 802 00:37:24,233 --> 00:37:28,900 But recommendation algorithms have infiltrated 803 00:37:28,900 --> 00:37:30,933 all aspects of our society. 804 00:37:30,933 --> 00:37:32,333 ♪ ♪ 805 00:37:32,333 --> 00:37:36,333 Humans are relying on recommendation systems 806 00:37:36,333 --> 00:37:40,433 to tell them what decisions to make. 807 00:37:40,433 --> 00:37:44,833 And they are determining our futures moment by moment 808 00:37:44,833 --> 00:37:50,766 in ways that we have very little control. 809 00:37:50,766 --> 00:37:52,500 If we fail to regulate social media 810 00:37:52,500 --> 00:37:55,933 and the impact that it's having on this generation, 811 00:37:55,933 --> 00:38:01,000 we're gonna see a lot of marginalized teenagers 812 00:38:01,000 --> 00:38:04,666 experiencing harms that none of us had to experience 813 00:38:04,666 --> 00:38:06,166 and that none of us are prepared 814 00:38:06,166 --> 00:38:07,700 to help them navigate. 815 00:38:07,700 --> 00:38:14,233 ♪ ♪ 816 00:38:16,900 --> 00:38:19,866 - I've been in the spotlight since I was 16. 817 00:38:19,866 --> 00:38:23,133 And it is exhausting. 818 00:38:23,133 --> 00:38:25,733 There's definitely this tension always 819 00:38:25,733 --> 00:38:27,433 between produce, produce, produce, 820 00:38:27,433 --> 00:38:30,333 stay relevant and produce things 821 00:38:30,333 --> 00:38:32,400 that you're going to be proud of in ten years. 822 00:38:32,400 --> 00:38:36,300 And they don't always go together. They can't. 823 00:38:36,300 --> 00:38:40,833 And I think it leaves me to question, you know, 824 00:38:40,833 --> 00:38:42,400 "Am I doing the right thing? 825 00:38:42,400 --> 00:38:45,766 Did I make the right choices?" 826 00:38:45,766 --> 00:38:48,200 ♪ ♪ 827 00:38:48,200 --> 00:38:50,133 I was raised by a single mom. 828 00:38:50,133 --> 00:38:53,500 And I grew up in a household that 829 00:38:53,500 --> 00:38:57,466 like many couldn't afford the basics. 830 00:38:57,466 --> 00:39:00,600 Probably the hardest time 831 00:39:00,600 --> 00:39:04,466 was when my mom started to decline 832 00:39:04,466 --> 00:39:07,600 into substance abuse. 833 00:39:07,600 --> 00:39:10,366 When I was 15, I walked out of my mom's house, 834 00:39:10,366 --> 00:39:14,366 because I couldn't get what I needed there. 835 00:39:14,366 --> 00:39:16,166 And so for me, that looked like living 836 00:39:16,166 --> 00:39:17,500 at a friend's house 837 00:39:17,500 --> 00:39:22,000 until I graduated and moved to college. 838 00:39:22,000 --> 00:39:23,933 Sometimes I'll just scroll through social media, 839 00:39:23,933 --> 00:39:27,866 because it feels like the only thing I can do, 840 00:39:27,866 --> 00:39:29,766 but it doesn't give me energy. 841 00:39:29,766 --> 00:39:31,266 It takes energy from me. 842 00:39:31,266 --> 00:39:35,833 And then I end up just completely coming down my-- 843 00:39:35,833 --> 00:39:39,266 my ladder to the bottom rung of stagnation. 844 00:39:39,266 --> 00:39:42,400 I was seeing a therapist at Columbia 845 00:39:42,400 --> 00:39:44,000 for the first time ever. 846 00:39:44,000 --> 00:39:45,500 And I, like, went into her office, 847 00:39:45,500 --> 00:39:48,333 and I was shaking and crying, and she couldn't understand. 848 00:39:48,333 --> 00:39:50,166 And she was telling me, you know, 849 00:39:50,166 --> 00:39:51,933 "Why don't you just delete your social media?" 850 00:39:51,933 --> 00:39:53,300 And action. 851 00:39:53,300 --> 00:39:55,733 And I was like, "What you don't understand 852 00:39:55,733 --> 00:39:57,266 "is that I can't delete these accounts, 853 00:39:57,266 --> 00:40:00,533 because they are what keeps me financially stable." 854 00:40:00,533 --> 00:40:03,200 I pay all of my own bills. 855 00:40:03,200 --> 00:40:05,733 And in addition to that, I pay my mom's bills. 856 00:40:05,733 --> 00:40:08,466 And monetizing on social media 857 00:40:08,466 --> 00:40:12,900 has given me the opportunity to do that. 858 00:40:12,900 --> 00:40:17,166 When I'm being abused or harassed online, 859 00:40:17,166 --> 00:40:20,033 it's almost impossible for me to step away. 860 00:40:20,033 --> 00:40:22,133 And it's kind of like an abusive relationship 861 00:40:22,133 --> 00:40:23,333 in that regard. 862 00:40:23,333 --> 00:40:25,433 I have to open myself up to this hate, 863 00:40:25,433 --> 00:40:29,366 because this is what creates financial stability for me. 864 00:40:29,366 --> 00:40:34,133 ♪ ♪ 865 00:40:34,133 --> 00:40:38,266 [ticking noise] 866 00:40:38,266 --> 00:40:42,266 - What starts out as just a place to be creative 867 00:40:42,266 --> 00:40:47,800 and express yourself becomes this rat race for attention 868 00:40:47,800 --> 00:40:50,766 and this need to constantly chase like counts, 869 00:40:50,766 --> 00:40:52,700 follower counts, and view counts, 870 00:40:52,700 --> 00:40:54,000 a need to constantly perform 871 00:40:54,000 --> 00:40:57,766 in a way that can really break people down. 872 00:40:57,766 --> 00:41:02,766 Social media influencer was the fourth-highest aspiration 873 00:41:02,766 --> 00:41:05,933 among elementary school students. 874 00:41:05,933 --> 00:41:08,966 On the outside, the life of influencers 875 00:41:08,966 --> 00:41:12,133 looks really fun and glamorous. 876 00:41:12,133 --> 00:41:14,633 On the inside, a lot of those influencers, 877 00:41:14,633 --> 00:41:17,466 in addition to getting some external validation, 878 00:41:17,466 --> 00:41:20,866 they're getting a lot of harassment and hate. 879 00:41:20,866 --> 00:41:24,866 They have to perform happiness all of the time. 880 00:41:24,866 --> 00:41:28,566 Many of them are struggling with depression, 881 00:41:28,566 --> 00:41:31,433 anxiety, burnout. 882 00:41:31,433 --> 00:41:37,266 And that is having very real-world consequences. 883 00:41:37,266 --> 00:41:40,566 So that algorithm that's always trying to figure out 884 00:41:40,566 --> 00:41:42,233 what the hottest trends are, 885 00:41:42,233 --> 00:41:44,133 it's constantly lifting something up 886 00:41:44,133 --> 00:41:45,433 to the stratosphere 887 00:41:45,433 --> 00:41:47,533 and then taking it back down again. 888 00:41:47,533 --> 00:41:49,600 To me this is bigger than TikTok. 889 00:41:49,600 --> 00:41:52,700 It's about who in our society gets heard 890 00:41:52,700 --> 00:41:56,133 and what you have to do in our society to get heard. 891 00:41:56,133 --> 00:41:59,133 [light music] 892 00:41:59,133 --> 00:42:04,266 ♪ ♪ 893 00:42:04,266 --> 00:42:07,200 - After my first video about the Uighurs was taken down, 894 00:42:07,200 --> 00:42:09,966 I knew I had to disguise my video. 895 00:42:09,966 --> 00:42:12,400 So I grabbed my pink eyelash curler, 896 00:42:12,400 --> 00:42:15,066 and I start curling my lashes. 897 00:42:15,066 --> 00:42:17,833 This is the one that started it all. 898 00:42:17,833 --> 00:42:20,533 Hi, guys. I'm gonna teach you guys how to get long lashes. 899 00:42:20,533 --> 00:42:22,866 So the first thing you need to do is grab your lash curler, 900 00:42:22,866 --> 00:42:24,600 curl your lashes, obviously. 901 00:42:24,600 --> 00:42:26,433 Then you're gonna put them down and use your phone 902 00:42:26,433 --> 00:42:27,900 that you're using right now 903 00:42:27,900 --> 00:42:29,500 to search up what's happening in China, 904 00:42:29,500 --> 00:42:31,233 how they're getting concentration camps, 905 00:42:31,233 --> 00:42:32,966 throwing innocent Muslims in there, 906 00:42:32,966 --> 00:42:34,600 separating their families from each other, 907 00:42:34,600 --> 00:42:36,166 kidnapping them, murdering them, 908 00:42:36,166 --> 00:42:37,500 raping them, forcing them to... 909 00:42:37,500 --> 00:42:40,433 I spoke about all of that in just, like, 40 seconds, 910 00:42:40,433 --> 00:42:43,100 and then I continued on to the eyelash tutorial. 911 00:42:43,100 --> 00:42:45,833 This is another Holocaust, yet no one is talking about it. 912 00:42:45,833 --> 00:42:49,933 Please be aware. Please spread awareness. 913 00:42:49,933 --> 00:42:51,300 And, yeah. 914 00:42:51,300 --> 00:42:53,466 So you can grab your lash curler again. 915 00:42:53,466 --> 00:42:57,166 It reached millions, and people were-- 916 00:42:57,166 --> 00:43:01,100 people were shocked in the comments. 917 00:43:01,100 --> 00:43:02,733 - You popped up on my For You page, and I was like, 918 00:43:02,733 --> 00:43:04,000 "Oh, my God, that's Feroza." 919 00:43:04,000 --> 00:43:05,333 I was like, "I sent this to her." 920 00:43:05,333 --> 00:43:06,633 And I was like, "Why are you on my For You page? 921 00:43:06,633 --> 00:43:08,766 And why do you have, like, so many likes?" 922 00:43:08,766 --> 00:43:10,366 It was, like, crazy. 923 00:43:10,366 --> 00:43:12,066 - I had to tell my mom right after that. 924 00:43:12,066 --> 00:43:13,933 I was like, "Ugh, people are sending it now. 925 00:43:13,933 --> 00:43:16,200 I should tell my mom I have TikTok." 926 00:43:16,200 --> 00:43:18,533 - My mom--like, we were watching the news 927 00:43:18,533 --> 00:43:20,700 and she was like, "Is that Feroza?" 928 00:43:20,700 --> 00:43:22,366 - Really? - I was like, "Oh, gosh. 929 00:43:22,366 --> 00:43:25,200 Yes, it's my friend." 930 00:43:25,200 --> 00:43:28,800 - A 42nd video going viral in just one day. 931 00:43:28,800 --> 00:43:32,966 That's, like, the power that TikTok holds. 932 00:43:32,966 --> 00:43:36,600 So I decided to post two more videos the following two days 933 00:43:36,600 --> 00:43:39,800 to just post more information on how to help. 934 00:43:39,800 --> 00:43:41,566 Hey, guys. You wanted a second part to the video 935 00:43:41,566 --> 00:43:43,500 on how to get longer lashes, so here it is. 936 00:43:43,500 --> 00:43:45,166 And by the way, I say that so TikTok 937 00:43:45,166 --> 00:43:46,666 doesn't take down my videos. 938 00:43:46,666 --> 00:43:50,533 I don't think TikTok noticed what I posted at first. 939 00:43:50,533 --> 00:43:52,333 And then the following day-- 940 00:43:52,333 --> 00:43:53,700 it was, like, a Monday-- 941 00:43:53,700 --> 00:43:55,233 I wake up at 5:00 a.m. for school, 942 00:43:55,233 --> 00:43:57,100 and I go on TikTok 943 00:43:57,100 --> 00:43:59,900 to see how many views the next two videos got. 944 00:43:59,900 --> 00:44:03,533 And I see that I can't even go on TikTok. 945 00:44:03,533 --> 00:44:04,700 My account's suspended. 946 00:44:04,700 --> 00:44:06,833 ♪ ♪ 947 00:44:06,833 --> 00:44:09,400 "Your account is temporarily suspended, 948 00:44:09,400 --> 00:44:13,566 because it goes against Community Guidelines." 949 00:44:13,566 --> 00:44:16,600 Nothing from my post violates community guidelines. 950 00:44:16,600 --> 00:44:20,733 I show nothing of hate speech. I show no profanity. 951 00:44:20,733 --> 00:44:24,933 Everything I spoke about was factual evidence. 952 00:44:24,933 --> 00:44:27,900 My first thought after seeing this black screen 953 00:44:27,900 --> 00:44:30,133 on my phone was, 954 00:44:30,133 --> 00:44:31,966 "I'm not letting them silence me." 955 00:44:31,966 --> 00:44:34,700 I quickly made a Twitter account. 956 00:44:34,700 --> 00:44:38,066 I quickly posted on Instagram that, 957 00:44:38,066 --> 00:44:41,133 "Hey, I was silenced for speaking up. 958 00:44:41,133 --> 00:44:42,933 And I'm not gonna let them get away with that." 959 00:44:42,933 --> 00:44:47,933 And I asked people to continue sharing the video. 960 00:44:47,933 --> 00:44:49,933 I'm so grateful that people 961 00:44:49,933 --> 00:44:53,633 heard me saying that my voice was taken away. 962 00:44:53,633 --> 00:44:55,733 - 17-year-old Feroza Aziz. 963 00:44:55,733 --> 00:44:56,900 - [speaking in native language] 964 00:44:56,900 --> 00:44:59,900 - Joining us now is Feroza Aziz. 965 00:44:59,900 --> 00:45:05,133 - In less than a few days, I was on Al Jazeera, BBC, CNN. 966 00:45:05,133 --> 00:45:08,966 - More than 1.5 million people have watched it just on TikTok. 967 00:45:08,966 --> 00:45:12,733 What kind of responses have you had just from regular people? 968 00:45:12,733 --> 00:45:14,966 - Half of them is positive. And the other half is, 969 00:45:14,966 --> 00:45:17,133 "Wow, I did not know this is happening. 970 00:45:17,133 --> 00:45:19,800 Why am I hearing this on TikTok and why not on the news?" 971 00:45:19,800 --> 00:45:24,166 I did feel a little bit upset, though, because I felt 972 00:45:24,166 --> 00:45:25,933 as if more attention was brought to me 973 00:45:25,933 --> 00:45:29,533 being silenced than to what I was actually speaking about. 974 00:45:29,533 --> 00:45:31,333 I remember seeing headlines saying, 975 00:45:31,333 --> 00:45:34,633 "Oh, Beijing-owned app takes on video" 976 00:45:34,633 --> 00:45:38,633 and not, "Oh, like, there's a Uighur genocide happening." 977 00:45:38,633 --> 00:45:40,166 ♪ ♪ 978 00:45:40,166 --> 00:45:42,733 I felt very overwhelmed with the news coverage. 979 00:45:42,733 --> 00:45:45,966 One of my idols--I have, like, her picture on my wall-- 980 00:45:45,966 --> 00:45:50,466 AOC, retweeted an article of me. 981 00:45:50,466 --> 00:45:53,400 I didn't expect politicians from China 982 00:45:53,400 --> 00:45:54,700 to even comment on it. 983 00:45:54,700 --> 00:45:55,933 - China's foreign ministry 984 00:45:55,933 --> 00:45:58,466 said it had no specifics of this case. 985 00:45:58,466 --> 00:46:00,333 - [speaking Chinese] 986 00:46:00,333 --> 00:46:02,866 - You're saying the content is still on the TikTok account? 987 00:46:02,866 --> 00:46:05,000 I'm not aware of the situation. How could I know 988 00:46:05,000 --> 00:46:07,500 what's happening on the account of one individual? 989 00:46:07,500 --> 00:46:09,500 - I thought I had the freedom of speech, 990 00:46:09,500 --> 00:46:11,733 but I guess under TikTok 991 00:46:11,733 --> 00:46:15,700 that it's not possible for me to have that right. 992 00:46:15,700 --> 00:46:19,366 ♪ ♪ 993 00:46:19,366 --> 00:46:24,633 [ticking noise] 994 00:46:24,633 --> 00:46:28,300 - Content moderation is a process of determining 995 00:46:28,300 --> 00:46:29,533 what's appropriate 996 00:46:29,533 --> 00:46:32,633 and what's not appropriate online. 997 00:46:32,633 --> 00:46:37,366 One of the natural tensions becomes if you have a company, 998 00:46:37,366 --> 00:46:39,233 but it's all throughout the globe, 999 00:46:39,233 --> 00:46:43,400 do you adjust to the cultural norms of another country? 1000 00:46:43,400 --> 00:46:44,900 I mean, a lot of people 1001 00:46:44,900 --> 00:46:47,800 when they're on Facebook or TikTok or Instagram, 1002 00:46:47,800 --> 00:46:50,133 they talk about it by using concepts, 1003 00:46:50,133 --> 00:46:51,700 like freedom of speech, 1004 00:46:51,700 --> 00:46:56,766 because all throughout society, specifically American society, 1005 00:46:56,766 --> 00:47:00,533 we have debated what's appropriate, 1006 00:47:00,533 --> 00:47:03,866 how you balance individual autonomy 1007 00:47:03,866 --> 00:47:06,466 and expression with the societal impact. 1008 00:47:06,466 --> 00:47:09,866 That used to reside in governmental bodies. 1009 00:47:09,866 --> 00:47:13,166 ♪ ♪ 1010 00:47:13,166 --> 00:47:16,900 With social media, the power of determining speech 1011 00:47:16,900 --> 00:47:19,100 has been far too consolidated. 1012 00:47:19,100 --> 00:47:21,533 Major tech companies, 1013 00:47:21,533 --> 00:47:23,933 if they have the power of deciding what's okay 1014 00:47:23,933 --> 00:47:27,600 and what's not okay with what I say, 1015 00:47:27,600 --> 00:47:30,200 if they have the power to de-platform, 1016 00:47:30,200 --> 00:47:36,533 that puts a tremendous level of power in an unelected official. 1017 00:47:36,533 --> 00:47:38,033 That's antidemocratic. 1018 00:47:38,033 --> 00:47:45,066 ♪ ♪ 1019 00:47:48,533 --> 00:47:50,833 - You know, I might go livestream 1020 00:47:50,833 --> 00:47:53,133 on Douyin a little bit. 1021 00:47:53,133 --> 00:47:56,900 That's what's hard on Douyin, but there's one problem, 1022 00:47:56,900 --> 00:48:01,466 'cause all the restrictions of my tattoos. 1023 00:48:01,466 --> 00:48:03,433 They might shut me down. 1024 00:48:03,433 --> 00:48:06,433 Here we go. We're live. 1025 00:48:06,433 --> 00:48:09,333 Tattoos? No. Also, the piercing? 1026 00:48:09,333 --> 00:48:10,366 No. 1027 00:48:25,066 --> 00:48:26,833 Ahh! 1028 00:48:26,833 --> 00:48:31,400 Sorry, sorry. Hey, you see this? 1029 00:48:31,400 --> 00:48:36,266 Because I have tattoos, I can't go livestream. 1030 00:48:39,566 --> 00:48:42,066 Sorry. 1031 00:48:45,266 --> 00:48:48,366 That's really messed up, you know? 1032 00:48:48,366 --> 00:48:52,033 Yeah, on Douyin, you have to watch everything that you say. 1033 00:48:52,033 --> 00:48:55,800 Just one word, one frame 1034 00:48:55,800 --> 00:48:57,666 can set your whole video off. 1035 00:48:57,666 --> 00:49:00,633 ♪ ♪ 1036 00:49:00,633 --> 00:49:03,066 - For the first time in history, 1037 00:49:03,066 --> 00:49:06,166 a person can write something or say something 1038 00:49:06,166 --> 00:49:10,300 and have it reach a large segment of the world. 1039 00:49:10,300 --> 00:49:12,733 So this brings up the topic of censorship, 1040 00:49:12,733 --> 00:49:14,333 which is really tricky. 1041 00:49:14,333 --> 00:49:17,466 The West, of course, is a society built on free speech. 1042 00:49:17,466 --> 00:49:18,933 But now in the West, we're grappling 1043 00:49:18,933 --> 00:49:22,066 with our own form of debate around censorship. 1044 00:49:22,066 --> 00:49:23,500 Different government officials, 1045 00:49:23,500 --> 00:49:27,066 even the public are starting to feel great anger 1046 00:49:27,066 --> 00:49:29,366 about the types of misinformation 1047 00:49:29,366 --> 00:49:31,566 that's spread on social networks, 1048 00:49:31,566 --> 00:49:33,700 like Facebook, like Twitter. 1049 00:49:33,700 --> 00:49:37,000 And so now, you're starting to see, even in the U.S., 1050 00:49:37,000 --> 00:49:39,966 increasing rhetoric around passing laws 1051 00:49:39,966 --> 00:49:42,233 to curb their power. 1052 00:49:42,233 --> 00:49:45,866 China, with a great firewall and with government moderation 1053 00:49:45,866 --> 00:49:48,066 has taken a very active hand 1054 00:49:48,066 --> 00:49:50,833 in controlling what topics are discussed, 1055 00:49:50,833 --> 00:49:53,866 what ideas are acceptable to discuss on the internet. 1056 00:49:53,866 --> 00:49:57,733 We've never had to grapple with questions around censorship 1057 00:49:57,733 --> 00:49:59,500 in an era where so many people 1058 00:49:59,500 --> 00:50:03,133 have a global megaphone now in their hands. 1059 00:50:03,133 --> 00:50:06,833 ♪ ♪ 1060 00:50:06,833 --> 00:50:09,633 - It was back in late December when Dr. Li Wenliang first 1061 00:50:09,633 --> 00:50:11,033 warned friends on WeChat 1062 00:50:11,033 --> 00:50:13,566 about a SARS-like disease going around. 1063 00:50:13,566 --> 00:50:15,900 Li sent a group message saying that a test result 1064 00:50:15,900 --> 00:50:17,766 from a patient quarantined at the hospital 1065 00:50:17,766 --> 00:50:21,266 where he worked showed a patient had a coronavirus. 1066 00:50:21,266 --> 00:50:23,066 But hours after hitting Send, 1067 00:50:23,066 --> 00:50:25,566 Wuhan City health officials tracked Li down 1068 00:50:25,566 --> 00:50:28,866 questioning where he got the information. 1069 00:50:28,866 --> 00:50:35,900 ♪ ♪ 1070 00:50:38,733 --> 00:50:41,533 - Dr. Li sounded the alarm early 1071 00:50:41,533 --> 00:50:44,300 in the COVID-19 outbreak. 1072 00:50:44,300 --> 00:50:47,266 He soon faced government intimidation 1073 00:50:47,266 --> 00:50:50,866 and then contracted the virus. 1074 00:50:50,866 --> 00:50:55,233 When he passed away, I was among many Chinese netizens 1075 00:50:55,233 --> 00:50:57,900 who expressed grief and outrage 1076 00:50:57,900 --> 00:51:00,633 at the events on Weibo 1077 00:51:00,633 --> 00:51:05,833 only to have my account deleted. 1078 00:51:05,833 --> 00:51:09,466 I felt guilt more than anger. 1079 00:51:09,466 --> 00:51:13,266 At the time, I was a tech worker at ByteDance, 1080 00:51:13,266 --> 00:51:16,533 where I helped develop tools and platforms 1081 00:51:16,533 --> 00:51:18,866 for content moderation. 1082 00:51:18,866 --> 00:51:22,800 In other words, I had helped build the system 1083 00:51:22,800 --> 00:51:26,200 that censored accounts like mine. 1084 00:51:26,200 --> 00:51:30,133 The technologies we created supported the entire company's 1085 00:51:30,133 --> 00:51:33,733 content moderation, including Douyin at home 1086 00:51:33,733 --> 00:51:39,766 and its international equivalent, TikTok. 1087 00:51:39,766 --> 00:51:43,433 There was a long, constantly updated list 1088 00:51:43,433 --> 00:51:47,733 of sensitive words, dates, and names. 1089 00:51:47,733 --> 00:51:51,133 If a user mentioned a sensitive term, 1090 00:51:51,133 --> 00:51:54,500 they would shut down the ongoing livestreaming session 1091 00:51:54,500 --> 00:51:59,266 and even suspend or delete the account. 1092 00:51:59,266 --> 00:52:01,933 Many of my colleagues felt uneasy 1093 00:52:01,933 --> 00:52:04,733 about what we were doing. 1094 00:52:04,733 --> 00:52:09,966 But we all felt that there was nothing we can do. 1095 00:52:09,966 --> 00:52:12,466 Dr. Li warned his colleagues and friends 1096 00:52:12,466 --> 00:52:15,166 about an unknown virus. 1097 00:52:15,166 --> 00:52:18,133 He was punished for that. 1098 00:52:18,133 --> 00:52:22,800 Just imagine, had any social media platform been able 1099 00:52:22,800 --> 00:52:26,366 to reject the government's censorship directives, 1100 00:52:26,366 --> 00:52:31,200 perhaps millions of lives would have been saved today. 1101 00:52:31,200 --> 00:52:32,966 - I mean, the thing about TikTok is, 1102 00:52:32,966 --> 00:52:35,966 it says its Chinese owners are just that, owners. 1103 00:52:35,966 --> 00:52:37,033 They don't control it. 1104 00:52:37,033 --> 00:52:40,466 In fact, TikTok takes a much, 1105 00:52:40,466 --> 00:52:43,333 much stronger attitude against the sort of content 1106 00:52:43,333 --> 00:52:44,966 that, well, that the Chinese government 1107 00:52:44,966 --> 00:52:48,333 wouldn't like to see on a social media app. 1108 00:52:48,333 --> 00:52:50,400 There's no orders coming down from on high. 1109 00:52:50,400 --> 00:52:52,500 There's just the understanding 1110 00:52:52,500 --> 00:52:55,166 that you will do what Beijing wants, 1111 00:52:55,166 --> 00:52:56,866 and you'll try and guess what they want 1112 00:52:56,866 --> 00:52:59,033 and do it without being asked. 1113 00:52:59,033 --> 00:53:05,866 ♪ ♪ 1114 00:53:05,866 --> 00:53:10,933 - In 2019 we had someone contact us 1115 00:53:10,933 --> 00:53:14,666 claiming to have internal information and access 1116 00:53:14,666 --> 00:53:18,433 to internal TikTok moderation guidelines. 1117 00:53:18,433 --> 00:53:19,800 ♪ ♪ 1118 00:53:19,800 --> 00:53:24,366 And I don't think we realized at the time 1119 00:53:24,366 --> 00:53:27,300 how big the story would get. 1120 00:53:27,300 --> 00:53:31,533 [phone buzzing] 1121 00:53:31,533 --> 00:53:37,200 What we saw was that TikTok was very explicit 1122 00:53:37,200 --> 00:53:40,400 about what it wanted to have on the platform 1123 00:53:40,400 --> 00:53:44,200 and what it didn't want to show on the platform. 1124 00:53:44,200 --> 00:53:47,733 TikTok rarely deletes content. 1125 00:53:47,733 --> 00:53:50,200 They don't have to. They can just hide it. 1126 00:53:50,200 --> 00:53:51,733 ♪ ♪ 1127 00:53:51,733 --> 00:53:54,900 The guidelines were explicitly instructing moderators 1128 00:53:54,900 --> 00:53:56,566 to deal with people 1129 00:53:56,566 --> 00:54:01,233 who are LGBTQ or had disabilities 1130 00:54:01,233 --> 00:54:04,100 or for whatever reason TikTok felt 1131 00:54:04,100 --> 00:54:10,166 were vulnerable to bullying by hiding their content. 1132 00:54:12,100 --> 00:54:13,800 So it was in Mandarin 1133 00:54:13,800 --> 00:54:18,333 and underneath a fairly awkward English translation. 1134 00:54:18,333 --> 00:54:20,333 So it says, 1135 00:54:20,333 --> 00:54:23,400 "Subjects who is susceptible to bullying or harassment 1136 00:54:23,400 --> 00:54:26,866 "based on their physical or mental condition. 1137 00:54:26,866 --> 00:54:32,466 "Example: facial disfigurement, autism, 1138 00:54:32,466 --> 00:54:34,200 "Down Syndrome, 1139 00:54:34,200 --> 00:54:36,500 "disabled people or people 1140 00:54:36,500 --> 00:54:39,533 with some facial problems," et cetera. 1141 00:54:39,533 --> 00:54:42,100 "Content of subjects likely to incite 1142 00:54:42,100 --> 00:54:48,366 cyberbullying will be allowed but marked with risk tag 4." 1143 00:54:48,366 --> 00:54:50,333 ♪ ♪ 1144 00:54:50,333 --> 00:54:52,466 Basically, like, different levels 1145 00:54:52,466 --> 00:54:54,900 of what we call algorithmic punishment 1146 00:54:54,900 --> 00:54:57,133 or algorithmic visibility. 1147 00:54:57,133 --> 00:55:00,233 So they were put in a category called Risk 4, 1148 00:55:00,233 --> 00:55:03,133 which means that as soon as their videos 1149 00:55:03,133 --> 00:55:06,566 would reach a certain threshold of views, 1150 00:55:06,566 --> 00:55:08,500 they would automatically also be taken 1151 00:55:08,500 --> 00:55:12,433 from the For You feed. 1152 00:55:12,433 --> 00:55:15,366 ♪ ♪ 1153 00:55:15,366 --> 00:55:19,433 Later on, other leaks surfaced. 1154 00:55:41,300 --> 00:55:44,600 - I actually have the Ugly Content Policy 1155 00:55:44,600 --> 00:55:46,066 right in front of me. 1156 00:55:46,066 --> 00:55:48,533 It's so crazy to read this, like, 1157 00:55:48,533 --> 00:55:53,300 "Abnormal body shape, chubby, ugly facial looks, 1158 00:55:53,300 --> 00:55:56,066 "not limited to disformatted face, 1159 00:55:56,066 --> 00:55:58,366 fangs, lack of front teeth, 1160 00:55:58,366 --> 00:56:00,966 senior people with too many wrinkles." 1161 00:56:00,966 --> 00:56:03,000 And it just goes on and on, right? 1162 00:56:05,100 --> 00:56:07,366 - Yeah. 1163 00:56:14,366 --> 00:56:16,333 - In a statement, TikTok said, 1164 00:56:16,333 --> 00:56:18,033 "Earlier, we took a blunt approach 1165 00:56:18,033 --> 00:56:20,733 "to minimizing conflict on the platform. 1166 00:56:20,733 --> 00:56:24,333 "Today we use local content moderation policies. 1167 00:56:24,333 --> 00:56:27,566 "We want TikTok to be a space where everyone can safely 1168 00:56:27,566 --> 00:56:31,333 and freely express themselves." 1169 00:56:31,333 --> 00:56:34,233 - It's just a lot of the move-fast-and-break-things 1170 00:56:34,233 --> 00:56:37,133 attitude that we've seen from other Silicon Valley companies. 1171 00:56:37,133 --> 00:56:38,666 ♪ ♪ 1172 00:56:38,666 --> 00:56:44,533 It's not like only TikTok was doing these things. 1173 00:56:45,666 --> 00:56:50,766 Obviously, the representation that we see in media 1174 00:56:50,766 --> 00:56:56,033 is not an accurate picture of society. 1175 00:56:56,033 --> 00:56:58,666 But I think there is a difference that, 1176 00:56:58,666 --> 00:57:02,266 you know, no TV station nor does Hollywood 1177 00:57:02,266 --> 00:57:06,433 pretend to be open access to everybody, 1178 00:57:06,433 --> 00:57:11,200 whereas this is a promise that social media platforms make. 1179 00:57:11,200 --> 00:57:18,100 ♪ ♪ 1180 00:57:18,100 --> 00:57:20,233 - TikTok faces government investigation 1181 00:57:20,233 --> 00:57:22,066 in over seven countries, 1182 00:57:22,066 --> 00:57:24,633 all citing concerns over national security 1183 00:57:24,633 --> 00:57:28,800 and content moderation. 1184 00:57:28,800 --> 00:57:33,666 [ticking noise] 1185 00:57:33,666 --> 00:57:35,133 - Am I the only one that has noticed 1186 00:57:35,133 --> 00:57:38,533 that Black creators get least favored by the algorithm? 1187 00:57:38,533 --> 00:57:40,400 How is it that my followers are not seeing my video? 1188 00:57:40,400 --> 00:57:41,766 What's up with the algorithm? 1189 00:57:41,766 --> 00:57:45,100 - I've had some of my TikTok videos get zero views. 1190 00:57:45,100 --> 00:57:47,166 And I've been shadow-banned. 1191 00:57:47,166 --> 00:57:50,966 - Shadow-banning on TikTok is just when 1192 00:57:50,966 --> 00:57:52,733 there's something in the algorithm 1193 00:57:52,733 --> 00:57:55,400 that just kind of shuts you out completely. 1194 00:57:55,400 --> 00:57:58,200 They just, like, find a way to make it 1195 00:57:58,200 --> 00:58:00,600 so nobody sees any of your content. 1196 00:58:00,600 --> 00:58:03,766 ♪ ♪ 1197 00:58:03,766 --> 00:58:05,933 I am an apprenticing ocularist, 1198 00:58:05,933 --> 00:58:08,333 an artist who works in the medical field 1199 00:58:08,333 --> 00:58:11,633 making prosthetic eyes. 1200 00:58:11,633 --> 00:58:14,933 TikTok's algorithm is very good. 1201 00:58:14,933 --> 00:58:17,400 You know, you can create an account, 1202 00:58:17,400 --> 00:58:19,700 and within a couple of hours or a couple of days, 1203 00:58:19,700 --> 00:58:24,066 that algorithm knows who you are, you know? 1204 00:58:24,066 --> 00:58:26,566 So for that same algorithm to kind of 1205 00:58:26,566 --> 00:58:30,066 just rip the rug out from thousands of Black creators, 1206 00:58:30,066 --> 00:58:33,433 it kind of pulls you back for a second. 1207 00:58:33,433 --> 00:58:35,766 We know the history that this nation has with Black people. 1208 00:58:35,766 --> 00:58:37,933 We know the savagery that they had to endure 1209 00:58:37,933 --> 00:58:40,266 because of colonizers and the savagery 1210 00:58:40,266 --> 00:58:41,266 that they still have to endure. 1211 00:58:41,266 --> 00:58:42,966 I got on, and I had made a video 1212 00:58:42,966 --> 00:58:45,966 talking about how my For You page 1213 00:58:45,966 --> 00:58:48,400 was only white creators. 1214 00:58:48,400 --> 00:58:52,166 And by that point, I would say I had 1215 00:58:52,166 --> 00:58:56,000 150,000 to 200,000 followers. 1216 00:58:56,000 --> 00:58:59,066 And so I had a video sitting that I published 1217 00:58:59,066 --> 00:59:02,166 for three hours, and it said zero views. 1218 00:59:02,166 --> 00:59:04,633 That was the first time where I was like, 1219 00:59:04,633 --> 00:59:07,033 "This is blatant shadow-banning. 1220 00:59:07,033 --> 00:59:08,366 "Nobody's seeing this, 1221 00:59:08,366 --> 00:59:11,866 'cause they're ensuring that nobody can." 1222 00:59:11,866 --> 00:59:13,433 - Tech troubles. 1223 00:59:13,433 --> 00:59:16,166 TikTok says a technical glitch is making it appear 1224 00:59:16,166 --> 00:59:19,333 as if posts with the #BlackLivesMatter 1225 00:59:19,333 --> 00:59:21,800 and #GeorgeFloyd receive no views. 1226 00:59:21,800 --> 00:59:25,633 The video platform says it's dealing with a display issue 1227 00:59:25,633 --> 00:59:27,933 adding that videos featuring those tags 1228 00:59:27,933 --> 00:59:30,966 have amassed more than 2 billion views. 1229 00:59:30,966 --> 00:59:33,933 - Normally when you go to use a tag on TikTok, 1230 00:59:33,933 --> 00:59:39,200 it'll tell you how many views have been on that tag. 1231 00:59:39,200 --> 00:59:41,500 And you would go to write Black Lives Matter, 1232 00:59:41,500 --> 00:59:43,233 and it would say zero. 1233 00:59:43,233 --> 00:59:45,866 Or, you know, BLM, George Floyd, 1234 00:59:45,866 --> 00:59:49,933 Ahmaud Arbery, anything, it would tell you zero. 1235 00:59:49,933 --> 00:59:51,966 - TikTok said in a statement which reads in part, 1236 00:59:51,966 --> 00:59:53,433 "First, to our Black community, 1237 00:59:53,433 --> 00:59:55,000 we want you to know that..." 1238 00:59:55,000 --> 00:59:56,033 - "Last week, a technical glitch 1239 00:59:56,033 --> 00:59:57,533 "made it temporarily appear 1240 00:59:57,533 --> 01:00:01,133 "as if posts uploaded using #BlackLivesMatter 1241 01:00:01,133 --> 01:00:04,400 "and #GeorgeFloyd would receive zero views. 1242 01:00:04,400 --> 01:00:06,033 "We understand that many assume this bug 1243 01:00:06,033 --> 01:00:08,366 "to be an intentional act to suppress experiences 1244 01:00:08,366 --> 01:00:11,033 "and invalidate the emotions felt by the Black community. 1245 01:00:11,033 --> 01:00:12,500 "And we know we have work to do 1246 01:00:12,500 --> 01:00:14,866 to regain and repair that trust." 1247 01:00:14,866 --> 01:00:16,933 Kind of, like, the normal checkpoints 1248 01:00:16,933 --> 01:00:18,466 that people go through. 1249 01:00:18,466 --> 01:00:19,500 "We're growing. We're learning. 1250 01:00:19,500 --> 01:00:23,033 We're trying to do better." 1251 01:00:23,033 --> 01:00:25,633 I would love to believe that it was a technical glitch, 1252 01:00:25,633 --> 01:00:28,666 'cause you're like, "That's absolutely possible, 100%." 1253 01:00:28,666 --> 01:00:32,733 But it's so oddly specific that I can't attribute 1254 01:00:32,733 --> 01:00:35,966 that to just being a glitch. 1255 01:00:35,966 --> 01:00:37,400 - TikTok has said that their content moderation 1256 01:00:37,400 --> 01:00:39,000 has changed. 1257 01:00:39,000 --> 01:00:42,033 Some of what you see on there backs that up in the sense 1258 01:00:42,033 --> 01:00:44,833 that you see a lot of activism there. 1259 01:00:44,833 --> 01:00:46,333 You saw Black Lives Matter content 1260 01:00:46,333 --> 01:00:48,833 eventually be up on there. 1261 01:00:48,833 --> 01:00:51,300 But it's constantly changing. 1262 01:00:51,300 --> 01:00:52,833 It's a constant black box. 1263 01:00:52,833 --> 01:00:56,300 We have no idea what's going into any of these algorithms. 1264 01:00:56,300 --> 01:00:59,333 And there's zero transparency. 1265 01:00:59,333 --> 01:01:02,600 [birds chirping] 1266 01:01:02,600 --> 01:01:05,233 - ByteDance, the Beijing-based owner of TikTok, 1267 01:01:05,233 --> 01:01:07,300 apologized for the suspension 1268 01:01:07,300 --> 01:01:09,666 blaming a human moderation error. 1269 01:01:09,666 --> 01:01:12,500 And TikTok says it doesn't apply Chinese moderation 1270 01:01:12,500 --> 01:01:17,333 principles to its product outside of Mainland China. 1271 01:01:17,333 --> 01:01:21,333 - After a few days, TikTok gave my account back. 1272 01:01:21,333 --> 01:01:23,366 People don't seem to understand what it feels like 1273 01:01:23,366 --> 01:01:26,300 to have someone try to take away your voice, 1274 01:01:26,300 --> 01:01:28,600 and then they give it back to you. 1275 01:01:28,600 --> 01:01:30,133 It's my voice. 1276 01:01:30,133 --> 01:01:32,566 And them deciding to give me back my account 1277 01:01:32,566 --> 01:01:35,000 after taking it away, it was as if they could control 1278 01:01:35,000 --> 01:01:37,800 what I could say and what I could do. 1279 01:01:37,800 --> 01:01:43,066 And it's just disgusting to see an app do that. 1280 01:01:43,066 --> 01:01:45,033 Till this day, my classmates will post 1281 01:01:45,033 --> 01:01:49,333 on my social media accounts leaving hate comments. 1282 01:01:49,333 --> 01:01:50,666 I can delete the comment, 1283 01:01:50,666 --> 01:01:52,233 but I'm gonna go to class the next day, 1284 01:01:52,233 --> 01:01:54,733 and I'm gonna sit next to the person who hates my guts 1285 01:01:54,733 --> 01:01:56,566 for just speaking on issues 1286 01:01:56,566 --> 01:02:00,933 that I believe needs to be spoken about. 1287 01:02:00,933 --> 01:02:04,200 When you're so invested on apps like TikTok, 1288 01:02:04,200 --> 01:02:06,800 when something bad happens on social media, 1289 01:02:06,800 --> 01:02:09,733 your life is torn apart. 1290 01:02:09,733 --> 01:02:13,400 ♪ ♪ 1291 01:02:13,400 --> 01:02:18,500 [ticking noise] 1292 01:02:18,500 --> 01:02:21,966 - Storytime. We all worked on the Kamala Harris campaign 1293 01:02:21,966 --> 01:02:23,866 in the presidential primary. 1294 01:02:23,866 --> 01:02:25,966 - And this is your sign to get a tattoo 1295 01:02:25,966 --> 01:02:27,400 with your work besties. 1296 01:02:27,400 --> 01:02:29,266 - For the people. 1297 01:02:29,266 --> 01:02:30,766 - I was 19 years old 1298 01:02:30,766 --> 01:02:32,633 when I started on the Kamala Harris campaign. 1299 01:02:32,633 --> 01:02:35,300 I withdrew my sophomore year at Columbia. 1300 01:02:35,300 --> 01:02:37,533 And it was a huge move. 1301 01:02:37,533 --> 01:02:39,733 I think that our perspective as young people 1302 01:02:39,733 --> 01:02:42,333 is what led us to think TikTok is important. 1303 01:02:42,333 --> 01:02:43,633 There's a lot of young people there. 1304 01:02:43,633 --> 01:02:45,133 - We were the first campaign 1305 01:02:45,133 --> 01:02:48,800 that was putting content on TikTok directly. 1306 01:02:48,800 --> 01:02:51,466 - I feel like we have to take a moment for this. This. 1307 01:02:51,466 --> 01:02:53,200 - Oh, thank you. - Oh, yes. 1308 01:02:53,200 --> 01:02:55,666 - All of Yessica's iconic shots. 1309 01:02:55,666 --> 01:02:58,500 - So good. 1310 01:02:58,500 --> 01:03:01,700 - I mean, you pioneered vertical video. 1311 01:03:01,700 --> 01:03:03,533 - And it was interesting to see the progression. 1312 01:03:03,533 --> 01:03:06,033 Like, when different candidates started to have, 1313 01:03:06,033 --> 01:03:07,866 like, their official TikToks 1314 01:03:07,866 --> 01:03:09,466 and, like, what kind of content 1315 01:03:09,466 --> 01:03:11,666 did they make on TikTok? 1316 01:03:11,666 --> 01:03:13,666 I think we felt like we were kind of starting 1317 01:03:13,666 --> 01:03:15,066 to hit our stride with TikTok. - Yeah. 1318 01:03:15,066 --> 01:03:17,933 - And then we, like, had to stop. 1319 01:03:17,933 --> 01:03:20,133 - That email, that dreadful, dreadful email we got. 1320 01:03:20,133 --> 01:03:21,933 - The dark day when we were told 1321 01:03:21,933 --> 01:03:24,466 that we couldn't be on TikTok anymore. 1322 01:03:24,466 --> 01:03:28,133 - And I get this email that because of security reasons, 1323 01:03:28,133 --> 01:03:30,766 we're all being asked to delete TikTok 1324 01:03:30,766 --> 01:03:33,800 on government and military phones. 1325 01:03:33,800 --> 01:03:35,533 That was a sad email. That was a tough one. 1326 01:03:35,533 --> 01:03:36,966 ♪ ♪ 1327 01:03:36,966 --> 01:03:39,433 From a perspective of a young person, 1328 01:03:39,433 --> 01:03:42,600 I have pretty much no conceptions about privacy, 1329 01:03:42,600 --> 01:03:44,700 because in all honesty, 1330 01:03:44,700 --> 01:03:47,733 I've grown up with everyone seeing everything. 1331 01:03:47,733 --> 01:03:50,166 The way that we use these social platforms, 1332 01:03:50,166 --> 01:03:52,466 we often don't think about the global impacts. 1333 01:03:52,466 --> 01:03:54,300 - TikTok's ownership by a Chinese 1334 01:03:54,300 --> 01:03:56,766 parent company subject to Chinese surveillance law 1335 01:03:56,766 --> 01:03:59,466 has made the app's popularity problematic, 1336 01:03:59,466 --> 01:04:01,900 causing concern from the U.S. army, the navy, 1337 01:04:01,900 --> 01:04:05,200 the TSA, the DNC, the RNC, and the Biden campaign, 1338 01:04:05,200 --> 01:04:07,400 all banning TikTok from their phones. 1339 01:04:07,400 --> 01:04:08,600 [helicopter blades whirring] 1340 01:04:08,600 --> 01:04:10,000 - There were many young soldiers 1341 01:04:10,000 --> 01:04:12,400 in the military who were using TikTok. 1342 01:04:12,400 --> 01:04:15,533 They were in all sorts of U.S. military bases 1343 01:04:15,533 --> 01:04:16,900 around the world. 1344 01:04:16,900 --> 01:04:18,066 And so they would go, and they would do 1345 01:04:18,066 --> 01:04:19,500 push-up contests. 1346 01:04:19,500 --> 01:04:21,800 They would, you know, do tours of the bases. 1347 01:04:21,800 --> 01:04:23,633 And, you know, they were really showing 1348 01:04:23,633 --> 01:04:26,800 some pretty top secret assets 1349 01:04:26,800 --> 01:04:29,466 to anyone in the world wanting to see them. 1350 01:04:29,466 --> 01:04:30,766 And this was at a moment 1351 01:04:30,766 --> 01:04:34,633 where people were not taking TikTok seriously. 1352 01:04:34,633 --> 01:04:38,200 But what they realized was this silly little kids' app 1353 01:04:38,200 --> 01:04:41,400 was collecting a ton of information on GPS 1354 01:04:41,400 --> 01:04:44,533 and location of all of these soldiers. 1355 01:04:44,533 --> 01:04:46,200 And all of that in the end 1356 01:04:46,200 --> 01:04:50,566 was heading back into a Chinese company. 1357 01:04:50,566 --> 01:04:52,166 - ♪ It's not too long, it's not too long ♪ 1358 01:04:52,166 --> 01:04:54,733 ♪ It's not too long for you to call back ♪ 1359 01:04:54,733 --> 01:04:57,066 ♪ And normally I would just forget that ♪ 1360 01:04:57,066 --> 01:04:59,166 - From a nation state's perspective, 1361 01:04:59,166 --> 01:05:01,833 well, data is the new oil. 1362 01:05:01,833 --> 01:05:04,700 If I can understand the connections between people, 1363 01:05:04,700 --> 01:05:07,000 I can start to target my misinformation 1364 01:05:07,000 --> 01:05:09,433 so that one person is likely to take actions 1365 01:05:09,433 --> 01:05:13,666 in the real world, like vote. 1366 01:05:13,666 --> 01:05:16,433 So the data that TikTok collects is on par 1367 01:05:16,433 --> 01:05:18,033 with what the other social media companies 1368 01:05:18,033 --> 01:05:20,533 are collecting. 1369 01:05:20,533 --> 01:05:24,833 So the question really becomes, why is TikTok being picked on? 1370 01:05:24,833 --> 01:05:27,500 Xenophobia should certainly be considered as part of this. 1371 01:05:27,500 --> 01:05:31,100 We've seen a rise in hate crimes against Asian Americans. 1372 01:05:31,100 --> 01:05:34,733 And so I think being very clear about the differences 1373 01:05:34,733 --> 01:05:38,933 of the practices of a government versus the people 1374 01:05:38,933 --> 01:05:41,966 that happen to reside inside of that nation state--after all, 1375 01:05:41,966 --> 01:05:45,366 I don't agree with 100% of the things our nation does. 1376 01:05:45,366 --> 01:05:49,600 - ♪ I'm singing Trump 2020, Trump 2020 ♪ 1377 01:05:49,600 --> 01:05:52,466 ♪ Trump 2020, Trump 2020 ♪ 1378 01:05:52,466 --> 01:05:55,233 ♪ I'm singing Trump 2020, Trump 2020 ♪ 1379 01:05:55,233 --> 01:05:59,300 - When I first found Gen Z comedians online, 1380 01:05:59,300 --> 01:06:01,933 it was so inspiring to me as a comedian 1381 01:06:01,933 --> 01:06:03,933 in seeing how easy it is 1382 01:06:03,933 --> 01:06:07,333 to build traction on apps like TikTok. 1383 01:06:07,333 --> 01:06:09,300 ♪ ♪ 1384 01:06:09,300 --> 01:06:12,466 One of my friends had posted that Donald Trump's Tulsa 1385 01:06:12,466 --> 01:06:16,533 rally had free tickets, and my first thought was just, 1386 01:06:16,533 --> 01:06:20,000 "How easy is it to get a ticket?" 1387 01:06:20,000 --> 01:06:21,833 Guys, Donald Trump is having a rally. 1388 01:06:21,833 --> 01:06:23,900 All you have to do is give your phone number. 1389 01:06:23,900 --> 01:06:25,933 And so I got two tickets. 1390 01:06:25,933 --> 01:06:29,866 But I totally forgot that I have to pick 1391 01:06:29,866 --> 01:06:33,333 every individual piece of lint off of my floor 1392 01:06:33,333 --> 01:06:38,533 and then sort them by size, so I can't make it for Friday. 1393 01:06:38,533 --> 01:06:42,733 I had realized the potential of this. 1394 01:06:42,733 --> 01:06:45,566 You should be really careful going to do this. 1395 01:06:45,566 --> 01:06:48,433 You know, you don't want a bunch of empty seats. 1396 01:06:48,433 --> 01:06:53,166 And when I had posted it, I didn't think much of it. 1397 01:06:53,166 --> 01:06:56,366 But in two days, it just blew up. 1398 01:06:56,366 --> 01:06:59,700 - Oh, my God! I just registered for Trump's rally, 1399 01:06:59,700 --> 01:07:03,233 and I'm so excited to not go. Mm-hmm. 1400 01:07:03,233 --> 01:07:05,266 - We've never had an empty seat, 1401 01:07:05,266 --> 01:07:06,800 and we certainly won't in Oklahoma. 1402 01:07:06,800 --> 01:07:09,600 - TikTok users may well be President Trump's 1403 01:07:09,600 --> 01:07:12,466 latest adversary after thousands of people 1404 01:07:12,466 --> 01:07:15,166 who'd gotten tickets online didn't show up, 1405 01:07:15,166 --> 01:07:18,466 thanks to a secret campaign on TikTok. 1406 01:07:18,466 --> 01:07:22,100 - We had gotten over a million tickets sold, 1407 01:07:22,100 --> 01:07:24,000 and only 6,000 people showed up. 1408 01:07:24,000 --> 01:07:27,000 - President Trump was frustrated and angry. 1409 01:07:27,000 --> 01:07:28,666 - "He yelled at aides backstage 1410 01:07:28,666 --> 01:07:32,000 while looking at the endless rows of empty blue seats." 1411 01:07:32,000 --> 01:07:35,733 - TikTok is definitely giving teenagers new power. 1412 01:07:35,733 --> 01:07:38,200 - I think it's unbelievable 1413 01:07:38,200 --> 01:07:41,333 that I was able to prank an American president. 1414 01:07:41,333 --> 01:07:43,100 - Trump nemesis New York Congresswoman 1415 01:07:43,100 --> 01:07:46,066 Alexandria Ocasio-Cortez gloated... 1416 01:07:49,866 --> 01:07:52,300 - As soon as that rally happened, 1417 01:07:52,300 --> 01:07:56,266 that's when the rhetoric on TikTok rose to a level 1418 01:07:56,266 --> 01:07:57,933 that we hadn't seen before. 1419 01:07:57,933 --> 01:08:00,700 The real China hawks in his administration 1420 01:08:00,700 --> 01:08:02,366 were ready to go after this company, 1421 01:08:02,366 --> 01:08:04,500 and they were kind of waiting for the moment. 1422 01:08:04,500 --> 01:08:07,833 And this rally and the pandemic came together 1423 01:08:07,833 --> 01:08:09,500 to give them that moment that they needed. 1424 01:08:09,500 --> 01:08:12,500 - The Pentagon, the Department of State, 1425 01:08:12,500 --> 01:08:14,300 the Department of Homeland Security, 1426 01:08:14,300 --> 01:08:17,633 and the TSA have all banned their employees 1427 01:08:17,633 --> 01:08:21,500 and service members from using TikTok on government devices. 1428 01:08:21,500 --> 01:08:23,700 And we know that it's a national security risk. 1429 01:08:23,700 --> 01:08:26,066 - People really pounced on this moment, 1430 01:08:26,066 --> 01:08:28,700 not only the China hawks in the U.S. government 1431 01:08:28,700 --> 01:08:32,633 but also the tech companies, particularly Facebook. 1432 01:08:32,633 --> 01:08:34,966 - Do you believe that the Chinese government 1433 01:08:34,966 --> 01:08:38,733 steals technology from U.S. companies? 1434 01:08:38,733 --> 01:08:40,333 - Congressman, I think it's well-documented 1435 01:08:40,333 --> 01:08:41,633 that the Chinese government 1436 01:08:41,633 --> 01:08:44,500 steals technology from American companies. 1437 01:08:44,500 --> 01:08:47,033 - And so Mark Zuckerberg saw this as a moment, 1438 01:08:47,033 --> 01:08:48,833 and Facebook pounced on this moment 1439 01:08:48,833 --> 01:08:51,433 where TikTok was getting under pressure. 1440 01:08:51,433 --> 01:08:53,833 And he said, "I'm gonna turn this up even more." 1441 01:08:53,833 --> 01:08:56,200 And so they started making their case 1442 01:08:56,200 --> 01:08:58,333 to the different people in Congress 1443 01:08:58,333 --> 01:09:00,500 who were really going after Facebook. 1444 01:09:00,500 --> 01:09:02,100 And they were saying, "You know what? 1445 01:09:02,100 --> 01:09:04,266 "You're looking at us as the boogeyman, 1446 01:09:04,266 --> 01:09:07,333 "but we're just a distraction from the real problem, 1447 01:09:07,333 --> 01:09:09,066 "which are the Chinese tech companies. 1448 01:09:09,066 --> 01:09:10,533 "And those are the companies 1449 01:09:10,533 --> 01:09:12,200 that you should be looking at." 1450 01:09:12,200 --> 01:09:14,066 - Now, "The Wall Street Journal" is reporting that 1451 01:09:14,066 --> 01:09:18,933 not only did Mark Zuckerberg publicly go against TikTok. 1452 01:09:18,933 --> 01:09:22,666 It lobbied behind the scenes against the company 1453 01:09:22,666 --> 01:09:25,100 in a private dinner with the president. 1454 01:09:25,100 --> 01:09:26,833 - There was a moment during the pandemic 1455 01:09:26,833 --> 01:09:28,866 where cases were going up. 1456 01:09:28,866 --> 01:09:30,200 We didn't have a vaccine. 1457 01:09:30,200 --> 01:09:33,466 Donald Trump's campaign wasn't doing so well. 1458 01:09:33,466 --> 01:09:37,533 And so Donald Trump started really hammering this idea home 1459 01:09:37,533 --> 01:09:40,900 that we need to blame China for the coronavirus 1460 01:09:40,900 --> 01:09:42,100 and this pandemic. 1461 01:09:42,100 --> 01:09:45,900 - Kung flu, the Chinese virus. 1462 01:09:45,900 --> 01:09:47,300 - Why do you keep using this? - 'Cause it comes from China. 1463 01:09:47,300 --> 01:09:49,166 - A lot of people say it's racist. 1464 01:09:49,166 --> 01:09:50,400 - It's not racist at all. 1465 01:09:50,400 --> 01:09:53,233 No, not at all. It comes from China. 1466 01:09:53,233 --> 01:09:54,533 - Trump loved that. 1467 01:09:54,533 --> 01:09:55,933 He wanted to play to that, 1468 01:09:55,933 --> 01:09:58,266 because it became this kind of rallying cry 1469 01:09:58,266 --> 01:10:01,166 in the U.S. to go after China. 1470 01:10:01,166 --> 01:10:05,666 And TikTok kind of became this symbol of China 1471 01:10:05,666 --> 01:10:07,600 at that moment. 1472 01:10:07,600 --> 01:10:09,900 - There have been more than 2,500 incidents 1473 01:10:09,900 --> 01:10:12,333 of anti-Asian hate crimes. 1474 01:10:12,333 --> 01:10:14,100 - It's not just in the U.S. 1475 01:10:14,100 --> 01:10:17,033 Asians around the world have reported discrimination 1476 01:10:17,033 --> 01:10:19,433 linked to coronavirus. 1477 01:10:19,433 --> 01:10:21,600 - Asian hate, it didn't just start now. 1478 01:10:21,600 --> 01:10:24,733 It was always there. It was always there. 1479 01:10:28,400 --> 01:10:32,266 It's really hard right now to be a Chinese-American. 1480 01:10:48,200 --> 01:10:50,333 - Suddenly, because of the pandemic, 1481 01:10:50,333 --> 01:10:54,733 TikTok became this symbol of this fight 1482 01:10:54,733 --> 01:10:56,433 between the U.S. and China 1483 01:10:56,433 --> 01:10:59,600 and a way for Donald Trump to kind of deflect blame. 1484 01:10:59,600 --> 01:11:01,466 - It all started last Friday 1485 01:11:01,466 --> 01:11:03,500 when President Trump sent shock waves 1486 01:11:03,500 --> 01:11:05,966 through social media after making this comment. 1487 01:11:05,966 --> 01:11:07,366 - We're looking at TikTok. 1488 01:11:07,366 --> 01:11:08,966 We may be banning TikTok. 1489 01:11:08,966 --> 01:11:11,300 - The president threatened to block the popular video app 1490 01:11:11,300 --> 01:11:13,633 citing national security concerns. 1491 01:11:13,633 --> 01:11:16,400 - No, we're not a national security threat. 1492 01:11:16,400 --> 01:11:18,466 And we've said that time and again. 1493 01:11:18,466 --> 01:11:21,500 We have very strict data access and controls. 1494 01:11:21,500 --> 01:11:23,233 - TikTok has said, "American user data 1495 01:11:23,233 --> 01:11:25,500 "is stored in the U.S. and backed up in Singapore, 1496 01:11:25,500 --> 01:11:26,866 not in China." 1497 01:11:26,866 --> 01:11:29,900 - We are at a time where we're seeing a-- 1498 01:11:29,900 --> 01:11:31,700 very much geopolitical tension, 1499 01:11:31,700 --> 01:11:34,033 as you know, between the U.S. and China. 1500 01:11:34,033 --> 01:11:35,600 And we are in the middle of that. 1501 01:11:35,600 --> 01:11:38,500 - In China, there is a cybersecurity law that states, 1502 01:11:38,500 --> 01:11:39,633 "If we ask you for information, 1503 01:11:39,633 --> 01:11:41,133 then you have to give it to us." 1504 01:11:41,133 --> 01:11:44,533 - A 2017 law mandates that Chinese-owned companies 1505 01:11:44,533 --> 01:11:46,666 have to cooperate with the Communist Party. 1506 01:11:46,666 --> 01:11:48,966 - And so that's kind of the heart of the problem. 1507 01:11:48,966 --> 01:11:50,566 TikTok can swear up and down 1508 01:11:50,566 --> 01:11:52,833 that they've never been asked to give information, 1509 01:11:52,833 --> 01:11:55,133 but that doesn't stop the Chinese government 1510 01:11:55,133 --> 01:11:57,666 from taking information in the future. 1511 01:11:59,133 --> 01:12:01,833 I remember this moment where we have the pandemic, 1512 01:12:01,833 --> 01:12:03,300 we have Black Lives Matter protests, 1513 01:12:03,300 --> 01:12:05,766 we have wildfires in California. 1514 01:12:05,766 --> 01:12:08,066 Like, the world feels like it's falling apart, 1515 01:12:08,066 --> 01:12:09,900 and the only thing people on the news 1516 01:12:09,900 --> 01:12:12,766 are talking about is this ban of TikTok. 1517 01:12:12,766 --> 01:12:15,666 - Just the threat alone has already had a huge impact. 1518 01:12:15,666 --> 01:12:17,900 Advertisers have been hitting pause on campaigns 1519 01:12:17,900 --> 01:12:19,233 worth millions of dollars. 1520 01:12:19,233 --> 01:12:22,700 - It caused absolute chaos in the tech industry. 1521 01:12:22,700 --> 01:12:25,600 Like, Apple and Google and everyone was sort of struggling 1522 01:12:25,600 --> 01:12:26,900 to get a handle on it 1523 01:12:26,900 --> 01:12:29,766 and thinking, "Can a president even do this?" 1524 01:12:29,766 --> 01:12:31,400 - Yo, what's up guys? 1525 01:12:31,400 --> 01:12:34,100 I'm sure all of you guys heard the news. 1526 01:12:34,100 --> 01:12:35,566 TikTok's gettin' banned. 1527 01:12:35,566 --> 01:12:40,966 - I'm going across TikTok, and all my friends 1528 01:12:40,966 --> 01:12:42,733 are saying bye to TikTok 1529 01:12:42,733 --> 01:12:45,233 and, "It's so sad I have to leave you guys." 1530 01:12:45,233 --> 01:12:46,733 There's another one that's saying, 1531 01:12:46,733 --> 01:12:49,200 "Please follow me on all my other social media." 1532 01:12:49,200 --> 01:12:52,233 And I was fearing for my career. 1533 01:12:54,133 --> 01:12:56,233 - I have 5 million followers. How can this get banned? 1534 01:12:56,233 --> 01:12:58,466 This is my living. It's what I do. 1535 01:12:58,466 --> 01:13:01,933 - And I'm starting a video petition with #SaveTikTok. 1536 01:13:01,933 --> 01:13:05,100 - You all mean the world to me. 1537 01:13:05,100 --> 01:13:08,766 Thank you for everything. Thank you for the career. 1538 01:13:08,766 --> 01:13:13,466 Thank you for making all my beatbox dreams come true. 1539 01:13:13,466 --> 01:13:14,966 Yeah, it was tough. 1540 01:13:14,966 --> 01:13:17,433 I thought of a million different possibilities. 1541 01:13:17,433 --> 01:13:19,366 I'm like, "Maybe I gotta perform, 1542 01:13:19,366 --> 01:13:21,100 "or maybe I gotta go busk or something. 1543 01:13:21,100 --> 01:13:22,500 Like, what am I gonna do?" 1544 01:13:22,500 --> 01:13:25,233 Like, it was definitely tough for me to see, 1545 01:13:25,233 --> 01:13:27,933 because I didn't want that to happen. 1546 01:13:27,933 --> 01:13:29,400 TikTok. 1547 01:13:29,400 --> 01:13:30,933 - A few days after the executive order, 1548 01:13:30,933 --> 01:13:34,466 we hear that Microsoft is in deal talks to buy TikTok. 1549 01:13:34,466 --> 01:13:36,266 Then we started hearing, okay, well, 1550 01:13:36,266 --> 01:13:38,333 maybe Oracle wants to buy TikTok, 1551 01:13:38,333 --> 01:13:39,966 maybe all these other companies, you know, 1552 01:13:39,966 --> 01:13:41,533 because if they bought TikTok, 1553 01:13:41,533 --> 01:13:44,966 then it would no longer be owned by a Chinese company, 1554 01:13:44,966 --> 01:13:46,900 and suddenly that would be okay for Donald Trump. 1555 01:13:46,900 --> 01:13:48,133 And oh, by the way, 1556 01:13:48,133 --> 01:13:50,600 Donald Trump also wanted to take a finder's fee 1557 01:13:50,600 --> 01:13:53,600 and give some money to the Treasury, 1558 01:13:53,600 --> 01:13:55,700 which was probably the most bizarre part 1559 01:13:55,700 --> 01:13:59,133 of the entire storyline. 1560 01:13:59,133 --> 01:14:01,133 TikTok kept saying, "We're trying to find a deal. 1561 01:14:01,133 --> 01:14:02,533 We're trying to find a deal." 1562 01:14:02,533 --> 01:14:07,166 But in the meantime, nothing was actually happening. 1563 01:14:07,166 --> 01:14:09,033 And then the Chinese government stepped in. 1564 01:14:09,033 --> 01:14:13,133 - Tonight state media have been lashing out once again 1565 01:14:13,133 --> 01:14:14,866 saying that Beijing would, 1566 01:14:14,866 --> 01:14:18,133 "Undoubtedly prepare proportional countermeasures 1567 01:14:18,133 --> 01:14:20,366 for what it says could become piracy 1568 01:14:20,366 --> 01:14:22,966 and looting by the United States. 1569 01:14:22,966 --> 01:14:26,766 - All of a sudden came this law that banned the export 1570 01:14:26,766 --> 01:14:30,800 or sale of any artificial intelligence from China. 1571 01:14:30,800 --> 01:14:34,733 ByteDance and TikTok at its core is an AI company. 1572 01:14:34,733 --> 01:14:36,866 And that was really what stopped the discussions 1573 01:14:36,866 --> 01:14:40,466 and could prevent the sale of TikTok. 1574 01:14:40,466 --> 01:14:42,733 - TikTok is one of the opening salvos 1575 01:14:42,733 --> 01:14:45,300 in an emerging battle of technology 1576 01:14:45,300 --> 01:14:49,900 between the world's two largest and most dynamic economies, 1577 01:14:49,900 --> 01:14:52,200 a new tech Cold War. 1578 01:14:52,200 --> 01:14:56,633 ♪ ♪ 1579 01:14:56,633 --> 01:14:58,833 - And then the November U.S. presidential election 1580 01:14:58,833 --> 01:15:00,466 started heating up. 1581 01:15:00,466 --> 01:15:03,700 And the story of TikTok became the biggest deal of the century 1582 01:15:03,700 --> 01:15:06,033 that never actually ended up happening, 1583 01:15:06,033 --> 01:15:09,266 because Donald Trump lost the presidency, 1584 01:15:09,266 --> 01:15:12,933 Biden took over, and we never revisited it. 1585 01:15:12,933 --> 01:15:15,566 - Well, it turns out the clock won't stop for TikTok. 1586 01:15:15,566 --> 01:15:17,933 President Biden has signed a new executive order 1587 01:15:17,933 --> 01:15:19,766 voiding the Trump-era decision 1588 01:15:19,766 --> 01:15:21,700 seeking to ban the social media app. 1589 01:15:21,700 --> 01:15:22,966 - Everybody, calm down. 1590 01:15:22,966 --> 01:15:25,233 Calm down. TikTok is not getting banned. 1591 01:15:25,233 --> 01:15:28,033 - I just love how Trump tried to ban TikTok, 1592 01:15:28,033 --> 01:15:33,366 and now TikTok has banned Trump. 1593 01:15:33,366 --> 01:15:37,466 - TikTok is just one app in what is going to be 1594 01:15:37,466 --> 01:15:41,566 a long line of new ways of communicating. 1595 01:15:41,566 --> 01:15:45,600 Can we have the reach without the vulnerability? 1596 01:15:45,600 --> 01:15:48,433 Can we have the financial independence 1597 01:15:48,433 --> 01:15:51,533 without being subject to this kind of hate? 1598 01:15:51,533 --> 01:15:55,566 And I think the solution to this lies in relationships, 1599 01:15:55,566 --> 01:15:59,533 that we need to redefine the relationships between creator, 1600 01:15:59,533 --> 01:16:02,333 consumer, tech, and government, 1601 01:16:02,333 --> 01:16:06,200 because regulations on all ends 1602 01:16:06,200 --> 01:16:09,100 are not keeping pace with the culture 1603 01:16:09,100 --> 01:16:10,933 that we're setting online. 1604 01:16:10,933 --> 01:16:12,766 ♪ ♪ 1605 01:16:12,766 --> 01:16:14,400 On the internet, we treat others 1606 01:16:14,400 --> 01:16:16,100 like they're disposable, 1607 01:16:16,100 --> 01:16:20,033 but we know that nobody is disposable. 1608 01:16:20,033 --> 01:16:23,000 Okay, Mama, what do you think of my haircut? 1609 01:16:23,000 --> 01:16:24,766 - I love it. - You love it? 1610 01:16:24,766 --> 01:16:29,366 My mom has now been sober for a little more than four years. 1611 01:16:29,366 --> 01:16:32,333 And now we have a really great relationship. 1612 01:16:32,333 --> 01:16:34,500 I actively work to remind myself 1613 01:16:34,500 --> 01:16:37,833 that she's someone who's capable of change. 1614 01:16:41,166 --> 01:16:45,233 As a digital native, it's exhausting to grow up 1615 01:16:45,233 --> 01:16:48,366 and make mistakes in front of everyone. 1616 01:16:48,366 --> 01:16:52,066 The things you put on the internet are forever, but... 1617 01:16:52,066 --> 01:16:53,766 Hey, how's it going? 1618 01:16:53,766 --> 01:16:55,866 - I founded GenZ Girl Gang, 1619 01:16:55,866 --> 01:16:57,700 because social media can be used 1620 01:16:57,700 --> 01:17:00,400 as a community-building tool. 1621 01:17:00,400 --> 01:17:02,666 As my generation gets older 1622 01:17:02,666 --> 01:17:05,200 and we live more life documented, 1623 01:17:05,200 --> 01:17:09,100 I hope that we learn to live 1624 01:17:09,100 --> 01:17:11,166 with this technology 1625 01:17:11,166 --> 01:17:13,366 and really live with it, right? 1626 01:17:13,366 --> 01:17:18,233 Live full lives with it, live our mistakes through it, 1627 01:17:18,233 --> 01:17:20,000 that we can all create the space 1628 01:17:20,000 --> 01:17:23,600 for one another to--to change. 1629 01:17:23,600 --> 01:17:28,000 ♪ ♪ 1630 01:17:28,000 --> 01:17:31,600 - [beatboxing] 1631 01:17:31,600 --> 01:17:33,100 - I have a lot of followers. 1632 01:17:33,100 --> 01:17:37,133 Like, right now, I have 54 million as of yesterday. 1633 01:17:37,133 --> 01:17:39,100 Nice. 1634 01:17:39,100 --> 01:17:43,666 - [beatboxing] 1635 01:17:43,666 --> 01:17:44,833 - We follow you. 1636 01:17:44,833 --> 01:17:46,000 Can you take a picture with my kids? 1637 01:17:46,000 --> 01:17:48,500 - Of course I can. Hi. - Oh, my God. 1638 01:17:48,500 --> 01:17:50,533 - They're about to cry right now. 1639 01:17:50,533 --> 01:17:52,866 Aww, don't cry. 1640 01:17:52,866 --> 01:17:54,566 - Don't cry. - Don't cry. 1641 01:17:54,566 --> 01:17:57,500 - I'm happy. - Yeah, we're happy too. 1642 01:17:57,500 --> 01:17:59,766 - Aww. You're gonna make me cry. 1643 01:17:59,766 --> 01:18:01,300 - Let me take a picture really quick. 1644 01:18:01,300 --> 01:18:04,966 - [beatboxing] - [laughs] 1645 01:18:04,966 --> 01:18:07,066 My boy, Spencer! That's my boy. 1646 01:18:07,066 --> 01:18:08,066 - You guys have a great day, all right? 1647 01:18:08,066 --> 01:18:09,100 - Thank you. That was awesome. 1648 01:18:09,100 --> 01:18:10,633 - It was so great meeting you guys. 1649 01:18:10,633 --> 01:18:14,600 To me, I think fame is that support you give people 1650 01:18:14,600 --> 01:18:19,233 that didn't really have it before you existed. 1651 01:18:19,233 --> 01:18:21,100 - I follow you. - Aww. 1652 01:18:21,100 --> 01:18:22,466 Oh, let me follow you guys back. 1653 01:18:22,466 --> 01:18:23,700 - I follow you on TikTok. 1654 01:18:23,700 --> 01:18:25,633 - I just put out a music video, like-- 1655 01:18:25,633 --> 01:18:28,233 - Uh-huh, I saw that. - You did? Aww, thank you. 1656 01:18:28,233 --> 01:18:30,233 ♪ ♪ 1657 01:18:30,233 --> 01:18:34,233 I want kids to be like, "I know I can do that too. 1658 01:18:34,233 --> 01:18:37,766 I know there's a chance out there." 1659 01:18:37,766 --> 01:18:40,366 I would have never dreamed in a million years 1660 01:18:40,366 --> 01:18:43,500 that it would happen like this. 1661 01:18:43,500 --> 01:18:47,300 TikTok has really changed my entire life. 1662 01:18:47,300 --> 01:18:51,633 ♪ ♪ 1663 01:18:51,633 --> 01:18:54,633 [percussive music] 1664 01:18:54,633 --> 01:18:56,933 ♪ ♪ 1665 01:18:56,933 --> 01:19:00,933 - It's graduation day. I'm definitely nervous. 1666 01:19:00,933 --> 01:19:05,600 I'm trying to wear my Afghan sash to graduation. 1667 01:19:05,600 --> 01:19:07,500 I was told that I can't wear the sash, 1668 01:19:07,500 --> 01:19:10,333 because it goes against dress code. 1669 01:19:10,333 --> 01:19:12,266 ♪ ♪ 1670 01:19:12,266 --> 01:19:16,300 I want to show that no matter what people say about me, 1671 01:19:16,300 --> 01:19:18,700 at the end of the day, I'm proud I'm Afghan, 1672 01:19:18,700 --> 01:19:22,700 and there's no other human being like me. 1673 01:19:22,700 --> 01:19:24,566 Take a lot. Just tap a lot. 1674 01:19:24,566 --> 01:19:28,300 And if it glitches, use your phone. 1675 01:19:28,300 --> 01:19:30,300 It feels a little embarrassing to see, like, 1676 01:19:30,300 --> 01:19:32,066 my mom and the whole family, like, 1677 01:19:32,066 --> 01:19:34,033 celebrating me graduating, 1678 01:19:34,033 --> 01:19:36,966 'cause I'm like, "Ugh, it's not that big of a deal." 1679 01:19:36,966 --> 01:19:38,833 But then I look back at it, and I'm like, 1680 01:19:38,833 --> 01:19:41,533 "Honestly, it is, 'cause I'm the first female 1681 01:19:41,533 --> 01:19:44,533 in my family to graduate high school." 1682 01:19:44,533 --> 01:19:47,333 My mother, she went to elementary school. 1683 01:19:47,333 --> 01:19:50,000 But then once the violence in Kabul, 1684 01:19:50,000 --> 01:19:51,933 Afghanistan, got too much, 1685 01:19:51,933 --> 01:19:54,500 she had to be taken out at third grade. 1686 01:19:54,500 --> 01:19:56,566 I'm the first anyone in my family 1687 01:19:56,566 --> 01:19:58,100 to even go to college now. 1688 01:19:59,100 --> 01:20:01,800 - Oh. 1689 01:20:01,800 --> 01:20:03,733 - Oh, my God. 1690 01:20:09,533 --> 01:20:11,333 [cheers and applause] 1691 01:20:11,333 --> 01:20:16,633 - Welcome to graduation for the class of 2021. 1692 01:20:16,633 --> 01:20:19,100 [cheers and applause] 1693 01:20:19,100 --> 01:20:24,400 - I didn't expect myself to go viral and be this activist. 1694 01:20:25,866 --> 01:20:28,333 What inspired me to speak up 1695 01:20:28,333 --> 01:20:31,566 was seeing those around me staying silent. 1696 01:20:31,566 --> 01:20:33,166 ♪ ♪ 1697 01:20:33,166 --> 01:20:34,933 [cheers and applause] 1698 01:20:34,933 --> 01:20:37,000 - Feroza Aziz. 1699 01:20:37,000 --> 01:20:38,666 [cheers and applause] 1700 01:20:38,666 --> 01:20:43,500 - I want to do more in the future on human rights issues. 1701 01:20:43,500 --> 01:20:46,033 And I want to do more than just speaking on social media. 1702 01:20:46,033 --> 01:20:49,900 I actually want to physically help. 1703 01:20:49,900 --> 01:20:52,000 - Congratulations. 1704 01:20:52,000 --> 01:20:56,233 [cheers and applause] 1705 01:20:57,900 --> 01:20:59,666 - Thank you. 1706 01:20:59,666 --> 01:21:02,733 [light music] 1707 01:21:02,733 --> 01:21:09,733 ♪ ♪ 1708 01:21:09,733 --> 01:21:12,433 - TikTok has infiltrated American culture, 1709 01:21:12,433 --> 01:21:14,266 the Hollywood and entertainment system, 1710 01:21:14,266 --> 01:21:16,166 and politics, and all of these 1711 01:21:16,166 --> 01:21:18,366 different facets of American life 1712 01:21:18,366 --> 01:21:20,700 in such a deep way. 1713 01:21:20,700 --> 01:21:23,033 There's very legitimate reasons to think critically 1714 01:21:23,033 --> 01:21:25,466 about the impact that this massive 1715 01:21:25,466 --> 01:21:27,900 tech conglomerate is having on America. 1716 01:21:27,900 --> 01:21:29,700 And it's really important to think 1717 01:21:29,700 --> 01:21:31,133 about issues around data privacy 1718 01:21:31,133 --> 01:21:33,200 with all of these tech platforms. 1719 01:21:33,200 --> 01:21:34,666 - It's called the LOG OFF Movement, 1720 01:21:34,666 --> 01:21:36,166 and it's a nonprofit organization. 1721 01:21:36,166 --> 01:21:37,833 It's really been started by kids 1722 01:21:37,833 --> 01:21:41,733 for ways to promote healthy ways to exist on social media. 1723 01:21:41,733 --> 01:21:44,166 - I was really inspired by the LOG OFF Movement. 1724 01:21:44,166 --> 01:21:46,933 They're a group of high school students 1725 01:21:46,933 --> 01:21:48,566 from all over the planet. 1726 01:21:48,566 --> 01:21:50,700 They're not just telling people to spend 1727 01:21:50,700 --> 01:21:52,233 less time on the apps. 1728 01:21:52,233 --> 01:21:55,900 They're pushing back by talking to members of Congress, 1729 01:21:55,900 --> 01:21:58,633 by talking to people at the platforms themselves 1730 01:21:58,633 --> 01:22:01,366 to try to change how these systems are built. 1731 01:22:01,366 --> 01:22:04,366 Companies like TikTok need to be watched. 1732 01:22:04,366 --> 01:22:06,600 They need to be held accountable the same way 1733 01:22:06,600 --> 01:22:09,200 that we hold other institutions of power accountable. 1734 01:22:11,233 --> 01:22:14,133 - TikTok has tightened privacy measures. 1735 01:22:14,133 --> 01:22:16,433 Anybody under 15 will automatically have 1736 01:22:16,433 --> 01:22:17,766 a private account. 1737 01:22:17,766 --> 01:22:20,400 Federal regulators have already ordered the app 1738 01:22:20,400 --> 01:22:24,900 to disclose how its practices do affect young people. 1739 01:22:24,900 --> 01:22:27,100 - Personally, I don't think it's fair 1740 01:22:27,100 --> 01:22:29,033 to single out an individual company 1741 01:22:29,033 --> 01:22:31,400 just because it's popular. 1742 01:22:31,400 --> 01:22:33,200 Personally, I think it makes more sense 1743 01:22:33,200 --> 01:22:36,766 to pass cohesive laws against all companies 1744 01:22:36,766 --> 01:22:39,066 so that not only can TikTok not do some of this, 1745 01:22:39,066 --> 01:22:41,200 but neither can Facebook or Google or Amazon 1746 01:22:41,200 --> 01:22:43,700 or any of the companies regardless of nationality. 1747 01:22:43,700 --> 01:22:46,400 - [beatboxing] 1748 01:22:46,400 --> 01:22:47,966 - If the story ended today 1749 01:22:47,966 --> 01:22:50,933 I would say hands down TikTok won. 1750 01:22:50,933 --> 01:22:53,400 [phone beeping] 1751 01:22:53,400 --> 01:22:56,600 All the Trump ban did was make TikTok even bigger 1752 01:22:56,600 --> 01:22:58,400 because it caused people to download the app. 1753 01:22:58,400 --> 01:22:59,766 It caused people to talk about it. 1754 01:22:59,766 --> 01:23:01,733 And so all it did was create more growth 1755 01:23:01,733 --> 01:23:04,133 and more revenue for this Chinese company 1756 01:23:04,133 --> 01:23:05,833 that is even bigger in the U.S. 1757 01:23:05,833 --> 01:23:09,933 than it was when Trump first started going after it. 1758 01:23:09,933 --> 01:23:12,766 Now, the story is not over. 1759 01:23:12,766 --> 01:23:14,000 - [beatboxing] 1760 01:23:14,000 --> 01:23:17,433 ♪ Everybody wants to be somebody ♪ 1761 01:23:17,433 --> 01:23:19,800 ♪ Everybody wants to be somebody ♪ 1762 01:23:19,800 --> 01:23:21,100 ♪ ♪ 1763 01:23:21,100 --> 01:23:25,533 ♪ Everybody wants to be somebody ♪ 1764 01:23:25,533 --> 01:23:27,866 ♪ Everybody ♪ 1765 01:23:27,866 --> 01:23:29,033 ♪ ♪ 1766 01:23:29,033 --> 01:23:31,066 ♪ Be somebody ♪ 1767 01:23:31,066 --> 01:23:33,166 - Generations before us didn't have the same power 1768 01:23:33,166 --> 01:23:35,066 as we do now, and that's technology. 1769 01:23:35,066 --> 01:23:37,633 You have power. You can create change. 1770 01:23:37,633 --> 01:23:39,733 - ♪ Tik TikTok Boom ♪ 1771 01:23:39,733 --> 01:23:42,166 - ♪ We're the generation happened in moment ♪ 1772 01:23:42,166 --> 01:23:43,566 ♪ You thought you were in control ♪ 1773 01:23:43,566 --> 01:23:45,033 ♪ We put it into motion ♪ 1774 01:23:45,033 --> 01:23:46,500 ♪ And you can never own it ♪ 1775 01:23:46,500 --> 01:23:49,100 ♪ Look at how the sky falls down on ya ♪ 1776 01:23:49,100 --> 01:23:51,233 ♪ Under the weight of millions over voices ♪ 1777 01:23:51,233 --> 01:23:55,500 ♪ Down on ya ♪ 1778 01:23:55,500 --> 01:23:57,966 ♪ If I knew what I know now ♪ 1779 01:23:57,966 --> 01:23:59,300 ♪ I wouldn't want to ♪ 1780 01:23:59,300 --> 01:24:01,833 ♪ Dive in with my head down ♪ 1781 01:24:01,833 --> 01:24:03,600 ♪ Oh, you'd rather stay quiet ♪ 1782 01:24:03,600 --> 01:24:06,566 ♪ Too late to go right now ♪ 1783 01:24:06,566 --> 01:24:09,266 ♪ So can you cry yourself while I ♪ 1784 01:24:09,266 --> 01:24:11,633 - ♪ Tik TikTok Boom ♪ 1785 01:24:11,633 --> 01:24:12,733 - ♪ Ohh-ahh ♪ 1786 01:24:12,733 --> 01:24:15,233 ♪ Ohh-ohh-ohh-ohh-ohh-ohh ♪ 1787 01:24:15,233 --> 01:24:16,633 ♪ Ohh-ahh ♪ 1788 01:24:16,633 --> 01:24:19,400 ♪ Ohh-ohh-ohh-ohh-ohh-ohh ♪ 1789 01:24:19,400 --> 01:24:20,633 ♪ Ohh-ahh ♪ 1790 01:24:20,633 --> 01:24:23,266 ♪ Ohh-ohh-ohh-ohh-ohh-ohh ♪ 1791 01:24:23,266 --> 01:24:25,233 ♪ ♪ 1792 01:24:25,233 --> 01:24:27,533 - ♪ Tik TikTok Boom ♪ 1793 01:24:27,533 --> 01:24:32,866 ♪ ♪ 1794 01:24:32,866 --> 01:24:35,300 ♪ Tik TikTok Boom ♪ 1795 01:24:35,300 --> 01:24:36,600 ♪ ♪ 1796 01:24:39,433 --> 01:24:48,400 ♪♪