1
00:00:00,020 --> 00:00:02,060
(THEME MUSIC PLAYS)
2
00:00:16,340 --> 00:00:19,460
If you were to choose three words
that sum up social media,
3
00:00:19,460 --> 00:00:19,700
If you were to choose three words
that sum up social media,
what would they be?
4
00:00:19,700 --> 00:00:20,700
what would they be?
5
00:00:20,700 --> 00:00:23,660
Unnecessarily, tragically harmful.
6
00:00:23,660 --> 00:00:27,860
After 20 years of unfettered growth,
7
00:00:27,860 --> 00:00:33,540
a realisation is dawning
about the true cost of social media.
8
00:00:33,540 --> 00:00:36,260
I don't think Instagram
wants to kill kids.
9
00:00:36,260 --> 00:00:38,540
I think Instagram
wants to make a ton of money,
10
00:00:38,540 --> 00:00:41,620
and they don't care if they happen
to kill some kids along the way.
11
00:00:42,580 --> 00:00:44,380
This social experiment
12
00:00:44,380 --> 00:00:48,300
has had the most profound
consequences for young people.
13
00:00:48,300 --> 00:00:51,940
LORI SCHOTT: We let a stranger
into our house every night
14
00:00:51,940 --> 00:00:54,660
that does the most horrific things
15
00:00:54,660 --> 00:00:57,140
and can manipulate a child's mind.
16
00:00:58,380 --> 00:01:00,620
ARTURO BEJAR:
This is an urgent crisis.
17
00:01:00,620 --> 00:01:03,820
Millions of teens are having
their mental health compromised.
18
00:01:04,780 --> 00:01:08,420
I think Mark is responsible
for what happened to these kids.
19
00:01:08,420 --> 00:01:12,500
Would you like to apologise for what
you've done to these good people?
20
00:01:12,500 --> 00:01:13,260
FAITH MURPHY: Fix it.
21
00:01:13,260 --> 00:01:19,060
Your algorithms are targeting
the mentally unwell and children.
22
00:01:19,060 --> 00:01:23,660
Those algorithms
are also spreading disinformation.
23
00:01:23,660 --> 00:01:23,900
Those algorithms
(INDISTINCT CHANTING)
24
00:01:23,900 --> 00:01:25,260
(INDISTINCT CHANTING)
25
00:01:25,260 --> 00:01:27,740
On the eve of an American election,
26
00:01:27,740 --> 00:01:30,500
democracy is under threat.
27
00:01:30,500 --> 00:01:33,380
Insiders are sounding the alarm.
28
00:01:33,380 --> 00:01:33,620
EDWARD PEREZ: It's a feeling
Insiders are sounding the alarm.
29
00:01:33,620 --> 00:01:35,860
EDWARD PEREZ: It's a feeling
like you are in a car
30
00:01:35,860 --> 00:01:39,660
hurtling toward a brick wall or
some danger that you know is coming.
31
00:01:41,820 --> 00:01:46,100
Online anger
is spilling onto the streets.
32
00:01:46,100 --> 00:01:48,780
These were
our first social-media riots.
33
00:01:49,740 --> 00:01:54,780
Inside these vast, soulless
business parks of Silicon Valley
34
00:01:54,780 --> 00:01:57,580
they're shaping
the way the rest of us live.
35
00:01:57,580 --> 00:02:00,100
But it's not all likes
and cat videos.
36
00:02:00,100 --> 00:02:02,740
Around the world
momentum is building
37
00:02:02,740 --> 00:02:03,020
to finally hold
social-media companies to account.
Around the world
momentum is building
38
00:02:03,020 --> 00:02:06,940
to finally hold
social-media companies to account.
39
00:02:06,940 --> 00:02:07,180
to finally hold
In this episode of Four Corners
40
00:02:07,180 --> 00:02:09,100
In this episode of Four Corners
41
00:02:09,100 --> 00:02:09,340
In this episode of Four Corners
we speak to some of the key players
in the fight to make that happen.
42
00:02:09,340 --> 00:02:14,420
we speak to some of the key players
in the fight to make that happen.
43
00:02:25,260 --> 00:02:29,340
(BIRD SQUALLS)
44
00:02:31,340 --> 00:02:38,860
(SOFT PRAYER ON LOUDSPEAKER)
45
00:02:40,140 --> 00:02:42,700
It's early morning at a mosque
46
00:02:42,700 --> 00:02:42,940
in the Merseyside town of Southport
It's early morning at a mosque
47
00:02:42,940 --> 00:02:47,780
in the Merseyside town of Southport
in England's North West.
48
00:02:50,220 --> 00:02:54,860
Don't attack other people
for no reason.
49
00:02:54,860 --> 00:03:00,460
Don't harm
the community that you're living in.
50
00:03:00,460 --> 00:03:01,220
Don't harm...
51
00:03:01,220 --> 00:03:03,300
We've come to visit this community
52
00:03:03,300 --> 00:03:06,580
as it recovers
from a frightening attack.
53
00:03:13,100 --> 00:03:14,100
Allah! Allah!
54
00:03:14,100 --> 00:03:15,780
Who the fuck is Allah?!
55
00:03:15,780 --> 00:03:15,980
Who the fuck is Allah?!
In July
56
00:03:15,980 --> 00:03:17,260
In July
57
00:03:17,260 --> 00:03:19,620
Southport imam, Ibrahim Hussein,
58
00:03:19,620 --> 00:03:22,220
was here with seven other worshippers
59
00:03:22,220 --> 00:03:24,100
when a mob descended.
60
00:03:30,340 --> 00:03:32,540
(LAUGHTER)
61
00:03:32,540 --> 00:03:36,500
(INDISTINCT YELLING)
62
00:03:36,500 --> 00:03:38,980
IBRAHIM HUSSEIN:
We are no match to them.
63
00:03:38,980 --> 00:03:41,740
We're feeling intimidated,
frightened.
64
00:03:41,740 --> 00:03:43,180
(INDISTINCT YELLING)
65
00:03:43,180 --> 00:03:46,500
And then, of course, they got vocal,
they got louder.
66
00:03:46,500 --> 00:03:46,740
And then, of course, they got vocal,
(INDISTINCT YELLING)
67
00:03:46,740 --> 00:03:50,060
(INDISTINCT YELLING)
68
00:03:51,020 --> 00:03:55,900
The imam and his men
barricaded themselves inside.
69
00:03:55,900 --> 00:03:56,140
The imam and his men
We went to watch the CCTV
70
00:03:56,140 --> 00:03:59,420
We went to watch the CCTV
71
00:03:59,420 --> 00:04:01,860
and then very soon we discovered
72
00:04:01,860 --> 00:04:02,100
that the smoke and the fire
and then very soon we discovered
73
00:04:02,100 --> 00:04:05,540
that the smoke and the fire
has been thrown at the window
74
00:04:05,540 --> 00:04:05,780
that the smoke and the fire
and all the smoke is coming inside.
75
00:04:05,780 --> 00:04:08,340
and all the smoke is coming inside.
76
00:04:08,340 --> 00:04:11,100
If we didn't die
in the hands of the mob,
77
00:04:11,100 --> 00:04:14,460
we definitely
going to suffocate with the smoke.
78
00:04:14,460 --> 00:04:14,740
There is young lads that were here
that have got young families,
we definitely
going to suffocate with the smoke.
79
00:04:14,740 --> 00:04:18,060
There is young lads that were here
that have got young families,
80
00:04:18,060 --> 00:04:20,860
and they think
they're not going to see them again.
81
00:04:20,860 --> 00:04:21,140
and they think
they're not going to see them again.
They thought they were going to die?
Yeah.
82
00:04:21,140 --> 00:04:22,740
They thought they were going to die?
Yeah.
83
00:04:23,700 --> 00:04:27,340
Outside the chaos escalated.
84
00:04:27,340 --> 00:04:30,340
It looks like we were in a war zone.
85
00:04:30,340 --> 00:04:32,220
Because of a lie on social media?
86
00:04:32,220 --> 00:04:33,220
Because of a lie.
87
00:04:33,220 --> 00:04:33,460
Is social media to blame
Because of a lie.
88
00:04:33,460 --> 00:04:37,460
Is social media to blame
for what happened at your mosque?
89
00:04:37,460 --> 00:04:37,700
Is social media to blame
Most definitely.
90
00:04:37,700 --> 00:04:38,620
Most definitely.
91
00:04:41,940 --> 00:04:44,780
The attack on the Southport Mosque
92
00:04:44,780 --> 00:04:49,620
was the culmination of a concerted
disinformation campaign.
93
00:04:49,620 --> 00:04:54,660
It began after three little girls
were stabbed in Southport
94
00:04:54,660 --> 00:04:54,900
It began after three little girls
at a Taylor Swift dance class.
95
00:04:54,900 --> 00:04:57,140
at a Taylor Swift dance class.
96
00:04:57,140 --> 00:04:57,380
VOICEOVER: Three children who were
at a Taylor Swift dance class.
97
00:04:57,380 --> 00:05:01,060
VOICEOVER: Three children who were
just dancing, and now they're gone.
98
00:05:01,060 --> 00:05:03,140
REPORTER: The Home Secretary
had earlier warned
99
00:05:03,140 --> 00:05:05,660
about disinformation
linked to the attack.
100
00:05:07,220 --> 00:05:11,140
Social media didn't create
the racial tensions in Britain,
101
00:05:11,140 --> 00:05:16,260
but it's given extremists
a powerful tool to spread hate.
102
00:05:17,220 --> 00:05:22,620
Nazir Afzal is a former chief
prosecutor for North West England.
103
00:05:22,620 --> 00:05:24,420
He brought to justice
104
00:05:24,420 --> 00:05:27,860
the perpetrators of violent protests
in the past,
105
00:05:27,860 --> 00:05:28,100
the perpetrators of violent protests
but he's seen nothing like this.
106
00:05:28,100 --> 00:05:30,060
but he's seen nothing like this.
107
00:05:31,420 --> 00:05:34,820
NAZIR AFZAL: There was an enormous
amount of comment on social media
108
00:05:34,820 --> 00:05:37,460
as to the identification
of the perpetrator.
109
00:05:37,460 --> 00:05:37,700
as to the identification
"This person's a Muslim,"
110
00:05:37,700 --> 00:05:38,980
"This person's a Muslim,"
111
00:05:38,980 --> 00:05:42,060
"This person's an asylum seeker,"
112
00:05:42,060 --> 00:05:43,580
language of that nature,
113
00:05:43,580 --> 00:05:46,220
which was wrong in all respects.
114
00:05:46,220 --> 00:05:49,780
You know, disinformation is not
freedom of speech, it's lying,
115
00:05:49,780 --> 00:05:51,700
and openly lying
116
00:05:51,700 --> 00:05:55,420
in order to encourage people
to do something nefarious.
117
00:05:55,420 --> 00:06:00,180
In fact, the alleged perpetrator,
who was ultimately arrested,
118
00:06:00,180 --> 00:06:04,740
was born in Britain
to a Christian family.
119
00:06:04,740 --> 00:06:08,260
Everybody forgot the three
little girls that had been murdered.
120
00:06:08,260 --> 00:06:08,500
You know, it really shocks me.
Everybody forgot the three
little girls that had been murdered.
121
00:06:08,500 --> 00:06:09,740
You know, it really shocks me.
122
00:06:09,740 --> 00:06:13,260
That was their motivation,
allegedly.
123
00:06:13,260 --> 00:06:13,500
That was their motivation,
allegedly.
But they didn't care.
124
00:06:13,500 --> 00:06:14,580
But they didn't care.
125
00:06:14,580 --> 00:06:17,980
It literally was simply an attack
on black and brown people
126
00:06:17,980 --> 00:06:18,220
It literally was simply an attack
in this country
127
00:06:18,220 --> 00:06:19,460
in this country
128
00:06:19,460 --> 00:06:21,620
because they were black and brown.
129
00:06:21,620 --> 00:06:23,420
What I DO know
130
00:06:23,420 --> 00:06:27,700
is something is going horribly wrong
in our once beautiful country.
131
00:06:27,700 --> 00:06:27,940
is something is going horribly wrong
Missile!
132
00:06:27,940 --> 00:06:28,940
Missile!
133
00:06:30,260 --> 00:06:35,980
It was the worst civil unrest
in the UK in more than a decade.
134
00:06:45,700 --> 00:06:48,460
These were our first
social-media riots.
135
00:06:48,460 --> 00:06:48,700
These were our first
(INDISTINCT YELLING)
136
00:06:48,700 --> 00:06:50,140
(INDISTINCT YELLING)
137
00:06:50,140 --> 00:06:50,380
People were using social media
(INDISTINCT YELLING)
138
00:06:50,380 --> 00:06:52,100
People were using social media
to create fear.
139
00:06:54,540 --> 00:06:58,300
The disinformation
about the attacker's identity
140
00:06:58,300 --> 00:07:04,220
had spread after a series of posts
on X, formerly known as Twitter.
141
00:07:04,220 --> 00:07:08,780
The posts were amplified
by notorious influencers
142
00:07:08,780 --> 00:07:10,620
linked to the far-right,
143
00:07:10,620 --> 00:07:12,580
people like Andrew Tate.
144
00:07:13,540 --> 00:07:15,500
So, an undocumented migrant
145
00:07:15,500 --> 00:07:19,620
decided to go into
a Taylor Swift dance class today.
146
00:07:19,620 --> 00:07:23,500
That's right, somebody arrived
in the UK on a boat...
147
00:07:23,500 --> 00:07:23,780
Shadowy extremist accounts
on Telegram and others on X
That's right, somebody arrived
in the UK on a boat...
148
00:07:23,780 --> 00:07:28,700
Shadowy extremist accounts
on Telegram and others on X
149
00:07:28,700 --> 00:07:31,620
with hundreds of thousands
of followers
150
00:07:31,620 --> 00:07:31,860
pushed the narrative further.
of followers
151
00:07:31,860 --> 00:07:34,380
pushed the narrative further.
152
00:07:34,380 --> 00:07:37,540
X owner, Elon Musk, weighed in,
153
00:07:37,540 --> 00:07:37,780
X owner, Elon Musk, weighed in,
to the anti-immigration account
154
00:07:37,780 --> 00:07:41,340
responding directly
to the anti-immigration account
155
00:07:41,340 --> 00:07:41,580
Europe Invasion.
to the anti-immigration account
156
00:07:41,580 --> 00:07:43,820
Europe Invasion.
157
00:07:43,820 --> 00:07:46,100
JOE MULHALL:
The idea that Musk would allow
158
00:07:46,100 --> 00:07:47,540
an account like that on the platform
159
00:07:47,540 --> 00:07:49,940
is a major problem in and of itself.
160
00:07:49,940 --> 00:07:51,500
Musk, the owner of the platform,
161
00:07:51,500 --> 00:07:55,580
was amplifying a voice that was
an extremely dangerous voice
162
00:07:55,580 --> 00:07:56,820
that contributed to the violence
163
00:07:56,820 --> 00:07:57,060
that contributed to the violence
we saw in the streets
over the coming days.
164
00:07:57,060 --> 00:07:58,900
we saw in the streets
over the coming days.
165
00:07:59,860 --> 00:08:02,940
Things really just started
in Southport, but it...
166
00:08:02,940 --> 00:08:05,580
Joe Mulhall monitors
the extreme right
167
00:08:05,580 --> 00:08:10,180
for anti-racism advocacy group
Hope Not Hate.
168
00:08:11,260 --> 00:08:14,420
So, you'd get some people
putting it on X like this,
169
00:08:14,420 --> 00:08:16,980
but you'd also get
live streaming happening on TikTok.
170
00:08:16,980 --> 00:08:22,420
He was aghast to see
Elon Musk post during the riots
171
00:08:22,420 --> 00:08:26,300
to his more
than 200 million followers
172
00:08:26,300 --> 00:08:29,420
that "Civil war is inevitable."
173
00:08:29,420 --> 00:08:32,740
What happened in the days
following that attack in Southport
174
00:08:32,740 --> 00:08:36,260
had disastrous effects on
our streets and in our communities,
175
00:08:36,260 --> 00:08:37,940
and it wasn't Musk
putting his hands up
176
00:08:37,940 --> 00:08:39,100
and saying, "This is free speech."
177
00:08:39,100 --> 00:08:41,500
He was engaging in it himself.
178
00:08:42,780 --> 00:08:44,580
Some of the biggest players
179
00:08:44,580 --> 00:08:44,820
spreading misinformation,
Some of the biggest players
180
00:08:44,820 --> 00:08:47,580
spreading misinformation,
disinformation and hatred
181
00:08:47,580 --> 00:08:50,340
in the period
running up to and during the riots
182
00:08:50,340 --> 00:08:53,660
were individuals that had previously
been removed from Twitter
183
00:08:53,660 --> 00:08:56,180
but Elon Musk allowed back
onto the platform.
184
00:08:57,540 --> 00:08:59,780
As a social-media riot,
185
00:08:59,780 --> 00:09:01,780
we're going to have to look back
at this
186
00:09:01,780 --> 00:09:03,780
and see what we can learn from this.
187
00:09:03,780 --> 00:09:05,980
Platforms themselves,
social-media platforms,
188
00:09:05,980 --> 00:09:06,220
Platforms themselves,
have a responsibility,
189
00:09:06,220 --> 00:09:08,460
have a responsibility,
190
00:09:08,460 --> 00:09:11,380
and we didn't see it
being exercised.
191
00:09:11,380 --> 00:09:13,660
You say these were the first
social-media riots.
192
00:09:13,660 --> 00:09:13,900
You say these were the first
Will they be the last?
193
00:09:13,900 --> 00:09:15,060
Will they be the last?
194
00:09:15,060 --> 00:09:17,620
I have no doubt that there
will be other social-media riots.
195
00:09:19,460 --> 00:09:23,180
To understand
why Britain erupted in this way,
196
00:09:23,180 --> 00:09:26,340
it's instructive
to meet a far-right figure,
197
00:09:26,340 --> 00:09:30,020
like Patriotic Alternative's
Mark Collett.
198
00:09:30,980 --> 00:09:36,660
Mr Collett uses social media
to disseminate his xenophobic agenda.
199
00:09:37,620 --> 00:09:40,020
His anti-immigration rhetoric
200
00:09:40,020 --> 00:09:40,260
has seen him banned
His anti-immigration rhetoric
201
00:09:40,260 --> 00:09:43,420
has seen him banned
from Facebook and X,
202
00:09:43,420 --> 00:09:47,780
but he supports where Elon Musk
is taking social media.
203
00:09:47,780 --> 00:09:48,020
but he supports where Elon Musk
MARK COLLETT: I think social media
204
00:09:48,020 --> 00:09:49,380
MARK COLLETT: I think social media
205
00:09:49,380 --> 00:09:49,580
IS important.
MARK COLLETT: I think social media
206
00:09:49,580 --> 00:09:51,180
IS important.
207
00:09:51,180 --> 00:09:52,740
It does give you a greater reach.
208
00:09:52,740 --> 00:09:55,780
And since Elon Musk
has taken over on Twitter
209
00:09:55,780 --> 00:09:56,060
And since Elon Musk
has taken over on Twitter
there have been smaller accounts
that have grown very large,
210
00:09:56,060 --> 00:10:00,700
there have been smaller accounts
that have grown very large,
211
00:10:00,700 --> 00:10:00,940
there have been smaller accounts
and they are saying things
212
00:10:00,940 --> 00:10:02,420
and they are saying things
213
00:10:02,420 --> 00:10:02,660
that under
and they are saying things
214
00:10:02,660 --> 00:10:04,180
that under
the old Twitter establishment
215
00:10:04,180 --> 00:10:05,700
they wouldn't have been allowed
to say,
216
00:10:05,700 --> 00:10:05,940
they wouldn't have been allowed
and I think that's something
217
00:10:05,940 --> 00:10:07,180
and I think that's something
218
00:10:07,180 --> 00:10:09,500
that the British government
is very scared about
219
00:10:09,500 --> 00:10:11,260
because they want to do something
220
00:10:11,260 --> 00:10:16,140
to curtail Elon Musk's drive
for free speech.
221
00:10:18,420 --> 00:10:21,300
As the Southport riots broke out,
222
00:10:21,300 --> 00:10:24,820
Mark Collett took to social media.
223
00:10:24,820 --> 00:10:29,620
He linked the attacks
on the little girls to immigration,
224
00:10:29,620 --> 00:10:29,860
He linked the attacks
telling his followers:
225
00:10:29,860 --> 00:10:32,900
telling his followers:
226
00:10:38,620 --> 00:10:41,260
That was a false claim.
227
00:10:41,260 --> 00:10:44,020
He also wrote:
228
00:10:52,060 --> 00:10:53,540
So, you stand by those words?
229
00:10:53,540 --> 00:10:54,540
Oh, of course.
230
00:10:54,540 --> 00:10:57,660
I mean, I'm not blaming
the immigrant community here.
231
00:10:57,660 --> 00:10:59,340
People are saying,
"We've had enough,
232
00:10:59,340 --> 00:11:00,540
"we've had enough,
we've had enough."
233
00:11:00,540 --> 00:11:02,020
Another thing that you've said
in the past
234
00:11:02,020 --> 00:11:04,780
is "Diversity is a hate crime
against white people."
235
00:11:04,780 --> 00:11:07,900
Did I specifically say that?
You were quoted as saying that.
236
00:11:07,900 --> 00:11:09,460
Do you...
I mean, it's a reasonable quote.
237
00:11:09,460 --> 00:11:09,700
Do you...
I mean, I'll take it if you want.
238
00:11:09,700 --> 00:11:12,060
I mean, I'll take it if you want.
239
00:11:12,060 --> 00:11:16,340
Diversity is something
that is bad for white people
240
00:11:16,340 --> 00:11:20,140
because every time somebody says,
"This has become more diverse,"
241
00:11:20,140 --> 00:11:21,620
it means fewer white people.
242
00:11:21,620 --> 00:11:21,860
This is a predominantly
white country
it means fewer white people.
243
00:11:21,860 --> 00:11:24,140
This is a predominantly
white country
244
00:11:24,140 --> 00:11:26,380
with predominantly
white institutions,
245
00:11:26,380 --> 00:11:29,300
with enormous amounts
of institutional racism
246
00:11:29,300 --> 00:11:29,540
with enormous amounts
of institutional racism
in our system.
247
00:11:29,540 --> 00:11:30,940
in our system.
248
00:11:30,940 --> 00:11:35,460
So, the idea that somehow
this is against him and his people
249
00:11:35,460 --> 00:11:37,100
is...it's just nonsense.
250
00:11:38,060 --> 00:11:41,300
At Southport there was someone there
wearing a T-shirt
251
00:11:41,300 --> 00:11:43,380
that was very clearly linked
to Patriotic Alternative.
252
00:11:43,380 --> 00:11:43,620
that was very clearly linked
So, that's a dead giveaway.
253
00:11:43,620 --> 00:11:45,500
So, that's a dead giveaway.
254
00:11:45,500 --> 00:11:46,860
Hope Not Hate
255
00:11:46,860 --> 00:11:52,580
discovered Patriotic Alternative
figures attended the Southport riots.
256
00:11:52,580 --> 00:11:55,500
We've proved
by analysing footage of the riots
257
00:11:55,500 --> 00:11:55,780
that individuals linked
to Patriotic Alternative,
We've proved
by analysing footage of the riots
258
00:11:55,780 --> 00:11:58,100
that individuals linked
to Patriotic Alternative,
259
00:11:58,100 --> 00:12:00,780
that have been active
with Patriotic Alternative,
260
00:12:00,780 --> 00:12:02,580
engaged in the violence
on those days,
261
00:12:02,580 --> 00:12:02,820
engaged in the violence
on those days,
that were present at those riots.
262
00:12:02,820 --> 00:12:04,820
that were present at those riots.
263
00:12:04,820 --> 00:12:06,860
And that comes as no surprise.
264
00:12:06,860 --> 00:12:09,340
Mark Collett paints himself
265
00:12:09,340 --> 00:12:11,740
as an advocate for, as he puts it,
266
00:12:11,740 --> 00:12:14,820
the 'indigenous white people'
of Britain.
267
00:12:14,820 --> 00:12:15,060
the 'indigenous white people'
But his views are far more sinister.
268
00:12:15,060 --> 00:12:17,740
But his views are far more sinister.
269
00:12:17,740 --> 00:12:17,980
So, you still think
But his views are far more sinister.
270
00:12:17,980 --> 00:12:20,860
So, you still think
that Hitler wasn't such a bad guy?
271
00:12:20,860 --> 00:12:21,100
So, you still think
I don't know him, I never met him,
272
00:12:21,100 --> 00:12:23,380
I don't know him, I never met him,
273
00:12:23,380 --> 00:12:26,500
so, I don't have
firsthand experience.
274
00:12:26,500 --> 00:12:29,820
He did order the extermination
of six million Jews.
275
00:12:29,820 --> 00:12:30,060
He did order the extermination
Did he?
276
00:12:30,060 --> 00:12:31,340
Did he?
277
00:12:31,340 --> 00:12:32,940
You don't think he did?
278
00:12:32,940 --> 00:12:37,180
I don't know. I mean, I think
it's an interesting question to ask.
279
00:12:40,460 --> 00:12:44,420
Holocaust denial is not
just historically inaccurate,
280
00:12:44,420 --> 00:12:44,660
Holocaust denial is not
it's deeply hurtful.
281
00:12:44,660 --> 00:12:47,260
it's deeply hurtful.
282
00:12:47,260 --> 00:12:51,260
We've grappled with whether
to include these comments at all.
283
00:12:53,420 --> 00:12:56,180
Joe Mulhall believes it's important
284
00:12:56,180 --> 00:13:02,300
to never sanitise the darkest beliefs
of these far-right figures.
285
00:13:04,500 --> 00:13:07,860
Mark Collett is a long-standing
Holocaust denier.
286
00:13:07,860 --> 00:13:08,100
Mark Collett is a long-standing
He has done for many years.
287
00:13:08,100 --> 00:13:09,460
He has done for many years.
288
00:13:09,460 --> 00:13:11,940
He's an extraordinarily
extreme figure.
289
00:13:11,940 --> 00:13:15,340
He might present himself sometimes
as a moderate figure,
290
00:13:15,340 --> 00:13:15,580
He might present himself sometimes
but there is so much evidence
291
00:13:15,580 --> 00:13:17,260
but there is so much evidence
292
00:13:17,260 --> 00:13:19,900
about just how extreme
and how pro-Nazi
293
00:13:19,900 --> 00:13:20,140
about just how extreme
he actually is.
294
00:13:20,140 --> 00:13:21,700
he actually is.
295
00:13:22,660 --> 00:13:25,180
This anti-immigration rhetoric
296
00:13:25,180 --> 00:13:27,780
has had serious consequences
297
00:13:27,780 --> 00:13:30,580
for families like Naveed Mukhtar's.
298
00:13:31,540 --> 00:13:34,460
Never imagined in my life
that that day will come
299
00:13:34,460 --> 00:13:34,700
Never imagined in my life
when I will have to leave my home.
300
00:13:34,700 --> 00:13:36,300
when I will have to leave my home.
301
00:13:38,300 --> 00:13:40,860
Mr Mukhtar is a migration lawyer
302
00:13:40,860 --> 00:13:43,980
and he runs his business
out of his family home.
303
00:13:43,980 --> 00:13:44,220
and he runs his business
(INDISTINCT CHATTER)
304
00:13:44,220 --> 00:13:46,020
(INDISTINCT CHATTER)
305
00:13:46,020 --> 00:13:47,300
During the riots
306
00:13:47,300 --> 00:13:49,380
his business was named on a list
307
00:13:49,380 --> 00:13:53,020
shared on the social-media platform
Telegram,
308
00:13:53,020 --> 00:13:56,420
forcing the family
to leave their home.
309
00:13:59,260 --> 00:14:02,260
There was racist people.
310
00:14:03,700 --> 00:14:05,700
What do you know about racism?
311
00:14:06,660 --> 00:14:09,900
That they're...
312
00:14:09,900 --> 00:14:16,020
..not respecting
other people's religions and faith.
313
00:14:16,980 --> 00:14:19,220
And how do you feel about that?
314
00:14:21,220 --> 00:14:23,940
Offended and angry.
315
00:14:23,940 --> 00:14:24,180
Do you feel safe
Offended and angry.
316
00:14:24,180 --> 00:14:26,420
Do you feel safe
when you go outside?
317
00:14:26,420 --> 00:14:28,660
Are you able to go outside now?
318
00:14:28,660 --> 00:14:30,260
Not much.
319
00:14:30,260 --> 00:14:33,820
I still have the fear
that something is going to happen
320
00:14:33,820 --> 00:14:35,860
because these people
can just do anything.
321
00:14:35,860 --> 00:14:36,100
because these people
They have so many people with them.
322
00:14:36,100 --> 00:14:38,100
They have so many people with them.
323
00:14:39,620 --> 00:14:42,940
The family lives in Stoke-on-Trent,
324
00:14:42,940 --> 00:14:45,220
just over an hour from Southport.
325
00:14:45,220 --> 00:14:48,580
Mr Mukhtar says, since the riots,
326
00:14:48,580 --> 00:14:54,460
the already simmering racial tensions
in this area have worsened.
327
00:14:54,460 --> 00:14:58,140
Since these riots
started in the country,
328
00:14:58,140 --> 00:15:01,300
racism has increased multiple times.
329
00:15:01,300 --> 00:15:01,540
We can't leave our home
thinking that we are safe,
racism has increased multiple times.
330
00:15:01,540 --> 00:15:04,980
We can't leave our home
thinking that we are safe,
331
00:15:04,980 --> 00:15:05,260
and I reported to the police many
times and to the local city council,
We can't leave our home
thinking that we are safe,
332
00:15:05,260 --> 00:15:09,220
and I reported to the police many
times and to the local city council,
333
00:15:09,220 --> 00:15:12,500
but, unfortunately,
haven't received any help at all.
334
00:15:13,620 --> 00:15:17,660
My children cannot go
and play in the street anymore.
335
00:15:17,660 --> 00:15:17,900
My children cannot go
We are just living in a cage.
336
00:15:17,900 --> 00:15:20,780
We are just living in a cage.
337
00:15:20,780 --> 00:15:21,020
That hate which has been spread
We are just living in a cage.
338
00:15:21,020 --> 00:15:24,220
That hate which has been spread
in the whole country
339
00:15:24,220 --> 00:15:24,460
That hate which has been spread
is affecting families like us
340
00:15:24,460 --> 00:15:25,980
is affecting families like us
341
00:15:25,980 --> 00:15:28,100
who just want to work.
342
00:15:28,100 --> 00:15:31,140
What would you say
to social-media proprietors,
343
00:15:31,140 --> 00:15:33,020
you know, the likes of Elon Musk?
344
00:15:33,020 --> 00:15:36,660
If you want to make profits,
take more responsibility.
345
00:15:37,620 --> 00:15:39,500
JOE MULHALL: The social media
346
00:15:39,500 --> 00:15:39,740
are fundamentally culpable
JOE MULHALL: The social media
347
00:15:39,740 --> 00:15:42,460
are fundamentally culpable
for what's happening.
348
00:15:43,620 --> 00:15:47,420
These are vast international
companies with huge resources,
349
00:15:47,420 --> 00:15:48,500
and as yet
350
00:15:48,500 --> 00:15:48,740
they refuse to,
or they are doing it too slowly,
and as yet
351
00:15:48,740 --> 00:15:50,620
they refuse to,
or they are doing it too slowly,
352
00:15:50,620 --> 00:15:52,660
to deal with the issues
on these platforms
353
00:15:52,660 --> 00:15:55,660
that are having real-world effects
on people on our streets.
354
00:16:03,140 --> 00:16:05,860
Andrew Kaung saw this up close
355
00:16:05,860 --> 00:16:11,540
when he worked as a planning analyst
for the social-media company TikTok.
356
00:16:11,540 --> 00:16:14,780
ANDREW KAUNG:
I have seen so many videos,
357
00:16:14,780 --> 00:16:17,900
people dying, people, you know,
committing suicide,
358
00:16:17,900 --> 00:16:18,140
people dying, people, you know,
people getting beheaded
359
00:16:18,140 --> 00:16:20,100
people getting beheaded
360
00:16:20,100 --> 00:16:20,340
on a platform that is targeted
towards younger generation
people getting beheaded
361
00:16:20,340 --> 00:16:22,860
on a platform that is targeted
towards younger generation
362
00:16:22,860 --> 00:16:23,100
on a platform that is targeted
and younger people.
363
00:16:23,100 --> 00:16:25,140
and younger people.
364
00:16:25,140 --> 00:16:26,820
It's horrible.
365
00:16:26,820 --> 00:16:27,020
Go away! We don't want you here!
It's horrible.
366
00:16:27,020 --> 00:16:30,220
Go away! We don't want you here!
367
00:16:30,220 --> 00:16:30,420
Go away! We don't want you here!
"We are far too soft."
368
00:16:30,420 --> 00:16:32,020
"We are far too soft."
369
00:16:32,020 --> 00:16:32,260
Pakistan!
"We are far too soft."
370
00:16:32,260 --> 00:16:33,740
Pakistan!
(INDISTINCT CHANT)
371
00:16:33,740 --> 00:16:38,340
TikTok says it has strict controls
for those under 18.
372
00:16:38,340 --> 00:16:43,420
We set up an account
for an adult male TikTok user
373
00:16:43,420 --> 00:16:47,460
and entered the search term
'immigration UK'
374
00:16:47,460 --> 00:16:49,700
to see where the algorithm took us.
375
00:16:49,700 --> 00:16:51,620
They make all these mosques and...
376
00:16:51,620 --> 00:16:56,700
Within minutes the suggested content
turned xenophobic and dark.
377
00:16:56,700 --> 00:16:56,940
"Do Muslims have to pay council tax?"
turned xenophobic and dark.
378
00:16:56,940 --> 00:16:59,820
"Do Muslims have to pay council tax?"
379
00:16:59,820 --> 00:17:01,140
Congress are going to be Muslims.
380
00:17:01,140 --> 00:17:03,700
Soon England won't be England,
it will be Islam.
381
00:17:06,260 --> 00:17:08,220
The more time you spend on it,
382
00:17:08,220 --> 00:17:08,460
you're going to go down further
The more time you spend on it,
383
00:17:08,460 --> 00:17:10,860
you're going to go down further
into the rabbit hole.
384
00:17:10,860 --> 00:17:16,700
That's kind of how they keep you
addicted, you know, entertained.
385
00:17:16,700 --> 00:17:20,580
CHOIR: (SINGS)
# There is a name I love to hear
386
00:17:20,580 --> 00:17:24,980
# I love to sing its worth
387
00:17:24,980 --> 00:17:28,620
# It sounds like music in my ear
388
00:17:28,620 --> 00:17:32,340
# The sweetest name on earth... #
389
00:17:32,340 --> 00:17:36,460
Thousands of kilometres
across the North Atlantic,
390
00:17:36,460 --> 00:17:40,740
the United States
is on the cusp of an election
391
00:17:40,740 --> 00:17:45,940
where social media
is more pivotal than ever before.
392
00:17:48,620 --> 00:17:51,260
One of the biggest challenges
that I face
393
00:17:51,260 --> 00:17:52,420
is mis- and disinformation.
394
00:17:52,420 --> 00:17:52,620
Joe taught me rule number one -
is mis- and disinformation.
395
00:17:52,620 --> 00:17:54,500
Joe taught me rule number one -
396
00:17:54,500 --> 00:17:56,980
carefully hide
your total incompetence.
397
00:18:03,620 --> 00:18:07,300
We tracked down
the tech company insiders
398
00:18:07,300 --> 00:18:11,260
once responsible
for managing this minefield,
399
00:18:11,260 --> 00:18:15,140
and they believe
social-media disinformation
400
00:18:15,140 --> 00:18:19,980
is tearing at the very heart
of American democracy.
401
00:18:19,980 --> 00:18:21,900
EDWARD PEREZ:
My name is Edward Perez.
402
00:18:21,900 --> 00:18:22,140
EDWARD PEREZ:
Most people call me Eddie.
403
00:18:22,140 --> 00:18:23,700
Most people call me Eddie.
404
00:18:23,700 --> 00:18:28,220
I used to be the director of product
management for information integrity
405
00:18:28,220 --> 00:18:29,580
at Twitter.
406
00:18:29,580 --> 00:18:31,700
I don't think
it's an exaggeration at all
407
00:18:31,700 --> 00:18:36,540
to say that this is potentially the
most important election in the US,
408
00:18:36,540 --> 00:18:39,980
certainly since
our civil war last century.
409
00:18:39,980 --> 00:18:43,420
It's a feeling like you are in a car
hurtling toward a brick wall
410
00:18:43,420 --> 00:18:46,460
or some danger
that you know is coming,
411
00:18:46,460 --> 00:18:49,300
and there's not a lot
that can be done to stop it.
412
00:18:49,300 --> 00:18:52,300
(SIREN WAILS)
(DUCK QUACKS)
413
00:18:53,260 --> 00:18:54,580
At Twitter
414
00:18:54,580 --> 00:18:59,580
Eddie Perez led the team tasked
with tackling false information
415
00:18:59,580 --> 00:19:01,620
on politics and public health
416
00:19:01,620 --> 00:19:04,940
and monitoring extremism.
417
00:19:04,940 --> 00:19:07,340
He left in 2022,
418
00:19:07,340 --> 00:19:12,940
just before Elon Musk took over
and slashed thousands of jobs.
419
00:19:14,220 --> 00:19:18,180
This is now a company that, instead
of having roughly 8,000 people,
420
00:19:18,180 --> 00:19:18,420
This is now a company that, instead
it has approximately 2,000, I think.
421
00:19:18,420 --> 00:19:20,940
it has approximately 2,000, I think.
422
00:19:20,940 --> 00:19:24,060
So, there's a tremendous amount
of institutional knowledge
423
00:19:24,060 --> 00:19:25,020
that has been lost.
424
00:19:25,020 --> 00:19:28,380
In terms of infrastructure
and elections in particular,
425
00:19:28,380 --> 00:19:33,260
even the attempt to try
to achieve accuracy of information,
426
00:19:33,260 --> 00:19:33,500
even the attempt to try
those standards are now gone.
427
00:19:33,500 --> 00:19:35,420
those standards are now gone.
428
00:19:35,420 --> 00:19:35,660
The attempt to balance
freedom of speech
those standards are now gone.
429
00:19:35,660 --> 00:19:38,260
The attempt to balance
freedom of speech
430
00:19:38,260 --> 00:19:41,140
with quality of information
and reduction of harm
431
00:19:41,140 --> 00:19:43,740
was a very, very serious part
of our mission.
432
00:19:43,740 --> 00:19:44,020
was a very, very serious part
of our mission.
And in its place
Twitter today is...
433
00:19:44,020 --> 00:19:47,180
And in its place
Twitter today is...
434
00:19:47,180 --> 00:19:50,980
..the most generous thing
you can say is a wild, wild west.
435
00:19:50,980 --> 00:19:53,500
FRANCES HAUGEN: They know
they do far too little about it...
436
00:19:53,500 --> 00:19:57,540
The concerns about democracy
aren't confined to X.
437
00:19:57,540 --> 00:20:00,460
Facebook is a company that has paid
for its immense profits
438
00:20:00,460 --> 00:20:00,700
Facebook is a company that has paid
with our safety.
439
00:20:00,700 --> 00:20:01,940
with our safety.
440
00:20:01,940 --> 00:20:06,820
Frances Haugen was
a product manager at what's now Meta.
441
00:20:06,820 --> 00:20:07,100
CROWD: (CHANTS)
Freedom! Freedom! Freedom!
Frances Haugen was
a product manager at what's now Meta.
442
00:20:07,100 --> 00:20:09,420
CROWD: (CHANTS)
Freedom! Freedom! Freedom!
443
00:20:09,420 --> 00:20:15,100
After the chaos of the 2020 election
and its violent aftermath,
444
00:20:15,100 --> 00:20:15,380
After the chaos of the 2020 election
and its violent aftermath,
she leaked thousands of pages
of company documents.
445
00:20:15,380 --> 00:20:19,740
she leaked thousands of pages
of company documents.
446
00:20:19,740 --> 00:20:20,620
(INDISTINCT YELLING)
447
00:20:20,620 --> 00:20:22,100
So, at Facebook,
448
00:20:22,100 --> 00:20:22,340
there was one area of the company,
Civic Integrity,
So, at Facebook,
449
00:20:22,340 --> 00:20:24,860
there was one area of the company,
Civic Integrity,
450
00:20:24,860 --> 00:20:26,700
that was responsible
for trying to make sure
451
00:20:26,700 --> 00:20:29,380
that Facebook
was a positive force in the world.
452
00:20:29,380 --> 00:20:29,660
that Facebook
was a positive force in the world.
When they dissolved that team
right after the election,
453
00:20:29,660 --> 00:20:32,100
When they dissolved that team
right after the election,
454
00:20:32,100 --> 00:20:35,220
less than 30 days after people voted
in the United States,
455
00:20:35,220 --> 00:20:35,460
less than 30 days after people voted
I knew that Facebook
456
00:20:35,460 --> 00:20:36,580
I knew that Facebook
457
00:20:36,580 --> 00:20:38,860
wasn't going to be able
to heal itself on its own.
458
00:20:38,860 --> 00:20:42,380
The woman was talking about a change
that has been going on Instagram...
459
00:20:42,380 --> 00:20:46,300
Frances Haugen says
social-media companies like Meta
460
00:20:46,300 --> 00:20:46,540
Frances Haugen says
are taking Elon Musk's lead
461
00:20:46,540 --> 00:20:48,580
are taking Elon Musk's lead
462
00:20:48,580 --> 00:20:50,660
in slashing staff
463
00:20:50,660 --> 00:20:50,900
hired to stop disinformation
in slashing staff
464
00:20:50,900 --> 00:20:56,020
hired to stop disinformation
from being disseminated.
465
00:20:56,020 --> 00:20:58,780
Seeing that you could fire
huge numbers of people
466
00:20:58,780 --> 00:21:01,340
and not face
really any meaningful consequences,
467
00:21:01,340 --> 00:21:01,580
and not face
Mark Zuckerberg followed
468
00:21:01,580 --> 00:21:03,260
Mark Zuckerberg followed
469
00:21:03,260 --> 00:21:05,220
and fired 20,000 employees.
470
00:21:05,220 --> 00:21:08,140
They closed the only mechanism
for transparency
471
00:21:08,140 --> 00:21:11,540
that existed for either Facebook
or Instagram.
472
00:21:11,540 --> 00:21:12,860
During the US election season
473
00:21:12,860 --> 00:21:15,340
there will be no tool available
from Facebook
474
00:21:15,340 --> 00:21:17,140
to be able to see
what's going on on the platform
475
00:21:17,140 --> 00:21:17,380
to be able to see
even in a moderate way.
476
00:21:17,380 --> 00:21:18,460
even in a moderate way.
477
00:21:18,460 --> 00:21:18,700
So, I would say
even in a moderate way.
478
00:21:18,700 --> 00:21:21,700
So, I would say
Musk has done a very dangerous thing
479
00:21:21,700 --> 00:21:23,740
in terms
of establishing a precedent.
480
00:21:24,700 --> 00:21:28,780
Meta told us
it now has 40,000 people
481
00:21:28,780 --> 00:21:31,180
working in safety and security,
482
00:21:31,180 --> 00:21:34,740
more than it did in 2020.
483
00:21:34,740 --> 00:21:38,100
DONALD TRUMP: Where is...
Where is he? Come on up here, Elon!
484
00:21:38,100 --> 00:21:38,340
DONALD TRUMP: Where is...
(CHEERING)
485
00:21:38,340 --> 00:21:40,980
(CHEERING)
486
00:21:40,980 --> 00:21:41,220
ELON MUSK: We had one president who
(CHEERING)
487
00:21:41,220 --> 00:21:44,260
ELON MUSK: We had one president who
couldn't climb a flight of stairs...
488
00:21:44,260 --> 00:21:44,500
ELON MUSK: We had one president who
(LAUGHTER)
489
00:21:44,500 --> 00:21:45,260
(LAUGHTER)
490
00:21:45,260 --> 00:21:49,260
..and another who was fist-pumping
after getting shot.
491
00:21:49,260 --> 00:21:51,300
(CHEERING)
492
00:21:51,300 --> 00:21:52,860
Fight! Fight! Fight!
493
00:21:52,860 --> 00:21:55,260
One of the things that has most
494
00:21:55,260 --> 00:21:59,420
turned Eddie Perez off
the platform he once worked for
495
00:21:59,420 --> 00:22:02,940
is how
X has been politically weaponised
496
00:22:02,940 --> 00:22:03,180
is how
by its new owner, Elon Musk.
497
00:22:03,180 --> 00:22:05,580
by its new owner, Elon Musk.
498
00:22:06,540 --> 00:22:12,740
Mr Musk's posts have become
ardently supportive of Donald Trump
499
00:22:12,740 --> 00:22:17,020
and sometimes disturbing in tone.
500
00:22:17,020 --> 00:22:19,260
I made a joke which I realised...
I deleted,
501
00:22:19,260 --> 00:22:22,300
which is like, "Nobody's even
BOTHERING to try to kill Kamala"
502
00:22:22,300 --> 00:22:24,820
because it's pointless.
(BOTH LAUGH)
503
00:22:24,820 --> 00:22:27,180
What do you achieve? Nothing.
Just buy another puppet.
504
00:22:27,180 --> 00:22:29,060
Elon Musk, thank you very much.
You're welcome.
505
00:22:31,740 --> 00:22:35,060
Would the posts of Elon Musk
506
00:22:35,060 --> 00:22:37,340
violate the sort of rules
507
00:22:37,340 --> 00:22:40,260
that you and your colleagues
were putting in place?
508
00:22:40,260 --> 00:22:42,300
Certainly
some of them definitely would.
509
00:22:42,300 --> 00:22:42,540
Certainly
It disturbs me the most
510
00:22:42,540 --> 00:22:44,020
It disturbs me the most
511
00:22:44,020 --> 00:22:44,260
because, to me,
It disturbs me the most
512
00:22:44,260 --> 00:22:47,860
because, to me,
what it is actually illustrating
513
00:22:47,860 --> 00:22:51,260
are the really, really
pernicious ends
514
00:22:51,260 --> 00:22:51,500
are the really, really
that come with tremendous wealth
515
00:22:51,500 --> 00:22:53,220
that come with tremendous wealth
516
00:22:53,220 --> 00:22:56,660
and with tremendous power
and with influence,
517
00:22:56,660 --> 00:23:00,500
and when it is paired
with absolute recklessness.
518
00:23:00,500 --> 00:23:00,780
X has a very harmful place
in the current media ecosystem.
and when it is paired
with absolute recklessness.
519
00:23:00,780 --> 00:23:07,180
X has a very harmful place
in the current media ecosystem.
520
00:23:07,180 --> 00:23:10,460
Whether people are actively
participating in it or not,
521
00:23:10,460 --> 00:23:13,300
the ripple effects from that
will be felt.
522
00:23:13,300 --> 00:23:14,900
NEWSREADER: Unverified claims
523
00:23:14,900 --> 00:23:18,300
about Haitian immigrants
killing and eating people's pets.
524
00:23:18,300 --> 00:23:23,500
Eddie Perez says one of the most
powerful illustrations of this
525
00:23:23,500 --> 00:23:28,700
was the false claim whipped up
on social media, posted by Elon Musk,
526
00:23:28,700 --> 00:23:33,900
that migrants in Springfield, Ohio,
were eating pets...
527
00:23:33,900 --> 00:23:34,140
that migrants in Springfield, Ohio,
See, I'm a different kind of...
were eating pets...
528
00:23:34,140 --> 00:23:35,020
See, I'm a different kind of...
529
00:23:35,020 --> 00:23:39,140
..a false claim that was repeated
in the presidential debate
530
00:23:39,140 --> 00:23:42,620
by Republican candidate Donald Trump.
531
00:23:42,620 --> 00:23:44,540
In Springfield
532
00:23:44,540 --> 00:23:46,340
they're eating the dogs.
533
00:23:46,340 --> 00:23:49,220
The people that came in,
they're eating the cats.
534
00:23:49,220 --> 00:23:50,260
They're eating...
535
00:23:50,260 --> 00:23:52,340
They're eating the pets.
536
00:23:52,340 --> 00:23:52,540
(DOG WHINES)
They're eating the pets.
537
00:23:52,540 --> 00:23:53,340
(DOG WHINES)
538
00:23:53,340 --> 00:23:54,940
# They're eating the dogs
They're eating... #
539
00:23:54,940 --> 00:23:55,180
# They're eating the dogs
They're eating... #
It flooded social media
540
00:23:55,180 --> 00:23:56,940
It flooded social media
541
00:23:56,940 --> 00:24:00,540
with AI-generated Trump cat memes.
542
00:24:00,540 --> 00:24:02,300
# Eat the cat, eat-eat the cat
543
00:24:02,300 --> 00:24:02,540
# Eat the cat
# Eat the cat, eat-eat the cat
544
00:24:02,540 --> 00:24:03,940
# Eat the cat
They're eating the cats
545
00:24:03,940 --> 00:24:04,180
# Eat the cat
# They're eating the dogs... #
546
00:24:04,180 --> 00:24:05,460
# They're eating the dogs... #
547
00:24:05,460 --> 00:24:07,740
EDWARD PEREZ: I mean, it would be
laughable if it weren't so harmful.
548
00:24:07,740 --> 00:24:07,980
EDWARD PEREZ: I mean, it would be
# Talk about extreme! #
549
00:24:07,980 --> 00:24:09,260
# Talk about extreme! #
550
00:24:10,660 --> 00:24:12,940
X is fundamentally operating
551
00:24:12,940 --> 00:24:18,140
like a amplifier
and disinformation machine,
552
00:24:18,140 --> 00:24:18,380
like a amplifier
almost like a factory
553
00:24:18,380 --> 00:24:20,260
almost like a factory
554
00:24:20,260 --> 00:24:20,500
for these inflammatory
almost like a factory
555
00:24:20,500 --> 00:24:23,500
for these inflammatory
right-wing messages.
556
00:24:23,500 --> 00:24:28,300
The bottom line here is
that there is very real human harm.
557
00:24:28,300 --> 00:24:33,420
What happened to that small town
of Ohio was terrible.
558
00:24:33,420 --> 00:24:33,660
What happened to that small town
They ended up with bomb threats.
559
00:24:33,660 --> 00:24:36,620
They ended up with bomb threats.
560
00:24:36,620 --> 00:24:36,860
They ended up
They ended up with bomb threats.
561
00:24:36,860 --> 00:24:39,620
They ended up
having to evacuate schools.
562
00:24:39,620 --> 00:24:42,740
It was absolute chaos
for those poor people
563
00:24:42,740 --> 00:24:45,180
that, frankly,
were used as political pawns.
564
00:24:45,180 --> 00:24:48,100
I think absolutely
the United States has gone backwards
565
00:24:48,100 --> 00:24:48,340
I think absolutely
in its response to disinformation.
566
00:24:48,340 --> 00:24:50,580
in its response to disinformation.
567
00:24:51,860 --> 00:24:54,420
COMMENTATOR: The US Department
of Homeland Security
568
00:24:54,420 --> 00:24:56,700
recently tried
to create an advisory board,
569
00:24:56,700 --> 00:24:59,420
what it called
a Disinformation Governance Board.
570
00:24:59,420 --> 00:24:59,660
what it called
But that board quickly collapsed.
571
00:24:59,660 --> 00:25:01,540
But that board quickly collapsed.
572
00:25:03,260 --> 00:25:05,020
In 2022
573
00:25:05,020 --> 00:25:08,980
Nina Jankowicz
was appointed executive director
574
00:25:08,980 --> 00:25:12,980
of a new
Disinformation Governance Board.
575
00:25:12,980 --> 00:25:18,220
But this government attempt
to rein in the lies on social media
576
00:25:18,220 --> 00:25:22,940
was a disastrously short-lived
experiment.
577
00:25:22,940 --> 00:25:26,380
So, today, to herald the coming
of the new Soviet America,
578
00:25:26,380 --> 00:25:29,700
the administration announced
its own Ministry of Truth.
579
00:25:29,700 --> 00:25:33,980
Right-wing influencers started
calling it a Ministry of Truth,
580
00:25:33,980 --> 00:25:36,340
and within 24 hours
581
00:25:36,340 --> 00:25:43,660
the board and I were the subject
of many Fox News panels and reports
582
00:25:43,660 --> 00:25:43,940
saying that I was going
to be censoring Americans,
the board and I were the subject
of many Fox News panels and reports
583
00:25:43,940 --> 00:25:47,100
saying that I was going
to be censoring Americans,
584
00:25:47,100 --> 00:25:50,020
Tucker Carlson said
that I would have the power
585
00:25:50,020 --> 00:25:50,300
Tucker Carlson said
that I would have the power
to send men with guns to the homes
of Americans with whom I disagreed,
586
00:25:50,300 --> 00:25:53,820
to send men with guns to the homes
of Americans with whom I disagreed,
587
00:25:53,820 --> 00:25:54,060
to send men with guns to the homes
which absolutely was not true.
588
00:25:54,060 --> 00:25:55,540
which absolutely was not true.
589
00:25:55,540 --> 00:25:55,780
This lady is clearly totally unfit
for the task.
which absolutely was not true.
590
00:25:55,780 --> 00:25:58,980
This lady is clearly totally unfit
for the task.
591
00:25:58,980 --> 00:25:59,220
This lady is clearly totally unfit
for the task.
(SPEAKS INDISTINCTLY)
592
00:25:59,220 --> 00:26:01,140
(SPEAKS INDISTINCTLY)
593
00:26:02,580 --> 00:26:04,620
We're talking about
the Ministry of Truth here?
594
00:26:04,620 --> 00:26:04,860
We're talking about
Are we talking about Orwell's 1984?
595
00:26:04,860 --> 00:26:07,220
Are we talking about Orwell's 1984?
596
00:26:07,220 --> 00:26:07,460
Because it sounds
Are we talking about Orwell's 1984?
597
00:26:07,460 --> 00:26:09,300
Because it sounds
an awful like that to me.
598
00:26:10,260 --> 00:26:14,140
Flooded with threats
and the most toxic of online hate...
599
00:26:14,140 --> 00:26:14,380
She's wildly inadequate...
Flooded with threats
and the most toxic of online hate...
600
00:26:14,380 --> 00:26:16,660
She's wildly inadequate...
601
00:26:16,660 --> 00:26:21,500
..Nina Jankowitz resigned
from her position after three weeks.
602
00:26:21,500 --> 00:26:21,740
..Nina Jankowitz resigned
from her position after three weeks.
No. Nada. Nunca.
603
00:26:21,740 --> 00:26:24,180
No. Nada. Nunca.
604
00:26:25,140 --> 00:26:27,900
The worst part was that
they went after me and my family.
605
00:26:27,900 --> 00:26:28,140
The worst part was that
It led people to dox me,
606
00:26:28,140 --> 00:26:31,140
It led people to dox me,
607
00:26:31,140 --> 00:26:33,180
so, to release my home address
and phone number
608
00:26:33,180 --> 00:26:33,420
so, to release my home address
and that of my close family,
609
00:26:33,420 --> 00:26:36,060
and that of my close family,
610
00:26:36,060 --> 00:26:37,620
to threaten me credibly.
611
00:26:37,620 --> 00:26:40,740
I, you know, had to consult
with the Federal Protective Service.
612
00:26:40,740 --> 00:26:40,980
I, you know, had to consult
And at the time I was pregnant,
613
00:26:40,980 --> 00:26:42,020
And at the time I was pregnant,
614
00:26:42,020 --> 00:26:44,860
I was just a couple of weeks away
from becoming a mom,
615
00:26:44,860 --> 00:26:47,380
and they were threatening
me and my unborn child.
616
00:26:47,380 --> 00:26:47,620
and they were threatening
(QUACKS)
617
00:26:47,620 --> 00:26:48,700
(QUACKS)
618
00:26:51,540 --> 00:26:53,020
In America
619
00:26:53,020 --> 00:26:58,540
children are now at the centre
of the battle against Big Tech.
620
00:27:03,740 --> 00:27:07,700
American lawmakers, educators
and parents
621
00:27:07,700 --> 00:27:11,140
are fighting back
against the social-media companies
622
00:27:11,140 --> 00:27:11,380
are fighting back
with a huge legal case
623
00:27:11,380 --> 00:27:13,220
with a huge legal case
624
00:27:13,220 --> 00:27:16,020
where 37 state attorneys-general
625
00:27:16,020 --> 00:27:19,260
are joining together
with school districts
626
00:27:19,260 --> 00:27:22,180
and thousands of families
across the country
627
00:27:22,180 --> 00:27:22,420
and thousands of families
to say that these platforms
628
00:27:22,420 --> 00:27:24,660
to say that these platforms
629
00:27:24,660 --> 00:27:28,460
are causing profound harm
to young people.
630
00:27:30,180 --> 00:27:33,700
Previn Warren is one of the lawyers
running the case
631
00:27:33,700 --> 00:27:33,940
Previn Warren is one of the lawyers
against the social-media companies.
632
00:27:33,940 --> 00:27:36,660
against the social-media companies.
633
00:27:36,660 --> 00:27:41,300
Five other state attorneys-general
are running their own cases.
634
00:27:42,460 --> 00:27:45,220
A lot of these parents
have lost their kids,
635
00:27:45,220 --> 00:27:47,780
and they've lost their kids
needlessly
636
00:27:47,780 --> 00:27:53,700
through the absolute wanton
negligence of these platforms.
637
00:27:53,700 --> 00:27:59,300
Mr Warren was at Harvard when
Mark Zuckerberg launched Facebook,
638
00:27:59,300 --> 00:27:59,540
Mr Warren was at Harvard when
and was one of its first users.
639
00:27:59,540 --> 00:28:02,060
and was one of its first users.
640
00:28:02,060 --> 00:28:02,300
What was FaceMash?
And is it still up and running?
and was one of its first users.
641
00:28:02,300 --> 00:28:05,060
What was FaceMash?
And is it still up and running?
642
00:28:05,060 --> 00:28:09,620
Facebook's forerunner
was a website called FaceMash.
643
00:28:09,620 --> 00:28:11,060
You put up pictures of two women
644
00:28:11,060 --> 00:28:13,540
and decide which one was the better,
more attractive of the two?
645
00:28:13,540 --> 00:28:14,420
Is that right?
646
00:28:15,380 --> 00:28:17,780
Congressman,
that is an accurate description
647
00:28:17,780 --> 00:28:19,860
of the prank website
that I made when I was...
648
00:28:19,860 --> 00:28:20,140
of the prank website
that I made when I was...
That's the origin story
of Mr Zuckerberg's entire empire.
649
00:28:20,140 --> 00:28:24,780
That's the origin story
of Mr Zuckerberg's entire empire.
650
00:28:24,780 --> 00:28:26,820
And fast forward
to where we are now.
651
00:28:26,820 --> 00:28:28,860
And, fast forwarding
to where we are now,
652
00:28:28,860 --> 00:28:29,100
And, fast forwarding
what is the impact on teenage girls?
653
00:28:29,100 --> 00:28:32,780
what is the impact on teenage girls?
654
00:28:32,780 --> 00:28:35,500
Its algorithm is designed
655
00:28:35,500 --> 00:28:37,700
to push content at young girls
656
00:28:37,700 --> 00:28:40,540
that they will find engaging,
657
00:28:40,540 --> 00:28:40,740
and it sucks them down a rabbit hole
that they will find engaging,
658
00:28:40,740 --> 00:28:42,580
and it sucks them down a rabbit hole
659
00:28:42,580 --> 00:28:45,500
of dieting tips, beauty content.
660
00:28:45,500 --> 00:28:45,700
of dieting tips, beauty content.
We should not be living in a society
661
00:28:45,700 --> 00:28:47,580
We should not be living in a society
662
00:28:47,580 --> 00:28:50,300
where young women are made to feel
663
00:28:50,300 --> 00:28:52,700
by powerful, rich tech companies
664
00:28:52,700 --> 00:28:55,180
that they are not good enough.
665
00:28:59,700 --> 00:29:02,300
Many of Previn Warren's clients
666
00:29:02,300 --> 00:29:05,820
are parents of teenage users
of these platforms
667
00:29:05,820 --> 00:29:10,860
who allege social-media contributed
to their children's deaths.
668
00:29:13,820 --> 00:29:17,180
I don't think Instagram
wants to kill kids.
669
00:29:17,180 --> 00:29:20,140
I think Instagram
wants to make a ton of money,
670
00:29:20,140 --> 00:29:23,460
and they don't care if they happen
to kill some kids along the way.
671
00:29:24,420 --> 00:29:28,820
And what does that say about
the people running that organisation?
672
00:29:28,820 --> 00:29:29,060
And what does that say about
I think they don't care.
673
00:29:29,060 --> 00:29:30,180
I think they don't care.
674
00:29:30,180 --> 00:29:30,420
I think, at the end of the day, they
I think they don't care.
675
00:29:30,420 --> 00:29:33,140
I think, at the end of the day, they
think that they're above the law
676
00:29:33,140 --> 00:29:36,700
and if they don't HAVE to
change their design
677
00:29:36,700 --> 00:29:36,940
and if they don't HAVE to
in a way that maximises safety,
678
00:29:36,940 --> 00:29:39,180
in a way that maximises safety,
679
00:29:39,180 --> 00:29:40,940
they won't.
680
00:29:40,940 --> 00:29:43,900
They will put their profits first.
681
00:29:43,900 --> 00:29:48,180
That is an incredibly
dangerous thing for our society.
682
00:29:52,820 --> 00:29:55,140
(EAGLE CALLS)
683
00:30:04,180 --> 00:30:08,460
We went to a tiny town
in rural Colorado
684
00:30:08,460 --> 00:30:11,780
to hear just how dangerous
it can be.
685
00:30:15,300 --> 00:30:19,020
It's corn harvest time in Merino,
686
00:30:19,020 --> 00:30:22,900
population 281.
687
00:30:23,900 --> 00:30:26,340
One of the young people
from this town
688
00:30:26,340 --> 00:30:26,580
One of the young people
is not here to tell her story,
689
00:30:26,580 --> 00:30:29,020
is not here to tell her story,
690
00:30:29,020 --> 00:30:31,900
but her mother, Lori Schott,
691
00:30:31,900 --> 00:30:34,300
wants the world to know it.
692
00:30:36,460 --> 00:30:39,300
Her name was Annalee Schott.
693
00:30:40,260 --> 00:30:42,700
Everyone called her Anna.
694
00:30:43,860 --> 00:30:48,460
LORI SCHOTT: She was an adorable
little country girl.
695
00:30:48,460 --> 00:30:48,740
She was about five foot tall.
She weighed 100 pounds.
LORI SCHOTT: She was an adorable
little country girl.
696
00:30:48,740 --> 00:30:51,700
She was about five foot tall.
She weighed 100 pounds.
697
00:30:51,700 --> 00:30:53,660
She was my shadow all growing up.
698
00:30:53,660 --> 00:30:53,900
She was my shadow all growing up.
Lori Schott is one of the thousands
of parents across America
699
00:30:53,900 --> 00:30:59,540
Lori Schott is one of the thousands
of parents across America
700
00:30:59,540 --> 00:30:59,780
Lori Schott is one of the thousands
suing the social-media companies.
701
00:30:59,780 --> 00:31:02,140
suing the social-media companies.
702
00:31:02,140 --> 00:31:04,540
Why are you speaking out, Lori?
703
00:31:07,780 --> 00:31:08,820
Sorry.
704
00:31:12,860 --> 00:31:15,660
I've never felt such a passion...
705
00:31:17,540 --> 00:31:19,940
..to make a wrong a right.
706
00:31:21,420 --> 00:31:24,260
Losing our daughter
at the age of 18,
707
00:31:24,260 --> 00:31:24,500
Losing our daughter
knowing that it was preventable,
708
00:31:24,500 --> 00:31:26,860
knowing that it was preventable,
709
00:31:26,860 --> 00:31:28,580
I had a choice, you know,
710
00:31:28,580 --> 00:31:32,980
either to curl up
in my own sorrow and pity
711
00:31:32,980 --> 00:31:36,060
or, after what I found out
about social media,
712
00:31:36,060 --> 00:31:38,740
to share Anna's story,
to be her voice.
713
00:31:41,620 --> 00:31:45,380
Anna's best friend was Faith Murphy.
714
00:31:46,340 --> 00:31:48,540
She was kind.
715
00:31:49,420 --> 00:31:51,380
Like, undoubtedly kind.
716
00:31:51,380 --> 00:31:52,460
Always.
717
00:31:54,060 --> 00:31:55,940
Mm... Jesus Christ.
718
00:32:02,180 --> 00:32:05,020
To anybody and everybody.
All the time.
719
00:32:06,140 --> 00:32:10,100
Faith says they would spend
about 10 hours a day
720
00:32:10,100 --> 00:32:13,340
on Instagram, Snapchat, TikTok.
721
00:32:14,300 --> 00:32:17,300
Tell me about her use
of social media.
722
00:32:17,300 --> 00:32:17,540
Tell me about her use
It was gnarly. It was bad.
723
00:32:17,540 --> 00:32:20,180
It was gnarly. It was bad.
724
00:32:21,820 --> 00:32:23,100
Gnarly.
725
00:32:24,660 --> 00:32:26,260
Non-stop.
726
00:32:26,260 --> 00:32:27,540
Continuous.
727
00:32:28,500 --> 00:32:32,220
She was so desperate
to be on those platforms,
728
00:32:32,220 --> 00:32:34,580
if you'd take her phone,
she'd find an iPad.
729
00:32:34,580 --> 00:32:37,220
Then I tried to figure out
if she was on her school computer
730
00:32:37,220 --> 00:32:37,460
Then I tried to figure out
or her private computer.
731
00:32:37,460 --> 00:32:38,860
or her private computer.
732
00:32:38,860 --> 00:32:42,420
She would be doing her chores,
the phone was in her hand.
733
00:32:42,420 --> 00:32:44,900
She would be riding in the arena
with her horse,
734
00:32:44,900 --> 00:32:45,140
She would be riding in the arena
her phone would be in her pocket.
735
00:32:45,140 --> 00:32:46,700
her phone would be in her pocket.
736
00:32:46,700 --> 00:32:50,620
And it was her constant companion.
737
00:32:50,620 --> 00:32:52,540
And she said she needed it
for school work.
738
00:32:52,540 --> 00:32:55,020
Well, I don't think
that was what was going on.
739
00:32:55,020 --> 00:32:57,900
She needed it because she was
on these social-media platforms
740
00:32:57,900 --> 00:32:58,140
She needed it because she was
and did not want to give it up.
741
00:32:58,140 --> 00:32:59,500
and did not want to give it up.
742
00:32:59,500 --> 00:33:02,740
Because she was addicted?
Because she was addicted to it.
743
00:33:02,740 --> 00:33:06,700
Yeah. I think everyone that at that
age is addicted to social media.
744
00:33:26,300 --> 00:33:29,580
Anna's friends and family
began to notice
745
00:33:29,580 --> 00:33:34,020
that by age 16
she was losing her spark.
746
00:33:34,980 --> 00:33:36,740
She was depressed.
747
00:33:36,740 --> 00:33:39,700
Social media made it worse.
748
00:33:40,660 --> 00:33:47,420
Journal entries show her comparing
herself to images she saw online,
749
00:33:47,420 --> 00:33:47,660
writing:
herself to images she saw online,
750
00:33:47,660 --> 00:33:49,100
writing:
751
00:33:53,700 --> 00:33:55,900
Like, she would go through phases
752
00:33:55,900 --> 00:33:58,100
where she was like,
"Hey, just text me,"
753
00:33:58,100 --> 00:34:01,220
like, "I'm deleting Snapchat,
I'm deleting Instagram."
754
00:34:01,220 --> 00:34:01,460
like, "I'm deleting Snapchat,
And it would last,
755
00:34:01,460 --> 00:34:03,500
And it would last,
756
00:34:03,500 --> 00:34:03,700
like, a month max,
And it would last,
757
00:34:03,700 --> 00:34:05,420
like, a month max,
758
00:34:05,420 --> 00:34:08,940
and then it was redownloaded and...
759
00:34:08,940 --> 00:34:10,220
Why was she wanting
760
00:34:10,220 --> 00:34:10,420
Why was she wanting
to delete these platforms?
761
00:34:10,420 --> 00:34:12,660
to delete these platforms?
762
00:34:12,660 --> 00:34:14,860
'Cause it was toxic.
763
00:34:14,860 --> 00:34:17,140
If you could see her social media,
764
00:34:17,140 --> 00:34:20,540
it destroyed her with constant feeds
765
00:34:20,540 --> 00:34:23,220
about how she looked, who she was,
766
00:34:23,220 --> 00:34:24,540
anxiety, depression.
767
00:34:24,540 --> 00:34:26,020
She wasn't sleeping
768
00:34:26,020 --> 00:34:28,260
because, you know,
when we let her have her phone,
769
00:34:28,260 --> 00:34:28,500
because, you know,
she was constantly on it.
770
00:34:28,500 --> 00:34:29,620
she was constantly on it.
771
00:34:29,620 --> 00:34:29,860
I think she was sleeping
she was constantly on it.
772
00:34:29,860 --> 00:34:31,380
I think she was sleeping
four hours a night.
773
00:34:31,380 --> 00:34:31,620
But Instagram fed her...
I think she was sleeping
four hours a night.
774
00:34:31,620 --> 00:34:34,580
But Instagram fed her...
775
00:34:36,740 --> 00:34:38,540
..the worst content.
776
00:34:38,540 --> 00:34:38,780
..the worst content.
I feel like
it is a heat-detecting missile
777
00:34:38,780 --> 00:34:40,740
I feel like
it is a heat-detecting missile
778
00:34:40,740 --> 00:34:44,820
that finds these weaknesses in these
children and just zone in on it.
779
00:34:44,820 --> 00:34:45,060
that finds these weaknesses in these
So, they knew that Anna's weakness
780
00:34:45,060 --> 00:34:46,620
So, they knew that Anna's weakness
781
00:34:46,620 --> 00:34:46,860
was, you know,
So, they knew that Anna's weakness
782
00:34:46,860 --> 00:34:48,660
was, you know,
a little anxiety or depression -
783
00:34:48,660 --> 00:34:51,220
"I'm going to a counsellor,"
looking these things up,
784
00:34:51,220 --> 00:34:51,460
"I'm going to a counsellor,"
and it just magnified it.
785
00:34:51,460 --> 00:34:53,780
and it just magnified it.
786
00:34:53,780 --> 00:34:57,100
And it did not stop.
787
00:34:57,100 --> 00:34:58,740
And, as a parent,
788
00:34:58,740 --> 00:35:02,660
you say you don't think it's going
to happen to you in your family.
789
00:35:02,660 --> 00:35:02,900
you say you don't think it's going
It does.
790
00:35:02,900 --> 00:35:03,660
It does.
791
00:35:03,660 --> 00:35:06,260
Those algorithms change so quickly,
792
00:35:06,260 --> 00:35:06,500
and no parent
Those algorithms change so quickly,
793
00:35:06,500 --> 00:35:10,380
and no parent
can outsmart those algorithms.
794
00:35:15,460 --> 00:35:19,740
The TikTok algorithm
pushed deeply disturbing videos
795
00:35:19,740 --> 00:35:21,980
on Anna's For You feed.
796
00:35:23,420 --> 00:35:28,940
It curated content that constantly
referenced death and suicide.
797
00:35:30,100 --> 00:35:34,900
Lori Schott only discovered this
after Anna died.
798
00:35:37,060 --> 00:35:42,180
She found a page in Anna's journal
headed 'TikToks',
799
00:35:42,180 --> 00:35:42,420
She found a page in Anna's journal
with the handwritten words:
800
00:35:42,420 --> 00:35:45,180
with the handwritten words:
801
00:35:54,820 --> 00:35:56,540
It brought me to my knees.
802
00:35:56,540 --> 00:35:57,900
It crippled me.
803
00:35:57,900 --> 00:36:02,980
I mean, to this day
I can't unsee what I saw.
804
00:36:02,980 --> 00:36:06,300
How does that make you feel
towards the social-media companies?
805
00:36:06,300 --> 00:36:06,540
How does that make you feel
I think it's disgusting.
806
00:36:06,540 --> 00:36:08,100
I think it's disgusting.
807
00:36:09,060 --> 00:36:10,380
Fix it.
808
00:36:11,700 --> 00:36:18,100
Your algorithms are targeting
the mentally unwell and children,
809
00:36:18,100 --> 00:36:22,340
and worsen all of the shit
they're already dealing with.
810
00:36:24,780 --> 00:36:29,060
TikTok says it has strict
community guidelines
811
00:36:29,060 --> 00:36:34,020
and doesn't allow content depicting
extreme violence, self-harm
812
00:36:34,020 --> 00:36:34,260
and doesn't allow content depicting
or promoting eating disorders.
813
00:36:34,260 --> 00:36:37,020
or promoting eating disorders.
814
00:36:37,980 --> 00:36:40,540
To find out
what the tech giants knew
815
00:36:40,540 --> 00:36:43,980
about the dangers to young people
like Anna Schott,
816
00:36:43,980 --> 00:36:44,220
about the dangers to young people
we've travelled here to San Francisco
817
00:36:44,220 --> 00:36:47,060
we've travelled here to San Francisco
818
00:36:47,060 --> 00:36:49,500
to meet Silicon Valley insiders
819
00:36:49,500 --> 00:36:52,100
who were alarmed at what they saw
820
00:36:52,100 --> 00:36:56,940
when advising the social-media
companies on trust and safety.
821
00:37:00,700 --> 00:37:03,660
My name is Arturo Bejar.
822
00:37:03,660 --> 00:37:09,620
For me, I mean, the reason I'm here
talking with you, right, is...
823
00:37:10,580 --> 00:37:12,540
..this shouldn't be happening.
824
00:37:13,740 --> 00:37:18,900
And every parent should know
825
00:37:18,900 --> 00:37:20,420
how bad it is.
826
00:37:23,100 --> 00:37:25,700
In October 2021
827
00:37:25,700 --> 00:37:30,780
Arturo Bejar emailed Facebook's CEO,
Mark Zuckerberg,
828
00:37:30,780 --> 00:37:35,340
to warn him
of the dangers to young people.
829
00:37:35,340 --> 00:37:38,820
He raised internal company research
he'd worked on
830
00:37:38,820 --> 00:37:39,060
He raised internal company research
that showed:
831
00:37:39,060 --> 00:37:40,420
that showed:
832
00:37:50,740 --> 00:37:55,020
More than a quarter
of 13- to 15-year-olds surveyed
833
00:37:55,020 --> 00:37:59,660
said they'd received
unwanted sexual advances.
834
00:37:59,660 --> 00:37:59,900
"Don't we run the risk
said they'd received
unwanted sexual advances.
835
00:37:59,900 --> 00:38:01,540
"Don't we run the risk
836
00:38:01,540 --> 00:38:06,940
"of normalising bad behaviour?"
Mr Bejar asked his boss.
837
00:38:06,940 --> 00:38:09,820
On many of the measures
examined by his team,
838
00:38:09,820 --> 00:38:13,540
13- to 15-year-olds
were the most likely
839
00:38:13,540 --> 00:38:13,780
to encounter dangerous content,
were the most likely
840
00:38:13,780 --> 00:38:16,820
to encounter dangerous content,
841
00:38:16,820 --> 00:38:21,740
including posts about self-harm
on Instagram.
842
00:38:24,540 --> 00:38:28,100
And to this day -
I mean, it's insane -
843
00:38:28,100 --> 00:38:28,380
there's no way
for a kid to press a button
And to this day -
I mean, it's insane -
844
00:38:28,380 --> 00:38:30,380
there's no way
for a kid to press a button
845
00:38:30,380 --> 00:38:32,900
to let know Instagram
that that happened.
846
00:38:33,780 --> 00:38:35,020
Why?
847
00:38:35,020 --> 00:38:36,780
Because they don't want to know.
848
00:38:36,780 --> 00:38:38,340
Why don't they want to know?
849
00:38:38,340 --> 00:38:38,580
Because I think that if they knew,
Why don't they want to know?
850
00:38:38,580 --> 00:38:40,860
Because I think that if they knew,
they might be...
851
00:38:40,860 --> 00:38:41,100
Because I think that if they knew,
..they might think they're liable.
852
00:38:41,100 --> 00:38:42,420
..they might think they're liable.
853
00:38:42,420 --> 00:38:44,780
I think, if they knew,
they might feel
854
00:38:44,780 --> 00:38:46,260
they have a responsibility
to deal with it.
855
00:38:46,260 --> 00:38:46,500
they have a responsibility
I mean, I think that the company
856
00:38:46,500 --> 00:38:47,900
I mean, I think that the company
857
00:38:47,900 --> 00:38:53,260
cares more about protecting its
reputation and the risk of a leak
858
00:38:53,260 --> 00:38:57,820
than they do to understand
the actual harm and reduce it.
859
00:38:57,820 --> 00:39:02,500
You brought your concerns
to the highest levels of Meta,
860
00:39:02,500 --> 00:39:02,740
You brought your concerns
right up to Mark Zuckerberg,
861
00:39:02,740 --> 00:39:05,140
right up to Mark Zuckerberg,
862
00:39:05,140 --> 00:39:06,860
and what was his response?
863
00:39:07,820 --> 00:39:09,580
He didn't even write back.
864
00:39:10,540 --> 00:39:16,900
Arturo Bejar worked at Facebook
for six years until 2015.
865
00:39:16,900 --> 00:39:20,500
When he returned to the company
in 2019
866
00:39:20,500 --> 00:39:21,780
he was horrified to see
867
00:39:21,780 --> 00:39:25,180
most of the safety measures
to protect young people
868
00:39:25,180 --> 00:39:25,420
most of the safety measures
that his team had put in place
869
00:39:25,420 --> 00:39:27,660
that his team had put in place
870
00:39:27,660 --> 00:39:30,740
had been removed from the platform.
871
00:39:30,740 --> 00:39:34,700
He also wrote to the head
of Instagram, Adam Mosseri,
872
00:39:34,700 --> 00:39:34,940
He also wrote to the head
about his concerns.
873
00:39:34,940 --> 00:39:36,900
about his concerns.
874
00:39:36,900 --> 00:39:38,620
And I had my meeting with him,
875
00:39:38,620 --> 00:39:38,860
and he understood
And I had my meeting with him,
876
00:39:38,860 --> 00:39:41,140
and he understood
everything I was talking about,
877
00:39:41,140 --> 00:39:41,380
and he understood
he didn't disagree with any of it.
878
00:39:41,380 --> 00:39:43,420
he didn't disagree with any of it.
879
00:39:43,420 --> 00:39:45,460
So, that was 2021.
880
00:39:45,460 --> 00:39:47,180
What did he do?
881
00:39:47,180 --> 00:39:48,100
Nothing.
882
00:39:50,700 --> 00:39:53,020
How does that make you feel?
883
00:39:53,020 --> 00:39:55,620
Well, that's why I whistleblow.
884
00:39:55,620 --> 00:39:56,420
Right?
885
00:39:56,420 --> 00:39:58,340
How many kids, right?
886
00:39:59,380 --> 00:40:01,420
What... What...
887
00:40:01,420 --> 00:40:03,340
What is his legacy?
888
00:40:03,340 --> 00:40:06,500
What is Mark's legacy
when it comes to teenagers?
889
00:40:06,500 --> 00:40:08,700
Right?
What IS his legacy?
890
00:40:09,780 --> 00:40:11,340
Eating disorders.
891
00:40:11,340 --> 00:40:14,540
Teens committing suicide
because of bullying and harassment.
892
00:40:14,540 --> 00:40:14,780
Teens committing suicide
So much harm
893
00:40:14,780 --> 00:40:16,220
So much harm
894
00:40:16,220 --> 00:40:19,460
to the people that deserve
to be protected the most.
895
00:40:20,500 --> 00:40:21,700
Well...
They're here.
896
00:40:21,700 --> 00:40:21,940
Well...
You're on national television.
897
00:40:21,940 --> 00:40:23,180
You're on national television.
898
00:40:23,180 --> 00:40:25,260
Mark Zuckerberg testified
899
00:40:25,260 --> 00:40:25,500
before the United States
Mark Zuckerberg testified
900
00:40:25,500 --> 00:40:29,020
before the United States
Senate Judiciary Committee...
901
00:40:29,020 --> 00:40:32,180
Would you like to apologise for what
you've done to these good people?
902
00:40:32,180 --> 00:40:37,700
..apologising for the harms his
company had caused to young people.
903
00:40:37,700 --> 00:40:40,020
Arturo Bejar was there
904
00:40:40,020 --> 00:40:43,700
with the parents
of teenagers who had died.
905
00:40:43,700 --> 00:40:48,260
I think Mark is responsible
for what happened to these kids
906
00:40:48,260 --> 00:40:53,860
because if Mark woke up
tomorrow morning and said,
907
00:40:53,860 --> 00:40:56,420
"We no longer are going to show
908
00:40:56,420 --> 00:41:00,380
"anything that has remotely to do
with self-harm to 13-year-olds,"
909
00:41:00,380 --> 00:41:02,900
it would take a few months for
the company to be able to do that,
910
00:41:02,900 --> 00:41:05,540
they have the infrastructure
and the capacity to do that.
911
00:41:05,540 --> 00:41:09,500
I've seen what the company can do
when Mark deems it a priority,
912
00:41:09,500 --> 00:41:09,740
I've seen what the company can do
and I've seen how hard it is
913
00:41:09,740 --> 00:41:11,100
and I've seen how hard it is
914
00:41:11,100 --> 00:41:14,580
for a company to do something
if Mark doesn't deem it a priority.
915
00:41:14,580 --> 00:41:17,940
So, I think he has
such responsibility
916
00:41:17,940 --> 00:41:20,300
when it comes
to everything we're talking about.
917
00:41:22,220 --> 00:41:28,900
Meta disputes Arturo Bejar's
assertion that no action was taken.
918
00:41:28,900 --> 00:41:31,940
It says it's implemented
safety features
919
00:41:31,940 --> 00:41:36,740
such as "comment warnings
and kindness reminders."
920
00:41:36,740 --> 00:41:40,860
The company has also introduced
Teen Accounts,
921
00:41:40,860 --> 00:41:45,300
which it says provides teens
"with built-in protections"
922
00:41:45,300 --> 00:41:47,540
and gives parents "more oversight,
923
00:41:47,540 --> 00:41:50,580
"including seeing
who their teens are chatting with
924
00:41:50,580 --> 00:41:50,820
"including seeing
"and setting time limits."
925
00:41:50,820 --> 00:41:52,860
"and setting time limits."
926
00:41:52,860 --> 00:41:57,700
Whistleblowers question
whether these measures go far enough.
927
00:41:59,700 --> 00:42:02,700
If you were to choose three words
that sum up social media,
928
00:42:02,700 --> 00:42:02,940
If you were to choose three words
what would they be?
929
00:42:02,940 --> 00:42:04,020
what would they be?
930
00:42:04,020 --> 00:42:07,420
Unnecessarily, tragically harmful.
931
00:42:18,580 --> 00:42:21,900
This tiny community in Colorado
932
00:42:21,900 --> 00:42:26,700
is coming to terms
with the unnecessary, tragic loss
933
00:42:26,700 --> 00:42:28,220
of a country teenager.
934
00:42:34,860 --> 00:42:37,340
LORI SCHOTT:
She was a beautiful little rider.
935
00:42:37,340 --> 00:42:40,460
I still remember her
backing into the box.
936
00:42:41,500 --> 00:42:43,540
She just loved it.
937
00:42:48,380 --> 00:42:51,780
I felt like social media
just stripped her confidence.
938
00:42:51,780 --> 00:42:55,580
But when she was on a horse
with these guys in this environment,
939
00:42:55,580 --> 00:42:55,820
But when she was on a horse
she just blossomed.
940
00:42:55,820 --> 00:42:57,820
she just blossomed.
941
00:42:57,820 --> 00:42:59,780
What's that?
Call it what you need to.
942
00:42:59,780 --> 00:43:01,980
These people
were her cheering squad,
943
00:43:01,980 --> 00:43:03,980
so, it just crippled everybody,
944
00:43:03,980 --> 00:43:08,060
and to come back to the arena
without her here is hard.
945
00:43:10,780 --> 00:43:15,780
Court documents revealed that in the
months before Anna Schott's death
946
00:43:15,780 --> 00:43:19,420
Mark Zuckerberg was warned
by his own staff
947
00:43:19,420 --> 00:43:19,660
Mark Zuckerberg was warned
that photo filters used in Instagram
948
00:43:19,660 --> 00:43:22,500
that photo filters used in Instagram
949
00:43:22,500 --> 00:43:28,180
were "actively encouraging
young girls into body dysmorphia."
950
00:43:28,180 --> 00:43:32,660
Mr Zuckerberg dismissed the concerns
as 'paternalistic'.
951
00:43:35,300 --> 00:43:37,500
They knew what was happening.
952
00:43:37,500 --> 00:43:39,980
If you follow
my daughter's journals,
953
00:43:39,980 --> 00:43:42,540
2019 - she was harmed
954
00:43:42,540 --> 00:43:47,060
for the exact same things
they warned Mr Zuckerberg about.
955
00:43:47,060 --> 00:43:48,820
And they knew it.
956
00:43:48,820 --> 00:43:50,700
How could this happen
957
00:43:50,700 --> 00:43:50,940
and they're not accountable
How could this happen
958
00:43:50,940 --> 00:43:53,580
and they're not accountable
to this day?
959
00:43:53,580 --> 00:43:56,100
It hurts me every single day
because my daughter would be alive,
960
00:43:56,100 --> 00:43:58,780
I really think
she'd be alive today...
961
00:44:01,300 --> 00:44:02,340
Sorry.
962
00:44:03,300 --> 00:44:05,500
..if he would have taken action.
963
00:44:05,500 --> 00:44:09,500
How do you feel towards
Mark Zuckerberg in light of this?
964
00:44:09,500 --> 00:44:09,740
How do you feel towards
(SCOFFS)
965
00:44:09,740 --> 00:44:10,700
(SCOFFS)
966
00:44:11,940 --> 00:44:13,660
He's responsible.
967
00:44:13,660 --> 00:44:15,300
I want him held accountable.
968
00:44:15,300 --> 00:44:19,500
So, he needs to be held
legally liable
969
00:44:19,500 --> 00:44:22,140
for what's happened
to these children.
970
00:44:22,140 --> 00:44:24,540
And that's a hill
I'm willing to die on
971
00:44:24,540 --> 00:44:27,420
if I have to tell Anna's story
10,000 times.
972
00:44:27,420 --> 00:44:30,900
Because he had the ability
to stop it, change it,
973
00:44:30,900 --> 00:44:31,140
Because he had the ability
make it for the good of the kids.
974
00:44:31,140 --> 00:44:32,940
make it for the good of the kids.
975
00:44:32,940 --> 00:44:33,180
Let's make this a better place.
make it for the good of the kids.
976
00:44:33,180 --> 00:44:36,620
Let's make this a better place.
Let's get to a better yes.
977
00:44:36,620 --> 00:44:38,940
You can have
your billions of dollars, I guess.
978
00:44:38,940 --> 00:44:41,940
But trying to improve it
was not on his agenda,
979
00:44:41,940 --> 00:44:45,260
and he's going to have to live
with that for the rest of his life.
980
00:44:45,260 --> 00:44:47,020
He does, he has blood on his hands.
981
00:44:49,260 --> 00:44:52,940
Meta says
when all the evidence comes out
982
00:44:52,940 --> 00:44:58,220
it will demonstrate that neither the
company nor any of its executives
983
00:44:58,220 --> 00:45:01,100
will have any liability
984
00:45:01,100 --> 00:45:03,500
and it has strict rules
985
00:45:03,500 --> 00:45:09,820
against content that's graphic
or glorifies suicide or self-harm.
986
00:45:10,660 --> 00:45:13,980
(INSECTS CHIRP)
987
00:45:13,980 --> 00:45:18,700
Anna Schott died by suicide
in her own home.
988
00:45:20,020 --> 00:45:23,140
She's buried on the family ranch.
989
00:45:24,700 --> 00:45:29,060
It is like a slow drip of pain
990
00:45:29,060 --> 00:45:32,380
that it has totally altered
our family.
991
00:45:32,380 --> 00:45:34,660
But I want some good
to come out of this.
992
00:45:39,220 --> 00:45:43,500
Each day, as the light fades
across these plains,
993
00:45:43,500 --> 00:45:48,900
Lori Schott looks out to see
Anna's memorial cross light up.
994
00:45:50,900 --> 00:45:53,620
You know, for us,
it's not about the money.
995
00:45:53,620 --> 00:45:53,860
You know, for us,
We just want our daughter back.
996
00:45:53,860 --> 00:45:56,260
We just want our daughter back.
997
00:45:56,260 --> 00:45:58,300
But if the lawsuits
998
00:45:58,300 --> 00:45:58,540
make social media stand up
But if the lawsuits
999
00:45:58,540 --> 00:46:04,900
make social media stand up
and take responsibility and change,
1000
00:46:04,900 --> 00:46:05,140
make social media stand up
that's where I'm pushing.
1001
00:46:05,140 --> 00:46:07,860
that's where I'm pushing.
1002
00:46:07,860 --> 00:46:09,820
Make the world a better place,
Mark Zuckerberg.
1003
00:46:09,820 --> 00:46:12,700
Make the world a better place,
TikTok.
1004
00:46:12,700 --> 00:46:15,940
Every harm, every single harm,
1005
00:46:15,940 --> 00:46:17,380
I can see a child's face.
1006
00:46:17,380 --> 00:46:22,100
You know, we let a stranger
into our house every night
1007
00:46:22,100 --> 00:46:22,340
You know, we let a stranger
that does the most horrific things
1008
00:46:22,340 --> 00:46:26,620
that does the most horrific things
1009
00:46:26,620 --> 00:46:26,860
and can manipulate a child's mind
that does the most horrific things
1010
00:46:26,860 --> 00:46:31,140
and can manipulate a child's mind
to that extent
1011
00:46:31,140 --> 00:46:34,260
that we need to say
enough is enough.
1012
00:46:45,500 --> 00:46:47,540
(THEME MUSIC PLAYS)