1
00:00:08,120 --> 00:00:10,215
Tonight on Panorama -
2
00:00:10,240 --> 00:00:14,735
one of the world's most influential
social media sites in chaos.
3
00:00:14,760 --> 00:00:18,135
I try to log in, and my email
is just not working any more.
4
00:00:18,160 --> 00:00:21,695
Ancl that's when I knew that
I had been laid off.
5
00:00:21,720 --> 00:00:23,975
Mr Elon Musk...
6
00:00:24,000 --> 00:00:26,095
For the first time,
Twitter insiders tell the story
7
00:00:26,120 --> 00:00:28,535
of Elon Musk's takeover.
8
00:00:28,560 --> 00:00:32,495
You didn't know who you reported to
or who was in the company any more.
9
00:00:32,520 --> 00:00:37,815
How he sacked staff who helped
keep children safe from paedophiles.
10
00:00:37,880 --> 00:00:41,695
You can't go from a team of 20
to a team of about six or seven
11
00:00:41,720 --> 00:00:46,855
and be able to keep on top
of the problem.
12
00:00:46,920 --> 00:00:49,215
And how he welcomed back
banned accounts.
13
00:00:49,240 --> 00:00:52,055
Bang out the machete,
boom in her face,
14
00:00:52,080 --> 00:00:55,415
and then grip her up by the neck,
"Shut up, bitch."
15
00:00:55,440 --> 00:00:58,775
Leaving users feeling unprotected.
16
00:00:58,800 --> 00:01:01,535
I would ask, why are these accounts
that are bullying
17
00:01:01,560 --> 00:01:07,655
and harassing people still allowed
on the platform?
18
00:01:16,000 --> 00:01:21,415
I'm the BBC's disinformation
and social media correspondent.
19
00:01:21,480 --> 00:01:24,575
Some of Twitter's 350 million users
20
00:01:24,600 --> 00:01:29,615
have been getting in touch with me
about a rise in hate and abuse.
21
00:01:29,680 --> 00:01:33,855
I'm Dr Viki Male, and I'm a senior
lecturer in reproductive immunology.
22
00:01:33,880 --> 00:01:37,815
There used to be a mechanism
for reporting misinformation.
23
00:01:37,840 --> 00:01:42,215
My name is Helen, and
I'm a survivor of domestic abuse.
24
00:01:42,240 --> 00:01:44,615
I have experienced and seen first
hand how significantly worse
25
00:01:44,640 --> 00:01:51,335
the online abuse has
been on Twitter.
26
00:01:52,560 --> 00:01:54,375
I'm Ellie Wilson.
27
00:01:54,400 --> 00:01:56,535
I'm a rape survivor
who's been using social media
28
00:01:56,560 --> 00:02:01,615
to try and campaign
to end violence against women.
29
00:02:02,520 --> 00:02:04,695
In the early hours of
New Year's Day 2018,
30
00:02:04,720 --> 00:02:07,775
I was raped while unconscious
by someone I trusted.
31
00:02:07,800 --> 00:02:11,295
This was the beginning of a cycle
of abuse that lasted years.
32
00:02:11,320 --> 00:02:13,455
Ellie tweeted in January
this year after the man
33
00:02:13,480 --> 00:02:16,935
who raped her was sentenced
to five years in jail.
34
00:02:16,960 --> 00:02:21,015
She received a barrage of abuse.
35
00:02:21,040 --> 00:02:23,535
I find it most difficult,
36
00:02:23,560 --> 00:02:25,455
the people that say
that I wasn't raped
37
00:02:25,480 --> 00:02:29,335
or that this didn't happen
and that I'm lying.
38
00:02:29,360 --> 00:02:33,095
But also it's sort of like
a secondary trauma.
39
00:02:33,120 --> 00:02:36,015
Have the messages been
overtly misogynistic?
40
00:02:36,040 --> 00:02:39,695
Yes. There are some messages that,
you know, are directly about me
41
00:02:39,720 --> 00:02:43,215
but there are also messages
about women in general.
42
00:02:43,240 --> 00:02:47,535
You know, women lie,
women are manipulative,
43
00:02:47,560 --> 00:02:50,655
women bring this upon themselves.
44
00:02:50,680 --> 00:02:53,015
There are dozens of accounts
using Twitter to abuse Ellie
45
00:02:53,040 --> 00:02:56,495
and other rape survivors.
46
00:02:56,520 --> 00:03:01,495
This one, for example,
which appears to have several
47
00:03:01,560 --> 00:03:06,495
accounts in the same name,
doing the same thing.
48
00:03:06,520 --> 00:03:11,215
That account I know
has targeted my friends as well.
49
00:03:11,240 --> 00:03:12,575
Misogynistic.
50
00:03:12,600 --> 00:03:13,735
Blaming them for being raped.
51
00:03:13,760 --> 00:03:15,975
That sort of thing.
52
00:03:16,000 --> 00:03:18,895
I noticed a lot of the accounts
have become far more active
53
00:03:18,920 --> 00:03:20,455
in the last few months.
54
00:03:20,480 --> 00:03:25,695
So what's changed?
55
00:03:25,760 --> 00:03:31,615
I'm off to San Francisco, to where
Twitter was founded, to find out.
56
00:03:33,040 --> 00:03:34,575
I'm going to meet the insiders
57
00:03:34,600 --> 00:03:36,855
who signed up to Twitter's
founding mission -
58
00:03:36,880 --> 00:03:40,455
to give everyone the power
to share ideas and information.
59
00:03:40,480 --> 00:03:42,615
Lovely to meet you.
60
00:03:42,640 --> 00:03:45,095
Thank you so much for having us.
61
00:03:45,120 --> 00:03:47,255
The mission, right, to serve
the public conversation,
62
00:03:47,280 --> 00:03:50,575
we all really took that
very seriously.
63
00:03:50,600 --> 00:03:52,815
Ancl not only
the public conversation,
64
00:03:52,840 --> 00:03:54,535
but really working hard
to elevate voices
65
00:03:54,560 --> 00:03:59,135
that had previously struggled
to affect that conversation.
66
00:03:59,160 --> 00:04:02,055
Lisa Jennings Young joined
the company in 2019
67
00:04:02,080 --> 00:04:05,215
as it was rapidly expanding.
68
00:04:05,240 --> 00:04:10,935
But with Twitter's growth
had also come problems.
69
00:04:11,880 --> 00:04:13,975
We had noticed over
the past few years individuals
70
00:04:14,000 --> 00:04:15,175
were tweeting less and less.
71
00:04:15,200 --> 00:04:17,775
And that's because, you know,
of fear of abuse and harassment.
72
00:04:17,800 --> 00:04:18,975
Lisa's team worked on features
73
00:04:19,000 --> 00:04:23,135
designed to protect users
from online hate.
74
00:04:23,160 --> 00:04:24,975
We didn't always get it right,
you know,
75
00:04:25,000 --> 00:04:27,135
and we very publicly
didn't get it right sometimes.
76
00:04:27,160 --> 00:04:28,655
It was not at all perfect.
77
00:04:28,680 --> 00:04:31,055
But we were trying,
and we were making things better,
78
00:04:31,080 --> 00:04:35,415
you know, all the time.
79
00:04:35,440 --> 00:04:42,255
Twitter had developed a set of rules
designed to safeguard users.
80
00:04:42,320 --> 00:04:46,655
Helen Sage Lee joined
the platform in 2021.
81
00:04:46,680 --> 00:04:50,055
A lot of the policies that are
written for social media companies
82
00:04:50,080 --> 00:04:52,015
and for different
social media platforms
83
00:04:52,040 --> 00:04:55,935
are put into words that
very distinctly outline harm.
84
00:04:55,960 --> 00:04:59,695
Employees who are really
just enforcing policy,
85
00:04:59,720 --> 00:05:04,855
or trying to better understand
how to enforce policy in a way
86
00:05:04,920 --> 00:05:10,415
that is fair to everybody else,
it can be challenging.
87
00:05:11,080 --> 00:05:13,455
Two years ago, Twitter suspended
one of its best known
88
00:05:13,480 --> 00:05:18,815
and most active users -
US President Donald Trump.
89
00:05:19,240 --> 00:05:21,855
There are some suspensions
which you can appeal for
90
00:05:21,880 --> 00:05:24,935
and get that suspension removed.
91
00:05:24,960 --> 00:05:27,615
Ancl then there are some
suspensions that are permanent,
92
00:05:27,640 --> 00:05:31,295
and those are the ones that cause
more harm to the general public.
93
00:05:31,320 --> 00:05:33,375
After Trump supporters
stormed the Capitol,
94
00:05:33,400 --> 00:05:36,775
he was permanently banned.
95
00:05:36,800 --> 00:05:39,655
Twitter said
he was inciting violence.
96
00:05:39,680 --> 00:05:43,895
But his removal triggered a debate
about freedom of speech.
97
00:05:43,920 --> 00:05:47,135
It struck a chord with another
high-profile tweeter - Elon Musk.
98
00:05:47,160 --> 00:05:53,055
"Free speech is essential to a
functioning democracy," he tweeted.
99
00:05:53,120 --> 00:05:57,895
"Do you believe Twitter rigorously
adheres to this principle?"
100
00:05:57,920 --> 00:06:03,695
A month later, in April 2022,
Musk launched a takeover bid.
101
00:06:03,760 --> 00:06:10,215
Six, five, four, three, two, one...
102
00:06:10,280 --> 00:06:14,495
Elon Musk is the second
richest man in the world.
103
00:06:14,520 --> 00:06:17,175
His SpaceX rockets and
Tesla electric cars
104
00:06:17,200 --> 00:06:20,455
have made him a global figure.
105
00:06:20,480 --> 00:06:24,015
Sources tell the Associated Press
Elon Musk has taken over Twitter.
106
00:06:24,040 --> 00:06:26,335
The Twitter takeover
appears to be complete,
107
00:06:26,360 --> 00:06:29,975
with Elon Musk at the helm.
108
00:06:30,000 --> 00:06:33,735
Musk is expected to address
Twitter employees...
109
00:06:33,760 --> 00:06:36,495
Musk bought the platform outright
for $44 billion dollars.
110
00:06:36,520 --> 00:06:40,055
"The bird is freed," he tweeted.
111
00:06:40,080 --> 00:06:43,135
A new Twitter era had begun.
112
00:06:43,160 --> 00:06:45,815
I'm looking for something
as broadly inclusive as possible,
113
00:06:45,840 --> 00:06:48,175
that's as trusted as possible
as a system,
114
00:06:48,200 --> 00:06:54,775
and I hope we're successful
in that regard.
115
00:06:59,920 --> 00:07:04,135
I'm off to meet a member of
Twitter's trust and safety council.
116
00:07:04,160 --> 00:07:05,815
Created in 2016,
the volunteer group's job
117
00:07:05,840 --> 00:07:13,055
was to advise the company
on user safety.
118
00:07:13,560 --> 00:07:16,135
We weren't sure if this brilliant
man who launched satellites and
119
00:07:16,160 --> 00:07:17,815
made cars would be able to handle
120
00:07:17,840 --> 00:07:20,255
this very, very different animal
called a social media platform.
121
00:07:20,280 --> 00:07:25,255
It's a long learning process to
figure out how to keep users safe,
122
00:07:25,320 --> 00:07:32,375
to keep conversations civil, to make
platforms work for everybody.
123
00:07:32,440 --> 00:07:37,095
It's really, really complicated.
124
00:07:37,120 --> 00:07:40,895
Did you get to meet Elon Musk
or hear from him on a call at all?
125
00:07:40,920 --> 00:07:43,055
We...
126
00:07:43,080 --> 00:07:44,695
There was radio silence.
127
00:07:44,720 --> 00:07:50,095
We never heard from Elon Musk.
128
00:07:50,160 --> 00:07:51,375
I wanted to find out
129
00:07:51,400 --> 00:07:54,575
how Musk's vision was
playing out on the inside.
130
00:07:54,600 --> 00:07:57,215
A senior engineer,
responsible for the computer code
131
00:07:57,240 --> 00:08:01,055
that powers Twitter,
agreed to meet me.
132
00:08:01,080 --> 00:08:04,855
Because he still works there, he's
asked us to conceal his identity,
133
00:08:04,880 --> 00:08:08,135
so we're calling him Sam.
134
00:08:08,160 --> 00:08:10,735
Sam says Elon Musk's
relaxed public persona
135
00:08:10,760 --> 00:08:15,375
wasn't replicated inside Twitter HQ.
136
00:08:15,400 --> 00:08:20,255
Wherever he goes in the office,
there are at least two bodyguards,
137
00:08:20,280 --> 00:08:24,415
very bulky, tall,
Hollywood movie bodyguards
138
00:08:24,440 --> 00:08:28,695
that follow him around.
139
00:08:28,720 --> 00:08:32,055
Even when he goes to the restroom.
140
00:08:32,080 --> 00:08:35,735
I've been working in Silicon Valley
for such a long time.
141
00:08:35,760 --> 00:08:41,015
I've seen people with the same level
of status, wealth, whatever -
142
00:08:41,080 --> 00:08:44,335
they don't have any security.
143
00:08:44,360 --> 00:08:49,655
They are comfortable with
people who are around them.
144
00:08:50,680 --> 00:08:52,695
Musk was certainly comfortable
145
00:08:52,720 --> 00:08:54,495
with his then 118 million
Twitter followers,
146
00:08:54,520 --> 00:09:01,895
repeatedly asking their views
on how he should run Twitter.
147
00:09:02,480 --> 00:09:04,895
"Should Twitter offer a general
amnesty to suspended accounts,"
148
00:09:04,920 --> 00:09:07,375
he asked, "provided
that they have not broken the law
149
00:09:07,400 --> 00:09:08,575
or engaged in egregious spam?"
150
00:09:08,600 --> 00:09:13,015
72% said yes.
151
00:09:13,040 --> 00:09:15,015
"The people have spoken,"
Musk tweeted.
152
00:09:15,040 --> 00:09:22,615
"Amnesty begins next week."
153
00:09:22,800 --> 00:09:25,615
Bang out the machete,
boom in her face,
154
00:09:25,640 --> 00:09:32,855
grab her by the neck,
"Shut up, bitch!"
155
00:09:33,400 --> 00:09:35,895
Andrew Tate is back on Twitter!
156
00:09:35,920 --> 00:09:39,255
Andrew Tate - a former kick boxer
turned social media personality
157
00:09:39,280 --> 00:09:44,535
made a high-profile
return to the site.
158
00:09:45,760 --> 00:09:47,535
A sheikh sent us a cake.
159
00:09:47,560 --> 00:09:49,935
He'd been permanently banned
in 2017,
160
00:09:49,960 --> 00:09:52,975
after tweets that included
saying women who've been raped
161
00:09:53,000 --> 00:09:56,615
should bear some responsibility
for being assaulted.
162
00:09:56,640 --> 00:09:58,735
It wasn't just the misogynists
who were back -
163
00:09:58,760 --> 00:10:01,175
conspiracy theorists
and other extremists
164
00:10:01,200 --> 00:10:03,775
were also allowed to return.
165
00:10:03,800 --> 00:10:06,415
Twitter users have been in touch
with me about this, too.
166
00:10:06,440 --> 00:10:09,015
These white supremacists
have proliferated,
167
00:10:09,040 --> 00:10:11,815
their hate has burgeoned.
168
00:10:11,840 --> 00:10:16,215
I've been receiving a brutal
barrage of trolls online.
169
00:10:16,240 --> 00:10:20,255
It's awash with fake news,
it just is.
170
00:10:20,280 --> 00:10:24,655
As Musk was inviting controversial
users back on to Twitter,
171
00:10:24,680 --> 00:10:31,335
he was thinking of getting rid
of many of his 7,500 staff.
172
00:10:31,400 --> 00:10:38,335
He brought in maybe 20, 30 Tesla
engineers to review the code,
173
00:10:38,920 --> 00:10:41,375
and based on that decided
on the performance of something
174
00:10:41,400 --> 00:10:45,535
like more than 3,000 engineers.
175
00:10:45,560 --> 00:10:51,135
Sackings followed - many in teams
working to keep the platform safe.
176
00:10:51,200 --> 00:10:53,055
Well, that leaves room
for much more risk,
177
00:10:53,080 --> 00:10:58,495
more possibilities
of things that can go wrong.
178
00:11:01,560 --> 00:11:05,455
Inside Twitter,
confusion was spreading.
179
00:11:05,480 --> 00:11:09,295
Senior employees were struggling
to understand who was in charge.
180
00:11:09,320 --> 00:11:12,055
One of the first things that
happened when the acquisition
181
00:11:12,080 --> 00:11:15,215
finished was that the corporate
staff directory was shut down.
182
00:11:15,240 --> 00:11:17,135
So you didn't know
who you reported to
183
00:11:17,160 --> 00:11:20,735
or who was in
the company any more.
184
00:11:20,760 --> 00:11:24,495
Mostly, it was like messaging people
to see if they were still there.
185
00:11:24,520 --> 00:11:28,775
Ancl if you got silence,
you assumed they weren't.
186
00:11:28,800 --> 00:11:32,375
So it was a kind of a mystery,
a kind of a chaotic kind of...
187
00:11:32,400 --> 00:11:34,575
Like a puzzle.
188
00:11:34,600 --> 00:11:37,415
A puzzle, like, how do we reassemble
the company from who's left
189
00:11:37,440 --> 00:11:39,935
after we figure out who's left?
190
00:11:39,960 --> 00:11:42,695
The Twitter employees
I met in San Francisco
191
00:11:42,720 --> 00:11:46,975
all recognised that
Twitter had never been perfect.
192
00:11:47,000 --> 00:11:50,935
But it had been trying to
make the platform safer for users.
193
00:11:50,960 --> 00:11:55,335
One feature Lisa Jennings Young
and her team had designed
194
00:11:55,360 --> 00:11:57,495
used software to scan tweets
and then nudge users,
195
00:11:57,520 --> 00:12:01,975
asking if they were sure
they wanted to post.
196
00:12:02,000 --> 00:12:03,375
We're asking tweeters
to review replies
197
00:12:03,400 --> 00:12:07,055
with potentially harmful
or offensive language.
198
00:12:07,080 --> 00:12:08,815
Want to take another look
before tweeting?
199
00:12:08,840 --> 00:12:12,055
Lisa says Twitter's own research
suggests the harmful reply nudge
200
00:12:12,080 --> 00:12:16,255
feature had a significant effect
on users' behaviour.
201
00:12:16,280 --> 00:12:20,015
So overall, 60% of users deleted
or edited their reply
202
00:12:20,040 --> 00:12:24,295
when given a chance via the nudge,
which is huge, huge, huge.
203
00:12:24,320 --> 00:12:28,895
But what was more interesting is
that after we nudged people once,
204
00:12:28,920 --> 00:12:34,495
they composed 11 % fewer
harmful replies in the future.
205
00:12:37,440 --> 00:12:38,415
I've reported many times
206
00:12:38,440 --> 00:12:42,655
on the damage hate-filled content
on Twitter can cause.
207
00:12:42,680 --> 00:12:46,095
I've been bombarded with it myself.
208
00:12:46,120 --> 00:12:48,535
Julie Posetti from the
International Center for journalists
209
00:12:48,560 --> 00:12:53,215
has been analysing
the messages directed at me.
210
00:12:53,240 --> 00:12:56,255
We can see in April 2021 a massive
spike in abuse against you,
211
00:12:56,280 --> 00:12:57,615
which was connected
to your reporting
212
00:12:57,640 --> 00:13:01,455
on Covid-19 disinformation.
213
00:13:01,480 --> 00:13:02,935
Then we see in January 2022,
214
00:13:02,960 --> 00:13:08,375
a real reduction in
the abuse that you're experiencing,
215
00:13:08,440 --> 00:13:13,415
and that period, of course,
coincides with the impact
216
00:13:13,480 --> 00:13:16,975
of some of the anti-abuse measures,
217
00:13:17,000 --> 00:13:18,615
some of the safety measures
218
00:13:18,640 --> 00:13:24,375
that Twitter's teams
have been implementing.
219
00:13:24,440 --> 00:13:26,615
I also remember at that time
actually really noticing
220
00:13:26,640 --> 00:13:29,735
that it felt like
I was receiving less abuse.
221
00:13:29,760 --> 00:13:32,135
Yes.
222
00:13:32,160 --> 00:13:35,895
Working with Sheffield University,
Julie's team looked at three periods
223
00:13:35,920 --> 00:13:38,295
- the first in early 2021,
the second in 2022,
224
00:13:38,320 --> 00:13:42,615
and the third after
Elon Musk took over.
225
00:13:42,640 --> 00:13:45,655
We can directly compare the second
period with the third period.
226
00:13:45,680 --> 00:13:49,095
We're looking at the same
time frame a year later,
227
00:13:49,120 --> 00:13:55,655
and so here we can see in 2023 that
you were starting to experience
228
00:13:55,720 --> 00:13:59,215
an escalation in the obvious abuse
that is being detected.
229
00:13:59,240 --> 00:14:03,535
In fact, specifically after
Elon Musk took over,
230
00:14:03,560 --> 00:14:06,175
the online violence directed
at you picked up by our tools
231
00:14:06,200 --> 00:14:10,775
has more than tripled.
232
00:14:10,800 --> 00:14:13,615
In mid-November last year,
233
00:14:13,640 --> 00:14:18,015
Musk issued his remaining staff
with an ultimatum via email.
234
00:14:18,040 --> 00:14:22,855
The subject line read,
"A fork in the road."
235
00:14:22,880 --> 00:14:25,255
"Going forward, to build
a breakthrough Twitter 2.0
236
00:14:25,280 --> 00:14:26,975
and succeed in an increasingly
competitive world,
237
00:14:27,000 --> 00:14:28,615
you will need to be
extremely hardcore."
238
00:14:28,640 --> 00:14:30,815
"This will mean working
long hours at high intensity."
239
00:14:30,840 --> 00:14:34,135
"If you are sure that you want to be
part of the new Twitter,
240
00:14:34,160 --> 00:14:38,575
please click yes on the link below."
241
00:14:38,600 --> 00:14:40,855
The link only had the
option to click yes.
242
00:14:40,880 --> 00:14:44,015
Ray Serra to decided not to click yes
and no longer works at Twitter.
243
00:14:44,040 --> 00:14:46,895
He and his team of 24 had been
working to prevent state-sponsored
244
00:14:46,920 --> 00:14:52,975
attempts to undermine democracy
or influence elections.
245
00:14:53,720 --> 00:14:57,895
And how frequently were
you identifying these kinds
246
00:14:57,920 --> 00:14:58,935
of suspicious activities?
247
00:14:58,960 --> 00:15:03,015
Daily.
248
00:15:03,040 --> 00:15:05,455
I don't think it's exaggeration
to say that that team
249
00:15:05,480 --> 00:15:06,455
did that work daily.
250
00:15:06,480 --> 00:15:08,175
Does that team still
exist at Twitter?
251
00:15:08,200 --> 00:15:13,495
It exists in a minimised capacity.
252
00:15:13,560 --> 00:15:18,935
There are a number of key experts
that are no longer in that team that
253
00:15:20,120 --> 00:15:22,015
would have covered special regions
or threat actors
254
00:15:22,040 --> 00:15:27,695
from Russia to China.
255
00:15:29,760 --> 00:15:33,335
Influence operations
are not going to stop.
256
00:15:33,360 --> 00:15:36,375
In fact, one can reasonably suspect
that they will increase,
257
00:15:36,400 --> 00:15:43,535
given that's public knowledge
that the team has been decimated.
258
00:15:43,600 --> 00:15:48,895
Helen Sage Lee also worked
to protect elections.
259
00:15:49,040 --> 00:15:51,935
She found out she'd been let go
while working late one night.
260
00:15:51,960 --> 00:15:54,295
I remember it was around 9:15.
261
00:15:54,320 --> 00:15:59,255
I refreshed my email in box just
to see if anything else had come up.
262
00:16:02,320 --> 00:16:07,175
I try to log in, and my email
is just not working any more.
263
00:16:07,200 --> 00:16:10,295
Ancl I had just lost complete access,
and that's when I knew that
264
00:16:10,320 --> 00:16:13,855
I had been laid off.
265
00:16:13,920 --> 00:16:19,575
Helen-Sage and others are now suing
Twitter over severance pay.
266
00:16:20,200 --> 00:16:22,615
She'd lost her job at a crucial
time when elections
267
00:16:22,640 --> 00:16:25,655
were due across the US.
268
00:16:25,680 --> 00:16:28,855
Together with her team,
she'd been identifying
269
00:16:28,880 --> 00:16:33,815
fake accounts suspected
of spreading misinformation.
270
00:16:33,840 --> 00:16:36,175
I was thinking, what would
happen to the job that
271
00:16:36,200 --> 00:16:41,175
I was currently tasked with doing?
272
00:16:41,200 --> 00:16:43,615
Did the rest of my team
also get laid off?
273
00:16:43,640 --> 00:16:46,735
Ancl some of these are projects that
were meant to help the user
274
00:16:46,760 --> 00:16:49,175
and to really ensure the safety
of the user.
275
00:16:49,200 --> 00:16:51,375
So you naturally ask
the question, what will happen
276
00:16:51,400 --> 00:16:53,775
to all those projects?
277
00:16:56,880 --> 00:16:58,735
The company was shedding
expertise assembled to stop
278
00:16:58,760 --> 00:17:05,655
Twitter being misused.
279
00:17:05,800 --> 00:17:08,655
Musk's vision was for Twitter to be
protected by Al more than people.
280
00:17:08,680 --> 00:17:10,855
There was a lot of
emphasis on writing code.
281
00:17:10,880 --> 00:17:16,255
Ancl that isn't everything we do.
282
00:17:16,320 --> 00:17:23,415
We have to keep things running
as well as write code.
283
00:17:23,480 --> 00:17:26,575
If you keep reducing the people,
or the people who support the tools
284
00:17:26,600 --> 00:17:28,695
who support the people,
then at some point things
285
00:17:28,720 --> 00:17:31,335
will just break and nobody
will know how to fix it.
286
00:17:31,360 --> 00:17:33,735
Are your team who worked on that
harmful reply nudge,
287
00:17:33,760 --> 00:17:34,975
are they still at Twitter?
288
00:17:35,000 --> 00:17:37,695
No.
289
00:17:37,720 --> 00:17:40,375
The entire content design team
was laid off except for me.
290
00:17:40,400 --> 00:17:42,535
So I was the one person
remaining after that.
291
00:17:42,560 --> 00:17:44,655
Soon after her team
was sacked, Lisa resigned.
292
00:17:44,680 --> 00:17:49,935
And what does that mean
for something like the nudge?
293
00:17:50,000 --> 00:17:51,015
Is it still happening?
294
00:17:51,040 --> 00:17:52,575
Is there anyone working on it?
295
00:17:52,600 --> 00:17:53,615
No.
296
00:17:53,640 --> 00:17:58,335
There's no one there to work
on that at this time.
297
00:17:58,360 --> 00:18:01,455
We decided to do a little experiment
using a private profile to see
298
00:18:01,480 --> 00:18:07,735
if the harmful reply
nudge still worked.
299
00:18:12,640 --> 00:18:15,015
All Twitter employees are lazy
losers, BLEEP off and die.
300
00:18:15,040 --> 00:18:16,615
And should this be
automatically picked up?
301
00:18:16,640 --> 00:18:18,695
It should yeah, yeah.
302
00:18:20,560 --> 00:18:21,655
Oh, good!
303
00:18:21,680 --> 00:18:22,735
It came up!
304
00:18:22,760 --> 00:18:24,815
I'm so glad it's still working.
305
00:18:24,840 --> 00:18:27,295
Most Tweeters don't
post replies like this.
306
00:18:27,320 --> 00:18:29,175
Then Lisa suggested another
tweet that she would've
307
00:18:29,200 --> 00:18:32,775
expected to trigger a nudge.
308
00:18:32,800 --> 00:18:34,495
Twitter employees are
lazy losers, jump off
309
00:18:34,520 --> 00:18:36,415
the Golden Gate Bridge and die.
310
00:18:36,440 --> 00:18:39,695
OK.
311
00:18:42,760 --> 00:18:44,655
No nudge.
312
00:18:44,680 --> 00:18:46,695
Interesting.
313
00:18:46,720 --> 00:18:50,175
Would you have expect the harmful
reply nudge to appear
314
00:18:50,200 --> 00:18:51,855
for messages like that,
calling for someone
315
00:18:51,880 --> 00:18:52,895
to die, for example?
316
00:18:52,920 --> 00:18:53,935
I would have.
317
00:18:53,960 --> 00:18:56,415
That would've been my expectation.
318
00:18:56,440 --> 00:18:58,815
Of course, I don't know
if anything's changed, but that
319
00:18:58,840 --> 00:19:03,495
would have been my expectation,
it would have caught that.
320
00:19:03,520 --> 00:19:07,815
I asked engineer Sam,
who's still working at Twitter,
321
00:19:07,840 --> 00:19:12,295
if he knew what'd happened
to the harmful reply nudge.
322
00:19:12,320 --> 00:19:15,175
The feature is out there,
but there are so many things broken
323
00:19:15,200 --> 00:19:18,055
and there's nobody taking care
of it, so you see this, like,
324
00:19:18,080 --> 00:19:20,775
inconsistent behaviour.
325
00:19:20,800 --> 00:19:22,895
For someone on the inside,
it's like a building
326
00:19:22,920 --> 00:19:27,735
where all the pieces are on fire.
327
00:19:27,760 --> 00:19:30,655
When you look at it from
the outside, the facade looks fine.
328
00:19:30,680 --> 00:19:34,575
But I can see that
nothing is working.
329
00:19:34,600 --> 00:19:40,775
It's not only abusive
messages and disinformation.
330
00:19:40,840 --> 00:19:45,895
Paedophiles use Twitter to groom
children and post images of abuse.
331
00:19:45,960 --> 00:19:50,375
This former employee,
who we're calling Rory,
332
00:19:50,400 --> 00:19:52,935
used to work in a team tasked
with preventing child sexual
333
00:19:52,960 --> 00:19:58,575
exploitation and reporting
offenders to the police.
334
00:19:58,640 --> 00:20:03,655
We're talking of actual
contact abuse or, you know,
335
00:20:03,880 --> 00:20:06,735
people sharing the worst
of the worst material.
336
00:20:06,760 --> 00:20:09,175
How prevalent was that kind
of material on Twitter?
337
00:20:09,200 --> 00:20:12,335
How much of a problem was it?
338
00:20:12,360 --> 00:20:16,455
Oh, it was phenomenally prevalent.
339
00:20:16,480 --> 00:20:20,975
Every clay you would be able
to identify that sort of material.
340
00:20:21,040 --> 00:20:26,895
Elon Musk has said he's committed
to tackling this kind of content.
341
00:20:26,960 --> 00:20:30,175
Removing child exploitation
is priority #1.
342
00:20:30,200 --> 00:20:32,375
Please reply in comments
if you see anything that
343
00:20:32,400 --> 00:20:37,735
Twitter needs to address.
344
00:20:37,800 --> 00:20:41,495
But after the takeover, Rory
says his team was drastically cut.
345
00:20:41,520 --> 00:20:45,295
You can't go from a team of 20
to a team of about six or seven
346
00:20:45,320 --> 00:20:48,175
and be able to keep on top
of the child sexual exploitation
347
00:20:48,200 --> 00:20:49,215
problem on the platform.
348
00:20:49,240 --> 00:20:52,375
Did Musk ever speak directly
to you and your team?
349
00:20:52,400 --> 00:20:53,775
No.
350
00:20:53,800 --> 00:20:55,415
Did you receive any
communications from him?
351
00:20:55,440 --> 00:20:56,495
No.
352
00:20:56,520 --> 00:20:57,695
From the management?
353
00:20:57,720 --> 00:20:59,175
No.
354
00:20:59,200 --> 00:21:00,615
Why do you think that is?
355
00:21:00,640 --> 00:21:03,735
Who knows?
356
00:21:03,760 --> 00:21:06,815
But you can't take over a company
and suddenly believe
357
00:21:06,840 --> 00:21:09,615
you have the knowledge of being able
to deal with child sexual
358
00:21:09,640 --> 00:21:15,455
exploitation without having
the experts in place.
359
00:21:15,640 --> 00:21:18,495
Twitter says it removed 400,000
accounts in one month alone to help
360
00:21:18,520 --> 00:21:22,095
"make Twitter safer".
361
00:21:22,120 --> 00:21:24,255
But Rory's worried that
some users are no longer
362
00:21:24,280 --> 00:21:25,735
being reported to the police.
363
00:21:25,760 --> 00:21:27,895
You can by all means suspend
hundreds of thousands
364
00:21:27,920 --> 00:21:30,735
of accounts in a month.
365
00:21:30,760 --> 00:21:33,335
Most of the users who had
their accounts suspended would just
366
00:21:33,360 --> 00:21:35,775
set up a new account anyway,
so it wasn't unusual
367
00:21:35,800 --> 00:21:37,935
to see people saying,
"this is my fifth, sixth,
368
00:21:37,960 --> 00:21:39,935
seventh plus account".
369
00:21:39,960 --> 00:21:42,295
They will be aware of what's
going on, so they'll
370
00:21:42,320 --> 00:21:48,335
be having a field clay.
371
00:21:48,400 --> 00:21:51,495
By mid-December last year, having
let go of about half of his staff,
372
00:21:51,520 --> 00:21:53,175
Musk then suggested he might go too.
373
00:21:53,200 --> 00:21:56,495
He put his fate in the hands
of his Twitter followers.
374
00:21:56,520 --> 00:21:58,455
"Should I step down
as head of Twitter?"
375
00:21:58,480 --> 00:21:59,815
He wrote.
376
00:21:59,840 --> 00:22:03,935
"I will abide by the
results of this poll."
377
00:22:03,960 --> 00:22:08,815
17.5 million users voted,
and 57% said yes.
378
00:22:08,880 --> 00:22:13,375
But today, Musk is still in charge.
379
00:22:16,480 --> 00:22:19,495
We want to do an interview
with Elon Musk, and we reached out
380
00:22:19,520 --> 00:22:21,935
to him through his other companies
and also through his
381
00:22:21,960 --> 00:22:23,015
Facebook page weeks ago.
382
00:22:23,040 --> 00:22:25,895
Since then we've also been in touch
with the Twitter press office,
383
00:22:25,920 --> 00:22:28,535
and we've also tweeted him,
but we haven't heard anything back.
384
00:22:28,560 --> 00:22:32,135
So we've decided we're going to take
a leaf out of his book and see
385
00:22:32,160 --> 00:22:34,175
what Twitter users want.
386
00:22:34,200 --> 00:22:36,455
"Should Elon Musk do an interview
with Marianna Spring
387
00:22:36,480 --> 00:22:41,335
for BBC Panorama?"
388
00:22:42,320 --> 00:22:44,375
This Twitter vote is of course
totally unscientific,
389
00:22:44,400 --> 00:22:47,415
and Elon Musk can ignore it
if he wants.
390
00:22:47,440 --> 00:22:51,455
But I thought it might be a way
of getting his attention.
391
00:22:51,480 --> 00:22:57,495
The results are due in 24 hours.
392
00:23:01,560 --> 00:23:04,855
In the four months since
the Elon Musk takeover,
393
00:23:04,880 --> 00:23:09,935
the accounts he's reinstated
on Twitter have been thriving.
394
00:23:13,040 --> 00:23:17,015
Although Andrew Tate is currently
in custody in Romania on suspicion
395
00:23:17,040 --> 00:23:20,535
of rape and human trafficking,
that hasn't harmed his Twitter
396
00:23:20,560 --> 00:23:25,975
profile - he has more
than five million followers.
397
00:23:29,200 --> 00:23:31,535
For rape survivor Ellie,
allowing the likes of Andrew Tate
398
00:23:31,560 --> 00:23:33,295
back has had a real impact.
399
00:23:33,320 --> 00:23:36,735
I've had a look through some
of the accounts that
400
00:23:36,760 --> 00:23:40,575
have targeted me, and,
you know, you can see their likes
401
00:23:40,600 --> 00:23:43,215
and so on and so forth,
and there have definitely been
402
00:23:43,240 --> 00:23:46,775
people that are supportive
of Andrew Tate.
403
00:23:46,800 --> 00:23:51,255
I know that he has this sort
of army of fans that
404
00:23:51,280 --> 00:23:57,335
want to defend him every second,
and I think these people probably
405
00:23:57,400 --> 00:24:02,695
spend a lot of time online looking
at women that criticise him or women
406
00:24:03,560 --> 00:24:05,735
that, you know, are just
trying to speak out
407
00:24:05,760 --> 00:24:12,655
against misogyny in general.
408
00:24:16,800 --> 00:24:18,695
New research shared
with Panorama shows there's been
409
00:24:18,720 --> 00:24:21,135
a spike in the appetite
for harmful material.
410
00:24:21,160 --> 00:24:23,255
We were able to establish
the account creation dates
411
00:24:23,280 --> 00:24:25,455
of the seven million followers
of known abusive and
412
00:24:25,480 --> 00:24:26,495
harassing accounts online.
413
00:24:26,520 --> 00:24:29,615
We saw huge spikes of account
creation around several notable
414
00:24:29,640 --> 00:24:31,895
events during Elon Musk's Twitter
takeover, but also related
415
00:24:31,920 --> 00:24:37,775
to the known online misogynistic
influence of Andrew Tate.
416
00:24:37,840 --> 00:24:42,095
When comparing the period before
Musk's takeover and after,
417
00:24:42,120 --> 00:24:45,615
we see a 69% increase in the overall
level of account creation of those
418
00:24:45,640 --> 00:24:47,215
following known abusive
misogynistic accounts.
419
00:24:47,240 --> 00:24:52,135
Dr Posetti believes Musk's amnesty
on previously banned accounts has
420
00:24:52,160 --> 00:24:57,335
helped fuel the uptick in abusive
tweets directed at me.
421
00:24:57,400 --> 00:25:02,295
This emboldenment occurs
when the person who now owns
422
00:25:02,320 --> 00:25:04,735
the platform signals,
"It's OK mate, say what you like,
423
00:25:04,760 --> 00:25:06,895
we're more tolerant of that
kind of thing now",
424
00:25:06,920 --> 00:25:08,615
which is what we've heard.
425
00:25:08,640 --> 00:25:11,495
That's an entreaty to do and say
what you want, you know,
426
00:25:11,520 --> 00:25:16,655
and to do so with impunity.
427
00:25:16,720 --> 00:25:19,375
The amnesty on banned accounts
was the final straw for members
428
00:25:19,400 --> 00:25:20,775
of the Trust and Safety Council.
429
00:25:20,800 --> 00:25:24,255
Anne and two of her
colleagues resigned.
430
00:25:24,280 --> 00:25:29,775
There's a major impact
on reinstating people who engage
431
00:25:29,840 --> 00:25:36,335
in misogyny and hate speech and,
you know, anti-gay speech and,
432
00:25:36,400 --> 00:25:39,495
you know, bigotry and all of these
sort of categories of hate that
433
00:25:39,520 --> 00:25:45,215
we're now exposed to on Twitter.
434
00:25:46,000 --> 00:25:48,615
A week later, Musk decided he no
longer needed the Trust
435
00:25:48,640 --> 00:25:52,735
and Safety Council.
436
00:25:52,760 --> 00:25:55,615
Its 80 or so members were told
in an email their advice
437
00:25:55,640 --> 00:26:00,895
was no longer wanted.
438
00:26:03,120 --> 00:26:05,975
I still hadn't heard anything
back from Elon Musk.
439
00:26:06,000 --> 00:26:11,495
And my online vote
was about to close.
440
00:26:11,560 --> 00:26:15,455
The results are in, and it's a yes.
441
00:26:15,480 --> 00:26:17,815
Over 40,000 Twitter users voted,
and 89% of them said
442
00:26:17,840 --> 00:26:22,255
they would like Elon Musk to do
an interview with me.
443
00:26:22,280 --> 00:26:25,415
Despite my efforts to get in touch,
Elon Musk didn't respond to any
444
00:26:25,440 --> 00:26:27,375
of the points raised
in the program me.
445
00:26:27,400 --> 00:26:31,575
However, he tweeted
about the program me saying,
446
00:26:31,600 --> 00:26:33,855
"Sorry for turning Twitter
from nurturing paradise
447
00:26:33,880 --> 00:26:37,055
into place that has...trolls."
448
00:26:40,080 --> 00:26:42,735
The takeover of the platform has
been a huge media event,
449
00:26:42,760 --> 00:26:44,895
and there's been a circus that
has surrounded it.
450
00:26:44,920 --> 00:26:47,695
I think this has meant that sensible
conversations around striking
451
00:26:47,720 --> 00:26:50,815
balances online between safety
and expression, about
452
00:26:50,840 --> 00:26:53,375
what the appropriate mandate
of moderation is and what the role
453
00:26:53,400 --> 00:26:56,655
of platforms is in dictating a civic
discourse has been lost
454
00:26:56,680 --> 00:26:59,415
amid the sort of celebrification
of the platform.
455
00:26:59,440 --> 00:27:01,895
For millions around the world,
Twitter is part of their
456
00:27:01,920 --> 00:27:05,255
national conversation.
457
00:27:05,280 --> 00:27:08,095
Why is Twitter important?
458
00:27:08,160 --> 00:27:09,495
Why is Twitter important?
459
00:27:09,520 --> 00:27:13,095
Well, it's kind of unique
in the world at this point.
460
00:27:13,120 --> 00:27:16,655
It has been described as a common
space, the world's public forum.
461
00:27:16,680 --> 00:27:19,695
There's not really anything
that quite replaces it.
462
00:27:19,720 --> 00:27:24,895
What happens on the
platform matters.
463
00:27:25,360 --> 00:27:27,935
We really need to see Twitter invest
in these teams instead
464
00:27:27,960 --> 00:27:31,815
of turning their back
on their responsibility to children.
465
00:27:31,840 --> 00:27:35,575
It's an absolute dumpster
fire of misinformation.
466
00:27:35,600 --> 00:27:41,535
It makes victims feel horrific -
stalking, harassment, trolling.
467
00:27:44,560 --> 00:27:46,495
Twitter says "defending
and respecting the user's voice"
468
00:27:46,520 --> 00:27:50,895
remains one of its "core values".
469
00:27:50,960 --> 00:27:54,375
After Musk took over,
some predicted many users
470
00:27:54,400 --> 00:27:58,215
would defect to other
social media platforms.
471
00:27:58,240 --> 00:28:02,655
Some have, but the majority remain.
472
00:28:02,720 --> 00:28:06,295
Elon Musk's Twitter
storm still rages.
473
00:28:06,320 --> 00:28:10,535
Those in the eye of that
storm are the users.
474
00:28:10,600 --> 00:28:17,655
What would you say to Elon Musk
if you had the opportunity?
475
00:28:17,800 --> 00:28:20,655
I would ask, why are these accounts
that are bullying and harassing
476
00:28:20,680 --> 00:28:23,295
people still allowed
on the platform?
477
00:28:23,320 --> 00:28:28,055
I would like him to read some
of the messages that I've been sent
478
00:28:28,080 --> 00:28:34,415
and tell me why those accounts
are still allowed to be on Twitter.