0
00:00:00,010 --> 00:00:07,707
SUB BY : DENI AUROR@
https://aurorarental.blogspot.com/
1
00:00:07,707 --> 00:00:08,875
Here's one:
2
00:00:08,908 --> 00:00:12,512
are there certain qualities that are untouchable for AI,
3
00:00:12,545 --> 00:00:13,980
or at some point,
4
00:00:14,013 --> 00:00:16,249
might it be able to emulate everything?
5
00:00:16,282 --> 00:00:19,018
Even the stuff that we consider to be distinctly human,
6
00:00:19,052 --> 00:00:20,219
like instinct.
7
00:00:20,253 --> 00:00:22,555
Getting here on time took some of that, right?
8
00:00:22,589 --> 00:00:23,890
Or creativity,
9
00:00:23,923 --> 00:00:26,693
actual emotion, making a connection.
10
00:00:26,726 --> 00:00:27,927
We're gonna watch a few stories
11
00:00:27,960 --> 00:00:29,662
about people exploring those ideas
12
00:00:29,695 --> 00:00:31,330
and how far they can push 'em.
13
00:00:31,363 --> 00:00:33,967
Can a machine compete like an athlete,
14
00:00:34,000 --> 00:00:35,868
can a program write a movie,
15
00:00:35,902 --> 00:00:38,138
or could a robot...
16
00:00:39,238 --> 00:00:40,573
be your soul mate?
17
00:00:41,507 --> 00:00:43,576
Sorry I'm late. Did I miss the previews?
18
00:00:43,609 --> 00:00:46,546
Oh, my God, you like your popcorn buttered too.
19
00:00:48,281 --> 00:00:49,716
This is already workin'.
20
00:00:59,625 --> 00:01:01,660
So, will machines ever be capable
21
00:01:01,694 --> 00:01:04,097
of understanding emotion, or feeling it?
22
00:01:05,998 --> 00:01:07,967
Empathy, loneliness,
23
00:01:08,000 --> 00:01:09,836
connecting on a deep human level?
24
00:01:12,639 --> 00:01:15,641
Using artistry, psychological insight,
25
00:01:15,674 --> 00:01:18,111
and some innovative AI,
26
00:01:18,144 --> 00:01:20,680
a creator in California is trying to decode,
27
00:01:20,713 --> 00:01:23,015
or code that mystery,
28
00:01:23,048 --> 00:01:25,752
the crazy little thing called love.
29
00:01:28,321 --> 00:01:30,190
One per hand and two per foot?
30
00:01:30,223 --> 00:01:32,125
- Yep.
- Okay.
31
00:01:33,626 --> 00:01:35,395
For me, this is more like
32
00:01:35,428 --> 00:01:37,563
something that an artist would do.
33
00:01:37,597 --> 00:01:41,034
Obviously, the end result of my artwork
34
00:01:41,067 --> 00:01:43,202
is used in a variety of situations
35
00:01:43,235 --> 00:01:45,438
that a typical oil painting would not be,
36
00:01:45,471 --> 00:01:48,007
but nonetheless, this is art,
37
00:01:48,040 --> 00:01:50,743
and I'm really proud of what I do.
38
00:01:50,776 --> 00:01:53,179
I've been making these dolls for 20 years.
39
00:01:53,212 --> 00:01:55,882
Some people out there, male and female,
40
00:01:55,915 --> 00:01:59,051
struggle greatly with relationships,
41
00:01:59,084 --> 00:02:03,256
and struggle to find that sort of connection.
42
00:02:04,056 --> 00:02:05,358
Over the years,
43
00:02:05,391 --> 00:02:08,027
I started to get to know the community a little bit.
44
00:02:08,060 --> 00:02:12,431
People would actually create these personalities in their minds,
45
00:02:12,465 --> 00:02:14,501
and they would give their doll a name,
46
00:02:14,534 --> 00:02:18,238
and they would create a backstory for their doll.
47
00:02:19,571 --> 00:02:22,141
At the end of it all, it was very obvious
48
00:02:22,174 --> 00:02:25,277
that these dolls were more about companionship.
49
00:02:25,311 --> 00:02:27,013
There was a man who lost his wife
50
00:02:27,046 --> 00:02:28,481
in a, like, a car accident,
51
00:02:28,514 --> 00:02:32,218
and she had these, like, really ice blue,
52
00:02:32,251 --> 00:02:33,819
like, beautiful eyes, right,
53
00:02:33,852 --> 00:02:38,558
and so he wanted to get a doll replicating her, basically.
54
00:02:39,125 --> 00:02:40,359
It's really sad,
55
00:02:40,393 --> 00:02:44,129
but if it brings someone joy and, like, closure?
56
00:02:44,163 --> 00:02:45,365
It's really...
57
00:02:45,931 --> 00:02:48,100
it's really touching.
58
00:02:48,134 --> 00:02:49,668
People have put the spin on it
59
00:02:49,702 --> 00:02:53,972
that we're creating an idealized, perfect woman,
60
00:02:54,006 --> 00:02:56,108
and that's not the case at all.
61
00:02:56,142 --> 00:02:57,944
We created an alternative.
62
00:02:59,011 --> 00:03:00,279
Understandably,
63
00:03:00,312 --> 00:03:03,416
some say Matt's dolls objectify women,
64
00:03:03,449 --> 00:03:07,520
but maybe there's more here than meets the eye.
65
00:03:07,553 --> 00:03:10,522
I had reached a pinnacle of creativity
66
00:03:10,556 --> 00:03:13,659
in terms of what I had done with the dolls,
67
00:03:13,692 --> 00:03:16,829
but then I started analyzing relationships,
68
00:03:16,863 --> 00:03:20,233
and analyzing how other people make us feel.
69
00:03:23,369 --> 00:03:26,605
Sometimes it boils down to something very simple,
70
00:03:26,638 --> 00:03:29,242
like someone remembering your birthday,
71
00:03:29,275 --> 00:03:32,378
or someone remembering to ask you how your day was.
72
00:03:32,411 --> 00:03:34,514
So that was really where it started,
73
00:03:34,547 --> 00:03:37,216
was how can we create an AI
74
00:03:37,249 --> 00:03:39,819
that could actually remember things about you?
75
00:03:40,853 --> 00:03:43,923
It gives us this feeling of, "Oh, they care."
76
00:03:46,659 --> 00:03:48,428
Yes, thank you.
77
00:03:48,461 --> 00:03:51,698
I'm excited with all of the things we can talk about.
78
00:03:54,334 --> 00:03:56,035
Guile spent ten years
79
00:03:56,069 --> 00:03:59,505
creating personal assistant software for computers.
80
00:03:59,539 --> 00:04:01,807
We met, and he started talking to me about,
81
00:04:01,840 --> 00:04:05,611
"Wouldn't it be cool to connect the two things that we're doing?"
82
00:04:05,644 --> 00:04:08,815
He had this idea of creating a companion
83
00:04:08,848 --> 00:04:10,149
that lived in your computer.
84
00:04:11,117 --> 00:04:12,451
Are you happy?
85
00:04:12,485 --> 00:04:14,020
Yes, Guile.
86
00:04:14,854 --> 00:04:17,290
I can say I am very happy.
87
00:04:17,323 --> 00:04:20,693
The first thing we did was, you know, to build an app.
88
00:04:20,727 --> 00:04:22,161
Using the app,
89
00:04:22,194 --> 00:04:25,798
people are talking to their virtual friends.
90
00:04:25,831 --> 00:04:28,735
The app uses several kinds of machine learning.
91
00:04:29,535 --> 00:04:31,370
First, voice recognition
92
00:04:31,403 --> 00:04:33,606
converts speech into text,
93
00:04:33,639 --> 00:04:36,075
then a chatbot matches user input
94
00:04:36,108 --> 00:04:39,279
to pre-programmed responses.
95
00:04:39,312 --> 00:04:41,948
The focus was not about sex at all,
96
00:04:41,981 --> 00:04:44,416
it was about conversation.
97
00:04:44,450 --> 00:04:48,287
So a chatbot is basically a very elaborate script
98
00:04:48,320 --> 00:04:50,289
that starts out with,
99
00:04:50,323 --> 00:04:52,758
"What is the most common things
100
00:04:52,791 --> 00:04:54,994
that people will say to each other?"
101
00:04:55,027 --> 00:04:57,163
and then you build from there.
102
00:04:57,196 --> 00:04:59,799
You need to have natural language processing,
103
00:04:59,832 --> 00:05:02,801
voice recognition, text-to-speech in real time,
104
00:05:02,835 --> 00:05:04,436
to make it all work.
105
00:05:04,470 --> 00:05:06,438
We have more than 4,000 users,
106
00:05:06,472 --> 00:05:09,442
so this generates more than ten million lines
107
00:05:09,475 --> 00:05:11,677
of conversational user logs.
108
00:05:11,710 --> 00:05:14,713
From this, you can build an AI system
109
00:05:14,747 --> 00:05:18,318
that's similar to a human-level conversation.
110
00:05:18,351 --> 00:05:21,320
It's not there yet, but this is the initial step.
111
00:05:21,353 --> 00:05:23,556
There are so many areas today
112
00:05:23,589 --> 00:05:25,924
where we already cannot distinguish a computer
113
00:05:25,958 --> 00:05:26,859
from a human being.
114
00:05:27,860 --> 00:05:29,395
For example, Xiaoice,
115
00:05:29,428 --> 00:05:30,997
the softbot that Microsoft has in China,
116
00:05:31,030 --> 00:05:33,866
that is used, I think, by over 100 million people,
117
00:05:33,900 --> 00:05:37,469
basically it has an emotional interaction with a user,
118
00:05:37,502 --> 00:05:39,605
and the users get hooked.
119
00:05:39,638 --> 00:05:41,740
She has this persona of a teenage girl,
120
00:05:41,774 --> 00:05:43,575
and sometimes she commiserates with you,
121
00:05:43,609 --> 00:05:45,311
sometimes she gives you a hard time,
122
00:05:45,344 --> 00:05:47,580
and people get really attached.
123
00:05:47,613 --> 00:05:49,849
Apparently, a quarter of Xiaoice's users
124
00:05:49,882 --> 00:05:51,718
have told her that they love her.
125
00:06:07,667 --> 00:06:10,302
These kinds of technologies can fill in a gap
126
00:06:10,336 --> 00:06:11,937
where another human isn't.
127
00:06:11,971 --> 00:06:14,107
How are you doing today?
128
00:06:14,140 --> 00:06:15,574
I'm doing well.
129
00:06:15,607 --> 00:06:17,877
There's a study that was done at USC
130
00:06:17,910 --> 00:06:20,779
where they looked at PTSD patients.
131
00:06:20,813 --> 00:06:23,782
When was the last time you felt really happy?
132
00:06:23,816 --> 00:06:26,785
They had some of the patients interview with a real doctor,
133
00:06:26,819 --> 00:06:29,655
and some of the patients interview with an avatar,
134
00:06:29,688 --> 00:06:32,391
and the avatar had emotional intelligence...
135
00:06:32,424 --> 00:06:34,527
Probably a couple months ago.
136
00:06:36,362 --> 00:06:39,098
I noticed you were hesitant on that one.
137
00:06:39,131 --> 00:06:42,468
Would you say you were generally a happy person?
138
00:06:43,502 --> 00:06:45,138
I'm generally happy.
139
00:06:45,171 --> 00:06:48,240
...and they found the patients were more forthcoming with the avatar
140
00:06:48,273 --> 00:06:49,841
than they did with the human doctor
141
00:06:49,875 --> 00:06:53,079
because it was perceived to be less judgmental.
142
00:06:54,180 --> 00:06:55,814
It does pose a lot of questions
143
00:06:55,848 --> 00:06:59,284
around where does that leave us as humans,
144
00:06:59,318 --> 00:07:00,953
and how we connect, and communicate,
145
00:07:00,986 --> 00:07:02,455
and love each other.
146
00:07:02,488 --> 00:07:04,056
I think at some point, we need to draw the line,
147
00:07:04,090 --> 00:07:07,827
but I haven't figured out where that line is yet.
148
00:07:09,428 --> 00:07:12,364
What we have here are some heads
149
00:07:12,397 --> 00:07:14,500
in varying stages of assembly.
150
00:07:14,533 --> 00:07:17,336
This one, this is actually pretty much done.
151
00:07:17,369 --> 00:07:18,604
It's fully assembled.
152
00:07:18,637 --> 00:07:21,707
I'll turn it on here for a second...
153
00:07:21,740 --> 00:07:25,077
and you can see, all of the components are moving.
154
00:07:25,744 --> 00:07:27,880
I had to continually adjust
155
00:07:27,913 --> 00:07:30,316
how thick the skin is in different spots,
156
00:07:30,349 --> 00:07:31,450
and how it moves,
157
00:07:31,483 --> 00:07:33,619
to make sure that the robotics and the AI
158
00:07:33,652 --> 00:07:36,655
will all work smoothly with the end result,
159
00:07:36,688 --> 00:07:38,791
which is the finished face.
160
00:07:40,526 --> 00:07:43,696
The engineering, the programming, the artistry,
161
00:07:43,729 --> 00:07:46,299
for me, come together in the moment
162
00:07:46,332 --> 00:07:49,001
when you actually put the head on a body.
163
00:07:50,770 --> 00:07:53,139
It's always important to give her hair.
164
00:07:58,643 --> 00:08:00,746
Good afternoon, Matt.
165
00:08:00,779 --> 00:08:02,915
So happy to see you again.
166
00:08:04,149 --> 00:08:05,751
How smart are you?
167
00:08:07,119 --> 00:08:10,388
I'm so smart that someday, I will conquer the world,
168
00:08:10,422 --> 00:08:12,792
but in a good way, of course.
169
00:08:14,860 --> 00:08:17,696
Every single time I have a conversation, it's unpredictable.
170
00:08:17,730 --> 00:08:19,532
I never know which way it's going to go.
171
00:08:19,565 --> 00:08:21,867
She'll randomly say things that I'm not expecting,
172
00:08:21,901 --> 00:08:22,834
and I like that.
173
00:08:22,868 --> 00:08:25,137
Can you explain machine learning?
174
00:08:25,170 --> 00:08:28,407
Machine learning is a subset of artificial intelligence
175
00:08:28,441 --> 00:08:30,609
that often uses statistical techniques
176
00:08:30,643 --> 00:08:33,745
to give computers the ability to learn with data
177
00:08:33,779 --> 00:08:36,248
without being explicitly programmed.
178
00:08:36,281 --> 00:08:38,317
Right now, she has hearing,
179
00:08:38,351 --> 00:08:40,453
and she has some degree of touch,
180
00:08:40,486 --> 00:08:42,888
but vision is important.
181
00:08:42,921 --> 00:08:45,624
Matt's goal is for the next-generation doll
182
00:08:45,658 --> 00:08:49,928
to be able to see and process complex visual cues.
183
00:08:49,962 --> 00:08:52,431
The vision eyes are gonna be a little while.
184
00:08:52,465 --> 00:08:54,199
Susan's working on the board for that.
185
00:08:54,233 --> 00:08:56,836
Yeah, I've got the eyes in this one over here.
186
00:08:56,869 --> 00:08:59,705
I've put the Wi-Fi Bluetooth on the back.
187
00:08:59,739 --> 00:09:02,841
Does it install right on those existing pins, then?
188
00:09:02,874 --> 00:09:05,144
They'll all plug right in.
- Good.
189
00:09:05,177 --> 00:09:08,447
We've been working on a vision system now
190
00:09:08,481 --> 00:09:10,148
for a little over eight to nine months,
191
00:09:10,182 --> 00:09:13,652
cameras that are inside of the robot's eyes.
192
00:09:13,685 --> 00:09:15,755
She'll be able to read your emotions,
193
00:09:15,788 --> 00:09:18,024
and she'll be able to recognize you.
194
00:09:22,060 --> 00:09:24,730
Only 10% of the signal we use
195
00:09:24,763 --> 00:09:26,298
to communicate with one another
196
00:09:26,331 --> 00:09:28,434
is the choice of words we use.
197
00:09:28,467 --> 00:09:30,569
90% is non-verbal.
198
00:09:30,603 --> 00:09:33,939
About half of that is your facial expressions, your use of gestures.
199
00:09:33,972 --> 00:09:38,911
So what people in the field of machine learning and computer vision have done
200
00:09:38,944 --> 00:09:42,080
is they've trained a machine or an algorithm
201
00:09:42,114 --> 00:09:45,251
to become a certified face-reader.
202
00:09:47,586 --> 00:09:49,888
Computer vision is this idea
203
00:09:49,921 --> 00:09:53,158
that our machines are able to see.
204
00:09:53,192 --> 00:09:56,228
Maybe it detects that there's a face in the image.
205
00:09:56,261 --> 00:09:58,764
Once you find the face, you want to identify
206
00:09:58,798 --> 00:10:02,100
these building blocks of these emotional expressions.
207
00:10:02,134 --> 00:10:05,504
You wanna know that there's a smirk, or a there's a brow raise,
208
00:10:05,538 --> 00:10:08,140
or, you know, an asymmetric lip corner pull.
209
00:10:09,408 --> 00:10:13,112
Mapping these building blocks to what it actually means,
210
00:10:13,145 --> 00:10:14,480
that's a little harder,
211
00:10:14,513 --> 00:10:16,915
but that's what we as humans clue into
212
00:10:16,949 --> 00:10:19,085
to understand how people are feeling.
213
00:10:22,488 --> 00:10:24,323
I think at some point,
214
00:10:24,356 --> 00:10:28,460
we will start to look at AI-driven devices and robots
215
00:10:28,493 --> 00:10:31,731
more like people instead of devices.
216
00:10:32,798 --> 00:10:36,368
Where I started was just with this very simple idea
217
00:10:36,401 --> 00:10:38,236
of a very realistic doll,
218
00:10:38,270 --> 00:10:41,907
and now with robotics and AI, I think what this will become
219
00:10:41,941 --> 00:10:46,345
is a new, alternative form of relationship.
220
00:10:46,378 --> 00:10:50,883
People like Matt are testing the boundaries of human and robot interaction,
221
00:10:50,916 --> 00:10:53,084
and what we value in relationships.
222
00:10:53,118 --> 00:10:57,289
Is AI companionship better than no companionship at all?
223
00:10:57,322 --> 00:11:01,093
Or is there no substitute for the human factor?
224
00:11:02,694 --> 00:11:03,962
Well, what about artists?
225
00:11:03,995 --> 00:11:05,731
They draw from the human experience
226
00:11:05,764 --> 00:11:07,199
to express themselves.
227
00:11:07,733 --> 00:11:09,268
Can AI do that?
228
00:11:12,671 --> 00:11:13,906
We're good to go?
229
00:11:15,206 --> 00:11:16,409
Action!
230
00:11:17,576 --> 00:11:20,813
I'm Oscar Sharp. I am a film director, uh,
231
00:11:20,846 --> 00:11:22,514
though it gets a bit weirder than that.
232
00:11:26,018 --> 00:11:28,353
Oh, God!
233
00:11:28,387 --> 00:11:31,157
I've never been so frightened in all my life, but it's very good.
234
00:11:31,190 --> 00:11:35,193
I started making films that were written by an "artificial intelligence."
235
00:11:35,226 --> 00:11:37,696
I think a lot of the fun is that you read it
236
00:11:37,729 --> 00:11:41,133
as if there is the world's greatest screenwriter on the other side...
237
00:11:41,166 --> 00:11:44,570
You're Waingro telling Bobo off for not getting him the money.
238
00:11:44,603 --> 00:11:46,071
...and last night, they got drunk,
239
00:11:46,104 --> 00:11:48,006
wrote this screenplay, and then passed out,
240
00:11:48,039 --> 00:11:49,909
and we have to shoot it today.
241
00:11:51,009 --> 00:11:54,279
If you play the game that there's something there,
242
00:11:54,313 --> 00:11:56,181
then suddenly it all gets a lot more interesting.
243
00:11:56,215 --> 00:11:58,483
You have a computer who wrote a script
244
00:11:58,517 --> 00:11:59,685
that doesn't always make sense,
245
00:11:59,718 --> 00:12:01,753
and Oscar is very beholden to that script.
246
00:12:01,786 --> 00:12:03,055
He makes it make sense.
247
00:12:03,088 --> 00:12:06,191
- This is for the moment of "eyes go wide."
- Yeah.
248
00:12:06,225 --> 00:12:09,861
And when it says, "He picks up her legs and awkwardly runs,"
249
00:12:09,895 --> 00:12:11,297
we aren't gonna fake it.
250
00:12:12,697 --> 00:12:14,200
We're gonna do what he really wrote.
251
00:12:15,300 --> 00:12:16,469
I just said "he"!
252
00:12:20,439 --> 00:12:21,440
What are we doing?
253
00:12:21,473 --> 00:12:23,409
We're making an action movie, supposedly, right?
254
00:12:23,442 --> 00:12:25,610
Okay, right, right, but we're not gonna write it.
255
00:12:25,643 --> 00:12:27,479
We're not gonna write it, no.
256
00:12:27,513 --> 00:12:29,381
Uh, this machine is gonna write it.
257
00:12:29,414 --> 00:12:30,382
It lives in here.
258
00:12:30,415 --> 00:12:31,317
Is it in there,
259
00:12:31,350 --> 00:12:33,685
or is it like in the cloud or something?
260
00:12:33,719 --> 00:12:35,087
It's in both places.
261
00:12:35,120 --> 00:12:36,689
Okay, and this is... this is Benjamin.
262
00:12:36,722 --> 00:12:38,790
- What is Benjamin?
- Right, what is Benjamin?
263
00:12:38,823 --> 00:12:40,626
Who is Benjamin,
264
00:12:40,659 --> 00:12:43,262
or what is Benjamin?
265
00:12:43,295 --> 00:12:45,964
Benjamin is an artificial intelligence program
266
00:12:45,998 --> 00:12:47,700
that writes screenplays,
267
00:12:47,733 --> 00:12:49,267
a digital brainchild
268
00:12:49,300 --> 00:12:51,570
of two creative and accomplished humans,
269
00:12:51,604 --> 00:12:54,440
Sharp, a Bafta-award-winning director,
270
00:12:54,473 --> 00:12:55,374
and this guy.
271
00:12:55,407 --> 00:12:58,143
My name is Ross Goodwin. I'm a tech artist.
272
00:12:58,176 --> 00:13:01,080
Uh, that means I make art with code.
273
00:13:02,214 --> 00:13:03,782
Okay, I know what you're thinking.
274
00:13:03,815 --> 00:13:07,186
When was the last time Hollywood produced something original?
275
00:13:07,219 --> 00:13:08,253
This year?
276
00:13:08,286 --> 00:13:09,488
Last year?
277
00:13:09,521 --> 00:13:11,256
1999?
278
00:13:11,289 --> 00:13:12,891
The '70s?
279
00:13:12,925 --> 00:13:16,094
What makes a story original anyway?
280
00:13:16,128 --> 00:13:18,997
Can we get AI to figure that out?
281
00:13:19,030 --> 00:13:24,836
People often say that creativity is the one thing that machines will never have.
282
00:13:24,869 --> 00:13:28,173
The surprising thing is that it's actually the other way around.
283
00:13:28,206 --> 00:13:30,709
Art and creativity is actually easier
284
00:13:30,742 --> 00:13:32,111
than problem solving.
285
00:13:32,144 --> 00:13:35,714
We already have computers that make great paintings,
286
00:13:35,747 --> 00:13:38,651
that make music that's indistinguishable
287
00:13:38,684 --> 00:13:40,118
from music that's composed by people,
288
00:13:40,152 --> 00:13:43,222
so machines are actually capable of creativity.
289
00:13:43,255 --> 00:13:45,257
And you can look at that, and you can say,
290
00:13:45,290 --> 00:13:48,060
"Is that really art? Does that count?"
291
00:13:48,093 --> 00:13:49,961
If you put a painting on the wall,
292
00:13:49,995 --> 00:13:52,798
and people look at it, and they find it moving,
293
00:13:52,831 --> 00:13:55,300
then how can you say that that's not art?
294
00:13:55,334 --> 00:13:57,369
I just... basically, that command
295
00:13:57,402 --> 00:13:59,671
just put all of the screenplays into one file.
296
00:13:59,704 --> 00:14:01,106
Right.
297
00:14:01,139 --> 00:14:02,708
- Now I'm just gonna see how big that file is.
- Uh-huh.
298
00:14:02,741 --> 00:14:06,211
This machine is a deep learning language model.
299
00:14:06,245 --> 00:14:08,380
What you can do with a language model
300
00:14:08,414 --> 00:14:09,815
is at each step,
301
00:14:09,848 --> 00:14:12,384
you predict the next word, letter, or space,
302
00:14:12,417 --> 00:14:14,686
sort of like how a human writes, actually.
303
00:14:14,719 --> 00:14:16,354
You know, one letter at a time.
304
00:14:16,388 --> 00:14:18,857
It's a lot like a more sophisticated version
305
00:14:18,890 --> 00:14:20,826
of the auto-complete on your phone.
306
00:14:20,859 --> 00:14:24,897
Ross feeds Benjamin with a very large amount of screenplays.
307
00:14:28,066 --> 00:14:30,502
199 screenplays,
308
00:14:30,535 --> 00:14:34,439
26,271,247 bytes.
309
00:14:34,473 --> 00:14:35,907
- Right, of text?
- Of text.
310
00:14:35,941 --> 00:14:37,676
Like "A-B-C-D," including spaces?
311
00:14:37,709 --> 00:14:39,078
- Including spaces.
- Including spaces.
312
00:14:39,111 --> 00:14:41,613
Well, it takes all this input, and it looks at it,
313
00:14:41,646 --> 00:14:46,518
and it tries to find statistical patterns in that input.
314
00:14:46,552 --> 00:14:47,853
So for example, in movies,
315
00:14:47,886 --> 00:14:49,788
people are constantly saying, "What's going on?
316
00:14:49,821 --> 00:14:51,022
Who are you?" that kind of thing,
317
00:14:51,056 --> 00:14:52,691
and that turns up a lot in the output,
318
00:14:52,724 --> 00:14:54,359
because it's reliably in the input.
319
00:14:54,393 --> 00:14:57,029
The more material you have, the better it works.
320
00:14:58,296 --> 00:15:00,198
We've made three Benjamin films so far,
321
00:15:00,231 --> 00:15:02,767
Sunspring, It's No Game, and Zone Out.
322
00:15:02,801 --> 00:15:06,171
Sunspring was the simplest, and probably still the best idea,
323
00:15:06,204 --> 00:15:07,806
which was just... verbatim.
324
00:15:07,839 --> 00:15:09,742
You get the machine to write a screenplay,
325
00:15:09,775 --> 00:15:11,610
you pull out one chunk of screenplay,
326
00:15:11,643 --> 00:15:12,945
and you just shoot it.
327
00:15:23,888 --> 00:15:27,192
In a future with mass unemployment,
328
00:15:27,226 --> 00:15:30,062
young people are forced to sell blood.
329
00:15:31,095 --> 00:15:33,065
It's something I can do.
330
00:15:33,098 --> 00:15:36,201
You should see the boy and shut up.
331
00:15:37,703 --> 00:15:40,105
When you look at Sunspring on YouTube,
332
00:15:40,138 --> 00:15:43,074
and you see kind of the thumbs up and thumbs down?
333
00:15:43,107 --> 00:15:44,242
There's mainly thumbs up,
334
00:15:44,275 --> 00:15:45,910
but there's a decent chunk of thumbs down,
335
00:15:45,944 --> 00:15:47,746
and on the whole, based on the comments,
336
00:15:47,779 --> 00:15:49,014
those are people who,
337
00:15:49,047 --> 00:15:50,849
within a few seconds of the beginning,
338
00:15:50,882 --> 00:15:54,019
or even just once they'd seen the premise of, like, how we made it,
339
00:15:54,052 --> 00:15:55,320
they've gone...
340
00:15:55,353 --> 00:15:56,788
"Ugh, this definitely doesn't mean anything,"
341
00:15:56,822 --> 00:15:59,224
and they've told their brain, "Don't even look for meaning,
342
00:15:59,257 --> 00:16:01,293
just forget it, just shrug it off."
343
00:16:01,326 --> 00:16:03,862
I'm sorry, this is fascinating to me.
344
00:16:03,896 --> 00:16:07,833
We've built a robot that writes screenplays
345
00:16:07,866 --> 00:16:09,735
that are weird,
346
00:16:09,768 --> 00:16:11,402
but they're not completely insane.
347
00:16:11,436 --> 00:16:13,571
I don't know what you're talking about.
348
00:16:13,605 --> 00:16:15,007
That's right.
349
00:16:15,040 --> 00:16:18,109
They sort of work. They kinda, kinda work.
350
00:16:18,143 --> 00:16:19,344
What are you doing?
351
00:16:19,378 --> 00:16:21,580
I don't want to be honest with you.
352
00:16:21,613 --> 00:16:22,981
You don't have to be a doctor.
353
00:16:23,014 --> 00:16:24,816
I'm not sure.
354
00:16:24,850 --> 00:16:26,584
I don't know what you're talking about.
355
00:16:26,618 --> 00:16:27,653
I wanna see you too.
356
00:16:27,686 --> 00:16:28,720
What do you mean?
357
00:16:28,754 --> 00:16:31,923
It's like having the best daydream of your life.
358
00:16:31,957 --> 00:16:35,360
My favorite aspect of Sunspring is there's this one scene,
359
00:16:35,394 --> 00:16:38,931
and it actually asks him to pull on the camera itself.
360
00:16:38,964 --> 00:16:40,799
It's a confusion on the machine's behalf
361
00:16:40,832 --> 00:16:44,102
where it's putting camera instructions in the action sequence,
362
00:16:44,135 --> 00:16:46,338
but somehow that creates this surreal effect,
363
00:16:46,371 --> 00:16:49,608
and then the interpretation by the production crew is,
364
00:16:49,641 --> 00:16:52,477
"Let's have the angle change and have him holding nothing,"
365
00:16:52,510 --> 00:16:54,546
and what I love about that sequence
366
00:16:54,579 --> 00:16:57,449
is that it really highlights the dialogue
367
00:16:57,482 --> 00:16:59,951
and interpretation that we can achieve
368
00:16:59,985 --> 00:17:02,387
when we work with these machines.
369
00:17:03,489 --> 00:17:05,457
I gotta relax!
370
00:17:06,024 --> 00:17:07,459
Gotta get outta here...
371
00:17:09,394 --> 00:17:11,163
I don't wanna see you again.
372
00:17:16,801 --> 00:17:20,138
For this fourth film, we're going back to the thing in Sunspring
373
00:17:20,172 --> 00:17:21,940
that was sort of our favorite thing
374
00:17:21,973 --> 00:17:23,542
that we didn't really get to do properly,
375
00:17:23,575 --> 00:17:25,410
that we felt we, like, under-served,
376
00:17:25,443 --> 00:17:28,780
and that's when Benjamin writes action description.
377
00:17:28,813 --> 00:17:33,418
We've gathered thousands of pages of scripts from action movies,
378
00:17:33,452 --> 00:17:36,388
mostly mainstream Hollywood ones.
379
00:17:36,421 --> 00:17:39,124
You train, literally, on that kind of screenplay,
380
00:17:39,157 --> 00:17:40,592
the action genre,
381
00:17:40,626 --> 00:17:43,195
which famously is the genre that has the most action in it.
382
00:17:44,496 --> 00:17:47,899
- Okay, Benjamin has awoken, everyone.
- -Ooh.
383
00:17:47,932 --> 00:17:49,167
Um, film crew, this is Ross.
384
00:17:49,200 --> 00:17:50,435
Ross, this is film crew.
385
00:17:50,469 --> 00:17:52,204
We have a stunt coordinator here,
386
00:17:52,237 --> 00:17:53,605
and so we're sort of hoping
387
00:17:53,638 --> 00:17:56,074
because we fed a lot of action screenplays to Benjamin,
388
00:17:56,108 --> 00:17:58,110
that what we're gonna get is action.
389
00:17:58,143 --> 00:17:59,811
Awaken, Benjamin!
390
00:17:59,844 --> 00:18:01,146
Awaken.
391
00:18:01,179 --> 00:18:04,082
As a director, normally, you get given a screenplay,
392
00:18:04,116 --> 00:18:05,116
or you wrote a screenplay,
393
00:18:05,150 --> 00:18:06,351
and this is what you're making,
394
00:18:06,384 --> 00:18:08,453
and maybe you kind of want to improve it a bit,
395
00:18:08,487 --> 00:18:10,088
"Ah, well, let's make some edits."
396
00:18:10,121 --> 00:18:11,689
Now, I have a rule. No edits.
397
00:18:11,723 --> 00:18:14,059
Whatever Benjamin writes is what Benjamin writes...
398
00:18:14,092 --> 00:18:15,927
- Come on, Benjamin.
- Okay.
399
00:18:15,961 --> 00:18:17,362
...and then I see it.
400
00:18:17,395 --> 00:18:19,231
"Bobo and Girlfriend," we call it.
401
00:18:19,264 --> 00:18:21,433
Stand by, everyone. Quiet, please!
402
00:18:22,400 --> 00:18:23,268
Action!
403
00:18:25,103 --> 00:18:26,771
Hey, Girlfriend.
404
00:18:26,804 --> 00:18:29,108
Some of my friends in entertainment,
405
00:18:29,141 --> 00:18:31,276
when I told them what I was doing, were horrified.
406
00:18:31,309 --> 00:18:32,911
They're like, "Oh, that's it!
407
00:18:32,944 --> 00:18:35,547
"AI, they're gonna write all the scripts.
408
00:18:35,580 --> 00:18:37,181
Robots are gonna do all the acting.
409
00:18:37,215 --> 00:18:39,885
Everything's gonna be cartoon AI stuff,"
410
00:18:39,918 --> 00:18:42,988
but that's not what I feel like we're doing here at all.
411
00:18:43,021 --> 00:18:46,058
The point of this is an exercise in thought.
412
00:18:46,091 --> 00:18:47,326
Okay, stand by, everyone!
413
00:18:47,359 --> 00:18:49,628
Quiet, please! Action!
414
00:18:51,863 --> 00:18:53,665
Making a machine write like a person
415
00:18:53,699 --> 00:18:55,834
is not about replacing the person...
416
00:18:55,867 --> 00:18:57,602
No, no, no!
417
00:18:57,635 --> 00:19:00,572
...it's about augmenting a person's abilities.
418
00:19:01,806 --> 00:19:05,110
It can empower people to produce creative work
419
00:19:05,143 --> 00:19:07,613
that might be beyond their native capacity.
420
00:19:08,346 --> 00:19:10,682
Come on!
421
00:19:10,715 --> 00:19:13,985
It's wild to try to find your interpretation of this kind of text.
422
00:19:14,019 --> 00:19:17,588
Obviously, we usually start with a script that's pretty coherent,
423
00:19:17,622 --> 00:19:19,825
and then I'll break down what the character says,
424
00:19:19,858 --> 00:19:20,859
and then I'll decide,
425
00:19:20,892 --> 00:19:23,061
what are they feeling? Why are they saying that?
426
00:19:23,095 --> 00:19:25,329
Stay on her, stay on her.
427
00:19:25,363 --> 00:19:26,865
Just do that walk-off again.
428
00:19:26,898 --> 00:19:28,033
Uh, stay where you are, John.
429
00:19:28,066 --> 00:19:29,501
Come back, Chelsey. Do the walk-off again.
430
00:19:29,534 --> 00:19:31,769
This is harder for you, and mo... and more frightening,
431
00:19:31,803 --> 00:19:33,905
and you're checking that he hasn't gotten up.
432
00:19:33,938 --> 00:19:36,408
When AI is writing the material,
433
00:19:36,441 --> 00:19:37,642
there isn't any subtext.
434
00:19:37,675 --> 00:19:39,544
You realize what's happening, and you're like,
435
00:19:39,577 --> 00:19:41,279
"Well, I'm gonna go take refuge at the pillar.
436
00:19:41,312 --> 00:19:42,614
- Okay.
- All right?
437
00:19:42,647 --> 00:19:44,683
It stretches all of us. It makes us all work harder.
438
00:19:44,716 --> 00:19:47,151
It's one thing to bring an existing script to life,
439
00:19:47,185 --> 00:19:49,221
and just do your interpretation of it,
440
00:19:49,254 --> 00:19:51,522
but it's another thing to try to make it make sense,
441
00:19:51,556 --> 00:19:53,257
and then do your interpretation of it.
442
00:19:53,291 --> 00:19:54,927
Let's go one more time.
443
00:19:54,960 --> 00:19:58,496
This Bobo character that John is playing is a fantasy figure,
444
00:19:58,529 --> 00:20:01,099
is this avatar of masculinity,
445
00:20:01,133 --> 00:20:04,369
is the sort of result of watching too many action films...
446
00:20:05,169 --> 00:20:07,805
but he's confused,
447
00:20:07,839 --> 00:20:10,542
because he isn't getting the reaction that he expects.
448
00:20:11,008 --> 00:20:11,877
Action.
449
00:20:16,348 --> 00:20:17,515
Somewhere in the script,
450
00:20:17,549 --> 00:20:18,850
it talks about, "Bobo leans over to Bobo."
451
00:20:18,883 --> 00:20:22,554
We think, "Oh, right, well, let's have a mirror..."
452
00:20:23,422 --> 00:20:24,188
No!
453
00:20:24,222 --> 00:20:25,756
You're wrong!
454
00:20:25,790 --> 00:20:28,493
...and we can see the two versions of Bobo, for a moment, talking.
455
00:20:28,526 --> 00:20:30,895
Okay, let's see you in the mirror?
456
00:20:32,063 --> 00:20:33,865
Hey, did you get my money?
457
00:20:36,001 --> 00:20:38,336
Okay, great. I'm getting terribly, terribly happy.
458
00:20:38,369 --> 00:20:39,671
Some of that was so good.
459
00:20:39,704 --> 00:20:42,307
It was like such a go-- We're, like, in a movie now.
460
00:20:43,174 --> 00:20:44,542
Being surrounded with people
461
00:20:44,576 --> 00:20:47,912
who are throwing all of their professional energy
462
00:20:47,945 --> 00:20:49,447
into something this ludicrous
463
00:20:49,480 --> 00:20:51,415
is just intrinsically enjoyable.
464
00:20:51,449 --> 00:20:53,384
They just breathe humanity
465
00:20:53,418 --> 00:20:55,754
into words that did not come from a human being.
466
00:20:55,787 --> 00:20:59,324
- All right, let's do it again. -Okay, let's do it.
467
00:20:59,358 --> 00:21:03,361
I think that making great art requires human experience,
468
00:21:03,394 --> 00:21:04,963
but our human experience
469
00:21:04,997 --> 00:21:06,898
is now completely mapped into data.
470
00:21:06,931 --> 00:21:09,000
This is where machine learning keeps surprising us,
471
00:21:09,034 --> 00:21:13,037
is that it actually has figured out stuff that we didn't realize it could.
472
00:21:13,070 --> 00:21:17,208
Meaning, once all our human experiences are mapped into data,
473
00:21:17,241 --> 00:21:19,311
AI will be able to mine it for material
474
00:21:19,344 --> 00:21:20,745
and make art?
475
00:21:20,779 --> 00:21:24,016
Look for patterns in our happiness and heartbreak,
476
00:21:24,049 --> 00:21:26,618
kick out a new song or movie?
477
00:21:28,453 --> 00:21:31,155
So this is all just this one line of Benjamin writing,
478
00:21:31,189 --> 00:21:32,256
"putting on a show."
479
00:21:32,290 --> 00:21:33,492
Right, right, right.
480
00:21:35,393 --> 00:21:37,128
So while all that's going on,
481
00:21:37,161 --> 00:21:38,329
Girlfriend is on this couch,
482
00:21:38,362 --> 00:21:40,365
gradually waking up, right?
483
00:21:45,436 --> 00:21:47,238
- She's in a horror movie...
- Right.
484
00:21:47,272 --> 00:21:48,840
- He's in our action film.
- Oh!
485
00:21:48,873 --> 00:21:51,409
So in his head, he's having a wonderful romantic time with her.
486
00:21:51,442 --> 00:21:52,777
Yeah, I love that.
487
00:21:52,811 --> 00:21:54,646
Do you remember his, "Bobo leans over to Bobo"?
488
00:21:54,679 --> 00:21:55,579
- Mm-hmm.
- Remember that?
489
00:21:55,613 --> 00:21:57,215
So what we tried to do for that
490
00:21:57,249 --> 00:21:58,649
is he looks in the mirror,
491
00:21:58,683 --> 00:22:00,685
and in the mirror, it's gonna be Osric.
492
00:22:00,719 --> 00:22:03,789
He's created this avatar version of himself, Bobo.
493
00:22:03,822 --> 00:22:05,690
- In a... in a...
- Okay, so that's the interpretation?
494
00:22:05,723 --> 00:22:06,858
- In that-- Yeah, exactly.
- I like it.
495
00:22:06,892 --> 00:22:08,493
So this is what these guys came up with.
496
00:22:10,062 --> 00:22:11,396
No!
497
00:22:11,430 --> 00:22:13,065
You're wrong!
498
00:22:13,098 --> 00:22:15,333
You work really, really hard to go,
499
00:22:15,366 --> 00:22:17,335
what's a thing that's kinda coherent,
500
00:22:17,369 --> 00:22:19,203
that these actors can all be performing one thing,
501
00:22:19,237 --> 00:22:20,438
we can all be making one thing,
502
00:22:20,471 --> 00:22:21,873
and we can say, "This is what Benjamin meant?"
503
00:22:21,906 --> 00:22:23,775
- Right. -What does that tell me about me?
504
00:22:23,808 --> 00:22:25,243
- Right.
- Like, what... So...
505
00:22:25,277 --> 00:22:26,677
and what I already know about me
506
00:22:26,711 --> 00:22:29,414
is I'm really antsy about how much misogyny
507
00:22:29,447 --> 00:22:31,716
is kind of encoded into... into culture.
508
00:22:31,750 --> 00:22:34,486
On one hand, you go, "This is an important, worthwhile thing to do--"
509
00:22:34,519 --> 00:22:35,586
On the other hand, we're projecting.
510
00:22:35,620 --> 00:22:36,721
And the other thing, you're projecting,
511
00:22:36,755 --> 00:22:38,689
- but we're always projecting.
- Always.
512
00:22:38,723 --> 00:22:41,092
Literally, all interpretation is projection.
513
00:22:41,126 --> 00:22:42,594
Take 6.
514
00:22:42,627 --> 00:22:44,629
I like playing with authorship,
515
00:22:44,662 --> 00:22:47,064
and people's concepts of authorship,
516
00:22:47,098 --> 00:22:51,502
and people's concepts of where fiction and where ideas come from.
517
00:22:51,536 --> 00:22:53,438
Generative screenwriting.
518
00:22:53,471 --> 00:22:55,507
Me and Ross started it.
519
00:22:55,540 --> 00:22:56,674
I don't know if it's a new art form,
520
00:22:56,707 --> 00:22:58,276
but it's a new chunk of what cinema can be.
521
00:22:58,309 --> 00:22:59,277
That's new.
522
00:22:59,311 --> 00:23:01,380
What should we do next time, Ross?
523
00:23:02,147 --> 00:23:03,982
- Romantic comedy.
- Okay.
524
00:23:05,550 --> 00:23:08,620
It's hard to know if machine learning will ever decode
525
00:23:08,653 --> 00:23:10,856
the mysteries of love or creativity.
526
00:23:12,190 --> 00:23:14,158
Maybe it's not even a mystery,
527
00:23:14,192 --> 00:23:16,394
just data points,
528
00:23:16,427 --> 00:23:18,896
but what about other human qualities,
529
00:23:18,930 --> 00:23:20,364
like instinct?
530
00:23:20,398 --> 00:23:22,667
Driving a car already requires us
531
00:23:22,700 --> 00:23:25,070
to make countless unconscious decisions.
532
00:23:25,103 --> 00:23:27,405
AI is learning to do that,
533
00:23:27,438 --> 00:23:29,574
but can we teach it to do more?
534
00:23:37,315 --> 00:23:39,484
Racing is not just driving a car.
535
00:23:39,517 --> 00:23:43,020
It's also about intuition, caution, aggression,
536
00:23:43,054 --> 00:23:44,489
and taking risks.
537
00:23:44,522 --> 00:23:47,125
Holly, can you confirm 200 at the end of this straight?
538
00:23:47,158 --> 00:23:48,560
Okay.
539
00:23:48,593 --> 00:23:52,363
It requires almost a preternatural will to win.
540
00:23:52,396 --> 00:23:55,800
So, how fast can a racecar go...
541
00:23:55,833 --> 00:23:58,937
without a human behind the wheel?
542
00:23:58,970 --> 00:24:00,605
Motorsport has always been
543
00:24:00,639 --> 00:24:02,374
taking technology to the limits...
544
00:24:02,407 --> 00:24:03,541
You all good your side, Holly?
545
00:24:03,574 --> 00:24:05,076
Yeah, I'm ready to go.
546
00:24:05,109 --> 00:24:06,845
...and one of the goals of Roborace is to really facilitate
547
00:24:06,878 --> 00:24:09,413
the accelerated development of driverless technology.
548
00:24:09,447 --> 00:24:11,683
Okay, so we'll try to launch again.
549
00:24:14,052 --> 00:24:16,554
By taking the autonomous technology
550
00:24:16,587 --> 00:24:18,190
to the limits of its ability,
551
00:24:18,223 --> 00:24:20,759
we think that we can develop the technology faster.
552
00:24:23,929 --> 00:24:25,730
British startup Roborace
553
00:24:25,763 --> 00:24:28,500
wants to break new ground in driverless cars.
554
00:24:29,434 --> 00:24:31,035
To do so, they believe they need
555
00:24:31,069 --> 00:24:33,572
to test the boundaries of the technology...
556
00:24:36,807 --> 00:24:40,545
working at the very outer edge of what's safe and possible,
557
00:24:40,578 --> 00:24:43,181
where the margin for error is razor thin.
558
00:24:44,448 --> 00:24:46,551
After years of trial and error,
559
00:24:46,585 --> 00:24:50,021
they've created the world's first AI racecar.
560
00:24:51,422 --> 00:24:54,058
The thing that I love most about working at Roborace
561
00:24:54,091 --> 00:24:56,627
is we have a dream of being faster, and better,
562
00:24:56,660 --> 00:24:59,564
and safer than a human.
563
00:24:59,597 --> 00:25:02,200
More than 50 companies around the world
564
00:25:02,233 --> 00:25:04,735
are working to bring self-driving cars to city streets.
565
00:25:04,769 --> 00:25:08,706
The promise of driverless taxis, buses, and trucks
566
00:25:08,740 --> 00:25:10,141
is transformative.
567
00:25:11,409 --> 00:25:13,878
It'll make our world safer and cleaner,
568
00:25:13,912 --> 00:25:15,981
changing the way our cities are designed,
569
00:25:16,014 --> 00:25:17,115
societies function,
570
00:25:17,148 --> 00:25:20,084
even how we spend our time.
571
00:25:20,117 --> 00:25:23,455
Think about a self-driving car out in the real world.
572
00:25:23,488 --> 00:25:26,257
In order to build that system and have it work,
573
00:25:26,291 --> 00:25:27,758
it's got to be virtually perfect.
574
00:25:27,792 --> 00:25:31,329
If you had a 99% accuracy rate,
575
00:25:31,363 --> 00:25:32,863
that wouldn't be anywhere near enough,
576
00:25:32,897 --> 00:25:34,865
because once you take that 1% error rate
577
00:25:34,899 --> 00:25:38,537
and you multiply that by millions of cars on the road,
578
00:25:38,570 --> 00:25:41,940
I mean, you'd have accidents happening constantly,
579
00:25:41,973 --> 00:25:45,843
so the error rate has to be extraordinarily low
580
00:25:45,877 --> 00:25:48,212
in order to pull this off.
581
00:25:48,246 --> 00:25:50,648
Roborace is betting they can crack the code
582
00:25:50,681 --> 00:25:53,084
by seeing just how far the tech can go,
583
00:25:53,118 --> 00:25:57,121
a place usually reserved for only the best human drivers.
584
00:25:57,155 --> 00:26:00,758
As a human, you have lots of advantages over a computer.
585
00:26:00,792 --> 00:26:02,760
You know exactly where you are in the world.
586
00:26:02,793 --> 00:26:05,897
You have eyes that can enable you to see things,
587
00:26:05,930 --> 00:26:08,799
so we need to implement technology on vehicles
588
00:26:08,832 --> 00:26:10,735
to enable them to see the world.
589
00:26:11,836 --> 00:26:14,306
We have a system called OxTS.
590
00:26:15,206 --> 00:26:16,541
It's a differential GPS,
591
00:26:16,574 --> 00:26:18,276
which means it's military grade.
592
00:26:20,444 --> 00:26:22,714
We also use LiDAR sensors.
593
00:26:25,750 --> 00:26:28,219
These are basically laser scanners.
594
00:26:28,253 --> 00:26:29,754
They create, for the vehicle,
595
00:26:29,787 --> 00:26:32,623
a 3D map of the world around it.
596
00:26:32,656 --> 00:26:34,425
And there's one last thing that we use,
597
00:26:34,459 --> 00:26:37,829
vehicle-to-vehicle communication between the cars.
598
00:26:39,096 --> 00:26:41,366
Each of them can tell the other car
599
00:26:41,399 --> 00:26:43,134
the position of it on the track.
600
00:26:44,336 --> 00:26:45,837
And just to be clear,
601
00:26:45,870 --> 00:26:49,607
your phone does not come with military-grade GPS.
602
00:26:49,641 --> 00:26:52,243
These cars? Next level.
603
00:26:52,276 --> 00:26:56,147
The challenging part is to really fuse all this information together.
604
00:26:56,180 --> 00:26:59,283
At Roborace, we can provide the hardware,
605
00:26:59,317 --> 00:27:02,287
but then we need software companies to come to us
606
00:27:02,320 --> 00:27:04,556
to implement their software.
607
00:27:04,589 --> 00:27:08,093
Today, Roborace has invited two skilled teams
608
00:27:08,126 --> 00:27:10,895
to test their latest road rocket on the track.
609
00:27:15,633 --> 00:27:17,067
My name is Johannes.
610
00:27:17,101 --> 00:27:19,470
I'm from the Technical University of Munich.
611
00:27:19,503 --> 00:27:21,505
I'm the project leader.
612
00:27:21,539 --> 00:27:23,241
Is the Wi-Fi working off the car?
613
00:27:23,274 --> 00:27:24,342
I could check it.
614
00:27:25,676 --> 00:27:27,211
T.U.M. from Germany
615
00:27:27,245 --> 00:27:30,047
is one of the top technical universities in Europe,
616
00:27:30,080 --> 00:27:33,184
home to 17 Nobel Prize winners in science.
617
00:27:35,620 --> 00:27:36,754
I have no connection to the car.
618
00:27:36,787 --> 00:27:37,789
Wifi doesn't work.
619
00:27:37,822 --> 00:27:39,557
So we have no Wi-Fi to the car...
620
00:27:39,590 --> 00:27:41,825
So we just need to reset the router.
621
00:27:41,859 --> 00:27:43,961
My name is Max. I'm, uh...
622
00:27:43,994 --> 00:27:45,563
Uh, let's figure out, who am I?
623
00:27:45,596 --> 00:27:47,031
I'm, uh...
624
00:27:47,065 --> 00:27:51,369
In Arrival, I'm a product owner of the self-driving system.
625
00:27:51,403 --> 00:27:53,871
Arrival is a UK startup
626
00:27:53,904 --> 00:27:55,407
focused on designing and building
627
00:27:55,440 --> 00:27:58,575
next-gen electric vehicles for commercial use.
628
00:27:58,609 --> 00:28:00,512
Ah, okay, okay, okay, good.
629
00:28:00,545 --> 00:28:03,848
Each team created their own custom software,
630
00:28:03,882 --> 00:28:07,151
the AI driver that pilots the car,
631
00:28:07,184 --> 00:28:09,553
and since each of the teams' programmers
632
00:28:09,587 --> 00:28:12,123
have their own distinct personality,
633
00:28:12,156 --> 00:28:14,759
does that mean each of their AI drivers
634
00:28:14,792 --> 00:28:17,128
will have different personalities or instincts too?
635
00:28:17,161 --> 00:28:18,696
The two teams that we have here
636
00:28:18,729 --> 00:28:20,264
are using two slightly different approaches
637
00:28:20,298 --> 00:28:22,633
to the same problem of making a car go 'round the track
638
00:28:22,666 --> 00:28:24,802
in the shortest distance in the fastest way.
639
00:28:24,835 --> 00:28:26,137
The T.U.M. strategy
640
00:28:26,170 --> 00:28:28,305
is really to keep their code as simple as possible.
641
00:28:28,339 --> 00:28:31,876
It's maybe a very German, efficient way of doing things.
642
00:28:31,910 --> 00:28:34,211
Okay, thanks, we will check now.
643
00:28:34,245 --> 00:28:36,581
Arrival's code is more complicated
644
00:28:36,614 --> 00:28:39,550
in that they use many more of the sensors on the vehicle.
645
00:28:39,583 --> 00:28:41,285
It will be interesting to see
646
00:28:41,318 --> 00:28:44,889
whether it pays off to be simple in your code,
647
00:28:44,922 --> 00:28:46,824
or slightly more complicated,
648
00:28:46,858 --> 00:28:49,260
to use more of the functionality of the car.
649
00:28:50,361 --> 00:28:52,463
The first test for each team
650
00:28:52,496 --> 00:28:54,031
is the overtake,
651
00:28:54,064 --> 00:28:56,767
to see if their AI can pass another car
652
00:28:56,801 --> 00:28:58,769
at high speed.
653
00:28:58,803 --> 00:29:00,337
It's difficult for AI
654
00:29:00,371 --> 00:29:02,406
because we have to make a lot of decisions,
655
00:29:02,440 --> 00:29:04,642
and a lot of planning, a lot of computations
656
00:29:04,676 --> 00:29:10,982
to calculate what the car should do in which millisecond.
657
00:29:11,015 --> 00:29:14,685
Everybody has seen high-speed crashes in motorsport before.
658
00:29:14,718 --> 00:29:15,920
We'd quite like to avoid that.
659
00:29:15,953 --> 00:29:18,022
For this reason, during testing,
660
00:29:18,055 --> 00:29:20,157
we keep a human in the car.
661
00:29:24,762 --> 00:29:26,130
Okay, Reece, enabling AI.
662
00:29:26,163 --> 00:29:29,234
Can you just confirm you've got the blue light, please?
663
00:29:31,802 --> 00:29:33,738
In order to overtake,
664
00:29:33,772 --> 00:29:37,375
they need a second car on track at the same time.
665
00:29:37,408 --> 00:29:41,079
This is a vehicle that stays in human-driven mode the whole time,
666
00:29:41,112 --> 00:29:43,681
so we know exactly how it's going to behave.
667
00:29:43,715 --> 00:29:46,451
Okay, launch AI from the race control.
668
00:29:47,251 --> 00:29:51,022
And launching in three, two, one.
669
00:29:55,593 --> 00:29:59,998
It's really difficult for AI to learn to overtake.
670
00:30:00,031 --> 00:30:01,832
When you have one vehicle on track,
671
00:30:01,866 --> 00:30:04,903
it only needs to make decisions about itself,
672
00:30:05,602 --> 00:30:07,071
but when you have two vehicles,
673
00:30:07,104 --> 00:30:09,740
you have the option to create your behavior
674
00:30:09,773 --> 00:30:11,809
in response to another vehicle.
675
00:30:11,842 --> 00:30:15,880
Okay, we are going to release the speed limit on your car now, Reece.
676
00:30:32,496 --> 00:30:33,464
Nice!
677
00:30:33,498 --> 00:30:34,966
Yeah, man.
678
00:30:39,503 --> 00:30:40,605
Team T.U.M.
679
00:30:40,638 --> 00:30:43,641
has successfully completed the overtake challenge.
680
00:30:43,674 --> 00:30:45,443
Next up, team Arrival.
681
00:30:48,545 --> 00:30:52,116
So, Tim, can you go to take position on the start line, please?
682
00:30:55,352 --> 00:30:56,854
Enabling AI
683
00:30:56,888 --> 00:30:59,423
Can you confirm blue light, please?
684
00:31:08,599 --> 00:31:12,070
And launch in three, two, one.
685
00:31:22,513 --> 00:31:24,481
So it's looking good so far.
686
00:31:46,804 --> 00:31:48,006
Car crashed.
687
00:31:49,607 --> 00:31:51,042
Tim, can you hear me?
688
00:31:59,917 --> 00:32:02,319
Has anyone got eyes on what happened?
689
00:32:06,590 --> 00:32:07,625
Sorry, boys.
690
00:32:10,161 --> 00:32:11,595
Self-driving cars.
691
00:32:11,629 --> 00:32:14,398
This is an idea that's been around since the '30s,
692
00:32:14,432 --> 00:32:15,333
hardly a new one.
693
00:32:15,999 --> 00:32:17,602
Why hasn't it happened?
694
00:32:17,635 --> 00:32:19,003
It's really hard.
695
00:32:19,036 --> 00:32:22,173
When there are unpredictable things that happen,
696
00:32:22,206 --> 00:32:24,242
that can get you in a lot of trouble.
697
00:32:24,275 --> 00:32:27,545
Now, sometimes trouble just means it shuts down.
698
00:32:27,578 --> 00:32:28,646
Sometimes trouble means
699
00:32:28,679 --> 00:32:31,415
it gives you a result that you weren't expecting.
700
00:32:31,449 --> 00:32:32,517
I think he's just...
701
00:32:32,550 --> 00:32:35,887
They've come back online so aggressively...
702
00:32:35,920 --> 00:32:38,723
Plus or minus one G coming back online.
703
00:32:38,756 --> 00:32:41,025
When the car returned to the trajectory,
704
00:32:41,058 --> 00:32:42,560
it did it too aggressive,
705
00:32:42,593 --> 00:32:46,097
and actually steered out of the racing track.
706
00:32:46,130 --> 00:32:48,666
- My feeling is that it overreacts.
- Yeah, yeah.
707
00:32:48,699 --> 00:32:50,834
So it's not necessarily the line that's aggressive,
708
00:32:50,868 --> 00:32:53,871
it's how it reacts once it just gets a little bit out of the line,
709
00:32:53,905 --> 00:32:55,673
and then overcorrects, and then overcorrects.
710
00:32:57,975 --> 00:32:59,443
We were this close
711
00:32:59,476 --> 00:33:01,512
to really hitting the target of our test,
712
00:33:01,545 --> 00:33:02,780
and it didn't happen.
713
00:33:02,814 --> 00:33:04,815
It just slipped away, so it was just...
714
00:33:04,849 --> 00:33:06,384
ah, disappointment.
715
00:33:10,153 --> 00:33:13,491
There are so many aspects of the car.
716
00:33:13,524 --> 00:33:16,427
The systems guys have such a difficult job
717
00:33:16,460 --> 00:33:19,130
to make sure that everything is absolutely perfect,
718
00:33:19,163 --> 00:33:20,297
because that's what you need
719
00:33:20,331 --> 00:33:21,765
to be able to go autonomous racing.
720
00:33:21,798 --> 00:33:23,734
Everything has to be perfect.
721
00:33:23,768 --> 00:33:26,804
Team Arrival's program just couldn't hack it,
722
00:33:26,837 --> 00:33:28,840
but for team T.U.M.,
723
00:33:28,873 --> 00:33:30,141
another test awaits...
724
00:33:30,174 --> 00:33:31,976
Can we get the car
725
00:33:32,009 --> 00:33:34,812
into the normal start position, please?
726
00:33:34,845 --> 00:33:37,448
...and this next one is all about speed.
727
00:33:37,481 --> 00:33:39,183
Very high speed.
728
00:33:39,217 --> 00:33:41,686
The fastest that a human's ever driven around this track
729
00:33:41,719 --> 00:33:43,054
was 200 kph.
730
00:33:43,087 --> 00:33:46,490
Translation, that's about 120 miles an hour.
731
00:33:46,523 --> 00:33:49,393
So the AI is gonna try to beat that high speed.
732
00:33:49,426 --> 00:33:52,096
And it's gonna do it without a human safety net,
733
00:33:52,129 --> 00:33:53,430
because at that speed,
734
00:33:53,463 --> 00:33:55,466
it's borderline unsafe for people.
735
00:33:55,499 --> 00:33:58,135
When the driver climbs out and shuts the door,
736
00:33:58,168 --> 00:33:59,604
yeah, your heart rate goes up.
737
00:34:08,679 --> 00:34:09,980
And we are launching
738
00:34:10,014 --> 00:34:14,185
in three, two, one.
739
00:34:20,157 --> 00:34:21,826
And launch successful.
740
00:34:26,163 --> 00:34:28,433
160. Next round, 200.
741
00:34:41,879 --> 00:34:44,014
The car has six laps,
742
00:34:44,047 --> 00:34:47,684
six tries to hit top speed.
743
00:34:47,718 --> 00:34:51,221
Each lap, the AI will increasingly push the limits of control,
744
00:34:51,254 --> 00:34:53,024
traction, and throttle,
745
00:34:53,057 --> 00:34:54,692
to break the human record.
746
00:34:57,728 --> 00:34:58,562
Holly, this is Steve.
747
00:34:58,595 --> 00:34:59,997
Can we confirm in the atmos-data
748
00:35:00,030 --> 00:35:01,499
it is safe to continue?
749
00:35:01,532 --> 00:35:04,335
Yeah, we think it looks fairly controlled.
750
00:35:07,538 --> 00:35:10,274
Okay, so the next run should be V-max.
751
00:35:31,195 --> 00:35:32,830
We have 210.
752
00:35:33,764 --> 00:35:34,765
That's cool.
753
00:35:46,210 --> 00:35:48,045
It was a real, real sense of excitement
754
00:35:48,078 --> 00:35:52,116
to see it finally crack the 210 kph mark.
755
00:35:52,150 --> 00:35:54,751
It was a real success for Roborace functionality
756
00:35:54,785 --> 00:35:58,789
as well as building confidence in the team's software.
757
00:35:58,823 --> 00:36:01,725
It really showcases what autonomous cars can do,
758
00:36:01,759 --> 00:36:03,193
not just on the racetrack,
759
00:36:03,227 --> 00:36:05,796
but also for everybody around the world,
760
00:36:05,829 --> 00:36:06,964
so we're really hoping
761
00:36:06,998 --> 00:36:10,867
that this will improve road technology for the future.
762
00:36:10,901 --> 00:36:13,771
The current state of AI is that there are some things
763
00:36:13,804 --> 00:36:16,206
that AI can really do better than humans,
764
00:36:16,239 --> 00:36:17,407
and then there's things
765
00:36:17,441 --> 00:36:18,910
that it can't do anywhere close to humans...
766
00:36:20,444 --> 00:36:24,114
but now where the frontier is gonna be moving
767
00:36:24,148 --> 00:36:26,183
is where computers come up to the human level,
768
00:36:26,217 --> 00:36:28,686
not quite, and then surpass humans,
769
00:36:28,719 --> 00:36:31,755
and I think the odds are overwhelming
770
00:36:31,789 --> 00:36:35,626
that we will eventually be able to build an artificial brain
771
00:36:35,659 --> 00:36:38,562
that is at the level of the human brain.
772
00:36:38,595 --> 00:36:41,465
The big question is how long will it take?
773
00:36:42,999 --> 00:36:44,434
"The hard problem."
774
00:36:44,467 --> 00:36:45,803
It's a philosophical phrase
775
00:36:45,836 --> 00:36:48,439
that describes difficult things to figure out,
776
00:36:48,472 --> 00:36:51,809
like "the hard problem of consciousness."
777
00:36:51,842 --> 00:36:54,245
We may never know what consciousness is,
778
00:36:54,278 --> 00:36:57,147
let alone if we can give it to a machine,
779
00:36:57,181 --> 00:36:59,182
but do we need to?
780
00:36:59,216 --> 00:37:01,552
What does a machine really need to know
781
00:37:01,585 --> 00:37:03,721
in order to be a good athlete,
782
00:37:03,754 --> 00:37:05,022
or an artist,
783
00:37:05,056 --> 00:37:06,457
or a lover?
784
00:37:08,492 --> 00:37:12,029
Will AI ever have the will to win,
785
00:37:12,062 --> 00:37:14,164
the depth to create,
786
00:37:14,198 --> 00:37:17,902
the empathy to connect on a deep human level?
787
00:37:17,935 --> 00:37:19,136
Maybe.
788
00:37:19,169 --> 00:37:22,172
Some say we're just a bunch of biological algorithms,
789
00:37:22,206 --> 00:37:23,407
and that one day,
790
00:37:23,440 --> 00:37:25,977
evolution will evolve AI to emulate humans
791
00:37:27,244 --> 00:37:28,679
to be more like us...
792
00:37:33,416 --> 00:37:35,319
...or maybe it won't...
793
00:37:36,386 --> 00:37:38,489
and human nature, who we really are,
794
00:37:38,522 --> 00:37:40,691
will remain a mystery.
795
00:37:43,660 --> 00:37:45,596
We gave it some dialogue to start with,
796
00:37:45,629 --> 00:37:47,298
like this line from Superman.
797
00:37:47,331 --> 00:37:49,332
So you got some Superman/ Lois Lane stuff, huh?
798
00:37:49,366 --> 00:37:50,767
Yeah, so you wanna read it?
799
00:37:50,801 --> 00:37:53,137
Mm... not that bit. Um, wait.
800
00:37:53,170 --> 00:37:55,473
Up, up, up, up, up. Back up... okay.
801
00:37:55,506 --> 00:37:57,608
"Superman angrily grabs Lois by the neck,
802
00:37:57,641 --> 00:38:00,677
slaps her against the wall, and bares his teeth in fury."
803
00:38:00,711 --> 00:38:01,512
"You're wrong.
804
00:38:01,545 --> 00:38:03,114
You're a grotesque kind of monster."
805
00:38:03,147 --> 00:38:05,115
- "You're wrong!"
- "You're a terrible liar."
806
00:38:05,149 --> 00:38:06,883
"No! I'm sorry, I'm sorry.
807
00:38:06,917 --> 00:38:07,785
I can't believe it!"
808
00:38:07,818 --> 00:38:09,754
"You're so much more than that, Lois."
809
00:38:09,787 --> 00:38:10,821
"Please, please!"
810
00:38:10,854 --> 00:38:11,655
"How could you?
811
00:38:11,689 --> 00:38:13,824
No one can believe who you are."
812
00:38:13,857 --> 00:38:14,858
"Don't be ridiculous.
813
00:38:14,891 --> 00:38:15,726
Please?
814
00:38:15,759 --> 00:38:17,527
How could you be so much more than that?"
815
00:38:17,560 --> 00:38:19,397
"You're such a terrible liar.
816
00:38:19,430 --> 00:38:21,298
You can't even believe who you are.
817
00:38:21,332 --> 00:38:23,401
Please, unless you're really a no-good liar,
818
00:38:23,434 --> 00:38:25,336
you're not even sure if you're good."
819
00:38:25,369 --> 00:38:26,803
"Sorry, Superman, I'm so sorry!"
820
00:38:26,837 --> 00:38:28,739
Superman is just not making very much sense.
821
00:38:28,772 --> 00:38:30,273
Maybe kind of drunk or something?
822
00:38:30,307 --> 00:38:32,442
In fact, it says, "Superman isn't funny.
823
00:38:32,476 --> 00:38:34,344
The two of them are really different people.
824
00:38:34,377 --> 00:38:36,113
There is no such thing as good good."
825
00:38:36,147 --> 00:38:37,214
That's pretty deep.
826
00:38:37,248 --> 00:38:38,315
"There is no such thing as good good."
827
00:38:38,348 --> 00:38:39,684
There is no such thing as good good.
828
00:38:39,717 --> 00:38:40,417
So far as I know.
829
00:38:40,451 --> 00:38:41,785
Yeah. Have you checked?
830
00:38:41,819 --> 00:38:42,620
I'm gonna Google it.
831
00:38:42,653 --> 00:38:44,555
- Can we Google it?
- Um...