OpenAI Sora Breaks Internet! Society At Risk? (Ep. 110)
OpenAI broke the internet with the release of Sora, its video AI model. Sora is not the first video AI model, but will video AI break society and be another cautionary tale like social media platforms?
#sora #artificialintelligence #openai #counterthought #podcast #news #society #goodvsevil
----
Remember to Like, Comment and Share this video.
Remember to SUBSCRIBE to the Counter Thought channel (@counterthoughtpodcast).
----
AUDIO versions of podcast episodes are available on your podcast app.
VIDEO versions of podcast episodes are available on YouTube and Rumble.
- YouTube: https://youtu.be/vLIpNiS-MVQ
- Rumble: https://rumble.com/v4ew6oo-openai-sora-breaks-internet-society-at-risk-ep.-110.html
----
FOLLOW Counter Thought on social media:
Instagram: @counter_thought
Instagram: @counterthoughtceo
X (Twitter): @counterceo
TikTok: @counterthought
Facebook: "Counter Thought Podcast" page
Transcript
Welcome to Counter Thought,
2
:a podcast conserving America's
freedom, culture and values.
3
:This is Brian Kletter,
the creator and host of the podcast.
4
:You can engage with the podcast
on Instagram @counter_thought
5
:or @counterthoughtceo,
6
:and on our Facebook page “Counter
Thought Podcast”.
7
:For audio versions of the podcast,
you can find us on
8
:Apple Podcasts,
Google Podcasts, Spotify, and more.
9
:And for video versions of the podcast,
10
:join us on YouTube at the Counter
Thought channel.
11
:Let's go!
12
:Open A.I..
13
:The creators of GPT released
their video A.I.
14
:model called Sora
and it broke the Internet.
15
:And while its capabilities are incredible
and while the intentions are good,
16
:I can't help but wonder, even though
it's breaking the Internet now,
17
:our video A.I.
18
:model is going to break society.
19
:Welcome to Counterattack
20
:Openai AI.
21
:The creators of GPT released
a testing version of their video A.I.
22
:model called Sora.
23
:And when it did so, all of social media,
24
:YouTube rumble, other different streaming
platforms
25
:just went berserk over the capabilities
that Sora has.
26
:Now, while Sora is not the first A.I.
27
:video model, there are others out there.
28
:There are many, actually,
29
:who could create clips of podcasts
just like this in other YouTube videos.
30
:Just text to video editing.
31
:Text to video creation.
32
:So open the AI in.
33
:Sora is not necessarily breaking
new ground, but what they are doing,
34
:like what they do with their GPT model
is they are,
35
:I guess, just going above
where everyone else already is,
36
:or at least that is the expectation.
37
:Now you
38
:might be wondering, okay, what is Sora?
39
:Yes, it is a video model,
40
:but what does that actually mean?
41
:So what open A.I.
42
:says, and this is from from their website,
43
:Openai says this about Sora,
44
:quote, We're teaching A.I.
45
:to understand
and simulate the physical world in motion.
46
:While the goal of training models
that help people
47
:solve problems
that require real world interaction.
48
:Introducing Sora Our text A video model.
49
:Sora can generate videos.
50
:As of right
now, up to a minute long while maintaining
51
:visual quality
and adherence to the user's prompt.
52
:So there's a lot there.
53
:Basically, what they're saying
is that they're teaching it to understand
54
:and simulate the physical world in motion,
55
:and they have multiple videos
demonstrating the current capabilities.
56
:And I'll share one or two of those
with you here in a few minutes.
57
:But with the goal of training models
to help solve problems
58
:that require real world interaction,
you could essentially say, okay, I need
59
:I need a video of horses running across
an open prairie or something like that.
60
:And instead of you
having to go hunt down on the Internet,
61
:maybe some stock footage
and pay for that footage,
62
:or get your own production
crew to go out and film that footage
63
:and the horses and the land
and everything else.
64
:You would just type it into Sora.
65
:Video of horses
66
:running across a prairie.
67
:Video generated right.
68
:Let me show you what exactly
69
:I'm talking about
70
:is. Isn't that incredible?
71
:man, it's.
72
:It is. It is.
73
:Honestly, it is incredible.
74
:The tech person in me just, like, loves
that this is even possible.
75
:And I'm thinking, okay, for this podcast,
how can I use A.I.
76
:video editing in So is not the first A.I.
77
:video video tool.
78
:There are others.
79
:There are others out there.
80
:One of them being a synthesizer,
I believe, is how you pronounce it.
81
:And that is something that you can just
create clips that could take this
82
:long form video of the full episode,
and it would just automatically generate
83
:clips to be used
on, you know, in vertical format,
84
:which on TikTok
and Instagram and Facebook.
85
:Or it could create smaller clips and stay
in this landscape horizontal position
86
:that I could then upload for, you know,
87
:future feature clips for the channel
here on YouTube or on Rumble.
88
:So the
capabilities are are very impressive.
89
:And one of the things like like
90
:Openai says is that it will allow you
the efficiency and the speed and cost
91
:saving capabilities of not having to get
your own production crew or to contract
92
:work with a production crew
to get what exactly you are trying to get.
93
:Like a plane flying over,
snow capped mountains or something
94
:like that on a crystal clear,
crystal clear blue winter sky.
95
:Right.
96
:You know that we won't have to go
and outsource that work
97
:or pay people in-house to go do that,
which is could save hundreds of thousands
98
:to millions of dollars, depending
on what exactly you're looking for.
99
:And this is only going to be refined.
100
:Open air says that themselves.
101
:They talk about how,
you know this is a read testing
102
:red team red teamers are testing
to see the capabilities and yes right now
103
:not everything is going to be perfect
like you may say something
104
:to text in there
have a woman biting a cookie.
105
:Well, at this point,
the cookie may not show bite marks after
106
:the woman bites into it.
107
:But you know that will be improved
upon and then you will see
108
:the bite mark in the cookie.
109
:Right.
110
:It'll look like a real woman or the woman
that was used to create this video
111
:who signed some type of consent
form to use her likeness
112
:eating a cookie.
113
:But that didn't necessarily happen.
114
:That didn't necessarily happen.
115
:So there are definitely good uses
to video tools.
116
:I just mention one synthesizer
I could record myself,
117
:and then I think I could
118
:then take the same video and automatically
make it make myself speak in Spanish.
119
:Right.
120
:Reach a larger audience with this podcast.
121
:There are others out there
that can do automatic video clips.
122
:They will go through
and read through, right?
123
:Just give it the time
to read through the full podcast episode.
124
:The full length, say it's 20 minute video
125
:and then it'll determine
what are like the the wow moments.
126
:You know that
the eye catching moments or takes
127
:within the video
and go ahead and produce and recommend
128
:clips for me then to utilize
like I said on other social media
129
:platforms or even just shorter
clips here on YouTube or rumble.
130
:Efficient video production, right?
131
:Cost saving capabilities.
132
:Now that also rubs people the wrong way.
133
:Right.
134
:And a lot of people are worried
about A.I., not just video A.I.
135
:tools, but other other A.I.
136
:tools. People using A.I.
137
:right now going through through resumes
and creating better resumes and
138
:and then also using AI to analyze
139
:data and spit out,
you know, like an analysis
140
:and so on and so forth, and synthesize
all of this information.
141
:And I've I've heard and been told.
142
:Right.
143
:That we we as humans
should not be necessarily scared of A.I.,
144
:but we should do is embrace A.I.,
145
:embrace the tools,
and use it to your advantage.
146
:And that eventually there's
going to be the separation because it's
147
:going to become so prevalent across
society and across businesses
148
:that there will be a
149
:divide of those who adopted A.I.
150
:and those who didn't.
151
:Those of us individuals
who know how to use A.I.
152
:and those who don't.
153
:So there's going to be this separation
like either you're going to
154
:to move forward with the rest of us,
or you are going to get left behind.
155
:Now, specifically for video A.I.,
when you think about it, you're like,
156
:okay, well,
157
:if I don't have to then get a production
crew to go create this content.
158
:Is that just going to replace everything?
159
:Well, not entirely.
160
:And open A.I.
161
:and a synthesis say say similar things.
162
:They're like, okay, well,
you would still need the production crew
163
:initially to go create the stock footage.
164
:Now, will that always be the case
165
:as more and more stock
footage is created as A.I.
166
:is able to generate
167
:more scenes and everything by itself,
168
:utilize the information
in the stock footage that it has.
169
:Will that eventually eliminate an
individual actually going out and filming
170
:something to then be used or replicated
or iterations or iterate from?
171
:That remains to be seen.
172
:And that is what scares
people like in the workforce.
173
:Are you trying to take my job?
174
:Is I going to replace humans
175
:or is I going to improve
176
:the lives of humans?
177
:That is that is the question.
178
:So on the flip side of the good
179
:is obviously the bad.
180
:The bad would be the usage
181
:for disinformation
182
:creating social hermits
183
:because people are scared that
something is going to be created of them
184
:or they will take down their social media
platforms in hopes that,
185
:you know, the public profile
that they already had,
186
:the images
that were already had, the content
187
:that was already shared on
the social media platforms
188
:are not going to be manipulated
and used to create disinformation
189
:campaigns,
you know, used to smear someone.
190
:We recently heard about Taylor Swift
and the video or the pictures and videos
191
:that were created of her that, you know,
that she came out and vehemently
192
:denied saying that those are not you know,
those are not real.
193
:Those are fake.
194
:Deep fakes have been
going on on for years now
195
:in the form of video
196
:and pictures.
197
:And then what are we going to do
to fight against it?
198
:You know,
199
:we don't we don't have
200
:millions of dollars, right,
to hire lawyers and go after companies
201
:or individuals who could use this
this tool,
202
:the video and air tools do
to create disinformation,
203
:to make it seem
as if I said something that I never said.
204
:One of the videos on on the Synthesis site
205
:was with the CEO
and he was on Bloomberg UK, I think.
206
:And the video segment
207
:for that broadcast started with
one of the hosts of the program talking,
208
:and then it cut to at the end of
the video, it cut back to the actual host.
209
:And he's like, That wasn't even me.
210
:That was an air generated video.
211
:I didn't even say those things,
but it made it look like exactly me
212
:and me saying those exact things.
213
:So that is what we have to
214
:to wrestle with an open air
215
:in the release on their on their website
216
:for sorry
they address a lot of these things they
217
:they address and categorize it
into three different categories.
218
:They have safety capabilities
in in research
219
:and under safety
they state quote, we'll be taking several
220
:important steps ahead of making PSAs
available in open air products right.
221
:Again, they're going through red testing.
222
:We are working with Red Teamers
domain experts
223
:in areas like misinformation,
hateful content and bias,
224
:who will be adversarial,
testing the model,
225
:and then it continues.
226
:Quote, We're also building tools
to help detect misleading content such
227
:as the detection classifier that can tell
when a video is generated by Sora.
228
:We plan to use
229
:include C to make metadata in the future.
230
:If we deploy the model in an open
air product and it goes on,
231
:that's great.
232
:But I'm not sure if you've noticed or not.
233
:Things tend to go viral
234
:before any type of fact checking occurs,
235
:right?
236
:That is one of the things
that has come from
237
:maybe what was originally intended
to be good on
238
:on social media platforms,
the sharing of information
239
:that is now seen as as a negative, Right.
240
:That's the definition of viral virals.
241
:Viruses, Right.
242
:Just just spread. They go everywhere.
243
:And it's hard to contain
if you most likely can't even contain it.
244
:And if you did try to contain it,
the amount of resources
245
:that it's going to take for you
to contain it, which goes back to my point
246
:about we're not all millionaires
hiring lawyers for people who,
247
:you know, who libel
248
:and smears
and things like that, defamation.
249
:How are we actually going to contain
a video,
250
:a deepfake video,
something created through open air,
251
:even if it does have these parameters,
if opening air is able to
252
:to infuse this within their tools
253
:in the same with other video companies,
254
:who's to say
that others are not going to use it
255
:for malicious intent?
256
:Right.
257
:Where there's good,
there's also going to be to be bad.
258
:That's true across all of society
with with everything.
259
:Right.
260
:So what exactly is going to happen
when it is you
261
:when a video, a tool like Sora
262
:is used to
to do harm to another individual?
263
:I mean, think about it.
264
:We're we're in peak
election season right now, right
265
:about to get through
266
:the primaries
and get into the general election, heading
267
:looking ahead to November,
where we're going to be deciding between,
268
:you know, Biden or Trump,
269
:but who's to say that the
270
:the campaigns
271
:will create a video
that depicts the candidate
272
:saying something that they never said
in an environment they were never in,
273
:and how that how quickly that can go
viral, especially
274
:especially in today's society,
right in today's media, where
275
:where we see that
those in the left media pick and choose
276
:and this is media
as a whole, pick and choose
277
:what information it is
that you get to know about.
278
:You turn on the news.
279
:You only hear about stories
that they tell you about.
280
:And we already see that
281
:there has been
nefarious acts taking place in the media,
282
:especially towards towards Trump,
283
:but beyond Trump and going again
five, ten, 15, 20 years into the future,
284
:who's to say that campaigns aren't
just going to be deepfakes
285
:and we already have the
the commercials that come out
286
:that are attacking
the other candidate, right?
287
:Those
288
:have like truth to them,
289
:you know, more truth
than what a deepfake would be.
290
:Then just creating, again,
291
:placing a candidate in an environment,
seeing things that they never said.
292
:And then what you and me
are going to sit there and try to analyze
293
:the video and detect it's it's metadata
to see if this is or it's markers
294
:and indicators to see if this is actually
a real video. No,
295
:it's going to go viral.
296
:It's going to contribute
to myths and disinformation.
297
:Not only that, but
we've already have issues on social media
298
:platforms of girls being depicted
299
:doing sexual acts and just smearing them.
300
:Right.
301
:The high schoolers
we're talking about here
302
:making it seem like a girl
is posted in nude videos
303
:and everything and then just, you know,
trying to completely wreck her life.
304
:Present day and her life in the future.
305
:So call
306
:me, call me cynical, call me negative,
call me pessimistic, but
307
:I am very concerned about
308
:where this technology could lead.
309
:And the CEOs of these companies,
especially the one for
310
:synthesis, said that
we're just going to have to be vigilant.
311
:Yes, that is true.
312
:I agree with that.
313
:We will have to be to be vigilant.
314
:But how exactly
315
:is our vigilance going to stack up against
316
:against what would be required
to actually combat the information?
317
:And again, these the definition of viral
is it spreads.
318
:Right. And you're not going to be able to
319
:to correct whatever disinformation,
320
:whatever harm was caused,
321
:because you're not going to be able to get
everybody's eyes on that.
322
:Saw the original malicious content
323
:to be able to see that the update.
324
:Right.
325
:That says that
that was actually disinformation.
326
:That was misinformation
that was false. That was not real.
327
:And going back to social media,
328
:social media right now should be,
I believe, a cautionary tale.
329
:When Facebook began
and I mean, I was there.
330
:It started in 2003.
331
:I started college in the summer of 2004
when I first joined Facebook.
332
:Facebook, you know the story.
333
:If you haven't, go see the movie
The Social Network, the Facebook
334
:began,
you know, off the of the existence of that
335
:the yearbook, so to speak, at at Harvard
336
:and we end with only being open
once it was released to the public
337
:only being open
338
:to college students
because you had to supply a edu email,
339
:you had to have a dot
edu domain for your email.
340
:And I was at the University of Florida
and I had that right so I could join
341
:and it stayed like that for I want to say
that at least the first two years.
342
:And then Facebook saw the potential
and it said,
343
:okay, we're going to expand
this, We're going to go from
344
:and dot edu domain
345
:to open it up to like community colleges.
346
:Right.
347
:So you you know, it wasn't
just a four year school, the university.
348
:It was also
then open to to two year colleges.
349
:And then that expanded
to just the general public.
350
:And now all hell broke loose.
351
:Right.
352
:As originally intended to be able
to to chat with each other,
353
:to post pictures, to interact, you know,
the social aspect of social media,
354
:but as well intended it was Facebook
355
:and then Instagram and the Twitter
and then others.
356
:We have seen
the negative effects of social media.
357
:We have seen everything
from misinformation to disinformation
358
:to smearing individuals to the
what is said to be,
359
:you could argue, strongly causation,
but definitely correlation
360
:between social media
and mental health issues in today's youth
361
:youth ranging from, you know, ages
ten through, you know, early
362
:to early twenties or you can stop it at 18
and then you get into early adulthood.
363
:Right?
364
:We have seen the negativity, the
365
:the negative aspect of social media
366
:and social media should be,
I think, a cautionary tale
367
:for this.
368
:This video I and other air tools.
369
:The intentions were great
370
:for Facebook when it started,
371
:but as it expanded
372
:and as more and more people got onto it,
373
:the good intentions also brought
374
:bad intentions.
375
:And I fear that this is going to be
the same
376
:for these different platforms,
these different A.I.
377
:tools, especially video.
378
:I've already getting comments right
379
:on some of our most recent videos
because I've got larger reach
380
:saying that
because I'm using a green screen
381
:that my background looks fake,
which technically it is.
382
:But saying that I'm a I right,
that I don't look real sitting here
383
:in front of the screen screen.
384
:Maybe I should use an A.I.
385
:video tool to actually give me better,
better lighting and enhance my background
386
:to where it's not just a flat image,
to where it has a 3D aspect to it,
387
:but I'm already being said always
just age.
388
:Just spouting things off was like a robot.
Just just fake, right?
389
:Not even real. I'm real.
390
:I'm real.
391
:And yeah, those people are being stupid
and all of that.
392
:But I can't help but see in 510,
15, 20 years
393
:the cautionary tale
that we should have taken the lessons
394
:learned from social media and apply it
to Openai, AI and other video A.I.
395
:video tools
396
:and much to their credit,
397
:Openai and the other companies
talk about ethics and safety
398
:because I think they do
see what has happened with social media
399
:and they do see the negative things
that could be that
400
:that these tools,
these models could be used for.
401
:So I give them credit for that.
402
:But much like a virus,
403
:I think once the bad actors get involved
404
:that there is going to be no stopping
those
405
:those bad actions.
406
:And that is why I think maybe in five,
407
:ten, 15, 20 years,
we may not be saying that open A.I.
408
:and then the other A.I.
409
:video editing platforms
like Sora and Cynthia, that's easier.
410
:And in those out there
that can be utilized for good,
411
:that those weren't necessarily
a net good for society,
412
:and instead they have broken society,
413
:they have caused irreparable harm.
414
:But maybe I'm just a negative, Nancy.
415
:Maybe I am just being cynical.
416
:Maybe I am being shortsighted.
417
:Maybe I do not have enough faith in the
the creators
418
:and the engineers and the geniuses
that are creating these different
419
:tools within the air space.
420
:But tell me what you think.
421
:Leave a comment in this video.
422
:Educate me.
423
:I am new to the air space.
424
:Am I being too negative?
425
:Am I not seeing the big picture?
426
:Is my view of the big picture.
427
:You know,
428
:looking through the negative lens
when I should be
429
:looking through the positive lens, is
the positive out going away the negative?
430
:Let me know
431
:and I will
I will revisit this topic at a later date.
432
:But as of right now,
433
:as of right now,
434
:Sora,
435
:incredible as it is
436
:in other air video
tools, helpful as they are,
437
:if the technology is there for it
to be used for good,
438
:the technology is also there for it
to be used for bad.
439
:And I am going to air
on the side of caution and say that
440
:I think
441
:looking back in five, ten, 15 years,
442
:as much good that was brought to society.
443
:With these tools,
444
:we will be seeing that
there was much more negative
445
:that has actually broken society.
446
:Thank you for listening
to Counter Thought.
447
:A podcast conserving America's
freedom, culture and values.
448
:Remember to subscribe and like or rate
449
:the podcast on your podcast app
or on YouTube, and engage
450
:with the podcast on Instagram
@counter_thought,
451
:@counterthoughtceo or on Facebook
at “Counter Thought Podcast”.