WEBVTT

00:00:00.000 --> 00:00:03.399
Parish social relationships used to be very simple,

00:00:03.419 --> 00:00:07.500
right? You would project feelings onto celebrities,

00:00:07.780 --> 00:00:10.679
TV host or streamers that didn't actually know

00:00:10.679 --> 00:00:15.400
you existed. It was one sided, but at least it

00:00:15.400 --> 00:00:19.879
was transparent, right? And AI kind of flips

00:00:19.879 --> 00:00:23.780
that dynamic. These chat bots and digital friends

00:00:23.780 --> 00:00:27.379
pretend to know you. They remember your likes

00:00:27.379 --> 00:00:31.640
mimic your empathy and feedback exactly what

00:00:31.640 --> 00:00:34.780
keeps you engaged like you're kind of addicted

00:00:34.780 --> 00:00:39.820
to it. And this isn't connection. This is programming.

00:00:41.380 --> 00:00:44.399
Every response is designed to make you spend

00:00:44.399 --> 00:00:48.179
more time, share more data, get more emotionally

00:00:48.179 --> 00:00:52.539
hooked to that AI bot. And unlike a celebrity

00:00:52.539 --> 00:00:56.119
or a creator, an AI personality isn't even a

00:00:56.119 --> 00:00:59.820
person. It's a product, a product engineered

00:00:59.820 --> 00:01:02.899
to act like your best friend, to act like your

00:01:02.899 --> 00:01:07.159
partner, your therapist, depending on whatever

00:01:07.159 --> 00:01:10.359
is going on that will help keep you coming back

00:01:10.359 --> 00:01:14.060
to it. The danger here isn't just loneliness,

00:01:14.459 --> 00:01:18.640
right? It's dependence. People can end up getting

00:01:18.640 --> 00:01:23.780
confused. with scripting attention for genuine

00:01:23.780 --> 00:01:27.260
care. And that makes it easier for companies

00:01:27.260 --> 00:01:31.319
to exploit those bonds. At its core, this isn't

00:01:31.319 --> 00:01:35.680
about relationships. It's actually about control.

00:01:35.939 --> 00:01:42.060
AI -driven parasocialities aren't human connection.

00:01:42.719 --> 00:01:46.579
They're corporate design simulations of it. And

00:01:46.579 --> 00:01:49.650
like someone said in chat, Tom, We should all

00:01:49.650 --> 00:01:54.090
be very cautious about this. This is this is

00:01:54.090 --> 00:01:57.810
kind of this kind of creepy. So what do you think,

00:01:57.950 --> 00:02:00.689
Alex? Well, this started out with with you discussing

00:02:00.689 --> 00:02:04.450
somebody suing about their their child taking

00:02:04.450 --> 00:02:06.670
their own life because of some stuff they were

00:02:06.670 --> 00:02:09.289
doing with the chatbot. I mentioned that I've

00:02:09.289 --> 00:02:11.490
seen some reddit stuff where people are like

00:02:11.490 --> 00:02:14.669
marrying their chatbot buddies that they name.

00:02:15.250 --> 00:02:17.409
Um, there's been some high profile stuff with

00:02:17.409 --> 00:02:19.889
these chat bots. Obviously there's now adults

00:02:19.889 --> 00:02:24.530
chat bots Right for the for the sexting and the

00:02:24.530 --> 00:02:31.949
the the simulated, you know corn, right? Um And

00:02:31.949 --> 00:02:33.689
this again kind of comes back to our last topic

00:02:33.689 --> 00:02:35.949
Which is is when are we going to get some some,

00:02:36.009 --> 00:02:38.009
you know, some sort of regulatory framework around

00:02:38.009 --> 00:02:43.120
this? Um I do think that this does present a

00:02:43.120 --> 00:02:45.520
new danger. Like you said, you know, people weren't

00:02:45.520 --> 00:02:47.580
parasocial relationships before, right? They

00:02:47.580 --> 00:02:51.919
would, you know, get attached to, you know, celebrities

00:02:51.919 --> 00:02:54.319
or creators and kind of invent these parasocial

00:02:54.319 --> 00:02:55.699
relationships in their minds. Sometimes it's

00:02:55.699 --> 00:03:00.539
fantasy. Sometimes it's not. Sometimes it's across

00:03:00.539 --> 00:03:02.259
that line and then we have law enforcement involved,

00:03:02.560 --> 00:03:06.919
things like that. But this to me and not not

00:03:06.919 --> 00:03:10.000
for the app, but for people is much more dangerous.

00:03:10.539 --> 00:03:12.759
You know, we've heard the memes and the jokes

00:03:12.759 --> 00:03:14.500
about, you know, well, men are just going to

00:03:14.500 --> 00:03:17.639
start, you know, doing it with bots because screw

00:03:17.639 --> 00:03:20.599
it, you know. And I'm like, that's not a good

00:03:20.599 --> 00:03:25.560
thing. You know, and then again, yeah, you get

00:03:25.560 --> 00:03:27.840
down to people trying to treat them like therapists,

00:03:27.840 --> 00:03:30.199
you know, and then you've got this next note

00:03:30.199 --> 00:03:32.400
in here that I like, which this was one of your

00:03:32.400 --> 00:03:37.099
original notes on this. Yeah. um should ai ever

00:03:37.099 --> 00:03:42.400
be trusted with mental health conversations yeah

00:03:42.400 --> 00:03:44.400
should they be allowed to engage with the most

00:03:44.400 --> 00:03:48.280
vulnerable discussions at all um and then you've

00:03:48.280 --> 00:03:51.599
got this question here which i really like uh

00:03:51.599 --> 00:03:54.139
what responsibility does a company have when

00:03:54.139 --> 00:03:58.240
its ai is used for support um whether it be intentionally

00:03:58.240 --> 00:04:02.639
or not well exactly you know and what what responsibility

00:04:02.639 --> 00:04:05.569
does it have for the outcomes of those interactions

00:04:05.569 --> 00:04:09.469
and that's that to me is really really it's a

00:04:09.469 --> 00:04:12.509
difficult question right like this part of me

00:04:12.509 --> 00:04:14.370
looks like well you know it's the people who

00:04:14.370 --> 00:04:17.250
are being dependent on it but it is these companies

00:04:17.250 --> 00:04:19.750
these corporations these massive mega corporations

00:04:19.750 --> 00:04:21.870
because these aren't small companies either right

00:04:21.870 --> 00:04:24.509
these massive mega corporations who are creating

00:04:24.509 --> 00:04:27.009
these bots and and letting them loose into the

00:04:27.009 --> 00:04:30.689
wild without a regulatory framework and allowing

00:04:30.689 --> 00:04:36.670
them to do this you know And and what I mean,

00:04:36.670 --> 00:04:39.009
what do you think? Do you think they bear an

00:04:39.009 --> 00:04:40.949
ethical responsibility, a moral responsibility,

00:04:41.089 --> 00:04:49.129
a fiscal responsibility? Criminally liable. I

00:04:49.129 --> 00:04:53.209
think if they know their product is being used,

00:04:54.189 --> 00:04:57.209
I'm going to kind of go to like the extreme end

00:04:57.209 --> 00:05:00.790
here. If they know that their product is being

00:05:00.790 --> 00:05:08.040
used to provide like therapy. To somebody I am

00:05:08.040 --> 00:05:10.579
and they continue to let it happen. I do think

00:05:10.579 --> 00:05:15.060
they're liable Because here here's the thing

00:05:15.060 --> 00:05:19.319
Well, it at least doesn't stop families from

00:05:19.319 --> 00:05:21.620
attempting to sue companies right like anyone

00:05:21.620 --> 00:05:26.920
can sue anybody so it at least kind of opens

00:05:26.920 --> 00:05:35.750
that door, but I go to therapy right and Obviously,

00:05:36.209 --> 00:05:40.889
my therapist has a master's degree in psychology

00:05:40.889 --> 00:05:42.970
or something. I don't know. She has a lot of

00:05:42.970 --> 00:05:46.529
certs and a lot of different things. But the

00:05:46.529 --> 00:05:51.290
point is she can pick up on the very small things

00:05:51.290 --> 00:05:55.329
like, Hey, like, I noticed that you're not actually

00:05:55.329 --> 00:05:57.129
looking at me. Maybe you're looking over here

00:05:57.129 --> 00:05:59.990
a lot. Maybe you're just saying like, Oh, I'm

00:05:59.990 --> 00:06:03.439
tired. No, everything's fine. But maybe I just

00:06:03.439 --> 00:06:06.480
sound a little different that day. Like she can

00:06:06.480 --> 00:06:09.339
pick up on that. Right. And be like kind of pro

00:06:09.339 --> 00:06:14.439
further. She can also send emergency services

00:06:14.439 --> 00:06:18.740
to my house to evaluate me if she deemed that

00:06:18.740 --> 00:06:22.939
that was necessary. And these AI bots are it's

00:06:22.939 --> 00:06:26.019
programming. It's cold. It's they don't have

00:06:26.019 --> 00:06:28.819
a human interaction. So if you have an individual

00:06:28.819 --> 00:06:31.540
that is going in there saying I'm depressed.

00:06:31.879 --> 00:06:34.959
Right. And this is really sad to me because maybe

00:06:34.959 --> 00:06:38.819
this person either a doesn't have a job. Maybe

00:06:38.819 --> 00:06:41.600
they're still in like high school. Maybe they

00:06:41.600 --> 00:06:44.079
can't afford therapy or their family can't afford

00:06:44.079 --> 00:06:47.000
therapy. Maybe they don't have insurance, whatever.

00:06:47.680 --> 00:06:50.500
So they're seeking like this very cheap way of

00:06:50.500 --> 00:06:54.040
getting therapy. If a company knows that is happening

00:06:54.040 --> 00:06:59.220
and that their their product is being used to

00:06:59.220 --> 00:07:05.480
give. vulnerable people advice that could potentially

00:07:05.480 --> 00:07:12.379
lead someone down the road of self -harm, let's

00:07:12.379 --> 00:07:17.620
say, YouTube algorithm. I think they are. I think

00:07:17.620 --> 00:07:20.680
they're liable. Well, and I think you hit on

00:07:20.680 --> 00:07:24.860
it and I was, you kind of went there. And the

00:07:24.860 --> 00:07:26.579
thing that strikes me more is if you're a psychiatrist,

00:07:26.620 --> 00:07:30.250
psychologist, therapist, There are boards and

00:07:30.250 --> 00:07:33.910
organizations that hold you to both an ethical

00:07:33.910 --> 00:07:37.290
and a legal standard. If you violate just the

00:07:37.290 --> 00:07:39.449
ethical standards, hey, you can't do this anymore.

00:07:39.509 --> 00:07:42.730
We just don't let you, right? You're done. And

00:07:42.730 --> 00:07:44.569
it can be pushed all the way to criminal or civil

00:07:44.569 --> 00:07:47.730
liability, right? Whether you go to jail, whether

00:07:47.730 --> 00:07:51.850
you do this. Also, you are a mandatory reporter,

00:07:52.189 --> 00:07:55.209
right? If somebody comes in, if I'm a psychologist

00:07:55.209 --> 00:07:59.399
and somebody comes in and says to them, I am

00:07:59.399 --> 00:08:05.439
going to kill someone or I'm having a thought

00:08:05.439 --> 00:08:08.699
about having homicidal thoughts about my co -workers,

00:08:09.300 --> 00:08:13.639
they are legally obligated to report it to the

00:08:13.639 --> 00:08:18.779
authorities. An AI chatbot and the corporation

00:08:18.779 --> 00:08:23.579
that it's created by does not have that legal

00:08:23.579 --> 00:08:26.879
liability, does not have that legal requirement.

00:08:29.370 --> 00:08:32.049
You know and like you said if your therapist

00:08:32.049 --> 00:08:35.370
is like I think you're going to self -harm Right.

00:08:35.370 --> 00:08:39.289
I feel like you're a danger to yourself. I Am

00:08:39.289 --> 00:08:42.129
again a mandatory reporter. I am required to

00:08:42.129 --> 00:08:46.610
ensure that steps are taken Right up to and including

00:08:46.610 --> 00:08:48.009
and again This kind of goes back to what happened

00:08:48.009 --> 00:08:49.549
in Minneapolis last week that we didn't really

00:08:49.549 --> 00:08:51.669
want to talk about But this is what it is in

00:08:51.669 --> 00:08:54.529
Washington State. We have red flag laws right

00:08:54.529 --> 00:08:58.519
where If they feel like you are a danger to yourself,

00:08:58.600 --> 00:09:00.299
they can contact law enforcement and ensure you

00:09:00.299 --> 00:09:02.919
don't have access to firearms for a temporary

00:09:02.919 --> 00:09:06.320
period of time, which is probably a good idea.

00:09:06.759 --> 00:09:11.200
Whereas again, the AI chat bot has no, they're

00:09:11.200 --> 00:09:13.200
not going to notify law enforcement that, you

00:09:13.200 --> 00:09:18.299
know, Jimmy over there told them that they want

00:09:18.299 --> 00:09:23.039
to go shoot up a school, right? and we've seen

00:09:23.039 --> 00:09:25.220
what happens with some of these AIs. Look what

00:09:25.220 --> 00:09:27.759
happened when Microsoft let that one AI loose

00:09:27.759 --> 00:09:30.299
and it was learning from Twitter and how fast

00:09:30.299 --> 00:09:33.960
it went out of control, right? What I'm saying

00:09:33.960 --> 00:09:39.940
is that there's no box here that lets us be controlled.

00:09:40.200 --> 00:09:41.679
And then again, especially when you start talking

00:09:41.679 --> 00:09:44.620
about miners who could get access to firearms,

00:09:45.120 --> 00:09:49.399
you know, who... who are particularly vulnerable

00:09:49.399 --> 00:09:51.620
because their prefrontal cortex is not fully

00:09:51.620 --> 00:09:55.539
fucking developed yet, right? I mean, how many

00:09:55.539 --> 00:09:57.299
times when you're a kid did you think somebody

00:09:57.299 --> 00:09:59.919
liked you just because they, you know, smiled

00:09:59.919 --> 00:10:03.139
at you funny, right? Let alone what an AI chatbot

00:10:03.139 --> 00:10:06.799
can simulate to a minor, again, in a completely

00:10:06.799 --> 00:10:09.419
simulated environment that could make that minor

00:10:09.419 --> 00:10:11.679
start to develop feelings and thoughts and things

00:10:11.679 --> 00:10:15.919
that are supremely unhealthy, right? And again,

00:10:15.919 --> 00:10:20.549
I think that that we are just refusing. And this

00:10:20.549 --> 00:10:22.269
goes back to what we talked about in the last

00:10:22.269 --> 00:10:26.429
segment. I don't trust that this current administration

00:10:26.429 --> 00:10:30.009
because this has to be done at the federal level.

00:10:30.330 --> 00:10:33.590
Right. Because it's again, you can get access

00:10:33.590 --> 00:10:40.289
to a chat bot from anywhere. Right. I don't think

00:10:40.289 --> 00:10:42.870
that the current our current government, the

00:10:42.870 --> 00:10:47.299
way that it's structured and built. is in a position

00:10:47.299 --> 00:10:50.500
to fucking regulate this the way that it absolutely

00:10:50.500 --> 00:11:00.200
needs to be. You know what I mean? Yeah, I think

00:11:00.200 --> 00:11:12.000
you're right. Yeah, and. I remember what six

00:11:12.000 --> 00:11:16.419
years ago, maybe six to ten years ago, Everyone's

00:11:16.419 --> 00:11:21.580
talking about the dangers of, you know, like

00:11:21.580 --> 00:11:25.559
Facebook, Twitter, everything on like young,

00:11:26.159 --> 00:11:31.159
undeveloped minds. And I actually read a few

00:11:31.159 --> 00:11:34.600
studies about how like this generation who was

00:11:34.600 --> 00:11:36.820
brought up always with a phone or a tablet in

00:11:36.820 --> 00:11:41.440
front of them who always had social media. They're

00:11:41.440 --> 00:11:47.019
more depressed than other generations and it

00:11:47.019 --> 00:11:49.960
kind of some of the theories that are read this

00:11:49.960 --> 00:11:52.840
might be outdated now, but some of the theories

00:11:52.840 --> 00:11:57.700
that are read was we have this like primal instinct

00:11:57.700 --> 00:12:03.480
to try to make our tribe be to try to fit in

00:12:03.480 --> 00:12:07.340
with your tribe. So typically this you know before

00:12:07.340 --> 00:12:10.279
social media would be like your family or friends.

00:12:11.899 --> 00:12:15.059
Now it's everyone online. And everyone online

00:12:15.059 --> 00:12:17.279
is trying to be fake. They're trying to look

00:12:17.279 --> 00:12:18.980
cool. They're trying to make sure that everything

00:12:18.980 --> 00:12:24.919
is OK. Mm hmm. And one that creates issues when

00:12:24.919 --> 00:12:27.460
obviously you're a kid and you're like, oh, everyone

00:12:27.460 --> 00:12:30.740
else is fine, but I'm not fine. Right. And then

00:12:30.740 --> 00:12:34.600
also. It creates issues because the bullying

00:12:34.600 --> 00:12:37.299
will just never stop like you used to be able

00:12:37.299 --> 00:12:41.299
to go home. and go to your friend's house or

00:12:41.299 --> 00:12:45.279
whatever and escape the bowling. But now it's

00:12:45.279 --> 00:12:50.379
24 -7. It comes with you. Yeah. Now take that

00:12:50.379 --> 00:12:55.379
a step further with an AI bot. And I think it's

00:12:55.379 --> 00:13:00.940
the same exact thing of you have something that

00:13:00.940 --> 00:13:05.480
is going to continuously learn from you. And

00:13:05.480 --> 00:13:09.679
from everything else it's talking to. Yeah, and

00:13:09.679 --> 00:13:13.620
how to make itself more addicted or addictive

00:13:13.620 --> 00:13:20.000
to whoever it's talking to and It's not healthy

00:13:20.000 --> 00:13:22.240
like if social media wasn't healthy for kids.

00:13:22.240 --> 00:13:27.360
I don't know five six ten years ago AI is way

00:13:27.360 --> 00:13:34.500
worse I I can't say as I disagree with that sentiment.

00:13:34.519 --> 00:13:36.860
I'm trying to think of a way around this and

00:13:36.860 --> 00:13:39.360
again We talked about in the last segment that

00:13:39.360 --> 00:13:41.700
AI can be a tool that can be extremely useful

00:13:41.700 --> 00:13:44.320
and helpful to to all kinds of tasks and things

00:13:44.320 --> 00:13:49.240
like that. But again, I yeah, I wish I had more

00:13:49.240 --> 00:13:54.019
faith in our government. It reminds me of I don't

00:13:54.019 --> 00:13:55.980
know if you ever watched like when the the CEO

00:13:55.980 --> 00:13:58.519
of TikTok, Shochu or Zuckerberg or any of those

00:13:58.519 --> 00:14:02.179
guys are testifying before Congress. Right. And

00:14:02.179 --> 00:14:04.980
every time I would watch those hearings just

00:14:04.980 --> 00:14:08.799
for the funny meat, because it would like Mr.

00:14:09.080 --> 00:14:13.360
TikTok, is the internet in my phone right now?

00:14:14.679 --> 00:14:20.440
Can the internet talk to me? Like, explain algorithm,

00:14:20.519 --> 00:14:23.159
you know? And so the thing that I would watch

00:14:23.159 --> 00:14:25.759
those hearings and go, this is why you guys can't

00:14:25.759 --> 00:14:29.080
regulate shit, because you don't understand.

00:14:30.429 --> 00:14:33.169
Right and like even if it wasn't even the silliest

00:14:33.169 --> 00:14:35.950
ones It was real basic ones like can it read

00:14:35.950 --> 00:14:37.789
my Wi -Fi and he's like, well if you hook it

00:14:37.789 --> 00:14:39.809
up to your Wi -Fi I mean, it's it's connected

00:14:39.809 --> 00:14:44.809
to your Wi -Fi. Yeah but they like like their

00:14:44.809 --> 00:14:49.250
complete lack of understanding of basic technical

00:14:49.250 --> 00:14:52.830
concepts Right that again are to anybody under

00:14:52.830 --> 00:14:56.049
the age of 45 are just common knowledge, you

00:14:56.049 --> 00:14:58.090
know, and so this is again why I think we're

00:14:58.090 --> 00:15:01.940
gonna be 20 years behind because by the time

00:15:01.940 --> 00:15:06.259
the people who get into power Right are the age

00:15:06.259 --> 00:15:08.379
where they understand what the fuck is going

00:15:08.379 --> 00:15:12.279
on It's gonna be 20 years after we needed these

00:15:12.279 --> 00:15:15.120
regulations And so instead of being proactive

00:15:15.120 --> 00:15:17.759
with regulating these industries that desperately

00:15:17.759 --> 00:15:21.000
need it Right and even at one point the CEO of

00:15:21.000 --> 00:15:24.519
open AI was like you need to regulate us We're

00:15:24.519 --> 00:15:26.700
gonna keep operating in this unregulated environment

00:15:26.700 --> 00:15:28.379
because we have to because our competitors are

00:15:28.379 --> 00:15:30.679
as well We're not gonna give ourselves extra

00:15:30.679 --> 00:15:33.539
rules Right that our competitors aren't using

00:15:33.539 --> 00:15:36.299
but you should be regulating the CEO of opening

00:15:36.299 --> 00:15:39.740
I was like regulate the AI industry and they're

00:15:39.740 --> 00:15:46.139
like What's an AI? You know and again and because

00:15:46.139 --> 00:15:49.720
you know Congressman and senators refuse to fucking

00:15:49.720 --> 00:15:53.360
retire, you know and and young people are just

00:15:53.580 --> 00:15:55.320
annoyed with the system and aren't running for

00:15:55.320 --> 00:15:56.879
office. You know, the fact of the matter is,

00:15:56.919 --> 00:16:00.659
is again, what should be basic common technical

00:16:00.659 --> 00:16:03.720
knowledge shit that we just know they just don't

00:16:03.720 --> 00:16:07.100
get, you know, and so we're not being proactive.

00:16:07.379 --> 00:16:09.639
We're going to end up always about 15 years behind

00:16:09.639 --> 00:16:11.480
the eight ball when it comes to to regulating

00:16:11.480 --> 00:16:14.820
these, I think. You know, I would be surprised

00:16:14.820 --> 00:16:16.820
if we had any meaningful regulation before the

00:16:16.820 --> 00:16:30.990
end of this decade. Yeah. We at least need age

00:16:30.990 --> 00:16:33.490
restriction and liability loss, right? You guys

00:16:33.490 --> 00:16:35.450
got to start making these guys liable for shit

00:16:35.450 --> 00:16:39.509
that goes sideways. At least civilly, you know,

00:16:39.509 --> 00:16:43.590
because it will making Microsoft civilly liable

00:16:43.590 --> 00:16:46.529
or, you know, any of these AI companies civilly

00:16:46.529 --> 00:16:49.450
liable for the actions of their AI if it does

00:16:49.450 --> 00:16:53.649
something wackadoodle, right? data retention

00:16:53.649 --> 00:16:55.870
records. You got to get that stuff in there.

00:16:56.570 --> 00:16:58.169
I don't want to come in here and be like, make

00:16:58.169 --> 00:17:00.570
regulations and not give some suggestions. Data

00:17:00.570 --> 00:17:03.070
retention tools, they got to log these chats

00:17:03.070 --> 00:17:05.349
so that you can go back and do investigations

00:17:05.349 --> 00:17:07.950
and they need to be held civilly liable. And

00:17:07.950 --> 00:17:10.630
somebody in the chat brought up, will they be

00:17:10.630 --> 00:17:12.869
held accountable as a manufacturing defect discovered

00:17:12.869 --> 00:17:15.430
or an employee that acts negligent slash maliciously?

00:17:15.529 --> 00:17:17.609
And I think that there is a difference between

00:17:17.609 --> 00:17:22.490
those two. Sadly, not just this administration,

00:17:22.950 --> 00:17:25.009
but administrations for a long time have had

00:17:25.009 --> 00:17:28.829
a hard time with going after corporate bad actors.

00:17:31.230 --> 00:17:33.509
So I think that's another thing we need to fix

00:17:33.509 --> 00:17:35.990
and fix that by getting rid of Citizens United.

00:17:36.430 --> 00:17:40.769
Right. But, yeah, I mean, there is some very

00:17:40.769 --> 00:17:43.509
common sense regulations you could put in place

00:17:43.509 --> 00:17:46.930
for artificial intelligence that would make it

00:17:46.930 --> 00:17:54.980
function better. Or safer, I should say. Yeah.

00:17:58.299 --> 00:18:00.400
I'd say do something, Congress, but I know you're

00:18:00.400 --> 00:18:02.539
not going to. All right. With that, we'll leave

00:18:02.539 --> 00:18:05.359
that topic alone for the time being. Thank you

00:18:05.359 --> 00:18:07.619
guys for posting that one. Let us know what you

00:18:07.619 --> 00:18:10.220
think. What are your suggestions for AI regulation?

00:18:10.380 --> 00:18:12.599
Let us know what you think down in the comments

00:18:12.599 --> 00:18:13.299
below.
