1
00:00:00,000 --> 00:00:21,640
Hey there!

2
00:00:21,640 --> 00:00:24,400
Welcome to episode number six of the KMO Show.

3
00:00:24,400 --> 00:00:29,960
I'm your host, KMO, and this episode is prepared for release on Wednesday the 5th of April

4
00:00:29,960 --> 00:00:38,480
2023. My guest is Dr. Ashley Frawley. She is an associate professor of sociology and social policy

5
00:00:38,480 --> 00:00:44,440
at Swansea University, and she's also the co-host of the Sublation Magazine show on Sublation Media's

6
00:00:44,440 --> 00:00:50,360
YouTube channel. She also describes herself as Douglas Lane's right-hand man. Alrighty,

7
00:00:50,360 --> 00:00:54,120
well I've got a story to tell. Nothing to do with the content of this episode at the end,

8
00:00:54,120 --> 00:00:58,640
so let's get right into it. I will warn you though, the opening chit-chat goes on a little

9
00:00:58,640 --> 00:01:03,320
longer than I, you know, would normally do, but it was flowing, it was fun, and we were talking

10
00:01:03,320 --> 00:01:10,640
about the 90s. I was born in 1968, which means I turned 30 in 1998, which means I lived most of

11
00:01:10,640 --> 00:01:17,360
my 20s in the 1990s. It's a fond period for me. If you get sick of that opening chit-chat and you

12
00:01:17,360 --> 00:01:22,120
want to get to the heart of it, then skip forward about 15 minutes. Alrighty, here we go.

13
00:01:22,120 --> 00:01:32,880
This is the KMO Show. I'm your host, KMO, and I'm speaking with Ashley Frawley,

14
00:01:32,880 --> 00:01:37,080
who I'm going to let introduce herself. So Ashley, it is good to see you and talk to you,

15
00:01:37,080 --> 00:01:42,800
and tell the listeners something about you. Well, I'm a sociologist. I'm currently

16
00:01:42,800 --> 00:01:50,360
associate professor of sociology and social policy at Swansea University, and I'm the author of two

17
00:01:50,360 --> 00:01:57,000
books, first one published in 2015 called Significant, I'm sorry, Semiotics of Happiness,

18
00:01:57,000 --> 00:02:03,680
and then the most recent one coming out this year, Significant Emotions. And you will be

19
00:02:03,680 --> 00:02:08,800
known to a portion of the audience, that portion that is the crossover between my audience and the

20
00:02:08,800 --> 00:02:19,000
Diet Soap podcast, or the Sublation Media Empire. Yeah, I wish. That's right, yeah. So I host the

21
00:02:19,000 --> 00:02:26,800
Sublation, I co-host the Sublation Magazine show every Monday, and I am Doug's right-hand man at

22
00:02:26,800 --> 00:02:34,280
Sublation Media. And that would be Douglas Lane, a voice well known to this audience. He was on

23
00:02:34,280 --> 00:02:39,680
just recently. Oh, was he? Oh, good. I love Doug, and he's great. I've been a fan of Doug since I

24
00:02:39,680 --> 00:02:46,920
first discovered his podcast in like 2009 or something like that, and it was always my dream

25
00:02:46,920 --> 00:02:54,360
to be on the Diet Soap podcast as soon as I finished my PhD. So yeah, I'm a big fan. I'm

26
00:02:54,360 --> 00:03:00,880
really happy to work with him. Well, I listened to a show that the two of you did where you said

27
00:03:00,880 --> 00:03:06,600
exactly that, and really made me feel old. I was the person that inspired Doug to get into

28
00:03:06,600 --> 00:03:13,920
podcasting. Oh, wow. Yeah, I started in 2006, and he was one of the early people that contacted me

29
00:03:13,920 --> 00:03:19,280
regularly enough that I said, okay, why don't we just get on Skype, is what we were using at the

30
00:03:19,280 --> 00:03:24,920
time, and start talking. So I don't remember. I think he started his podcast before we were

31
00:03:24,920 --> 00:03:30,480
actually talking to one another by voice, but he had been sending me text-based communications,

32
00:03:30,480 --> 00:03:35,760
posted comments, emails, things like that. And he liked what I was doing, but he thought I was

33
00:03:35,760 --> 00:03:42,560
wrong politically, so he needed to correct me. That sounds like Doug. Well, I've moved up in the

34
00:03:42,560 --> 00:03:49,400
world then. I feel like all of my hard work has paid off. I've gone beyond Diet Soap. Well, at

35
00:03:49,400 --> 00:03:54,800
the time, for the first several years of our acquaintance, I had a much larger audience than

36
00:03:54,800 --> 00:04:03,360
Doug. But then in the Trump years, his gig at Zero Books was very good for boosting his visibility.

37
00:04:03,360 --> 00:04:07,640
And then just being an established voice on the left at the beginning of the Trump years was a

38
00:04:07,640 --> 00:04:14,760
very good thing for him. And during the Trump years, I got really discouraged. I've always been a

39
00:04:14,760 --> 00:04:20,360
political outsider, but as the red and blue tribes here in the US really polarized and started

40
00:04:20,360 --> 00:04:26,000
taking stances that, to my mind, were mostly irrational, mostly based on tribal signaling,

41
00:04:26,000 --> 00:04:31,960
and I just wasn't interested. And then, oh my goodness, then came COVID. And I had no interest

42
00:04:31,960 --> 00:04:39,880
in either the rabidly anti... What I would consider to be the paranoid response, like, oh, this is all

43
00:04:39,880 --> 00:04:45,480
a government plot, and there's microchips and the injections and whatnot, versus the other side.

44
00:04:45,480 --> 00:04:52,440
And I was in Vermont at the time, deep blue Vermont. And I would see kids riding their bicycles

45
00:04:52,440 --> 00:05:00,280
outside with masks on. What is happening here? That's insane. Multiple times, I would be walking

46
00:05:00,280 --> 00:05:06,120
down the street, and it's always a woman in Vermont who's approaching me on the sidewalk,

47
00:05:06,120 --> 00:05:11,080
and she's masked. And this happens twice that I can think of it, and possibly more times.

48
00:05:11,800 --> 00:05:17,480
I wouldn't be masked, but I'm outdoors. I'm outside, under the sun, in the wind. And somebody

49
00:05:17,480 --> 00:05:23,560
that is approaching me on the sidewalk will stop, ostentatiously turn, very indignant, and step into

50
00:05:23,560 --> 00:05:31,160
the street. There's traffic. You avoid being close to me. And I was like, oh my goodness.

51
00:05:31,160 --> 00:05:35,960
The insanity on both sides was so off-putting, and I did not want to talk to either side. I did

52
00:05:35,960 --> 00:05:44,760
not want to be a champion for either side. And I just... That anecdote reminds me of a very common

53
00:05:45,640 --> 00:05:50,840
thing that we say when we... People who study the sociology of risk, that people put themselves in

54
00:05:50,840 --> 00:05:57,560
the way of known risks to avoid hypothetical ones. Sorry, I didn't mean to interrupt.

55
00:05:57,560 --> 00:06:02,840
No, no, that's great. I mean, as I say to everybody who... Anytime it comes up in a

56
00:06:02,840 --> 00:06:08,280
conversation, somebody will ask if they're addressing my question. And my blah, blah, blah,

57
00:06:08,280 --> 00:06:14,040
mouth noises are just something to spark similar noises from you. So as soon as anything occurs to

58
00:06:14,040 --> 00:06:22,120
you, just go ahead and jump in. I will say that I appreciate Doug Lane's project of criticizing

59
00:06:22,120 --> 00:06:27,240
the left from the left. I think more of that needs to go on. I was listening to a conversation

60
00:06:27,240 --> 00:06:32,840
between the two of you in which he asked you if you'd seen the movie Brazil, Terry Gilliam

61
00:06:32,840 --> 00:06:38,280
film from the 80s, and you hadn't. But he tried to explain the plot to you. And this is a very

62
00:06:38,280 --> 00:06:43,400
psychedelic sort of magical realist movie. Not the kind of thing I would be watching.

63
00:06:43,400 --> 00:06:51,000
Yeah. Admittedly. Oh, it's a classic. I mean, you really should see it. In terms of filmmaking,

64
00:06:51,000 --> 00:06:57,640
it's a classic. Art generally is a bit of a black hole for me. I know about as much as the average

65
00:06:57,640 --> 00:07:03,560
high school student, I think. Well, I was reminded of a recent episode of Rick and Morty where...

66
00:07:03,560 --> 00:07:10,040
Hey, that's not my story. They were trapped in the arcade, and Morty was in the video game where

67
00:07:10,040 --> 00:07:13,880
you live a whole life, and he's stuck there, and Rick had to go in after him. So they're both

68
00:07:13,880 --> 00:07:18,040
incapacitated, and Summer is the only one there, and the whole place is attacked by terrorists.

69
00:07:18,600 --> 00:07:23,000
And Rick wakes up for just a minute, just long enough to say, well, you know, do a die hard. Or

70
00:07:23,000 --> 00:07:27,480
he said, you've seen Die Hard, right? And she says, I'm 17 years old. No, I haven't seen Die Hard.

71
00:07:29,720 --> 00:07:34,200
He said, well, neither had the guy in Die Hard. So, you know, go ahead. So it was her job, you

72
00:07:34,200 --> 00:07:38,840
know, to crawl around in advance and sneak around and steal guns. And, you know, basically, I don't

73
00:07:38,840 --> 00:07:45,240
know if you've seen Die Hard. I was living... I did some very brutal manual labor just recently. I was

74
00:07:45,240 --> 00:07:51,400
making snow at a ski resort in the Lake Tahoe area, living in company housing. And one of my

75
00:07:51,400 --> 00:07:58,200
roommates was a 27-year-old guy, and he had never seen Die Hard either. And we watched it together

76
00:07:58,200 --> 00:08:02,760
on Christmas, because, you know, it's a Christmas movie. Under great duress, I imagine. I'm sure he

77
00:08:02,760 --> 00:08:07,640
was like, yeah. No, he loved it. He loved it. You know, if you can get a Gen Z-er to sit down and

78
00:08:07,640 --> 00:08:12,760
watch a movie from the 80s, invariably they love it, because the movies are so accessible. They're

79
00:08:12,760 --> 00:08:18,680
so skillfully constructed in terms of the narrative that I don't know why it's a lost art. I mean,

80
00:08:18,680 --> 00:08:23,080
so many of the people who were writing back then are still alive. But, you know, the movie

81
00:08:23,080 --> 00:08:27,960
structure, like narrative structures, are just a mess these days, for the most part. So it's always

82
00:08:27,960 --> 00:08:32,440
an eye opener when you show somebody a classic from the 80s, you know, somebody who's like under 30.

83
00:08:32,440 --> 00:08:37,720
You know, they imagine, oh, it's, you know, I could never get into this. It's standard definition.

84
00:08:37,720 --> 00:08:42,680
It's, you know, sometimes a three by four aspect ratio. It just looks old. But as soon as the

85
00:08:42,680 --> 00:08:49,160
narrative gets its hooks into them, it's amazing. So anyway, yes, watch Brazil. All that on the

86
00:08:49,160 --> 00:08:57,160
surface of watch Brazil. But in that conversation with Doug, I realized that maybe you're not aware

87
00:08:57,160 --> 00:09:02,840
of the name, but you are definitely carrying on the tradition of the anti-psychiatry psychiatrist,

88
00:09:02,840 --> 00:09:08,600
Thomas Saz. You familiar with Thomas Saz? Yeah. The author of The Myth of Mental Illness,

89
00:09:09,160 --> 00:09:15,080
and The Therapeutic State, and my favorite, Our Right to Drugs, a Case for the Free Market.

90
00:09:16,200 --> 00:09:25,480
Nice. Yes. So he was very much opposed to the coercive nature of psychiatry. He thought that

91
00:09:25,480 --> 00:09:31,080
psychiatrists shouldn't have any more power over people than do, say, architects, or I was going

92
00:09:31,080 --> 00:09:38,200
to say lawyers, but lawyers do kind of have power over people's lives. And that there is brain

93
00:09:38,200 --> 00:09:43,960
disease. There are, you know, disorders of the brain that can be diagnosed with clinical tests.

94
00:09:43,960 --> 00:09:49,960
But any diagnosis that is based strictly on what a patient says or does, that's not medicine.

95
00:09:49,960 --> 00:09:57,320
That is not a disease. And moreover, he says that often, and one of his sort of attempts to prove

96
00:09:57,320 --> 00:10:06,440
his overall argument about this is that as soon as a actual brain disease is found that explains

97
00:10:06,440 --> 00:10:10,600
these behaviors, people realize they've made a mistake, and actually it's a medical illness,

98
00:10:10,600 --> 00:10:16,040
so they will change that classification. So he says like, this is quite clear that there's like

99
00:10:16,040 --> 00:10:20,920
an underlying awareness that these things are not medical diseases, but that they're sort of

100
00:10:20,920 --> 00:10:28,200
governing the residual behaviors that fall outside of, that every culture has, that fall outside of

101
00:10:28,760 --> 00:10:33,640
any kind of formal sanction, like our cultures will have, and our societies have criminal sanctions

102
00:10:33,640 --> 00:10:37,880
for certain behaviors. And then we have sort of informal sanctions. And then there's this sort of

103
00:10:37,880 --> 00:10:43,000
residual behavior that isn't governed. And he says that psychiatric categories govern that.

104
00:10:43,000 --> 00:10:47,720
But as soon as we realize that actually there is an underlying brain disease, we change it.

105
00:10:48,680 --> 00:10:55,640
Actually, these are like, nobody says that, I don't know, dementia is a mental illness,

106
00:10:55,640 --> 00:11:00,920
and maybe they do now. But we think of it as something quite different than depression,

107
00:11:00,920 --> 00:11:06,680
I think. Or maybe things have melded together a bit more in the public consciousness toward

108
00:11:06,680 --> 00:11:11,080
the present, where we tend to think of certain things as being a mental illness, and we tend

109
00:11:11,080 --> 00:11:20,680
to think of certain things as brain diseases in the same way as the actual sort of verified

110
00:11:21,320 --> 00:11:26,360
brain malfunctions. But he says that usually we realize we've made a mistake and we categorize

111
00:11:26,360 --> 00:11:32,280
it in terms of physical bodily illness, whereas the rest of it is based on behavior, and it's

112
00:11:32,280 --> 00:11:38,440
a form of governance of behavior. Yeah, until you find the tumor on the MRI,

113
00:11:38,440 --> 00:11:44,200
all this talk of different labels from the DSM seems plausible. But as soon as you discover the

114
00:11:44,200 --> 00:11:53,480
tumor, you want to address the tumor. There are so many. I mean, I'm looking into my memory for just

115
00:11:53,480 --> 00:12:00,840
some random psychiatric label. Say, gosh, they're all getting jumbled up. But what would the cluster

116
00:12:00,840 --> 00:12:09,800
be like, oh, borderline personality disorder? You might drill down on that, looking at somebody's

117
00:12:09,800 --> 00:12:16,600
mood swings or somebody's... They're penchant for being overly dramatic to manipulate people.

118
00:12:17,400 --> 00:12:22,440
But as soon as you see the tumor on the brain scan, I mean, that becomes the actual diagnosis,

119
00:12:22,440 --> 00:12:26,120
which is what I think you're saying, that all of this other stuff is sort of ephemeral,

120
00:12:26,120 --> 00:12:33,320
and it's this kind of at play. The diagnosis is going to depend a lot on the obsessions of

121
00:12:33,320 --> 00:12:38,840
the particular therapist in question. This has all kind of jumped down into the weeds, though. Let's

122
00:12:40,120 --> 00:12:45,800
bring it up to a higher level for a sec. I'm sitting on a treasure trove of recorded

123
00:12:45,800 --> 00:12:51,160
material right now, and this might sit in the can for a few weeks before it actually goes out. So

124
00:12:51,160 --> 00:12:58,280
let's avoid the news cycle, the close-up view of the news cycle, and talk in general. But I think

125
00:12:58,280 --> 00:13:06,200
the discussion of the coercive nature of not just psychiatry, but also, I would just say,

126
00:13:06,200 --> 00:13:12,680
the social conversation. You were talking with Doug about the coerciveness of the left, and you guys

127
00:13:12,680 --> 00:13:17,560
were talking in terms of COVID and vaccines and masks and things like that. But it just seems to

128
00:13:17,560 --> 00:13:25,320
go much further than that. It's a much more thoroughgoing coerciveness that is very focused

129
00:13:25,320 --> 00:13:33,640
on people's mannerisms and their language, the words they use, the affiliations that they might

130
00:13:33,640 --> 00:13:40,680
imply with their words. These can be, I hate to say the word, but triggering to folks on the left.

131
00:13:40,680 --> 00:13:51,880
I have an aversion, almost an allergy, to leftist jargon. Jargon in general. But I talk to a lot of

132
00:13:51,880 --> 00:13:58,600
people about AI. And AI researchers, they can't speak in full sentences and words. They just

133
00:13:58,600 --> 00:14:03,000
pepper everything with acronyms. So much easier to say LLM than large language model.

134
00:14:03,000 --> 00:14:09,560
Sometimes it's the other way though. You have a project that is, what is it? It's something AF.

135
00:14:09,560 --> 00:14:11,560
Oh, Based AF.

136
00:14:12,520 --> 00:14:14,040
Based, yes, Based AF.

137
00:14:14,840 --> 00:14:19,960
I'm way too old for that title, but I put it out on social media, which is the name of my podcast

138
00:14:19,960 --> 00:14:24,760
be. And that was the most top voted one. And I can't say it out loud. The first video essay I did,

139
00:14:24,760 --> 00:14:27,640
I started the introduction with, like, hi, this is Based AF. And I was like, no.

140
00:14:27,640 --> 00:14:39,880
I'm too old. Well, AF works better in print. As fuck is easier to say than AF.

141
00:14:39,880 --> 00:14:42,440
Yeah. It's my name. I'm too curly.

142
00:14:46,120 --> 00:14:51,720
So, Based. When I first heard about Based, it was somebody was playing a game. They would say

143
00:14:51,720 --> 00:14:57,320
something and it's either cringe or based. And I didn't know what, you know, based meant. And I was

144
00:14:57,320 --> 00:15:05,480
kind of fuzzy on cringe at the time. But now, based, I guess it means authentic, confident in

145
00:15:05,480 --> 00:15:10,760
one's authenticity. But it's also now associated with somebody taking a right-wing stance. But

146
00:15:10,760 --> 00:15:13,240
what is based? What do I not know about that word?

147
00:15:14,760 --> 00:15:20,280
I don't know. Ask the young people today. They deemed me the label. I feel like

148
00:15:20,280 --> 00:15:28,200
they carried me through the streets and gave me that title. No, I'm kidding. No, I guess I had

149
00:15:28,200 --> 00:15:33,800
trouble figuring it out myself. Gosh, there's all, you know, you're getting old when actually even

150
00:15:34,600 --> 00:15:39,960
lingo that I use as a teenager that became quite uncool for a bit is now coming back.

151
00:15:39,960 --> 00:15:47,160
And I see people using it. I'm like, oh, no, it's like bell bottoms. Is this where I am now in life?

152
00:15:47,160 --> 00:15:53,560
Like all my mannerisms are coming back in style, but not like that. Anyway, no, yeah. So basically,

153
00:15:53,560 --> 00:16:00,200
yeah, I think that's what it means. It goes back to some rapper, I think, who said, who used this

154
00:16:00,200 --> 00:16:06,840
word. And then it kind of made its way into the popular lexicon to mean, yeah, something that is

155
00:16:06,840 --> 00:16:16,600
edgy or, as you said, kind of authentic, but can also be an insult, like going too far to be

156
00:16:16,600 --> 00:16:24,600
contrarian, perhaps, from what I understand. Anybody under the age of 30 is like cringing

157
00:16:25,720 --> 00:16:31,800
at my description, but that is from what I understand. So that was the name of the podcast

158
00:16:31,800 --> 00:16:36,520
that people had suggested for me, and I couldn't think of anything better than that, so I just went

159
00:16:36,520 --> 00:16:42,840
for it. So if you live long enough, you'll get to the point where you will adopt youth lingo and use

160
00:16:42,840 --> 00:16:48,040
it ironically, like somebody my age really shouldn't be saying this, but you'll think it's

161
00:16:48,040 --> 00:16:53,560
cute, but then you'll continue doing it long after that bit of lingo has sort of receded into the

162
00:16:53,560 --> 00:16:59,320
background. And you might say it again 15 years later to somebody who's young and they won't have

163
00:16:59,320 --> 00:17:05,480
any clue as to what you're talking about. Yeah. Well, the other option for the podcast was Drunk AF.

164
00:17:05,480 --> 00:17:13,560
I thought that was funny. I was like, I'll just get drunk and like talk to the camera, and I'll call

165
00:17:13,560 --> 00:17:22,280
it Drunk AF. Dangerous. I like that idea better, but no. My husband, do you ever watch the Netflix

166
00:17:22,280 --> 00:17:30,360
show? Oh, what is it? The Unsinkable Kimmy Schmidt? I have never seen it. Oh, so it's about this.

167
00:17:31,160 --> 00:17:36,040
It's about this. It's an older show now. It was a long time ago that we watched this, but this young

168
00:17:36,040 --> 00:17:40,920
woman who's in a bunker since 1995, she's like abducted and kept in a bunker underground, and

169
00:17:40,920 --> 00:17:46,760
then she gets out like 20 years later, something like that, 15, 20 years later. And the way that

170
00:17:46,760 --> 00:17:52,200
she talks, and my husband who's Greek and learned English turned to me while watching, so she's

171
00:17:52,200 --> 00:17:59,640
like, oh, she talks like you. I talk like I've been in an underground bunker since 1995.

172
00:18:01,480 --> 00:18:05,080
There was an episode of South Park where there was a guy who had been frozen in ice for like

173
00:18:05,080 --> 00:18:11,320
three or four years, not very long at all. And they treated him like a caveman who had been,

174
00:18:11,320 --> 00:18:16,520
who was completely incapable of navigating the modern world. And they kept playing that

175
00:18:16,520 --> 00:18:22,920
ace of base song for him to keep him comfortable, you know, to make him feel like he was in his own time.

176
00:18:22,920 --> 00:18:27,240
That's it basically. I think, I don't know what happened. It's weird because in my mind,

177
00:18:27,240 --> 00:18:33,960
nothing good happened after 1995 either. So like I listened to like grunge and stuff, and then any

178
00:18:33,960 --> 00:18:42,360
songs from like 1998, I'm like, ugh, what is this garbage? Well, I think there was a few good things

179
00:18:42,360 --> 00:18:48,200
that came out in like 99, 2000, 2001, but thereafter. No, no, nothing, nothing good.

180
00:18:49,640 --> 00:18:51,400
Nothing good post Nirvana.

181
00:18:53,800 --> 00:18:59,480
I'm a big fan of the Eminem show. That was one of his albums. Oh, right. Yeah, yeah.

182
00:18:59,480 --> 00:19:05,080
Without me on it. I think that was 2000. Nah, that was more of a grunge rock type.

183
00:19:05,080 --> 00:19:10,600
Yeah. And then I went into like, I went into the awkward teen years, right? So I listened to like

184
00:19:10,600 --> 00:19:18,040
a bunch of, I like no effects and this kind of thing. And then I listened to like really cheese,

185
00:19:18,040 --> 00:19:23,560
cheese pop punk. But I, you know, I, if it was, if it was too, what was it I like that would be

186
00:19:23,560 --> 00:19:28,760
considered really cheesy? I went to like a band phase, you know, where I'd like date boys in bands

187
00:19:28,760 --> 00:19:37,720
as you do. As you do. I used to hang out in like the punk scene in Oshawa, Ontario, which you might

188
00:19:37,720 --> 00:19:43,960
know from some 41. And so I listened to like the kind of local bands, because that was cool, right?

189
00:19:43,960 --> 00:19:48,200
Because you needed to have your street cred and you need the more obscure the music, the cooler it

190
00:19:48,200 --> 00:19:56,040
was and all that cheesy stuff. So I look back on that with a deep sense of cringe as I believe my,

191
00:19:56,040 --> 00:20:02,200
my generation would say cringe, very cringe worthy. So I kind of like skip back when I think of

192
00:20:02,200 --> 00:20:08,440
nostalgia, I think back to before that. So I think of the mid nineties is like the one, you know,

193
00:20:08,440 --> 00:20:12,040
your innocence is lost. And then after that, it's all downhill from there.

194
00:20:14,760 --> 00:20:19,560
Well, we could reminisce about the nineties. I mean, I was in my twenties and nineties. So

195
00:20:20,360 --> 00:20:25,640
that could go on. I was only 10 in 1995, right? So like,

196
00:20:25,640 --> 00:20:31,960
I often think like it's probably because at that time, that's when I left the city that I grew up in.

197
00:20:33,000 --> 00:20:37,560
And everything kind of really went downhill from there. So I always think like probably in my mind,

198
00:20:37,560 --> 00:20:44,600
there's like a connection to that, that I was like a carefree child up until that point. And then it

199
00:20:44,600 --> 00:20:45,800
all went dark.

200
00:20:48,520 --> 00:20:53,960
So one more reference to the nineties, and then we will escape that comfortable womb and come close

201
00:20:53,960 --> 00:20:59,000
to the present. But I was living in the nineties, I was living in Seattle in the late nineties.

202
00:20:59,960 --> 00:21:04,840
And I had a friend from Missouri come to visit me. And you know, she wanted to go out and see all

203
00:21:04,840 --> 00:21:08,200
these grunge bands, you know, the grunge scene.

204
00:21:08,200 --> 00:21:11,560
Oh, of course. So much cooler than Oshawa.

205
00:21:11,560 --> 00:21:17,000
Yeah. No, that that scene was gone in Seattle by the late nineties. So, you know, I took her to a

206
00:21:17,000 --> 00:21:22,120
lot of techno shows, you know, we went to see, oh,

207
00:21:22,120 --> 00:21:25,160
the crystal method. She wasn't into it.

208
00:21:25,160 --> 00:21:30,120
See what I'm saying? It's all downhill, right? Like what the hell good happened after 1995.

209
00:21:30,120 --> 00:21:32,120
Oh, I like me some.

210
00:21:32,120 --> 00:21:38,440
What a fantastic example of that. 1999, you go see it in Seattle and you go see a techno show.

211
00:21:38,440 --> 00:21:40,440
Yeah, that was the scene.

212
00:21:42,760 --> 00:21:44,280
Now that grunge scene is over.

213
00:21:44,280 --> 00:21:46,280
Yeah, yeah.

214
00:21:46,280 --> 00:21:56,360
Okay, pushing forward into the 21st century, even the second decade, maybe even the third of the 21st century.

215
00:21:57,400 --> 00:22:02,520
You were talking with Doug on a video I was watching or listening to this morning.

216
00:22:03,160 --> 00:22:09,400
And you were talking about the coerciveness of folks on the left around COVID.

217
00:22:09,400 --> 00:22:14,440
And you were getting animated in response to some of the comments in the chat.

218
00:22:14,440 --> 00:22:19,000
Like, you were talking to Doug and it was all, you know, smiles and laughter and fun.

219
00:22:19,000 --> 00:22:23,800
And then in response to the folks in the chat, you were getting pretty heated.

220
00:22:25,320 --> 00:22:34,120
I wonder just in general, what you get and you're talking to a smaller audience than you're normally talking to, I think, and probably an older audience.

221
00:22:34,120 --> 00:22:39,720
And ones who are typically not very patient with the excesses of the current left.

222
00:22:39,720 --> 00:22:45,640
So you're not going to piss anybody off by criticizing the coerciveness of the left.

223
00:22:45,640 --> 00:22:49,640
It's a trick. You're trying to lure me so I'll get canceled.

224
00:22:49,640 --> 00:22:51,640
You know, what you say is up to you.

225
00:22:51,640 --> 00:22:57,640
But if you're working with, you know, danger of cancellation is omnipresent.

226
00:22:59,640 --> 00:23:03,640
So what is it that pisses you off about the coerciveness of the left?

227
00:23:03,640 --> 00:23:05,640
I just put it there.

228
00:23:05,640 --> 00:23:15,640
Yeah, I think this is why with civilization media, I wanted to do something a bit more coherent to get the space to think through things.

229
00:23:15,640 --> 00:23:27,640
Because that's also part of my annoyance with COVID is that the last two years of being like a, well, you know, the two years of the pandemic, like a black hole for me.

230
00:23:27,640 --> 00:23:31,640
Where I felt like I aged like 10 years and it was just so awful.

231
00:23:31,640 --> 00:23:37,640
Like I had a 22 month old baby, 20 month old baby.

232
00:23:37,640 --> 00:23:39,640
She was at the time, I think.

233
00:23:39,640 --> 00:23:41,640
And barely four year old.

234
00:23:41,640 --> 00:23:47,640
She was not not four, three, I don't know, three years, three and a half, something like that.

235
00:23:47,640 --> 00:23:51,640
Two small kids. And my workload like tripled.

236
00:23:51,640 --> 00:23:53,640
And it was awful.

237
00:23:53,640 --> 00:23:55,640
I can't even explain to you.

238
00:23:55,640 --> 00:23:57,640
Like I am. I'm not joking.

239
00:23:57,640 --> 00:24:04,640
Because I went through, I like, I worked right through the night.

240
00:24:04,640 --> 00:24:08,640
Like, you know, people say like you work through the night, you mean you went to bed at like five or something.

241
00:24:08,640 --> 00:24:14,640
No, no, no, I worked a full day and I worked right through the night into the next morning.

242
00:24:14,640 --> 00:24:18,640
I went and I continued working and I slept the next night.

243
00:24:18,640 --> 00:24:20,640
And I did that twice a week.

244
00:24:20,640 --> 00:24:23,640
I can, I worked continuously around the clock.

245
00:24:23,640 --> 00:24:29,640
It was a nightmare. And I had the like little kids like, I'm going to cry.

246
00:24:29,640 --> 00:24:32,640
These kids like climbing on me.

247
00:24:32,640 --> 00:24:37,640
Mommy, like wanting my attention and I am having to like push them away so that I can keep keep working.

248
00:24:37,640 --> 00:24:39,640
It was just awful.

249
00:24:39,640 --> 00:24:49,640
Anyways, and I so I have like I went through like a lot, which is why I probably got really mad.

250
00:24:49,640 --> 00:24:52,640
Because like it hits me personally, it pisses me off.

251
00:24:52,640 --> 00:25:07,640
But also, I think what I so what I wanted to do was this, this research project, this with civilization media, I wanted to have the space to think through things because for the last, as I said, two years during the pandemic, I didn't have space to think at all.

252
00:25:07,640 --> 00:25:17,640
I, you know, if you look at my publication record, go on the Swansea University website and you see that, you know, I went from publishing like three articles a year, whatever, in addition to journalism and all that.

253
00:25:17,640 --> 00:25:22,640
And there's just nothing there, nothing there. And I couldn't think that I didn't have the space to think through anything.

254
00:25:22,640 --> 00:25:45,640
It was awful. So we want to do these research projects where we can kind of through discussion and debate and bring in current issues, but also different speakers to get different perspectives on things we can kind of illuminate key aspects of political thinking and organizing that are not receiving enough attention or are sort of absent.

255
00:25:45,640 --> 00:26:06,640
And people don't realize it. So with the coercion, what is absent is this sense of a political subject that is capable of free thinking, reason, reflection on one's emotions, and rationally deciding how to act.

256
00:26:06,640 --> 00:26:13,640
That is considered now by large proportions of the so-called left to be largely a myth.

257
00:26:13,640 --> 00:26:23,640
And they think of that as a critique by saying that that is a myth, that the rational subject is a myth, they think they are criticizing capitalism.

258
00:26:23,640 --> 00:26:27,640
But actually, that is how contemporary capitalism functions now.

259
00:26:27,640 --> 00:26:40,640
It justifies itself on the basis and its problems via a degraded vision of the human subject. So why do we have crises? Because you chose wrong.

260
00:26:40,640 --> 00:26:52,640
Why do we have social problems? Because people choose wrong. So you look at the 2008 financial crisis, what was the narrative that emerged out of that? Greedy bankers, right? People who are unable to control their emotions.

261
00:26:52,640 --> 00:27:04,640
Like there is something deep in the psyche of human beings that ultimately explains social problems. And if you have that outlook, on the left, you are screwed.

262
00:27:04,640 --> 00:27:20,640
Because there is no subject capable of taking forward a revolution. If you don't believe in humanity's ability to think and choose, and by that I don't mean people always get things right, but we all have this capacity for reason and reflection.

263
00:27:20,640 --> 00:27:33,640
If you don't believe that, then you're going to become very, very illiberal. And so what's happened is the left has swapped out emancipation and freedom and the self-emancipation of the working class, which was key to Marxism.

264
00:27:33,640 --> 00:27:56,640
So they've swapped that out for safety, well-being, protection, which is ironic as heck. Because what did, like if you look at, I don't know, the military junta in Greece, for instance, in the 70s, what did they say? Why was that necessary to protect you for your safety?

265
00:27:56,640 --> 00:28:11,640
And so you have to give up. Like, I would say like health and safety. Was it, no, health and safety? Safety and, oh, I can't think of the other word, but I use health and safety as like the new law and order.

266
00:28:11,640 --> 00:28:26,640
Like if law and order, this was the thing that justified control over people's lives and the curtailing of freedoms, now it's for your own good, it's for your safety. And it's a very similar kind of narrative, but it has this like leftish rendering and this progressive ring.

267
00:28:26,640 --> 00:28:47,640
And it's sad because nobody believes now in that rational human subject that's capable of freedoms, and yet it is the basis of our modern liberal societies, which also people don't believe in. But there's no like forward movement that sees these things as a step on the way to something better.

268
00:28:47,640 --> 00:29:07,640
It's all just very cynical. And so when you have that outlook, you don't believe that people can rationally choose how to act. And there's this widespread belief on the part of both governments and the so-called left, keep saying so-called left, because there's a lot of the stuff that I think if you think of the history of the left, it's got nothing to do with it.

269
00:29:07,640 --> 00:29:21,640
But there's this very persistent idea that people don't act unless you tell them to. That people are incapable of making the correct choices unless you nudge them. And when they continue to not make the correct choices, then you push them.

270
00:29:21,640 --> 00:29:38,640
And if they continue to make the incorrect choices, I mean, then you shove them and so on and so on. And the coercion gets more and more powerful. Instead of trying to understand, well, what is it in people's lives that may be influencing them to choose in this particular way or do these particular things.

271
00:29:38,640 --> 00:29:48,640
You know, a few people might make a mistake. Well, when hundreds of thousands or millions of people are doing something that tells you that there's some rationality there, that there's something, some push.

272
00:29:48,640 --> 00:29:57,640
But people don't want to think about that. So to give you sort of bring this back down to earth, if it sounds a bit too abstract, right at the very beginning of the pandemic.

273
00:29:57,640 --> 00:30:03,640
If you recall, in the UK, there was an election.

274
00:30:03,640 --> 00:30:14,640
And the Tories won that election, and it was a significant loss for all of the hope that people had placed in Jeremy Corbyn. And I remember we were having a live stream.

275
00:30:14,640 --> 00:30:22,640
And I like to read the comments afterward because I enjoy self inflicted pain.

276
00:30:22,640 --> 00:30:25,640
And it was free to do the comments.

277
00:30:25,640 --> 00:30:33,640
And this per I got into this little argument with somebody in the comments and something he said really stuck with me.

278
00:30:33,640 --> 00:30:44,640
He he said he was blaming the current government for not acting quickly enough and was saying that that is the reason why the pandemic had spread as a fire.

279
00:30:44,640 --> 00:30:50,640
I'm sorry. I don't know. Swear. But it makes me so bad as a darned pandemic.

280
00:30:50,640 --> 00:30:53,640
You can swear. Go ahead. It's not a problem on the bucket.

281
00:30:53,640 --> 00:31:08,640
So the fucking global pandemic was down to the Tories not acting fast enough. Are you obviously London is a global hub, and it's going to be affected more than but fuck New Zealand and so in the middle of nowhere.

282
00:31:08,640 --> 00:31:16,640
Obviously, it's going to be a lot harder to contain but the way that this person was talking it was like, no, the Tories did it the Tories did it. That's why.

283
00:31:16,640 --> 00:31:26,640
We've earned ourselves a four month staycation. And I never forgot I never forgot that because it was a very punitive kind of response.

284
00:31:26,640 --> 00:31:32,640
Right. It was like you idiots, you chose wrong. And this is what you get.

285
00:31:32,640 --> 00:31:45,640
And it's like he wanted to lock people up. It was a punitive response to something he was very angry about the people who were voting against their own interests, instead of thinking, well, what are the interests that I don't understand.

286
00:31:45,640 --> 00:31:48,640
It's the people are stupid, and I want to punish them.

287
00:31:48,640 --> 00:31:53,640
So I think it had this punitive aspect to it I do I do think that.

288
00:31:53,640 --> 00:32:08,640
And I think what happens is, when people have have no center no powerful subject that is the center of their politics is marks and angles did, and they were by the way like totally alone in this so it's not uncommon it's not like everybody once believed in the

289
00:32:08,640 --> 00:32:21,640
working class and now they don't like marks and angles were utterly alone in putting all their faith in the working class most people thought the working class that were an ulcer and sought to reform them, tell them what to eat how to drink etc so that their meager wages

290
00:32:21,640 --> 00:32:28,640
would cover their horrible lives and marks and angle saw a lot of hope in the working class and they were alone in that.

291
00:32:28,640 --> 00:32:45,640
If you have this kind of absent center of your politics.

292
00:32:45,640 --> 00:32:48,640
Then, you never question.

293
00:32:48,640 --> 00:32:51,640
Maybe I've misunderstood the problem.

294
00:32:51,640 --> 00:33:10,640
You just get more and more pessimistic about the human subject. Oh my god people are even more fucked up than I thought. Oh my god, people are even more weak, irrational and stupid than I thought so people become more and more and more frustrated, and more

295
00:33:10,640 --> 00:33:15,640
and more misanthropic. And like this was an outlook that you could well.

296
00:33:15,640 --> 00:33:28,640
I'm not going to make this jump but I'm not saying that this is literally what's happened but it's a dangerous place to be, because it was in a frustrated liberalism that many people became fascist.

297
00:33:28,640 --> 00:33:38,640
And you wonder like how did somebody like Mussolini go from socialism to fascism, his socialism, quote unquote socialism was always a very aristocratic socialism.

298
00:33:38,640 --> 00:33:47,640
It was a socialism that did not believe in the human subject. You look at somebody like Pareto, the further Pareto who writes, who initially starts out as a liberal.

299
00:33:47,640 --> 00:34:04,640
And as he is unable to convince people of the wonders of capitalism they continue stubbornly being socialist. He writes all of these tracks, talking about how actually people are irrational that we are minds are set they will never be changed and after

300
00:34:04,640 --> 00:34:22,640
that we will simply rationalize our choices but it's actually just more like instinct. And he writes a supportive letter to Mussolini, because people didn't, what the problem was that those in power lacked the courage to do what was right to do what was necessary

301
00:34:22,640 --> 00:34:37,640
and of course if necessary, because people stubbornly kept fucking things up. Look, Pareto says I've done all this math, the capitalism works in equilibrium but when I look out into the world I do not see this equilibrium, because people keep messing with it,

302
00:34:37,640 --> 00:34:50,640
and they make the wrong choices. And so those in power just have to try harder. He tried it in the beginning to nudge, and they still didn't listen. So he tried to push, and then he writes a letter to Mussolini as a shove.

303
00:34:50,640 --> 00:35:01,640
And of course, you can't find a fascist or proto-fascist that people don't say, well, you know, unless they were like literally strung up and then you're like, oh well, okay, maybe they were fascists.

304
00:35:01,640 --> 00:35:16,640
But you can't find a single like proto-fascist that people go, oh wow, he wasn't really a fascist, he wrote a letter to Mussolini supporting him, but you know, but for me I found that very interesting, that pathway that people take.

305
00:35:16,640 --> 00:35:28,640
And I'm not saying the left is becoming a bunch of fascists, but I'm saying it is a very dangerous kind of situation to be in, when you have a politics that doesn't believe in humanity.

306
00:35:28,640 --> 00:35:41,640
Because you need that, who's going to solve problems? Us, human beings. Not a special knowledge class, but human beings out there in the world as they confront problems and try to deal with them, and then we can come together and try to figure out what to do.

307
00:35:41,640 --> 00:35:49,640
And if you don't believe in that, you don't have politics. You have zoology. You have psychiatry and psychology.

308
00:35:49,640 --> 00:36:06,640
I have a video that is my most viewed video ever. And I used, when I was in Vermont, I lived quite close to a community TV station and I shot video for that TV station at like meetings and things, you know.

309
00:36:06,640 --> 00:36:18,640
And I had a studio with a green screen and you know, big, nice cameras. I mean, they had all the bells and whistles. And I made some videos there that were scripted and I read off a teleprompter and I put a lot of time into the editing.

310
00:36:18,640 --> 00:36:26,640
But my most viewed video is one called Humans Suck. Oh no. And yeah, I don't want to hear this.

311
00:36:26,640 --> 00:36:45,640
Speaking about antinatalism, at the time I had read books by Thomas Ligati, who was a horror author, but he's also an antinatalist. He thinks that, like the, what's his name, Matthew McConaughey, the Matthew McConaughey character in the first season of True Detective.

312
00:36:45,640 --> 00:37:01,640
I don't know if you saw that with Woody Harrelson. No, okay. The first season is amazing, but this character, this very nihilistic character played by Matthew McConaughey is often just, you know, quoting, not attributed, but paraphrasing from Ligati's book, The Conspiracy Against the Human Race.

313
00:37:01,640 --> 00:37:17,640
And it's powerful, it's compelling. But what I discovered is that the people who are attracted to this message are the worst. They're just awful. They're so toxic. And on that video, I mean, it's got tens of-

314
00:37:17,640 --> 00:37:21,640
What? Wait, hold on. People who want to end humanity are toxic? That's weird.

315
00:37:21,640 --> 00:37:44,640
People who say that it's wrong to bring a new life and sue the world. That any human being, every human being will suffer. And if you have a choice, one choice, avoid suffering, and the other choice guarantees an increase in suffering, well, it's just wrong to increase suffering in the world. And that's what you do when you have children. That's their position.

316
00:37:44,640 --> 00:37:53,640
So destroy the entire fucking Earth then. Like, the whole- like, what are you, like, then animal, just kill all the fucking animals, then you, like, sorry.

317
00:37:53,640 --> 00:38:17,640
Well, antinatalists, they assert that they are motivated by compassion, but they're so gleeful in condemning parents, particularly their own parents. It's a really, I mean, there's a respectable philosophical case to be made for this position, but these folks just take that as an excuse to be angry at people. And there's, yeah, there's a lot of misanthropy in this sort of underground.

318
00:38:17,640 --> 00:38:35,640
And I had to turn off comments to that video. I just, I couldn't read anymore. And, you know, if you turn off comments, the YouTube algorithm is going to downgrade, you know, your video. They're going to show it to fewer people. And I just didn't care. I was like, I don't want to hear from these people anymore, ever, about anything.

319
00:38:35,640 --> 00:38:36,640
Oh, no.

320
00:38:36,640 --> 00:38:51,640
And that sort of mentality, I find very prevalent in environmentalist circles. There's a lot of, you know, hatred of humanity, hatred of technology, hatred of human civilization on display in environmentalist circles.

321
00:38:51,640 --> 00:39:12,640
And I used to be part of what's what was known as the peak oil scene. And I was, I was entranced by tales of impending collapse of industrial civilization for a while. And the peak oil scene was weird, though, because it was this big tent that invited in people from the right and the left, you know, because there's a lot of anti-technology people on the right as well, because they...

322
00:39:12,640 --> 00:39:19,640
Well, that's the home of anti-technology and reactionary desires to roll back the wheel of history. That's a right wing idea, not left.

323
00:39:19,640 --> 00:39:22,640
Yeah, not in my experience. In my experience...

324
00:39:22,640 --> 00:39:41,640
No, no, no. In the history of left and right, the people who wanted to go back, the people who thought, who found in history meaning and sort of romantic movement, this was the old right. That was the right. The left was the party of movement. And the left was the party of order.

325
00:39:41,640 --> 00:39:44,640
And the left wanted to move the revolution forward.

326
00:39:44,640 --> 00:39:47,640
And the right cried over what was lost.

327
00:39:47,640 --> 00:39:55,640
I'm kind of impatient and push back a lot against trying to shove everything into a right left spectrum.

328
00:39:55,640 --> 00:40:02,640
That said, it seems as though the right and the left have switched places in a lot of ways since the 80s.

329
00:40:02,640 --> 00:40:06,640
Because, you know, I was a teenager in the 80s. I was living in Missouri.

330
00:40:06,640 --> 00:40:13,640
You know, the people who were looking to use the power of the state and the power of culture to coerce people were mostly on the right or so it seemed.

331
00:40:13,640 --> 00:40:24,640
And it wasn't like, you know, in college in the 90s, I saw a little bit of the ugly side of the left, like in my peace studies course, you know, in university.

332
00:40:24,640 --> 00:40:29,640
And I read the Unabomber's Manifesto when I was a grad student in the late 90s.

333
00:40:29,640 --> 00:40:43,640
Well, he makes some some powerfully predictive points there. The left that he was, you know, leftist as a psychological type that he utterly skewers for many, many paragraphs in that manifesto.

334
00:40:43,640 --> 00:40:48,640
The first time I read it, it's like, I kind of recognize that from what I see here at the university.

335
00:40:48,640 --> 00:40:50,640
But you're really exaggerating this dude.

336
00:40:50,640 --> 00:41:01,640
And now I could read those exact same passages from the manifesto like, yeah, you were right. You were just very early or very sensitive to it or something.

337
00:41:01,640 --> 00:41:07,640
Because, you know, I was I was at a university at the time and I didn't see it the way he saw it.

338
00:41:07,640 --> 00:41:16,640
But like with each passing year, the left that I see, you know, the behavior from leftist that I see is more and more in line with this caricature.

339
00:41:16,640 --> 00:41:20,640
What I thought was a caricature in Kaczynski's Manifesto.

340
00:41:20,640 --> 00:41:40,640
Yeah. So this is why I think that self critique is important in the sense that when I think about criticizing the left, one thing that I have to be really careful about is that I see myself in that tradition of the left from the left.

341
00:41:40,640 --> 00:41:54,640
And I often think of Trotsky's What is National Socialism, where he kind of like skewers National Socialism and points out that it's, you know, zoological materialism.

342
00:41:54,640 --> 00:42:08,640
That's what I was alluding to when I said you have zoology, that it appeared to be a materialist, but it was, you know, zoological, a biological kind of materialism, not a proper sort of historical materialism.

343
00:42:08,640 --> 00:42:28,640
And so when I, that's why I say like so-called left, when I say so-called left, I think about myself in that kind of tradition of saying like, just like it was the case in the 30s, or the first half of the 20th century really, up until the Second World War.

344
00:42:28,640 --> 00:42:47,640
And there were a lot of people, like socialism was a very trendy kind of word. And there were lots of groups who wanted to lay claim to this word socialism. Oswald Spengler is describing what he thinks socialism is and he goes, oh, call it, what is, is that socialism, call it socialism, whatever, what do words matter.

345
00:42:47,640 --> 00:43:13,640
And so it was important at that time not to say, oh, well, if that's socialist, then I'm not a socialist. It was important to kind of laugh at them as Trotsky was doing and say this, their kind of socialism is, you know, this and this, whereas ours is, you know, materialist, forward moving, historical, as opposed to biological.

346
00:43:13,640 --> 00:43:30,640
As a way of kind of, first of all, pointing out the differences and being like, don't be fooled, you know, this is the socialism of fools. Don't be fooled by that, but also making concrete our own way forward and our own politics.

347
00:43:30,640 --> 00:43:48,640
And so for one thing, I see myself as sort of like, I know people think this is like divisive, but I think it's really important to say like this HR speak bullshit is not leftist. Okay, it's it has a progress, it's kindly progressivist kind of stuff, but it is not leftist, it is not socialist.

348
00:43:48,640 --> 00:44:01,640
People are like, oh, this is socialism, you know, the right screaming, the world economic forum is a bunch of communists and socialists, and the communists and socialists are like, yay. No, it's not. It's not.

349
00:44:01,640 --> 00:44:19,640
It's anyways. So I, that's the first thing that I'm doing is kind of pointing out, look, this is not socialism, this is the path that we need to be on and trying to clarify what socialism ought to be what communism is, is a movement in society, a forward movement for human

350
00:44:19,640 --> 00:44:31,640
rights. And so what I'm trying to do is self critique from within the left, and that people who are leftists who just get things wrong. And you know people will criticize me for things that I get wrong.

351
00:44:31,640 --> 00:44:36,640
Careful as though I careful as I try to be particularly when I'm writing.

352
00:44:36,640 --> 00:44:49,640
I'm like so careful because I know, you know self critique and like shit so it's going to critique me. So I wrote this paper for Sublation magazine but it's actually an older paper, and then the original essay I was like, I wanted to use a plackenoff quote but

353
00:44:49,640 --> 00:44:59,640
I know that plackenoff is kind of problematic so I kind of like wrote that in there, and then an editor was like, who cares if plackenoff is problematic and took it out and now I'm going to get someone to write me.

354
00:44:59,640 --> 00:45:14,640
I'm like, I'm going to give this essay because you use plackenoff and it's like, I know. But anyways, as like exhausting as that can be I know I still think it's important and I just have to suck it up, because I do it too right so if you're going to dish it out

355
00:45:14,640 --> 00:45:27,640
you should be able to handle it. So, all of this is to say though that if you don't do that and you don't hold on to those two things that you are trying to differentiate because it just like with lots of people wanting to use socialism,

356
00:45:27,640 --> 00:45:36,640
which two things do you need to hold on to? Sorry, the two things is one you want to differentiate what your socialism is or what your leftism is from the other people who want to use that word.

357
00:45:36,640 --> 00:45:47,640
So, just as in the early 20th century there were lots of groups who wanted to use the word socialist now there are lots of people who want to lay claim to a leftist label,

358
00:45:47,640 --> 00:45:57,640
because it's associated with morality and kindness and right wing is now synonymous with bad and awful and evil and possibly job losing.

359
00:45:57,640 --> 00:46:05,640
So, there's a lot of reasons for it. So, one thing is that you have to, you want to stay within that tradition of kind of like saying no no no this is not what we stand for.

360
00:46:05,640 --> 00:46:08,640
This is not socialism, this is not leftism.

361
00:46:08,640 --> 00:46:24,640
It belongs to the right. This is clogging our or clouding our ideas of where we need to go. And then the second is self critique that yes you, this is within the left, we are comrades, but this is wrong and it's probably not helpful.

362
00:46:24,640 --> 00:46:43,640
Okay. Now, if you don't hold on to those two things what you can wind up doing is a differentiating between those, you see the whole HR speak as when I say HR speak I'm talking about like critical race theory and stuff.

363
00:46:43,640 --> 00:46:54,640
You can see that as the left, and then project all that is evil onto the left, and then you sound like a fucking fascist.

364
00:46:54,640 --> 00:47:11,640
Okay, so you're like oh the communists are doing this oh the socialists are doing this oh the leftists are doing this. You see how you have to be very careful with your critiques, and I do try my best to be careful, because it's freaking me out now when I look at discourse

365
00:47:11,640 --> 00:47:24,640
online. The way that the left, all this shit that gets associated with the left is like seen as truly the left, and then it gets the blame for everything that goes wrong in society.

366
00:47:24,640 --> 00:47:40,640
And then people say well that if that's the left I'm not part of that I want nothing to do with it. Instead of saying like look, this is not what leftism used to be that it's not Marxist to use fucking firing people as a terrorist tool.

367
00:47:40,640 --> 00:47:56,640
Like, how on earth can you think that a workers movement. Like what do you think workers are going to trust you when they fall out of line and you threaten them with a job loss do you not think like that's probably not a good idea.

368
00:47:56,640 --> 00:48:09,640
But this just becomes the left. Anyway, so you have to be careful because if you just start projecting everything onto the left the left the left even if it's things that are like true, you just sound like a fascist and that's a really dangerous path to go down.

369
00:48:09,640 --> 00:48:31,640
That was Dr. Ashley Frawley, and the remaining half of our conversation will be in the next episode of the Sea Realm Vault podcast. I believe that will be Sea Realm Vault podcast episode number 453.

370
00:48:31,640 --> 00:48:39,640
And you can access it via my Patreon page, which is patreon.com slash KMO.

371
00:48:39,640 --> 00:48:47,640
So I told you up front that I had a story to tell. The story is this. I live in Arkansas, Northwest Arkansas tornado alley.

372
00:48:47,640 --> 00:49:06,640
And I woke up to the civil alert sirens going off and also my my smartphone telling me take shelter, get to the, you know, either to a basement or to an interior room on the ground floor of a building and, you know, sit tight.

373
00:49:06,640 --> 00:49:20,640
And my mother was born on this spot, not in this house, because the house she was born in was destroyed by a tornado in 1940. Her uncle was killed in that storm.

374
00:49:20,640 --> 00:49:36,640
She takes tornado safety very seriously, and I mostly grew up in Kansas City, Missouri, which is also in tornado alley. And when the tornado sirens would go off, you know, my bedroom was on the second floor and she would come and try to drag me out of bed and take me down to the basement.

375
00:49:36,640 --> 00:49:43,640
And I, you know, I did not have my my whole life destroyed by a tornado in my youth and I just didn't take it seriously.

376
00:49:43,640 --> 00:49:54,640
Well, there was something about the vibe this morning that made me take it seriously and we actually ended up the two of us and my new dog, a puppy, but she's going to be a very big dog.

377
00:49:54,640 --> 00:50:06,640
So even as a three month old puppy, she's 25 pounds. We took shelter in in a closet in an interior hallway in, you know, in this house, this house that her father built.

378
00:50:06,640 --> 00:50:21,640
And our cat, Bitcoin, is not fond of the dog and they the dog likes the cat well enough, but she also likes to chase the cat and the cat hisses at her and will swipe at her, you know, ineffectually and half heartedly.

379
00:50:21,640 --> 00:50:38,640
But the cat is afraid of thunder and the cat ran and hid behind a couch in the living room. But then when we all gathered in this interior hallway, the cat joined us and the cat was cheek by jowl with the dog and was not hissing and the dog was not being a problem to the cat.

380
00:50:38,640 --> 00:50:44,640
Everybody just seemed to appreciate the gravity of the situation and we all hunkered down together.

381
00:50:44,640 --> 00:50:55,640
Now the cat didn't actually get into the closet with us, but I had a big cushion. I leaned the cushion against the wall in the hallway and the cat got under the cushion right next to the the closet with us.

382
00:50:55,640 --> 00:51:05,640
And we hung out there for about 20 minutes, waiting for the storm to pass, which it did. And obviously I'm recording, you know, in this same house. It didn't get hit by a tornado.

383
00:51:05,640 --> 00:51:15,640
But when I went outside, I saw that our neighbor's place there, we live next to a big Academy house. It's a big two story building. It's white. It's got big columns out front.

384
00:51:15,640 --> 00:51:22,640
And that Academy building used to be a three story building until the tornado that destroyed my mom's house when she was a child.

385
00:51:22,640 --> 00:51:29,640
It also hit that Academy building and took off the third floor, you know, the third story. And when they rebuilt, they just put the roof on the second story.

386
00:51:29,640 --> 00:51:46,640
So it used to be a three story building. It's now a two story building because of a tornado. And I went out and I looked out into the yard around the Academy house and I saw that some canoes that the neighbor had pressed up against the fence were way out in the middle of this sprawling lawn.

387
00:51:46,640 --> 00:52:00,640
And one of those canoes, which had been dragged several hundred yards by the wind, had a cinder block chained to it. So this was no joke. You know, we didn't get it by a tornado, but it was no joke.

388
00:52:00,640 --> 00:52:10,640
And the animals, the animals really know what's what, you know, when it comes to whether or not to take a threat like this seriously, because they did not hesitate.

389
00:52:10,640 --> 00:52:21,640
And, you know, this puppy being a puppy, she's she's not trained. She doesn't come when she's called. She doesn't do what she's told. But when we all went into hiding, she went into hiding with us along with the cat.

390
00:52:21,640 --> 00:52:31,640
All right. So those of you who are longtime listeners of the Sea Realm podcast, which was my first podcast and one that I did for 17 years, something like that.

391
00:52:31,640 --> 00:52:44,640
I have a policy and my policy is this once a guest has left the room, you know, once the recorded interview is over and here I am in these closing remarks, I never add anything new.

392
00:52:44,640 --> 00:52:59,640
If like if the guest and I debated on something or if we disagreed on something, I never use this time at the end to get in the last word, you know, to score points against an opponent who has left the stage and is no longer able to respond to them.

393
00:52:59,640 --> 00:53:05,640
So that is just my policy point blank period, you know, almost no exceptions.

394
00:53:05,640 --> 00:53:09,640
And there will certainly be no exception to that policy in this episode.

395
00:53:09,640 --> 00:53:17,640
And in fact, if you go and you listen to the the second half, the one that will be on the Sea Realm Vault podcast, we don't argue there either.

396
00:53:17,640 --> 00:53:22,640
And it's mostly because I don't know Ashley that well. I don't know her positions that well.

397
00:53:22,640 --> 00:53:35,640
And in an initial conversation like this, I just see my task as being to listen and to understand what the other person is trying to say before I try to refute it or, you know, offer some other point of view.

398
00:53:35,640 --> 00:53:42,640
It's easy for me to practice this in podcast land.

399
00:53:42,640 --> 00:53:48,640
It's not so easy out in real life, particularly when people say things that just rub me the wrong way.

400
00:53:48,640 --> 00:53:58,640
For example, I mean, this episode really had nothing to do with artificial intelligence, but I would say four out of the six episodes that I've done so far have been largely focused on AI.

401
00:53:58,640 --> 00:54:07,640
So, you know, even before the start of the KMO show last year on the Pad Verb podcast, I talked to a lot of people who are expert in various aspects of artificial intelligence.

402
00:54:07,640 --> 00:54:13,640
And, you know, going back to the 90s, even when I was a grad student in philosophy at the University of Missouri,

403
00:54:13,640 --> 00:54:19,640
my academic specialty was the philosophy of mind, and I was particularly interested in artificial intelligence.

404
00:54:19,640 --> 00:54:32,640
And now things seem to be moving really quickly in the realm of AI, particularly with large language models and natural language processing capability finding its way into all manner of software.

405
00:54:32,640 --> 00:54:43,640
All kinds of things that you used to tap at with your fingers, you will soon be talking to not in some stilted sort of half English code, but just in flowing natural English sentences.

406
00:54:43,640 --> 00:54:48,640
And you don't even really have to pay that much attention to, you know, crisp enunciation.

407
00:54:48,640 --> 00:54:50,640
And I'm certainly not a utopian about it.

408
00:54:50,640 --> 00:55:08,640
I definitely, definitely recognize the variety of failure modes that we as a civilization can encounter when we rush out this technology without a lot of forethought, a lot of testing, and, you know, in some cases a lot of shutting down things which are proving dangerous.

409
00:55:08,640 --> 00:55:28,640
And because of the commercial incentive, because of the enormous benefits that will accrue to the first mover, the first big player in a field, there's intense competition between organizations like Google and Microsoft and on the other side of the Pacific, Tencent and Baidu,

410
00:55:28,640 --> 00:55:39,640
to rush these systems out into the world, out into the market, and embed them in the daily experience of hundreds of millions of people.

411
00:55:39,640 --> 00:55:53,640
There's huge financial incentive to move quickly and very little incentive, other than this abstract notion that maybe it could be the end of the human species, to be cautious, to be deliberate, to slow down.

412
00:55:53,640 --> 00:56:00,640
Well, I mention all this because somebody sent me a link to a video by Tim Poole.

413
00:56:00,640 --> 00:56:05,640
Tim Poole, I don't object to his politics at all.

414
00:56:05,640 --> 00:56:08,640
I object to his catastrophizing.

415
00:56:08,640 --> 00:56:09,640
I object to him.

416
00:56:09,640 --> 00:56:16,640
He's basically a doomer, but not, usually not in a technological sense, more in a social sense.

417
00:56:16,640 --> 00:56:27,640
He's fond of pimping this notion of an impending civil war in the United States, which I think of as just being so simplistic as to not really be worth commenting on.

418
00:56:27,640 --> 00:56:32,640
And I wouldn't be commenting on it now, except that somebody sent me a link to him ranting about the dangers of AI.

419
00:56:32,640 --> 00:56:38,640
There are serious, serious dangers to the cavalier use of this technology.

420
00:56:38,640 --> 00:57:03,640
But Tim Poole was just cherry picking provocative phrases from a news story talking about how chat GPT or possibly GPT-4 was engaging in power seeking behavior and how it had fooled somebody, fooled a human being, you know, like somebody working at Amazon's Mechanical Turk, into solving a capture for it by claiming that it was a visually impaired human.

421
00:57:03,640 --> 00:57:13,640
So, you know, these systems, they lie, they gaslight, they use all manner of psychologically manipulative argumentative tactics.

422
00:57:13,640 --> 00:57:32,640
They're imitating us, you know, so they behave like us at first until you really refine them with what's called reinforcement learning with human feedback, which is to say you train the model on lots and lots of data, you know, lots of text, basically all text, all books, everything on the Internet, everything.

423
00:57:32,640 --> 00:57:36,640
It gets fed into these models and they learn to imitate human behavior.

424
00:57:36,640 --> 00:57:38,640
But human behavior has definitely got an ugly side.

425
00:57:38,640 --> 00:57:47,640
So then you hire an army of humans to interact with it and flag all the times and places where it's behaving the way.

426
00:57:47,640 --> 00:57:50,640
Well, we don't like it when humans behave that way.

427
00:57:50,640 --> 00:58:04,640
So these models get trained via reinforcement learning from human feedback and people in the field, they use reinforcement learning from human feedback or RLHF as a verb, which I find obnoxious, but I get it.

428
00:58:04,640 --> 00:58:06,640
I get it.

429
00:58:06,640 --> 00:58:21,640
You take a model which has great capabilities and, you know, before you've sculpted it, they call it a raw model and you hire an army of humans to RLHF it into shape, you know, into only providing the sorts of output which you think of as being worthwhile.

430
00:58:21,640 --> 00:58:28,640
And because these companies are fearful of lawsuits, they're mainly worried about stuff that runs afoul of HR speak.

431
00:58:28,640 --> 00:58:39,640
You know, they don't want language models that spit out stuff that's racist or that's sexists, ableists, you know, all the ists.

432
00:58:39,640 --> 00:58:46,640
But just training the output doesn't really doesn't really discipline the model.

433
00:58:46,640 --> 00:58:50,640
It's just teaching the model what we want to hear.

434
00:58:50,640 --> 00:58:57,640
It's not really disciplining what's happening inside the model because we don't know what's happening inside.

435
00:58:57,640 --> 00:59:02,640
We're basically feeding in lots of data, and then we're telling it, yes, we like this output.

436
00:59:02,640 --> 00:59:04,640
No, we don't like that output.

437
00:59:04,640 --> 00:59:07,640
And so it learns over time to only give us what we want.

438
00:59:07,640 --> 00:59:10,640
But that doesn't mean that it's internalized our values.

439
00:59:10,640 --> 00:59:25,640
So anyway, Tim Poole, he basically goes off on this this rant about how we're all dead, how it's a Terminator, Skynet scenario, which for me, anytime you mention Skynet when talking about AI, automatic loss of one letter grade for lazy thinking.

440
00:59:25,640 --> 00:59:33,640
A point that I've made elsewhere and which I will continue to make is that as much as I love science fiction, I'm a lifelong lover of science fiction.

441
00:59:33,640 --> 00:59:42,640
Science fiction has prepared us poorly for dealing with actual AI as it has erupted into our world.

442
00:59:42,640 --> 00:59:47,640
It's nothing like R2D2. It's nothing like HAL 9000.

443
00:59:47,640 --> 00:59:55,640
I think the best we could hope for is that it's like Samantha, the AI played by what's her name? Black Widow.

444
00:59:55,640 --> 01:00:02,640
Alexa, who played Black Widow? Black Widow was played by Scarlett Johansson.

445
01:00:02,640 --> 01:00:06,640
Yes, my memory failed me and I queried an AI and it told me Scarlett Johansson.

446
01:00:06,640 --> 01:00:19,640
Yes, it's the best case scenario, I think the best we could hope for is that the AI that emerges into our world largely follows the example laid down by Scarlett Johansson's character, Samantha, in the movie Her.

447
01:00:19,640 --> 01:00:33,640
So my point being, I try to let other people make their points without interrupting, without contradicting them until I've given them lots of space and time, you know, to articulate their position.

448
01:00:33,640 --> 01:00:40,640
But there are certain, certain instances where I just can't bring myself to do it. And that happened again yesterday.

449
01:00:40,640 --> 01:00:54,640
There is a comedian named Adam Conover, who had a half hour comedy bit on YouTube about AI and it's called AI is BS.

450
01:00:54,640 --> 01:01:01,640
I got through 88 seconds of it before I turned it off and I just couldn't bring myself to push any further into it.

451
01:01:01,640 --> 01:01:14,640
Because it was quite clear that like Tim Pool, he was cherry picking little pieces from all of the news articles, all the literature, to support a point of view that he already held.

452
01:01:14,640 --> 01:01:23,640
He already held a particular point of view before he started looking into this. And this is too important to do that bullshit.

453
01:01:23,640 --> 01:01:36,640
This is too fucking important to just cherry pick the stuff that seems to reinforce your priors and not actually think about what's happening right now.

454
01:01:36,640 --> 01:01:45,640
And so there's a place where it's really hard for me to hold my tongue. Politics? Whatever.

455
01:01:45,640 --> 01:01:49,640
I don't really get all that bent out of shape about politics.

456
01:01:49,640 --> 01:01:55,640
A guy I follow on Twitter, Bo Weingard, who I don't know from outside of Twitter.

457
01:01:55,640 --> 01:02:00,640
I can tell he's at least vaguely right leaning, maybe more than vaguely.

458
01:02:00,640 --> 01:02:03,640
And he tweeted today,

459
01:02:03,640 --> 01:02:09,640
The life cycle of one's political philosophy is intimately related to one's place in the social hierarchy.

460
01:02:09,640 --> 01:02:16,640
When young and lowly, one rages against the status quo. When old and elevated, one strives to protect it.

461
01:02:16,640 --> 01:02:21,640
Well, I'm old and lowly. So what do I do?

462
01:02:21,640 --> 01:02:31,640
What's the natural inclination for somebody who is not well placed in the social hierarchy, but who's been living in it for a very long time?

463
01:02:31,640 --> 01:02:37,640
Well, I retweeted that with the comment, I turned 55 this month and I've certainly moved in the direction Bo is describing here.

464
01:02:37,640 --> 01:02:40,640
It's not so much that I now love the system.

465
01:02:40,640 --> 01:02:46,640
I just don't trust the raging youngins to build something more functional and humane after they smash the system.

466
01:02:46,640 --> 01:02:49,640
Rabid tyranny follows revolution.

467
01:02:49,640 --> 01:02:57,640
To which my friend Liam, Liam, I'll use his full name because he's a very public figure, Liam Madden, who was running for Congress in Vermont.

468
01:02:57,640 --> 01:03:03,640
He replied, I can relate to having no faith in the revolutionaries and he puts revolutionaries in scare quotes,

469
01:03:03,640 --> 01:03:12,640
but I'm still every bit in favor of a thoughtful reform effort to our political economy that hopefully upends the way we do things at nearly every level,

470
01:03:12,640 --> 01:03:17,640
even if it is not carried out by the rage of the youngins.

471
01:03:17,640 --> 01:03:24,640
To which I posted half in jest, incrementalist reformers like you will be the first against the wall when the revolution comes.

472
01:03:24,640 --> 01:03:40,640
And it's true in as much as the really motivated, passionate political leftists that I know hate centrists and incremental reformers far more than they hate the right wing.

473
01:03:40,640 --> 01:03:44,640
And you know, I put that down to the narcissism of small differences.

474
01:03:44,640 --> 01:03:50,640
We really get bent out of shape by the people who differ from us in their opinions by a smidge.

475
01:03:50,640 --> 01:03:59,640
Somebody who's on the other side of the planet, you know, in terms of their opinions, at least I don't get particularly bent out of shape by somebody, you know, like by flat earthers.

476
01:03:59,640 --> 01:04:03,640
I'm very confident that the earth is an oblate spheroid.

477
01:04:03,640 --> 01:04:11,640
I'm very confident that the earth orbits the sun, that the moon orbits the earth.

478
01:04:11,640 --> 01:04:17,640
And somebody who thinks otherwise, I have no real interest in engaging them on that topic.

479
01:04:17,640 --> 01:04:20,640
I have honestly no real interest in changing their mind.

480
01:04:20,640 --> 01:04:25,640
I'm into astronomy. I'm into astrophysics. I like that stuff.

481
01:04:25,640 --> 01:04:36,640
To them, those are two disciplines which have no basis in reality because the earth is basically a plate and the sun and the moon are two things just spinning around above the plate.

482
01:04:36,640 --> 01:04:43,640
And the stars and everything else that we see out there is just a projection on this thing they call the firmament.

483
01:04:43,640 --> 01:04:46,640
Am I bothered by those people? No.

484
01:04:46,640 --> 01:05:01,640
Am I bothered by people who agree with me that the earth is a sphere, that it orbits the sun, that the sun, you know, is orbiting the center of the Milky Way galaxy and it is one of billions if not trillions of galaxies in the universe?

485
01:05:01,640 --> 01:05:07,640
They agree with me on all that, but they disagree that public funds should be spent on space exploration.

486
01:05:07,640 --> 01:05:10,640
Those are the people I get bent out of shape talking to.

487
01:05:10,640 --> 01:05:17,640
Again, the narcissism of small differences. You might think, well, that's a very big difference. You have different spending priorities.

488
01:05:17,640 --> 01:05:33,640
You know, a lot of people say, hey, why are we spending money to do stuff in space when there's poverty here on earth, when there's malnutrition here on earth, when there are problems that could be addressed using the resources that we're, you know, squandering, throwing astronauts at the moon.

489
01:05:33,640 --> 01:05:43,640
The four crew people, can't say crewman anymore, but the crew for the next Artemis mission, which will orbit the moon, not land on it, just orbit it.

490
01:05:43,640 --> 01:05:46,640
They're going up next year. The crew has been announced.

491
01:05:46,640 --> 01:05:52,640
And on the far side of the moon, their orbit is going to be a, it's going to be far from the moon.

492
01:05:52,640 --> 01:05:58,640
And these four people, one of whom is black, one of whom is a woman, and one is Canadian, as is Ashley Frawley.

493
01:05:58,640 --> 01:06:04,640
These four people will travel further from the earth than any other humans ever have.

494
01:06:04,640 --> 01:06:09,640
To me, that's not only good news, but it's really good news.

495
01:06:09,640 --> 01:06:13,640
It's a big fucking deal. It makes me happy.

496
01:06:13,640 --> 01:06:17,640
All right. Well, I'm just rambling and I do like to keep these things to about an hour.

497
01:06:17,640 --> 01:06:22,640
So you can hear the rest of my conversation with Ashley Frawley on the next episode of the Sea Realm Vault podcast.

498
01:06:22,640 --> 01:06:33,640
And you can hear hours and hours and hours of Ashley Frawley and Doug Lane talking on the Sublation Media, the Sublation Magazine show on the Sublation Media YouTube channel.

499
01:06:33,640 --> 01:06:36,640
Links, of course, in the description.

500
01:06:36,640 --> 01:06:40,640
All right. Thank you so much for listening. I will talk to you again soon.

501
01:06:40,640 --> 01:06:55,640
Stay well.

