1
00:00:00,000 --> 00:00:05,000
I feel your eyes burning

2
00:00:05,000 --> 00:00:10,000
Down to my bones

3
00:00:10,000 --> 00:00:15,000
But I see the truth now

4
00:00:15,000 --> 00:00:20,000
A little bit more alone

5
00:00:20,000 --> 00:00:23,000
I run through the trees

6
00:00:23,000 --> 00:00:25,000
As fast as I can

7
00:00:25,000 --> 00:00:27,000
And I fall to my knees

8
00:00:27,000 --> 00:00:30,000
And I find a place to land

9
00:00:30,000 --> 00:00:32,000
Whisper in the moonlight

10
00:00:32,000 --> 00:00:35,000
Stopping at my feet

11
00:00:35,000 --> 00:00:37,000
See into the shadows

12
00:00:37,000 --> 00:00:40,000
Taking over me

13
00:00:43,000 --> 00:00:44,000
Anybody need an answer?

14
00:00:49,000 --> 00:00:51,000
Good evening, people of the Bone Club.

15
00:00:51,000 --> 00:00:53,000
We're coming to you live from Baltimore

16
00:00:53,000 --> 00:00:56,000
and the American Academy of Forensic Science Conference.

17
00:00:56,000 --> 00:00:59,000
I'm here with you with Stephanie.

18
00:00:59,000 --> 00:01:00,000
Hey, everyone.

19
00:01:00,000 --> 00:01:03,000
And a whole bunch of friends we've gathered.

20
00:01:03,000 --> 00:01:05,000
Hello.

21
00:01:05,000 --> 00:01:06,000
Yes.

22
00:01:06,000 --> 00:01:09,000
You all want to go ahead and introduce yourselves.

23
00:01:09,000 --> 00:01:11,000
Sure, I can start.

24
00:01:11,000 --> 00:01:13,000
Hi, my name is Taylor Flaherty.

25
00:01:13,000 --> 00:01:17,000
I am a doctoral candidate at the University of Nevada, Las Vegas.

26
00:01:17,000 --> 00:01:22,000
And I specialize specifically in biocultural forensic anthropology.

27
00:01:22,000 --> 00:01:26,000
I'm Thomas Delgado.

28
00:01:26,000 --> 00:01:28,000
I'm a...

29
00:01:28,000 --> 00:01:29,000
Wow.

30
00:01:29,000 --> 00:01:31,000
Who am I?

31
00:01:31,000 --> 00:01:35,000
I am a doctoral student at the University of Utah.

32
00:01:35,000 --> 00:01:38,000
I focus on the isotopic analysis of human tissues

33
00:01:38,000 --> 00:01:42,000
and structural vulnerability in marginalized populations.

34
00:01:42,000 --> 00:01:44,000
Hello, I am Khalil Mahul.

35
00:01:44,000 --> 00:01:48,000
I am a teacher here as in the lecture of forensic anthropology

36
00:01:48,000 --> 00:01:51,000
in the American University of Science and Technology.

37
00:01:51,000 --> 00:01:55,000
My research focus is about human-born remakes.

38
00:01:55,000 --> 00:01:57,000
My name is Skylar.

39
00:01:57,000 --> 00:02:01,000
I am currently a PhD student at Leiden University in the Netherlands.

40
00:02:01,000 --> 00:02:06,000
My PhD topic covers the missing and deceased migration crisis

41
00:02:06,000 --> 00:02:10,000
in the Mediterranean and specifically how the lack of state accountability

42
00:02:10,000 --> 00:02:13,000
impacts the search process.

43
00:02:13,000 --> 00:02:15,000
Hi, my name is Hannah Carson.

44
00:02:15,000 --> 00:02:18,000
I'm a PhD student at the University of Montana.

45
00:02:18,000 --> 00:02:22,000
My research focuses more on mining skeletal and molecular techniques

46
00:02:22,000 --> 00:02:26,000
on ancient remains, but my background is mostly in forensic anthropology,

47
00:02:26,000 --> 00:02:29,000
so that's my passion.

48
00:02:29,000 --> 00:02:30,000
I guess that's me.

49
00:02:30,000 --> 00:02:31,000
I'm Sam Blatt.

50
00:02:31,000 --> 00:02:35,000
I'm an associate professor at Idaho State University.

51
00:02:35,000 --> 00:02:40,000
And I dabble in forensics and bioarchaeology in forensic world.

52
00:02:40,000 --> 00:02:46,000
I do novel identification and biocultural practices.

53
00:02:46,000 --> 00:02:49,000
I'm a professor at the University of Montana.

54
00:02:49,000 --> 00:02:51,000
I do research on different species of animals,

55
00:02:51,000 --> 00:02:54,000
but I also love teeth and really old bones.

56
00:02:54,000 --> 00:02:56,000
I travel around getting that whenever I can also.

57
00:02:56,000 --> 00:02:57,000
You can keep the teeth.

58
00:02:57,000 --> 00:02:59,000
No, I'm totally cool.

59
00:02:59,000 --> 00:03:00,000
You have that teeth.

60
00:03:00,000 --> 00:03:01,000
You don't want to deal with them.

61
00:03:01,000 --> 00:03:02,000
I am your girl.

62
00:03:02,000 --> 00:03:03,000
Third Mollers.

63
00:03:03,000 --> 00:03:04,000
I'll take it.

64
00:03:04,000 --> 00:03:10,000
I'll be in Idaho soon.

65
00:03:10,000 --> 00:03:15,000
conversation about AFS and kind of what's going on at the conference is talk about the location.

66
00:03:15,000 --> 00:03:20,000
We are, as Ashley said, we're in Baltimore. How'd everyone feel about Baltimore this week?

67
00:03:20,000 --> 00:03:26,000
I know for myself coming to Hawaii, I was frozen. So I am so cold.

68
00:03:26,000 --> 00:03:31,000
I mean, I'll say that to me from Idaho, it's still very cold.

69
00:03:31,000 --> 00:03:35,000
Yeah, yeah. So it's freezing.

70
00:03:35,000 --> 00:03:40,000
We didn't even get the snow we were promised. So that's a little bit.

71
00:03:40,000 --> 00:03:43,000
We got some flakes.

72
00:03:43,000 --> 00:03:47,000
I'm in Salt Lake. I'm in the snow.

73
00:03:47,000 --> 00:03:53,000
I mean, since I've been here on Saturday, it's snowed like, I think it was close to two feet in Montana since I've been here.

74
00:03:53,000 --> 00:03:56,000
So I'm not missing that.

75
00:03:56,000 --> 00:04:06,000
Well, I know the big thing about this year's AFS was that every year there's a theme that kind of helps focus what research gets presented.

76
00:04:06,000 --> 00:04:14,000
And this year's theme was about technology and AI. And I know there are a lot of really good presentations on all of that.

77
00:04:14,000 --> 00:04:19,000
There's definitely some people here within this podcast that have even presented on it.

78
00:04:19,000 --> 00:04:28,000
So if any of you all want to like briefly talk about what you thought about a topic and yeah, whatever.

79
00:04:28,000 --> 00:04:35,000
You know, yeah, it talks about AI. So yeah, this year's theme was something, something technology.

80
00:04:35,000 --> 00:04:38,000
A tool for something or tyranny.

81
00:04:38,000 --> 00:04:40,000
Or tyranny.

82
00:04:40,000 --> 00:04:43,000
Like there's a something or tyranny.

83
00:04:43,000 --> 00:04:47,000
Tools or tyranny.

84
00:04:47,000 --> 00:04:57,000
Yeah. Anyway, but the theme was technology and how do we interact with technology and our relationships and why.

85
00:04:57,000 --> 00:05:00,000
Transformation.

86
00:05:00,000 --> 00:05:04,000
You only remember the tyranny. That's the important part of this discussion.

87
00:05:04,000 --> 00:05:09,000
I love transformer neural networks. They're very cool.

88
00:05:09,000 --> 00:05:20,000
So yeah, so the theme I thought was very interesting and particularly relevant. You know, forensic anthropologists have been integrating artificial intelligence technologies into their research for the better part of a decade.

89
00:05:20,000 --> 00:05:29,000
But we haven't yet started really critically examining our relationships with these techniques, the ethics, the bioethics and the politics that come alongside of them.

90
00:05:29,000 --> 00:05:44,000
And also understanding them at a higher than anthropology, but still semi rudimentary computer science level. So we can make these really critical decisions are actually utilizing them and presenting novel research with them.

91
00:05:44,000 --> 00:05:57,000
So I love the theme. Myself and Taylor Clowherty both organized a symposium that centered around how we use artificial intelligence and forensic anthropology, focusing specifically on methodological applications.

92
00:05:57,000 --> 00:06:08,000
So this is how we do the work. This is clear steps to how we did it, and how can we make this reproducible accessible and fair across the wider, wider field.

93
00:06:08,000 --> 00:06:19,000
Yeah, and I think Ashley said something really good earlier, he said something really good about the symposium, if you wanted to repeat it.

94
00:06:19,000 --> 00:06:30,000
Well, it depends on what the thing was, I've said a lot about this. I thought it was very important. I think it is the direction of where forensic anthropology is going.

95
00:06:30,000 --> 00:06:38,000
I think it's the direction of where forensic science as a overarching field is going.

96
00:06:38,000 --> 00:06:52,000
But I do have reservations and questions I asked Thomas this yesterday during the symposium and I'll pose it to you again for everybody to see this thing and I want everybody to jump in on this they can.

97
00:06:52,000 --> 00:06:58,000
When we look at what we've done, I'll touch it this way.

98
00:06:58,000 --> 00:07:11,000
So, years ago I wrote a paper on standardization osteometrics, and I noted that in a list of osteometric measurements.

99
00:07:11,000 --> 00:07:16,000
Our field our study couldn't get 80% on any one.

100
00:07:16,000 --> 00:07:22,000
It was all 80% or well not even 80% or below, and it was just, we couldn't get that 80% threshold.

101
00:07:22,000 --> 00:07:39,000
So, those are then replicated in database site, or does you have that same replication in any sort of measurements out there. If you have that error, either in databases or in user input within AI.

102
00:07:39,000 --> 00:07:53,000
And AI would replicate that error in the future, or in magnifying, or can it serve as a corrective function and say hey wait a second.

103
00:07:53,000 --> 00:08:02,000
This just seems like it might be off you need to fix this. I mean that was something that really sparked the discussion for planning the symposium.

104
00:08:02,000 --> 00:08:10,000
You know I came to Thomas as a friend and a colleague who knows a lot about AI computers and I said, have some thoughts.

105
00:08:10,000 --> 00:08:23,000
I have some questions. A lot of thoughts. And the biggest thing is that we have to remember that the bias that we have in ourselves and in our samples then get to put into these algorithms right.

106
00:08:23,000 --> 00:08:40,000
I do think that we have to be cautious about what data we apply what populations we apply data to, and things like that so that we cannot have our biases reflected in these computer methods right.

107
00:08:40,000 --> 00:08:48,000
At the same time, it comes with its own ethics that we have to consider. And so that's fine too.

108
00:08:48,000 --> 00:09:00,000
I had lots of opinions yesterday on questions, similar to yours on Ashley on why or how we most ethically implement these tools.

109
00:09:00,000 --> 00:09:12,000
And I do think there's a risk and I do think we need to go forward and see how we can best remove our bias as much as possible from our data collection and from our methods that we're building.

110
00:09:12,000 --> 00:09:22,000
Can we remove our biases? Can even using a computer algorithm, even using machine learning and neural networks and things like that.

111
00:09:22,000 --> 00:09:29,000
Can we ultimately, at the end result, remove our biases?

112
00:09:29,000 --> 00:09:35,000
Yeah, you know I will say, I would correct myself. Yeah, I don't think we can remove it fully.

113
00:09:35,000 --> 00:09:51,000
I don't think we can. I think even things like if you've ever seen the very basic example of Googling different types of professional attire for different like cultural groups of people, the bias is so built into that, where if you look at like professional

114
00:09:51,000 --> 00:10:09,000
attire for black women, their natural hair very infrequently shows up in a Google images survey. And so we are seeing our biases reflected in these and so I guess you're absolutely correct. I wouldn't say we remove it. We have to see how we can accommodate it and how we can challenge it.

115
00:10:09,000 --> 00:10:23,000
And I think that also just like comes down to, can the field even agree on what's an okay bias? Because bias isn't necessarily inherently negative, but as a field we tend to not even agree on what's okay.

116
00:10:23,000 --> 00:10:41,000
Like what types of maybe sample bias is good for, you know, a certain study or what is actually negatively affecting how we're implementing this in a broader spectrum. And there doesn't seem to be consensus. So I think that before I know for myself going into this

117
00:10:41,000 --> 00:10:59,000
like this conference, I was cautiously optimistic, I guess, because in my job, don't really use the technology in that same way. I feel like we're a little stuck in a little bit older school just because of, you know, who we're working with and what populations

118
00:10:59,000 --> 00:11:25,000
we're trying to identify. And so the methods somewhat work and so they're kind of not changing it too, too much. And so I was just like, how, if we can't even agree as a field, what's okay, as far as what standard and what's willing, like what we're willing to have in our science, how are we going to progress forward with technologies and AIs and other things like that.

119
00:11:25,000 --> 00:11:33,000
And so we can't agree on how to do an age assessment by itself, using that for that, that's what we have already. So, editor.

120
00:11:33,000 --> 00:11:41,000
And then I guess for me just to go back because I have a lot of thoughts on that but I do want to address your first question as well with it, Ashley.

121
00:11:41,000 --> 00:11:58,000
Just to reiterate what I said in this symposium, I think that we have to take a critical look at how we interact with data and how we interact with this observer error that we see throughout our measurements. Hey, I will not correctively look for that.

122
00:11:58,000 --> 00:12:12,000
I guess I shouldn't say that. Could you code it, could you develop a tool that does it? Yes. Do I think it would be really useful and applicable to what we do? No, because you need to be correcting that error on the back end instead of just saying there is error here.

123
00:12:12,000 --> 00:12:29,000
And I think I saw a lot of really good posters and presentations these last couple days about measuring inter and intra observer error and how we approach most of our biological assessment techniques. So I think right now we're having some of these really critical conversations.

124
00:12:29,000 --> 00:12:39,000
AI is not going to be able to learn to fix our problems for us. That has to be fixed in the data and how we handle the data.

125
00:12:39,000 --> 00:12:43,000
I don't think we should imagine that it could do that either. I agree.

126
00:12:43,000 --> 00:12:47,000
Then we're, we're, you're the critical thinkers.

127
00:12:47,000 --> 00:13:03,000
We need to maintain that level as well as we can put things in the black box and expect, or now this is correct. Like the validation of AI needs to continue. Yes. And not a selfish level. I'm not trying to work myself out of a job.

128
00:13:03,000 --> 00:13:18,000
Well, I think that was a big takeaway yesterday, right? A multiple presenter said AI is not here to take our jobs. It's not at the level of we have to be concerned about fixing our own error via these methods.

129
00:13:18,000 --> 00:13:47,000
But to be a contrarian, and I believe Skylar, you were at the talk today. I believe you were there. There was a talk this afternoon about using AI programs such as chat, GBT, and DOC3, where you can enter into the program, a three dimensional photograph,

130
00:13:47,000 --> 00:14:03,000
a three dimensional image, and ask it to do a, in this case, a sex and age assessment and would it give you an accurate result with reasonings behind it.

131
00:14:03,000 --> 00:14:23,000
And their result was yes, only they used two images. I have a severe problem with an N of two. Yeah. But that seems to be just me. But I also have a problem with the simplicity of just that. Yes.

132
00:14:23,000 --> 00:14:38,000
And I think that got really well addressed by Dr. Carolyn Isaacs, who was a presenter at our symposium. She's a professor at Michigan State University. And I think she did a really good job at showing that actually no.

133
00:14:38,000 --> 00:14:55,000
It's like we can feed it all chat GBT, these large language models, these chess, radio graphs, anything more than post mortem comparison and say, can you do it. And it might do it. Well, yeah. But if you ask for real technical reasoning, it does not and not provide that.

134
00:14:55,000 --> 00:15:09,000
Yeah. And I think a large part of that is because we do lack some standardized language. If we trained an LLM on a set of standardized definitions that I could then regurgitate at us then yes, I think it could then do that or simulate it.

135
00:15:09,000 --> 00:15:24,000
But I don't think we're at the point. And I think Dr. Isaacs definitely showed this as well, where large language models and their raw base form could come close to approximating friend, you know, practitioner vocabulary.

136
00:15:24,000 --> 00:15:28,000
Could it incorporate all I know.

137
00:15:28,000 --> 00:15:33,000
Hearing into the future and I know us appearing into the future is.

138
00:15:33,000 --> 00:15:38,000
Yes, it's also.

139
00:15:38,000 --> 00:15:44,000
It's also everything that happened.

140
00:15:44,000 --> 00:15:58,000
Yeah, it's just anything and everything can happen in the future but when you look at things like what Dr. Isaacs was saying, I look at chess chess, x rays I look at chess CTs.

141
00:15:58,000 --> 00:16:18,000
And yes you can do identifications and have our standards standardized, but we don't have the right language on how to do identifications. And you could do a language model of large language model on trying to do those identifications, but one of the

142
00:16:18,000 --> 00:16:31,000
reasons why we see those CTs in the first place we see those x rays in the first place is because of pathologies pathologies do not lend themselves to a standardization.

143
00:16:31,000 --> 00:16:35,000
They've never lend themselves to a standardization.

144
00:16:35,000 --> 00:16:54,000
So how can a large language model can machine learning or any other sort of AI algorithm, pick up on the non standardization and irregularities of pathologies and use them for identification.

145
00:16:54,000 --> 00:17:06,000
So the standardization of and the lack of standardization of biomedical terminology which is the whole other rabbit hole. And I can definitely talk yeah you actually probably could maybe not with our language models.

146
00:17:06,000 --> 00:17:14,000
But with convolution neural networks or transfer learning which we also saw usage of those in our symposium yesterday.

147
00:17:14,000 --> 00:17:28,000
Yeah, you probably could. But I think there's a huge caveat here. It would have to be only hearing about traditionally expressed forms of pathology. I think that there's some level of logical variability.

148
00:17:28,000 --> 00:17:37,000
I mean, the use of the old synchronic product hyperostosis. I mean that is the most wild thing.

149
00:17:37,000 --> 00:17:50,000
It looks like a candlestick so there will be actually. So, as that uses they can take on such varying forms of complexity and then if you're only getting like this much of the vertebral column.

150
00:17:50,000 --> 00:17:54,000
Okay, is it dish or is it just some really bad osteoarthritic clipping.

151
00:17:54,000 --> 00:18:07,000
So I think it could do something, but I don't think it's going to be able to account for those buried pathologies and I actually think are more common than the textbook.

152
00:18:07,000 --> 00:18:27,000
So I think it's going to be back to, which will be the, you know, the top of next year's basics of what anthropology is and it's right is looking at the, like, right like really looking at the vast variability of human existence and trying to categorize it to a certain extent

153
00:18:27,000 --> 00:18:41,000
and trying to use it in a cultural setting. So like getting, like you can get something from a computer that will help you maybe guide where you're going with it but at the end of the day it's still going to have to be a human on the other end, looking at what

154
00:18:41,000 --> 00:18:58,000
you're doing with your notebooks and then say, does that make sense to begin with, and then how do we use that in our practice. I mean, I knew I use chat GBT to create my life grocery lists and stuff because I'm lazy and I don't know how to cook, but at the end of the day,

155
00:18:58,000 --> 00:19:10,000
I might have to look at it and be like, you know, maybe I'm not feeling like Mexican food tonight, and I want something else like, you know, that's a very simplified version, whatever, but I think that is the answer but you have to still have that human aspect to it.

156
00:19:10,000 --> 00:19:25,000
I think that's essential I definitely don't think we're ever going to be in an environment where there's not a human on the outside. I personally, 50 years down the road of this could be completely wrong. I really don't think it's that level.

157
00:19:25,000 --> 00:19:29,000
Yeah, but that by that time I'll be retired.

158
00:19:29,000 --> 00:19:45,000
And again, this is the same thing that the original AI founder said in 1943, but like, it stagnated for 20 years nothing happened and then we have more powerful computers. So it might be on here, it might be a little bit.

159
00:19:45,000 --> 00:19:50,000
Is it theoretically possible yes because it's just the replication of neurons.

160
00:19:50,000 --> 00:20:01,000
So theoretically it shouldn't be too challenging, but I just, I don't think we can see your machine a lot. Yeah, like I think the contextual information unless you can feed all those variables.

161
00:20:01,000 --> 00:20:11,000
And then that case you literally are very published and we're not. I don't know that's possible. Watson's not there yet. Not yet, but maybe I do have a question.

162
00:20:11,000 --> 00:20:15,000
Change it a little bit.

163
00:20:15,000 --> 00:20:20,000
And for me, for my research.

164
00:20:20,000 --> 00:20:33,000
I've had to switch using image J to a different software package because my, my images were just so large and it changes.

165
00:20:33,000 --> 00:20:48,000
When you're looking at sort of these large that these machine learning programs external network programs and things like that. When you're dealing with computational images.

166
00:20:48,000 --> 00:21:04,000
3D is obviously better than tuning. The greater the flank cloud, the better the image, the better the outcome. But is there a plateau at which you know beyond this point, it.

167
00:21:04,000 --> 00:21:16,000
It's just, you're getting nothing better from it, and just crashing the system or just not not receiving what you need out.

168
00:21:16,000 --> 00:21:19,000
The literal answer. Yes.

169
00:21:19,000 --> 00:21:35,000
I'm sorry I'm talking the most about this. I don't want to try on. But the literal answer is yes there is a point, there is such a thing as data saliency when you hit a point where more data do not necessarily result in better.

170
00:21:35,000 --> 00:21:47,000
For forensic anthropological sample size we don't reach that level of saliency. But when we're talking like raw resolution. Yes, especially when it comes to human perception.

171
00:21:47,000 --> 00:21:56,000
We can only perceive a lot we can actually replicate a lot more pixels that we can perceive and a lot more colors and we can perceive through our technology.

172
00:21:56,000 --> 00:22:06,000
We do it because it's fine, I guess. So yeah, you get hit a point of a list and saliency and basically any aspect of technology.

173
00:22:06,000 --> 00:22:19,000
No, I think with with even just with data sets trying to get the number of individuals into the system, I don't think we're ever going to get a point, or we can get that because we just don't have the files.

174
00:22:19,000 --> 00:22:30,000
And we won't have the bodies. I think the only way that we could get close to that point is if we start routinely collecting medical data from living.

175
00:22:30,000 --> 00:22:39,000
I think we could get enough patients at some very far point in the future, that we could have a big enough data set to have to worry about.

176
00:22:39,000 --> 00:22:52,000
I think we could get living CT data. The problem is that I see is ethics and related to getting living CT data.

177
00:22:52,000 --> 00:23:04,000
Or even to see CT data because particularly with living individuals, they may consent to their data being used for project X.

178
00:23:04,000 --> 00:23:09,000
What about Y, C, A?

179
00:23:09,000 --> 00:23:24,000
At what point do we say, do we know, do we say, here we want to take your, use your CT and use it for research and is that ethical to even ask?

180
00:23:24,000 --> 00:23:40,000
I think it goes back to our next year's theme of back to basics, because where we're just collecting data, and then nobody knows about it and they don't, again, have their say, their consent that they can give generations later.

181
00:23:40,000 --> 00:23:44,000
And this is where we started.

182
00:23:44,000 --> 00:24:01,000
Are we just doing it again in just a different way? So there's, and there are very valid reasons why people do not want to get their information out and on many different levels, not beyond medical, they're very thin, you know, from history too.

183
00:24:01,000 --> 00:24:21,000
And so that hasn't been established, how that should work. Nobody agrees yet how that should work. So I think it's, we need to take that memory that we're sitting with now to, to trying to understand how to build something that's larger than us to work with.

184
00:24:21,000 --> 00:24:23,000
And I don't know the answer. Yeah.

185
00:24:23,000 --> 00:24:43,000
I think a lot of it is like, I feel like a good portion of our field believes that, I don't know about, I wouldn't say all, but having to just keep in mind that yes, well, this might be data or this might be a collection that we're trying to learn from, we're trying to do our best to progress science in a way that's better for the future.

186
00:24:43,000 --> 00:24:58,000
It's still people that we're working on and they had lived experiences and lives and families that cared about them and like, you have to always not forget that aspect when doing any research moving forward.

187
00:24:58,000 --> 00:25:14,000
Even a practical measure, there's so many variables on how everybody's so different. I would, at some point you'd be like, I don't know the variables so I can't even use this and then collected and stored things, just like today in museums and you don't know what's the

188
00:25:14,000 --> 00:25:30,000
answer to that. Yeah.

189
00:25:30,000 --> 00:25:48,000
In case there was a data breach. Yeah, you know, it's all information.

190
00:25:48,000 --> 00:26:04,000
Because we're collecting collecting collecting. And even though we have the old bioethics behind us, collecting where is that much secure, or maybe someone uses for different purposes that we still are not aware of, you know, at the point.

191
00:26:04,000 --> 00:26:19,000
The problem I have and shooting myself in the foot knowing I'm shooting myself in the foot is the NMD IV, the Mexico, the seed identification database.

192
00:26:19,000 --> 00:26:38,000
They were in a session, and a few years ago, and they were announcing a presentation, the database, and someone asked if these decedents had given their authorization to be used for research.

193
00:26:38,000 --> 00:26:54,000
And the response was in the state of New Mexico, it's publicly admission admissible data because it's there. All deceit all medical examiner records are public data.

194
00:26:54,000 --> 00:27:19,000
So it's out there. And I have had at the time and still have a real ethical problem about that database, because these people did not give consent for their CT data for their autopsy data to be used for research, and a lot of them are, yes, Native American

195
00:27:19,000 --> 00:27:27,000
individuals, indigenous individuals, who our field has historically not been the nicest to.

196
00:27:27,000 --> 00:27:42,000
And this is where like ethics and like legal frameworks don't necessarily line up, because there is a massive objectification of deceased individuals where people just forget that the body used to be a human being.

197
00:27:42,000 --> 00:27:50,000
And a skeleton is not just like a bunch of random objects in the ground. It's like, that was a person, that is a person, that person has a family.

198
00:27:50,000 --> 00:28:09,000
It doesn't matter, you know, like if it's legally, you know, allowed, we should be thinking more deeply into those ethical ramifications of what that person and the data around that person, those that can impact people who aren't still living in very big ways.

199
00:28:09,000 --> 00:28:25,000
And I mean, I saw this on like Facebook or something, like that date site, but it was something that essentially said, just because it's legal doesn't mean that it's right. And that really like you had your talk that like, you had said there's no rights for the dead.

200
00:28:25,000 --> 00:28:40,000
And it really like, you know, it hit home because I was like, yeah, this is like the legal applicability, which is like, what forensic science is supposed to be, you know, it's anthropology to a legal question.

201
00:28:40,000 --> 00:28:57,000
So, is the legal question is really all that we should be worried about? And what are we looking at, at the full, not just for the purpose of answering that question?

202
00:28:57,000 --> 00:29:04,000
I have used an MZ ID a bit.

203
00:29:04,000 --> 00:29:14,000
I don't know what my full answers are to all of these thoughts. I've heard actually specifically expressed these concerns before.

204
00:29:14,000 --> 00:29:33,000
And it's really hard because on one hand, we do get valuable information. Yes, we do get valuable, valuable, valuable information that can actually stand a chance at representing our actual populations, which we have never done.

205
00:29:33,000 --> 00:29:45,000
We have never done an anthropology test. We have never accurately represented our populations and applying methods accurately to marginalized populations. We've never done that.

206
00:29:45,000 --> 00:29:50,000
And at the same time, there are all of these ethical questions.

207
00:29:50,000 --> 00:29:58,000
How far past the date of your death do you get consent over your body? When do you stop having long-term autonomy?

208
00:29:58,000 --> 00:30:09,000
I think this is a broader question for biological anthropology. Sorry to burst the bubble. But like, we do the same thing with archaeological remains.

209
00:30:09,000 --> 00:30:18,000
You know, we don't, we don't contact descendant communities and get consent from the descendants to excavate their potential loved ones.

210
00:30:18,000 --> 00:30:29,000
We see a burial on the ground and we excavate it and we store it somewhere most often in a lab that very infrequently gets repatriated to who it belongs to.

211
00:30:29,000 --> 00:30:34,000
So, do we bring into question our entire field?

212
00:30:34,000 --> 00:30:58,000
I think we do. I think we do. But that is the purpose of being an anthropologist and that is the purpose of being young anthropologists is we always have to not only honor and respect what those that have gone before have done, but we have to challenge and find better ways.

213
00:30:58,000 --> 00:31:04,000
And that's kind of why I know, like, don't get me wrong, I love listening to everyone talk to the anthropology section.

214
00:31:04,000 --> 00:31:18,000
But because I know most of y'all are like, oh, I'll get your research later. I like to go to the other section talks, because most other sections within the American Academy of Forensic Sciences are not humanitarian workers.

215
00:31:18,000 --> 00:31:35,000
You know, you know they are service members, like, they, you know, give service to a community and in some regard, but it's not necessarily from a humanitarian background, and with all the cultural adaptations that are necessary in order to do it properly for any given population.

216
00:31:35,000 --> 00:31:54,000
And so I like to go and just see like, okay, what time potentially bring, or not just me because like I'm no one but like, what can this question, bring to the forefront of other fields with any forensic sciences to try to help the full of forensic science

217
00:31:54,000 --> 00:32:12,000
to be more humanitarian, more human, because we're answering these legal questions but about people and I think more than just anthropology needs that. That's why I know when Ashley asked me to do this podcast to begin with I was like, yes, because I want more people to not,

218
00:32:12,000 --> 00:32:28,000
you know, not just talk to anthropologists about anthropology because we can go around the grounded circles about the ethics of anthropology and I think we should, but it has to be more than just us. And so that's why I like coming to conferences like this to go to those other talks

219
00:32:28,000 --> 00:32:47,000
and really see, okay, what do the other fields care about, and what are they doing research on, in what ways, I'm here to spend all day looking at drugs but never see the human side of things, and coming from Massachusetts, we had to chemist any

220
00:32:47,000 --> 00:33:02,000
and Sarah Frapp, who did not see the human side of things, and ended up at the end of the day, getting close to 60,000 drug convictions.

221
00:33:02,000 --> 00:33:21,000
I was like, I was last year maybe year before I asked, I'm not 100% sure my memory is rough, but they're talking about just like, what is objectivity in science and it's like, yes you're trying to be as objective as possible in order to make good science and I think

222
00:33:21,000 --> 00:33:41,000
that would probably be something that we talk about next year for back to basics, but people aren't objective and people are the ones doing science and so like trying to just get other fields to just slap them across the face a little bit to be like, you're the one doing it, you're the one doing science, you're not objective, think through what you're doing.

223
00:33:41,000 --> 00:33:58,000
I tried to drive that point home so frequently. I did a poster presentation yesterday on part of my dissertation, and I am talking about bio cultural approaches to forensics, all the time because I fully recognize that we cannot be completely objective and that we have to recognize

224
00:33:58,000 --> 00:34:02,000
our biases, so we can understand how to challenge them.

225
00:34:02,000 --> 00:34:12,000
I had a poster yesterday and someone said well what is the key takeaway, and I said the key takeaway is that we have to look at these deceased individuals as a whole person.

226
00:34:12,000 --> 00:34:22,000
We can't just do biology, we can't just do culture, we can't just do toxicology, we have to use our whole anthropology brains to do what we need to do.

227
00:34:22,000 --> 00:34:26,000
And I think, yes, I did me.

228
00:34:26,000 --> 00:34:38,000
I think you're more than just some of your parts, you know, you're more than just your biology, you're more than just your culture, you know, like, you are all of those things all at the same time, that doesn't change any guy, and they play roles together

229
00:34:38,000 --> 00:34:51,000
you know what I'm constantly screaming at the top of my lungs to whoever will read my papers is that biology and culture interact and you can embody your culture and your biology can affect your culture and vice versa.

230
00:34:51,000 --> 00:35:08,000
And so we really have to critically examine how those things interact with each other to best represent individuals to identify them, obviously, first line of priorities to identify and repatriate, and also to build better, you know, stronger databases that can be

231
00:35:08,000 --> 00:35:10,000
used for research.

232
00:35:10,000 --> 00:35:28,000
And I think it's also not just on the deceivment, even though our job is to not worry about who did what to focus on missing. We also have to keep in mind what the end result of our analyses are for kids.

233
00:35:28,000 --> 00:35:54,000
And with that forum that we called earlier and we, that was the purpose of it, having individuals from the Innocence Project, showing that forensic science has this incredible power to wrongfully convict if misused, and when you convict someone, you upend not just a life, you upend lives and communities.

234
00:35:54,000 --> 00:35:57,000
And we have to recognize that.

235
00:35:57,000 --> 00:36:01,000
We were recently talking about this earlier today.

236
00:36:01,000 --> 00:36:23,000
And something that I often that I've thought about in the recent years that I try to challenge people to think about too, is why forensic anthropologists who do casework, right, who try to help in the forensic system actively are only ever considered to be experts for the prosecution.

237
00:36:23,000 --> 00:36:34,000
And it's very infrequent that defense attorneys are able to get access to us because we typically work in medical examiner's offices and state and federal institutions.

238
00:36:34,000 --> 00:36:44,000
And so we have to think to ourselves, is our science going to put someone into the system that's so, so, so.

239
00:36:44,000 --> 00:37:06,000
And how do we challenge these cases where there have been jumps on the issues, where there has been an inaccurate application of the methods, how could we start integrating ourselves into that approach to make sure that people are not getting incarcerated, particularly when they are not guilty of what they're being charged with.

240
00:37:06,000 --> 00:37:17,000
And going through a trial alone, it's a huge impact on someone's life. Even if they're not convicted, going through the system, the financial and social and technical training put on someone is really, really huge.

241
00:37:17,000 --> 00:37:30,000
Not even the trial. Just the mere indictment. Even before you get to the indictment fence can cost hundreds of thousands of dollars.

242
00:37:30,000 --> 00:37:40,000
And then on, I mean, I don't know who's applied for a job recently, but it doesn't say where you convicted of a crime, it says where you charged with one.

243
00:37:40,000 --> 00:37:48,000
And that's even before the science happens really, you know, to a certain extent.

244
00:37:48,000 --> 00:38:07,000
I say, not to necessarily like change up the topic, but to let other people talk, I know I talk a lot too. I figure we can go through it like maybe everyone tell us, say like what your favorite part of the conferences, kind of bring it back to ask like what part of the academy, sorry.

245
00:38:07,000 --> 00:38:23,000
Like what were maybe favorite talk, maybe favorite presentation, or just like something you felt like you learned a lot from this conference. I mean honestly my favorite presentation still at the end was the speakers that Ashley brought in from the Excellents Project.

246
00:38:23,000 --> 00:38:39,000
Because I think I like I would have loved that to have been the main plenary session I would have loved the entire Academy to have heard it instead of however many people are in the room and have it not be scheduled at the end of the day when everybody is exhausted.

247
00:38:39,000 --> 00:38:47,000
Because I do think those voices, particularly from Marvin, the person that was, I said, right. Yes, it was Marvin.

248
00:38:47,000 --> 00:39:08,000
It was Marvin, who was the wrongfully convicted in vigil, like more people in our field need to hear the consequences of the science and the trials going wrong, so that they say they aren't just in a prosecutorial only mindset which, while we maybe are not too many people

249
00:39:08,000 --> 00:39:11,000
in forensics are.

250
00:39:11,000 --> 00:39:16,000
So thank you, Ashley.

251
00:39:16,000 --> 00:39:19,000
Hannah, what about you, what was your favorite?

252
00:39:19,000 --> 00:39:26,000
I really liked.

253
00:39:26,000 --> 00:39:50,000
A lot of the talks about looking at some of our metrics and are they actually as accurate as they originally said they were in their actual studies. There was a really awesome poster yesterday about how to ask how to invent a brave estimation and

254
00:39:50,000 --> 00:40:09,000
the initial engagement was like 80% accuracy or higher and then the student tested it and was like, oh well, you know, it's kind of your energy and culture like 50 something percent and I'm like okay, you know, let's think about lives, and you know using the brand's

255
00:40:09,000 --> 00:40:13,000
technologies to assess.

256
00:40:13,000 --> 00:40:20,000
And that makes me excited for next year's well I think just getting back basics about

257
00:40:20,000 --> 00:40:24,000
what are things about and stuff.

258
00:40:24,000 --> 00:40:30,000
And like the violence and violent.

259
00:40:30,000 --> 00:40:45,000
From my side it's more about as we discussed, is that I personally, that's the multi let's say, approach modernism, the approach that we have in the US and then we can jump from another perspective, and to see other things and see other things

260
00:40:45,000 --> 00:41:01,000
that are doing. I think that's the highlight.

261
00:41:01,000 --> 00:41:19,000
And at the same time there is other thoughts that was too simple to make believe, and I think that's the tricky part, but it's too easy to work with the machine learning or the LLM and so on. So, it's not easy, I think, you know, well, it's not the way that was presented

262
00:41:19,000 --> 00:41:35,000
it was like, you do this, you don't have this, you do this, and you have this, you know, and that's the contrast that you have from one side. It's simple for someone to understand how complex it is. And from the other it's too simple to be used.

263
00:41:35,000 --> 00:41:41,000
But at the end, there is a question my

264
00:41:41,000 --> 00:41:56,000
I sometimes I like, I get no one wants to comment on this, it's necessary to last too long, because we're all exhausted by the end of the day. But sometimes I'm like, can this have been a 30 minute talk? I just need a little more detail, please.

265
00:41:56,000 --> 00:42:11,000
I think it's so true though, you know, everyone has to rush and then you're like, oh, no time for questions and then people run off. Like, okay, I guess I'll try and find this person or maybe just question.

266
00:42:11,000 --> 00:42:17,000
But yeah, no, it's always the hardest thing. It's like, everything's happening for the next few days, Thursday, Friday.

267
00:42:17,000 --> 00:42:30,000
This year in the fall, we only had posters, Friday, so this time to like, frame everything in here, like, overwhelmed, trying to like, do everything.

268
00:42:30,000 --> 00:42:34,000
Oh, I'm going to read that paper.

269
00:42:34,000 --> 00:42:38,000
Download it.

270
00:42:38,000 --> 00:42:53,000
My favorite is, is you'll see them up there and they'll use a software package and they'll do something with it and like, I tried that and can't get it to squat. What extra thing are you doing?

271
00:42:53,000 --> 00:43:10,000
I think that's, I'll get back on my soapbox, that's one of the big issues I have and why we organize our symposium focused on methodology and making the methodology accessible and understandable. I think it is completely inadequate to say, I plugged some things into this program.

272
00:43:10,000 --> 00:43:25,000
Some stuff happened, and then I had a result, or even if you do that to the level of interacting with your computer scientists that way, where I just gave the stuff to the computer scientists, they did their thing to it, and I have a program.

273
00:43:25,000 --> 00:43:35,000
I think we do have to critically evaluate how we interact with that data and how we apply these tools, and then actually have to say how to do it.

274
00:43:35,000 --> 00:43:40,000
That was not my favorite part of the conference. Those are my big opinions.

275
00:43:40,000 --> 00:43:55,000
I mean that also just like, I mean, I think probably most of us here got into forensic science, probably watched a lot of crime shows growing up and you're like, 24 hours and I got an answer to everything and I think sometimes these conferences make me be like, no way you could do that 15 minutes.

276
00:43:55,000 --> 00:44:06,000
You cannot, you know what you can explain AI technology to me, who I barely get my computer to use Word.fin in 15 minutes. I'm sorry you can't, I need more time.

277
00:44:06,000 --> 00:44:09,000
Sorry, go ahead.

278
00:44:09,000 --> 00:44:34,000
I guess from the professor's perspective, just as a general conference theme, like talking about AI, be really important because it gives accessible research to different students with different needs in different places that don't have what you might think we're going to have in every program,

279
00:44:34,000 --> 00:45:03,000
a huge storehouse of skeletons back there, and so it is, and that's just not the reality of those programs for students now. And so, something that, you know, you can tear out like metadata from and look at it in a different way and use different technologies to do that is increasingly important from from the perspective of having graduate students and undergraduates who want to get in this field, and then there's limitations

280
00:45:03,000 --> 00:45:12,000
in many ways for them and so I think that that's can be important for students to see.

281
00:45:12,000 --> 00:45:14,000
For all you students out there.

282
00:45:14,000 --> 00:45:30,000
I did my thesis using entirely open access data I did not generate novel data. And I think that's important for us. Yeah, there's a discussion where should we be digging things up and storing them and maybe using them sometimes.

283
00:45:30,000 --> 00:45:44,000
I think that's the second thing that's yes, please. I mean I love the data. Yeah, there's so much that there's so like, please use it, not to be little that there is a complexity and how we distribute interact with human remains research data.

284
00:45:44,000 --> 00:45:49,000
But I've said this a lot, the biomedical field does it.

285
00:45:49,000 --> 00:45:53,000
They do human remains research they do living human research.

286
00:45:53,000 --> 00:46:05,000
They do something that they do and us, we don't have to just make it up from scratch. I mean starting to make us go through proper IRB protocols would be a great first step.

287
00:46:05,000 --> 00:46:23,000
For those of you who don't know, any anthropologists or researcher who is doing a research study with humans or human remains has to apply for ethics approval through their university. And when you are using deceased remains they are automatically not considered human

288
00:46:23,000 --> 00:46:39,000
remains as an individual and therefore not considered research. So that would be a huge first step is not just waving our hands away at things like IRB, it's been actually going through the process to ensure that we are doing ethical.

289
00:46:39,000 --> 00:46:44,000
And for those that don't know IRB is institutional review board.

290
00:46:44,000 --> 00:46:55,000
Research ethics. There you go. And I had to go through one. Yeah, I did too but only because I'm using living samples, you should do it anyway. Yeah, yeah, if you're using dead sample.

291
00:46:55,000 --> 00:47:06,000
Yeah, even if they object to you. Yes, it's a very important process that you should be aware of. Yeah, you know what the questions are. Yeah, and other people should be reviewing your work before you do.

292
00:47:06,000 --> 00:47:22,000
Yeah, and very, but it is an area where in different jurisdictions different groups of life and Canada, you have to file an REB. If you're using deceased humans, there is you know if sample bucks about it.

293
00:47:22,000 --> 00:47:40,000
And that kind of plays into something that Taylor Thomas brought up earlier about the disclaimers that we started seeing in the first slide of presentations, which seems newish but not standardized and not forever presentation.

294
00:47:40,000 --> 00:47:57,000
But they didn't include their human remains or had, I'm not including images of human remains because it's not necessary for what I'm talking about, or I even mentioned you know, land acknowledgments before we discuss things like that are becoming

295
00:47:57,000 --> 00:48:04,000
more standard and other other fields and other fields of biological anthropology, and has not it was very clear that that's not standardized.

296
00:48:04,000 --> 00:48:19,000
Yeah, for those of you who aren't here there were many presentations this year that started with some sort of statement that says, these are my opinion, and not the opinions of my employer or my funder or whoever it is and we

297
00:48:19,000 --> 00:48:37,000
didn't have that specifically that's out of me required financial.

298
00:48:37,000 --> 00:48:50,000
There was something else that came out and it was in quotes you have to have this exact wording on your slide and also have financial disclosure that says, do you have anything, any commercial financial disclosure.

299
00:48:50,000 --> 00:49:03,000
I did not and I think that I think that might be interesting decision for the academy to make particularly in our political climate, because of the political climate.

300
00:49:03,000 --> 00:49:08,000
Because,

301
00:49:08,000 --> 00:49:19,000
well, we only have five minutes, because of the current political climate and the fact that the academy does receive federal grants.

302
00:49:19,000 --> 00:49:33,000
They do not want the administration they're trying their best not to have the administration come down hard, and it literally can come down to our section, we're all in the anthropology section or most of the anthropology section.

303
00:49:33,000 --> 00:49:39,000
The anthropology section has diversity and notions that may get hurt.

304
00:49:39,000 --> 00:49:49,000
And if you use the academy and say, you can't have that it'll hurt us on a grander scale.

305
00:49:49,000 --> 00:50:09,000
This podcast does not reflect this podcast reflects our views and our views only not those of our employers, the American Academy of forensic sciences, or anybody else under the sun.

306
00:50:09,000 --> 00:50:21,000
And on that note, yeah I think I think we've talked to good men and if anything we've talked about today really resonated with you and you want us to go into more detail we can have further podcasts about it and try not to, you know, make this podcast too long,

307
00:50:21,000 --> 00:50:31,000
maybe just, you know, you can listen to it on your ride home from work or school or whatever but if you find anything you found really interesting and want us to talk about more, you can always take some of our little guests here.

308
00:50:31,000 --> 00:50:35,000
I don't know if I said little.

309
00:50:35,000 --> 00:50:39,000
I'm Skylar.

310
00:50:39,000 --> 00:50:44,000
Are fabulous.

311
00:50:44,000 --> 00:50:50,000
Randy Santic.

312
00:50:50,000 --> 00:51:05,000
So, we're going to have a little bit more of a chat with you and then we'll talk with us a little bit more about any of the topics or you might even get some of them to come and talk about their research in particular so thanks for hanging out with us, probably see you.

313
00:51:05,000 --> 00:51:27,000
And on that, good evening to all will have, you know, the normal emails and everything up there for you all.

314
00:51:27,000 --> 00:51:42,000
Bye to all. Oh, and we, we apologize Jenna could not make it this, this year she couldn't make it to AFS. She had a personal commitment this evening, and our thoughts and prayers are with her as she's doing that.

315
00:51:42,000 --> 00:51:44,000
Thank you.

316
00:51:44,000 --> 00:51:52,000
I wave to myself.

317
00:51:52,000 --> 00:52:01,000
To the

318
00:52:01,000 --> 00:52:11,000
to

319
00:52:11,000 --> 00:52:26,000
the

320
00:52:26,000 --> 00:52:41,000
to

