1
00:00:00,000 --> 00:00:14,320
And it used to start at 10 o'clock, but now they start at 930.

2
00:00:14,320 --> 00:00:18,360
Well that's good.

3
00:00:18,360 --> 00:00:25,060
There's also a letter engage seminar at 930 today and I was going to share that with the

4
00:00:25,060 --> 00:00:30,720
group and I was going to conclude around 925 today.

5
00:00:30,720 --> 00:00:35,920
That way people can make their way to their next event.

6
00:00:35,920 --> 00:00:37,960
Okay, great.

7
00:00:37,960 --> 00:00:38,960
Awesome.

8
00:00:38,960 --> 00:00:43,360
So we'll just give it about five minutes for people to dig in here.

9
00:00:43,360 --> 00:00:55,040
But I saw your GitHub repository where you've started to work on some Python, Jupyter work

10
00:00:55,040 --> 00:00:58,200
books.

11
00:00:58,200 --> 00:01:03,400
Would you be happy to talk a bit about the work you've done there?

12
00:01:03,400 --> 00:01:09,280
Yeah, actually I've added a bunch of starter notebooks now.

13
00:01:09,280 --> 00:01:11,080
Welcome Jake.

14
00:01:11,080 --> 00:01:13,520
Hi Jake.

15
00:01:13,520 --> 00:01:17,120
Hello.

16
00:01:17,120 --> 00:01:21,720
So I did figure out how to the inventories and the plants.

17
00:01:21,720 --> 00:01:32,920
I was able to read those in to pandas by just specifying the valid columns and not reading

18
00:01:32,920 --> 00:01:34,440
in the invalid columns.

19
00:01:34,440 --> 00:01:35,800
And so that works now.

20
00:01:35,800 --> 00:01:38,180
That was a sticking point.

21
00:01:38,180 --> 00:01:42,800
And so I put those up in the GitHub repository.

22
00:01:42,800 --> 00:01:50,800
And then did you maybe see like what are like the common inventory types and the not so

23
00:01:50,800 --> 00:01:54,880
common ones by chance?

24
00:01:54,880 --> 00:02:00,360
I don't remember that right now.

25
00:02:00,360 --> 00:02:03,680
You can download that workbook and look.

26
00:02:03,680 --> 00:02:08,640
You can maybe dig into that next time.

27
00:02:08,640 --> 00:02:10,520
But you're just in this opening bit.

28
00:02:10,520 --> 00:02:15,200
I'm going to go ahead and share some links with you here just while we're waiting for

29
00:02:15,200 --> 00:02:16,880
some people.

30
00:02:16,880 --> 00:02:22,680
So welcome Nick.

31
00:02:22,680 --> 00:02:27,760
Hi Nick.

32
00:02:27,760 --> 00:02:34,560
Let's see here.

33
00:02:34,560 --> 00:02:46,880
So David Busby over at OpenTHC put together this SQLite file for business to consumer

34
00:02:46,880 --> 00:02:47,880
sales.

35
00:02:47,880 --> 00:02:57,800
So if you were having trouble working with the large sales file, this is essentially

36
00:02:57,800 --> 00:03:05,320
the sales file in a large SQL database if that's easier for you to work with.

37
00:03:05,320 --> 00:03:09,680
And so I just received that in the past couple of days.

38
00:03:09,680 --> 00:03:12,600
So I'm going to be taking a look at it myself.

39
00:03:12,600 --> 00:03:20,160
And perhaps next week we can do some work with SQL taking some poke at the sales data.

40
00:03:20,160 --> 00:03:24,440
But just keep in mind, was there a question or something?

41
00:03:24,440 --> 00:03:30,120
I was wondering if we could use Dask to deal with those really large files.

42
00:03:30,120 --> 00:03:34,040
Can you tell us about Dask please?

43
00:03:34,040 --> 00:03:37,280
Dask sits.

44
00:03:37,280 --> 00:03:42,360
Dask allows you to multitask, like multi-thread things.

45
00:03:42,360 --> 00:03:48,640
And then also it reads pandas files, but it doesn't, it chunks them up.

46
00:03:48,640 --> 00:03:54,840
And if there's not enough room in memory, it stores them on disk and so it saves them

47
00:03:54,840 --> 00:03:57,320
like in pages.

48
00:03:57,320 --> 00:04:00,160
But you don't, and then you can just use them.

49
00:04:00,160 --> 00:04:07,880
It integrates with pandas, scikit-learn, most python.

50
00:04:07,880 --> 00:04:10,400
And it just spelled Dask?

51
00:04:10,400 --> 00:04:11,400
Yeah.

52
00:04:11,400 --> 00:04:15,060
Well, it's definitely worth checking out.

53
00:04:15,060 --> 00:04:19,240
So I'll put that on the agenda for next week.

54
00:04:19,240 --> 00:04:26,600
So would that be useful for the raw files or?

55
00:04:26,600 --> 00:04:29,760
I haven't used it that much.

56
00:04:29,760 --> 00:04:34,720
I know it works well with pandas and scikit-learn.

57
00:04:34,720 --> 00:04:39,840
And also if you have multiple machines, you can run Dask on each one of the machines and

58
00:04:39,840 --> 00:04:47,040
it'll use the memory and processor on each one of the machines.

59
00:04:47,040 --> 00:04:51,400
And that's, you may need a nice setup if you're working with this data.

60
00:04:51,400 --> 00:04:58,880
So like David was telling me, he's basically, he's running this locally on his server.

61
00:04:58,880 --> 00:05:07,360
So he's just got this giant SQL database just sitting there locally.

62
00:05:07,360 --> 00:05:12,120
So it'd be kind of expensive to put this in the cloud.

63
00:05:12,120 --> 00:05:24,680
But actually, I want to check out the SQL because there's a lot of things I feel like

64
00:05:24,680 --> 00:05:30,000
you could do, a lot of queries and that with the SQL would be great.

65
00:05:30,000 --> 00:05:31,000
Exactly.

66
00:05:31,000 --> 00:05:37,040
So we can get our hands into that for next week.

67
00:05:37,040 --> 00:05:42,040
And then I guess just go ahead and kick it off here.

68
00:05:42,040 --> 00:05:47,080
Just for the people who are new here or who just got here in the past like five minutes,

69
00:05:47,080 --> 00:05:53,320
I think we're going to go to about 9.25 today because Charles is heading to an event.

70
00:05:53,320 --> 00:05:58,600
And then I'm going to be listening into this Flutter and Gauge at 9.30.

71
00:05:58,600 --> 00:06:05,980
I'm not sure if any of you use Flutter, but it's sort of a fun tool.

72
00:06:05,980 --> 00:06:10,760
So it's going to be interesting to see.

73
00:06:10,760 --> 00:06:17,920
I would like to see that get pretty popular because it's still pretty new.

74
00:06:17,920 --> 00:06:22,640
Anywho, just sharing that.

75
00:06:22,640 --> 00:06:31,520
All right.

76
00:06:31,520 --> 00:06:36,960
So I guess I can just go ahead and share my screen and just show you some of the things

77
00:06:36,960 --> 00:06:50,760
I'm doing and then we can talk about it along the way if that sounds good to you.

78
00:06:50,760 --> 00:06:58,560
Great.

79
00:06:58,560 --> 00:07:04,280
So essentially we were starting to look at the different sample types last week.

80
00:07:04,280 --> 00:07:12,800
And there's probably going to be different types of cannabinoids for the different sample

81
00:07:12,800 --> 00:07:13,800
types.

82
00:07:13,800 --> 00:07:17,960
So oils, of course, are going to have a higher distribution than flour.

83
00:07:17,960 --> 00:07:23,600
So I just thought this would be an interesting thing to look at.

84
00:07:23,600 --> 00:07:36,760
So over in Spider, or you could use your favorite text editor and your favorite programming

85
00:07:36,760 --> 00:07:37,760
language.

86
00:07:37,760 --> 00:07:48,600
But once again, I'm just working with the same LCB data dump just because the variables

87
00:07:48,600 --> 00:07:50,000
are relatively the same.

88
00:07:50,000 --> 00:07:58,760
So I figured if I can master it for one month, then it should be fairly easy to generalize

89
00:07:58,760 --> 00:08:03,920
to the full data set.

90
00:08:03,920 --> 00:08:19,760
So just go ahead and I'm not sure if this is actually going to be in the right location.

91
00:08:19,760 --> 00:08:42,400
So never mind this.

92
00:08:42,400 --> 00:08:57,600
Okay.

93
00:08:57,600 --> 00:09:04,240
So essentially we're just going to read this data in here and we're going to start looking

94
00:09:04,240 --> 00:09:09,320
at some of these sample types that we looked at last week.

95
00:09:09,320 --> 00:09:21,640
So real quick here, I'm actually just going to go ahead and just commit everything over

96
00:09:21,640 --> 00:09:33,200
to GitHub just so that everybody can follow along.

97
00:09:33,200 --> 00:09:53,040
And then just so everybody can follow along with all the material.

98
00:09:53,040 --> 00:10:10,160
You know, I'm trying to keep most things here in the cannabis data science repository.

99
00:10:10,160 --> 00:10:16,320
So that'll need to get cleaned up a little bit, but so far just sort of archiving things

100
00:10:16,320 --> 00:10:20,400
there.

101
00:10:20,400 --> 00:10:31,240
And then, okay, so we've got our data read in.

102
00:10:31,240 --> 00:10:40,000
And so the tool that I'll be using today is just Seaborn, which is sort of an extension

103
00:10:40,000 --> 00:10:41,720
of Matplotlib.

104
00:10:41,720 --> 00:10:53,320
So Seaborn requires Matplotlib, so you can see it more of an extension or an enhancement.

105
00:10:53,320 --> 00:10:57,040
We'll go ahead and grab those packages.

106
00:10:57,040 --> 00:11:10,120
And essentially, you know, let's find out what columns we even have here.

107
00:11:10,120 --> 00:11:21,360
So it's sort of a handful, but essentially, we want to look at some of these cannabinoids

108
00:11:21,360 --> 00:11:25,780
that may be of interest to people.

109
00:11:25,780 --> 00:11:33,120
So we could maybe create a total cannabinoid calculation, but at the moment, let's just

110
00:11:33,120 --> 00:11:44,440
pick a, you know, this first cannabinoid here, THCA.

111
00:11:44,440 --> 00:11:49,120
And so, you know, in the lab, you could call this an analyte.

112
00:11:49,120 --> 00:11:55,480
So you could say your analyte is THCA.

113
00:11:55,480 --> 00:12:06,160
And so, you know, now let's just basically look at the distribution of THCA in the lab

114
00:12:06,160 --> 00:12:07,160
data.

115
00:12:07,160 --> 00:12:10,880
And let's see what happens.

116
00:12:10,880 --> 00:12:20,360
Okay, so that's what happened.

117
00:12:20,360 --> 00:12:35,640
I think we want...

118
00:12:35,640 --> 00:13:03,040
So we'll just look at the kernel density of this analyte.

119
00:13:03,040 --> 00:13:24,440
So this may take a minute since there's a lot of data.

120
00:13:24,440 --> 00:13:47,080
And essentially, you know, you can add the hue, which is the actual variable type.

121
00:13:47,080 --> 00:13:58,000
So the actual, you know, inventory type, remember, is the intermediate type.

122
00:13:58,000 --> 00:14:10,040
So we'll eventually check out the hue with intermediate type.

123
00:14:10,040 --> 00:14:22,760
So we may have to get just a subset of our data here.

124
00:14:22,760 --> 00:14:45,400
I wonder if we can just do a sample of it.

125
00:14:45,400 --> 00:14:58,120
Okay, so now we've just got a random sample of the data.

126
00:14:58,120 --> 00:15:17,280
So now maybe it'll be a bit more approachable.

127
00:15:17,280 --> 00:15:20,320
So we've at least got a chart now.

128
00:15:20,320 --> 00:15:27,440
And so this is just an estimated distribution of THC.

129
00:15:27,440 --> 00:15:37,760
So not the most informative chart ever.

130
00:15:37,760 --> 00:15:41,640
It looks like there's a grouping lower down towards zero.

131
00:15:41,640 --> 00:15:48,600
You know, perhaps that's edibles or things of that sort, or miscoding, or it's hard to

132
00:15:48,600 --> 00:15:50,120
say at this point.

133
00:15:50,120 --> 00:15:52,800
And then there's a grouping up here.

134
00:15:52,800 --> 00:15:57,480
And you can see there's a slight bump up here, which must be the oils.

135
00:15:57,480 --> 00:16:06,080
But you really need to separate this by the actual intermediate type for it to be informative.

136
00:16:06,080 --> 00:16:16,080
So let's look at how we can adapt this plot here.

137
00:16:16,080 --> 00:16:45,720
I'd really like to use this plot, but it looks like my Seaborn version may be outdated.

138
00:16:45,720 --> 00:17:05,760
You could just add hue.

139
00:17:05,760 --> 00:17:09,400
So it looks like my Seaborn version may be outdated.

140
00:17:09,400 --> 00:17:15,280
So we may have to just come back to these at another point.

141
00:17:15,280 --> 00:17:21,800
However, we can maybe now change gears into something else interesting that I was going

142
00:17:21,800 --> 00:17:23,840
to show you today.

143
00:17:23,840 --> 00:17:34,120
So we've been talking about mocking up the LEAF API.

144
00:17:34,120 --> 00:17:42,920
And I thought, OK, the first step is we're going to need a database just to store some

145
00:17:42,920 --> 00:17:43,920
things.

146
00:17:43,920 --> 00:17:50,360
So I just thought, OK, we'll just go ahead and use the database that essentially I'm

147
00:17:50,360 --> 00:17:58,760
already using for some of this open source material that I'm putting out through Candletics.

148
00:17:58,760 --> 00:18:08,440
So some of the tools I'll show you today that essentially I'm using to mock up the API are

149
00:18:08,440 --> 00:18:23,280
essentially the Candletics engine and then the Candletics API.

150
00:18:23,280 --> 00:18:37,880
And I'll go ahead and commit the latest work here, which has these two mock endpoints.

151
00:18:37,880 --> 00:18:44,120
So I've gone ahead and mocked the lab results endpoint and the MMEs endpoint.

152
00:18:44,120 --> 00:18:50,720
And then we can go ahead and I'll gradually mock all of the LEAF API endpoints.

153
00:18:50,720 --> 00:18:57,760
And then you can use them as you please for testing and whatnot.

154
00:18:57,760 --> 00:19:02,320
But they'll be fairly limited with a limited amount of data.

155
00:19:02,320 --> 00:19:20,760
So

156
00:19:20,760 --> 00:19:26,840
and then I'll go ahead and share these links with you here in the chat.

157
00:19:26,840 --> 00:19:31,120
The API and the engine.

158
00:19:31,120 --> 00:19:38,160
And so essentially, I'll just go ahead and just kind of show you these.

159
00:19:38,160 --> 00:19:39,880
Nothing set in stone here.

160
00:19:39,880 --> 00:19:47,920
These are just sort of general tools that I was putting together just to help with cannabis

161
00:19:47,920 --> 00:19:49,840
analytics.

162
00:19:49,840 --> 00:19:53,380
So we're doing cannabis data science here.

163
00:19:53,380 --> 00:20:05,480
So these are just sort of open ended packages that you can use to contribute to help your

164
00:20:05,480 --> 00:20:07,560
workflow.

165
00:20:07,560 --> 00:20:21,400
So the engine is just a Python package, which you can just install with your command prompt.

166
00:20:21,400 --> 00:20:31,800
So you can just, you know, install can lytics to get the latest version.

167
00:20:31,800 --> 00:20:35,920
And we're at the moment.

168
00:20:35,920 --> 00:20:40,600
So you've now got kin lytics installed.

169
00:20:40,600 --> 00:20:51,160
And then let me actually restart spider and I'll show you a bit more.

170
00:20:51,160 --> 00:21:05,120
But essentially, at the moment, it just leverages Firebase for a database.

171
00:21:05,120 --> 00:21:11,880
So if you want to learn about Firebase, I'll put that in the chat.

172
00:21:11,880 --> 00:21:22,800
So Firebase is sort of a fork of MongoDB and I also like MongoDB.

173
00:21:22,800 --> 00:21:24,280
So that's worth checking out.

174
00:21:24,280 --> 00:21:29,880
And so these are what you'd call no SQL database.

175
00:21:29,880 --> 00:21:37,920
So it doesn't have quite the same connective structure as the SQL database.

176
00:21:37,920 --> 00:21:44,440
And just to kind of just give you a look at what the data looks like here.

177
00:21:44,440 --> 00:22:02,080
So here is just some collections that I've made with some of the MME data.

178
00:22:02,080 --> 00:22:06,600
So you have essentially a directory.

179
00:22:06,600 --> 00:22:13,320
So you can think about it as a directory of collections and documents.

180
00:22:13,320 --> 00:22:18,920
And then all documents are essentially a JSON file.

181
00:22:18,920 --> 00:22:23,160
So you can think about these as key value pairs.

182
00:22:23,160 --> 00:22:29,680
So you essentially just have a giant JSON database.

183
00:22:29,680 --> 00:22:32,640
So it has its pros and its cons.

184
00:22:32,640 --> 00:22:37,400
The cons right off the bat are going to be it's much harder.

185
00:22:37,400 --> 00:22:39,480
Well, it depends on your perspective.

186
00:22:39,480 --> 00:22:47,400
But it can be harder to do queries and aggregations than it is in an SQL database.

187
00:22:47,400 --> 00:22:53,160
And then it has some advantages in that you're just going JSON to JSON.

188
00:22:53,160 --> 00:22:59,960
And sometimes the loose structure can be beneficial.

189
00:22:59,960 --> 00:23:05,560
And it can be fast if you know what you're looking for.

190
00:23:05,560 --> 00:23:08,480
So once again, nothing set in stone.

191
00:23:08,480 --> 00:23:12,680
But I essentially just tossed the MME data.

192
00:23:12,680 --> 00:23:26,240
And I tossed about 1,000 lab results here just so that way you can just pull these to

193
00:23:26,240 --> 00:23:28,720
start doing some practice.

194
00:23:28,720 --> 00:23:56,080
So how did I get those there?

195
00:23:56,080 --> 00:24:01,560
I used the Cantlytics engine.

196
00:24:01,560 --> 00:24:03,480
I think I put that in the chat.

197
00:24:03,480 --> 00:24:07,720
Yeah, so that's the Cantlytics engine.

198
00:24:07,720 --> 00:24:14,640
And so this is just essentially a wrapper around Firebase.

199
00:24:14,640 --> 00:24:18,320
So Firebase already has a Python SDK.

200
00:24:18,320 --> 00:24:27,880
But this is an even more expedient way to interact with the data.

201
00:24:27,880 --> 00:24:33,840
And so just put some useful tools here.

202
00:24:33,840 --> 00:24:41,480
So you can initialize Firebase.

203
00:24:41,480 --> 00:24:48,520
Use a bunch of wrappers to get documents, get collections.

204
00:24:48,520 --> 00:24:50,720
You can import data.

205
00:24:50,720 --> 00:25:01,000
So this handles importing data into Firebase from CSVs and Excel.

206
00:25:01,000 --> 00:25:03,920
So a fairly way to get your data in.

207
00:25:03,920 --> 00:25:10,960
And then so that's essentially one thing that Firebase was lacking was easy imports and

208
00:25:10,960 --> 00:25:12,640
easy exports.

209
00:25:12,640 --> 00:25:15,080
So there you go.

210
00:25:15,080 --> 00:25:24,760
Now you've got an easy way to import data and an easy way to export data from Firebase.

211
00:25:24,760 --> 00:25:41,720
And if you look in the tests, you can see how you can go about spinning up your Firebase

212
00:25:41,720 --> 00:25:43,880
instance.

213
00:25:43,880 --> 00:26:02,280
And then I actually use that code right there to populate these two collections.

214
00:26:02,280 --> 00:26:04,360
So it's all open source.

215
00:26:04,360 --> 00:26:11,560
So you can use it yourself and populate your own Firebase database essentially for free

216
00:26:11,560 --> 00:26:16,120
if that's something you're looking to do.

217
00:26:16,120 --> 00:26:19,160
You do have to pay for Firebase.

218
00:26:19,160 --> 00:26:22,240
But at the moment, the cost is really low.

219
00:26:22,240 --> 00:26:28,080
So as long as you keep your usage low, then it's essentially free.

220
00:26:28,080 --> 00:26:31,080
So it's a nifty tool.

221
00:26:31,080 --> 00:26:36,080
But I'm not set in stone to any one particular tool.

222
00:26:36,080 --> 00:26:40,280
So use your favorite tool there.

223
00:26:40,280 --> 00:26:46,720
And so then we've got our data there with the CanleLytx engine.

224
00:26:46,720 --> 00:26:54,680
And so now we can retrieve it with the CanleLytx API.

225
00:26:54,680 --> 00:27:03,360
And so once again, you can find the code here on GitHub.

226
00:27:03,360 --> 00:27:12,520
But this is essentially going to be the interface to all the actions that you can do in CanleLytx.

227
00:27:12,520 --> 00:27:24,720
So if you want to interact with any of this data, then you can use the CanleLytx API,

228
00:27:24,720 --> 00:27:30,400
which includes batteries included.

229
00:27:30,400 --> 00:27:32,940
It includes the CanleLytx engine.

230
00:27:32,940 --> 00:27:39,480
So this tool is bundled up inside the API.

231
00:27:39,480 --> 00:27:42,280
So once again, nothing is set in stone.

232
00:27:42,280 --> 00:27:50,220
But it's just a classic example of dog fooding, where you eat your own dog food.

233
00:27:50,220 --> 00:27:53,760
So you use your own tools.

234
00:27:53,760 --> 00:28:02,720
So that's the idea here is just building a set of tooling to perform cannabis analysis.

235
00:28:02,720 --> 00:28:07,560
That way, we can control everything from the ground up.

236
00:28:07,560 --> 00:28:17,880
We can control database interactions so you can get pretty low level.

237
00:28:17,880 --> 00:28:22,680
And then there's actually a lot more here than just database.

238
00:28:22,680 --> 00:28:35,640
So for example, you can work with storage, so you can put files into storage.

239
00:28:35,640 --> 00:28:42,760
You can download files, rename files, delete files.

240
00:28:42,760 --> 00:28:48,480
There's also authentication, so you can create users.

241
00:28:48,480 --> 00:28:52,000
You can give the users custom claims.

242
00:28:52,000 --> 00:28:55,400
You can give them custom tokens.

243
00:28:55,400 --> 00:29:00,960
And so the token would be for if you want to log in on your front end.

244
00:29:00,960 --> 00:29:06,760
So this is essentially a backend system that can talk with your front end.

245
00:29:06,760 --> 00:29:15,360
And then there's utility functions to get your users, update a user, then delete a user

246
00:29:15,360 --> 00:29:17,720
if need be.

247
00:29:17,720 --> 00:29:25,040
So it's just a little backend service wrapper around Firebase.

248
00:29:25,040 --> 00:29:36,520
So if you need those services, then this may be an easy interface for you.

249
00:29:36,520 --> 00:29:44,020
So let's go ahead and test some of these things out.

250
00:29:44,020 --> 00:29:49,080
So you can essentially clone the repository.

251
00:29:49,080 --> 00:29:53,920
And then you'll have this repo here.

252
00:29:53,920 --> 00:30:05,960
Looks like everything went through.

253
00:30:05,960 --> 00:30:13,120
And then just to kind of walk you through, so first things first, you'd clone the repository.

254
00:30:13,120 --> 00:30:20,240
And then you can actually run the server locally.

255
00:30:20,240 --> 00:30:26,440
So this is sort of to give you an example of how you could go about doing that.

256
00:30:26,440 --> 00:30:36,600
So you'd go to the folder where you downloaded, where you cloned the API.

257
00:30:36,600 --> 00:30:41,720
You can open the command console.

258
00:30:41,720 --> 00:31:02,120
And then you can just do, so you can just run server.

259
00:31:02,120 --> 00:31:06,940
It looks like one of my endpoints is not 100% correct here.

260
00:31:06,940 --> 00:31:08,280
So we can just jump in there.

261
00:31:08,280 --> 00:31:12,440
And so just to kind of show you a little bit about the API and tell you a little bit about

262
00:31:12,440 --> 00:31:13,440
it.

263
00:31:13,440 --> 00:31:20,520
So you've learned now that, OK, the database is Firebase Firestore.

264
00:31:20,520 --> 00:31:26,200
And so the API is Python Django.

265
00:31:26,200 --> 00:31:36,760
And so the typical structure here of a Django file, so in the root directory, you have your

266
00:31:36,760 --> 00:31:39,480
requirements.

267
00:31:39,480 --> 00:31:44,040
You have your read me.

268
00:31:44,040 --> 00:31:50,640
I'm just using node just to kind of help run some helper functions.

269
00:31:50,640 --> 00:31:55,760
There's no actual node dependencies.

270
00:31:55,760 --> 00:32:05,020
And then just, and then Docker to just get it up to, it's actually running in Google

271
00:32:05,020 --> 00:32:07,000
Cloud App Engine.

272
00:32:07,000 --> 00:32:14,120
But you can run it anywhere that you desire.

273
00:32:14,120 --> 00:32:22,800
And so just to give you a quick structure, so all the code is in the Canalytics API.

274
00:32:22,800 --> 00:32:30,600
And so just to give you an idea of what a Django project looks like, you'll have a settings

275
00:32:30,600 --> 00:32:33,720
file.

276
00:32:33,720 --> 00:32:37,160
And this is typically fairly long.

277
00:32:37,160 --> 00:32:39,600
You kind of want to try to keep it short.

278
00:32:39,600 --> 00:32:47,200
And this is just where you define all your key parameters towards how everything's going

279
00:32:47,200 --> 00:32:51,960
to be functioning.

280
00:32:51,960 --> 00:32:59,020
So this is where you define any third party apps you're using.

281
00:32:59,020 --> 00:33:06,880
And we're going to be using the rest framework, since we're going to be building an API here.

282
00:33:06,880 --> 00:33:12,020
And then everything else is relatively just boilerplate.

283
00:33:12,020 --> 00:33:19,000
So the rest of this is essentially just boilerplate to make everything else run.

284
00:33:19,000 --> 00:33:27,120
And you'll need to set up essentially your example credentials.

285
00:33:27,120 --> 00:33:43,440
So if you walk through the read me, I may have to flesh that out a bit about how to

286
00:33:43,440 --> 00:33:45,920
get set up with credentials.

287
00:33:45,920 --> 00:33:56,160
But essentially, you'll need to get these credentials set to really use this in development.

288
00:33:56,160 --> 00:34:01,840
So you've got your settings.

289
00:34:01,840 --> 00:34:09,040
And then where all the beauty happens, well, so you've got your URLs.

290
00:34:09,040 --> 00:34:13,220
And so these are going to be your actual endpoints.

291
00:34:13,220 --> 00:34:19,200
So for example, your endpoint here.

292
00:34:19,200 --> 00:34:23,040
And so it didn't like that these began with a slash.

293
00:34:23,040 --> 00:34:30,520
So your endpoint here is just going to be test, leaf, MMEs.

294
00:34:30,520 --> 00:34:36,960
And so I'm pointing this to essentially a Python function.

295
00:34:36,960 --> 00:34:43,280
So I just put these in this file over here.

296
00:34:43,280 --> 00:34:54,720
And so then this can give you some ideas about what Django endpoints may look like.

297
00:34:54,720 --> 00:35:06,240
So for a REST API, it's just a Python function that takes the request as an argument.

298
00:35:06,240 --> 00:35:10,600
You check the method of the request.

299
00:35:10,600 --> 00:35:18,720
So say you wanted to add functionality here for labs to actually create results.

300
00:35:18,720 --> 00:35:24,560
Well, then you would add a post.

301
00:35:24,560 --> 00:35:49,000
And then you would say, you know, right?

302
00:35:49,000 --> 00:36:00,320
And then you may even just, you know, you could return a not implemented error.

303
00:36:00,320 --> 00:36:07,440
So eventually, we'll want to go ahead and build out that functionality.

304
00:36:07,440 --> 00:36:13,400
Since we're essentially, remember, let's not lose sight of the original goal.

305
00:36:13,400 --> 00:36:29,920
Our original goal is we're essentially just creating a mock leaf API so we can, you know,

306
00:36:29,920 --> 00:36:42,800
so we can just, you know, basically mock some of these calls ourselves.

307
00:36:42,800 --> 00:36:53,560
So let's go ahead and see, you know, see this in action.

308
00:36:53,560 --> 00:36:57,240
So you know, we can come over here.

309
00:36:57,240 --> 00:37:07,240
You know, we'll just create a blank file here.

310
00:37:07,240 --> 00:37:17,440
Great, and so just since we're working in Python, you know, we can use the request package

311
00:37:17,440 --> 00:37:25,160
and, you know, the base.

312
00:37:25,160 --> 00:37:32,280
So the production base, if you want to test out what's in production, it's going to be

313
00:37:32,280 --> 00:37:43,320
so it's going to be API.canalytics.com.

314
00:37:43,320 --> 00:37:52,440
But you know, for right now, we're just going to be using the development base.

315
00:37:52,440 --> 00:37:57,360
So that way, you know, I can make, you know, rapid changes here just for, you know, just

316
00:37:57,360 --> 00:38:02,960
to kind of show, you know, show what we can do in development.

317
00:38:02,960 --> 00:38:08,320
And that way you can learn how to do development yourself if you're interested.

318
00:38:08,320 --> 00:38:11,360
So we've got it set up.

319
00:38:11,360 --> 00:38:16,460
Remember, we're going to be hitting these endpoints.

320
00:38:16,460 --> 00:38:33,440
So let's try to get the MMEs.

321
00:38:33,440 --> 00:38:46,600
So we just, you know, we're just going to make a get request, so requests.

322
00:38:46,600 --> 00:39:05,320
Took a second, so we got a 200 status code, I don't know.

323
00:39:05,320 --> 00:39:13,600
So I guess let's just do a data dump and see what we got.

324
00:39:13,600 --> 00:39:14,760
No promises here.

325
00:39:14,760 --> 00:39:17,360
This is a live test.

326
00:39:17,360 --> 00:39:21,840
Oh, sure enough, look at that.

327
00:39:21,840 --> 00:39:25,800
So that's a little much to work with.

328
00:39:25,800 --> 00:39:29,840
So let's just look at the first observation.

329
00:39:29,840 --> 00:39:33,240
Oh, so it's not a list.

330
00:39:33,240 --> 00:39:36,920
So let's see what we're even working with here.

331
00:39:36,920 --> 00:39:40,440
So we've got a dictionary.

332
00:39:40,440 --> 00:39:50,520
Ah, so it looks like at the moment the CanLytics API, let's actually go and so this is what's

333
00:39:50,520 --> 00:39:56,560
nice from building it from the ground up is you can actually now come over here and we

334
00:39:56,560 --> 00:40:03,720
can actually look at the endpoint so we can see what the endpoints actually doing.

335
00:40:03,720 --> 00:40:08,200
So that way it's not a black box, right?

336
00:40:08,200 --> 00:40:13,200
So now we control the data import, right?

337
00:40:13,200 --> 00:40:22,240
So we've now imported data here into Firebase.

338
00:40:22,240 --> 00:40:25,800
We've got the records here, right?

339
00:40:25,800 --> 00:40:34,320
So here we have somebody at 555 Brown Street, LCB Grow.

340
00:40:34,320 --> 00:40:35,960
Okay.

341
00:40:35,960 --> 00:40:44,360
So we want to basically be able to query that with our API.

342
00:40:44,360 --> 00:40:48,640
So let's start tinkering.

343
00:40:48,640 --> 00:40:54,680
So we're getting our MME data.

344
00:40:54,680 --> 00:40:59,560
So if we make our request, it'll check to see if there's a limit.

345
00:40:59,560 --> 00:41:02,180
Otherwise there won't be one.

346
00:41:02,180 --> 00:41:13,080
So let's try, let's test this limit out.

347
00:41:13,080 --> 00:41:24,760
So the way you could do that is you're just going to do Q or question mark, limit.

348
00:41:24,760 --> 00:41:38,280
Let's just try doing a limit of two.

349
00:41:38,280 --> 00:41:56,560
Yeah, that's right.

350
00:41:56,560 --> 00:42:05,560
So now let's, it looks like we may have gotten an error.

351
00:42:05,560 --> 00:42:09,600
So we got a 500 error code.

352
00:42:09,600 --> 00:42:19,920
And what's nice is, you know, we're in development here, so we can actually see the debug error.

353
00:42:19,920 --> 00:42:23,400
So you know, so this is how you would do development.

354
00:42:23,400 --> 00:42:31,960
So you would see, okay, there was a get request to the LEAF MMEs.

355
00:42:31,960 --> 00:42:40,000
Ah, and look, here's the error.

356
00:42:40,000 --> 00:42:45,920
To has class string, but we expected an integer.

357
00:42:45,920 --> 00:42:47,000
Okay.

358
00:42:47,000 --> 00:42:52,560
So we can go and tinker on this MMEs endpoint.

359
00:42:52,560 --> 00:42:57,720
So we now know this limit has to be an integer.

360
00:42:57,720 --> 00:43:05,400
So just a hot fix here, we could do better logic later on, but you know, you could just

361
00:43:05,400 --> 00:43:10,000
say, you know, you could just do something silly like that, you know.

362
00:43:10,000 --> 00:43:17,280
So if there's a limit, you know, make sure the limit's an integer.

363
00:43:17,280 --> 00:43:23,600
You know, and so go ahead and save that.

364
00:43:23,600 --> 00:43:30,560
This should, I think this should just refresh.

365
00:43:30,560 --> 00:43:38,160
I'm not sure if I've got the live server rerunning.

366
00:43:38,160 --> 00:43:57,720
Interesting…

367
00:43:57,720 --> 00:43:58,560
So.

368
00:43:58,560 --> 00:43:59,380
60 seconds for you.

369
00:44:28,560 --> 00:44:31,040
little bit.

370
00:44:32,880 --> 00:44:38,040
And actually we may even just want to change this just to return the docs just

371
00:44:38,040 --> 00:44:45,400
outright. I think that's just a bit a bit better. And let's actually just go

372
00:44:45,400 --> 00:44:54,840
ahead and re-spin this server. When you're actually doing

373
00:44:54,840 --> 00:45:05,840
development there is a live reload package which makes things a lot better.

374
00:45:05,960 --> 00:45:08,960
Okay.

375
00:45:11,160 --> 00:45:19,120
Okay let's try this again. Okay so now we've got a fast response. We've got our

376
00:45:19,120 --> 00:45:30,000
200 status code and we've got two documents here. So now this

377
00:45:30,000 --> 00:45:37,320
little fix here to the limit that seemed to do the trick. And if we

378
00:45:37,320 --> 00:45:45,840
look at our data we've got 555 Brown Street, the LCB grow.

379
00:45:45,840 --> 00:45:54,800
And so the next step is really to set this up to actually handle queries. And

380
00:45:54,800 --> 00:46:09,840
so the nifty thing is the get collection routine is set up to handle

381
00:46:09,840 --> 00:46:18,680
filters. So we can basically pass filters of the sort.

382
00:46:18,680 --> 00:46:38,360
Key, you know, so say we wanted you know key, you know, name, operation. Right?

383
00:46:39,480 --> 00:46:48,600
So on my to-do list is to actually set up these endpoints to be able to handle

384
00:46:48,600 --> 00:46:55,320
queries. Because you know we could actually set this up you know to really

385
00:46:55,320 --> 00:47:01,040
handle any sort of query. Because it would be cool to set it up so that way

386
00:47:01,040 --> 00:47:08,480
you could query by phone number, address. You could even do something that matches

387
00:47:08,480 --> 00:47:11,880
close to the address.

388
00:47:11,880 --> 00:47:18,000
You could of course do global ID. You don't want to do if anybody's suspended.

389
00:47:18,000 --> 00:47:26,400
You want to do type, city. So all of these are fields that we could set up to

390
00:47:26,400 --> 00:47:33,440
query over here. Okay? So now let's go ahead and you know let's hit this lab

391
00:47:33,440 --> 00:47:41,400
results endpoint to kind of see if it works as well. So we'll probably have to

392
00:47:41,400 --> 00:47:47,640
do a similar thing where we fix the limit.

393
00:47:49,960 --> 00:47:58,840
And this is also not set up for queries. But let's okay so we go ahead and got a

394
00:47:58,840 --> 00:48:02,000
live reload there.

395
00:48:02,000 --> 00:48:23,360
And so now let's check out. Now we're just going to hit the lab results endpoint

396
00:48:23,360 --> 00:48:41,720
and see what happens. Okay it looks like we may have gotten an error. Okay so we

397
00:48:41,720 --> 00:48:48,600
got a 404. So that means this endpoint was not found. So we probably want to

398
00:48:48,600 --> 00:49:00,920
also set up our URLs you know to you know it's better handle 404s.

399
00:49:00,920 --> 00:49:06,120
Because you know the idea of any endpoint I mean of any API is you want

400
00:49:06,120 --> 00:49:11,560
to handle all endpoints elegantly. So you know if somebody hits your base

401
00:49:11,560 --> 00:49:17,960
endpoint you know you want to so here's just a look at some of the other

402
00:49:17,960 --> 00:49:25,520
endpoints of the CanLytics API. So you know so if somebody hits the base

403
00:49:25,520 --> 00:49:34,320
endpoint so let's just say you just hit the base here you know you still get a

404
00:49:34,320 --> 00:49:42,560
status code and you and then you know you actually get a message. So it actually

405
00:49:42,560 --> 00:49:48,360
says okay you know welcome to the CanLytics API. You know you know we

406
00:49:48,360 --> 00:49:52,000
probably want to touch this up instead of data you know we may actually want to

407
00:49:52,000 --> 00:49:59,080
change that to message. But you know you're welcome to come in here and

408
00:49:59,080 --> 00:50:05,160
tinker on this because this is just a completely open source just sort of fun

409
00:50:05,160 --> 00:50:11,160
API that is just sort of set up just to you know tinker around with cannabis

410
00:50:11,160 --> 00:50:17,880
data. And you know like I said I'm not set in stone to any technology so if

411
00:50:17,880 --> 00:50:23,120
anybody has recommendations you know fly them out you know I'm not so you know

412
00:50:23,120 --> 00:50:30,200
I'm not stuck in stone on anything. And so just to give you a look here so we

413
00:50:30,200 --> 00:50:39,840
just hit the base endpoint we just got this message and so not to get too off

414
00:50:39,840 --> 00:50:46,120
track here but the idea is you know to handle all endpoints elegantly. Okay so

415
00:50:46,120 --> 00:50:52,480
the reason this one failed is we're actually we actually want to hit the

416
00:50:52,480 --> 00:51:01,040
test endpoint the test leaf endpoint. So let's go ahead and hit that endpoint and

417
00:51:01,040 --> 00:51:03,760
see what happens.

418
00:51:03,760 --> 00:51:10,680
And I'll publish all of this afterwards so that actually I'll do that right now

419
00:51:10,680 --> 00:51:19,880
that way you can actually hit the live API. Okay so now we've got a we've got

420
00:51:19,880 --> 00:51:34,880
our response, we've got our status code, and hopefully we have two lab results.

421
00:51:41,360 --> 00:51:48,840
Okay so we're still returning lab let's actually fix that real quick so that

422
00:51:48,840 --> 00:51:54,160
actually is kind of annoying me so it may annoy other people. I think it's just

423
00:51:54,160 --> 00:51:58,240
better if this just returns

424
00:52:02,400 --> 00:52:10,080
I think it's just better if it just returns the documents as a list.

425
00:52:10,080 --> 00:52:20,440
Got our live reload and see this is what's nice about you know cooking up

426
00:52:20,440 --> 00:52:27,200
your own little API here because now you know with just a couple lines of code

427
00:52:27,200 --> 00:52:32,160
you know we've now got this is a list and if you prefer it as something else

428
00:52:32,160 --> 00:52:37,440
then you know get in there and tinker and change it. So now we you know we've

429
00:52:37,440 --> 00:52:50,760
got our list here we've got two and you know we've got these lab results and so

430
00:52:50,760 --> 00:53:01,400
this is these are the exact same you know lab results as

431
00:53:01,400 --> 00:53:13,000
these are the same lab results you know from that LCB data dump. So don't be

432
00:53:13,000 --> 00:53:16,840
here's the link to the data dump.

433
00:53:22,560 --> 00:53:27,200
Okay but long story short so we've gone a long ways now we've basically gone

434
00:53:27,200 --> 00:53:35,640
from this data dump here we've gotten the data into Firebase we did that with

435
00:53:35,640 --> 00:53:40,960
the help of the Canalytics engine and so you're welcome to go and explore over

436
00:53:40,960 --> 00:53:49,200
here you know and check out check out the Firebase module you know we've used

437
00:53:49,200 --> 00:53:55,800
the import data function and then if you want just examples of that go check out

438
00:53:55,800 --> 00:54:02,680
test because the test is you know this can this can get you started that's how

439
00:54:02,680 --> 00:54:09,040
you initialize Firebase you know create some documents and then import your data

440
00:54:09,040 --> 00:54:17,000
and then now that we've got our data in Firebase we need a way to interact with

441
00:54:17,000 --> 00:54:29,600
it so we've got the Canalytics API and that's as simple to use as just you know

442
00:54:29,600 --> 00:54:35,680
making some get requests to you know to these various endpoints and then you've

443
00:54:35,680 --> 00:54:44,120
got this full suite of data and just to go ahead and polish it off I'll just go

444
00:54:44,120 --> 00:54:53,360
ahead and round it out by just showing you so you've you've done the install

445
00:54:53,360 --> 00:54:55,400
it

446
00:54:58,200 --> 00:55:03,800
all right so you've got the Canalytics API so you've done the installation we've

447
00:55:03,800 --> 00:55:11,680
done the development we've done the testing and so now it's time to publish

448
00:55:11,680 --> 00:55:20,560
well you know that's as easy to do it's just you know NPM and so that's why I

449
00:55:20,560 --> 00:55:31,280
have node is to expedite the publishing process so what's going on here so if

450
00:55:31,280 --> 00:55:39,760
you go and look over here and publish so basically we're containerizing the API

451
00:55:39,760 --> 00:55:52,640
so we create a container of the API then we push the container up to Google Cloud

452
00:55:52,640 --> 00:56:04,440
and then we actually are then redirecting all hosting from Firebase so

453
00:56:04,440 --> 00:56:13,200
over in Firebase we've actually got hosting and you can set up multiple

454
00:56:13,200 --> 00:56:20,760
sites here and you know we've actually got the you know the Canalytics API up

455
00:56:20,760 --> 00:56:30,640
here and so you know right now we're containerizing everything and you know

456
00:56:30,640 --> 00:56:38,560
it's going to be a little bit of a build process here so we're publishing that

457
00:56:40,920 --> 00:56:46,760
so I'll just sort of minimize this and then sort of bring it all home here so

458
00:56:46,760 --> 00:57:02,520
that way we can get people out of here in time for their next seminar

459
00:57:03,720 --> 00:57:13,000
all right so sorry to just kind of go on that long spiel there but you know what

460
00:57:13,000 --> 00:57:19,280
your what's your input what's your input too complicated of a way to get data or

461
00:57:19,280 --> 00:57:30,960
a manageable way I mean it's really cool it's probably a little more than what I

462
00:57:30,960 --> 00:57:41,200
would need but it's really interesting well this just got posted over here so

463
00:57:41,200 --> 00:57:44,760
here in this last couple of minutes

464
00:57:52,520 --> 00:57:57,360
we'll see if I wanted to do like more advanced things this would be this would

465
00:57:57,360 --> 00:58:02,440
be really useful yes so now it's published so now in this last minute

466
00:58:02,440 --> 00:58:14,720
let's see if we can't move over to the production endpoint and get the same we

467
00:58:14,720 --> 00:58:17,640
looks like

468
00:58:17,640 --> 00:58:24,640
it looks like there is an error along the way that's interesting oh I need a

469
00:58:31,640 --> 00:58:34,640
but it's interesting

470
00:58:34,640 --> 00:58:53,320
oh I need a but we got 200 status codes and so if we check out our data here

471
00:58:53,320 --> 00:59:04,440
we've got the two MMEs and we have our two lab results so thank you for bearing

472
00:59:04,440 --> 00:59:13,200
with it but there's essentially you know an MVP you know a bit of a viable

473
00:59:13,200 --> 00:59:21,640
product right so we've basically taken our data we've gotten it into you know

474
00:59:21,640 --> 00:59:39,040
firebase storage anyways we've got it there using the engine and now we've

475
00:59:39,040 --> 00:59:45,040
retrieved it from the you know the analytics API so you know you're welcome

476
00:59:45,040 --> 00:59:51,840
to you know go tinker on that and

477
00:59:56,240 --> 01:00:01,240
and see you know what happens

478
01:00:02,800 --> 01:00:05,800
but

479
01:00:05,800 --> 01:00:23,280
but anyways it looks like there's still some debugging to do so I'll go ahead

480
01:00:23,280 --> 01:00:26,280
and wrap it up there

481
01:00:26,280 --> 01:00:33,200
but thank you thank you for attending and I'll fill you in and let you know

482
01:00:33,200 --> 01:00:37,920
what's going on next week and next week we'll be checking out that SQL database

483
01:00:37,920 --> 01:00:42,120
see you then awesome thanks for attending Nick

484
01:00:42,120 --> 01:00:56,960
Michael

