0:28
Hi and welcome everyone to the Star Wars edition of AI42
0:35
I want to start by saying hi to my amazing team members, Håkon and Gosia
0:40
and our speaker today, Pete. Hello everyone, how are you? Oh, Daniel's speaker once, yeah, good, thank you
0:49
We were all being polite and waiting for each other. You don't need to wait
0:54
all right so it's so good that we are here together today again i'm very happy that so
1:01
many of you are joining us also i will start by telling a few words about our speaker then
1:07
hoken and gosha shares a few thoughts about ai42 and then we let pete to take the stage away is
1:14
that all right so pete is a senior solutions architect at octopus deploy the he is the owner
1:22
of PJG Creations and Microsoft Certified Trainer and Azure MVP, Pluralsight author
1:30
experienced public speaker and meetup organizer. Wow, Pete, I have so many questions
1:36
For example, what does your role involve at Octopus Deploy? Yeah, you're right
1:42
People read that stuff out and they just have to stop because there's too many words
1:47
I like to keep myself busy. So I recently joined Octopus Deploy
1:51
that was about three months ago i think i've been there three months um and my role is is in
1:56
community really so i'm trying to help people to understand how to use octopus better and how to
2:01
integrate their own processes with octopus as well as perhaps the tools that they're using or would
2:06
like to use with octopus as well and as well as going out into the community and meeting people
2:11
and getting feedback that way so uh yes it's a really varied role uh it takes me all over the
2:17
place. That sounds really interesting. And you also brought in your intro that your experienced
2:24
public speaker and organizer, like what topics do you usually present at conferences and meetups
2:31
Yeah, I mean, I've run my own company for 12 years, mainly doing IoT. So of course
2:37
IoT is close to my heart. And that's most of my talks will be about IoT, to be fair. But now
2:43
having joined Octopus, I'm picking up more and more of the DevOps side of things. So I've got a few conferences coming up where I'll be doing some DevOps, but I'll be over
2:52
at NBC Melbourne talking about a Raspberry Pi robot arm as well
2:58
So yeah, I definitely won't be losing the IoT stuff because that's good fun as well
3:03
So hopefully I'll find some time to combine the two. That sounds really interesting
3:07
And we also heard that you are helping in creating a certificate as well
3:12
Could you tell a few words about that? Yeah. So way back in the day, sort of, I think, 2019
3:18
I was invited to help create the AZ220 certification, Microsoft Developer Certification
3:24
And that was a really interesting process. And so I'm still involved to today in refreshing that exam
3:30
and keeping it up to date, creating new questions that are relevant and knocking off ones that aren't
3:35
And I also helped create the Pluralsight content around teaching people how to pass that exam
3:42
So, yeah, reasonably embedded into that system there. That sounds really interesting as well
3:50
I think we will talk a little more before you start the session
3:53
but let me hand it over to Goshen Håkon to tell a few words about AI42
3:57
Yes, so let us talk a little bit here about the motivation here for starting AI42
4:16
So the motivation comes from the recognition that there is really no good starting material for getting into machine learning and AI
4:23
So here, AI42, we are a strong team consisting of three Microsoft AI MEPs that strive to provide you with a valuable series of lectures that will help you to jumpstart your career in data science and artificial intelligence
4:40
And we aim to provide you with the necessary know-how so that it will help you to land your dream job, as long as it's related to data science or machine learning
4:50
And this concept is quite simple. It involves professionals from all around the globe that will explain you the underlying mathematics, statistics, probability calculations, data science, and also machine learning techniques
5:03
And that may sound like a lot, but don't worry, because we will guide you through it all. All you have to do is just to follow our channel and enjoy the content every second week
5:13
so it will be filled with real life cases and also expert experience
5:18
And we know that this is a lot because we have all started from scratch
5:22
but we're very happy to help you to build it up from there. And you can always stop and remind the videos
5:28
or ask for clarification in the comment section. And we hope to assist you on this wonderful journey
5:34
and have you as a speaker one day. We believe that by creating cross-collibrations with other organizations
5:41
we can give the best opportunities to broaden your network in the AI and data science communities
5:47
And with the combination of our offered services, we will be able to support less fortunate people
5:53
and organizations that are not that recognized yet, even though they deserve it
6:00
And our organization is sponsored by Microsoft and MICE, and we are humbled by all the support we get from our contributors as well
6:08
and thank you for Levent Pongor for all the beautiful graphic content and Minard Marie
6:13
for the cool introducing before our event. We are in close collaboration also with C-Shop Corner
6:20
and Global AI Community, so our lecturers are going to be there on their YouTube channel as
6:25
well as in our social media. Nicolet Tor creates and reviews all our text content that we use on
6:32
our website and during our sessions. You can follow us on Facebook, Instagram and Twitter
6:41
and to become a part of our growing community and we are sharing the knowledge and funding
6:46
You'll find every information that will bring you to an advanced level in the field of AI and data
6:52
science. You can watch our record session on our YouTube channel and find our upcoming session on
6:59
our meetup page. And then we also have our code of conduct. So the code of conduct will outline
7:06
the expectations for participation in our community, as well as the steps for reporting
7:12
unacceptable behavior. And we are committed to providing a welcoming and inspiring community
7:17
for everyone. So be friendly and be patient, be welcoming and also be respectful with each other
7:24
And you can find our code of conduct here on this link here With that said let us go back here to the studio
7:46
Hi, welcome back, everyone. I'm so happy to be back with you, all the four of us is here
7:53
to hear about what are we going to talk about today. So, Pete, what are you going to cover
8:01
Yeah, what was interesting is that my Percept was busy listening in the background there and decided to trigger
8:07
I thought I'll just go and have a look at what it's doing. And it was listening to almost all of that conversation
8:11
and it was like, oh, how am I going to stop that? So I had to mute you all and let it time out and then refresh pages
8:17
So that's probably going to be a theme of this particular talk
8:21
So I'm going to be talking today about how you can, along a journey, control your home using Azure Percept
8:29
So yeah, it's going to be quite an interesting talk. It's demo heavy, and I've got lots and lots of virtual desktops to go through and lots that can go wrong
8:37
So we're going to be saying prayers to, I don't know, Boba Fett or something like that to make sure that my demos work
8:45
That sounds really interesting. and I think I will let you take the stage away
8:51
and I wish you the best of luck and may the fourth baby do
8:57
Very. So, yeah, thank you for coming along to this talk
9:12
I'm going to be talking about the Azure Percept and how we can use that, as I say
9:16
to automate your home. So first, a little bit about me. We've talked about this briefly
9:22
so I won't go into a great deal of detail, but I'm a senior solutions architect at Octopus Deploy
9:27
and I'm also a Microsoft certified trainer and an Azure MVP and a Provisite author
9:33
I've been doing this for, I don't know, well over 20 years now
9:37
And so, yeah, IoT, pretty close to my heart now. I'm also a code club organizer, a STEM ambassador
9:45
and I've got two small children at home, two young girls. And so making sure that we can democratize this information
9:51
is really close to my heart. It's something that I strive to do
9:56
And I'm looking forward to when schools are fully reopened so that they can run code clubs again
10:02
Certainly the ones in our area are still a little bit shut down for that, sadly. So, yeah, that's about me
10:08
So diving straight in, what is the Azure Percept? Well, this is Microsoft's venture into AI at the edge, and it's underpinned by some hardware
10:19
So the center of all of this is the carrier board, and that's the brains of the operation
10:24
This is what's going to link everything together and do the processing. So onwards from that, we then plug in a vision board, and that allows us to be able to do vision processing at the edge
10:38
And then the period part of this talk is the audio module
10:42
And this is actually an optional module in the Percept bundle. But this is the part that's going to recognize voice and be able to then translate that back out into an audio format
10:53
So, yeah, it's pretty much just those and a power supply, but I'm not going to put that on the screen
11:00
Some specs, there's no need to go into too much detail about this. But what it's trying to do is give you enough power to be able to do AI processing down here at the edge and then control all of those components
11:12
So it's a reasonably powerful IMX 8M processor. And it's got TPM built on board as well, which helps us with our security, as well as Wi-Fi and Bluetooth and things like that
11:24
It's actually running Microsoft's own distribution of Linux, CBR Mariner. So just a little interesting piece to be aware of that is
11:33
And then we're going to be running IoT Edge on this device. And then Percept Studio is going to underpin our cloud services as well
11:42
Moving on, we've got the Vision module. I'm going to do a bit about Vision in this talk, but not a huge amount
11:47
because it's sort of tangential to what we're talking about. But this is an 8-megapixel camera capable of capturing at 30 frames a second
11:54
And it's got a nice wide 120-degree field of view. And then that's hooking into Azure Custom Vision, which gives you another no-code solution to how this is working
12:06
And then the part that we'll be using mostly today will be the audio module
12:11
Again, this is an optional module that you add to the Percept bundle if you want to do the audio processing side of things
12:18
It's got four microphones in there and a three and a half mil socket to plug some speakers into, which is what I've got plugged in here, which you'll see shortly
12:28
And then that's going to use speech services and Azure Speech Studio. And again, it's no code
12:33
So to get a lot of this working, you're just going to be configuring stuff in the cloud on the web
12:40
So pretty straightforward overall, really. So moving on, these hardware components themselves underpinned by some software
12:50
On that carrier board, we're using IoT Hub and IoT Edge. So this allows you to be able to have that Edge processing capability and then communicate up to Azure through the Edge gateway paradigm of the IoT Hub
13:05
And of course, this is a highly available service, allowing us to be able to connect millions of devices together
13:10
without the need for a busy tone at any point. On the speech side of things, we've got speech services
13:19
including Lewis and things like that. So that's Language Understanding Intelligence Service
13:23
and Speech Studio. So that's going to be doing our no-code speech projects
13:30
And then for the vision, we're using custom vision, which I'll dive into and show you these bits as we go
13:37
So that's enough of those bits of slides because, let's be honest, you're here for the demos
13:42
So what I've got is many, many, many, many desktops set up
13:46
So if I flick actually to this page just here, that was a spoiler you saw there
13:51
This is the set of kit and this is live. Hello. So we've got the carrier board just here
13:56
and then we've got the vision module and then the audio module with some speakers plugged into it
14:02
and then Chewbacca just making sure that we're doing the right things. And if we're not, then he's going to get angry and start firing arrows at me
14:09
He won't, trust me. So as a little preface, I actually wanted to add way more Star Wars
14:15
into this talk, being it is May the 4th. But sadly, some of the libraries that I wanted to use
14:23
to get a particular project to work have recently been updated and broken
14:27
So, yeah, that made me a bit sad. But don't worry, there's bound to be some puns as we go along
14:35
So those are the various different bits. And as I mentioned, we're going to be spending some time in something called Azure Percept Studio
14:42
So when you get your Percept, you'll come along into here. And this is where you're going to get started
14:50
What's great is there's a bunch of demos and tutorials. And you can try out things like some sample vision modules here if you wanted to or create your own vision prototype or some sample applications for people counting and vision on the edge and things like that So that works quite nicely to
15:07
be able to do that. But what you can do is you can create a custom vision project to be able to
15:12
recognize objects. And I've done just that. So hopefully this will actually still be working
15:18
So there we are, there's my hand and you can sort of see that working there. So what I've done is
15:22
built a really simple and obvious sort of demo here. And don't worry, I've got a hammer
15:31
But it's OK because the Azure Percept knows it's a hammer look
15:35
So it's recognizing that hammer in real time, which is great. And it can also tell the difference between that
15:41
and perhaps a screwdriver. So there's a screwdriver look. So I've trained it to recognize those particular objects
15:48
which actually this came from a demo that I gave way back in the day
15:56
And I think Gosha was there. We co-organized an event around AI. And it was from an MI42 sort of a project that they put a bunch of images
16:09
that were stuffed out of context, like the end of a hammer chopped off and the handle of a screwdriver chopped off just to prove that there's a lot of bias
16:18
when you're training stuff like this. But we can see this. So if I go to my custom vision project
16:24
so this is called Percept Vision 1, and what you have is a whole heap of training images
16:29
And if I go into one of the training images, you can see I've trained a screwdriver there
16:33
So it recognizes that. And I've tried to be reasonably good about it
16:38
So if not all of the screwdrivers there, like this one perhaps, look, so I'm only labeling the screwdriver part as the screwdriver
16:44
just to kind of try and give it a bit more context because it could easily think that if I'm tagging my hand in there
16:50
that my hand is part of the screwdriver, which we know it's not. So when you've done that, then you're able to train
16:56
the whole AI package then, the model, and then deploy that down onto the device
17:04
You can get more information about it in here. I've actually gone through six iterations of this training
17:11
Actually, I trained it on that iteration four and five and made the whole thing worse. I started to do some more training
17:16
It doesn't always work. Certainly, you can overtrain your model as well on certain things
17:21
So just bear that in mind. You can actually see some of this if you go to the vision side of things
17:28
We can see there's the Percept Vision 1. This is an older project on image classification
17:34
But you've got this Percept Vision 1. And clicking into that actually gives us a bit more information about that
17:40
So there's the project and some information about that particular iteration and onwards from that
17:47
So, yeah, again, that's pretty straightforward. It's not particularly difficult. And once you've trained that
17:54
you just go to evaluate and deploy, and then you can deploy it down onto your Percept
17:59
So in my case, that would be this Percept 1, and you just deploy it
18:02
And it takes, I don't know, maybe three, four minutes to deploy because there's a fair amount in that model
18:08
And then once it's on there, then you go back to your web stream and you can test it out
18:13
So, yeah, it works quite nicely. And I mean, I wanted to give you an idea of what that vision module did and how he could use that
18:21
And there's way more interesting things you can do with that. I know my friend Cliff Aegis has created something that recognizes all of the parts of an airplane he's going to build
18:29
Yeah, he's a pilot and he's going to build an actual airplane. And if you imagine there's quite a lot of parts that you don't know quite what they look like
18:36
and you pick them up and you've not got the manual. And so he can hold it in front of his percept and it'll actually tell him what that part is, which I thought was quite cool
18:43
I also saw a demo that came out of the Microsoft Hackathon where they had a percept camera pointing at the tray table in a operating theater
18:54
And it would have all of the instruments that the doctors and nurses were using there
19:01
And it would recognize if any of them were missing and which ones were actually there
19:05
So that was quite a good real world example of what you could do with this. So I quite like that
19:10
So moving on to the sound element of things. Again, if you go across to our Percept and we go back to Percept Studio
19:20
and we go back to Overview, in Demos and Tutorials, there's a few speech tutorials and demos
19:27
And one of them is this voice assistant. And so you can see there's just some idea of what Microsoft might think
19:34
that some of the ideas you could use this for are. And obviously, we're not limited to these ideas
19:38
but it gives you a way to be able to test it and understand how this works under the hood
19:44
So we see this hospitality, healthcare, inventory, and automotive as an example there
19:49
I've actually deployed that hospitality demo over here. And you can see why I'm using this one, because actually what I'm going to be demonstrating here is a demo
20:00
if you will, of home automation using the Percept. And this does actually work in theory
20:07
so let's see what happens computer computer turn the television on
20:17
okay turning the tv on let me see the televisions on computer turn the lights on sorry
20:26
i didn't quite catch that computer turn the lights on okay turning all the lights on and it's contextual as well so it knows so computer computer
20:42
turn the lights on all lights already on so it already knows that the lights are on which you
20:51
know so we can feed back that sort of information um we can see we've got a thermostat there as
20:56
well so computer set the thermostat to 68 degrees set temperature to 68 there we go
21:06
and we can we can just turn bits on as well so computer
21:11
computer turn the bathroom lights off okay turning off the bathroom light so yeah i mean
21:21
That gives us an idea of how you'd use the sound elements of this
21:26
And obviously, I'm talking and then expecting it to be able to pick me up straight away without leaving enough of a gap
21:31
So really, you can blame me for that. But you can see that it's quite reactive
21:37
And it gives you a good idea about how you could go about building this sort of a system up from scratch
21:44
Under the hood, this is using a bunch of speech services. So we're able to convert our spoken word into text, and then we can use Lewis to be able to then understand the intent behind our words there and then translate that into an action afterwards
22:06
And then we feed back from that action a sentence, if you will, a contextual sentence to say what's actually happened
22:13
And then the speech module there will then convert that from text back into speech
22:17
So end to end, it's using a fair amount of different services here to hook that all back up together again
22:24
So it seems to work quite well I quite like it But that kind of just gives you a flavor and doesn really give us exactly what we want So if I switch back to my slides now and we go across
22:41
Home automation is what we really want to talk about. So what I've set up on my desk here is a system that sort of jumps through quite a few hoops
22:54
And so I want to show you this, and then I'll show you the demo afterwards. And we also got to hope the demo works
22:59
But that aside, I'm starting with the Percept audio module, and I'm going to be able to give that commands
23:06
And then that's going to use all those services in the web to be able to then convert that, the intent of what we want it to do
23:13
and out through those speech services to an Azure function. So you'll see there that once we've understood the command
23:22
And we can then, using a web endpoint, call out to an Azure function to be able to do something
23:28
Sadly, there's no direct integration with that and an IoT hub, which is the next on the steps there
23:34
But the Azure function, that gives us the control after that point. If you can get out to an Azure function, you can do pretty much anything you want
23:40
I am looking forward to the day where we get direct IoT hub integration. I think that would be pretty powerful when we get that
23:46
So from the speech service, the speech project out of the Azure function, I deliver what it is that I want it to do, the command
23:56
And then that goes out to an IoT hub and then onto a Raspberry Pi
24:00
So really, I'm using the IoT hub as a broker, if you will, a gateway for us to be able to communicate with the Pi
24:07
So I call into the IoT Hub and actually invoke a method on the Pi, which then, at that point, will connect to a mains relay and to a desk lamp to turn the desk lamp on
24:20
And obviously, you can see there's quite a few hops there. So we're fraught with danger, but it should be fine
24:25
It should all work. It's all good. So let's go see what that looks like, shall we
24:31
Just to give you an idea of what it is we're doing here
24:36
I've got the speech studio on the screen here. And what we have is we have the custom command
24:45
and I've just got one in this case, so turn the light on or off. And we have a bunch of example sentences
24:51
So in here, if I zoom in a bit, you'll be able to see a little bit better. You can see that I've spelt out the number of different ways
24:58
we can actually turn a lamp or a light on and off. And, yeah, we could go on and on and on at that point
25:03
But it needs to understand that you could say this in many different ways
25:07
And this is like any home assistant, if you will. So in our case, we've got turn the light
25:14
And then we've got some green text here. And that's actually a parameter that we can see on the left-hand side
25:19
You could have a turn light on command and a turn light off command, but that's a bit wasteful
25:24
So we can actually accept parameters in here. So if I go to the parameter, you can see that it's a type of a string
25:31
which makes sense. We could have a default value, but I mean, which are you going to default to, on or off
25:36
So we're not going to use that. And the values we can have for that are either on or off, so pretty straightforward
25:43
And then at the end of that, we've got what's called a completion rule. And I've created one of those that's just called done
25:49
And that calls a web endpoint. And if I go into the web endpoints here, then we have a URL that it calls into
25:56
and I'm using post to be able to call into that. So it understands that command, and then it calls out to that web endpoint
26:05
Actually, if I go back to turn light on and off and back to done, if I click on this edit here, I'm just passing a state of that parameter that we had of on or off
26:16
And it's going to call into an endpoint called light control. And you'll see then when I show you the code that the Azure function is then initiating that process through the IoT hub and down to the Raspberry Pi
26:28
So that gives you some idea of what that looks like. So let's see if we can get it to work, shall we
26:32
So the first thing I need to do, though, is I need to tell my device that it's going to be using that particular speech
26:42
I think we'll do it from here. Go to commands. There we are
26:47
You can actually see I can assign different keywords as well. But the one I want in this case is this Percept IoT Hub speech
26:54
and if I click assign, if I go back there, I'll need to assign it to my percept and then click save
27:03
And if I'm fast enough, you see the LEDs on the sound module
27:08
flick on and off. That's it assigned. So it should be down on our device now
27:15
So the code is actually running on my Pi. So I've SSH'd into my Pi and this is all in .NET
27:23
and I'll show you the code for it shortly. but let's see if it works. So computer, computer
27:32
turn the light on. Good. So if I go back and then
27:42
should be able to see now that we are lit up over there in our shadow. So computer
27:49
turn the light off. the light has been turned off i'm going to do this one more time shaking my hands just so you
27:57
know i'm not cheating over here there are computer turn the light on your fallback message here
28:06
thank you computer turn the light on uh so yeah that's uh that that works so i'm really happy with that actually that's
28:18
pretty impressive at that point. And so, yeah, that's going through quite a few different hops
28:24
So if I come across here, and the first thing it's doing is going to the Azure function
28:30
So if you remember from the slides, that way there, you can see that it comes from Outer Speech Studio
28:35
and I showed you how that gets out through that web endpoint, and then out to an Azure function
28:41
And this is the Azure function. So all this is an HTTP-triggered Azure function
28:48
So I'm not doing anything particularly special. And in fact, most of the code that you'll see here
28:54
you get when you create a new HTTP triggered Azure function. But what I have done is I've added a couple of extra bits onto the bottom here
29:03
And mainly it's this Microsoft.Azure.Devices NuGet package. And that's allowing me to be able to then hook up to the IoT hub
29:14
Scrolling down a little bit further, we have a connection string that I'm storing in my environment variables
29:21
So I pull that in here. And then there's a bunch of scaffolded stuff here
29:27
that we've not needed to worry too much about. That you'll get automatically
29:33
And then I log out what that state was that I've passed in as part of the function
29:38
So I can grab that out from the data and then stick it out onto the console
29:43
And then I just say that my HTTP triggered function was executed successfully
29:49
And then I create a new service client. So just to give you a little bit of background
29:54
when you are trying to control devices that are connected to an IoT hub
29:59
you need to connect to a new service client. at what's called the service level permission. If you're a device
30:04
then you're going to be connecting at the device level permission. And the reason being is that we don't want devices
30:10
to be able to control other devices. So the service level permission allows you
30:13
to be able to control devices from that point. I then invoke this method down here
30:22
passing in that state. And there's not a huge amount to this
30:26
So I say I want to create a new cloud to device method. So the IoT hub is in the cloud and the device sits there waiting and you'll see in the code for this particular method to be invoked on it
30:38
And so I create one of those. I say timeout after 30 seconds because, you know, the device could be offline
30:44
And I set the payload to be pretty much exactly the same JSON that I'm passing in with the state
30:51
And that's either going to be on or off, if you remember. And then we wait for a response when we invoke that device control device
31:01
That's the name of my device there. And then that's the method invocation that we're creating above
31:06
And then we just write out to the console what's happened at that point. And we can actually see some of that in action if I go across to my Azure function
31:17
Then if I can go into one of these, this is an old one, but you can see that this was executing
31:23
If I zoom in a little bit further, we'll be able to see that. We're executing that PerceptIoT function
31:27
We've passed in a state of off. The state was off, and then we executed the PerceptIoT function successfully
31:34
So that's just a bit of logging that we get coming out of that. So going across a bit further, we come out of the Azure function
31:43
and then go through the IoT hub and down to our device. So all of this code is all .NET
31:50
.NET 5 in this case, actually, rather than .NET 6, just because I've not updated it
31:55
And here we're also using the Azure, but we're using the client side of things here for devices
32:05
And then I'm also pulling in this system.device.gpio NuGet package. So azure.devices.client is going to allow me to connect up to the IoT hub
32:15
so that I can listen to that method invocation from the Azure function
32:19
And then system.device.gpio, that NuGet package allows us to be able to communicate
32:24
with the pins that are sitting on the side of our Raspberry Pi
32:29
I've got another talk that goes into a bit more detail about how that all works, but GPIO is general purpose input output
32:36
And so that's a collection of pins that give us, you know, things to be able to turn LEDs off
32:40
or read buttons or control serial ports or pulse width modulation for robot arms
32:48
and things like that. So that particular library is hugely important because we can't do anything else with it otherwise
32:56
So next, I create a device connection string, which I've not put in here. And then I set aside a pin number
33:03
that I've got that relay connected to. And in this case, it's pin 32
33:08
It's worth understanding that relays work a bit backwards. So you don't send a pin high
33:15
You don't write a true value to a pin to turn a relay on. You have to send a low value, a false value to it
33:21
So you'll see in a bit that it seems a bit backwards, but that'll make some sense
33:26
Next up, we create this GPIO controller, which is coming from that device.gpio NuGet package
33:33
We're passing in a board number scheme of board, And it's worth knowing the difference between this
33:39
because you can also have BCM. And BCM is referring to a chip on the Raspberry Pi
33:44
that controls the GPIO. But if you go ahead and use that numbering scheme
33:48
then you have to convert from the BCM numbering scheme to the board numbering scheme in your head
33:51
or use some form of a little board that you put on there
33:55
like I've got. It just seems like a waste. Just stick to board. That's my tip of the day
34:00
Next up, we open that light pin up as an output because we're going to be controlling that relay
34:06
And then, as I mentioned earlier, we set that value to high because that turns the light off by default
34:12
I think that's the safest option. Next up, we create our device client
34:18
And this part here is the part that's going to connect us out to Azure, to the IoT hub
34:23
So we pass in our device connection string, and then we're using something called MQTT
34:28
which is just a queuing protocol that you can use to communicate with the IoT hub
34:32
Next up, then you can see here is the part that's waiting for that control light
34:38
And by the way, do ask your questions in the chat or wherever you need to
34:44
We're more than happy to answer those. I'll answer them as we go along or at the end
34:51
So, yeah, don't be shy. So, yeah, so we've got that control light method that we're waiting for
34:59
And so I'm awaiting that there. And then if that's received, then I just say it's been received
35:04
And I spit out the payload as JSON. So that's our state and on or off, if you remember
35:11
And then I just convert that to an object. And then if the state is on, then I set the value of that light pin to low
35:20
So again, remember, it's backwards. Setting the relay to be low will turn the light on
35:26
And setting it to be high will turn the light off. So that makes some sense
35:31
Then I'd just say that this is the response message of OK
35:35
When you invoke a method from an IoT hub to a device, it expects a response
35:40
If you remember in the Azure function there, we had a timeout of 30 seconds, and that's part of it
35:46
So it's waiting for that response to happen. You can, of course, say that it's not OK, and it's still fine with that, but then you need to handle it
35:54
and then we just return that response message with a 200, which is the HTTP code for okay
36:02
And then I say, I'm waiting for the command, and then I do nothing at all
36:06
I just sit there and wait, and that's all I'm doing in this particular code. And we could see that
36:11
So that's what it was doing there, so waiting for command, and then the IoT hub invoked the control light method
36:17
and the payload was that state of on, and then add an item onwards and upwards from there
36:24
So, I mean, that gives you some good idea about how a lot of that is hanging together
36:32
And actually, I'm really sad. I did want to demonstrate a little board down here
36:36
And if you kind of put me back to full screen rather than sharing my desktop, then what I've got here is an ESP32 with some LEDs and a speaker attached to it
36:47
And my plan was to, at this point, switch across to this demo
36:53
and invoke that method by turning on the light, and what it would have done is flash those LEDs
36:59
and played the Star Wars theme tune. But unfortunately, because of how development works
37:06
I using something called the Nano framework which is fantastic Jose Samoes and a load of other people Lauren Ellabach over at Microsoft are involved in this open source project
37:18
to bring .NET to microcontrollers. So these are teeny tiny little memory limited
37:24
CPU limited devices. And we're able to run .NET directly on these devices
37:29
which is hugely influential, actually, because ordinarily it'd be in something like C or C++
37:35
which is historically very difficult to get used to, certainly when it comes to memory
37:40
It can be very complicated. You can absolutely do a lot of this in C
37:44
There's nothing stopping you from doing that. But C Sharp is a lot easier. But if we flick back to my screen share
37:50
I can actually show you some of the code that I would have run. Here's what you could have had
37:55
So here's some Star Wars code. So I've got almost exactly the same sorts of things
38:01
So I've got a GPIO controller, and I'm setting aside some pins, blue and red pins, and some variables
38:09
And all of these constants here are the notes that would have come out of my speaker
38:14
And then a bit further down, we set up our connection to the IoT hub
38:20
And then I set up a GPIO controller, and then I set up my pins
38:26
So I've got the blue and the red LEDs here, and I'm setting those both to be outputs
38:31
and then I just default them to be off to begin with. And then the ESP32 that I'm using down here
38:37
has got Wi-Fi built in, so it's really cool. So we can connect that to the Wi-Fi
38:42
and then hook up some handlers and then we open the connection
38:46
This is the part that's broken at the moment, sadly, is when you open your connection, you get a socket error
38:51
And I set Jose off earlier fixing it. It's possible he's actually fixed it
38:54
while I've been on this talk, but I'm nowhere near brave enough
38:59
to try and test this out live on air because anything can happen. But hopefully, my plan is that I'll finish off this and go and check
39:07
and hopefully you've fixed it. And what I'll do is I'll record a little video of it
39:11
and then I'll tweet that out. So keep your eyes open on my Twitter there for that to happen
39:18
And you can see then we can do things with device twins if we want to. But a little bit further down, we've got a beep
39:25
and that's what's controlling our beep. And then we've got some music that happens down there
39:29
blinking LEDs, connecting to the Wi-Fi. And then at the bottom, we've got our control lights
39:35
And this is the exact same type of routine that we had over on the Raspberry Pi
39:40
So I still get the payload and I spit that out into a debug line and deserialize it
39:46
And then I would have started the Imperial March at that point and started the LEDs blinking
39:51
But I'm very sad, shed a tear that that's not working. So all you can do is just, you know, be happy that we've got Chewbacca in here
39:59
but yeah that's how that would have worked what's great about it by the way is that you can do
40:06
all of this in Visual Studio there's a fantastic extension for .NET
40:10
Nano Framework that allows you to actually debug this live on that ESP32 so
40:16
if you're only just starting your journey but if you start looking at the ESP32
40:19
debugging these things is really difficult but with the .NET Nano Framework
40:24
extension and using .NET Nano Framework once you flash the thing with the firmware to be able to do it
40:30
You can then actually debug it live, step through your code set, break points
40:35
get values out of variables. Yeah, it's super powerful. And I'm very sad that I can't actually demonstrate that to you
40:45
But at least you've been able to see how that's working, albeit not working under the hood
40:53
So if I go all the way back to my slides, that is the end of the demos
40:58
So you saw how that all worked together in there with, if I go back a slide, with the percept audio
41:06
going through Speech Studio, out through an Azure Function, to the IoT Hub and Azure Raspberry Pi
41:10
So quite a few hops. And to say one day I'd like to see that Azure Function negated
41:16
and just being able to go directly to an IoT Hub, that would be quite powerful, I think
41:20
There's some links here that I know they're going to get either tweeted out or put on the screen here
41:26
but I've got a blog post actually that you can go and check out
41:30
So over here with everything you need to be able to, to get this all set up
41:36
So it talks about custom commands and that, you know, that looks familiar. Yeah. That particular diagram
41:41
creating the Azure function and then using that in VS code and step-by-step
41:48
on everything. It's quite a long post, but don't be deterred. I've tried to make it as step-by-step and simple as possible
41:53
was that Raspberry Pi pulling from IoT Hub or IoT Hub pushing to the Raspberry Pi
42:01
Yeah, essentially it's been pushed. There's a socket connection that's opened up
42:06
and it just waits while it's connected for that command to be invoked
42:11
And so, yeah, the Raja function will call into the IoT Hub and then the IoT Hub is then coordinating
42:17
that communication down to the device at that point. So, yeah, I hope that answers that particular question
42:23
Sorry, you've disappeared off the screen. Not quite sure what your name was there, so apologies. So, yeah, so go and check that out
42:29
I think the link will be put up there for you to be able to see
42:33
but I'm going the wrong way. But, yeah, if you go to that bit.ly there, then that'll take you to that particular piece of code
42:41
The GitHub repo for everything I've showed you aside from the Star Wars stuff, because I didn't want to put that in there until it was working
42:46
you can find it that bit.ly there. So that'll be the Azure function and the Raspberry Pi code
42:53
are both in there. So you can go and check that out. If you need some more information
42:57
about the Azure Percept, then check that bit.ly out. And then onwards
43:02
you can look at speech services and custom vision there as well. So that's those bits
43:08
I'm also, we're on a break at the moment, which sounds very much like Friends
43:12
but it's not like Friends. Part of a Twitch channel called Azureish Live
43:17
And I have a show on there along with Cliff Ages and Maria Anastasia Mastaka
43:21
where we just do a weekly roundup of IoT news and get some interesting guests on to talk about IoT tech
43:29
and stuff like that. So do go and check that out. You'll be able to find the Azureish Live YouTube channel
43:34
with our historic shows on there as well. So you can go and check that out
43:40
It was mentioned at the start, Eve mentioned it, that I helped create the AZ220 certification
43:46
and I've also helped create the Pluralsight content to help you pass that course
43:50
So if you're interested in taking your IoT knowledge to a level that perhaps get you a job, then this is definitely a place to start
43:58
That certification is pretty broad, let me tell you. So if you pass it, then you're in a good place
44:04
And that set of Pluralsight courses will definitely help you to do that
44:09
So you can see the Pluralsight link at the top. You can hit that up and go and check that out
44:15
There's a couple of things coming up that aren't IoT-based. So today actually super busy for me because it a webinar day at Octopus Deploy And we deploy we give out this webinar three times during the day in three different time zones
44:30
So we've already done it twice for APAC and for the EU
44:34
But if you're around 8 p.m. tonight, then the final session of the day will go live then
44:41
And you can come along and listen to how to do configures code with Octopus Deploy
44:46
so that's quite a powerful feature. If you're back on the .NET side
44:51
I also run Notts IoT and I was mentioning .NET Nano Framework there
44:57
Well, Lauren Ellabach, Microsoft employee there, principal software engineer, he's involved with that project as well
45:03
and he's going to come along next Thursday to talk to us about the .NET Nano Framework
45:07
so definitely go and check that out on Meetup. You'll be able to come along to that virtually
45:13
and see what that's all about. And finally, myself and Chris Reddington, also from Microsoft
45:19
on the 18th of May, we'll be talking about how we can use GitHub Actions with Octopus Deploy
45:24
So, yeah, lots of things coming up. If you need to contact me, then at Pete underscore codes
45:31
is definitely the place to do that on Twitter. That's the easiest. You can email me, peter.gallagheroctopus.com
45:38
And if you want to check out all of my blogs, then petecodes.co.uk is the place to do that
45:42
knotsiot.net knots knotsiv workshop I'm also involved with a Lufthwa based networking group called LATI
45:49
the Agile Engineering Podcast which is DevOps and Agile Engineering you can go along check that out
45:55
and then the Azureist Live Twitch channel and the slides will be available at that bitly
45:59
at some point in the next day or so once I upload them so yeah thank you so
46:05
much for coming along to a talk I hope it was interesting and gave you everything you wanted
46:09
and yeah may the force be with you as they say thank you Pete for the great session
46:28
it was really cool I love that how you were having a conversation with Chewbacca
46:34
on the way yeah it doesn't say much these days you know
46:39
I'm a bit quiet. He needs to speak up a bit. I was wondering, how can someone get started with all this cool stuff
46:49
without any knowledge, basically? So, for example, I might have some stuff that I could use as an IoT Edge device
47:02
but I am not sure other than what you just shared, Links, where to start this
47:09
Yeah, there's quite a few things. I mean, if you're talking about something like a Raspberry Pi
47:13
if you've got a Raspberry Pi, then if you go along to the Pete Codes website
47:18
then if you do like raspberrypi.net5 or something like that, then Google will help you there
47:27
And so I show you how to install and use .NET 6 with the Raspberry Pi. This is quite easy these days, actually
47:34
I didn't go into any detail with it, but there's actually just one single command that I've created
47:38
that'll get the .NET framework installed on your Raspberry Pi directly, and then you're off and running with .NET
47:46
And then under that, there's a whole heap of stuff that will just get you through, so a simple circuitry to be able to read a button and flash an LED
47:54
And so some of this is in my Raspberry Pi robot ARM talk, but there's a heap of different blog posts that you can go through there
48:02
And, yeah, it's all backwards compatible, which is a nice thing. So, you know, I started messing around with .NET on the Pi when it was 2.1
48:11
But when it was 3.1, then I made a bunch of articles on here that you can read
48:15
And so you can then move up to actually adding the I to IoT there as well, all the way through to remote deployment and debugging
48:23
So there's that. It's obviously self-promotion. That's where I'm going to start that
48:27
But there's a fantastic resource that the people, the makers of AutoCAD, Autodesk have created called Tinkercad
48:37
And I can't show you that because I'd have to log in. It's not a great problem. But what that does is that gives you a design surface that you can drag things like Arduinos and micro bits onto
48:48
And then you can play virtually in the browser with this IoT stuff and program it directly
48:54
It's super powerful, all free. You just need to register. And you can drag all of the components that you want to
49:00
And you can drag things like motors and stuff on there and make things virtually move
49:05
And the great thing about that is that if you're doing Arduino stuff, you don't have to do it in C directly
49:10
You can drag programming blocks like Scratch, if you've ever used Scratch
49:16
You can just drag that on there. So for a beginner who will often get given something like an Arduino, to go and pick up C can be quite daunting
49:23
If you're not a developer, or even if you are, C can be quite daunting
49:27
So to be able to do it in that block-based paradigm, a bit like Logic Apps, if you will, is super powerful
49:36
I like that. And you can export it out of that environment and program it onto the real device, and it'll just work as well
49:43
So that's always my recommendation. Go and check that out. There's also Jim Bennett has created a fantastic series of IoT tutorials
49:53
I've not got the link for that. But if you go and search for Jim Bennett IoT classes, you'll find that on GitHub
50:01
That's really good. He's spent a lot of time on that and making it really good
50:06
So, yeah, congrats to Jim for that one. And congrats to you, too
50:11
You have some amazing blog posts here. I was just checking it out in the meantime
50:17
I also have a couple of questions here, Pete. So one of the questions is, because you were mentioning here
50:23
the .NET nano framework that you're using here on the Raspberry Pi
50:27
but is it a special version of the Raspberry Pi that is supported, and what other types of microcontrollers supports that type of framework
50:36
That's it, yeah. So just to be clear, I'm running full .NET Core for ARM or .NET 5
50:43
In my case, actually .NET 6 on my Pi these days on there. So this is just the ARM distribution of the SDKs and everything on the Pi
50:51
But on the ESP32, this little device down here, that uses this Nano framework
50:55
And I guess kind of the clues in the name a little bit with Nano being so small
51:02
These devices, they just can't handle full .NET. But there is another device called a Meadow F7 that Wilderness Labs have created
51:11
And that's fantastic as well. And I'd be remiss if I didn't mention that. And that uses this is a microcontroller with an ARM chip and an ESP32 on there
51:20
And that uses full fat .NET framework on there. So they've managed to squeeze that in there
51:25
So that's a remarkable project that they're working. Brian Kostinitz and his team are working on over there at Wilderness Labs So it worth checking that out But yeah specifically Nano Framework this is an implementation of that they spending a lot of time making
51:42
But it's pretty much full C Sharp. Pretty much anything you want to do
51:47
is going to be included in that. And they just have sort of almost interfaces
51:53
to the actual full .NET assemblies in there. So they've just worked around the memory limitations to make that work
52:02
Yeah, so go to the nanoframework.net website to find out a bit more about that
52:09
Nice. And the second question is here, because you were also demonstrating that you were using the Azure IoT Hub
52:17
And can you say something about, you know, what is, you know, the cost associated with using that, for example, what are sort of the Azure costs
52:26
Yeah, Megalo. In fact, if you're just playing, there's a free tier, and that allows you something like 4,000 messages a day
52:34
which actually sounds like a lot, but it's not a lot when you start doing the maths
52:40
But it's great for just getting you going, and that will cost you nothing
52:44
You can sign up for an Azure subscription, and you can get $200 free, I think it is these days
52:51
or the equivalent in pounds or euros, to be able to play. A lot of the services you'll get as part of that
52:56
will also be free for 12 months. And then after that, you convert to a regular pay as you go and pay
53:01
But even after that, I think it's for the standard tier, I think it's about 15 quid a month
53:06
And then that gets you for, I think it's something like 4 million messages a day
53:12
And then you're off and running at that point. There's a basic tier as well
53:16
The basic tier cuts out things like cloud to device messaging. So no good me using basic tier for this one
53:22
but if all you want is to ingress into the cloud or egress out of your device to the cloud
53:29
then the basic tier can sometimes work and that's half the price which sort of makes sense
53:33
but yeah, super cheap and when you're in the standard tier millions of devices
53:37
you're really limited on the number of messages per day and then you can step up
53:42
if you start going over that then you're probably making money so you can afford to pay for your IoT hubs
53:48
at that point and also speaking here of messages you got a message here from
53:54
Ben Watto was asking a question here previously in our session yeah that's fantastic
54:00
definitely do Ben, follow me on Twitter I've reached the 5,000 follower limit on Twitter
54:06
I follow a lot of people yeah there's a limit, okay yeah so I think
54:12
you'll see it's either 5,000 or 5,000 one I'm on at the moment so I keep having to go
54:16
through a list and unfollow people that aren't posting anything and then I can follow all the people
54:20
It's really annoying. What I need is that little blue tick, and then I can do what I want at that point
54:26
But, yeah, absolutely do. I'd be keen to find out what it is you're doing
54:30
And if you've got a GitHub repo or blogs, then absolutely send them out
54:35
And if nothing else, I'd love to get you on Notts IoT if you'd be interested in speaking
54:39
So thank you, Ben. Really appreciate that. And I have a final question here
54:44
You were also mentioning the AC220 certification exam. and can I ask you, have you taken it yourself
54:52
and how was it? I tried to take it, you know, they won't let me
54:57
Because I created it, they automatically award you the exam result. It would be very unfair
55:03
However, what I have done is the refresh, that you do a refresh, I think to keep it updated
55:10
you have to go through this little refresh exam and I did that, but I also wrote that refresh
55:14
So I think I got 99. I didn't actually get all of them right because actually some of them are quite tricky
55:21
And I didn't actually write all of those refresh questions. It was a group exercise, but I'd seen them all, of course
55:27
But yeah, actually that exam is hard. And every time I have to create new questions
55:32
and that happens quarterly, A, I find it hard to write new questions
55:36
because I've been involved in writing all of them so far. So that's hundreds and hundreds of questions
55:41
a lot of which aren't even in the exam anymore. But also there's so much new stuff being added
55:46
all the time to that whole Azure stack. And the IoT stack is broad
55:51
So you've got the IoT hub, but it also touches Stream ytics and the now defunct time series insights
55:56
and machine learning and storage and permissions. And I mean, it's not as bad as AZ-400 though
56:04
which is the DevOps exam. Oh, that's, I mean, that even asks you questions
56:08
on services that aren't Microsoft owned. It's that broad. So yeah, that exam is difficult
56:14
but AZ 220, I don't know I'm part of it so I'm always
56:20
pushing to make sure that if you're doing the job or you've done your research
56:26
and you've done your learning then you should be able to pass it there shouldn't be trick questions in there is the point Yes
56:33
Yes I think that brings the conclusion here to the questions Cool, good
56:41
Thank you very much for having me Thanks everybody for watching. Thank you a lot for your session
56:47
And Peter, please stay a little bit here on the line while we are finishing off the stream here
56:55
Of course, more than happy to. Thanks again. So, yes. So we have just a little bit of information here from us
57:05
The first thing is if you're interested in speaking here at our show, you can always submit to our session
57:18
as a call for papers or call for speakers. You can see that on our slide over here
57:26
And we can also say something about our next session. so the next sessions in two weeks will be on cognitive services
57:35
and it will be Priyanka Shah who will take that session here for us
57:41
so anything else Eve? I don't have much more but maybe we can like tell a little bit of our plans as well
57:55
because as we did last year as well in the end of this part of the year let's say like that so
58:04
before we go away for summer vacation we would like to bring back all the speakers we had since
58:11
january and talk with them about a very important topic in the field which we are going to tell you
58:19
probably at the next session. So you should stay tuned. Yes. So have a happy May 4th
58:30
May the 4th be with you