Hi everyone, and thank you so much for joining us today. Welcome to our webinar on using Google cloud products to build your IoT solutions. It looks like we have a lot of people online so we're going to go ahead and get started.
My name is Isabel I'm the director of marketing here at Leverage. I'll be helping to moderate the webinar today. I'll first note that as we've been testing things out here on our end we've noticed that there might be a lag once in a while so just hang tight as we switch slides and remember to reconnect or refresh if you haven't seen a change in a while. With that being said we're very excited to have you join us today again my name is Isabel I'll be helping to moderate and answer questions in the chat and things like that so I'll pass it off to Ryan to introduce our colleagues.
My name is Ryan, VP of Business Development and Marketing. My name is Yitaek Hwang, I'm the director of R&D. Hi, I'm Sameer Srivastava I'm a product engineer and I'm Eric on the CEO of Leverage.
So today we're gonna be sharing what we know and what we've learned about building and scaling IOT solutions with Google cloud products. For those of you who are not familiar with us, Leverage is an IOT software company and systems integrator that makes a solutions development platform so we work with companies and clients from startups all the way up to fortune 500 companies enabling them to create their own IOT solutions.
You'll be hearing from all of us throughout the this webinar and we'll have a lot of time at the end to answer questions and if we don't get to your question in the webinar and the time allotted we will send a follow-up email so you don't miss anything. So now we'll go through to quickly go through the agenda for the webinar.
We already went through welcome introductions so we're gonna start off with how to choose the right IOT cloud platform and key considerations for doing so, why our team chose GCP an intro to the GCP architecture, and then we will go through a case study using GCP which is our Siren Marine smart boating application and then we will finish up with question answers from the audience and some questions we received ahead of time as well. So with that being said I'm going to pass it over to Yitaek who will talk about the cloud platform considerations.
All right, before we get started can I get a quick poll on the screen to see which cloud platform everyone is currently using? I think it would be interesting to hear the pain points and the success stories with each platform later on during Q&A. Now the most important consideration for choosing a cloud provider is reviewing your IOT use case and matching your business technical and operational needs to that use case. Too often we've seen companies focusing too much on the technical IOT offerings and neglect to review the other components on the business side cost and increasing compliance requirements can help you narrow down the cloud providers to those that meet the regulatory requirements on a price point that makes sense to you.
One thing to note here is how sustained use discounts are determined to get an accurate measure of the total cost. Secondly it's important to consider what manage services and api's are available. All major providers of at least one IOT specific feature that may be critical for your use case if you decide not to build your own. For example, AWS IOT has a feature called
the device Shadow which represents the last known state of the device this can be really useful to get the device's values without having to store them yourself or rebuild them from a historical data store.
Lastly, operational requirements can drive your cloudprovider decision. If your company is already using a Microsoft stack, for example dotnet or c-sharp sticking, what Azure can make the most sense. Also technical and business support is often overlooked but an existing relationship with these companies can really accelerate growth. Now I'll pass it off to Eric who can talk about why we chose GCP.
This will be unique to your business but these were the sort of the guidelines we followed as we ended up selecting GCP. The first one was a cost structure which of course is always very important to any company trying to minimize your cost. As a start-up ourselves and we have many customers in different states of deployment some of them doing minimum viable products or rapid prototyping, others doing live testing or paid pilots in a region, some of them doing nationwide or even potentially global scaling, so we had to look at costs across the whole spectrum of use cases and so for us Google made a lot of sense and some of the things what that I'll mention are actually common with some of the other cloud providers but there were no upfront costs, no termination fees, the majority providers do not have things like that anyway, but they had of that Google has a very interesting pay-as-you-go model where you actually just pay per minute which is great to have that resolution of of you know payment we also like that they have sustained use discounts so you can get up to 30 percent automatically off
your workloads per month.
And that sort of view for instance left a virtual machine running and you forgot to turn it off or a part of your business was running longer than you expect it they will automatically apply a 30% discount at the end of the month when they do your credits.
You also have the ability to commit it use discounts where you can sort of lock in discounts up upfront and you can build custom machine types to meet your specific needs. And finally, they're price discounts will automatically kick in if you hit certain thresholds of spending per year or per month and it's nice to know that they kind of do those automatically.
So the second main aspect was security. While not every cloud platform out there is impervious to hacking or security breaches, there are none that that can cover all the
bases. Google's probably the closest in our consideration for an out-of-the-box security so they provide complete encryption for data both in motion as in transit and at rest which is which is a very nice feature. The other thing is they they are running GCP on the same exact fiber backbone and infrastructure that Google itself runs on so everyone is used to relying on Google.
It's like the one website you go to when you were trying to check your Wi-Fi connection so that type of reliability that they built up over many years is really nice to know that your applications are sitting on the exact same infrastructure.
Another thing is regulatory compliance in particular. So ISO and as SOC sock compliance which is sarbanes-oxley are they have quite a few of those those sort of certifications and we've we've actually come across a few use cases where especially soft compliance was really important to the end user.
And finally, they have nearly a dozen or more security products that provide all different types of sort of point security solutions along with endpoints networks data even in infrastructure. Another thing that we use is something that's a container vulnerability analysis service that will automatically detect flaws security wise in our docker images. The third point is strategy, so this is really a business specific thing.
For us, partnering with Google was a very very nice fit strategically. In addition to becoming a
technology partner and having all the the co-marketing opportunities and lead generation that goes with that we like the fact that Google is very hungry as a cloud provider and they're essentially in third place right now which is kind of a well-known thing, behind AWS and Microsoft, but they're very aggressive and committed to becoming one of the top two players in cloud and we like that as a start-up ourselves we like that kind of, you know, "go for it" mentality and because they are hungry they tend to give startups that work with them more opportunities to shine and so that was more of a strategic decision on our part.
The final thing is Google's focus as a company on artificial intelligence or AI and machine learning. Google just recently announced this week in Google I/O that they're renaming the research department to Google AI so they are one hundred percent committed to AI and as most of you probably know AI and ML are key drivers of automation and efficiency for IOT applications. So those types of features are very critical to us and Google we believe has a lead in AI over the other major players in the market we we regularly use a lot of their APIs for image recognition and Cloud ML and other things to accomplish things for our clients. We also as a partner of those we get early access to private betas of advance solutions that they're working on, some of them very specific to certain industries, so for those four reasons we ended up selecting Google and we're very very happy with our choice.
At this point I'm going to hand it back to Yitaek to kind of walk through a typical architecture within GCP that would be applied to an IOT application. Here you go Yitaek.
Thank you, Eric. So what you see on the screen is taken from the Google cloud platform website so this is a notional architecture that Google has put together for a typical IOT deployment so over on the very left you should see devices and these devices could be one device, a bunch of several devices together, or even from a different connectivity provider, and the three that I want to highlight next are cloud IOT core, cloud pub/sub, and cloud functions.
So these three take care of the device management data ingestion and related business logic so cloud IOT core is Google's managed service to securely connect and manage devices, so if you do not have a service that is provided by your connectivity partner this might be very a good tool to use to securely connect devices online. It also natively pumps data via pub/sub which is used.
These next three products deal with message processing and storage dataflow is a managed instance of Apache beam for those of you who do not know what that is, it's a really nice tool to handle stream and batch processing messages at once you can also pipe those messages to BigTable for no sequel options or BigQuery for long term structured data storage and analysis.
And the next product I want to show is cloud ML - this is probably the most exciting product listed here. What this is is a managed service where you can host your own tensorflow models and let Google handle all the server setup. This is really an attractive offering for data analysis and machine learning engineers so they can just solely focus on training the model and not have to worry about deploying, auto scaling, and setting up the server to handle requests.
Finally, on the very right you have Cloud Data lab and cloud data studio. Data lab is a management instance of Jupiter so if you're familiar with Python analysis this could be a good way to collaborate online in data studio is similar to Microsoft power because you can easily make dashboards that can be shared across different organizations. Now I'll hand it back to Eric who can talk about our Siren Marine use case.
Great, thanks Yitaek. So we want to try to take some of this technology that we're talking about in sort of components and actually bring it to life with a real use case, something that most of you understand especially if there happen to be any boaters on the line here. What we want to do is we want to feature one of our one of our great customers Siren Marine they have made a market defining product that allows you to stay in touch with your boat. Essentially you could think of it kind of as OnStar for your boat. It's an aftermarket product right now and you can have it installed or install it yourself and it will allow you to remotely monitor manage and track your
boat which is a very important thing, you as you know boats are very expensive there are complex systems of systems and they're unattended most of the time and that's where the
big issue comes.
So owners want to maximize their enjoyment on the water but they don't have insight into the
state of their boat before they arrive or what's happening when they're not present and that prevents a lot of emotional and potentially financial distress for them. So the solution is to
have what what Siren Marine likes to call a connected boat a boat that you can always reach out and touch or it will sort of alert you if something happens to it based on rules that you would set and that's exactly what Siren Marine in concert with Leverage built for the market.
So it allows you to provide current location and get battery levels build settings and other type of things even security alerts on your boat through a combination of wired and wireless sensors that get installed either an off season or you can do it you know at the pier since it's a fairly simple installation. You can also setup role-based or user-defined rules for security and boat operations for say it's goes with drift or you lose shore power due to a storm or someone tries to steal your boat, you can be notified via text messages and everything else so it's a very powerful peace of mind for a boater and at a really good cost.
So what I'd like to do now is hand it off to Sameer who will walk you through and Sameer has
helped build this application with Siren marine staff and he'll kind of walk you through how how the system works one and also sort of tied into specific Google products and how we accomplished the various features that you'll see, so here you go Sameer.
Thank you Eric...
Hi everyone, I'm Sameer and I'm a product engineer at Leverage and I'm going to be going over the high level system overview of the Siren Marine implementation. Start on the left with a boat with a Siren Marine solution on the boat there are various critical components to monitor such as a battery, the high waters sensor, and the security sensors, these are really critical because your boat could be sinking or someone could have broken into your boat so it's important that a boat owner can know this information.
To the right of the boat is a Siren Marine MTC in the summary and MTC is the brains of the Siren Marine solution: it receives inputs from these various sensors on the boat and then it will send UDP messages over cellular to the cellular provider containing information about the boat then the cellular provider will pass these UDP messages to our platform and we will unpack these UDP messages as I show you a demo of the Siren Marine implementation solution. Okay so let me get started setting it up.
Isabel here - as Sameer sets that up I just want to remind everyone that if you're experiencing a bit of a lag or every once in a while, just reconnect and you should be getting back up online in just a moment and if you have any questions while we're going through this demo or any questions that have come up so far remember to just ask away in the chat and we'll answer them live.
So this is a screen share of actual iPhone showing the Siren Marine app. We've made a native Android and iOS app using react native so the first thing that we see here is the the map on the top of the screen showing where the boat is located in this is really important for boat owners. It gives them real-time location of the boat and it really allows them to feel comfortable that their boat is where they think it is especially because they usually aren't close to their boat during offseason or during the week it's really on the weekends that they really go to their boat so it gives them peace of mind to know where their boat is.
Additionally with this GPS feature we have a position tile so this tells us where the boat is and a really cool feature is the geofence. We have a geofence on the boat as you can see here it says enable geofence so if the boat is a certain distance away from set coordinates then the boat owner will get an alert. And the amazing thing about the Siren Marine solution is that there's actually edge alerting so even if the cloud doesn't necessarily detect anything, that if the device - detects this out of the geofence it actually sends an alert itself over cellular and we get the alert and then it sends a text message email or even a phone application to the user telling them that their boat has left their geofence location which is really amazing.
The second thing I kind of want to show you all is the temperature chart which really shows data that we're storing using little puffs as well, so the temperature is loading in and this chart is using as thresholds that can send based off if the temperature is changing from max or min and then this chart is actually based off cloud sequel and BigQuery storing data so we have hourly daily and then we also have weekly data of the temperature to show.
So now I'm going to change screens again and show real-time up data in firebase. All right so
just hang tight while we pull that up. We're getting some good questions in the chat that we will definitely get to during the Q&A session.
Okay so on the left here I have our firebase console and then on the right I have the screen share again of the live app. So right now this temperature is about 72.2 degrees Fahrenheit I'm gonna go ahead and change that to 68.5 degrees Fahrenheit and then as you can see the live app actually changed instantaneously so the really cool thing is that for a boat owner it means that they can get instantaneous update.
That concludes the live demo of the Siren marine app and now Sameer is going to walk us through some of the specific products that we use for the solution, the first being firebase.
Thanks Isabel, so I'm going to start with the first Google product that we're using with the Siren Marine solution, the the main one is just firebase which really means our boaters needs to get their current status of their boat and we're using firebase for real-time database and we get pretty instantaneous data storage once we get an update from a device on the boat and our react native listens to firebase to get these real-time updates.
So on the right there's a diagram showing a back door security sensor, so you can imagine this would be useful for someone if they were breaking in to their boat. The first set on the left says the security is okay and then firebase if they receive the alert from the device saying the security sensor has gone off, it changes to an alert state and it's pretty simple security - if you see below changes from false to true on the right side.
I'm now going to move on to the next two products that we use in our solution. Important for both maintenance and monitoring to store historical information such as engine hours, temperature changes in security alerts, so in our platform we store data in two different ways: with both BigQuery and cloud sequel. Cloud sequel is really useful for 30-days historical data we use it to get pretty quick information. So the chart is using mostly cloud sequel and then BigQuery is useful for all-time historical data and it allows us to do analytics on a large scale.
There are other Google products such as BigTable that also saw a lot of data but BigQuery’s costs are low costs and we have the ability to really store a lot of data, steering us towards this solution and this product.
And then finally I'm going to talk about Google cloud functions. To give it a bit of a bit of background about cloud functions, cloud functions are server lists so most of the application code the server-side logic is written by the developer but unlike the traditional server the developer does not do the DevOps. Instead, Google manages this so these functions are run in stateless compute containers that are event triggered and full behind the scaling costs which is great and then and also for something like this application but inconsistent request scaling is something that Google does really well.
So suppose a security alert goes off when someone breaks in. You know once the cloud function can handle this load and the really great thing about the solution for us because we used firebase, firebase to cloud functions, trigger on firebase write. So one of our
wireless sensors actually don't have at right now but is detect writes to firebase and do cloud alerting so the wireless sensor can change its temperature into the correct threshold.
We can do cloud alerting to detect the right to firebase then send the message to a user and even use Google's firebase cloud messages for native notifications in both Android and iOS.
And the kind of final thing I wanted to do is talk about Siren Marine as a company and application. I've been working closely with Siren Marine for many months and it's a really great company so if you all know any boaters or just interested in their application please visit Sirenmarine.com to learn more about their product and their offering. I'm going to pass it back to Yitaek now to talk about the kubernetes architecture for Siren marine.
Thank You Sameer so here we have the overall system architecture of the Siren Marine use case which includes the products that Sameer has highlighted before but also some other system level products like kubernetes that help with the production deployment.
The high level takeaway here is that by using Google services you can quickly build and
augment your course software to build a more scalable system.
One thing to note is that Google components are shown with their corresponding icons and Leverage specific components such as message processor and transponder are not, and I will walk through each of those components right now. So over on the very left we have the sensors on the boat communicating to our platform via cellular connection.
The first thing that it hits is called message processor which is something that Leverage has written and what it does is it maps external IDs such as Dan's boat or Sameer's boat to internal IDs and it routes those messages to specific business logic to perform geofencing or other learning mechanism that Sameer has demonstrated earlier.
Again we use cloud functions there to only spin up those resources when needed and we have wrapped that service around something that we like to call Reasoner's internally.
Next all the messages go through something called transponder and this is also something that Leverage has written and what that is, is that it helps write to all the databases. Again the real-time database is served by firebase to update the app the native app but also they can also
support voice interfaces like Alexa in real-time and all the historical data goes through cloud sequel and also BigQuery so in case there is a need for historical data queries such as graphing then it can also be pushed to the app itself
Finally, as you can see most of these are various app for micro-services and we use Google's managed instance of kubernetes which is a container Orchestrator that really helps
with auto scaling and self-healing. This really reduces your burden on the DevOps and site reliability needs because Google with the with their use of software it can really help with that.
On the next slide what I want to go over quickly is something that we have thought about to increase security with cloud IOT core. So currently in the system what is missing is preventing
someone from spoofing the device. What I mean by that is let's say there's a bad
actor that gets access to your boat and they want to have they want to switch
out the device that you have installed with the faulty device so they can
disable alerts or disable location updates.
So currently with the system that we have it is really hard to make put the hardware security in
place. What you have with cloud IOT core is that you have the ability to pair the
security updates that come with Google with the secure TPM element for example
like the microchip 80cc series so you can generate keys on the device itself.
The big advantage here is that there are no external communications so that at no
point does the provisioner actually go in there and insert the certificates and
the keys. To put it simply what it does is that the hardware device itself makes the keys and
then now that is being passed to cloud IOT core so you have both hardware
security and software security so you have peace of mind that there is no way
that your device has been compromised. So we are looking into this solution in the
future to help beef up the security both on the hardware and the software side.
And now I'll pass it back to Ryan who can help moderate questions.
Now we're gonna open it up and go through some of the questions that were asked in the chat as well as some questions that came in ahead of time then pass it around for for certain answers. So one question just came in from Riley was if we could give a bit of a detailed breakdown on the cost difference between providers of AWS, Microsoft Azure, and then GCP. I think it'd be good to kind of also touch on the difference between the three as well so I believe Yitaek can jump into that.
Yeah, so I think the difference between the three different cloud providers kind of stemmed from
their business model. So Microsoft they have a huge influence in the enterprise sector so if you're already using their office suite or have a dotnet or c-sharp architecture then it really makes sense to just use them as your architecture. We initially did begin using Siren Marine on the Azure architecture, however we found some of the supports to be lacking for example kubernetes wasn't as ready and some of the AI services that we were likely using for Siren marine was not ready and finally we really like
firebase, so that was one of the big reasons that we switched over to Google.
Speaking of Google we really respect their AI focus so some of the implementations that were looking into the future using their latest deep learning technologies or image recognition capabilities Google really made sense as a services company. And finally, AWS as a really great platform company in fact they offer the most flexibility in terms of the VMs or the services that you want to use so if you really have an architecture in mind that you don't really want to use a managed service for, then AWS could be a really great offering so you can customize and really fine-tune for the cost model that makes sense for you.
I certainly can't speak for Google but I think we kind of ran into the same problem but what we've been doing was kind of using their chief cloud libraries or their node specific libraries so we you'd like to use the node.js library in the back back-end but you can also make direct rest calls I believe for some of the functionalities that you might be looking for so that could be a
workaround until Google decides to support HTML versions in the future.
Thanks. A question that came in from Terry: Did you use MQTT for device comms and if not, why? I saw Ming reply a little bit to that but I think we'll add a little bit on our end as well.
Sameer: For the Siren Marine use case MQTT with IOT core specifically just didn't exist when
we did our use case but MQTT has existed for a while as a message spec. We didn't
use MQTT because the hardware use UDP as the message spec so that's what we were
limited to it was made by another company called Cal Amp and really gave Siren Marine all they needed in terms of GPS and other types of functionality. So that's why we ended up using UDP
Great, thanks. Some of the questions we had come in prior to the webinar around this topic, we wanted to share as well. One of them was: Did you find any pains in migrating? and I think Sameer can answer that a little bit as well.
So in terms of the migration process there were definitely things that we had to watch out for in terms of converting our platforms to be more ready for kubernetes and scaling. I think particularly the biggest pain was dealing the UDP messages and load balancing them but otherwise is a pretty painless process to get certain kubernetes in terms of the migration.
Great, and the final question we have at the moment is: Can we provide more in-depth into the data encryption feature?
Yeah so Google has decided that every data that goes into their platform is encrypted by default so this is a difference from Azure and AWS as that from what I know that last time I checked and Google likes to consider all their networks as one big computer networking system so whatever
goes in different region is the same so if you're writing to the database in a central region that's the same as writing to database in the eastern region so they like to encrypt all the data that gets stored and everything that's communicating in between regions so that's turned on by a default which is a really nice security feature that we encountered so far.
Great, we also had one more question just came through how much data cleaning was required upon ingest.
There is a good amount of data cleaning really depends on which part of the sensor like like what data is coming in for GPS there is some data cleaning which depends on how many satellites we're getting so if we're not gonna good signal we won't really update the device
and app if we are then we will do that otherwise not too much data cleaning on our end there may be some stuff in the firmware to clean the data but in terms of what we received there wasn't a lot of data cleaning for other types of parameters.
Thank you! That pretty much sums up the majority of our questions. If you guys have any
more feel free to shoot them over right now. Other than that you know we just want to send everybody who answers as well as the slides and full presentation will be sent to you once the priest or once we are wrapped up here today and I also have a five day email course completely
for free on building solutions with GCP highly recommend you sign up for that. You get a new email every day, taking you through the process and all the benefits that we kind of talked about a little bit here today as well as the deeper dive into the tech. I’ll pass it over Isabel to wrap things up.
I'll just say thank you again for joining us today thank you for engaging and asking questions in the chat if any questions come up after this webinar or if you're not watching the webinar live and you do have questions feel free to shoot them over to us there will be a way to ask questions after the webinar and we have our email shared with all of you to do that as well.
I really hope we've provided some valuable information to you all as you're building and scaling your own IOT solutions so I hope you enjoyed the webinar, and like Ryan said if you are
interested in diving a little deeper into these products the email course which I will share on the right-hand side of the screen you can sign up right there. That dives into the products a little bit more in detail if you're interested in learning more. So that pretty much wraps it up for today. Thank you again for joining us and reach out if you have any questions!
We've done all the research and testing so you don't have to. Our team is passionate about and deeply experienced with all things IoT; and we're here to answer any questions you may have.
Fill out the form below and we will be in touch with you shortly!