The Internet of Things opens a new world of experiences and deep connectivity for both consumers and businesses. Always-on sensor streams, big data, and machine learning are dramatically changing the kinds of experiences software developers are building. These changes require those of us who build software to think differently about what software is and how it is composed. Software is no longer simple database-centric, CRUD interfaces. Multi-Sensory Applications are systems which connect all the things: IoT, business processes, data analytics, and immerse user experiences.
Before we dive into a very simple MSA that connects IoT data to a back-office business process, go read about the Principals of Multi-Sensory Applications so that you understand the basic ideas.
Ok, now lets walk through an MSA that uses location sensor data and wraps a business process around that data. We will start with the basics and add additional senses and transducers in future blog posts. This example focuses on a rental bike business that needs to address a problem with bikes periodically going missing. Here is the business process:
- A customer rents a bike and GPS tracking is enabled
- Throughout the rental, the bike transmits it current location to the cloud
- If no information about the bike’s location has been received for 30 minutes, a support case is created, which includes the last known location of the bike
- An employee can then go investigate the disappearance
Before we go into the technical architecture, here is a short video demonstrating this MSA:
The senses and transducers needed to build this MSA are:
- Identity Senses: Bike shop associate
- Device Input Sense: Bike GPS location
- User Input Senses: Rental begins & ends
- Storage Transducers: Rental bikes, reservations, & support cases
- Integration Transducer: Associate bike location with rental bike & reservation
- Learning Transducer: Creates support cases for lost bikes
- Notification Transducer: Emails bike shop associate when a support case is created
Breaking this down into individual components and microservices:
- Back-office services & data on Salesforce:
- Bike shop associate identity
- Rental business data (inventory, reservations, & support cases)
- Periodic lost bike checker
- Rental bike tracker:
- Raspberry Pi with GPS
- Network connectivity
- Location data ingest microservice:
- Web endpoint on Heroku
- Connected to Salesforce via REST APIs
Putting it all together here is what happens:
- A bike shop associate creates a reservation record in Salesforce
- When a rental begins the Raspberry Pi feeds it’s location every minute to the ingest microservice
- The ingest microservice on Heroku updates the latest location on the reservation in Salesforce
- A schedule job on Salesforce periodically checks for active reservations which have not received location data in the past 30 minutes
- A support cases is created for the lost bikes and assigned to a bike shop associate
Lets walk through the code to put this all together.
On Salesforce I created two new custom objects that hold the bike rental inventory and reservations. I also added an association between a Case object and the rental bike reservation which allows support cases to be created and associated with a bike reservation. The bike rental reservation has fields that hold the last known location. Here is the full schema:
I also created a custom Visualforce page which renders a bike’s last known location on Google Maps:
Finally for the back-office side I created a schedulable Apex class that creates support cases for active bike rentals which have not had a new location update in the past 30 minutes:
For the microservice on Heroku I used Play Framework and Scala to create a web endpoint that takes a location ping and updates the rental bike’s location on Salesforce. Here is the controller code that handles the POST request:
Finally there is a small Python app that runs on the Raspberry Pi and feeds the data to the microservice on Heroku:
The Python script has a hard-coded ID for the rental bike and a hard-coded HTTP endpoint. Using one of the many IoT gateway products out there, this could be designed differently but I chose to keep things simple for this example.
Here is what the Raspberry Pi hardware looks like:
I used the Adafruit Ultimate GPS Breakout via USB, following the Adafruit instructions to setup everything. For networking I used an Edimax USB wifi adapter. If I were building this for the real world I would have used a GPRS modem with a prepaid SIM card.
That’s it for the code! You can build and deploy all of this stuff on your own! Check out the development & deployment instructions.
Going Further
This example Multi-Sensory Application is just the start to what we can do when we connect IoT devices to business processes and customer experiences. In future articles I will add an Analysis Transducer that could use the aggregate location data to determine the most popular routes. I will also show how customer experiences can be built on top of this foundation to deliver things like social sharing so that bike renters can share their route with their friends.
As software evolves from being CRUD-centric to being Multi-Sensory, developers will be faced with new paradigms and architectures for connecting everything. Stay tuned as we further explore how to build modern, Multi-Sensory Applications using technologies like IoT, microservices, big data, and machine learning.