Yesterday, we released the Salesforce Wear Developer Pack – a set of open-source starter apps that let you quickly design and build wearable apps that connect to the Salesforce1 Platform. The pack includes sample/reference apps for six wearable devices – Google Glass, Android Wear, Samsung Gear 2, Myo, Nymi and Pebble Watch. In each case, we’ve built a fully functional end-to-end app that connects securely to the Salesforce1 Platform to query and/or update customer data. Behind every one of these devices is a customer, and the Wear Pack makes it easier for developers to build apps that interact with that customer.

It’s not often that you get to build apps for six different wearable devices (trying my hand at the art of British understatement). We had a lot of fun building these apps. James (Android Wear), Raja (Samsung Gear), Wade (Nymi), Brian and Ammar (Myo) and Dan (Pebble) got to play with cool geek-toys and built innovative apps that push the boundaries of “what’s possible”. It wasn’t all fun and games though. Generally speaking, wearables are very much on the bleeding-edge when it coming to development platforms. There are many challenges unique to a wearable context and the “plumbing” required to connect one of these devices to a cloud platform like Salesforce is not trivial. That was in fact the primary intent of the Wear Pack – give developers the underlying plumbing (via the open-source reference apps) so that they can focus on building innovative enterprise solutions for these wearable devices. I thought I would distill and share some of our learnings from developing the Salesforce Wear Developer Pack. But first, here is the tl;dr friendly version of what these 6 devices support in terms of a development environment.

So what did we learn from developing apps across such a diverse landscape? Thought you’d never ask.

  • Security/privacy: There are security and privacy issues unique to certain wearable devices and scenarios. For example, notifications sent from an Android phone to a Android Wear watch have no built in security layer. Anyone with access to the watch can see the notification and take action (e.g approve/reject a quote discount). This is unlike mobile devices where you can use features like screen locks, MDM etc. to secure the phone/tablet. Developers are therefore responsible for implementing the appropriate security and privacy controls in their wearable app.
  • Identity: One of the most challenging parts about developing for any wearable device is figuring out how user identity and authentication works. The way users authenticate with the wearable app is very different across the various devices, and often very challenging to setup. For example, in the case of Glass a user authenticates via OAuth as part of installing the app (vs when they launch the app). As shown in the above flow, the user authentication is initiated when a user installs a Glass app (referred to as Glassware) on their device via the MyGlass portal. This installation step is done on the user’s phone or desktop (not on the Glass device). In the case of our Field Inspection app, this redirects the user to the Salesforce login page where they authenticate themselves. We then make a callout to the Google Mirror API (via Apex) to send over the user’s access and refresh tokens. Finally, Google pushes those tokens to the app running on the Glass device and the app is able to make API calls on behalf of the user.

In the case of Android Wear and Samsung Gear, the user authenticates on the companion phone app (via the built in OAuth support in the Salesforce Android SDK). Since Nymi is a biometric identity device, it naturally has its own proprietary identity flow.

  • UI/UX: The old mantra for building mobile apps was that Mobile UX/UI != Desktop UX/UI. Similarly,  Wearable UI/UX != Mobile UX/UI. New UX patterns and best practices have to be developed for engaging with these new wearable devices. For example, Google Glass uses a card-based, timeline UI where information is presented to the user through “cards,” and users navigate different sections of the timeline to reveal cards in the past, present, and future. Android Wear uses a card-based UI pattern similar to Google Now. Users swipe vertically to navigate from card to card and can also swipe horizontally on a card to reveal additional information or actions (called “pages”). In addition to new UI/UX pattens, wearables also require more targeted and subtle user interaction patterns as compared to mobile apps. For example:
    • Only a subset of the most critical notifications that are sent to a phone/tablet should be sent to a wearable device. Wearable notifications for the most part should be actionable, and not merely FYI.
    • Apps for wearable devices like Glass, Samsung Gear and Android Wear should accept voice user input wherever possible. Keypads are either tiny or non-existent and so voice input trumps text input. In the case of the Android Wear app for example, a user can dictate comments when rejecting a Quote discount.
    • Navigation within an app can be challenging in some cases. Glass users can navigate back and forth using swipe up/down or left/right, but its not as simple or intuitive as hitting a “Back” button on a mobile app. Developers should therefore try and keep navigation within their app to a minimum. The more targeted a use case, the better.
    • In the case of wearables devices that pair with the phone (e.g. Android Wear), apps can include a “Open on Phone” option to let user’s see additional information/context. This UX pattern is equivalent to how mobile apps can direct users to the desktop app for an additional level of drill-down/detail.
  • Smartwatches: Not every smartwatch is equally “smart”. Android Wear and Samsung Gear 2 provide a relatively fully-featured development environment where apps can have visually complex UI, multiple screens etc. Pebble on the other hand supports a very basic, text based UI.
  • Data flow: Some devices like Android Wear & Glass support 2-way data flow to and from the device. Other devices like the Myo and Nymi only support 1-way data flow from the device to the app. The potential use cases/scenarios for a wearable device are therefore dependent on the supported direction(s) of data flow.
  • Testing: Testing apps for wearable devices is more challenging than for mobile. Emulators/simulators don’t exist for some devices (e.g. Glass, Nymi & Myo), while others like Android Wear have emulators but require tethering to a physical phone in order to test.
  • Connectivity: Some devices like Glass can connect to Wi-Fi directly, but most devices currently require to be paired with a phone app via Bluetooth Low Energy (BLE) for internet connectivity. For developers, this necessitates a unique app architecture whereby the phone app is responsible for aggregating multiple API calls and returning a consolidated and minimal data stream (often in JSON format) to the wearable device. Business logic for the most part does not reside on the  wearable device. Instead, it typically resides server-side in the Cloud or in the phone app. The figure above describe this high-level architecture.
  • APIs: API availability is uneven across wearable devices. Glass supports a Mirror API (REST) to trigger user notifications and FitBit supports APIs to retrieve a user’s activity data, but the other devices that we worked with for the Wear pack don’t provide a public API (yet). Developing apps on the respective device platform is currently the only way to customize these devices.
  • Gaming environment: In the case of devices like Myo, Occulus and Meta Space Glasses, the actual device is merely a brush (i.e hardware). You need a canvas (i.e. OS) like the Unity3D gaming engine to develop in. Gaming developers/enthusiasts have a bright future in wearables! And if you happen to be a .NET developer then the best thing about a platform like Unity is that its all based on C#. You can even embed web pages inside a Unity application (similar to how WebView’s can be embedded in native iOS/Android applications). In fact the Myo app included in the Wear Pack includes some Visualforce pages that are rendered directly in the Unity app.
  • Combining wearable devices: The true power of wearables will be realized by combining multiple devices to provide unique user experiences. For example, by combining a gesture controlled device like Myo with a Virtual Reality device like Occulus, developers can provide their users with a fully immersive and interactive experience. We didn’t get a chance to build an app that combines multiple wearables for the Wear pack, but that’s where you come in! We’d love to see you build something compelling and unique with one,  preferably more, of these devices and showcase them at Dreamforce.

 

  • Platform proliferation: Today, the wearable space is very similar to the early days of smartphones/mobile (2007/8). There are lots of development platforms (Android, Tizen, C/C++, Unity3D etc.) and no clear leader(s). While some form of consolidation is inevitable, for now developers have to build apps in multiple, diverse programming environments.
  • Bleeding edge: For the most part wearables are the bleeding edge. Developers need a lot of patience, perseverance and trial-and-error to develop apps for these nascent platforms. This will of course improve over time, but for now, get ready to do a lot of Googling!
  • Its all kinds of fun: Despite the technical challenges, wearable development is a ton of fun! Geeks get to play with cool toys, build things that have never been attempted before and hack stuff together. What’s not to love about that!

Get the latest Salesforce Developer blog posts and podcast episodes via Slack or RSS.

Add to Slack Subscribe to RSS