News From Google I/O: Android Everywhere
Sitting in the Google I/O keynote yesterday, one thing became apparent to me: Android is everywhere. I was well aware that Android is used all over the world, but Android will very soon not just be on many people’s phones in many diverse locations, but on many devices that a single person interacts with on a daily basis.
Watches and Cars and Televisions, Oh My!
New modalities of interaction with Android are appearing at a rapid pace. Google has announced that you will now (or soon) be able to use Android in the following new ways in addition to using Android-powered phones, tablets or Glass:
On Your Wrist
Three smart watch models are available (two are ready for purchase now and one is available for preorder). Each is powered by Android Wear. (For more info about wearables, see Kurt Nelson’s post on the state of the Weariverse.)
In Your Car
According to the keynote, more than 40 new members have joined the Open Automotive Alliance, and “the rubber will hit the road” when the Android Auto SDK becomes available with the public “L” release later this year.
On Your Television
Chromecast was announced last year, allowing users to stream content from a device to their television. Yesterday, Google announced Android TV (not to be confused with Google TV from 2010), which supports streaming content from your phone or tablet, along with an integrated experience for playing content directly from the television itself and interacting with apps developed expressly for TV using the Android TV SDK.
On Your Laptop
While it seems farther off, Sundar Pinchai announced in the keynote that Google is working on supporting interaction between Android apps and Chrome OS. In the short term, Chromebook is able to tell you things about your phone, like when its battery is low or if you are getting an incoming call.
Unifying the User Experience
In response to the growing number of Android interaction options available, a common theme presented itself in the keynote and again through the various talks I attended yesterday: unifying a user’s experience as he or she transitions between modes or devices.
Of course, converging data using the cloud is old news. The focus here was on syncing a user’s interactions across all their devices. For example:
I heard in three different talks that, with the new Android Wear APIs, developers will be able to easily offer users unified Android notifications, mostly managed by phone and presented through Android Wear on auxiliary devices. A user sees notifications from their phone apps on their watch, Glass or even their Chromebook. If the user dismisses a notification in one place, it will also be dismissed on the others. Key for me was that “no extra work is needed on the developer’s part.” Also of great interest is that developers can customize these notifications in an “enhanced” fashion (e.g., we can display player controls for an audio player).
Apps installed on your phone are automatically synced with your Wear device. The Wear device will install its compatible version of the app. A demo of the Allthecooks app showed that you can open
a recipe on your phone and then view a single step at a time on your Wear-powered watch.
Controlling One Device from Another
Several demos (including the Allthecooks demo) highlighted the importance of allowing users to control what’s happening on one device through another when it’s more convenient. You can use your watch to control music playing on your phone, use your phone to control Android TV content selection or use your Chromebook to take over your Evernote interactions.
Leveraging a User’s Multiple Devices for Context Awareness
One neat feature discussed in the keynote was using devices to provide authentication validation. For example, if your watch is on your wrist or your phone is in your hand, your Android-powered devices can assume they are in a “safe” context and allow you to interact with them without a PIN verification. I’m excited to see other ideas that grow from the premise that a single user has multiple Android devices.
In order to create the unified experience, Google is adding API support for the following in Google Play Services 5.0L:
- Message API: Send unidirectional messages between a wearable and phone. (For example, you can tell your phone to change a song being played).
- Asset API: Transfer binary blocks (such as images) between a wearable and another device (Your phone could download an image and resize it to match the resolution of your device and send it over to your wearable as an asset).
- DataItem API: Have data that is in sync in both the wearable and another device, so if it changes in one side it will be available in the other.
I’m looking forward to learning more about these new platforms and incorporating them into our Android bootcamps. I hope you will join us—solid Android knowledge will make developing for any of these new platforms easy!