Beyond Music, AirPods, and UX

We’ve had some time to think about the event, and we’ve noticed something interesting about Apple’s new focus. We already know they’re working to create a seamless user experience across their suite of products, but now there are new opportunities to engage with users when they are not actively interacting with a device.

 

A Strategist’s Perspective on Apple’s September Event

Although there were few surprises at the Annual, hardware-focused Apple September event, the announcement of AirPods might be signaling a new form of UX. The AirPods have a brain (W1 chip) and sensors (infrared and an accelerometer) independent of the phone. Currently, the sensors are only used to detect when they are being worn or to detect a tap, but the future potential is undeniable. The AirPods will one day be much more than earbuds that simply deliver audio.

Imagine the maps app whispering in your ear to turn left, or your calendar app reminding you of your next meeting and who is in it. Push notification could be read to you. We often assume augmented reality involves visuals, what would an audio augmented reality experience be? As we move to more screens interactions, what other interactions might move to earbuds?

Most interesting is what this means to the user. There will not be a single killer wearable; users will have a choice of the wearable to fit situations and environment. As a developer (or brand) we must support the devices that the user chooses. As we expand delivery to new devices and platforms, will the functionality and content you are currently providing your users (APIs) be flexible and able to support AirPods, smart watches, bracelets, glasses, etc.?

Contact us to find out more about these new points of interaction and how to leverage them to provide a better user experience.

Published by Bottle Rocket in IoT, iOS, Apple, Wearables, Strategy and Insights