How can we help?

Design? Check. Strategy? Check. Bulletproof code? Check! People who can manage it in an agile and efficient manner? Check. Someone to help you create your next big product? Of course.

Denver

- 1201 18th St. Suite 250, Denver, CO 80202

Grand Rapids

- 125 Ottawa Ave NW Suite 270, Grand Rapids, MI 49503

Blog

BLOG:Apple is Betting on Contextual Experiences

Apple is Betting on Contextual Experiences

Users don’t interact with a mobile experience in a vacuum. With every interaction, there are markers that could indicate what the user is trying to accomplish such as location, time of day, preferences, and even the user’s historical usage. We refer to each of these as a facet of the user’s context. Applications that take advantage of the user’s context will soon dominate the iOS landscape. For evidence of this, you need to look no further than Apple’s announcements of the the iPhone 5s and of the public release of iOS 7.

In short, Apple is betting big on contextual experiences, and a big part of their push on both the software and hardware side is being devoted to enabling these kind of experiences.

The User Context

The more your phone knows about you and your environment, the more the experience can be tailored to your current needs. This entire phenomenon across the entire digital ecosystem is generally referred to as contextual design. To this point, we have seen this integrated in certain applications in small ways. Early experiences that capitalized on the user’s location context included Apple’s own Apple Store app which provided a different experience from within a store as it did when you were not near a store. Most of the effective examples to date leverage the location context. However, there are many more facets of the user’s context including the activity context, temporal context, proximity context (which is different from the location context), preferential context, and historical context.

Apple is not new to the contextual experience space. Apple’s inclusion of Passbook in iOS 6 was an initial piece of an overall strategy for implementing contextual experiences at the operating system level, but Passbook doesn’t fit every situation. While it only leverages the location and temporal context, it was a step in the right direction. While this is an easy implementation for third party developers who need to distribute tickets, coupons, and loyalty cards this is only a small piece of what is possible with today’s modern hardware and OS integration on iOS.

There are three key pieces of functionality from Apple’s recent announcements that showcase the emphasis Apple is putting on enabling developers to create these contextual experiences: Bluetooth iBeacons, the M7 Motion Co-Processor, and the new A7 Chip.

Bluetooth iBeacons

The integration for the tracking of Bluetooth 4 LE beacons in iOS 7 is perhaps the single most important new piece of functionality in iOS. As organizations are striving to bridge the physical and digital channels, they have learned that in many cases GPS does not give them an adequate experience for determining exactly where within a physical environment a user might be. From a contextual design perspective, this is referred to as the user’s proximity context. In this case it isn’t trying to determine exactly where a user is (such as with the location context) but rather determining how close a user is to another object.

What does this mean for application experiences?

This means that we can connect the physical and digital in ways that just weren’t possible before iOS 7. Imagine if you walked into a car dealership and could walk up to cars that you were interested in. When you pulled out your phone, it instantly had the information of the car you were standing next to pulled up. You could quickly see if other models were available with a different color, save the information for that car, and even digitally signal that you need a sales person at that specific car. Now when walking away from the dealership, you would have the same kind of information that is only currently available through online research. And all of this is seamless for the end user because the proximity context of the user was fully leveraged.

In addition, Major League Baseball already has a working implementation of this at certain stadiums. I expect to see iBeacon implementations explode in early 2014.

M7 Motion CoProcessor

Wearable technology is becoming an essential element of our culture. Devices like the FitBit, Jawbone Up, and Nike+ FuelBand enable a new category of mobile experiences. With the introduction of the M7 Motion Coprocessor, Apple has enabled this type of experience in the iPhone 5s along with direct integration for third party applications. Through this, application developers can determine if the user is standing, walking, running, or even in an automobile. In addition, the heavy lifting is removed from the developer to have to determine this through software algorithms, and instead it is all handled by the M7 chip.

What does this mean for Application Experiences?

Applications that take advantage of the user’s activity context can now provide a level of precision not previously possible. Instead of having to create a complex plan for an fitness tracking application, developers can now seamlessly determine when the user starts to run and begin tracking this alongside the user’s location context. While the user is running, the application can provide information such as current speed, distance traveled, etc… When the users becomes stationary or if the user gets into an automobile, the application can prompt the user to complete their existing workout and store the data.

A7 Chip and an Improved Battery

For those not closely tied to the technology behind the device, the addition of a 64-bit chip just seems like a natural improvement. However, the change to a 64-bit chip is also tied to one of the most crucial pieces of contextual applications: battery life. The Apple A7 chip is much more efficient than its predecessor. There are lots of reasons for this, but one main factor is the ARMv8 architecture with its A64 instruction set. This instruction set jettisoned any technical debt from the platform and was written from scratch for 64-bit. This has resulted in greater performance at a much lower power consumption rate.

The future of contextual experiences is tethered to efficient power consumption and overall battery life. We have the capability now to track extremely accurate GPS, detect consistent motion, and even continually track the location of objects near us. However, on most modern devices, this continual activity would drain the battery in an hour or two. This is why power consumption is so important. We can’t reach the pinnacle of measurability on this data until we can do all of this and still have our phone last all day.

What does this mean for Application Experiences?

The addition of the A7 chip, should mean that developers can create experience which consider the user’s context more often. While we still haven’t reached the promised land of never having to worry about battery life, this is a step in the right direction.

Conclusion

Anyone developing mobile experiences should have a focus on contextual experiences. All signs point toward a huge focus on contextualization across all native mobile platforms as well as mobile web. With Apple’s current focus, it is clear that now is the time for organizations and third party developers need to begin focusing on capitalizing on the user’s context to deliver a customized and relevant experience for the end user.