Design For Sensors, Not Screens
We’ve all heard the statement “Design Mobile First”. Maybe its time to say “Design for Sensors First”. Sensors are going to become more and more incorporated into what we design. They will be scattered everywhere even more so than they are now. From wearable devices to button sized low powered beacons that transmit signals. We need to think about how we can start capturing these signals to help create that perfect, just right, contextual experience for our users.
The blending of the physical and digital experiences will become extremely powerful in the future, more so than it already has. Sensors are already invading retail spaces pushing mass amounts of data. If we layer customer buying habits on top of these sensors you now have a very good picture of why customers might be in the store and what they might be looking for. The ability to serve up “just in time” loyalty coupons or discounts will be frighteningly accurate.
When does all the noise become just too much? Imagine walking through the mall and having every store shouting some sort of notification at you, this would not give the shopper the experience they would want. We need to be aware of the context we are sending before we send an overload of information to a potential user/customer. We can’t just throw it on the wall and see if it sticks, users will start to visually dismiss them as they do most ads on websites today. We as designers need to understand all the conditions in order to serve up the perfect “Goldilocks” experience for our users/customers.
Privacy will become something we as designers will be faced with in every design decision we make moving forward. We need to understand that design needs to include an equal balance of giving up privacy to gain a better experience. Balance is a crucial and essential part of design in order to get users to feel comfortable enough to give up a piece of their privacy. The best thing to do is be honest with your users, let them know the benefits of taking a piece of their privacy.
Self-awareness has sparked a huge surge of interest towards the use of wearable devices. What better way to capture this information than to attach a sensor to our bodies so we can track every step, every wink, and every pulse we take. Having this information at our fingertips when designing experiences is a huge asset in helping form that contextual experience. Understanding the user in certain contextual situations is one that we will learn to leverage more and more.
Are you trying to find something, or do you want something to find you? Geo-Location is finding the location of an object or place. Proximity gives you the ability to know if you are close to an object. Doing this allows you listen and serve up contextual information based on how close you are to that object. Combining these two things are very powerful when it comes to creating a contextual experience. Take for example, Mobile Checkout, retailers are now allowing customers to scan products and checkout in the store using their own mobile device. Some of the low margin grocery chains have started to look at contextually changing the app to allow for check out at certain locations in the store to help reduce losses.