How can we help?

Design? Check. Strategy? Check. Bulletproof code? Check! People who can manage it in an agile and efficient manner? Check. Someone to help you create your next big product? Of course.

Denver

- 1201 18th St. Suite 250, Denver, CO 80202

Grand Rapids

- 125 Ottawa Ave NW Suite 270, Grand Rapids, MI 49503

Blog

BLOG:Apps for Everyone Part 2: Strategies for Accessibility in iOS

Apps for Everyone Part 2: Strategies for Accessibility in iOS

As user experience designers and developers, it is our job to create experiences that are easy and enjoyable for our users, including users with impairments. To create iOS experiences that are inclusive of all audiences, you’ll need to follow these steps.

  1. Understand what accessibility features the platform supports and experience them
  2. Prepare for accessibility features by identifying them early in your design process
  3. If your app is already live or in development, do an accessibility audit

In the first part of the Apps for Everyone series, we outlined the accessibility features of iOS. In this article, we will share strategies for accessibility in iOS.

Accessibility Audit

Now that you have a basic understanding of what is available to you as a designer, you’ll need to understand how the accessibility features of iOS are ‘seeing’ your app in its current state. This will give you an understanding for where you’ll need to focus to make your app accessible to the widest audience. There are two things to consider as you begin your audit. The first is visual accommodations of the user interface. These are the features that allow the on-screen elements of the interface to be more readable by users with visual challenges. The second consideration is for semantic accessibility. Semantic accessibility is the interactions of VoiceOver, Siri and Dictation and how they can be best leveraged by people with visual or physical challenges.

accessibility

For the iOS developer, a simple API is available to leverage the features of visual accommodation. You can determine if the user has chosen to set their screens to be more easily read through enhanced visual settings by checking these four methods:

  1. (BOOL) UIAccessibilityIsBoldTextEnabled();
  2. (BOOL) UIAccessibilityIsReduceTransparencyEnabled();
  3. (BOOL) UIAccessibilityDarkerSystemColorsEnabled();
  4. (BOOL) UIAccessibilityIsReducedMotionEnabled();

These methods determine if the user has selected these four features. From a designers perspective, you’ll need to consider how the styles of your interface elements should be rendered if these items are turned on. These alternate styles can then be added to the styling code contained in your application.

In terms of coding strategies, it makes sense not to code your styles in your individual classes. If possible, abstract them to a support class where all your styling can be adjusted and drawn from. If this is done efficiently, adding alternate cases for the four API calls above is a simple adjustment in one styling class. This is the same strategy of having a separate CSS file containing your styles as opposed to inline styles hard-coded in the HTML. Many of us have experienced changing instances of a style at the HTML tag level and it’s not fun. Same approach applies here for iOS.

Semantic accessibility is what allow the assistive technology to get information from your app and control the app through its features. For example, a user has VoiceOver turned on and has tapped a button on the screen. The OS will ask the app what is located at the x and y positions that the user has tapped. The application returns the information to the OS regarding what is in that location and the OS then speaks and highlights the element selected. At that point, the user can double tap anywhere on the screen to select and interact with that element. To continue with your audit through semantic accessibility, ask yourself the following questions:

  1. Can VoiceOver speak everything that is contained within your app?
  2. Can VoiceOver do everything within your app?

By examining how VoiceOver navigates your application, you’ll be able to understand where accommodations need to be made. Start your audit by enabling a VoiceOver shortcut; triple-click on the home button to turn it on and off. This can be made in your your Setting app under Settings > General > Accessibility > Accessibility Shortcut and checking VoiceOver or any of the other settings you’d like to easily test.

VoiceOver for the uninitiated can be a bit confusing. Essentially, VoiceOver highlights and describes anything that is touched in the UI. Once an element is highlighted, it can be selected by double tapping anywhere on the screen. If the selected element has other capabilities - like a sliding bar or a scrolling list - VoiceOver will give instructions on interacting with it as soon as it describes it to the user. By swiping left or right across the screen, VoiceOver moves the selected item forward or back in the list of items on the screen.

voiceover

If no accommodations for accessibility were planned into your project up to this point, you may see some odd behavior in your application as VoiceOver attempts to describe each element. Things like incorrect or missing names on page elements will be common. This is due to the fact that VoiceOver hasn’t been explicitly told the proper names of your UI elements. It is using built in descriptions for them. I’m always amazed at how close to the mark VoiceOver gets without any adjustments, but for a quality experience for everyone, we’ll need to tighten things up.

The basics of the Accessibility API has a property that identify’s elements as accessible.

@property(nonatomic) BOOL isAccessibilityElement;

This property will return YES to make an element visible to an assistive technology like VoiceOver. This property is by default set to YES on standard controls, labels and text fields. For custom components and visual elements like images, you can set this property in code or in Interface Builder.

The other basic accessibility property is one that checks the label for the element.

@property(nonatomic, copy) NSString *accessibilityLabel;

This property returns a text description of the selected accessibility enabled element. It is the description spoken by VoiceOver when an element is highlighted. This property is the one that typically needs some extra care. To make elements of your user interface contextually relevant to the user, consider creating descriptions that will assist the user in understanding what the items are even if they cannot see them or have challenges selecting them. A good example could be a text element on the UI that describes a person’s age. Without an appropriate accessibilityLabel, VoiceOver will probably say something like “text label… 27” as opposed to the more accurate “age… 27”. By simply labeling your interface elements through the accessibilityLabel property, you’ll be creating a more inclusive and understandable experience.

VoiceOver describes the element that exists when a user touches a point on the display. It can also move through a list of the elements in your application through swipe gestures to the left and right. So, as part of your audit, you should check to be sure that the elements within your display have a natural order to the information. Normally, a pattern of left to right and top to bottom would be appropriate for the order of selectable elements, but that may not be ideal depending on the layout of your application. If you have rows and columns of data displayed in your app, make sure the order that VoiceOver describes your data is optimized.

Custom Components

To this point, we’ve focused on the features available to the designer and developer for standard components included in iOS. Most applications include standard and custom components within the experience. Fortunately, iOS provides a mechanism to describe custom components to the assistive technologies.

zoom

Apple engineers describe the assistive technologies as a conversation between the OS and your application. The user highlights an element within your app and the assistive technologies asks your application six questions (through six items in the Accessibility API) to determine how it should react to it. For built in components, these questions are already mostly answered. For your custom interface elements you’ll need to consider them and have answers to them.

  1. Does this element serve a purpose?
  2. What’s the name of this element?
  3. What’s the value of this element?
  4. What’s the personality of this element?
  5. How should people interact with this element?
  6. Where is this element?

As you can see, some of these have already been discussed and are included in the built in components included in iOS. Questions like ‘Does it serve a purpose?’ allow you to bypass purely graphic elements that are tied to visual style. If they do not serve a purpose to a person with special needs, they should not be included in the accessible element list. The personality of an element digs into the depth of the Accessibility API. The API considers what the elements behavior should be if selected. These behaviors are defined by traits. The traits can be combined to give a more detailed description of the personality and the interaction model. The following traits are available for elements in your UI through code or through Interface Builder.

  1. Button
  2. Link
  3. Search Field
  4. Keyboard Key
  5. Static Text
  6. Image
  7. Plays Sound
  8. Selected
  9. Summary Element
  10. Updates Frequently
  11. Not Enabled
  12. None

By defining and/or combining traits (an image that when pressed is a link to a web page) a designer and developer can clearly define what an accessibility element’s personality is and how a user can interact with it. Work with your development team to allocate time in your effort to answer those six questions for custom objects.

disartuica2.png

Making an application accessible is not painful and doesn’t require heavy lifting. It does require consideration, empathy, and some time to test and adjust to accommodate users of all abilities. Think through the interactions you have in your app and consider how those interactions could be done with the features available in the Accessibility API. Gestures and multi-touch items require some creative thinking to accommodate but are worth the effort.

Remember, you are the designer. Design with the intent to create experiences for as many users as possible within your time and budget constraints. With an understanding of accessibility features and a simple focused approach to apply those features, you can open your application to a whole new user base that may have otherwise been excluded.