New Accessibility Feature Enhances Open Exhibits Experience
Video tour of enhanced Solar System Exhibit
Enhanced Solar System Exhibit
Ideum (the lead organization of Open Exhibits) has made significant progress in multitouch accessibility in the process of developing three prototypes for the Creating Museum Media for Everyone (CMME) National Science Foundation-funded project. The third prototype, a new version of our Open Exhibits Solar System Exhibit, incorporates improvements based on usability test results and suggestions from the Museum of Science Boston, National Center for Interactive Learning, WBGH, and advisor Sina Bahram. The major new feature in the current version is an accessibility layer designed for visually impaired users on large touch screen devices. This new CMME software will be released February 6, 2015.
Enhanced Solar System Exhibit with accessibility layer
Accessibility Layer
The main component of the accessibility layer is the information menu browser. To activate the menu browser, a user holds down three fingers for two seconds. This can be edited to incorporate most of the hundreds of gestures used in the Open Exhibit framework. During this hold, the user receives audio feedback letting them know the accessibility layer is activating. Once the menu is active, the user can swipe left or right to access different choices on the menu, in this case, the different planets in the solar system. The text that normally appears on the screen when an item is chosen from the visual menu is automatically narrated aloud. Using a simple set of gestures, the user can control the menu and the content to be read.
User enables accessibility layer
Future Steps
In the current version, the accessibility layer is intended for one user, and that one user controls what content is active for the entire screen. We are currently working on a multi-user version that will incorporate multiple “spheres of influence” to allow users to control a range of space from a small area. Using these “spheres of influence,” multiple visually impaired and/or sighted users can interact with the exhibit simultaneously. The multi-user version’s audio will be multidirectional, that is, can be split so that users on different sides of the table can listen to different parts of the content at the same time. Our next step is to develop visual elements that will play along with the audio narration for those who have limited sight or hearing impairment, or are learning the English language.
by Stacy Hasselbacher on January 6, 2015