BATS - Blind Audio Tactile Mapping System

Current Work

| Home | News | History | Current Work | Papers | Features | Downloads | User Manual | Links | Acknowledgements | Contact | Full site | UNC Assistive Technology |


Developing Spatial Understanding

A portion of a BATS UNC campus mapOur main goal for BATS is to provide students with visual impairments an avenue for exploring maps that puts them on equal footing with their sighted peers. We are working to develop a variety of techniques to be evaluated on their ability to develop a user's spatial awareness with respect to map content without relying on the sense of sight. Some of these methods are the following:

3D auditory icons
Auditory icons representative of important regions on maps (cities, rivers, lakes, forests, roads, buildings, etc.) are placed in three-dimensions around a user to help her understand relationships between landmarks. Comparisons can be made between sounds at different positions and volumes to form a rough understanding of relative location of one landmark to another. These sounds update in real-time with respect to a user's cursor location on a map.

While auditory icons help indicate the types of landmarks nearby, they do not identify the specific names of these places. A user clearly needs to query for the names of landmarks to further refine his mental model of a map. Callouts compliment a map's audio icons by reporting landmark names with increasing distance in a given compass direction from the user's position. For example, a user exploring near Durham on a map of North Carolina can learn that Chapel Hill is southwest of his location and that Raleigh is to the east. Finding a method for comparing the distance between Chapel Hill and Raleigh between callouts is a topic of current discussion. One possibility is to report the measured distance between the current point and each landmark.

Shapes of regions on a map, especially roads and rivers, are difficult to explore using sound and tactile feedback alone. Positioning a cursor over small regions of interest is also a difficult task relying only on a sound's direction and volume. We are developing a touring feature that moves users along extended bodies on a map, or drives them directly toward regions with associated sounds. During the tour, the environment responds just as it would as if the user was moving the cursor herself--sounds and textures update, and queries can be performed. We believe this type of guidance might be useful for types of map exploration that involve tracing regions to understand their shape or following paths from place to place.

Recursive zooming
A BATS map of counties, cities, rivers, and lakes surrounding Chapel Hill, NC A sighted user can most likely explain what it means to zoom into and out of a map. Such a description might read something like the following: "Zooming means focusing your view on a smaller, more details region of a map. Unzooming is exactly the opposite." Explanations of zooming such as this one are almost always laden with concepts related to vision (e.g. focus), making them next to useless for blind users. A zooming strategy for maps that is intuitive to users with visual impairments is an interesting topic of research. Recursive zooming [1] is one such strategy that might allow a BATS user to navigate into and out of map data without losing their context.

Exploratory User Interfaces

Exploring maps in BATS is only possible once a user understands how to use the software. An exploratory user interface gives a user the chance to ask BATS what it is doing at the present time, why it is doing it, and what it will do if the user performs a certain action. This question and answer interface moves away from traditional visual and auditory menus to help users explore the features of BATS, and build a model of interactions, while exploring and learning about maps.

Consider the following example. A student is exploring a map of the United States. He hears the sound of the ocean, but does not know what it means. He queries BATS with the press of a button to ask "what are you doing?" The software responds by explaining that the cursor is in the western part of the map, over the state of California, and that the sound of the ocean can be heard because the cursor is near the Pacific Ocean. The student now understands what the sound of an ocean means.

Consider a second example. The same student accidentally hits a key on the keyboard. The software responds by announcing "Durham" and proceeds to spell it. The student is confused as to why the software spelled 'Durham,' and wants to know what button makes the software spell landmark names. He queries BATS this time with another press of a button, asking "why did you do that?" The software explains that the student pressed the S key, and that pressing the S key will cause the BATS to spell the name of the landmark under the cursor.

Applying the concept of an exploratory user interface to software beyond BATS is a part of this research.

Audio-Based User Interfaces

The most commonly encountered audio user interface is akin to a telephone menu system. In such a system, a user navigates through a hierarchy of options until the one she wishes to select is encountered. As the complexity of a piece of software grows, so does the complexity of the menu hierarchy. The rate of speech of the system can be increased, but a bottleneck is ultimately encountered when the user can no longer increase the speed of speech without negating her ability to understand the menu options. This problem is usually solved by creating hot-keys that can be used to access common menu items without selecting them from the audio menu.

We are investigating an alternative approach to handling a large number of options without relying on a complex menu hierarchy or numerous hotkeys. The audio user interface in BATS promotes the idea of selecting an object and its related options within a certain time period after an event relating to that object has occurred. The map itself, therefore, becomes the menu system, and properties of map regions and events can be adjusted by exploration.

For example, consider a student exploring a map of North Carolina, complete with regions and associated sounds of cities, forests, roads, and rivers. The student notices quickly that the river sounds are too loud. To turn down the volume of the river sounds alone, she first finds a river region and makes its sound play. She then presses a selection key to select that object. She chooses to adjusts its volume accordingly. Along this line of thinking, she could also turn off the entire river layer, choose to visit all the rivers on the map one by one, attach a constant sound to this river as a beacon, etc.

Interface Bootstrapping

The needs of BATS users cannot be simply grouped into two categories--sighted and blind. Every individual has special needs when using our software. Some users have some sense of vision, and can benefit from high-contrast visual maps as well as audio and tactile feedback. Other users have mobility impairments in addition to their blindness, and can only use some of the keys on a keyboard. Still others are partially deaf in one ear, and cannot benefit from spatial sound.

BATS must provide a means for users to customize the software to meet their own needs and likings. But before users can customize their BATS experience, they must first be able to interact with the software and understand its features. We plan to implement a bootstrapping system [2] in BATS that allows the software to learn some basic information about the user as he tries to login to the system. When first using BATS, the user is required to answer some simple questions about their vision, hearing, mobility, and age. The responses to these questions are used to customize BATS to an initial batch of settings to meet the basic needs of the user.

User-Matched Stylesheets

An example of how stylesheets can change the appearance of a mapBATS maps can be pacakaged with multiple stylesheets, each defined to match general classes of BATS users. For instance, a stylesheet for low-vision users provides high-contrast colors and textures, while a stylesheet for deaf users specifies that map data is presented visually instead of audibly. These stylesheets can be matched to user needs in BATS using the bootstrapping idea described above, and then further customized in a user's profile. We are interested in building some intelligence into our map maker to assist would-be map makers in creating stylesheets that default to settings appropriate for the intended map users.


North Carolina Curriculum

BATS maps relevant to students and their studies are necessary for our software to find mainstream use. We are working with local teachers in attempt to garner interest for our project. We hope to form a community of teachers interested in teaching their students using BATS and in building maps to share with other teachers. Presently, we are engaged in developing some trial maps that would be of interest to 4th grade social studies teachers in North Carolina.

Service Learning

Many K-12 students learn about GIS in their technology education classes by creating maps. We believe it would be an excellent idea to have these students create maps that can be put to use in real world classrooms by students with visual impairments. Students making the maps will benefit by learning how GIS software works while students using the maps will benefit by exploring and learning information provided by their peers.

Various high schools in the Chapel Hill area also have service learning requirements for graduation. Students are required to complete a certain number of hours of community service. Spending these hours making BATS maps could be another excellent way of getting students involved with technology while helping their peers.


Finally, we wish to perform a formal evaluation of our software in the classroom sometime in the near future. We have worked with a number of students and teachers to get their feedback about initial BATS prototypes, but we are also interested in seeing if and how it improves student understanding of spatial information.

  1. Kamel, H. and J. Landay. "Sketching images eyes-free: a grid-based dynamic drawing tool for the blind." in Proceedings of ACM SIGCAPH Conference on Assistive Technologies (ASSETS). 2002. pp. 33-40.
  2. Fairweather, P., et al. "From assistive technology to a web accessibility service." in Proceedings of ACM SIGCAPH Conference on Assistive Technologies (ASSETS). 2002. pp. 4-8.