Graduation Year
2022
Document Type
Dissertation
Degree
Ph.D.
Degree Name
Doctor of Philosophy (Ph.D.)
Degree Granting Department
Electrical Engineering
Major Professor
Anna Pyayt, Ph.D.
Co-Major Professor
Andrew Hoff, Ph.D.
Committee Member
Mark Jaroszeski, Ph.D.
Committee Member
Venkat Bhethanabotla, Ph.D.
Committee Member
Arash Takshi, Ph.D.
Committee Member
Sanghoon Park, Ph.D.
Keywords
3D Printing, Maps, Visually Impaired, Haptic, Tactile
Abstract
The necessity for individuals to navigate from location to location is essential to daily life. The majority of humans use sight as the main source for determining how to move through their environment. However, there is a large population of people who are either blind or living with low vision. Based on a 2017 National Health Survey there are around 27 million adult Americans that have experienced vision loss (National Center for Health Statistics, 2017). According to the Eye Diseases Prevalence Research Group there are 3.4 million Americans 40 years and older that are legally blind or visually impaired (EDPRG, 2004). There are very limited resources for this population, especially for when it comes to tools for spatial and navigational understand of environments.
Maps and building layouts are generally available as two-dimensional images, which presents problems for blind and visually impaired persons. For example, maps, such as those provided on placards and kiosks at various locations are typically only provided as two-dimensional schematic of the area. In some instances, these two-dimensional maps may also include critical information such as evacuation routes in case of an emergency. Furthermore, while text may be presented on these maps as braille or raised lettering, this only provides textual information to the user, and does not fully and effectively describe the physical environment or navigational routes. Accordingly, there is a need to provide a resource that enables people with blindness or low vision to obtain the same mapping information as sighted individuals. One solution that addresses this need is based on tactile maps. However, it is important to create tactile maps with the appropriate representations and indicators to allow for easier access and understanding of spatial information for individuals living with blindness or low vision.
Wide availability of 3D-printers has made possible simple creation of tactile maps. However, typical tactile maps are often done by direct translation of 2D maps made for sighted individuals. These maps, however, are not focused on better readability and functionality and do not consider the requirements of a person with blindness or low vision. By studying how people with blindness or visually impairments perceive and use tactile sensation to learn about their surroundings we can provide more effective solutions to the development and design of the tactile maps. We also leverage newer production techniques using consumer grade 3D-printers, which allows for the development and testing of 3-dimensional tactile elements to be incorporated in a map. 3D-printing technology also allows for a greater number of tactile variants, than other traditional methods, and enables rapid prototyping for quicker user testing and iterative development.
The work in this dissertation presents a user-based iterative process for the development of new encoding rules which optimize map creation and functionality. In terms of user perception and comprehension this work demonstrates the 3D-printed tactile map developed enable people with blindness and low vision to obtain an improved understanding of environments, and an increase in mobility and independence. Additionally, this work presents deeper understanding in the spatial and tactual perception of blind and visually impaired people.
In this dissertation, first, we describe the various types of maps that were developed and tested by blind and low vision users. Then we explain how the user testing and feedback lead to the creation of a novel tactile encoding system. We further evaluate effectiveness of the encoding system for communicating spatial and navigational information, and how it provides a valuable resource for the users. We also present a tactile encoding system focused specifically on interior maps along with an analysis of the numerous encodings and encoding parameters tested. The final encodings system provides a novel approach for the creation of optimized interior tactile maps.
Scholar Commons Citation
Kaplan, Howard, "Assistive Technologies for Independent Navigation for People with Blindness" (2022). USF Tampa Graduate Theses and Dissertations.
https://digitalcommons.usf.edu/etd/9385
Included in
Biomedical Engineering and Bioengineering Commons, Electrical and Computer Engineering Commons