Privacy Concerns of the Visually Impaired People

My Role

Lead Researcher -- designed the study, engaged stakeholders, conducted surveys and interviews, performed qualitative and quantitative data analysis, published a full paper.
Year: 2017-2018 | Collaborators: Prof. Apu Kapadia, Prof. Bryan Semaan, Bryan Dososno, Tousif Ahmed | Indiana University, Syracuse University.


Description

The emergence of camera-based assistive technologies has empowered people with visual impairments (VIP) to obtain independence in their daily lives. Popular services feature volunteers who answer questions about photos or videos (e.g., to identify a medical prescription). However, people with VIPs can (inadvertently) reveal sensitive information to these volunteers. To better understand the privacy concerns regarding the disclosure of background objects to different types of human assistants (friends, family, and others), we conducted an online survey with 155 visually impaired participants. In general, our participants had varying concerns depending on the type of assistants and the kind of information. We found that our participants were more concerned about the privacy of bystanders than their own when capturing people in images. We also found that participants were concerned about self-presentation and were more comfortable sharing embarrassing information with family than with their friends. Our findings suggest directions for future work in the development of human-assisted question-answering systems. Specifically, we discuss how humanizing these systems can give people a greater sense of personal security.


Research Method

Controlled experiment; Online survey


Participants

155 participants with visual impairments recruited through different blind organizations (e.g., National Federation of the Blind (NFB), American Council of the Blind (ACB))


Independent Variable

3 real-life scenarios (home, office, and restaurant)
3 human assistants (family members, friends, and crowd-workers)
10 background objects that can be disclosed with foreground objects (e.g., credit card, medical prescriptions, face or body part)


Dependent Variable

Comfort level of sharing background information with human assistants (5 point Liker scale)


Quantitative Data Analysis Method

Non-parametric methods:
Kruskal-Wallis test (for multiple groups and between subjects)
Wilcoxon rank sum test (for two groups and between subjects)
Friedman rank sum test (for multiple groups and within subjects)
Wilcoxon signed rank test (two groups within subjects)


Qualitative Data Analysis Method

Bottom up approach for coding
Cohen's Kappa


For more details, please download the paper at: https://www.usenix.org/conference/usenixsecurity20/presentation/akter

Contact

Taslima Akter

700 N Woodlawn Ave

Bloomington, Indiana, 47408

Email: takter@iu.edu

Github Google Scholar Linkedin Twitter