
Vaishnav Kameswaran
Postdoctoral Researcher, University of Maryland
Area of Expertise: AI & Accessibility
Vaishnav Kameswaran is a postdoctoral researcher with the Value-Centered Artificial Intelligence Initiative at the University of Maryland. Kameswaran is the lead mentor for the TRAILS Summer Undergraduate Research Fellowship (TRAILS-SURF) program. His research focuses on the impact of high stakes algorithmic decision-making systems on people with disabilities, and designing novel accessible technologies using generative AI.
-
Kameswaran, V., Robinson, J., Sambasivan, N., Aggarwal, G., & Morris, M. R. (2024). Help and The Social Construction of Access: A Case-Study from India. In ASSETS ’24: Proceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility (pp. 1–12).
Abstract: A goal of accessible technology (AT) design is often to increase independence, i.e., to enable people with disabilities to accomplish tasks on their own without help. Recent work challenges this view by recognizing the role of ‘help’ in addressing the access needs of people with disabilities. However, empirical evidence examining help is limited to the Global North; we address this gap using a case study of how people with visual impairments (PVI) navigate indoor environments in India. Using interviews with PVI and their companions and a video-diary study, we find that help is a key practice that PVI use to navigate indoor environments. We uncover how help is a situated phenomenon shaped by socio-material and cultural factors unique to the Indian context. We discuss the value of help in the context of broader HCI and Accessibility literature on mixed-ability and collaborative interactions. We also discuss implications our findings on help have for AT design.
-
Sum, C. M., Spektor, F., Alharbi, R., Baltaxe-Admony, L. B., Devine, E., Dixon, H. A., Duval, J., Eagle, T., Elavsky, F., Fernandes, K., Guedes, L. S., Hillman, S., Kameswaran, V., Kirabo, L., Motahar, T., Ringland, K. E., Schaadhardt, A., Scheepmaker, L., & Williamson, A. (2024). Challenging Ableism: A Critical Turn toward Disability Justice in HCI. XRDS Crossroads the ACM Magazine for Students, 30(4), 50–55.
Abstract: A reflection on our learnings from the CHI 2022 "Dreaming Disability Justice in HCI" workshop, and why we continue to call for disability justice, despite the limitations of how we practice it within academia and industry.
-
Fernandes, K., Alharbi, R., Sum, C., Kameswaran, V., Spektor, F., Thuppilikkat, A. A., Petterson, A., Marathe, M., Hamidi, F., & Chandra, P. (2024). Organizing for More Just and Inclusive Futures: A community discussion. Companion Publication of the 2024 Conference on Computer-Supported Cooperative Work and Social Computing, 689–692.
Abstract: This Special Interest Group brings together researchers and practitioners to examine the critical questions, innovative methods and emerging possibilities that arise from an orientation toward disability justice within CSCW research particularly and HCI research more broadly. We will focus on how digital technologies influence the ways disabled people organize and advocate for their rights, and how disabled people influence and configure technologies as well. By attending to the intersections of technology, disability justice, and social movements, we aim to explore how HCI and CSCW research can support the organizing efforts of disabled communities. This SIG emphasizes the ways in which disabled people and communities have been organizing and are continuing to organize in response to various forms of oppression. The SIG will provide a platform for scholars and activists to engage in conversations around technologies, disability justice, and social movements. By centering disability justice as a framework, we hope to foster a deeper understanding of how HCI and CSCW research can support and amplify the efforts of disabled communities. Participants will share their insights, collaborate on research ideas, and contribute to a collective vision of a more inclusive and justice-oriented HCI and CSCW. Through these discussions, we aim to generate actionable strategies for future research and practice in supporting organizing efforts.
-
Kameswaran, V., Y, V., & Marathe, M. (2023). Advocacy as Access Work: How People with Visual Impairments Gain Access to Digital Banking in India. Proceedings of the ACM on Human-Computer Interaction, 7(CSCW1), 1–23.
Abstract: Research in accessibility and assistive technology often assumes that technology is within easy reach, that is, people with disabilities are able to obtain technologies so long as they are accessible. As a result, less is understood about the challenges that people with disabilities face in obtaining technology in the first place and how they work around these challenges. We reduce this gap by examining the technology access challenges of people with visual impairments in India in the context of digital banking. Through a qualitative study consisting of 30 interviews, we find that participants routinely encountered social and technical challenges that made it difficult to access and use digital banking. To address these challenges, people with visual impairments engaged in advocacy work which consisted of five dimensions: 1) creating awareness, 2) demonstrating competence, 3) escalation, 4) gathering support, and 5) seeking sighted help. We expand on the idea of advocacy as a form of access work performed by people with visual impairments to secure and maintain access to digital banking.
-
Ramesh, D., Kameswaran, V., Wang, D., & Sambasivan, N. (2022). How Platform-User Power Relations shape algorithmic accountability. 2022 ACM Conference on Fairness, Accountability, and Transparency, 1917–1928.
Abstract: Accountability, a requisite for responsible AI, can be facilitated through transparency mechanisms such as audits and explainability. However, prior work suggests that the success of these mechanisms may be limited to Global North contexts; understanding the limitations of current interventions in varied socio-political conditions is crucial to help policymakers facilitate wider accountability. To do so, we examined the mediation of accountability in the existing interactions between vulnerable users and a ‘high-risk’ AI system in a Global South setting. We report on a qualitative study with 29 financially-stressed users of instant loan platforms in India. We found that users experienced intense feelings of indebtedness for the ‘boon’ of instant loans, and perceived huge obligations towards loan platforms. Users fulfilled obligations by accepting harsh terms and conditions, over-sharing sensitive data, and paying high fees to unknown and unverified lenders. Users demonstrated a dependence on loan platforms by persisting with such behaviors despite risks of harms such as abuse, recurring debts, discrimination, privacy harms, and self-harm to them. Instead of being enraged with loan platforms, users assumed responsibility for their negative experiences, thus releasing the high-powered loan platforms from accountability obligations. We argue that accountability is shaped by platform-user power relations, and urge caution to policymakers in adopting a purely technical approach to fostering algorithmic accountability. Instead, we call for situated interventions that enhance agency of users, enable meaningful transparency, reconfigure designer-user relations, and prompt a critical reflection in practitioners towards wider accountability. We conclude with implications for responsibly deploying AI in FinTech applications in India and beyond.
-
Pandey, M., Bondre, S., Kameswaran, V., Rao, H., O’Modhrain, S., & Oney, S. (2024). UI Development Experiences of Programmers with Visual Impairments in Product Teams. In Apress eBooks (pp. 121–133).
Abstract: The tools and techniques that software engineers use to collaborate are critical in deciding who can contribute to software projects and the roles they can play within those teams. The consistent growth of UI developer job roles has made many programmers seek UI engineering jobs. It is important to understand the accessibility of the profession and identify ways to make it more inclusive. We conducted two qualitative studies to better understand the strategies that mixed-ability teams – specifically teams where some team members identify as having a visual impairment and some do not – use to collaborate on user interface (UI) development. In this chapter, we summarize and synthesize the findings from our prior studies to highlight the challenges programmers with visual impairments encounter in collaborative UI programming. The chapter concludes with recommendations for building more inclusive software engineering teams by fostering communication and help-seeking interactions, which we hope product teams would find valuable. We also derive implications for UI frameworks that aim to support accessible application development. These implications can inform the engineering choices of product teams as well as inform the efforts of researchers and developers building these frameworks.