Huaishu Peng
Assistant Professor of Computer Science
University of Maryland
Area of Expertise: Human-Computer Interaction
Huaishu Peng is an assistant professor of computer science with an appointment in the University of Maryland Institute for Advanced Computer Studies where he leads the Small Artifacts Lab. Peng is also an affiliate assistant professor in the College of Information, and a member of the Human-Computer Interaction Lab and Maryland Robotics Center. His research interests lie in the technical aspects of human-computer interaction with a focus on personal fabrication.
-
Li, J., Yan, Z., Jarjue, E., Shetty, A., & Peng, H. (2022). TangibleGrid: Tangible Web Layout Design for Blind Users. Proceedings of the 35th Annual ACM Symposium on User Interface Software and Technology, 1–12.
Abstract: We present TangibleGrid, a novel device that allows blind users to understand and design the layout of a web page with real-time tangible feedback. We conducted semi-structured interviews and a series of co-design sessions with blind users to elicit insights that guided the design of TangibleGrid. Our final prototype contains shape-changing brackets representing the web elements and a baseboard representing the web page canvas. Blind users can design a web page layout through creating and editing web elements by snapping or adjusting tangible brackets on top of the baseboard. The baseboard senses the brackets’ type, size, and location, verbalizes the information, and renders the web page on the client browser. Through a formative user study, we found that blind users could understand a web page layout through TangibleGrid. They were also able to design a new web layout from scratch without the help of sighted people.
-
Li, J., Yan, Z., Shah, A., Lazar, J., & Peng, H. (2023). Toucha11y: Making Inaccessible Public Touchscreens Accessible. Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, 1–13.
Abstract: Despite their growing popularity, many public kiosks with touchscreens are inaccessible to blind people. Toucha11y is a working prototype that allows blind users to use existing inaccessible touchscreen kiosks independently and with little effort. Toucha11y consists of a mechanical bot that can be instrumented to an arbitrary touchscreen kiosk by a blind user and a companion app on their smartphone. The bot, once attached to a touchscreen, will recognize its content, retrieve the corresponding information from a database, and render it on the user’s smartphone. As a result, a blind person can use the smartphone’s built-in accessibility features to access content and make selections. The mechanical bot will detect and activate the corresponding touchscreen interface. We present the system design of Toucha11y along with a series of technical evaluations. Through a user study, we found out that Toucha11y could help blind users operate inaccessible touchscreen devices.
-
Peng, H., Briggs, J., Wang, C.-Y., Guo, K., Kider, J., Mueller, S., Baudisch, P., & Guimbretière, F. (2018). RoMA: Interactive Fabrication with Augmented Reality and a Robotic 3D Printer. Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, 1–12.
Abstract: We present the Robotic Modeling Assistant (RoMA), an interactive fabrication system providing a fast, precise, hands-on and in-situ modeling experience. As a designer creates a new model using RoMA AR CAD editor, features are constructed concurrently by a 3D printing robotic arm sharing the same design volume. The partially printed physical model then serves as a tangible reference for the designer as she adds new elements to her design. RoMA's proxemics-inspired handshake mechanism between the designer and the 3D printing robotic arm allows the designer to quickly interrupt printing to access a printed area or to indicate that the robot can take full control of the model to finish printing. RoMA lets users integrate real-world constraints into a design rapidly, allowing them to create well-proportioned tangible artifacts or to extend existing objects. We conclude by presenting the strengths and limitations of our current design.
-
Rahman, T., Adams, A. T., Zhang, M., Cherry, E., Zhou, B., Peng, H., & Choudhury, T. (2014). BodyBeat: A Mobile System for Sensing Non-Speech Body Sounds. Proceedings of the 12th Annual International Conference on Mobile Systems, Applications, and Services, 2–13.
Abstract: In this paper, we propose BodyBeat, a novel mobile sensing system for capturing and recognizing a diverse range of non-speech body sounds in real-life scenarios. Non-speech body sounds, such as sounds of food intake, breath, laughter, and cough contain invaluable information about our dietary behavior, respiratory physiology, and affect. The BodyBeat mobile sensing system consists of a custom-built piezoelectric microphone and a distributed computational framework that utilizes an ARM microcontroller and an Android smartphone. The custom-built microphone is designed to capture subtle body vibrations directly from the body surface without being perturbed by external sounds. The microphone is attached to a 3D printed neckpiece with a suspension mechanism. The ARM embedded system and the Android smartphone process the acoustic signal from the microphone and identify non-speech body sounds. We have extensively evaluated the BodyBeat mobile sensing system. Our results show that BodyBeat outperforms other existing solutions in capturing and recognizing different types of important non-speech body sounds.
-
Sathya, A., Li, J., Rahman, T., Gao, G., & Peng, H. (2022). Calico: Relocatable On-Cloth Wearables with Fast, Reliable, and Precise Locomotion. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 6(3), 1–32.
Abstract: We explore Calico, a miniature relocatable wearable system with fast and precise locomotion for on-body interaction, actuation and sensing. Calico consists of a two-wheel robot and an on-cloth track mechanism or "railway," on which the robot travels. The robot is self-contained, small in size, and has additional sensor expansion options. The track system allows the robot to move along the user's body and reach any predetermined location. It also includes rotational switches to enable complex routing options when diverging tracks are presented. We report the design and implementation of Calico with a series of technical evaluations for system performance. We then present a few application scenarios, and user studies to understand the potential of Calico as a dance trainer and also explore the qualitative perception of our scenarios to inform future research in this space.
-
Tasnim Oshim, Md. F., Killingback, J., Follette, D., Peng, H., & Rahman, T. (2020). MechanoBeat: Monitoring Interactions with Everyday Objects using 3D Printed Harmonic Oscillators and Ultra-Wideband Radar. Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology, 430–444.
Abstract: In this paper we present MechanoBeat, a 3D printed mechanical tag that oscillates at a unique frequency upon user interaction. With the help of an ultra-wideband (UWB) radar array, MechanoBeat can unobtrusively monitor interactions with both stationary and mobile objects. MechanoBeat consists of small, scalable, and easy-to-install tags that do not require any batteries, silicon chips, or electronic components. Tags can be produced using commodity desktop 3D printers with cheap materials. We develop an efficient signal processing and deep learning method to locate and identify tags using only the signals reflected from the tag vibrations. MechanoBeat is capable of detecting simultaneous interactions with high accuracy, even in noisy environments. We leverage UWB radar signals' high penetration property to sense interactions behind walls in a non-line-of-sight (NLOS) scenario. A number of applications using MechanoBeat have been explored and the results have been presented in the paper.