Hi! I am Ruolin Wang (Violynne), a Postdoctoral Fellow in the Murty Lab, in the Cognition and Brain Science Unit of the Department of Psychology, at Georgia Tech. I received my PhD in Human-Computer Interaction at UCLA HCI Lab, advised by Prof. Xiang 'Anthony' Chen. I started doing research in the Tsinghua Pervasive HCI Group, advised by Prof. Yuanchun Shi and Chun Yu. I received M.Sc degree in Computer Science from Tsinghua University and BEng degree in Microelectronics from Tianjin University.
My mission is to break the cycle of exclusions through the lens of human interactions by
EarTouch: Facilitating Smartphone Use for Visually Impaired People in Public and Mobile Scenarios
Ruolin Wang, Chun Yu, Xing-Dong Yang, Weijie He, Yuanchun Shi (CHI 2019, Best Paper Honorable Mention 🏅)
Have you ever been troubled by unintentionally hanging up due to an ear touch on the screen? We reverse it as an innovative input to support one-handed interactions for blind or low vision people in public and mobile scenarios. Users can hold the smartphone in a talking position and listen to speech output from ear speaker privately. Eight ear gestures were deployed to support seven common tasks including answering a phone call, sending a message, map navigation, etc. EarTouch also brings us a step closer to the inclusive design of smartphone for all users who may suffer from situational disabilities.
Ruolin Wang, Zixuan Chen, Mingrui 'Ray' Zhang, Zhaoheng Li, Zhixiu Liu, Zihan Dang, Chun Yu, Xiang 'Anthony' Chen (CHI 2021)
So much of the language we use to describe a product is centered around vision, which poses a barrier to people who don’t experience the world visually. Inspired by observing how sighted friends help blind people with online shopping, we propose Revamp, a system that leverages customer reviews for interactive information retrieval. We identified four main aspects (color, logo, shape, and size) that are vital for blind and low vision users to understand the visual appearance of a product and formulated syntactic rules to extract review snippets, which were used to generate image descriptions and responses to users’ queries. Revamp also inspired several exciting future directions in accessible information seeking: (i) simplifying and reconstructing the web pages according to users’ current task; (ii) providing coordinated experience of active query and passive reading to support flexible information seeking; (iii) leveraging relative text resources on the web page, such as reviews, to fill in the information gap.