I am currently a first-year Ph.D. student at the Human-Computer Interaction Institute at Carnegie Mellon University, working with Dr. Jeff Bigham and Dr. Amy Pavel. I am also an Adobe Research Ph.D. Fellow. Previously, I completed my B.S. in Computer Science at National Taiwan University, where I worked with Dr. Mike Chen and Dr. Lung-Pan Cheng.

My research lies at the intersection of inclusive and immersive experiences, with a recent focus on building systems to facilitate accessible content consumption and creation. I also did work in the fields of computational interaction, virtual and augmented reality, haptics and perception.


profile shot taken when Yi Hao presented at CHI 2019.

| Email: yihaop [at] cs.cmu.edu | Curriculum Vitæ | Linkedin |

Selected Research

Say it all: Feedback for Improving Non-Visual Presentation Accessibility


An authoring system that visualizes real-time and post-hoc accessibility feedback for speakers to deliver more accessible presentations.

Presented at ACM CHI 2021, Yokohama, Japan


StrengthGaming: Enabling Dynamic Repetition Tempo in Strength Training-based Exergame Design


A novel approach to provide tempo variation to strength training in order to enable the design of more entertaining strength training-based exergames.

Presented at ACM MobileHCI 2020, Oldenburg, Germany


WalkingVibe: Reducing Virtual Reality Sickness and Improving Realism while Walking in VR using Unobtrusive Head-mounted Vibrotactile Feedback


An unobtrusive head-mounted vibro-tactile stimulation to reduce VR sickness and improve realism for VR navigation.

Presented at ACM CHI 2020, Hawaii, USA


PersonalTouch: Improving Touchscreen Usability by Personalizing Accessibility Settings based on Individual User’s Touchscreen Interaction


A recommendation pipeline which suggests personalized, optimal touchscreen accessibility settings based on individual touchscreen behavior.

Presented at ACM CHI 2019, Glasgow, England


PhantomLegs: Reducing Virtual Reality Sickness using Head-Worn Haptic Devices


A head-mounted haptic device which applies alternating haptic cues synchronized to users’ footsteps in VR.

Presented at IEEE VR 2019, Osaka, Japan


PeriText: Utilizing Peripheral Vision for Reading Text on Augmented Reality Smart Glasses


A multiword rapid-serial-visual-presentation AR interface which allows users to read virtual information with periphery meanwhile observe physical environment via central vision.

Presented at IEEE VR 2019, Osaka, Japan


SpeechBubbles: Enhancing Captioning Experiences for Deaf and Hard-of-Hearing People in Group Conversations


A real-time speech recognition AR interface aiming to provide desirable caption visualization for deaf and hard of hearing (DHH) users during group conversations with hearing people.

Presented at ACM CHI 2018, Montréal, Canada