I am now the research assistant in Human-Computer Interaction (HCI) Lab at National Taiwan University (NTU), working with Prof. Mike Y. Chen and Prof. Lung-Pan Cheng.

I received my B.S. in Computer Science and Information Engineering from NTU in 2019. During my undergraduate study, I participated and led several interdisciplinary projects with focus on accessibility, virtual and augmented reality, and haptics, which led to multiple publications during the last two years of my study (see publication page).

I am passionate in conducting research in HCI, especially interested in creating different enabling and inclusive interactive systems to improve user experiences in either digital or physical context.


chi19_me.jpg

| E-mail : olivehci [at] gmail.com |
| Curriculum Vitæ | Google Scholar | Linkedin |

Selected Research

StrengthGaming: Enabling Dynamic Repetition Tempo in Strength Training-based Exergame Design


A novel approach to provide tempo variation to strength training in order to enable the design of more entertaining strength training-based exergames.

Full paper accepted to ACM MobileHCI 2020, Oldenburg, Germany


WalkingVibe: Reducing Virtual Reality Sickness and Improving Realism while Walking in VR using Unobtrusive Head-mounted Vibrotactile Feedback


An unobtrusive head-mounted vibro-tactile stimulation to reduce VR sickness and improve realism for VR navigation.

Full paper presented at ACM CHI 2020, Hawaii, USA


PersonalTouch: Improving Touchscreen Usability by Personalizing Accessibility Settings based on Individual User’s Touchscreen Interaction


A recommendation pipeline which suggests personalized, optimal touchscreen accessibility settings based on individual touchscreen behavior.

Full paper presented at ACM CHI 2019, Glasgow, England


PhantomLegs: Reducing Virtual Reality Sickness using Head-Worn Haptic Devices


A head-mounted haptic device which applies alternating haptic cues synchronized to users’ footsteps in virtual reality.

Full paper presented at IEEE VR 2019, Osaka, Japan


PeriText: Utilizing Peripheral Vision for Reading Text on Augmented Reality Smart Glasses


A multiword rapid-serial-visual-presentation (RSVP) AR interface which allows users to read virtual context with periphery meanwhile observe physical environment via central vision.

Full paper presented at IEEE VR 2019, Osaka, Japan


SpeechBubbles: Enhancing Captioning Experiences for Deaf and Hard-of-Hearing People in Group Conversations


A real-time speech recognition AR interface aiming to provide an desirable caption visualization for deaf and hard of hearing (DHH) users during group conversations with hearing people.

Full paper presented at ACM CHI 2018, Montréal, Canada


GestureTag: Combining Gaze with Touch Gestures to Acquire Targets for People with Motor Impairments


A novel selection technique which aids motor impaired users to select targets on desktops by combining gaze pointing and simple touch swipe gestures for triggering.

Collaboroated project with CSE student Tony Tung and Prof. Jacob Wobbrock from University of Washington