VR sickness has been one of the major user experience barriers of VR. To reduce VR sickness, prior works have explored several solutions to combat such syndrome including dynamic field-of-view (FOV) modification that reduces peripheral optical flow and galvanic vestibular stimulation (GVS) that recouples the visual and vestibular systems directly.
In this project, we propose a new approach to reduce VR sickness, called PhantomLegs, that applies alternating haptic cues that are synchronized to users’ footsteps in VR. Our idea was inspired from the concept called bio-mechanical mitigation to VR sickness purposed by previous works.
We describe the implementation of each feedback provided by our systems, which periodically generates a visuo-haptic step-synchronized stimulation for VR walking experiences.
The haptic feedback device is driven by an Arduino board (Uno for the first iteration that was used in the experiment, Nano for the second iteration) fixed onto the back of the Vive HMD. A pair of SG90 180-degree servo motors are attached to each side of the HMD and connected to the Arduino board via digital ports. The Arduino board communicates with the PC via serial port signal through the additional USB port on the HMD. A joint made of hard iron is attached to each servo motor. One tip of the joint is bound together with a detachable horn that comes with the motor, and the other tip, which contacts the user when engaged, is bent 180° backwards and covered with a sponge to minimize the discomfort. The wire is bent 90° inwards relative to the user at around 1⁄3 from the contacting tip. The contacting tip is designed to strike slightly in front of the lower part of the ears, a location both close to the vestibular system and easy to reach from the HMD. Two haptic states provided during the VR walking experiences: a) Idle and b) Engaged.
We modelled a computed head-bobbing pattern using a sine wave based on the analysis of the pattern found in walking data. As shown in the the visualization below, a sequence of transform recorded bya participant where horizontal axis representing the time frame. When recording, participants were asked to walk along the Z-axis seen in the virtual environment. Orange line represents the height of the left controller attached to the left leg, and blue line represents that of the right controller attached to the right leg. A ”step” is recognized by the sudden stop in the change of height on either leg, marked with grey lines and L/R indicators. Yellow line represents X position of the HMD, positive being left direction. It can be observed that the head moved back at center every time a step happened.
We used an angle value a, increasing it at the rate of 320°/s when moving to drive the bobbing (head-oscillation) effect. Upon releasing the trigger, the value is snapped back to 0° or 180°, whichever is closer. sin is used for the horizontal movement instead of cos to make the movement start from center when a = 0° or 180°. The formula for current normalized horizontal offset is as following:
From the analysis on the data, we discovered that when the left or right leg is raised to the highest point, the position of the head offsets towards the opposite direction. Therefore, when a is 90°, dx is at the rightmost, implying the left leg is raised, and vice versa. The head reaches its highest point when either leg is raised to the highest point, and its lowest when the raised leg returns to the ground. Thus, the formula for current normalized vertical offset is as following:
When a reaches 0° or 180° when moving, dy returns to 0, signalling a right or left step respectively. Finally, when the maximum offset of bobbing effect is set to Ox m horizontally and Oy m vertically, the offset vector at angle a is:
In the experiment, Ox was set to 0.02m and Oy was set to 0.03m based on the average estimation from the data. When using computed data, the position of the avatar moved towards the next checkpoint at a constant speed of 1.5 m/s, estimated from the average speed of the data.
We designed a 3-session (3-separate days) mixed experiment testing the effectiveness on battling cybersickness between 3 following conditions: a) CONTROL: the unmodified condition without external assistance, b) DFOV: applying a dynamically changing FOV method, and c) HAPTIC: assisting with the external haptic device (PhantomLegs).
A total of 110 ordered checkpoints were placed in a two-scene-combined environment, each of which was placed 12m away from its predecessor.
Two measrements were taken during the study: 1) Discomfort score: For every 5 checkpoints, participants were asked the question verbally, ”How sick or uncomfortable do you feel on the scale of 1-10, 1 being how you felt before the experiment and 10 being that you have the urge to quit the experiment immediately?” 2) SSQ: Traditional VR sickness evaluation asked both before and after the test.
We calculated the relative SSQ (post-test SSQ - pre-test SSQ) to further evaluate how different feedback affect the perceived VR sickness.
Figure above demonstrates the means and standard errors of RSS from the 30 sessions on the first day. Brackets indicate a significant difference between two conditions reported by post-hoc tests (HAPTIC vs. CONTROL & HAPTIC vs. DFOV).
No significant difference found on the average RSS between 3 different conditions during all 90 sessions. It might cause from the training-wheel effect that the sequence of 3 feedback treatments significantly affect the final results.
The HAPTIC stimulation brings lowest perceived discomfort compared with two other conditions based on the cumulated observations.