A Machine Vision Approach for Estimating Motion Discomfort in Simulators and in Self-Driving Vehicles
Motion discomfort represents a persistent problem in driving simulators and is expected to be as problematic in highly automated vehicles. This problem can compromise data collection and research validity in simulators, and very likely discourage people from riding in automated vehicles; undermining their potential safety benefits.
Monitoring motion sickness can help mitigate its negative effects. However, most of the existing research focuses on physiological research and subjective reporting to quantify motion sickness, which is impractical and does not allow for online monitoring; respectively. Some studies have linked motion sickness to increased yawning. Yet, there is no system to monitor motion sickness in real-time.
This project will develop a machine vision algorithm that monitors facial features of drivers and detect any signs of motion discomfort in real-time. We will use data collected from 36 drivers who participated in an automated simulator driving study that is expected to induce motion sickness.