Towards Modelling Human Driving: Testing the Influence of Driving Mode and Distraction Types in a VR Simulator

Jiacheng Liu, Yue Li, Fan Zhang*

*Corresponding author for this work

Research output: Chapter in Book or Report/Conference proceedingConference Proceedingpeer-review

Abstract

Various driver monitoring systems have been deployed to understand human driving behaviours in complex scenarios, contributing to the development of automated vehicles that meet technical and legal requirements. However, commercial systems are often overpriced, and there is still limited understanding of how driving behaviour, distractions, and scenarios interact to influence decision-making and performance. This study addresses the gap by collecting behavioural and physiological data in different driving tasks and modelling human decision-making. In a between-subject design, participants were instructed to drive safely or aggressively through three simulated scenes, namely the crossroads, the T-junction, and the roundabout, under five distraction conditions: 1) no distraction, 2) audio-cognitive, 3) audio-action, 4) visual-cognitive, and 5) visual-action. Each participant completed forty-five trials, lasting 30-40 minutes. The driving scene was developed in Unreal Engine 4, using Microsoft AirSim. The experiment setup included a multi-sensor driver monitoring system, a driving simulator with wheel and pedals, and a VIVE Pro 2 VR display, to collect behavioural (e.g., head movements, steering) and physiological (e.g., heart rate, skin conductance) data. ANOVA was performed to explore behavioural patterns and physiological responses, including differences between safe and aggressive driving, distractions, and scenarios. Significant differences across conditions were revealed. Specifically, the throttle, steering, acceleration, the speed of the vehicle, the heart rate, and the head turning of participants in aggressive driving are significantly different from those in safe driving. Distraction conditions had a significant impact on the steering and head turning ranges. Our contributions include setting up a realistic driving simulation environment with affordable solutions and creating a human driving data collection pipeline for modelling driving performance. Future work will focus on improving data acquisition, modelling human decision-making, and integrating these models into the planning and control of automated vehicles to enhance AI transparency and public acceptance of autonomous driving.
Original languageEnglish
Title of host publicationProceedings of the Twelfth International Symposium of Chinese CHI
Subtitle of host publicationCHCHI '24
PublisherAssociation for Computing Machinery
Pages31-46
Number of pages16
ISBN (Print)979-8-4007-1389-7/24/11
DOIs
Publication statusPublished - 29 Oct 2025
Event12th International Symposium of Chinese CHI, Chinese CHI 2024 - SUSTech, Shenzhen, China
Duration: 22 Nov 202425 Nov 2024
https://chchi.icachi.org/24/

Publication series

NameCHCHI '24

Conference

Conference12th International Symposium of Chinese CHI, Chinese CHI 2024
Country/TerritoryChina
CityShenzhen
Period22/11/2425/11/24
Internet address

Fingerprint

Dive into the research topics of 'Towards Modelling Human Driving: Testing the Influence of Driving Mode and Distraction Types in a VR Simulator'. Together they form a unique fingerprint.

Cite this