Visual augmentation of live-streaming images in virtual reality to enhance teleoperation of unmanned ground vehicles

Yiming Luo, Jialin Wang, Yushan Pan, Shan Luo, Pourang Irani, Hai Ning Liang*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

First-person view (FPV) technology in virtual reality (VR) can offer in-situ environments in which teleoperators can manipulate unmanned ground vehicles (UGVs). However, non-experts and expert robot teleoperators still have trouble controlling robots remotely in various situations. For example, obstacles are not easy to avoid when teleoperating UGVs in dim, dangerous, and difficult-to-access areas with environmental obstacles, while unstable lighting can cause teleoperators to feel stressed. To support teleoperators’ ability to operate UGVs efficiently, we adopted construction yellow and black lines from our everyday life as a standard design space and customised the Sobel algorithm to develop VR-mediated teleoperations to enhance teleoperators’ performance. Our results show that our approach can improve user performance on avoidance tasks involving static and dynamic obstacles and reduce workload demands and simulator sickness. Our results also demonstrate that with other adjustment combinations (e.g., removing the original image from edge-enhanced images with a blue filter and yellow edges), we can reduce the effect of high-exposure performance in a dark environment on operation accuracy. Our present work can serve as a solid case for using VR to mediate and enhance teleoperation operations with a wider range of applications.

Original languageEnglish
Article number1230885
JournalFrontiers in Virtual Reality
Volume5
DOIs
Publication statusPublished - 2024

Keywords

  • edge enhancement
  • teleoperation
  • unmanned ground vehicles
  • virtual reality
  • vision augmentation

Fingerprint

Dive into the research topics of 'Visual augmentation of live-streaming images in virtual reality to enhance teleoperation of unmanned ground vehicles'. Together they form a unique fingerprint.

Cite this