Download PDFOpen PDF in browser

Uncertainty-Aware Visual Workload Estimation for Human-Robot Teams

14 pagesPublished: July 12, 2024

Abstract

Human-robot teams operate in uncertain environments and need to accomplish a wide range of tasks. A dynamic understanding of the human’s workload can enable fluid inter- actions between team members. A system that seeks to adapt interactions for a human- robot team needs to quantify the distribution of workload across the different workload components. A workload assessment algorithm capable of estimating the demand placed on the human’s visual resources is required. Further, adaptive systems will benefit from measures of uncertainty, as these measures inform interaction adaptations. Two machine learning methods’ capacity to estimate visual workload for a human-robot team operat- ing in a non-sedentary supervisory environment are analyzed. A key finding is that the uncertainty-aware method outperforms the other approach.

Keyphrases: human-robot interaction, human-robot teams, Visual Workload, wearable sensors

In: Kenneth Baclawski, Michael Kozak, Kirstie Bellman, Giuseppe D'Aniello, Alicia Ruvinsky and Candida Da Silva Ferreira Barreto (editors). Proceedings of Conference on Cognitive and Computational Aspects of Situation Management 2023, vol 102, pages 126--139

Links:
BibTeX entry
@inproceedings{CogSIMA2023:Uncertainty_Aware_Visual_Workload_Estimation,
  author    = {Joshua Bhagat Smith and Simone Angelo Toribio and Julie Adams},
  title     = {Uncertainty-Aware Visual Workload Estimation for Human-Robot Teams},
  booktitle = {Proceedings of Conference on Cognitive and Computational Aspects of Situation Management 2023},
  editor    = {Kenneth Baclawski and Michael Kozak and Kirstie Bellman and Giuseppe D'Aniello and Alicia Ruvinsky and Candida Da Silva Ferreira Barreto},
  series    = {EPiC Series in Computing},
  volume    = {102},
  pages     = {126--139},
  year      = {2024},
  publisher = {EasyChair},
  bibsource = {EasyChair, https://easychair.org},
  issn      = {2398-7340},
  url       = {https://easychair.org/publications/paper/NDhc},
  doi       = {10.29007/6w5h}}
Download PDFOpen PDF in browser