New tool helps analyze pilot performance and mental workload in augmented reality

Celebrity Gig
An image of a performer with an AR headset, illustrating the process of capturing and analyzing data related to job-training sessions. Credit: NYU Tandon School of Engineering

In the high-stakes world of aviation, a pilot’s ability to perform under stress can mean the difference between a safe flight and disaster. Comprehensive and precise training is crucial to equip pilots with the skills needed to handle these challenging situations.

Pilot trainers rely on augmented reality (AR) systems for teaching, by guiding pilots through various scenarios so they learn appropriate actions. But those systems work best when they are tailored to the mental states of the individual subject.

Enter HuBar, a novel visual analytics tool designed to summarize and compare task performance sessions in AR—such as AR-guided simulated flights—through the analysis of performer behavior and cognitive workload.

By providing deep insights into pilot behavior and mental states, HuBar enables researchers and trainers to identify patterns, pinpoint areas of difficulty, and optimize AR-assisted training programs for improved learning outcomes and real-world performance.

HuBar was developed by a research team from NYU Tandon School of Engineering that will present it at the 2024 IEEE Visualization and Visual Analytics Conference on October 17, 2024.

“While pilot training is one potential use case, HuBar isn’t just for aviation,” explained Claudio Silva, NYU Tandon Institute Professor in the Computer Science and Engineering (CSE) Department, who led the research with collaboration from Northrop Grumman Corporation (NGC). “HuBar visualizes diverse data from AR-assisted tasks, and this comprehensive analysis leads to improved performance and learning outcomes across various complex scenarios.”

READ ALSO:  PoS operators, Falana head for court, Gbajabiamila faults Emefiele

“HuBar could help improve training in surgery, military operations and industrial tasks,” added Silva, who is also the co-director of the Visualization and Data Analytics Research Center (VIDA) at NYU.

The team introduced HuBar in a paper appearing on the arXiv preprint server, that demonstrates its capabilities using aviation as a case study, analyzing data from multiple helicopter co-pilots in an AR flying simulation. The team also produced a video about the system.

Focusing on two pilot subjects, the system revealed striking differences: One subject maintained mostly optimal attention states with few errors, while the other experienced underload states and made frequent mistakes.

HuBar’s detailed analysis, including video footage, showed the underperforming copilot often consulted a manual, indicating less task familiarity. Ultimately, HuBar can enable trainers to pinpoint specific areas where copilots struggle and understand why, providing insights to improve AR-assisted training programs.

READ ALSO:  Researchers provide LLM benchmarking suite for the EU Artificial Intelligence Act

What makes HuBar unique is its ability to analyze non-linear tasks where different step sequences can lead to success, while integrating and visualizing multiple streams of complex data simultaneously.

This includes brain activity (fNIRS), body movements (IMU), gaze tracking, task procedures, errors, and mental workload classifications. HuBar’s comprehensive approach allows for a holistic analysis of performer behavior in AR-assisted tasks, enabling researchers and trainers to identify correlations between cognitive states, physical actions, and task performance across various task completion paths.

HuBar’s interactive visualization system also facilitates comparison across different sessions and performers, making it possible to discern patterns and anomalies in complex, non-sequential procedures that might otherwise go unnoticed in traditional analysis methods.

“We can now see exactly when and why a person might become mentally overloaded or dangerously underloaded during a task,” said Sonia Castelo, VIDA Research Engineer, Ph.D. student in VIDA, and the HuBar paper’s lead author.

“This kind of detailed analysis has never been possible before across such a wide range of applications. It’s like having X-ray vision into a person’s mind and body during a task, delivering information to tailor AR assistance systems to meet the needs of an individual user.”

READ ALSO:  Housing could help us improve our well-being and reach net zero by 2040, scientist says

As AR systems—including headsets like Microsoft Hololens, Meta Quest and Apple Vision Pro—become more sophisticated and ubiquitous, tools like HuBar will be crucial for understanding how these technologies affect human performance and cognitive load.

“The next generation of AR training systems might adapt in real-time based on a user’s mental state,” said Joao Rulff, a Ph.D. student in VIDA who worked on the project. “HuBar is helping us understand exactly how that could work across diverse applications and complex task structures.”

More information:
Sonia Castelo et al, HuBar: A Visual Analytics Tool to Explore Human Behaviour based on fNIRS in AR guidance systems, arXiv (2024). DOI: 10.48550/arxiv.2407.12260

Journal information:
arXiv


Provided by
NYU Tandon School of Engineering


Citation:
New tool helps analyze pilot performance and mental workload in augmented reality (2024, October 15)
retrieved 16 October 2024
from

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

Categories

Share This Article
Leave a comment