Rethinking human-machine interaction for ethical engineering

Celebrity Gig
Credit: Pixabay/CC0 Public Domain

In his article, “Trust is Not a Virtue: Why We Should Not Trust,” Matthew L. Bolton, an associate professor of systems and information engineering at the University of Virginia, critically challenges the emphasis on fostering trust in human-machine interactions. He focuses particularly on systems using AI, machine learning and automation.

The article is published in the journal Ergonomics in Design: The Quarterly of Human Factors Applications.

Bolton argues that while trust is often seen as essential for the adoption of new technologies, it is a problematic and imprecise concept. Trust is difficult to define, highly contextual, and conflated with related concepts like confidence and perceived risk, making it neither selective nor diagnostic as a measure of human behavior.

READ ALSO:  Manganese sprinkled with iridium reduces need for rare metal without altering rate of green hydrogen production

Bolton contends that the focus on building trust in technology may actually undermine sound human factors engineering. Instead of pursuing trust as a goal, engineers should focus on objective measures of system reliability, transparency, and usability—elements that directly impact human experience and performance.

Trust, Bolton asserts, is not inherently humanistic and can be manipulated to disenfranchise users, reducing autonomy rather than enhancing it. This manipulation often serves the interests of large organizations seeking to bypass the need for reliability by encouraging blind trust in their technologies.

“There is a contradiction at the heart of trust research,” Bolton says. “We include humans in systems because they bring experience, expertise, instincts, and creativity that improves performance and makes systems resilient… we rely on them to decide when, why, and how to trust a system. If engineers manipulate people into behaving the way they (or others) want, we lose the benefit of having human operators.”

READ ALSO:  OPS predicts woes as national debt hits N46tn

Ultimately, Bolton calls for a shift away from trust-centric research in favor of more concrete and ethical approaches to system design.

He emphasizes that engineers should prioritize developing technologies that empower users with transparent, reliable, and human-centered designs, rather than relying on trust as a justification for adoption. This approach, he argues, would lead to safer, more ethical, and effective human-machine interactions.

READ ALSO:  FG plans N854.8bn for pensions in 2023

More information:
Matthew L. Bolton, Trust is Not a Virtue: Why We Should Not Trust Trust, Ergonomics in Design: The Quarterly of Human Factors Applications (2022). DOI: 10.1177/10648046221130171

Provided by
University of Virginia


Citation:
Trust is not the answer: Rethinking human-machine interaction for ethical engineering (2024, October 17)
retrieved 20 October 2024
from

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

Categories

Share This Article
Leave a comment