How does a robot perform as a boss at work? The results of research by Polish scientists published in Cognition, Technology & Work suggest that while robots can command obedience, they are not as proficient at it as humans. The level of obedience towards them is generally lower than towards human authority figures, and work efficiency under the supervision of a robot is lower.
For employers and HR departments, this means there’s a need to take the psychological aspects of implementing robots in the work environment into account, including perception of them as an authority figure, trust in them, and potential resistance to following orders, says Konrad Maj, Ph.D., from SWPS University, a psychologist and head of the HumanTech Center for Social and Technological Innovation.
Robot as an authority figure
The development of robotics has led to a situation in which robots are increasingly found in roles associated with authority, e.g. in education, health care or law enforcement. Researchers were intrigued by the extent to which society would accept robots as authority figures. The researchers have shown that people demonstrate a significant level of obedience towards humanoid robots acting as authority figures, although it is slightly lower than towards people (63% vs. 75%).
As the experiment has shown, people may exhibit a decrease in motivation towards machines that supervise their work—in the researchers’ studies, participants performed their assigned tasks more slowly and less effectively under the supervision of a robot. This means that automation does not necessarily increase efficiency if it is not properly planned from a psychological point of view, Maj believes.
Course of the study
The study was carried out in the SWPS University laboratory by scientists from this university: Konrad Maj, Ph.D., Tomasz Grzyb, Ph.D., a professor at SWPS University, Professor Dariusz Doliński and Magda Franjo. Participants were invited to the laboratory and randomly assigned to one of two study groups: with the Pepper robot or with a human acting as an experimenter.
The task was to change the extensions of computer files. If the participant showed signs of reluctance to continue (e.g., a pause in work lasting more than 10 seconds), the robot or the experimenter used verbal encouragement.
The average time to change the extension of one file was shorter under human supervision (23 seconds), while in the groups supervised by a robot this time increased to 82 seconds. The average number of files changed in the first variant was 355, and in the second it was nearly 37% less—224 files.
Human-robot relations
The experiments indicate the complexity of human-robot interactions and the growing role of robots in society. Studies show that anthropomorphic features of robots affect the level of trust and obedience. Robots that are more human-like are perceived as more competent and trustworthy. On the other hand, too much anthropomorphization can cause the uncanny valley effect, which results in lower trust and comfort in the interaction.
Maj points out that there are several explanations for this phenomenon: if a machine has clear human features, but still exhibits various imperfections, this causes a cognitive conflict—and people are at a loss as to how to treat it. They do not know how to behave towards something like that.
But, a conflict of emotions can be discussed: fascination and admiration mixed with disappointment and fear. On the other hand, supporters of the evolutionary explanation claim that humans are programmed to avoid various pathogens and threats, and a robot that pretends to be a human, but is still not perfect at it, may appear to be a threat. Why? Because it looks like someone sick, disturbed or imbalanced.
At the same time, giving certain human features to a robot can facilitate cooperation with the machine—after all, we are used to working with humans. A robot that looks like a human and communicates like a human simply becomes easy for us to use. But there is also a dark side to this—if people create robots that are very similar to humans, they may stop seeing boundaries. People could start to befriend them, demand granting them various rights, and perhaps even get married to them in the future.
In the long run, humanoid robots may create a rift between people. There will also be more misunderstandings and aversion—and this is because robots owned at home will be personalized, always available, empathetic in communication, and understanding. People are not so well-matched, Konrad Maj points out.
More information:
Konrad Maj et al, Comparing obedience and efficiency in tedious task performance under human and humanoid robot supervision, Cognition, Technology & Work (2025). DOI: 10.1007/s10111-024-00787-1
Provided by
SWPS University
Citation:
When a robot becomes the boss: Exploring authority, obedience and relationships with machines (2025, March 14)
retrieved 15 March 2025
from
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.