The robots are coming to customer service, and with the arrival of humanoids expected in the next year, there could be implications for how their presence shapes the way customers treat human service employees.
In a recent study, a research team led by Professor Taeshik Gong of Hanyang University ERICA in South Korea found that customers who witness someone mistreating a service robot can either normalize bad behavior or respond with empathy. The findings were published in the Journal of Retailing and Consumer Services. Gong told CX Today:
“How customers treat robots may generalize to how they treat human staff. If disrespect toward robots becomes normalized, it risks lowering the threshold for mistreatment of employees.”
“When customers witness another person mistreating a service robot, this behavior can implicitly signal what is acceptable within that environment,” Gong added.
Past studies have explored direct interactions between humans and robots in sectors like hospitality, retail and healthcare. But there had been little research into how customers react when they see someone else mistreating a service robot, which is likely to be a growing concern as robots make their way into technology-enabled service environments.
The researchers identified two distinct types of reactions in customer service contexts.
“Some customers may imitate the incivility, whereas others may experience empathy and behave more considerately,” Gong said. “The balance between these reactions shapes the emotional climate of the service setting.”
The findings suggest that how observers react depends on factors including the robot’s humanlike design and the observer’s moral identity. And that third-party observers can play a key role in shaping norms around the treatment of customer service robots.
“Customer service leaders should recognize that the social atmosphere is co-created by customers themselves. Encouraging respectful norms toward all service agents, including robots, can help preserve a positive environment that supports satisfaction and smooth service encounters,” Gong said.
Designing Customer Service Environments to Support Humans and Robots
For organizations deploying robots alongside staff, Gong’s team sees a practical takeaway. Social norms and behavior management need to extend to machines. Leaders should address behavioral expectations during staff training and customer communication, Gong said. In settings where robots increasingly serve customers, managers should design cues to discourage mistreatment.
“Establishing service scripts, signage, or onboarding explanations that reinforce respectful conduct toward all service providers can help protect the well-being of human employees and maintain professional standards in the service environment,” Gong said.
As the study showed that humanlike features in robots can affect their behavior, robotics developers can use physical cues, including expressive eyes, emotive voices, and gestures, in customer-facing robots.
This not only improves likability, but also encourages customers to treat robots with dignity and reduce the spread of abusive behavior.
“Anthropomorphic features can encourage customers to see robots as more relatable partners in interaction, which can activate empathy and prosocial responses,” Gong said.
But striking a balance will be key. “Designers need to avoid over-humanization that leads to unrealistic expectations or discomfort. A balanced approach highlights warmth and relational cues without implying full human capabilities,” Gong added. “For example, subtle facial expressions, friendly tone, and responsive feedback can foster connection without suggesting that the robot possesses deep emotional understanding.”
The study points toward a growing need for ethical and social frameworks governing how people treat robots in public and workplace settings. Gong suggests that companies can include “prosocial messaging” in customer touchpoints, reminding them that mistreating robots has social and ethical implications, even if the robots themselves are not sentient.
The researchers also noted that there will need to be policy discussions about whether and how humanoid robots should be protected in public spaces.
“As service robots become more integrated into daily interactions, it is likely that norms and even formal codes of conduct will emerge,” Gong said. “This would parallel how societies developed etiquette around digital communication and online platforms.”
And as robots become part of the everyday service landscape, public attitudes will likely shift, Gong said.
“Clear expectations can help prevent negative behavioral spillovers, sustain positive customer experiences, and support employees working alongside robotic systems.”
As automation becomes more humanlike, the social rules of customer service will need to evolve with it.