Japanese telecommunications giant SoftBank recently announced that it has been developing “emotion-canceling” technology powered by AI that will alter the voices of angry customers to sound calmer during phone calls with customer service representatives. The project aims to reduce the psychological burden on operators suffering from harassment and has been in development for three years. Softbank plans to launch it by March 2026, but the idea is receiving mixed reactions online.
According to a report from the Japanese news site The Asahi Shimbun, SoftBank’s project relies on an AI model to alter the tone and pitch of a customer’s voice in real-time during a phone call. SoftBank’s developers, led by employee Toshiyuki Nakatani, trained the system using a dataset of over 10,000 voice samples, which were performed by 10 Japanese actors expressing more than 100 phrases with various emotions, including yelling and accusatory tones.
Voice cloning and synthesis technology has made massive strides in the past three years. We’ve previously covered technology from Microsoft that can clone a voice with a three-second audio sample and audio-processing technology from Adobe that cleans up audio by re-synthesizing a person’s voice, so SoftBank’s technology is well within the realm of plausibility.
By analyzing the voice samples, SoftBank’s AI model has reportedly learned to recognize and modify the vocal characteristics associated with anger and hostility. When a customer speaks to a call center operator, the model processes the incoming audio and adjusts the pitch and inflection of the customer’s voice to make it sound calmer and less threatening.
For example, a high-pitched, resonant voice may be lowered in tone, while a deep male voice may be raised to a higher pitch. The technology reportedly does not alter the content or wording of the customer’s speech, and it retains a slight element of audible anger to ensure that the operator can still gauge the customer’s emotional state. The AI model also monitors the length and content of the conversation, sending a warning message if it determines that the interaction is too long or abusive.
The tech has been developed through SoftBank’s in-house program called “SoftBank Innoventure” in conjunction with The Institute for AI and Beyond, which is a joint AI research institute established by The University of Tokyo.
Harassment a persistent problem
According to SoftBank, Japan’s service sector is grappling with the issue of “kasu-hara,” or customer harassment, where workers face aggressive behavior or unreasonable requests from customers. In response, the Japanese government and businesses are reportedly exploring ways to protect employees from the abuse.
The problem isn’t unique to Japan. In a Reddit thread on Softbank’s AI plans, call center operators from other regions related many stories about the stress of dealing with customer harassment. “I’ve worked in a call center for a long time. People need to realize that screaming at call center agents will get you nowhere,” wrote one person.
A 2021 ProPublica report tells horror stories from call center operators who are trained not to hang up no matter how abusive or emotionally degrading a call gets. The publication quoted Skype customer service contractor Christine Stewart as saying, “One person called me the C-word. I’d call my supervisor. They’d say, ‘Calm them down.’ … They’d always try to push me to stay on the call and calm the customer down myself. I wasn’t getting paid enough to do that. When you have a customer sitting there and saying you’re worthless… you’re supposed to ‘de-escalate.'”
But verbally de-escalating an angry customer is difficult, according to Reddit poster BenCelotil, who wrote, “As someone who has worked in several call centers, let me just point out that there is no way faster to escalate a call than to try and calm the person down. If the angry person on the other end of the call thinks you’re just trying to placate and push them off somewhere else, they’re only getting more pissed.”
Ignoring reality using AI
Harassment of call center workers is a very real problem, but given the introduction of AI as a possible solution, some people wonder whether it’s a good idea to essentially filter emotional reality on demand through voice synthesis. Perhaps this technology is a case of treating the symptom instead of the root cause of the anger, as some social media commenters note.
“This is like the worst possible solution to the problem,” wrote one Redditor in the thread mentioned above. “Reminds me of when all the workers at Apple’s China factory started jumping out of windows due to working conditions, so the ‘solution’ was to put nets around the building.”
SoftBank expects to introduce its emotion-canceling solution within fiscal year 2025, which ends on March 31, 2026. By reducing the psychological burden on call center operators, SoftBank says it hopes to create a safer work environment that enables employees to provide even better services to customers.
Even so, ignoring customer anger could backfire in the long run when the anger is sometimes a legitimate response to poor business practices. As one Redditor wrote, “If you have so many angry customers that it is affecting the mental health of your call center operators, then maybe address the reasons you have so many irate customers instead of just pretending that they’re not angry.”
https://arstechnica.com/?p=2032137