Please consider supporting us by disabling your content blocker.
loader

Introduction

We all know the frustration of not getting our queries dealt with by customer services, or being kept waiting too long. Some people take it out on the person dealing with the issue. Imagine being that person.

AI to the Rescue

Now Japanese telecoms giant SoftBank has created an artificial intelligence (AI) filter that masks angry customers’ voices with a softer tone, to ease pressure on staff. In Japan, customer harassment, or kasu-hara, has increasingly become a problem in the workplace alongside power harassment and sexual harassment.

According to a 2024 survey conducted by Japan’s biggest union, UA Zensen, of about 30,000 staff working in service and other sectors, 46.8 per cent said they had experienced customer anger or intimidation in the past two years.

A survey found that almost half of call centre staff in Japan have faced some form of customer abuse. Photo: Shutterstock

Incidents included abusive language, repetitive complaints, threats and unreasonable demand for apologies.

How Does It Work?

SoftBank has been working with Tokyo University on an AI filter that could identify angry customers’ voices, and soften them into a less aggressive tone. The new tech was published by SoftBank on April 15.

In the product’s demonstration video, a male customer’s angry voice was adapted into that of one described by a Japanese news anchor as “an anime dubbing artist”.

It is expected that the technology would reduce the negative impact on customer service staff’s mental health, so they stay in their jobs.

Changing Workplace Culture

In Japan, it is traditionally seen as a virtue to be servile to superiors and customers at work. However, the situation has gradually improved in recent years.

In 2022, Japan’s Ministry of Health, Labour and Welfare published a manual that instructs and urges companies to tackle customer harassment. Some service providers, such as ANA Holdings, the parent of All Nippon Airways, and West Japan Railway, had already unveiled policies on customer harassment.

West Japan Railway told staff they could stop selling products or providing services to customers who verbally or physically abuse them. Lawyers could also become involved to help employees take legal action against customers.

SoftBank is likely to begin using its AI filter in 2025.

The artificial intelligence voice filter has received widespread support. Photo: Shutterstock

Public Reaction

The technology has received widespread support online.

“It is really good to have such technology. However, people should learn to control their temper when talking to customer service staff,” one person said on YouTube.

“It would also be good if AI altered the staff’s voices to sound like an intimidating gangster,” another joked.

A third person said a filter was unnecessary: “The AI should just cut off the call when recognising an angry voice.”

For more details, visit the original article.