Now Reading
Startup develops AI technology that makes call centre employees sound white

Startup develops AI technology that makes call centre employees sound white

Startups in Silicon Valley sometimes give the impression that they are trying to employ AI in the most ominous ways possible. Before that, in June, we learned about the Google AI that was so excellent it persuaded an engineer it was sentient. This month, we also learned about Meta’s racist chatbot and the AI-generated rapper that uses the n-word. This time, the Palo Alto-based startup Sanas has unveiled an AI that aims to make foreign call center staff seem neutral in terms of accent and has the unintended consequence of making them sound white.

As reported by SFGATE, Sanas is a startup that offers “accent translation” for call center employees, a job that tends to be outsourced to cheaper foreign markets like India and the Philippines. Sanas, which was founded by three Stanford graduates, offers a real-time accent translation service, supposedly to make it easier for call center employees to be understood. It has already received over $30 million in venture capital funding.

“We don’t want to say that accents are a problem because you have one,” Sanas president Marty Sarim told SFGATE. “They’re only a problem because they cause bias and they cause misunderstandings.”

Based on the demo you can try out on Sanas’ website where you can “hear the magic,” it really does work. Not only does the software remove the accent, but it replaces the voice with something unsettlingly robotic akin to a standard American English accent. According to its website, Sanas believes this will allow call center employees to “take back the power of their own voice.”

A common comparison to Sanas’ AI has been to the 2018 film Sorry to Bother You where the main character, a Black man, adopts a “white voice” in order to garner more sales at his dystopian call center job. While Sanas states that its AI is meant to combat bias, critics assert that “accent translation” is another way to dehumanize an already dehumanizing job.

“On the surface it reflects communication difficulty — people not being able to understand someone else’s speech,” Winifred Poster, a professor of sociology at Washington University in St. Louis told SFGATE. “But, really, it’s coded for a whole bunch of other issues about how accent triggers racism and ethnocentrism.”

The Sanas AI not sounding human doesn’t help much either. According to the University of Toronto’s Kiran Mirchandani, whose research was on the treatment of Indian call center employees, she told SFGATE that people who are already predisposed to dishing out racist abuse to call center employees also won’t take kindly to a robotic voice on the phone either.

“Customer racism is likely to increase if workers are further dehumanized when an ‘app’ is placed between worker and customer, especially since there will no doubt be errors made by the app,” she told SFGATE.

See Also
Legal Departments to find AI integration soon

Sanas’ president Sarim stressed in his interview with SFGATE that workers will have a choice about whether or not to use the AI’s accent translation. However, those familiar with the exploitation that happens within the foreign call center industry believe if the tech proves to be successful, the workers won’t have much of a choice.

“There is virtually nothing in the labor process of call centers which involves choice by the workers in terms of technology,” Poster told SFGATE. Already, workers are subject to deeply invasive surveillance, which makes it almost impossible to have an authentic conversation with people on the other end.”

About Author

© 2021 The Technology Express. All Rights Reserved.

Scroll To Top