Amazon Alexa, thousands of employees listen to user conversations

in #technology5 years ago

If you thought you were the only one who knew the topics of your conversations with Alexa you were wrong. According to a recent article published by Bloomberg, Amazon would have thousands of workers around the world who listen to excerpts from exchanges between users and the virtual assistant , with the aim of continuously improving the quality of the answers.

The company employs employees "from Boston to Costa Rica, in India as well as in Romania" to listen to voice recordings made by users while using Alexa with speakers from the Echo family . Employees listen to and transcribe words into text, before giving it all to a software that analyzes audio streams and annotations with the ultimate aim of improving the assistant's abilities.

The source interviewed seven of the employees who worked on the program, and each of them admitted to having signed a secrecy pact with the American company to not publicly reveal the human process behind the technology improvement work. Amazon writes on the official website that uses user requests to train voice recognition systems and improve natural language detection systems.

However, until now it has always been thought that at the base of the process there were only computers , and not human beings. The two Bucharest employees interviewed pointed out that they were dealing with about 1000 audio clips in the typical 9-hour daily shift, within a job that was referred to as "mostly trivial". The clips were sometimes shared in order to work together to solve a problem, but sometimes even funny recordings were shared for the sole purpose of having fun.

In some cases the "reviewers" have had the opportunity to listen to recordings made during the implementation of criminal acts, but they cannot be directly involved or interact with them. Amazon Alexa was launched with the first Echo in 2014 and, subsequently, Google, Apple and others took to the chariot of the new market segment with similar technologies. It is not clear, however, whether other companies also use the same systems to improve the effectiveness of the responses, and whether human beings are present within the process.

To date, virtual assistants are in the early stages of development, and find it difficult, for example, to recognize accents, dialectal forms, and language distortions. These problems have forced development teams to come up with new methods to improve the overall performance of the technologies, but the new Bloomberg report will probably lead consumer groups to demand more transparency on the various software development phases of the category .

Amazon obviously commented on Bloomberg's findings and said: "We care very much about the security and privacy of our users' personal information. We only note very small excerpts of Alexa's voice recordings to improve the customer experience. For example, this information helps us train the voice recognition system and the natural language recognition systems so that Alexa can better understand users' requests and ensure good functioning for all " .

In addition, Amazon has stated that it has implemented "very restrictive technical and operational rules", and that "zero tolerance for any system abuse" is applied.

Coin Marketplace

STEEM 0.35
TRX 0.12
JST 0.040
BTC 70797.92
ETH 3553.00
USDT 1.00
SBD 4.76