Microsoft says Skype audio is now reviewed in ‘secure facilities’ after a worrying report
Hidden Content
A former contractor says there was little security to protect recordings of customers’ calls

Microsoft says Skype calls are now transcribed in “secure facilities in a small number of countries,” following a new report in The Guardian about the company’s use of contractors in China to listen to some calls to make sure the company’s transcription software is working properly. The company confirmed to The Verge that China is not currently one of the countries where transcription takes place.

A former contractor who lived in Beijing says that he transcribed Skype calls with little cybersecurity protection from potential state interference. The unidentified former contractor said that he reviewed thousands of audio recordings from Skype and Cortana on his personal laptop from his home in Beijing over a two-year period.

Workers who were part of the review process accessed the recordings via a web app in a Chrome browser over the internet in China. There was little vetting of employees and no security measures in place to protect the audio recordings from state or criminal interference.

The contractor said he heard “all kinds of unusual conversations” while performing the transcription. “It sounds a bit crazy now, after educating myself on computer security, that they gave me the URL, a username and password sent over email.”

A Microsoft spokesperson says that “If there is questionable behavior or possible violation by one of our suppliers, we investigate and take action.” The audio “snippets” that contractors get to review are ten seconds long or shorter, according to the spokesperson, “and no one reviewing these snippets would have access to longer conversations.”

“We’ve always disclosed this to customers and operate to the highest privacy standards set out in laws like Europe’s GDPR,” the spokesperson added.

The existence of the Skype transcription program was first detailed in a report from Motherboard in August. Although Skype’s terms of service indicated at the time that the company analyzed call audio, this was the first report showing how much of the analysis was done by humans. And unlike competitors who publicly declared that they would end the practice of having humans transcribing audio from virtual assistants, Microsoft continued the practice, apparently updating its privacy policy to admit it was doing so.

Microsoft says it reviewed its processes and communications with customers over the summer. “As a result, we’ve updated our privacy statement to be even more clear about this work, and since then we’ve significantly enhanced the process including by moving these reviews to secure facilities in a small number of countries,” the company said. “We will continue to take steps to give customers greater transparency and control over how we manage their data.”

Microsoft did not elaborate on what these “steps” entailed.

Microsoft is not the only company to face blowback for how it’s handled audio recordings of customers. The practices of data annotation, where humans help AI learn by interpreting audio and other information, have come under intense scrutiny as people weigh the convenience of having on-demand answers from virtual assistants with the discomfort of relinquishing chunks of their private lives often to people they didn’t know were listening.

An April report highlighted how Amazon used full-time employees and contractors to “listen” to customers’ conversations with Alexa. The report found the company wasn’t clear about how long such recordings are stored, or whether employees or even third parties have accessed or would be able to access the information for nefarious purposes. And both Apple and Google reportedly suspended their programs that used humans to review audio recordings of their Siri and Assistant virtual assistant programs.

Here’s how to prevent audio assistants from retaining audio recordings.............
__________________________________________________ __________________________________________________ __________________________________________________ __________________________________________________ __________________________________________________ __________________________________________________ __________________________________________________ __________________________________________________ ________________________________
How to ask Google to delete conversations you didn’t want it to hear

Talking with Google Assistant can be a handy way to get answers to questions, run your smart home, or ask for the weather. But every once in awhile, the Assistant might reply to a question you didn’t ask, or it might accidentally record a snippet of a conversation you didn’t want it to hear.

Pretty much everything you do while signed into your Google account on a device is stored by Google, so it’s understandable that you might not want Google to have recordings of conversations you’re not explicitly asking for Assistant’s help with. Especially after Amazon, Apple, Google, and Microsoft all got caught earlier this year using contractors to listen to voice recordings captured by their virtual assistants, some of which were captured accidentally and involved sensitive private information.

But there’s a simple command to have Google delete the last thing you said to it: just say “Hey Google, that wasn’t for you,” and Google Assistant will delete the last message you asked it, as noted by Droid Life. I tested it, and it appeared to work as expected. I asked Google Assistant what the weather was, and that showed up on My Activity page for my account. When I followed up by saying that recording wasn’t for Assistant, my question about the weather had disappeared from the page.

According to a Google support page, you can also say the following phrases to delete recordings over certain time periods:

“Hey Google, delete my last conversation.”
“Hey Google, delete today’s activity.”
“Hey Google, delete this week’s activity.”
And if you want to delete individual recordings, everything you’ve ever asked Google Assistant, or have Google automatically delete recordings after three or 18 months, you can do that from the Google Assistant Activity dashboard.

Similarly, you can ask Amazon’s Alexa to delete recordings of your voice, though you have to open the settings in the Alexa app and turn on “Enable Deletion by Voice” to do so. Once that’s on, you can say “Alexa, delete what I just said” to delete the last thing you asked it or “Alexa, delete everything I said today” to delete recordings from that day. You can also delete recordings individually, in batches, or set up automatic deletions using the Alexa app or on your Alexa Privacy page.