According to a recent report, Microsoft employees have access to private user conversations conducted via its chatbot, ChatGPT. Users on the platform use the chatbot to have conversations, receive support, and access other features.
However, the company has admitted that some employees were able to read the platform’s user conversations, raising serious privacy concerns. The company said human reviewers monitor what users submit to the chatbot in order to respond to “inappropriate behavior”.
According to the report, a small number of Microsoft employees have been given access to transcripts of ChatGPT conversations. This means that employees could potentially read users’ private messages, raising concerns about the platform’s security and privacy. Microsoft added two notes to its privacy statement last week to clarify that data generated from bots is collected and can be processed by humans.
Microsoft said that Bing data is protected by stripping personal information from it and that only certain employees could access the chats. It updated its privacy policy last week to include the ability to collect and analyze user interactions with chatbots.
“Microsoft is committed to protecting user privacy, and data is protected through agreed industry best practices including pseudonymization, encryption at rest, secured and approved data access management, and data retention procedures. In all cases access to user data is limited to Microsoft employees with a verified business need only, and not with any third parties.”
The sources for this piece include an article in Telegraph.