Slack trains user messages, files, and other content on machine learning models without explicit permission. This tutorial starts automatically, meaning your private data will be used by default. Even worse, you have to ask your Slack admin (human resources, IT, etc.) to contact the company via email to request that your data be removed from the training. You cannot do this yourself. Here’s welcome to the dark side of the new golden age of AI data training.
Corey Quinn of DuckBill Group noticed this policy in Slack’s Privacy Policy and shared it on the X platform (via PCMag). “To develop AI/ML models, our systems analyze Customer Data (for example, messages, content, and files) and Other Information (including usage information) sent to Slack, as described in our Privacy Policy and your customer agreement,” the relevant section states.
The deactivation process puts all the work on you to protect your data. According to the privacy statement, “To opt out, please have your Organization or Workspace Owners or Primary Owner contact our Customer Experience team at feedback@slack.com with the Workspace/Organization URL and subject line ‘Slack Global model opt-out request’ We will process your request and respond once the deactivation is complete.”
The company responded to Quinn’s message on the .”
It’s unclear when the Salesforce-owned company added the phrase to its terms. Saying that customers can opt out of this practice is misleading at best, considering that the term “customers” does not include employees within the organization. We have to hope that employees make this request to whoever manages Slack access at work.
I’m sorry Slack, you’re doing fucking WHAT
with user DMs, messages, files, etc? I’m positive I’m not reading
this correctly. pic.twitter.com/6ORZNS2RxC— Corey Quinn (@QuinnyPig)
May 16, 2024
Inconsistencies in Slack’s privacy policies add to the confusion. “When developing AI/ML models or analyzing Customer Data, Slack cannot access underlying content. We have various technical measures in place that prevent this from happening,” one section states. However, the machine learning model training policy contradicts this statement and leads to confusion.
Additionally, on Slack’s page where it markets premium productive AI tools, it says, “Work worry-free. Your data is your data. We don’t use Slack AI to train. Everything runs on Slack’s secure infrastructure and complies with the same compliance standards as Slack.”
In this case, the company is talking about premium generative AI tools, but training data on machine learning models without explicit permission. As PCMag points out, implying that all your data is safe from AI training is a misleading statement at best, when the company can decide which AI models to apply that statement to.
Source link: https://www.teknolojioku.com/guncel/slack-mesajlarinizi-ai-modelleri-icin-tariyor-6655ca5de90620b5f1024b48