Privacy Concerns Emerge As Slack Reportedly Uses User Data For AI/ML Without Consent
Reportedly, Slack uses its consumer messages, files, and other content without explicit consent, for the purpose of developing non-generative artificial intelligence/ machine learning (AI/ML) models for features such as emojis and channel recommendations.
On May 17, a user posted on the Hacker News developer community forum, revealing that Slack uses customer data to train its services.
The user mentioned that upon opting out, they were informed that Slack does not use its models “in such a way that they could learn, memorise, or be able to reproduce some part of customer data”.
However, it must be noted that the training process itself could lead to the exposure of sensitive information to those involved in training the models.
The primary concern for most users on the community forum (and later on X as the news spread) was that they were automatically opted in to share their data with Slack.
Furthermore, it is uncertain if choosing to opt out would result in the exclusion of all the information you previously shared on the platform from being used in model training data.
OpenAI Shuffles Safety Strategy; Dissolves Team Amid Leadership Exit
Click here