LinkedIn users in the U.S. — but not the EU, EEA, or Switzerland, likely due to those regions’ data privacy rules — have an opt-out toggle in their settings screen disclosing that LinkedIn scrapes personal data to train “content creation AI models.” The toggle isn’t new. But, as first reported by 404 Media, LinkedIn initially didn’t refresh its privacy policy to reflect the data use.

The terms of service have now been updated, but ordinarily that occurs well before a big change like using user data for a new purpose like this. The idea is it gives users an option to make account changes or leave the platform if they don’t like the changes. Not this time, it seems.

To opt out of LinkedIn’s data scraping, head to the “Data Privacy” section of the LinkedIn settings menu on desktop, click “Data for Generative AI improvement,” then toggle off the “Use my data for training content creation AI models” option. You can also attempt to opt out more comprehensively via this form, but LinkedIn notes that any opt-out won’t affect training that’s already taken place.

The nonprofit Open Rights Group (ORG) has called on the Information Commissioner’s Office (ICO), the U.K.’s independent regulator for data protection rights, to investigate LinkedIn and other social networks that train on user data by default.

“LinkedIn is the latest social media company found to be processing our data without asking for consent,” Mariano delli Santi, ORG’s legal and policy officer, said in a statement. “The opt-out model proves once again to be wholly inadequate to protect our rights: the public cannot be expected to monitor and chase every single online company that decides to use our data to train AI. Opt-in consent isn’t only legally mandated, but a common-sense requirement.”

  • circuitfarmer@lemmy.sdf.org
    link
    fedilink
    arrow-up
    20
    ·
    3 months ago

    This is one effect of a general lack of real consequences for corporations and those that run them.

    The company has already determined their likely fine after being caught doing something egregious. The profit from being early to market is significant, and so long as it is considerably higher than the likely fine, they go for it. The expected real earnings are the difference between the profit and the fine. It’s all made worse since so often the fine is absolutely nothing compared to the profit, since the numbers these companies are dealing with are so damn big.

    This is why you won’t see real change until we stop slapping corporations with fines and start slapping executives with jail time. That is literally the only way to break the cycle.