If you’re one of LinkedIn’s nearly 930 million users, your personal data is likely being used for something you didn’t agree to—training AI models. Without much public notice, LinkedIn updated its privacy policy to include language allowing them to use your personal data to "develop and train artificial intelligence (AI) models." Flagged last week by The Verge, while users can opt out of future data usage, any of your data already used remains in LinkedIn’s system. Users started to notice the change last Wednesday, but I’m not sure when LinkedIn started the practice.
This growing trend is part of a larger conversation about data privacy in tech. As highlighted in last week’s Federal Trade Commission (FTC) report, platforms like YouTube, Meta, and LinkedIn’s parent company, Microsoft, are being widely criticized for failing to prioritize user privacy while aggressively mining data for AI projects.
FTC Chair Lina Khan called the findings “especially troubling,” noting that companies are scooping up massive amounts of personal data, including from children, without offering meaningful control to users.
Data privacy matters
As AI tools become more sophisticated, the amount of data needed to train them will only increase. The concern isn’t just that companies are using your data—it’s that there’s no consent on your part. Most of us didn’t sign up for LinkedIn knowing that our profiles and activity could one day help build AI models.
When companies bury data privacy controls deep in menus or require multiple steps to opt out, it raises questions about how much control users truly have over their own information.
If data privacy is important to you, now’s the time to review your settings across all platforms—and not just once. As AI technology develops, how platforms use our data could change too.
Here's how to opt out of LinkedIn's AI training
Click your profile picture in the top-right corner and select Settings & Privacy.
From the left-hand navigation, select Data Privacy.
Choose Data for Generative AI Improvement.
Toggle the setting to Off.
LinkedIn’s AI isn’t limited to content creation, it also uses data for personalization and moderation, meaning you’ll need to fill out the LinkedIn Data Processing Objection Form if you want to opt out of those machine learning systems too. According to Mashable, LinkedIn’s decision to enable this setting by default—and without more visible notice—caught many users off guard.
(If you’re based in the EU, EEA, or Switzerland, LinkedIn isn’t using your data for AI training due to strong regional data privacy laws.)
Biweekly disruptions
Biden administration to host international AI safety meeting after election (US News) The Biden administration is organizing an international AI safety summit in San Francisco this November, bringing together experts from at least nine countries and the European Union. The two-day event will focus on safely developing AI technology and addressing its risks, like AI-generated misinformation or potential misuse by malicious actors. With AI rapidly evolving, this gathering is part of a global effort to create safety standards and guardrails for powerful AI systems.
AI and ancient water systems: Mapping qanats (Science Direct) A fascinating new study is using cutting-edge AI to map ancient water systems known as qanats—an underground network used to transport water in arid regions. Researchers trained a deep learning model using satellite imagery, including data from spy satellites. The AI can identify hidden water systems with impressive accuracy, offering a new way to study and preserve these ancient structures. This technology provides valuable insights into sustainable water management. In this case, AI is helping to preserve history and address modern-day challenges like drought.
’Hunger Games’ studio Lionsgate to partner with AI company (LA Times) Lionsgate announced a partnership with AI research company Runway to create a customized AI model for film production. The model will generate cinematic video content, which can be fine-tuned using Runway’s tools. Lionsgate’s filmmakers say they’re excited about AI’s potential to enhance both pre- and post-production processes. Many people see this collaboration as a test case for other AI tools used in filmmaking.