Hong Kong Regulators Warn LinkedIn Users About AI Data Training – Here’s How to Opt Out in 2025
- Why Are Hong Kong Regulators Targeting LinkedIn’s AI Practices?
- How Is LinkedIn Using Your Data for AI Training?
- How to Opt Out of LinkedIn’s AI Data Training
- The Bigger Picture: AI’s Insatiable Data Hunger
- FAQs: LinkedIn’s AI Data Training
the platform is now using personal data to train its AI models unless you manually opt out. With LinkedIn sharing this data with Microsoft and OpenAI, regulators are stepping in to ensure user control. Here’s what you need to know—and how to protect your privacy.
Why Are Hong Kong Regulators Targeting LinkedIn’s AI Practices?
Hong Kong’s Office of the Privacy Commissioner for Personal Data (PCPD) has raised alarms over LinkedIn’s updated privacy policy, which automatically enrolls user data—including profiles, posts, and resumes—into its AI training programs. The platform confirmed this shift WOULD take effect starting November 3, 2025, affecting users in Hong Kong, the EU, UK, Switzerland, and Canada.
Ada Chung Lai-ling, Hong Kong’s Privacy Commissioner, emphasized that users must review LinkedIn’s revised policy to understand how their data is being used. “This isn’t just about ads—it’s about feeding your professional history into AI systems,” she noted. The PCPD intervened in late 2024 after discovering LinkedIn had set Hong Kong users to “opt-in” by default, a MOVE criticized as overly aggressive.

How Is LinkedIn Using Your Data for AI Training?
LinkedIn’s AI training leverages public profiles, activity feeds, and resumes—but excludes private messages and under-18 users. The bigger concern? This data isn’t just staying in-house. LinkedIn plans to share it with Microsoft (its parent company) and OpenAI, the creator of ChatGPT. Given Microsoft’s heavy investments in OpenAI, this raises questions about where your career details might end up.
Goldman Sachs’ Chief Data Officer, Neema Raphael, recently warned that AI models like ChatGPT are running out of high-quality training data. “Scraping LinkedIn is a desperate move,” one analyst quipped. OpenAI co-founder Ilya Sutskever even admitted last year that data scarcity could stall AI progress—pushing firms toward “agentic AI” that acts autonomously.
How to Opt Out of LinkedIn’s AI Data Training
If you’re uncomfortable with your data training AI, here’s how to opt out:
- Go to Settings & Privacy on LinkedIn.
- Select Data Privacy > Generative AI Data Use.
- Toggle off “Use my data for AI content creation training.”

The PCPD confirmed LinkedIn has since adjusted its settings for Hong Kong users, but vigilance is key. “We’ll keep monitoring compliance,” a spokesperson said.
The Bigger Picture: AI’s Insatiable Data Hunger
LinkedIn’s move reflects a broader trend: social platforms are monetizing user data for AI development. With giants like Google and Meta facing similar scrutiny, regulators worldwide are playing catch-up. Meanwhile, AI’s thirst for data is leading to riskier tactics—like training on sensitive professional histories.
“This isn’t just a privacy issue; it’s a cybersecurity wildcard,” warns a BTCC market analyst. “Agentic AI systems trained on LinkedIn could someday mimic real professionals—for phishing or worse.”
This article does not constitute investment advice.
FAQs: LinkedIn’s AI Data Training
What data does LinkedIn use for AI training?
LinkedIn uses public profiles, posts, resumes, and activity feeds—but not private messages or data from users under 18.
Can Hong Kong users opt out?
Yes. After PCPD pressure, LinkedIn added an opt-out toggle in Settings > Data Privacy.
Does LinkedIn share data with OpenAI?
Yes. LinkedIn’s parent company, Microsoft, is a major OpenAI investor, and data sharing is confirmed.
Why is AI running out of training data?
High-quality text data (like professional content) is finite. Goldman Sachs predicts a shortage by 2026.