How LinkedIn Leverages Your Data for AI: The Legal Issues You Need to Know

Olivia Rhye
11 Jan 2022
5 min read

Like many of you, I was genuinely surprised to learn that we were all automatically opted in, allowing LinkedIn to use our data without first obtaining our permission. While it’s no surprise that they would want to use our data to train AI, doing so without asking us first is quite brazen. This raises the question: is it legal for companies to do this?

When it comes to the legality of using user data for AI training, the situation can vary significantly across different jurisdictions such as the United States, Canada, the European Union, and Singapore. Each region has its own set of data protection laws and regulations that govern how personal data can be used, especially in the context of artificial intelligence.

United States

  • Legal Framework: In the United States, there is no comprehensive federal data protection law. However, various state laws, such as the California Consumer Privacy Act (CCPA), provide guidelines on how companies can use personal data. The CCPA requires businesses to disclose their data collection practices and allows consumers to opt-out of the sale of their personal information.
  • Risk of Lawsuits: Companies like LinkedIn could potentially face lawsuits if they fail to comply with state-specific privacy laws or if users believe their data is being misused without consent.
  • Federal Governing Body: Federal Trade Commission (FTC) http://www.ftc.gov

Canada

  • Legal Framework: Canada’s Personal Information Protection and Electronic Documents Act (PIPEDA) governs the collection, use, and disclosure of personal information in the course of commercial activities. PIPEDA requires organizations to obtain meaningful consent from individuals before collecting or using their personal data.
  • Risk of Lawsuits: If LinkedIn uses user data without obtaining explicit consent, it could be at risk of legal action under PIPEDA.
  • Federal Governing Body: Office of the Privacy Commissioner of Canada - http://www.priv.gc.ca

European Union

  • Legal Framework: The General Data Protection Regulation (GDPR) is one of the strictest data protection laws globally. It mandates that companies must have a lawful basis for processing personal data and must obtain explicit consent from users for specific uses, such as AI training.
  • Risk of Lawsuits: Non-compliance with GDPR can result in significant fines and legal challenges. LinkedIn must ensure it adheres to GDPR requirements to avoid potential lawsuits.
  • Federal Governing Body: European Data Protection Board (EDPB) - https://edpb.europa.eu

Singapore

  • Legal Framework: The Personal Data Protection Act (PDPA) in Singapore regulates the collection, use, and disclosure of personal data. Organizations must obtain consent from individuals before using their data for purposes like AI training.
  • Risk of Lawsuits: Failure to comply with PDPA can lead to enforcement actions and penalties, putting LinkedIn at risk if it does not follow the regulations.
  • Federal Governing Body: Personal Data Protection Commission (PDPC) - http://www.pdpc.gov.sg 

In today's digital age, workers and HR organizations must be highly sensitive to the use of personal data for training AI systems. It is crucial for these organizations to always obtain explicit permission from individuals before using their data. By doing so, they not only comply with legal requirements but also build trust with their users. As technology continues to evolve, maintaining transparency and respecting user privacy will be key to fostering positive relationships and ensuring ethical practices in the workplace.