Facebook failed to obtain consent and safeguard personal data: Federal Court Appeal clarifies PIPEDA compliance

( Disponible en anglais seulement )

19 septembre 2024 | Tory Hibbitt, David Krebs, Manahil Arshad

INTRODUCTION

On September 9, 2024, the Federal Court of Appeal (the “FCA”) issued its decision in Privacy Commissioner of Canada v Facebook Inc., 2024 FCA 140,[i] overturning the Federal Court’s decision[ii] and declaring that Facebook, Inc. (now Meta Platforms Inc.) had violated the Personal Information Protection and Electronic Documents Act (“PIPEDA”) by improperly sharing users’ personal information with third-party applications (« apps« ) on its platform.[iii] Specifically, Facebook breached the requirements for obtaining meaningful consent[iv] and did not adequately safeguard user data.[v]

The FCA’s decision serves as a helpful reminder that PIPEDA is designed to strike a balance between the privacy rights of individuals and the legitimate needs of organizations to collect and use personal information. The FCA’s analysis of meaningful consent emphasizes the need for organizations to be specific and transparent regarding the uses and disclosures of personal data, especially in the context of third-party apps and other digital ecosystems. An organization cannot contract out of its statutory duty to safeguard personal information, and the impracticalities of monitoring compliance do not justify limiting the scope of safeguarding obligations – especially when those challenges are created by the organization itself.

BACKGROUND

This case stemmed from the Office of the Privacy Commissioner of Canada’s (“OPC”) investigation into the scraping and selling of Facebook user data by the app “thisisyourdigitallife” (“TYDL”) which was sold and used to generate user profiles to facilitate targeted political advertising. The alleged PIPEDA violations occurred from TYDL’s launch in November 2013 until its removal from Facebook in December 2015. During this period, Facebook had three layers of consent policies and practices in place:

  1. Platform-wide policies: When signing up for Facebook, users had to agree to the Terms of Service (4,500 words), which set out users’ rights and responsibilities, including how users could control their information. Facebook’s Data Policy (9,100 words) was incorporated by reference into the Terms of Service, such that users accepting the Terms of Service were deemed to have consented to the Data Policy. The Terms of Service broadly explained how user data could be shared, including with third-party apps, and stated that the agreement between the user and the app would govern how the app uses, stores, or transfers information. The Data Policy broadly described the user information shared with third-party apps – including through the use of such third-party apps by users’ friends.
  2. User controls: Users could adjust their data sharing preferences through permissions, the App Settings page, and the Privacy Settings page (e.g., selecting the default audience for their posts and restricting apps’ access to their information).
  3. Educational resources: Facebook provided resources for users to learn about Facebook’s privacy policies and practices, including explanations of what information is shared when friends use third-party apps and how to control that information.

Facebook required third-party apps to agree to its Platform Policy and Terms of Service (“Platform Policy”), which included specific  terms regarding the collection, use, and disclosure of user information. This included the requirement for apps to have a privacy policy and a prohibition against selling or purchasing data obtained from Facebook.

Despite the requirements of the Platform Policy, Facebook did not review or verify third-party compliance with this policy. When TYDL requested expanded access to user data, Facebook identified this as a « red flag » but took no action beyond denying the request.

After identifying that TYDL had breached the Platform Policy, Facebook removed TYDL in 2015 and asked it to delete the data it had obtained. However, Facebook did not notify affected users, nor did it remove Dr. Kogan or Cambridge Analytica from its platform until 2018, after media reports surfaced that they had not deleted the data as requested.

The OPC investigated Facebook[vi] and concluded that it failed to obtain valid and meaningful consent for its disclosures to third-party apps[vii] and failed to safeguard user data.[viii] These conclusions formed the basis of the OPC’s application pursuant to s. 15(a) of PIPEDA.

The Federal Court considered two central issues: 1) whether Facebook failed to obtain meaningful consent from users and their friends when sharing personal information with third-party apps, and 2) whether Facebook failed to adequately safeguard user data. The Federal Court dismissed the OPC’s application, finding that the OPC’s burden of proof for either allegation was unmet, particularly due to the absence of expert and subjective evidence.[ix]

FEDERAL COURT OF APPEAL’S KEY FINDINGS

A unanimous panel of three judges at the FCA partially granted the OPC’s appeal, finding that the Federal Court made two main errors:

  • relying too heavily on the lack of expert and subjective evidence in its analysis; and
  • failing to assess the consent given by friends of users who downloaded third-party apps, separate from the consent of the installing users.

Failure to Obtain Meaningful Consent

The FCA held that Facebook did not obtain meaningful consent for sharing user data, applying the objective « reasonable person » standard under PIPEDA. This standard does not require subjective or expert evidence; instead, it considers whether a reasonable person would understand the nature, purpose, and consequences of the disclosure of their information.

Key findings by the FCA include:

  • Reasonable Efforts vs. Manner of Consent: Consent must be both reasonable in form and clearly understood by users. Facebook’s methods fell short because users could not fully understand what they were consenting to. An organization’s reasonable efforts do not override the manner of consent, as valid consent requires individuals to understand what they are consenting to.
  • Users Friends’ Data: Friends of users were unable to review app privacy policies before their data was shared, violating clause 4.3.2 of PIPEDA. Broad statements in the Data Policy about sharing data with apps used by friends were too vague to form meaningful consent. Even if consent could somehow be derived from the Data Policy, the data use exceeded what could reasonably have been contemplated by users’ friends.
  • Users’ Consent: Based on deficiencies in its Terms of Service and Data Policy, Facebook also failed to obtain valid consent from the downloading users. Simply incorporating the Data Policy into the Terms of Service was insufficient under PIPEDA, as the manner and substance of these policies did not facilitate users’ understanding of the nature, purpose, and consequences of the disclosure of their information.
  • Lack of Warnings: Facebook did not adequately inform users that third-party apps could misuse their data or sell it to others, which a reasonable user would expect safeguards against.
  • Overall Deficiencies: Facebook’s privacy policies were too long and unclear. Its reliance on default privacy settings that allowed data sharing without active consent violated the principle that consent must be an affirmative choice.

Inadequate Safeguards for User Data

Facebook breached its safeguarding obligations by failing to properly monitor third-party apps or review their privacy policies. Key findings by the Court include:

  • Failure to Act on Red Flags: Facebook ignored TYDL’s « red flag » request for excessive data, breaching its safeguarding obligations.
  • Platform Overload: Although Facebook claimed it would be practically impossible to review the privacy policies of all third-party apps, the Court noted this issue was of Facebook’s own making. Facebook could not avoid its statutory responsibilities by claiming it had too many apps to manage, nor by contracting out of its statutory obligations under s. 6.1 and Principle 3 of PIPEDA.
    Breach of Safeguarding Duty: Facebook’s lack of oversight of third-party apps’ privacy practices violated its obligation to protect user data under PIPEDA.

KEY TAKEAWAYS

Proactive Steps for PIPEDA Compliance: Organizations should view this decision as a call to:

  • conduct regular privacy audits and update consent processes;
  • simplify privacy policies and make them accessible;
  • implement rigorous third-party oversight mechanisms; and
  • avoid default settings that assume consent.

By focusing on these areas, organizations can not only avoid legal risks but also build trust with their users, ensuring that privacy compliance becomes a competitive advantage. Other noteworthy points for privacy lawyers and private sector organizations include:

Reaffirmation of the Reasonable Person Standard for Consent: « Meaningful » consent is not merely about complying with formal consent mechanisms like terms of service or privacy policies. Consent must be understood from the perspective of a reasonable person, taking into account the specific context in which data is collected and shared. Users must be clearly informed about the nature, purpose, and risks of sharing their information. Organizations should review their privacy policies and consent processes to ensure they are clear, accessible, and easy to understand for the average user, and suitable for the circumstances and sensitivity of the information involved.

Vague or Overly Complex Policies Will Not Suffice: Lengthy, complicated, or obscured privacy policies are inadequate. Consent obtained through complex, broad, or indirect language is unlikely to meet PIPEDA’s consent requirements.

Friends’ Data Requires Explicit Consent: Facebook failed to obtain valid consent from the friends of users whose data was shared through third-party apps. Organizations cannot assume implied consent for sharing secondary data (such as friends’ information). Businesses that facilitate social media interactions or rely on data-sharing ecosystems should re-examine how they handle secondary data and ensure explicit consent mechanisms are in place.

Enhanced Safeguarding Measures Are Essential: Facebook failed to safeguard user data by not monitoring third-party apps. PIPEDA’s safeguarding provisions require proactive oversight of how third parties access and use personal data. Organizations should implement comprehensive measures to audit and monitor supply chain and third-party data practices. Privacy lawyers can guide businesses in setting up these audit and oversight frameworks.

Address the Risks of Default Privacy Settings: The FCA’s remarks about Facebook’s default privacy settings reinforces that consent under PIPEDA must be an « active » process, not obtained by default. Organizations should avoid default settings that assume consent for data sharing and instead create opt-in mechanisms that require affirmative user action.

If you have any questions or would like guidance on how this decision impacts your organization, please contact a member of the Miller Thomson LLP Technology, IP and Privacy Group.

[i] 2024 FCA 140.

[ii] Canada (Privacy Commissioner) v Facebook, Inc., 2023 FC 533.

[iii] Personal Information Protection and Electronic Documents Act, SC 2000, c 5.

[iv] ibid, clause 4.3, and s. 6.1.

[v] ibid, clause 4.7.

[vi] PIPEDA Report of Findings #2019-002, April 25, 2019, https://www.priv.gc.ca/en/opc-actions-and-decisions/investigations/investigations-into-businesses/2019/pipeda-2019-002/.

[vii] Personal Information Protection and Electronic Documents Act, SC 2000, c 5., clause 4.3 of Schedule 1.

[viii] ibid, clause 4.7 of Schedule 1.

[ix] Privacy Commissioner of Canada v Facebook, Inc., 2023 FC 533 at para 71.

Avis de non-responsabilité

Cette publication est fournie à titre informatif uniquement. Elle peut contenir des éléments provenant d’autres sources et nous ne garantissons pas son exactitude. Cette publication n’est ni un avis ni un conseil juridique.

Miller Thomson S.E.N.C.R.L., s.r.l. utilise vos coordonnées dans le but de vous envoyer des communications électroniques portant sur des questions juridiques, des séminaires ou des événements susceptibles de vous intéresser. Si vous avez des questions concernant nos pratiques d’information ou nos obligations en vertu de la Loi canadienne anti-pourriel, veuillez faire parvenir un courriel à privacy@millerthomson.com.

© Miller Thomson S.E.N.C.R.L., s.r.l. Cette publication peut être reproduite et distribuée intégralement sous réserve qu’aucune modification n’y soit apportée, que ce soit dans sa forme ou son contenu. Toute autre forme de reproduction ou de distribution nécessite le consentement écrit préalable de Miller Thomson S.E.N.C.R.L., s.r.l. qui peut être obtenu en faisant parvenir un courriel à newsletters@millerthomson.com.