Continuing to highlight the need for reform, the Office of the Privacy Commissioner of Canada (“OPC”) has initiated a consultation on recommendations they have presented to adapt the private sector privacy statute Personal Information Protection and Electronic Documents Act (“PIPEDA”) to address the specific challenges proposed by artificial intelligence systems and related tools, such as machine learning and big data analytics (“AI”). (See our previous article on this topic.)
Specifically, the OPC is concerned about the deployment of AI to process and analyze large amounts of personal information, typically used for predictive decision-making purposes. At issue in particular is the resulting potential for discrimination and unlawful bias, as well as risks to privacy, data security, and Canadians’ trust in the digital economy. The OPC’s recommendations frequently refer to the European General Data Protection Regulation (“GDPR”), including its AI-targeted requirements related to “automated decision-making,” as examples of a potential path ahead for Canada.
The OPC’s consultation is open until March 13, 2020, during which time stakeholders are invited to consider and provide commentary on whether the OPC’s proposals are consistent with responsible development and deployment of AI moving forward. The following are some of the OPC’s key proposals subject under the consultation:
1. Incorporate a definition of AI into the law that would clarify which rules apply to AI only, and which apply to all data processing (including AI)
PIPEDA is currently “technology neutral” meaning there are no specific definitions in the legislation related to technology, processing systems or AI. The OPC’s recommendation is to create and insert a definition for AI to allow for further regulation and legal requirements when using this innovative technology.
2. Recognize an individual’s right to privacy as a fundamental human right
The OPC’s view is that there should be a rights-based foundation within PIPEDA that provides individuals with explicit privacy rights, including the right to object to automated decision-making and the right to explanation and increased transparency related to AI and automated decision-making.
3. Update the consent principles in PIPEDA to allow for socially beneficial data processing activities
The OPC wants to include alternative grounds for data processing in the law, and develop solutions to protect privacy when meaningful consent is not attainable. One of the goals is to establish rules that allow for flexibility in using information that has been rendered non-identifiable while also ensuring there are enhanced measures taken to protect that information from being re-identified.
4. Require organizations to ensure data and algorithmic traceability and require organizations to show demonstrable accountability when developing and using AI for processing activities
The OPC wants the ability to proactively inspect data processing practices of organizations, and to incentivize organizations to adopt demonstrable accountability measures. The over-arching idea is to allow an individual to know where their data came from, how it was collected, curated, and moved within an organization. The OPC wants the philosophies of data protection and human rights by design to permeate Canadian privacy law.
5. Empower the OPC to issue binding orders and financial penalties to an organization for non-compliance
The OPC wants PIPEDA to provide real consequences and enforcement for organizations who are found to be non-compliant with the law. The law should ensure that individuals are protected and have access to quick, effective remedies in case of a breach to their privacy rights.
If enacted, many of these proposals would have a significant impact on organizations that handle Canadians’ personal information. You can read the OPC’s full proposal on their website.
If you have any questions or recommendations related to the OPC’s proposal or need support in drafting your organization’s consultation submissions in advance of the March 13 deadline, reach out to Miller Thomson’s Privacy and Information Protection Group.