Redrift

Quebec's Law 25 AI Obligations: What Most Programs Miss

Quebec's Law 25 is fully in force. The automated decision and profiling obligations are where most compliance programs still have significant gaps.

Quebec's Law 25 — formally, the Act to modernize legislative provisions as regards the protection of personal information, amending the Act respecting the protection of personal information in the private sector (RSQ, c. P-39.1) — completed its three-year phase-in on September 22, 2024, when data portability rights came into force. The enforcement regime has been fully operational since September 2023. The Commission d'accès à l'information du Québec (CAI) received 444 confidentiality incident reports in 2023–2024 and has actively exercised its new investigative and penalty powers.

Most organizations serving Quebec customers know the headline requirements. They've updated privacy policies, appointed a Privacy Officer, and built processes for confidentiality incident reporting. That's the visible layer, and most programs have addressed it.

The layer most programs haven't addressed is the automated decision and profiling framework. Law 25's obligations here exceed PIPEDA, go further than CCPA/CPRA in several key respects, and apply to a range of products that most product and legal teams don't categorize as AI.

What Law 25 Actually Requires on Automated Decisions

Under the amended Act, when an enterprise uses personal information to render a decision "based exclusively on automated processing," or when it deploys technology that identifies, locates, or profiles a person, it must inform that person before or at the time of collection. The required disclosure covers three things: that profiling or automated decision-making is occurring, the type of personal information used, and the person's right to be informed of the parameters used and, in certain cases, to have a human review the decision.

The scope of "automated processing" is broader than most clients expect. A credit evaluation tool. A dynamic pricing algorithm. A customer segmentation model. A fraud detection system that restricts account access. A chatbot that routes users to different service tiers. A recommendation engine that determines what a person sees in a product feed. Any of these systems uses personal information, produces an outcome affecting the person, and operates through automated logic — which is all Law 25 requires to trigger the obligation.

PIPEDA, Canada's federal private-sector privacy law, has no equivalent provision. CCPA and its CPRA amendments require disclosure of automated decision-making technology, but do not require a right to human review or object as a baseline. Law 25 requires both — and requires them to be surfaced at the point of collection, not buried in a privacy policy. A standalone privacy policy disclosure without point-of-collection notice does not satisfy the statute.

The Privacy Impact Assessment Gate

Section 3.3 of the amended Act requires a Privacy Impact Assessment (PIA) when an enterprise proposes to acquire, develop, or redesign an information system or electronic service delivery system involving personal information. The CAI published its PIA methodology guidance in 2023, setting out the expected structure and content of a compliant assessment.

The obligation is broad. Any new product feature involving personal information. Any system redesign that materially changes data flows. Any AI tool added to a product that handles personal information. If the system is new or materially redesigned and it processes personal information electronically, a PIA is required before deployment — not as a post-launch audit, but as a pre-deployment gate.

This is a meaningful departure from GDPR Article 35, which requires a Data Protection Impact Assessment only for "high risk" processing. Law 25's PIA trigger is not conditioned on a preliminary finding of high risk. The obligation attaches to the newness or redesign of the system itself, with the risk determination made within the PIA process. High-risk processing — including automated decision-making, large-scale processing, profiling, and cross-border transfers — typically generates a more intensive PIA and may require mandatory consultation with the CAI. But the PIA itself is not gated on that finding.

Operationally, Law 25 requires the PIA to live inside the product development process, not beside it. Organizations building AI features on Quebec-user products who are not running PIAs before deployment are out of compliance, regardless of whether they have a mature GDPR DPIA process. The two frameworks are structurally similar but procedurally distinct — one does not substitute for the other.

Cross-Border Transfers and the Assessment Requirement

When personal information is communicated outside of Quebec — including to other Canadian provinces — Law 25 requires a prior impact assessment of that communication. The enterprise must be satisfied that the receiving jurisdiction provides a level of protection comparable to what Quebec law requires, and that assessment must be documented before the transfer occurs.

The cross-border rule reaches further than most organizations recognize. Analytics logs sent to a U.S.-based platform. A SaaS application storing Quebec users' data in a cloud region outside Quebec. A third-party customer support tool based in Ireland. Each is a communication of personal information outside Quebec and requires a documented assessment.

Data processing agreements with vendors are necessary but not sufficient. The CAI has been explicit: enterprises must conduct their own assessment of the laws and practices of the receiving jurisdiction, not simply rely on vendor privacy representations, standard contractual clauses, or the existence of a data processing agreement. The obligation is the enterprise's, and the documentation must demonstrate an actual evaluation.

For in-house counsel, this means vendor onboarding reviews need to include a Law 25 cross-border impact assessment as a mandatory step — not just a DPA review. The same applies to existing vendor relationships involving Quebec data that have never been assessed under Law 25's framework.

Enforcement: Penalties and Private Rights of Action

Law 25 created a two-tier penalty structure. Administrative monetary penalties reach up to CAD 10 million or 2 percent of worldwide turnover for the preceding fiscal year, whichever is higher. For serious, intentional, or repeated violations, penal fines reach up to CAD 25 million or 4 percent of worldwide turnover — explicitly GDPR-scale, applied by a provincial regulator with full investigative powers including document production, inspections, and compelled testimony.

The CAI has been clear about its preferred enforcement path: for serious violations, violations not adequately remedied, intentional conduct, or repeated breaches, the CAI will pursue penal proceedings rather than administrative orders. The administrative AMPs are not the ceiling for noncompliance that produces real harm.

Law 25 also creates a private right of action. An individual can sue for privacy violations without demonstrating actual financial loss — the Act provides for a minimum of CAD 1,000 in compensatory damages for an unlawful intrusion upon privacy. This civil exposure exists independently of CAI proceedings and is an underweighted risk in most Law 25 compliance assessments. A class of Quebec customers affected by an undisclosed automated decision system is not a theoretical scenario.

What Adequate Compliance Actually Requires

The organizations that have gotten Law 25 compliance right share three practices. They've mapped where automated decision-making or profiling touches Quebec residents' personal information — which almost always requires a cross-functional conversation between legal, product, and data engineering teams, not just a privacy team review. They've built the PIA into the product development lifecycle as a required gate before deployment, not a retrospective exercise. And they've conducted documented cross-border transfer assessments for every vendor relationship that processes Quebec data, with evidence they can produce to the CAI if requested.

For organizations relying on a GDPR compliance program mapped partially to Law 25: that mapping typically catches consent and notice obligations. It typically misses the automated decision transparency requirements, the PIA trigger scope, and the private right of action. Closing those gaps requires a Law 25-specific review, not a gap assessment against GDPR.

The privacy drift problem — where a product evolves faster than its compliance documentation — is acute under Law 25 precisely because the PIA and automated decision obligations attach to product changes, not to annual review cycles. Every new AI feature is a potential compliance event. Treating it as one, before deployment, is what separates programs that are compliant from programs that are a complaint away from finding out they aren't.


← More posts