“Safeguarding the personal information of Canadians is essential to maintaining public trust,” Treasury Board President Shafqat Ali said on 2 April 2026, launching the first substantive review of Canada’s federal Privacy Act in 43 years. “Updating the Privacy Act will reinforce privacy protections and help ensure government institutions operate in a transparent and accountable manner in the digital age.”

The Treasury Board of Canada Secretariat has opened a public consultation that runs until 10 July 2026, with a consolidated findings report promised for winter 2026-27. The review covers a federal law that has governed how more than 250 government institutions handle Canadians’ personal information since 1983, and the proposed reforms would write AI transparency, mandatory privacy impact assessments, and explicit automated decision-making obligations directly into statute.

Why a 43-year-old law needed reform

The Canadian Privacy Act came into force in 1983, before the commercial internet, before smartphones, and decades before generative AI. The law applies to more than 250 federal institutions, including departments, agencies and Crown corporations, and is distinct from the Personal Information Protection and Electronic Documents Act (PIPEDA) that governs private-sector privacy. Private companies are not directly in scope, but vendors that sell to federal institutions effectively are.

The gap between 1983 legislative drafting and 2026 operational reality has been measurable for years, and the trend line is sharp. The Office of the Privacy Commissioner of Canada (OPC) received 615 breach reports from federal institutions in 2024-25, up from 561 the previous year. The number of individuals affected more than doubled, from 138,434 to 309,865. Federal institutions deploy AI for benefits decisions, risk scoring, identity verification, and language processing. The current Privacy Act addresses none of this directly. Privacy impact assessments exist as policy under a Treasury Board Secretariat directive, but they are not statutory requirements, and the Privacy Commissioner has limited order-making powers when institutions fall short.

Privacy Commissioner Philippe Dufresne, who has spent his term advocating for reform, framed the underlying principle in his 2024-25 annual report: “At a time when the personal information of Canadians is being collected, used, and shared at an unparalleled pace and volume on a global scale, effective privacy protection requires more than the status quo. Prioritizing privacy as a fundamental right reflects our Canadian values and ambitions and reinforces the freedoms and trust that underpin our democracy.”

Context matters here. The Privacy Act review is happening in the absence of a comprehensive federal AI law in Canada. The Artificial Intelligence and Data Act (AIDA) was bundled into Bill C-27 in 2022 and died on the order paper when Parliament was prorogued in early 2025. There is no immediate replacement on the table. That gap is why public-sector privacy reform is doing so much of the AI governance work: it is one of the few federal levers available, and it carries flow-through expectations for any vendor selling AI into federal institutions.

The ArriveCAN case: why federal contracting is in the frame

The Privacy Act review did not arrive in a vacuum. In March 2026, the OPC released the findings of its investigation into the Canada Border Services Agency’s contracting practices for the ArriveCAN application, a federal app used during the COVID-19 pandemic that became a flashpoint for federal IT contracting and outsourcing oversight. The investigation found no contraventions of the Privacy Act, but it identified significant shortcomings in how the CBSA managed contractors with access to personal information.

Dufresne tied the findings directly to the broader reform agenda: “This investigation highlights the importance of privacy as a core consideration when developing outsourcing contracts. The findings are an opportunity to raise the awareness for all federal institutions about best practices in contracting to ensure strong privacy protections for Canadians.” ArriveCAN is exactly the kind of federal AI and digital deployment scenario the Privacy Act review is designed to address. A federal institution stands up a digital service, contracts with vendors who handle personal information at scale, and discovers after the fact that its existing safeguards were thinner than assumed. The 1983 Act provides the OPC with limited tools to compel improvement before harm occurs. The reform would change that.

What the AI-relevant proposals actually say

The policy paper released by the Treasury Board sets out themes for consultation rather than draft legislation. Several of the proposed approaches have direct implications for AI governance.

Mandatory privacy impact assessments. The current PIA requirement sits in policy under the Treasury Board Directive on Privacy Impact Assessment. The proposal would move it into statute, making PIAs a legal requirement whenever a program uses personal data to make decisions about individuals. For AI systems making or supporting administrative decisions, that means a documented assessment becomes a non-negotiable precondition for deployment.

Explicit transparency obligations for AI and automated decision systems. The reform would create statutory transparency duties when AI or ADS are used in decisions affecting individuals. This goes beyond general privacy notice requirements to address the specific challenge of AI-driven decisions: people need to know when an algorithm is involved, what it does, and how to challenge the outcome.

Privacy as a fundamental right. The reform proposes elevating privacy to recognised fundamental right status within the Act itself. This is more than symbolic. Fundamental rights recognition creates stronger legal protections, heightens judicial scrutiny of privacy limitations, and changes how courts balance privacy against competing interests in AI deployment cases.

Stronger Commissioner powers. The proposal includes giving the Privacy Commissioner and Federal Court the powers needed to enforce compliance with all privacy obligations. Currently, the Commissioner can investigate and recommend, but cannot order compliance measures outside narrow statutory limits. Order-making powers would put real enforcement teeth behind the AI transparency obligations. Dufresne has separately advocated for binding order powers, administrative monetary fines, and proactive audit authority as part of his seven priority recommendations for federal privacy reform.

Periodic statutory reviews. The proposal includes a requirement for periodic reviews of the Privacy Act, an explicit acknowledgement that privacy law needs to keep pace with technology. For AI governance, it means future regulatory adjustments will not require waiting another four decades.

Why public-sector reform matters for private-sector AI

The Privacy Act applies only to federal institutions, not private companies. PIPEDA governs the private sector and is on a separate (and currently stalled) reform track. So why should Australian or international AI deployers care about a Canadian public-sector consultation?

Vendors selling AI to Canadian federal institutions will inherit the obligations through procurement. If a department must conduct a statutory PIA before deploying an AI system, the vendor must supply the documentation, technical detail, and audit trail to support that assessment. SaaS providers without that material will lose contracts. The same procurement-driven governance pressure has been visible in EU member states implementing the AI Act and in California’s recent procurement-led executive order.

Public-sector privacy norms tend to bleed into private-sector expectations. When Treasury Board sets a standard for PIA scope and AI transparency in government deployments, those expectations migrate to PIPEDA enforcement, provincial privacy regulators, and corporate self-regulation. Organisations that wait for explicit private-sector law before acting will be operating against a moving baseline.

The proposed framework is a useful template even outside Canadian jurisdiction. Mandatory PIAs for AI decisions, statutory ADM transparency, and a fundamental rights framing all align with where Australian law is moving. From 10 December 2026, the Privacy and Other Legislation Amendment Act 2024 will require Australian entities using automated decision-making affecting individuals’ rights or interests to disclose this in their privacy policy, with civil penalties for serious breaches reaching up to AUD $50 million. The EU AI Act’s high-risk system requirements take effect in August 2026. The frameworks are not identical, but an organisation that builds AI governance to satisfy all three is doing the same work once.

What to watch in the consultation

The reform is uncontroversial in principle. Forty-three years without substantive update is hard to defend. The operational questions are where the consultation will be tested, and these are the ones organisations and vendors should track most closely:

Scope of the ADM definition. How broadly is “automated decision system” defined, and does it capture decision-support tools as well as fully automated decisions?

PIA threshold. What triggers a mandatory statutory PIA, and is the test based on data sensitivity, decision impact, or both?

Strength of OPC powers. Will the Commissioner get binding order-making powers, administrative monetary penalties, and proactive audit authority, or will the reform stop short of full enforcement teeth?

Vendor flow-through. How explicitly will the obligations on federal institutions translate into procurement requirements, and what documentation must vendors supply?

These are legitimate questions, not delaying tactics. Organisations and vendors with a stake should consider written submissions before the 10 July deadline.

What organisations should do

If you sell AI to Canadian federal institutions, prepare for procurement-driven compliance. Start documenting your AI systems with the level of detail a PIA would require: data sources, model purpose, decision logic, bias testing, human oversight points. Vendors that can supply this on request will have a procurement advantage. Vendors that cannot will be excluded.

Treat the proposals as a design baseline, not a Canadian-specific issue. If your AI governance programme already supports documented PIAs, ADM transparency and fundamental rights framing, you are well placed for Canada, Australia, the EU, and the next wave of US state legislation. If not, the Canadian proposals are a good map of what “good” looks like for the next 18 months of AI regulation.

The bigger pattern

Canada’s Privacy Act review fits a pattern that has become unmistakable in 2026. Regulators on multiple continents are converging on the same set of AI governance requirements: documented impact assessments, transparency about automated decisions, meaningful human oversight, and enforceable accountability. The specific statutes differ, but the operational expectations are increasingly the same. Organisations that build their AI governance to meet the highest common denominator will avoid the cost and complexity of jurisdiction-by-jurisdiction retrofits. Those that wait for each individual law to come into force will keep finding they are six months behind the regulatory curve.

Sources

  • Treasury Board of Canada Secretariat, “Government of Canada launches review of the Privacy Act,” news release, 2 April 2026. canada.ca

  • Treasury Board of Canada Secretariat, “Privacy Act Modernization: Policy approaches,” April 2026. canada.ca

  • Treasury Board of Canada Secretariat, “2026 Review of the Privacy Act: Policy Approaches” detail page. canada.ca

  • Office of the Privacy Commissioner of Canada, 2024-25 Annual Report to Parliament, “Prioritizing privacy in a data-driven world.” priv.gc.ca

  • Office of the Privacy Commissioner of Canada, “Privacy Commissioner of Canada tables in Parliament Special Report on ArriveCAN app investigation,” March 2026. priv.gc.ca

  • IAPP, “What 2026 may bring for Canada’s privacy reform efforts,” February 2026. iapp.org

  • Osler, Hoskin & Harcourt LLP, “Canada’s 2026 privacy priorities: data sovereignty, open banking and AI,” December 2025. osler.com

  • Cantech Letter, “Government of Canada launches review of the Privacy Act,” 2 April 2026. cantechletter.com