On 16 April 2026, the UK Secretary of State for Science, Innovation and Technology signed a statutory instrument that removes the word “optional” from AI governance guidance. The Data Protection Act 2018 (Code of Practice on Artificial Intelligence and Automated Decision-Making) Regulations 2026 (SI 2026/425) requires the Information Commissioner to “prepare an appropriate code of practice” on the processing of personal data in relation to developing and using AI and automated decision-making. The instrument was laid before Parliament on 21 April 2026 and comes into force on 12 May 2026.

This is a statutory obligation, not guidance the ICO may choose to produce. The Commissioner must write the code. The code must cover AI and automated decision-making under the UK GDPR and the Data Protection Act 2018. And it must include specific guidance on the processing of children’s personal data.

What SI 2026/425 actually does

The instrument is three regulations long. Its brevity is deceptive.

Regulation 2(1) states that the Commissioner “must prepare an appropriate code of practice giving guidance as to good practice in the processing of personal data under the relevant data protection legislation” in relation to developing and using AI and automated decision-making.

Regulation 2(2) adds that the code “must include guidance as to good practice in the processing of children’s personal data.” This is not discretionary. Any code the ICO produces must address how AI and automated systems handle data belonging to minors.

Regulation 2(3) defines “automated decision-making” by reference to Article 22C(1) of the UK GDPR and section 50C(1) of the Data Protection Act 2018. These are the provisions introduced by the Data Protection and Digital Information Act 2024 (DPDIA) that replaced the old Article 22 framework with a new, expanded regime covering “significant decisions” made by automated means.

Regulation 3 modifies the review panel process: the panel that reviews the draft code is explicitly prohibited from considering or reporting on any aspect relating to national security.

The explanatory note confirms the scope covers the UK GDPR and the Data Protection Act 2018 except Part 4 (intelligence services processing). The code will apply to all sectors, all organisation sizes, and all AI systems that process personal data, from recruitment tools and credit scoring to fraud detection and customer service chatbots.

Why a statutory code is different from ICO guidance

The ICO already publishes guidance on automated decision-making and profiling. That guidance is helpful but non-binding. Organisations can choose to follow it or not. A regulator investigating a complaint can reference it, but it does not carry the weight of a code issued under statutory authority.

A statutory code of practice changes the dynamic. Under section 124B of the Data Protection Act 2018, the ICO’s code must go through a formal preparation process, including consultation, a review panel, and parliamentary procedure. Once issued, the code becomes the reference standard for what “good practice” looks like. The ICO can use it as a benchmark in investigations and enforcement actions. Courts can consider it when assessing whether an organisation’s data processing was lawful and fair.

For compliance teams, the practical effect is that the code will define the standard against which AI governance will be measured. Demonstrating alignment with the code will strengthen an organisation’s position if investigated. Failing to align will require an explanation.

What the DPDIA changed about automated decisions

The Data Protection and Digital Information Act 2024 replaced Article 22 of the UK GDPR with a new framework built around Articles 22A through 22D and equivalent sections in the DPA 2018. The old framework applied only to “solely automated” decisions with “legal or similarly significant” effects and gave individuals a right to opt out. The new framework is broader.

Under Article 22C, the trigger is now “significant decisions” made using automated processing, with “significant” defined more broadly than the old “legal or similarly significant” test. The right is no longer an opt-out; it is a right to be informed that a significant decision has been made, a right to a meaningful explanation, and a right to have a human reconsider the decision.

SI 2026/425 tells the ICO to write guidance on exactly how organisations should implement these obligations. That means the code will need to address what counts as a “significant decision,” what constitutes a “meaningful explanation,” how human oversight should work in practice, and how organisations should document their automated systems to meet these standards.

What the children’s data requirement means

The mandatory inclusion of children’s data guidance is significant. The UK already has the Age Appropriate Design Code (Children’s Code), which applies to online services likely to be accessed by children. SI 2026/425 extends that focus into AI and automated decision-making specifically.

This means the ICO’s code will need to address AI systems that process children’s data even when those systems are not child-facing services. Recruitment AI that processes data about young job applicants, educational technology that uses automated assessment, healthcare AI that makes decisions about minors, and social media content moderation systems that profile users by age will all be in scope.

For organisations that have not mapped whether their AI systems process children’s data, the code will create a compliance gap they may not be aware of until it is tested.

What this means for organisations outside the UK

The UK GDPR applies to organisations that process the personal data of UK residents, regardless of where the organisation is headquartered. An Australian company that uses AI to process job applications from UK-based candidates, or a US SaaS provider that deploys automated content moderation for UK users, will be subject to whatever standards the ICO’s code establishes.

The territorial reach mirrors the EU AI Act’s extraterritorial scope. Organisations that already face EU AI Act obligations will need to map the ICO’s code alongside them. The two regimes are not identical. The EU framework is risk-classification-based; the UK framework is data-protection-based, anchored in the UK GDPR rather than a standalone AI regulation. Compliance with one does not guarantee compliance with the other.

What compliance and IT teams should do now

Inventory automated decision-making systems. Identify every system that makes or contributes to decisions about individuals using automated processing. This includes AI recruitment tools, credit scoring models, fraud detection systems, pricing algorithms, content moderation tools, and customer service chatbots. Map which of these make “significant decisions” as defined under the DPDIA’s new Article 22C framework.

Update data protection impact assessments (DPIAs). Existing DPIAs for automated systems may not reflect the DPDIA’s expanded definition of automated decision-making. Review and update them before the code is issued, so the organisation has a documented baseline to measure against.

Identify children’s data processing. Determine whether any AI or automated systems process data belonging to individuals under 18. This includes systems that may process children’s data incidentally, such as recruitment tools that receive applications from 16 and 17-year-old school leavers or health platforms that serve family accounts.

Map the explanation and human review chain. Under the DPDIA framework, individuals affected by significant automated decisions have a right to a meaningful explanation and human reconsideration. Organisations need to know, before the code lands, whether their systems can provide those explanations and whether a human review process exists.

Monitor the ICO’s timeline. SI 2026/425 requires the ICO to prepare the code; it does not set a deadline for completion. The preparation process includes consultation and parliamentary procedure. Organisations should monitor the ICO’s public announcements and consultation schedule to ensure they can respond to the draft code when it is published.

The deadline behind the deadline

The 12 May 2026 coming-into-force date is the date the ICO’s statutory obligation begins, not the date the code itself will be published. The ICO will need to draft the code, consult on it, submit it to a review panel, and lay it before Parliament. That process could take months.

The risk for organisations is treating the gap between 12 May and the code’s eventual publication as free time. It is not. The underlying obligations in the DPDIA are already in force. The code will clarify how the ICO expects those obligations to be met, but the obligations themselves do not wait for the code. Organisations that are already building compliant AI governance frameworks will be ahead. Organisations that wait for the code before starting will be playing catch-up against a standard they had no input into shaping.

The agentic AI guidance from the CMA and DRCF covered in a recent SAW article, and the EU AI Act’s shifting timelines, both point in the same direction: regulators are building the frameworks. The question is whether organisations will be ready when those frameworks are applied.

Sources

  • UK Government, “The Data Protection Act 2018 (Code of Practice on Artificial Intelligence and Automated Decision-Making) Regulations 2026,” SI 2026/425, made 16 April 2026, laid 21 April 2026, in force 12 May 2026 (full statutory text, Regulations 1-3, explanatory note). legislation.gov.uk
  • GovPing (ChangeFlow), “UK Requires AI Code of Practice Under Data Protection,” 20 April 2026 (summary of SI 2026/425, timeline, scope). changeflow.com
  • LexisNexis, “Information Law weekly highlights, 23 April 2026,” 23 April 2026 (professional services coverage, legal context). lexisnexis.co.uk
  • ICO, “Rights related to automated decision making including profiling,” guidance page, updated 2025 (existing ICO ADM guidance, pre-code baseline). ico.org.uk
  • Osborne Clarke, “Artificial intelligence: UK Regulatory Outlook February 2026,” 8 February 2026 (DPDIA implementation context, regulatory landscape). osborneclarke.com