What is the UK Data (Use and Access) Act 2025? What it means for AI in recruitment

Written by
Kiku
6mins

The UK's data protection landscape changed significantly in 2026. The Data (Use and Access) Act 2025 (known as the DUAA) received Royal Assent in June 2025 and has been coming into force in phases. For employers and recruiters using AI tools in their hiring processes, the most significant provisions landed on 5 February 2026, with a further deadline on 19 June 2026.

This isn't a complete overhaul of UK data law. The DUAA amends, but does not replace, the UK GDPR and the Data Protection Act 2018. The practical changes for recruiters are substantial, particularly around automated decision-making, which is where most AI hiring tools sit.

This article explains what the DUAA is, what it changes for AI in recruitment, and the six concrete steps you need to take to stay compliant. If you also operate in the EU, read our guide to the EU AI Act and what it means for frontline hiring. The two regimes are moving in different directions in ways that matter for vendor decisions.

What is the UK Data (Use and Access) Act 2025?

The Data (Use and Access) Act 2025 is a wide-ranging piece of UK legislation that updates the country's data protection framework following Brexit. It builds on the UK GDPR and the Data Protection Act 2018 rather than replacing them.

Before getting into the steps, a few key terms worth fixing in mind:

  • Data (Use and Access) Act (DUAA): The most significant update to UK data law since Brexit. It changes how employers use personal data and automated tools by amending the 2018 framework.
  • UK GDPR: The UK's core data-protection framework. The DUAA adds rules around automated decisions, complaints, and data access requests.
  • Automated Decision-Making (ADM): Decisions made solely by technology without meaningful human involvement. The DUAA's changes to ADM rules are the most significant shift for recruiters using AI screening and assessment tools.
  • Applicant: Under the Act, an applicant is anyone who applies or expresses interest in a role, even if they stop partway through the process. Rights and obligations apply from the point of data collection, not only for completed applications.

Commencement is phased. The key employment-relevant provisions, including ADM changes, came into force on 5 February 2026. The mandatory complaints procedure follows on 19 June 2026.

What changes does the DUAA make to automated decision-making?

Under the previous UK GDPR framework, automated decision-making that produced significant effects on individuals (such as shortlisting or rejecting a candidate based solely on AI scoring) was tightly restricted. You needed to rely on one of a narrow set of legal bases to use it at all.

The DUAA relaxes this. From 5 February 2026, organisations can use automated decision-making across the full range of lawful bases, including legitimate interests. That means employers can now use AI-assisted screening and assessment tools more freely, provided three conditions are met:

  • No special category data is involved
  • A valid lawful basis for the processing is established
  • The required safeguards are implemented

The required safeguards are: informing individuals when a significant automated decision has been made about them; giving them the right to make representations and contest the decision; and enabling them to request human review of the outcome.

Special category data rules remain strict. If your AI tool processes health data, disability information, or other protected category data, the relaxed rules do not apply. You will need explicit consent or a substantial public interest basis, and existing safeguards continue in full.

This is a meaningful divergence from the EU position. The EU AI Act classifies AI tools used in employment decisions as high-risk systems with extensive obligations on both vendors and deployers. The DUAA moves in a different direction, one that is more permissive on use provided transparency and human oversight are in place. For organisations operating across both markets, both frameworks apply. Our EU AI Act vendor evaluation guide covers what to look for on the EU side.

Six steps to DUAA compliance for recruiters

Step 1: Update your privacy notices

Both employee and applicant privacy notices need to be updated to reflect the DUAA's requirements.

For current employees, your notice should clearly explain what personal data is collected and how it is used, whether any AI or automated tools are involved in decisions about them, and their right to challenge automated decisions and request human review.

For job applicants, an updated notice must be provided at the point of data collection, for example when a candidate submits a CV or fills in an application form. It must cover any automated screening, ranking, or assessment tools in use, how applicants can challenge automated decisions, and how long their data will be retained.

Action: Audit your ATS and careers portal to confirm that updated notices appear before candidate data is collected.

Step 2: Map and disclose all AI and automated decision-making tools

The DUAA expands what's permissible, but only if safeguards and transparency are in place. That starts with knowing exactly which automated tools you use and disclosing them.

For applicants, this covers CV screening and candidate ranking tools, online assessments and scoring, and automated rejection or progression systems.

For employees, it covers performance scoring, productivity monitoring, shift scheduling, conduct flagging, and any other tools that influence decisions about them.

The rule is straightforward: individuals must be informed before AI is used in any decision that affects them. Withholding that information, blocking contestation, or removing the option for human review are all direct compliance risks.

Action: Build or update a register of all automated tools used in your recruitment and HR processes, with the lawful basis and safeguards documented for each.

Step 3: Review your data subject access request process

A Data Subject Access Request (DSAR) is a formal request by an individual to access the personal data your organisation holds about them. Under the DUAA, candidates can request their application data, AI screening results, and any AI-generated notes, rankings, or scores used in the process.

The DUAA introduces a reasonable and proportionate standard for searches. You are not expected to search every system and data source if doing so would be disproportionate. The standard one-month response deadline remains, with extensions of up to two months available for complex requests provided the individual is notified in the first month.

For large employers or staffing agencies with complex IT environments, this proportionality principle is a practical improvement. The legal and compliance risks of handling DSARs inconsistently are significant. Our guide to legal and compliance considerations for high-volume recruitment covers the wider compliance landscape for employers hiring at scale.

Action: Review your DSAR process against the DUAA's proportionality standard and confirm your team knows how to handle requests involving AI-generated data.

Step 4: Build your data complaints procedure

This is one of the DUAA's most significant new obligations and has its own deadline: 19 June 2026.

A formal, written complaints procedure must be in place and accessible to individuals before they escalate to the Information Commissioner's Office (ICO). It applies to both employees and applicants.

The procedure must cover how to raise a complaint (via online form, email, or post), how it will be investigated and by whom, acknowledgement within 30 days, and how to escalate to the ICO if unresolved.

For applicants, the process must be accessible through job adverts, application portals, the applicant privacy notice, and your careers website.

Action: Coordinate with your Data Protection Officer (or equivalent) to draft and publish your complaints procedure before June 2026. Build the form or contact route into your candidate-facing channels.

Step 5: Update your governance documents

Compliance with the DUAA requires internal records that demonstrate how your organisation manages personal data responsibly. Four documents in particular need attention.

Records of Processing Activities (ROPAs) are required for employers with 250 or more employees, or any employer carrying out high-risk processing regardless of size. If you use AI tools in recruitment, high-risk processing almost certainly applies.

Data Protection Impact Assessments (DPIAs) are required whenever a new process or technology is likely to place a high risk on individuals. Introducing or updating AI screening tools triggers this requirement.

AI-specific impact audits should assess transparency, human oversight mechanisms, and the right to contest decisions. These are the three safeguards the DUAA requires for ADM.

Data retention policies must document how long candidate and employee data is held and ensure those periods are applied consistently.

Action: Review all four documents against the DUAA's requirements and update them before your tools go live or the compliance deadline arrives.

Step 6: Train your recruitment and HR teams

Compliance depends on the people running your process, not just the documents behind it.

HR teams and line managers need to understand how automated decision-making works and what safeguards apply, how to handle DSAR and complaint requests, and how to follow the new data-protection complaints procedure.

Recruitment and talent teams need to know how automated screening and ranking tools work in practice, when and how to provide human review when an applicant requests it, what information appears in privacy notices, and how to document compliance at each stage of the hiring process.

One important distinction: the DUAA's human intervention safeguard requires genuine review, not rubber-stamping. Reviewers need to be trained to exercise independent judgement and be in a position to override automated outputs. A human being nominally present at the end of the process doesn't satisfy the requirement if they're simply approving what the system produced.

Action: Build DUAA training into onboarding for anyone involved in recruitment or HR decisions, and update existing training before the June 2026 deadline.

How does the DUAA compare to the EU AI Act?

The short version: the UK and EU are heading in different directions.

The EU AI Act treats AI tools used in employment decisions as high-risk systems with extensive obligations, including mandatory risk assessments, bias testing, technical documentation, and registration with an EU database. The compliance deadline is 2 August 2026.

The DUAA takes a more permissive approach, relaxing the restrictions on automated decision-making and leaning toward pragmatic, innovation-friendly compliance. It still requires transparency, human oversight, and candidate rights, but it frames these as safeguards for expanded use rather than conditions on restricted use.

For organisations hiring across both markets, the practical implication is that your AI tools need to satisfy both frameworks. The safeguards required under the DUAA (transparency, contestation, and human review) are also required under the EU AI Act. Getting those right serves both regimes. Our EU AI Act explainer and vendor evaluation guide cover what compliance looks like on the EU side in detail.

At Kiku, the principles the DUAA requires are built into how our platform works: transparent AI use, candidate rights to contest and request human review, and genuine human oversight. You can read more about the values we build to in our AI Manifesto.

Frequently asked questions

When did the DUAA come into force for recruiters? The key provisions, including changes to automated decision-making, came into force on 5 February 2026. The mandatory data complaints procedure comes into force on 19 June 2026.

Does the DUAA replace the UK GDPR? No. The DUAA amends the UK GDPR and the Data Protection Act 2018 but does not replace them. Existing obligations remain in place alongside the new requirements.

Can I now use AI to screen candidates automatically without consent? The DUAA makes this more permissible than before, provided no special category data is involved and you can establish a lawful basis. Legitimate interests is now available where it previously wasn't. The required safeguards still apply: transparency, the right to contest, and human intervention on request.

What counts as special category data in a recruitment context? Health conditions, disabilities, racial or ethnic origin, religious beliefs, sexual orientation, and trade union membership are all special category data under UK law. If your screening tools process any of this, the stricter ADM rules apply.

What is a DSAR, and does it cover AI-generated data? A Data Subject Access Request is a formal request to access personal data held about an individual. It covers AI screening results, AI-generated notes, rankings, and scores used in the hiring process.

What are the penalties for non-compliance? The ICO retains enforcement powers including fines of up to £17.5 million or 4% of global annual turnover for serious breaches. The DUAA also aligns PECR fines with those under the UK GDPR.

Is Kiku compliant with the DUAA? Yes. Transparency about AI use, candidate rights to contest and request human review, and meaningful human oversight are built into how Kiku operates. Get in touch to discuss what compliance looks like for your hiring process.

Explore Kiku's AI driven productss

Ready to recruit smarter?

Let us show you how AI can relieve your everyday and free up your time.