Automated Decision Making and Data Protection
As consumers and authors of digital content, we are more and more at the mercy of algorithms, with many relying on ChatGPT for basic tasks. Are we seeing this century’s ‘industrial revolution’, where machines are able to and are carrying out many tasks previously carried out by humans? Now, AI and similar technology is able to carry out mental tasks that only humans could previously carry out. This revolution is progressing at a great pace, and inevitably, we as humans are struggling to keep up with its evolution and are considering how these technologies can streamline our daily lives and, in particular, our work.
The use of automated decision making (‘ADM’) provides an opportunity to streamline many processes and to assist organisations at all levels to make consistent decisions, which are fair and reasonably considered. The current data protection legal framework, whilst a little clunky and behind the ever-evolving technical landscape, aims to allow for the use of innovative technology, whilst balancing the importance of protecting individual rights and freedoms.
You or your organisation may be engaging with ADM already, or perhaps considering the implementation of processes which may include this. If so, you will need to keep up to date with your data protection obligations to individuals if processing their personal data for the purposes of ADM.
The Position under UK General Data Protection Regulation (UKGDPR)
Under the UKGDPR, Article 22(1) provides for the fact that:
“an individual data subject shall have the right not to be subjected to a decision based solely on automated processing, which produces legal effects concerning him or her or similarly significantly affects his or her decisions without further consideration”.
The scope of ‘legal or similar effects’ is sufficiently broad to catch various circumstances, including financial circumstances, and impact on reputation, opportunities, and choices, to name a few.
‘A decision’ will be considered as ‘based solely’ on ADM when no human intervention at all is used during the process of coming to that decision. The Information Commissioner’s Office (ICO), as the data regulator, has made clear in its guidance that the human intervention must be ‘meaningful’, and more than a ‘token gesture’. So, this means, if the type of ADM utilised falls within the definition set out in the UK GDPR (and is not captured by one of the exceptions), under the current regime, this automated processing will be prohibited.
If the process you are using does include meaningful human intervention, then you must still ensure compliance with the UK GDPR, i.e., ensuring you have the basics right, for example;
- a lawful basis (or bases) for the processing;
- you are keeping data subjects informed; and
- you are complying with data minimisation and security obligations.
However, fully automated processing may be possible if the automated decision falls outside what is prohibited. The relevant article is 22(2) of UK GDPR, and within this, one or more of the exceptions apply, where ADM is:
- Necessary for the entering into or performance of a contract between the subject and the data controller;
- Required or authorised by domestic law
- Explicitly consented to by the data subject
The above exceptions give rise to a high-risk processing activity, and should your activity fall within these (or indeed, if you are unsure), you should still always carry out a Data Protection Impact Assessment (DPIA) and introduce safeguards to provide the necessary information to data subjects as to the activity itself, and the rights available to them. Helpfully, DPIAs are also a great tool for demonstrating your accountability. It is wise to insert a human element into an ADM process, not only to avoid being captured by the prohibition, but as an important safeguarding tool.
If you are processing special category or particularly sensitive data, the limited exceptions differ and require further safeguarding, which is not considered here.
A DPIA in these circumstances will not only assist in your consideration of the risks of your processing activity but will also set out how to mitigate those risks. Importantly, you must inform data subjects that their data is being used for automated decision-making, regardless of whether the data has come straight from the subject to you or has come via a third party.
Changes to the Use of Automated Decision Making by the new Data Use and Access Act 2025 (DUUA) legislation
The new Data Use and Access Act 2025 (‘DUUA’) has recently received royal assent, and we anticipate it will be fully in force by December 2025. This prompted the ICO to consider the need to update and amend much of its guidance. As a general point, DPOs should remain live to the ICO updates; although there may be little to do by way of compliance at this stage, organisations should be aware of the opportunity to start doing things differently and embracing innovation.
Broadly, the aim and impact of DUUA is to promote innovation and make data protection compliance easier for organisations, whilst still protecting the rights of individuals. This means we will see some changes to the ADM landscape, examples of which are as follows:
- The relaxation of the ADM regime, including largely removing the prohibition on solely automated decision-making without meaningful human intervention. The prohibition will now only stand for ‘significant decision making’ and in relation to special category data.
- The potential to rely on legitimate interests as the lawful basis for processing for the purposes of ADM, so long as the appropriate safeguards are applied, this, of course, does not require data subject consent and which in turn can’t be withdrawn. Legitimate interests as a lawful basis will still need a Legitimate Interest Assessment to be carried out.
Conclusion
In summary, the data landscape is changing as legislation attempts to keep up with digital and AI innovation. Parliament’s intention is not to stifle innovation, but as ever, there is a lag. This is why the ICO is revising all of its guidance. So, provided you or your organisation can demonstrate justification for the processes being used and care has been taken to ensure personal data is handled respectfully and safely, compliance should flow from there. Following the Hanzel and Gretel model and leaving a trail on the way in will enable you to find your way out should the regulator come knocking through your justifications. If the ICO or indeed your industry regulator seeks to challenge the reasonable steps you have taken, a record of activity and your rationale behind processing should allow you to demonstrate your attempt to comply.
Talk to us
If you have any questions around data protection, please get in touch with the team at 0345 872 6666 or via our online enquiry form. Alternatively, contact Derek-Millard Smith or Lucy Barrow directly for guidance.
The majority of our work is privately paying and we will typically require a payment on account of our fees before commencing work. We do not do legally aided work.
