Artificial Intelligence (“AI”) and data protection

In the past few years, we have seen an increasing number of organisations developing or using AI solutions. Although the business case for the use AI is compelling, tensions can arise where its use is at odds with data protection laws.

These tensions between AI and data protection include the following:

  • Transparency – the GDPR requires you to provide individuals with notice setting out how you are using their personal data. Where there is an element of automated decision-making which results in legal effects or otherwise has a significant effect on an individual (as there often is with AI), the controller is required to provide affected individuals with “meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject”. Given the complexities with AI and the fact that some types of AI can develop in an unsupervised environment, without human intervention, it can sometimes be difficult to meet these requirements.
  • Purpose limitation, data minimisation and storage limitation – the GDPR requires that processing of personal data is carried out for specific purposes, no more personal data than is adequate to achieve those purposes is processed and that personal data is only processed for as long as necessary to achieve those purposes. There is often tension between these principles and AI, since the development of an AI system can often result in data being used for unexpected purposes, and often requires vast amounts of data to be inputted into the system in order for it to meaningfully detect patterns and trends.

In respect of the transparency issue, the ICO has developed draft guidance along with the Alan Turing Institute (the UK’s national institute for data science and artificial intelligence) dealing with explaining AI. The guidance provides detailed information on the different ways in which businesses can seek to explain the processing they undertake using AI to the individuals concerned and seeks to address some of the concerns businesses may have in providing such explanations.

In addition to the above, the ICO is also working on finalising its AI auditing framework which will address the following specific issues:

  • Accountability – which will discuss the measures that an organisation must have in place to be compliant with data protection law.
  • AI-specific risk areas – which will discuss the key risk areas the ICO has identified in relation to the use of AI in the field of data protection.

As the use of AI becomes more widespread, it is hoped that the guidance issued by the ICO will help businesses better understand and comply with their data protection obligations whilst still allowing them to develop AI systems which can benefit organisations and individuals alike as our knowledge in this area continues to grow.

Return to Data Privacy Day 2020 index

Fines – more to come …

Due to the timing of data incidents and the related ICO investigation, many monetary penalties in 2019 were issued under the previous legislation, the Data Protection Act 1998, and not under the GDPR. The maximum financial penalty under the former law is £500,000. And the ICO has shown itself willing to issue the maximum fines; for example, in January 2020, fining DSG Retail Limited (the brands Currys, PC World, Dixons Travel) £500,000 after a ‘point of sale’ computer system was compromised as a result of a cyber-attack, affecting at least 14 million people. Earlier, in December 2019, the ICO fined a London-based pharmacy £275,000 for failing to ensure the security of special category data. Doorstep Dispensaree Ltd, which supplies medicines to customers and care homes, had left approximately 500,000 documents in unlocked containers at the back of its premises in Edgware.

However, mega fines under the GDPR are beginning to come through. The outcome of the ICO’s statement of intention to fine Marriott International Inc £99,200,396 for a cyber incident affecting approximately 339 million guest records globally, is still awaited. As is the outcome of its statement of intention to fine British Airways (BA) £183.39 million for a cyber incident which affected approximately 500,000 BA customers. According to reports, the deadline by which to reply to the notices of intention has been extended to 31 March 2020 for both companies.

We expect to see more eye watering regulatory action of this kind in 2020.

Meanwhile, an important point of housekeeping; companies should ensure that they register with the ICO and pay their data protection fee (unless exempt) as the ICO has launched a campaign to contact organisations to remind them about payment of the fee. The ICO issued 340 monetary penalty notices for non-payment of the data protection fee between 1 July and 30 September 2019.

Return to Data Privacy Day 2020 index

Class action compensation claims

The GDPR provides supervisory authorities the power to issue huge administrative fines (and we have seen the ICO demonstrate its intent to levy such fines). It also provides individuals with the right to seek compensation against controllers and processors which fail to comply with its provisions. This is set to provide fertile ground for claimants bringing actions in this area, and we expect the number of claims for data protection violations to increase significantly over the course of 2020.

Of particular interest, is the rising number of class actions being brought for data protection related offences.

Towards the end of 2019, in the case of Lloyd v Google LLC, the Court of Appeal overturned an earlier decision of the High Court, allowing proceedings to be served against Google in the US for its allegedly unlawful use of cookies on iPhone users’ devices from a period running from 2011 to 2012. This secret use of cookies (referred to as the “Safari workaround” in the case) allowed Google to gather and subsequently sell certain user data.

The decision of the Court of Appeal was significant since it allowed the case to be brought on behalf of all iPhone users affected by Google’s conduct over the relevant period on an opt-out basis. The Court of Appeal found this to be acceptable since all members of the class had the same “interests” (i.e. they had all suffered the same alleged wrong). This could potentially have broad ramifications in the area of data protection since violations will often impact upon a large number of individuals, rather than being one-off events affecting specific individuals (e.g. where an organisation is sending marketing communications to its entire mailing list unlawfully).

Many commentators have therefore suggested that the decision by the Court of Appeal in Lloyd v Google LLC could result in the floodgates opening for class action claims in relation to data protection violations. To a certain extent, this has already materialised, with a number of data protection class actions currently being fought out in the UK courts. Organisations which have suffered security incidents would appear to be at particular risk, with each of Morrisons, Equifax and British Airways currently litigating class actions in the aftermath of high-profile data breaches.

While the amounts awarded to individuals may be modest, in the event of a class action involving a large number of claimants, the potential total damages could dwarf the fines that could be imposed by the regulator.

Return to Data Privacy Day 2020 index

Accountability – sounds good, but what does it actually mean?

The GDPR sets out six principles relating to processing of personal data. These include ‘lawfulness, fairness and transparency’, ‘purpose limitation’ and ‘data minimisation’. But then the GDPR adds another principle – that the controller “shall be responsible for, and be able to demonstrate compliance with” these six principles. This is referred to as the “accountability” principle. The ICO has said that “Accountability encapsulates everything the GDPR is about”. But what does it actually mean in practice?

Accountability is about putting data protection at the heart of your organisation. It means that you must consider data protection and privacy issues upfront when you are planning any new initiative. It includes things like:

  • implementing data protection policies;
  • recording your processing;
  • taking a data protection by design and by default approach;
  • having written contracts in place with processors;
  • implementing appropriate data security measures;
  • recording and, where necessary, reporting data breaches;
  • appointing a data protection officer;
  • establishing processes for handling data subject rights’ requests; and
  • carrying out data protection impact assessments where needed.

Towards the end of 2019 the ICO consulted on the idea of developing a toolkit to help organisations comply with their accountability obligations. The objective is to provide down to earth practical guidance on implementing privacy management programmes based on an understanding of technical challenges and other barriers (such as commitment to data protection from top management).

The ICO is planning to conduct a workshop on the toolkit in early February 2020. Following this, they expect to pilot the toolkit later in the year. It is hoped that this may help organisations, whose resources are already over-stretched, with achieving a good and practical level of compliance.

Return to Data Privacy Day 2020 index

Data security- what’s “appropriate”?

The ‘integrity and confidentiality’ principle of the GDPR – also known as the security principle – requires that you have appropriate security measures in place to protect the personal data you hold. In terms of data security, the central obligation under the GDPR is “taking into account the state of the art, the costs of implementation and the nature, scope, context and purposes of processing as well as the risk of varying likelihood and severity for the rights and freedoms of natural persons, … [to] implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk”.

The GDPR is not prescriptive as to what this means and there is no “one size fits all” solution – the GDPR takes a risk-based approach. It says that these measures may include pseudonymisation and encryption of personal data, and implementing a process for regularly testing, assessing and evaluating the effectiveness of technical and organisational measures for ensuring the security of the processing.

Pseudonymised data (for example, replacing names with a number) remains subject to the GDPR, but is a good technique for securing the data, for example, when sharing it with others. On the other hand, the GDPR makes clear that data protection laws do not apply to anonymised information (information which does not relate to an identifiable person). The GDPR does not go into any detail on how to anonymise data and the organisations often refer to personal data as having been ‘anonymised’ when, in fact, this is not the case. This presents a risk that you disregard the terms of the GDPR in the mistaken belief that you are not processing personal data. The ICO issued a code on anonymisation under the old Data Protection Act. In 2020, we can expect an update to this code.

Encryption is a key tool for data security. As this is an established, widely-deployed technology, failing to encrypt data in transit or at rest risks being in breach of the security principle and could lead to fines if the data is compromised.

On the other hand, in the event of a data breach where the data had been effectively encrypted, there would be no requirement to notify data subjects of the breach as there would be no risk to data subjects as the data was “unintelligible”.

However, the biggest causes of data breaches are relatively unsophisticated issues such as data being sent to the wrong recipient and email users falling for a phishing attack. While there are effective technologies that can help prevent these sort of errors, employee awareness and training programmes will go a long way to protect against them, and are an important part of the “accountability” principle (see above Accountability – sounds good, but what does it actually mean?).

Return to Data Privacy Day 2020 index