The growing culture of Data Subject Access Requests (DSARs)

The GDPR gives data subjects the right to access the personal data which a controller holds in relation to them. Although this may sound fairly innocuous, dealing with DSARs in practice continues to be a source of much frustration for controllers, particularly in the field of employment where DSARs are often used by disgruntled employees as part of a wider litigation strategy.

Meanwhile, the ICO’s Annual Report 2018-19 (published in July 2019) shows that subject access requests generate by far the most complaints to the regulator (at 38%). We expect the use of DSARs will continue to be prevalent in 2020. Businesses who do not yet have processes in place for dealing with such requests should develop procedures and protocols to be followed when requests are received.  To this end, the ICO published updated draft guidance in relation to the right of access towards the end of 2019. Some key points for controllers to note are as follows:

  • Procedure for submitting requests – there is no particular procedure data subjects must follow when submitting a DSAR. Individuals do not need to designate their request as being a DSAR for it to be treated as such. Furthermore, individuals can submit DSARs through whatever channel they prefer (including verbally), meaning that it’s important that relevant staff are trained in recognising such requests.
  • Receiving DSARs from 3rd parties – it is common for 3rd parties, such as law firms, to submit DSARs on behalf of others. In such circumstances, controllers are entitled to (and should) ask the relevant 3rd party for proof of the authorisation permitting them to act on behalf of the data subject. The onus is on the 3rd party to provide proof of authorisation, and this can be achieved through a letter of authorisation or a general power of attorney.
  • Time for responding to DSARs – normally you must comply with a DSAR without undue delay and at the latest within one month of receipt of the request. You can extend the time to respond by a further two months if the request is “complex” or you have received a number of requests from the same individual. Some organisations claim the extra time on the basis that the request is complex because it involves a large volume of information. The ICO guidance indicates that, while this may add to the complexity of a request, a request is not complex solely because the individual has requested a large amount of information.

The ICO guidance provides helpful advice in relation to the timeframe controllers are required to respond to DSARs, including the circumstances in which a controller may be able to extend the time for responding to a request on the basis of it being “complex” or where it has received multiple requests from the same individual.

The following are given as examples of factors that may in some circumstances add to the complexity of a request. However, you need to be able to demonstrate why the request is complex in the particular circumstances:

  • Technical difficulties in retrieving the information – for example if data is electronically archived.
  • Applying an exemption that involves large volumes of particularly sensitive information.
  • Any specialist work involved in redacting information or communicating it in an intelligible form.

One key area where the ICO has changed its position is in relation to circumstances where a controller needs to raise clarifications in relation to the DSAR. Whilst previously the ICO had taken the view that the statutory timeframe for responding to a DSAR would not commence until the controller received a response to any clarifications raised by it, this is no longer the case in the updated guidance. The ICO now takes the position that the timeframe for responding commences from the date the DSAR is received, irrespective of whether any clarifications are raised by the controller or whether the data subject has replied.

  • Being ready for DSARs – the ICO guidance expresses little sympathy for controllers who aren’t able to process DSARs efficiently, stating that DSARs have been a feature of the law since the 1980s and that therefore organisations should have systems in place to deal with them. From our experience, many organisations do not currently have systems in place to deal with DSARs, and particular difficulties are faced with unstructured data such as emails. While there are a growing number of third-party solutions which claim to assist, organisations are often forced to expend significant time and expense in dealing with DSARs.
  • Charging for DSARs – the guidance provides further guidance as to what is meant by the “administrative” costs which can be charged by controllers where an individual submits excessive or manifestly unfounded DSARs. Printing, photocopying and postage would fall within the meaning of an administrative costs. Charging for employee time taken to deal with such requests – which can be significant – would not be.

Return to Data Privacy Day 2020 index

Artificial Intelligence (“AI”) and data protection

In the past few years, we have seen an increasing number of organisations developing or using AI solutions. Although the business case for the use AI is compelling, tensions can arise where its use is at odds with data protection laws.

These tensions between AI and data protection include the following:

  • Transparency – the GDPR requires you to provide individuals with notice setting out how you are using their personal data. Where there is an element of automated decision-making which results in legal effects or otherwise has a significant effect on an individual (as there often is with AI), the controller is required to provide affected individuals with “meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject”. Given the complexities with AI and the fact that some types of AI can develop in an unsupervised environment, without human intervention, it can sometimes be difficult to meet these requirements.
  • Purpose limitation, data minimisation and storage limitation – the GDPR requires that processing of personal data is carried out for specific purposes, no more personal data than is adequate to achieve those purposes is processed and that personal data is only processed for as long as necessary to achieve those purposes. There is often tension between these principles and AI, since the development of an AI system can often result in data being used for unexpected purposes, and often requires vast amounts of data to be inputted into the system in order for it to meaningfully detect patterns and trends.

In respect of the transparency issue, the ICO has developed draft guidance along with the Alan Turing Institute (the UK’s national institute for data science and artificial intelligence) dealing with explaining AI. The guidance provides detailed information on the different ways in which businesses can seek to explain the processing they undertake using AI to the individuals concerned and seeks to address some of the concerns businesses may have in providing such explanations.

In addition to the above, the ICO is also working on finalising its AI auditing framework which will address the following specific issues:

  • Accountability – which will discuss the measures that an organisation must have in place to be compliant with data protection law.
  • AI-specific risk areas – which will discuss the key risk areas the ICO has identified in relation to the use of AI in the field of data protection.

As the use of AI becomes more widespread, it is hoped that the guidance issued by the ICO will help businesses better understand and comply with their data protection obligations whilst still allowing them to develop AI systems which can benefit organisations and individuals alike as our knowledge in this area continues to grow.

Return to Data Privacy Day 2020 index

Fines – more to come …

Due to the timing of data incidents and the related ICO investigation, many monetary penalties in 2019 were issued under the previous legislation, the Data Protection Act 1998, and not under the GDPR. The maximum financial penalty under the former law is £500,000. And the ICO has shown itself willing to issue the maximum fines; for example, in January 2020, fining DSG Retail Limited (the brands Currys, PC World, Dixons Travel) £500,000 after a ‘point of sale’ computer system was compromised as a result of a cyber-attack, affecting at least 14 million people. Earlier, in December 2019, the ICO fined a London-based pharmacy £275,000 for failing to ensure the security of special category data. Doorstep Dispensaree Ltd, which supplies medicines to customers and care homes, had left approximately 500,000 documents in unlocked containers at the back of its premises in Edgware.

However, mega fines under the GDPR are beginning to come through. The outcome of the ICO’s statement of intention to fine Marriott International Inc £99,200,396 for a cyber incident affecting approximately 339 million guest records globally, is still awaited. As is the outcome of its statement of intention to fine British Airways (BA) £183.39 million for a cyber incident which affected approximately 500,000 BA customers. According to reports, the deadline by which to reply to the notices of intention has been extended to 31 March 2020 for both companies.

We expect to see more eye watering regulatory action of this kind in 2020.

Meanwhile, an important point of housekeeping; companies should ensure that they register with the ICO and pay their data protection fee (unless exempt) as the ICO has launched a campaign to contact organisations to remind them about payment of the fee. The ICO issued 340 monetary penalty notices for non-payment of the data protection fee between 1 July and 30 September 2019.

Return to Data Privacy Day 2020 index

Class action compensation claims

The GDPR provides supervisory authorities the power to issue huge administrative fines (and we have seen the ICO demonstrate its intent to levy such fines). It also provides individuals with the right to seek compensation against controllers and processors which fail to comply with its provisions. This is set to provide fertile ground for claimants bringing actions in this area, and we expect the number of claims for data protection violations to increase significantly over the course of 2020.

Of particular interest, is the rising number of class actions being brought for data protection related offences.

Towards the end of 2019, in the case of Lloyd v Google LLC, the Court of Appeal overturned an earlier decision of the High Court, allowing proceedings to be served against Google in the US for its allegedly unlawful use of cookies on iPhone users’ devices from a period running from 2011 to 2012. This secret use of cookies (referred to as the “Safari workaround” in the case) allowed Google to gather and subsequently sell certain user data.

The decision of the Court of Appeal was significant since it allowed the case to be brought on behalf of all iPhone users affected by Google’s conduct over the relevant period on an opt-out basis. The Court of Appeal found this to be acceptable since all members of the class had the same “interests” (i.e. they had all suffered the same alleged wrong). This could potentially have broad ramifications in the area of data protection since violations will often impact upon a large number of individuals, rather than being one-off events affecting specific individuals (e.g. where an organisation is sending marketing communications to its entire mailing list unlawfully).

Many commentators have therefore suggested that the decision by the Court of Appeal in Lloyd v Google LLC could result in the floodgates opening for class action claims in relation to data protection violations. To a certain extent, this has already materialised, with a number of data protection class actions currently being fought out in the UK courts. Organisations which have suffered security incidents would appear to be at particular risk, with each of Morrisons, Equifax and British Airways currently litigating class actions in the aftermath of high-profile data breaches.

While the amounts awarded to individuals may be modest, in the event of a class action involving a large number of claimants, the potential total damages could dwarf the fines that could be imposed by the regulator.

Return to Data Privacy Day 2020 index

Accountability – sounds good, but what does it actually mean?

The GDPR sets out six principles relating to processing of personal data. These include ‘lawfulness, fairness and transparency’, ‘purpose limitation’ and ‘data minimisation’. But then the GDPR adds another principle – that the controller “shall be responsible for, and be able to demonstrate compliance with” these six principles. This is referred to as the “accountability” principle. The ICO has said that “Accountability encapsulates everything the GDPR is about”. But what does it actually mean in practice?

Accountability is about putting data protection at the heart of your organisation. It means that you must consider data protection and privacy issues upfront when you are planning any new initiative. It includes things like:

  • implementing data protection policies;
  • recording your processing;
  • taking a data protection by design and by default approach;
  • having written contracts in place with processors;
  • implementing appropriate data security measures;
  • recording and, where necessary, reporting data breaches;
  • appointing a data protection officer;
  • establishing processes for handling data subject rights’ requests; and
  • carrying out data protection impact assessments where needed.

Towards the end of 2019 the ICO consulted on the idea of developing a toolkit to help organisations comply with their accountability obligations. The objective is to provide down to earth practical guidance on implementing privacy management programmes based on an understanding of technical challenges and other barriers (such as commitment to data protection from top management).

The ICO is planning to conduct a workshop on the toolkit in early February 2020. Following this, they expect to pilot the toolkit later in the year. It is hoped that this may help organisations, whose resources are already over-stretched, with achieving a good and practical level of compliance.

Return to Data Privacy Day 2020 index