Updated Guidance on Data Protection Impact Assessments (DPIAs)

Sian Barr

The Information Commissioner’s Office (ICO) has recently updated its guidance on conducting DPIAs following guidance and recommendations from the European Data Protection Board.

A DPIA is mandatory if you are carrying out processing which is likely to result in a high risk to individuals.  The GDPR requires controllers to go through a DPIA process if they plan to:

  • use systematic and extensive automated processing (i.e. profiling) with legal or other significant effects;
  • process special category or criminal offence data on a large scale; or
  • carry out large scale systematic monitoring of a publicly accessible place.

But, the three examples of high risk processing identified in the GDPR are not exhaustive.  The ICO’s newly updated guidance is helpful in determining whether those processing operations that do not fit neatly into one or more of the three categories above warrant a DPIA because they are high risk.

The ICO directs those who are assessing whether or not their processing is high risk to consider the guidelines on DPIAs (WP248 rev 01) adopted by the Article 29 Working Party and endorsed by the European Data Protection Board (the “European guidelines”).  The European guidelines contain nine criteria for assessing high risk processing operations, summarised here:

  1. Evaluation or scoring, including profiling and predicting, especially from aspects concerning the data subject’s performance at work, economic situation, health, personal preferences or interests, reliability or behaviour, location or movements.
  2. Automated-decision making with legal or similar significant effect.
  3. Systematic monitoring: processing used to observe, monitor or control data subjects, including data collected through networks or a systematic monitoring of a publicly accessible area.
  4. Sensitive data or data of a highly personal nature.
  5. Data processed on a large scale.
  6. Matching or combining datasets.
  7. Data concerning vulnerable data subjects e.g. children or employees.
  8. Innovative use or applying new technological or organisational solutions.
  9. When the processing in itself “prevents data subjects from exercising a right or using a service or a contract” e.g. screening or eligibility checks.

If your processing covers two or more of these criteria then the European guidelines state that a DPIA will be required in most cases but beware too that processing including only one of the criteria can also be high risk and require a DPIA.  The European guidelines also contain useful examples as to how the criteria can be used effectively.

The ICO guidelines then provide a further list of processing operations in respect of which the ICO requires a DPIA:

  • using innovative technology (in combination with any of the criteria from the European guidelines);
  • using profiling or special category data to decide on access to services;
  • profiling individuals on a large scale;
  • processing biometric data (in combination with any of the criteria from the European guidelines);
  • processing genetic data (in combination with any of the criteria from the European guidelines);
  • matching data or combining datasets from different sources;
  • collecting personal data from a source other than the individual without providing them with a privacy notice (‘invisible processing’);
  • tracking individuals’ location or behaviour;
  • profiling children or target marketing or online services at them; or
  • processing data that might endanger the individual’s physical health or safety in the event of a security breach.

The ICO has to a certain degree relaxed its own criteria for determining high risk processing, in that a DPIA is now only mandatory for the use of biometric data, genetic data or innovative technology when combined with one of the criteria from the European guidelines.

Finally, a brief reminder as to why it is important to make the correct decision when it comes to DPIAs: failure to carry out a mandatory DPIA may result in enforcement action, including an administrative fine of up to €10 million, or 2% of global annual turnover if higher.  So, it can’t be wrong to carry out a DPIA, the consequences can be serious if you are required to undertake a DPIA but fail to do so.

 

Sian Barr is a Senior Associate in the commerce & technology team at City law firm Fox Williams LLP and can be contacted at sbarr@foxwilliams.com

€50m fine for Google in France

Nigel Miller
Nigel Miller

On 21 January 2019, the French data protection authority, the CNIL, imposed an eye-watering €50m penalty on Google under the GDPR. Side-stepping the €20m maximum the CNIL issued a turnover related fine, highlighting that the maximum possible fine under the GDPR is €20m or 4% of global annual turnover, whichever is the greater.

The investigation was initiated as a result of complaints made within days of the GDPR coming into effect in May 2018. The complaints were from the associations None Of Your Business (“NOYB”) and La Quadrature du Net (“LQDN”), mandated by 10,000 people to refer the matter to the CNIL.

The case concerns personalised ads on smart phone devices using the Android operating system with a Google account. The regulator found two types of breaches of the GDPR. First, a lack of transparency, and second a lack of valid consent regarding the targeted ads.

Transparency

The requirement for transparency goes to the heart of the GDPR and applies to all processing.

CNIL looked at the information and process a user goes through when setting up the account. They found that the information provided by Google is not easily accessible. Essential information, such as the purposes of the processing, data storage periods and the categories of personal data used for the ads personalization, are disseminated across several documents. Specifically, information is not clear enough so that a user can understand that the legal basis of processing for the ads personalization is consent, and not the legitimate interest of Google.

Meanwhile, the processing operations are “massive and intrusive” because of the number of services offered (e.g. Google search, You tube, Google home, Google maps, Playstore, Google pictures…), and the amount and the nature of the data processed and combined.

Lack of a legal basis

The GDPR also requires a legal basis for processing. “Consent” is one of the possible legal basis and the GDPR significantly raised the bar for obtaining a valid consent.

The CNIL decided that the user’s consent is not validly obtained for two reasons. First, the consent is not sufficiently informed –  a lack of transparency is fatal to obtaining a valid consent.. Second, the collected consent is neither “specific” nor “unambiguous”. The user gives his or her consent for all processing operations together, whereas the GDPR requires that the consent is “specific” only if it is given distinctly for each purpose, i.e. a separate consent for each separate processing operation.

Google has said it will appeal the decision.

The case highlights the imperative of, as well as the difficulties in, obtaining a valid consent especially in the complex and mystifying world of targeted advertising where presentation of transparent intelligible information to a user in order to inform consent is challenging.

Nigel Miller is a partner in the commerce & technology team at City law firm Fox Williams LLP and can be contacted at nmiller@foxwilliams.com