Class action compensation claims

The GDPR provides supervisory authorities the power to issue huge administrative fines (and we have seen the ICO demonstrate its intent to levy such fines). It also provides individuals with the right to seek compensation against controllers and processors which fail to comply with its provisions. This is set to provide fertile ground for claimants bringing actions in this area, and we expect the number of claims for data protection violations to increase significantly over the course of 2020.

Of particular interest, is the rising number of class actions being brought for data protection related offences.

Towards the end of 2019, in the case of Lloyd v Google LLC, the Court of Appeal overturned an earlier decision of the High Court, allowing proceedings to be served against Google in the US for its allegedly unlawful use of cookies on iPhone users’ devices from a period running from 2011 to 2012. This secret use of cookies (referred to as the “Safari workaround” in the case) allowed Google to gather and subsequently sell certain user data.

The decision of the Court of Appeal was significant since it allowed the case to be brought on behalf of all iPhone users affected by Google’s conduct over the relevant period on an opt-out basis. The Court of Appeal found this to be acceptable since all members of the class had the same “interests” (i.e. they had all suffered the same alleged wrong). This could potentially have broad ramifications in the area of data protection since violations will often impact upon a large number of individuals, rather than being one-off events affecting specific individuals (e.g. where an organisation is sending marketing communications to its entire mailing list unlawfully).

Many commentators have therefore suggested that the decision by the Court of Appeal in Lloyd v Google LLC could result in the floodgates opening for class action claims in relation to data protection violations. To a certain extent, this has already materialised, with a number of data protection class actions currently being fought out in the UK courts. Organisations which have suffered security incidents would appear to be at particular risk, with each of Morrisons, Equifax and British Airways currently litigating class actions in the aftermath of high-profile data breaches.

While the amounts awarded to individuals may be modest, in the event of a class action involving a large number of claimants, the potential total damages could dwarf the fines that could be imposed by the regulator.

Return to Data Privacy Day 2020 index

Accountability – sounds good, but what does it actually mean?

The GDPR sets out six principles relating to processing of personal data. These include ‘lawfulness, fairness and transparency’, ‘purpose limitation’ and ‘data minimisation’. But then the GDPR adds another principle – that the controller “shall be responsible for, and be able to demonstrate compliance with” these six principles. This is referred to as the “accountability” principle. The ICO has said that “Accountability encapsulates everything the GDPR is about”. But what does it actually mean in practice?

Accountability is about putting data protection at the heart of your organisation. It means that you must consider data protection and privacy issues upfront when you are planning any new initiative. It includes things like:

  • implementing data protection policies;
  • recording your processing;
  • taking a data protection by design and by default approach;
  • having written contracts in place with processors;
  • implementing appropriate data security measures;
  • recording and, where necessary, reporting data breaches;
  • appointing a data protection officer;
  • establishing processes for handling data subject rights’ requests; and
  • carrying out data protection impact assessments where needed.

Towards the end of 2019 the ICO consulted on the idea of developing a toolkit to help organisations comply with their accountability obligations. The objective is to provide down to earth practical guidance on implementing privacy management programmes based on an understanding of technical challenges and other barriers (such as commitment to data protection from top management).

The ICO is planning to conduct a workshop on the toolkit in early February 2020. Following this, they expect to pilot the toolkit later in the year. It is hoped that this may help organisations, whose resources are already over-stretched, with achieving a good and practical level of compliance.

Return to Data Privacy Day 2020 index

The use of location data by mobile apps post-GDPR

This article was first published on Lexis®PSL TMT on 24 September 2018.

From the perspective of a party providing an app via an app store, what regulations govern the use of location data by that mobile app?

The key consideration is data privacy and, therefore, the main regulation to consider is the General Data Protection Regulation (GDPR) which came into force on 25 May 2018. This will apply to the app provider if they carry out processing of personal data on the device.

While there is as yet no specific guidance under the GDPR on the use of location data by Apps, in 2011 the Article 29 Data Protection Working Party (now the European Data Protection Board (EDPB)) adopted Opinion 13/2011 on “Geolocation services on smart mobile devices” and in 2013 Opinion 2/2013 on “Apps on smart devices”. Although these opinions relate to the Data Protection Directive (95/46/EC), much of the content of the Opinions is still relevant under the GDPR.

In the UK, you should also take into account the Data Protection Act 2018 which supplements the GDPR in certain areas (such as in relation to special categories of personal data and data subject rights) although not specifically in relation to location data.

To what extent / in what circumstances will the Privacy and Electronic Communications Regulations 2003 regulate the use of location data by mobile app providers? What exemptions apply and does PECR 2003 apply to ‘information society services’?

Under regulation 6 of PECR (as amended by the 2011 Regulations), it is unlawful to gain access to information stored in the terminal equipment of a subscriber or user unless the subscriber or user (a) is provided with clear and comprehensive information about the purposes of the access to that information; and (b) has given his or her consent. This applies irrespective of whether or not the location data is “personal data”.

Regulation 14 relates specifically to the processing of location data and provides that you can only process location data if you are a public communications provider, a provider of a “value-added service”, or a person acting on the authority of such a provider, and only if: (a) the data is anonymous; or (b) you have the user’s consent to use it for a value-added service, and the processing is necessary for that purpose. This does not apply to data collected independently of the network or service provider such as GPS-based location data or data collected by a local wifi network. However, the use of such data will still need to comply with the GDPR.

To what extent / in what circumstances will the GDPR regulate the use of location data collected from mobile apps by mobile app providers?

The GDPR will apply if the app provider collects the location data from the device and if it can be used to identify a person.

If the data is anonymized such that it cannot be linked to a person, then the GDPR will not apply. However, if the location data is processed with other data related to a user, the device or the user’s behavior, or is used in a manner to single out individuals from others, then it will be “personal data” and fall within the scope of the GDPR even if traditional identifiers such as name, address etc are not known.

Opinion 13/2011 sets out the regulator’s view that a device is usually intimately linked to a specific individual and that location data will, therefore, be regarded as “personal data”. Indeed, the definition of “personal data” in the GDPR, specifically includes location data as one of the elements by reference to which a person can be identified.  The Opinion comments that the providers of geolocation based services gain “an intimate overview of habits and patterns of the owner of such a device and build extensive profiles.”

Furthermore, in certain contexts, location data could be linked to special category personal data (sensitive personal data). For example, location data may reveal visits to hospitals or places of worship or presence at political demonstrations.

How is compliance with such laws commonly addressed by app providers?

To process the data derived from the device or the app, the app provider needs to have a legal basis.

Contract necessity may apply to some uses of the location data. For other uses, depending on the app, it may be problematic to rely on “legitimate interests” as a lawful basis for tracking individuals using location data, for example, to serve location specific ads. Therefore, in many cases the app provider will need to rely on the user’s “consent” for processing location data.

How should app providers respond to recent changes in the law (e.g., the introduction of GDPR) impacting their apps’ use of location data?

Where app providers rely on “consent” as the legal basis, they will need to ensure that this meets the stricter requirements for consent under GDPR. This can be challenging given the constraints of the mobile app environment.

Transparency is essential. The Article 29 Guidelines on transparency WP260 rev.01 indicate that, for apps, the Article 13 privacy information should be made available from the app store before download. Once the app is installed, the privacy information needs to be easily accessible from within the app. The recommendation is that it should never be more than “two taps away” (e.g. by including a “Privacy” option in the app menu). Use of layered notices and contextual real time notifications will be particularly helpful on a mobile device.

The device’s operating system (such as IOS) may require the user’s permission to use the location data, for example via a dialogue box asking if the user agrees to allow the app to access the user’s location, either while using the app or in the background. Clicking on the “allow” button enables location service on the device and may also help signify consent provided that this has been sufficiently informed and is sufficiently granular.

If the app integrates with a third-party provider to enable, for example, location-based advertising the consent to use location data must be sufficiently explicit to include consent to data collection for advertising purposes by the third party, including the identity of the third party. Data sharing arrangements may also be required between the app provider and the third party.

Where children (in UK, under 13) may be involved, the consent must be given or authorised by the holder of parental responsibility over the child.

Following GDPR, app providers should review their data security and retention policies for compliance with the Article 5 principles.

App providers should be mindful of the principles of privacy by design and by default, and so for example location services should, by default, be switched off and its use should be customizable by the user.

Finally, using location data may involve “profiling” within the meaning of Article 4(4) which specifically refers to analysing location data. As such, consideration should be given to whether a data protection impact assessment (DPIA) is required under Article 35 or, if not required, should be undertaken as good practice.

Are there any forthcoming or anticipated changes to the law which may impact on use of location data by mobile app providers?

The ePrivacy Directive on which PECR is based is currently under review to be updated and aligned with GDPR in the form of the ePrivacy Regulation.

This is not yet finalised and its implementation date is not certain but may be in 2019 or 2020. However, GDPR-grade consent will still be required for use of location data subject to certain exceptions including where strictly necessary for providing an information society service specifically requested by the individual. Assuming the ePrivacy Regulation takes effect after Brexit, it remains to be seen if / how it will be implemented in the UK but this can be expected in the interests of UK “adequacy” status.

 

Nigel Miller leads Fox Williams’ technology and data protection group. Nigel is a Certified Information Privacy Professional/Europe (CIPP/E).

New ePrivacy Regulation – implications for Ad-tech

Josey Bright
Josey Bright

On 10 January this year, the European Commission published a proposal for a new ePrivacy Regulation (the “ePrivacy Regulation”) to update and replace the current ePrivacy Directive (the “Directive”).

The ePrivacy Regulation, which is part of the Commission’s Digital Single Market Strategy, is designed to closely align with the provisions of the General Data Protection ePrivacy Regulation (GDPR) which was adopted in May 2016. The Commission intends that the ePrivacy Regulation will come into force on the same date as the GDPR, the 25 May 2018. However, as it is still yet to be finalised and approved this timetable may be overly ambitious.  It is currently reported that the aim is to finalise the ePrivacy Regulation by end 2018.

As it is a ePrivacy Regulation, just like the GDPR, it will be directly applicable in all EU Member States without the need for implementing national laws.

The main aim of the ePrivacy Regulation is to increase privacy protection for users of electronic communications.

The key features of the proposed ePrivacy Regulation are:

  1. Broader scope

 The new ePrivacy Regulation will apply to any company processing data in connection with communication services including all providers of electronic communications services.

This includes “over-the-top” service providers such as text message, email and messaging app providers so services such as WhatsApp, Facebook Messenger and Skype will be within scope of the ePrivacy Regulation.

Like the GDPR, the ePrivacy Regulation will have an extended reach in that non-EU providers providing electronic services to users in the EU will also be within scope of the ePrivacy Regulation.

  1. Content and metadata included

 All electronic communications data are cover by the ePrivacy Regulation. However, the ePrivacy Regulation distinguishes between content data (what is actually said in the communication) and metadata (data related to the communication such as time, location and duration of a call or website visit). Separate rules apply in respect of each type of data:

  • Content can only be used if the end user has consented to its use for a specified purpose and the processing is necessary for the provision of the service.
  •  Metadata can only be use where it is necessary for the quality of the service such as billing, payments, detecting and/or stopping fraudulent or abusive use of the service.

In circumstances where all end users have consented to the use of content or metadata for a purpose which cannot be fulfilled if the information is anonymised, the data may be used provided that the service provider has consulted the competent EU Data Protection Authority (in the UK, the Information Commissioner’s Office (ICO)) before the processing is carried out.

The threshold for consent under the ePrivacy Regulation is defined by reference to the GDPR. This means consent must be “freely given, specific, informed and unambiguous” given by “a statement or by a clear affirmative action”. Like the GDPR, end users must also be given the right to withdraw their consent at any time.

  1. Storage and erasure of data required

The ePrivacy Regulation includes provisions requiring service providers to erase or anonymise all content after it is received by the end user.

All metadata must also be erased or anonymised once the permitted purpose has been fulfilled, except where such data is required for billing purposes.

  1. Cookie consent options

Like the Directive, the ePrivacy Regulation also provides that the consent of the end user is required for the use of cookies and similar technology. However, the current proposal is that consent can be built into the browser software set-up so that users can tailor their cookie consent choices at the point of installation, rather than by using cookie banners and pop ups.

In addition, analytics cookies which are non-privacy intrusive will not require consent (i.e. those which measure web audience measuring, remember shopping cart details or login information for the same session).

  1. Direct marketing rules

The ePrivacy Regulation distinguishes between business to consumer communications (B2C) and business to business communications (B2B).

Like the Directive, unsolicited commercial communications are not permitted. In B2C marketing prior consent (opt-in) is required. Consent will not be required where marketing similar products or services but a right to object must be provided.

For B2B marketing, the ePrivacy Regulation allows for Member States to determine that the legitimate interests of corporate end users are sufficiently protected from unsolicited communication.

  1. Enforcement and higher fines in line with GDPR

The Information Commission’s Office (ICO) will be responsible for enforcement of the ePrivacy Regulation and the GDPR in the UK.

Currently, ICO can only fine companies up to £500,000 for breaches of the PECR (the national legislation which implements the Directive).  The ePrivacy Regulation introduces fine which are in line with the GDPR (i.e. up to 20,000,000 EUR or 4% of total worldwide annual turnover, whichever is higher).

In addition, the ePrivacy Regulation confers users electronic communications services  with a right to seek compensation directly from services providers if they have “suffered material or non-material damage as a result of an infringement”.

Implications

The ePrivacy Regulation is critically important for many ad-tech businesses where the need to get specific opt in consent could be highly problematic for intermediaries who do not have a direct relationship with the end users and where soliciting that consent via publishers while legally possible may be impracticable.

All this is not helped by the fact that there is uncertainty around the final form of the ePrivacy Regulation; for example, as to whether valid consent can be managed within the browser.

As if compliance with GDPR did not present enough challenges, the ad-tech industry, as well as individual businesses, need to move quickly to prepare for these forthcoming changes in ePrivacy.

 

Josey Bright is an associate in the commerce & technology team at City law firm Fox Williams LLP and can be contacted at jbright@foxwilliams.com

Dynamic IP address can be personal data

Nigel Miller
Nigel Miller

Whether or not an IP address is “personal data” can be a crucial question because the answer determines whether or not the data is subject to the rigours of the EU Data Protection Directive (in the UK, the Data Protection Act).

An IP address is a number used to identify a device on a network. An IP address can be “dynamic” or “static”. A static IP address remains constant and does not change every time the device connects to the Internet. In contrast, the more usual dynamic IP address changes each time a new connection is made.

It has long been agreed that static IP addresses are personal data because they enable a link to be made with a particular device for profiling. IP addresses enable an individual to be “singled out” (even if that individual’s real-world identity remains unknown).

In its early opinion 4/2007, the Article 29 Working Party accepted that an IP address, for example, for a computer in an Internet café used by many people may not identify any particular individual. In other cases, however, the IP address can be associated with a particular user if for example there is a log of who used the computer at the relevant time. The Working Party therefore concluded that all IP information should be treated as personal data, “to be on the safe side”.

The question of whether a dynamic IP address can be “personal data” was less certain.

Patrick Breyer v Bundesrepublik Deutschland

The Court of Justice of the European Union (CJEU) has now ruled that dynamic IP addresses held by a website operator are personal data where the operator has “the legal means which enable it to identify the data subject with additional data which the internet service provider has about that person”.

While a dynamic IP address alone may not directly identify an individual, when combined with other information a dynamic IP address could be used to identify the individual user.

The question before the Court was whether a dynamic IP address can be personal data if the relevant additional information is in the hands of a third party (an internet service provider).

The case was brought by a politician, Mr Patrick Breyer, against the Federal Republic of Germany seeking to prevent them from storing, or arranging for third parties to store, his IP address from when he consulted publicly accessible websites of German Federal institutions. Mr Breyer claimed that IP addresses qualify as personal data under data protection laws; and therefore that consent was needed for processing such data.

If a user of a website reveals his identity on the website, for example by completing a form, then the IP address is certainly personal data because the operator of that website is able to identify the user by linking his name to his computer’s IP address.

However, if the user does not reveal his identity, the IP address alone does not enable the user to be directly identified. The website operator can identify the user only if the information relating to his identity is communicated to them by his ISP.

The court decided that the fact that the additional data necessary to identify the user are held, not by the website operator, but by the user’s ISP does not exclude dynamic IP addresses from being personal data. The question is whether the website operator has a legal way to obtain the additional data from the ISP. In that case it was decided that the Federal Republic of Germany did have a legal means to obtain the necessary additional information from the ISP and therefore the raw dynamic IP address data should be regarded as personal data.  For information to be treated as “personal data”, it is not necessary that all the information enabling the identification of the data subject must be in the hands of one person.

Comment

The Court has decided that a dynamic IP address could – but will not always necessarily – constitute personal data. In light of this decision, businesses that have not up to now been treating dynamic IP addresses as personal data need to re-assess that position and may need to alter data compliance practices. This may for example impact businesses engaged in online analytics and targeted advertising.

It may be that the case highlights a possible difference between the UK Data Protection Act and the implementation of the Directive in other EU countries. In the UK, data is personal data if an individual can be identified from those data and from “other information which is in the possession of, or is likely to come into the possession of, the data controller”. Is data “likely” to come into the possession of a data controller where the only way for him to obtain it is to ask for it?

All this will soon become academic as, looking ahead to May 2018, the General Data Protection Regulation (GDPR) specifically includes online identifiers, such as IP addresses, in its definition of “personal data”. It’s not that the position is now beyond doubt, it’s just that the nature of the question is changing …

 

Nigel Miller is a partner in the commerce & technology team at City law firm Fox Williams LLP and can be contacted at nmiller@foxwilliams.com