New ePrivacy Regulation – implications for Ad-tech

Josey Bright
Josey Bright

On 10 January this year, the European Commission published a proposal for a new ePrivacy Regulation (the “ePrivacy Regulation”) to update and replace the current ePrivacy Directive (the “Directive”).

The ePrivacy Regulation, which is part of the Commission’s Digital Single Market Strategy, is designed to closely align with the provisions of the General Data Protection ePrivacy Regulation (GDPR) which was adopted in May 2016. The Commission intends that the ePrivacy Regulation will come into force on the same date as the GDPR, the 25 May 2018. However, as it is still yet to be finalised and approved this timetable may be overly ambitious.  It is currently reported that the aim is to finalise the ePrivacy Regulation by end 2018.

As it is a ePrivacy Regulation, just like the GDPR, it will be directly applicable in all EU Member States without the need for implementing national laws.

The main aim of the ePrivacy Regulation is to increase privacy protection for users of electronic communications.

The key features of the proposed ePrivacy Regulation are:

  1. Broader scope

 The new ePrivacy Regulation will apply to any company processing data in connection with communication services including all providers of electronic communications services.

This includes “over-the-top” service providers such as text message, email and messaging app providers so services such as WhatsApp, Facebook Messenger and Skype will be within scope of the ePrivacy Regulation.

Like the GDPR, the ePrivacy Regulation will have an extended reach in that non-EU providers providing electronic services to users in the EU will also be within scope of the ePrivacy Regulation.

  1. Content and metadata included

 All electronic communications data are cover by the ePrivacy Regulation. However, the ePrivacy Regulation distinguishes between content data (what is actually said in the communication) and metadata (data related to the communication such as time, location and duration of a call or website visit). Separate rules apply in respect of each type of data:

  • Content can only be used if the end user has consented to its use for a specified purpose and the processing is necessary for the provision of the service.
  •  Metadata can only be use where it is necessary for the quality of the service such as billing, payments, detecting and/or stopping fraudulent or abusive use of the service.

In circumstances where all end users have consented to the use of content or metadata for a purpose which cannot be fulfilled if the information is anonymised, the data may be used provided that the service provider has consulted the competent EU Data Protection Authority (in the UK, the Information Commissioner’s Office (ICO)) before the processing is carried out.

The threshold for consent under the ePrivacy Regulation is defined by reference to the GDPR. This means consent must be “freely given, specific, informed and unambiguous” given by “a statement or by a clear affirmative action”. Like the GDPR, end users must also be given the right to withdraw their consent at any time.

  1. Storage and erasure of data required

The ePrivacy Regulation includes provisions requiring service providers to erase or anonymise all content after it is received by the end user.

All metadata must also be erased or anonymised once the permitted purpose has been fulfilled, except where such data is required for billing purposes.

  1. Cookie consent options

Like the Directive, the ePrivacy Regulation also provides that the consent of the end user is required for the use of cookies and similar technology. However, the current proposal is that consent can be built into the browser software set-up so that users can tailor their cookie consent choices at the point of installation, rather than by using cookie banners and pop ups.

In addition, analytics cookies which are non-privacy intrusive will not require consent (i.e. those which measure web audience measuring, remember shopping cart details or login information for the same session).

  1. Direct marketing rules

The ePrivacy Regulation distinguishes between business to consumer communications (B2C) and business to business communications (B2B).

Like the Directive, unsolicited commercial communications are not permitted. In B2C marketing prior consent (opt-in) is required. Consent will not be required where marketing similar products or services but a right to object must be provided.

For B2B marketing, the ePrivacy Regulation allows for Member States to determine that the legitimate interests of corporate end users are sufficiently protected from unsolicited communication.

  1. Enforcement and higher fines in line with GDPR

The Information Commission’s Office (ICO) will be responsible for enforcement of the ePrivacy Regulation and the GDPR in the UK.

Currently, ICO can only fine companies up to £500,000 for breaches of the PECR (the national legislation which implements the Directive).  The ePrivacy Regulation introduces fine which are in line with the GDPR (i.e. up to 20,000,000 EUR or 4% of total worldwide annual turnover, whichever is higher).

In addition, the ePrivacy Regulation confers users electronic communications services  with a right to seek compensation directly from services providers if they have “suffered material or non-material damage as a result of an infringement”.

Implications

The ePrivacy Regulation is critically important for many ad-tech businesses where the need to get specific opt in consent could be highly problematic for intermediaries who do not have a direct relationship with the end users and where soliciting that consent via publishers while legally possible may be impracticable.

All this is not helped by the fact that there is uncertainty around the final form of the ePrivacy Regulation; for example, as to whether valid consent can be managed within the browser.

As if compliance with GDPR did not present enough challenges, the ad-tech industry, as well as individual businesses, need to move quickly to prepare for these forthcoming changes in ePrivacy.

 

Josey Bright is an associate in the commerce & technology team at City law firm Fox Williams LLP and can be contacted at jbright@foxwilliams.com

Cyber Attacks: Why Cyber Security is more important now than ever

Amanda LeiuCyber security continues to be headline-grabbing news, particularly in light of the global “ransomware” cyber attack which recently hit the NHS, Telefónica and FedEx. The ransomware reportedly encrypted data on over 300,000 computers in some 150 countries, with hackers threatening to delete data unless a ransom was paid. This latest attack is reported to be the biggest online extortion scheme ever.

The Information Commissioner’s Office (ICO) issued a statement in response to the latest cyber attack to reiterate that “all organisations are required under the Data Protection Act to keep people’s personal data safe and secure.

Whilst concerns about cyber related risks and data security are not new, the issue is becoming ever more pressing for businesses, not least because of the introduction of the General Data Protection Regulations (GDPR) in May 2018.

The cyber threat

The recent global ransomware attack which hit 47 NHS trusts is not an isolated case. The UK government’s 2017 Cyber Security Breaches Survey found that:

  • over two thirds of large firms and SMEs detected a cyber security breach or attack in the last 12 months;
  • in the last year, the average business identified 998 breaches; and
  • for a large firm, the average cost to the business as a result of a breach is £19,600.[1]

These statistics highlight the fact that cyber attacks are a growing area of risk for businesses. Generally, more businesses are migrating into digital form and on globally interconnected technology platforms. As this trend continues, businesses’ exposure to a cyber attack inevitably increases.

The threat is no longer limited to large organisations. Smaller organisations have not historically been the target of cybercrime but this position has changed in recent years. SMEs are now being targeted by cyber criminals and with increasing frequency.

The consequences  

The consequences of a cyber attack can be multiple and far-reaching: disrupted business systems, regulatory fines, compensation claims, reputational damage and loss of consumer trust.

The legal implications in relation to cyber and data security arise primarily from the Data Protection Act 1998 (DPA). The DPA requires organisations to take appropriate technical and organisational security measures to prevent unauthorised or unlawful processing or accidental loss of or destruction or damage to personal data. Under the DPA, the ICO can impose fines of up to £500,000 for breach of this obligation. This is set to dramatically escalate under the GDPR to an upper limit of €20 million or 4% or annual global turnover – whichever is greater.

If appropriate measures have not been taken to keep peoples’ personal data secure and a cyber security breach occurs, organisations risk leaving themselves open to a fine or other enforcement action. This was the case with TalkTalk as discussed in our earlier article “The Only Way is Up – Fining Powers on the Increase for Data Protection Breaches” (21 March 2017). The ICO issued more than £1,000,000 in fines last year for breaches of the DPA. Moreover, personal data owners may seek compensation from organisations for such breaches.

The challenge of compliance with data protection laws is set to potentially increase and become more onerous under the GDPR. The GDPR will supersede the DPA and introduces new and extended obligations for organisations.

Businesses will be legally required to report data breaches that pose a risk to individuals to ICO within 72 hours and in some cases to the individuals affected. Data processors will also have direct obligations in relation to data security for the first time. Another key change is around accountability – the GDPR creates an onus on companies to demonstrate compliance with the data protection principles and put in place comprehensive governance measures.

Mitigating the risks – what should you be doing?

In light of the risks highlighted, it is more essential than ever that organisations protect themselves (and therefore, by extension their consumers), from increasingly sophisticated cyber attacks.

To minimise the risk of a cyber attack and ensure regulatory compliance with the current DPA and the incoming GDPR, businesses should be looking to take the following steps:

  • generate awareness within your organisation;
  • set up a project team with full board engagement;
  • carry out a data inventory and mapping exercise to understand what data you have, what you use it for, where it is held and what third parties are involved in processing data;
  • carry out a gap analysis to work out what compliance steps are needed;
  • review all relevant policies, procedures and contracts;
  • undertake a data privacy impact assessment, if needed;
  • prioritise and scope out a cyber security incident response plan;
  • implement and rehearse the cyber security incident response plan; and
  • train staff, monitor processes, audit and adjust.

[1]https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/609186/Cyber_Security_Breaches_Survey_2017_main_report_PUBLIC.pdf (pg. 8)

 

Amanda Leiu is a trainee solicitor in the Commerce & Technology team at Fox Williams LLP.

The Only Way Is Up – Fining Powers on the Increase for Data Protection Breaches

Julianna Tolan
Julianna Tolan

Last year saw the Information Commissioner’s Office impose record fines for data protection breaches, totalling £2,155,500.

TalkTalk was on the receiving end of the greatest financial penalty in ICO history for a highly publicised cyber-attack that claimed more than 150,000 of its customers’ personal details. The regulator considered these security failings sufficiently grave to issue the telecoms company with a £400,000 fine, close to its maximum fining powers of £500,000.

Other recipients of financial penalties from the ICO in 2016 included EE Limited, Hampshire County Council and David Lammy MP. In the latter case, Mr Lammy was accused of instigating 35,629 calls over two days, playing a recorded message that urged people to back his campaign to be named the Labour party candidate for London Mayor. This conduct resulted in a £5,000 fine for nuisance calls.

Of course, the ICO has a host of other enforcement tools at its disposal, such as issuing undertakings, serving enforcement notices and in the most serious cases, commencing a criminal prosecution against individuals or companies who contravene the Data Protection Act.

But for bottom-line conscious business, monetary penalties have historically been an effective means of compelling compliance with good business practice.

That ought to be the case now more than ever, as the EU General Data Protection Regulation (GDPR) comes into force on 25 May 2018, which will radically increase the maximum fines that can be imposed on UK businesses from £500,000 to an upper limit of €20 million or 4% or annual global turnover – whichever is higher.

These previously unprecedented fining powers mean that for many companies, the outcome of a serious data protection breach could conceivably result in insolvency or even closure of the business.

Given the profound detriment that data losses have been shown to cause to consumers over the past 12 months, it is perhaps timely that the ICO is finally catching up with other UK regulators. Enforcement authorities in the fields of health and safety, competition and environmental protection have long possessed the power to impose exorbitant fines capable of closing errant businesses down.

With the GDPR on the horizon, businesses should now seize the opportunity to monitor and review their compliance with data protection laws, including the effectiveness of internal policies and procedures. After all, the consequences of failing to do so could be costly.

Julianna Tolan is an Employed Barrister in the Dispute Resolution team at Fox Williams LLP acting for commercial and financial services clients in respect of contentious and non-contentious regulatory issues. Julianna can be contacted at jtolan@foxwilliams.com

Court of Appeal rules on subject access request in favour of data subjects

Laura Monro
Laura Monro

Back in November 2015 we reported that the High Court decision in Dawson-Damer v Taylor Wessing brought cautious optimism for data controllers when the judge refused to make an order for compliance with three subject access requests (see https://idatalaw.com/2015/11/24/high-court-decision-brings-cautious-optimism-for-data-controllers/). However, the Court of Appeal has taken a different approach, overturning the High Court decision and ordering compliance by Taylor Wessing, the data controller, with the subject access requests.

In its decision the Court of Appeal focused on the following three key issues:

The extent of the legal professional privilege exception

One of the family members was involved in litigation in the Bahamas with Taylor Wessing’s client which was the Bahamian trustee of the family’s trust fund. Taylor Wessing did not comply with the subject access requests, claiming to be entitled to the exemption for legal professional privilege. The High Court decided that all documents in respect of which the trustee would be entitled to resist disclosure under the ongoing litigation in the Bahamas would be protected by the legal professional privilege exception under English law.

However, the Court of Appeal took a more narrow view, finding that the legal professional privilege exception:

  1. applies only to documents which are protected by legal professional privilege under English law, and does not extend to systems of law outside the UK; and
  2. does not extend to documents which are the subject of non-disclosure rules, in this case the applicable rules being the trustee’s right of non-disclosure.

Whether any further search would involve “disproportionate effort”

The Data Protection Act provides that a data controller must supply the data subject with a copy of the information requested under a subject access request unless the supply of such information “is not possible or would involve disproportionate effort”.

Although the High Court concluded that it was not reasonable or proportionate for Taylor Wessing to carry out searches to determine if any particular document was covered by privilege, the Court of Appeal disagreed.

 The Court of Appeal stated that Taylor Wessing must produce evidence to show what it has done to identify the material and to work out a plan of action. It found that further compliance with the subject access requests would not involve disproportionate effort by Taylor Wessing, and that disproportionate effort must involve more than an assertion that it is too difficult to search through voluminous papers.

Whether the judge would have been entitled to refuse to exercise his discretion in favour of the data subjects because their motive was to use the information in legal proceedings against the trustees

The Court of Appeal held that the High Court judge was wrong not to enforce the subject access requests despite the motive of the data subjects.

Neither the Data Protection Act nor the ICO’s subject access code of practice provides that data subjects have to inform the data controller of their reason for making the subject access request, or what they intend to do with the information requested. There is no “no other purpose” rule which would allow a data controller to refuse to respond to a subject access request if the data subject proposes to use the information obtained for a purpose other than verifying or correcting the personal data held about them.

It follows that the intention of the data subject to use the personal data for the purpose of litigation proceedings cannot be used by a data controller to avoid complying with a subject access request.

The decision of the Court of Appeal finds in favour of the data subjects and serves as a warning to data controllers that significant effort may be needed in responding to subject access requests. Data controllers should also bear in mind that following the implementation of the GDPR in May 2018 there will be less time to comply with subject access requests – the GDPR requires that information must be provided without delay and at the latest within one month of receipt rather than the current 40 days. It is prudent for data controllers to be reviewing their policies and procedures now to ensure that they will be able to comply with the GDPR once it comes into force.

Laura Monro is an associate in the commerce & technology team at City law firm Fox Williams LLP and can be contacted at lmonro@foxwilliams.com

Dynamic IP address can be personal data

Nigel Miller
Nigel Miller

Whether or not an IP address is “personal data” can be a crucial question because the answer determines whether or not the data is subject to the rigours of the EU Data Protection Directive (in the UK, the Data Protection Act).

An IP address is a number used to identify a device on a network. An IP address can be “dynamic” or “static”. A static IP address remains constant and does not change every time the device connects to the Internet. In contrast, the more usual dynamic IP address changes each time a new connection is made.

It has long been agreed that static IP addresses are personal data because they enable a link to be made with a particular device for profiling. IP addresses enable an individual to be “singled out” (even if that individual’s real-world identity remains unknown).

In its early opinion 4/2007, the Article 29 Working Party accepted that an IP address, for example, for a computer in an Internet café used by many people may not identify any particular individual. In other cases, however, the IP address can be associated with a particular user if for example there is a log of who used the computer at the relevant time. The Working Party therefore concluded that all IP information should be treated as personal data, “to be on the safe side”.

The question of whether a dynamic IP address can be “personal data” was less certain.

Patrick Breyer v Bundesrepublik Deutschland

The Court of Justice of the European Union (CJEU) has now ruled that dynamic IP addresses held by a website operator are personal data where the operator has “the legal means which enable it to identify the data subject with additional data which the internet service provider has about that person”.

While a dynamic IP address alone may not directly identify an individual, when combined with other information a dynamic IP address could be used to identify the individual user.

The question before the Court was whether a dynamic IP address can be personal data if the relevant additional information is in the hands of a third party (an internet service provider).

The case was brought by a politician, Mr Patrick Breyer, against the Federal Republic of Germany seeking to prevent them from storing, or arranging for third parties to store, his IP address from when he consulted publicly accessible websites of German Federal institutions. Mr Breyer claimed that IP addresses qualify as personal data under data protection laws; and therefore that consent was needed for processing such data.

If a user of a website reveals his identity on the website, for example by completing a form, then the IP address is certainly personal data because the operator of that website is able to identify the user by linking his name to his computer’s IP address.

However, if the user does not reveal his identity, the IP address alone does not enable the user to be directly identified. The website operator can identify the user only if the information relating to his identity is communicated to them by his ISP.

The court decided that the fact that the additional data necessary to identify the user are held, not by the website operator, but by the user’s ISP does not exclude dynamic IP addresses from being personal data. The question is whether the website operator has a legal way to obtain the additional data from the ISP. In that case it was decided that the Federal Republic of Germany did have a legal means to obtain the necessary additional information from the ISP and therefore the raw dynamic IP address data should be regarded as personal data.  For information to be treated as “personal data”, it is not necessary that all the information enabling the identification of the data subject must be in the hands of one person.

Comment

The Court has decided that a dynamic IP address could – but will not always necessarily – constitute personal data. In light of this decision, businesses that have not up to now been treating dynamic IP addresses as personal data need to re-assess that position and may need to alter data compliance practices. This may for example impact businesses engaged in online analytics and targeted advertising.

It may be that the case highlights a possible difference between the UK Data Protection Act and the implementation of the Directive in other EU countries. In the UK, data is personal data if an individual can be identified from those data and from “other information which is in the possession of, or is likely to come into the possession of, the data controller”. Is data “likely” to come into the possession of a data controller where the only way for him to obtain it is to ask for it?

All this will soon become academic as, looking ahead to May 2018, the General Data Protection Regulation (GDPR) specifically includes online identifiers, such as IP addresses, in its definition of “personal data”. It’s not that the position is now beyond doubt, it’s just that the nature of the question is changing …

 

Nigel Miller is a partner in the commerce & technology team at City law firm Fox Williams LLP and can be contacted at nmiller@foxwilliams.com