New ePrivacy Regulation – implications for Ad-tech

Josey Bright
Josey Bright

On 10 January this year, the European Commission published a proposal for a new ePrivacy Regulation (the “ePrivacy Regulation”) to update and replace the current ePrivacy Directive (the “Directive”).

The ePrivacy Regulation, which is part of the Commission’s Digital Single Market Strategy, is designed to closely align with the provisions of the General Data Protection ePrivacy Regulation (GDPR) which was adopted in May 2016. The Commission intends that the ePrivacy Regulation will come into force on the same date as the GDPR, the 25 May 2018. However, as it is still yet to be finalised and approved this timetable may be overly ambitious.  It is currently reported that the aim is to finalise the ePrivacy Regulation by end 2018.

As it is a ePrivacy Regulation, just like the GDPR, it will be directly applicable in all EU Member States without the need for implementing national laws.

The main aim of the ePrivacy Regulation is to increase privacy protection for users of electronic communications.

The key features of the proposed ePrivacy Regulation are:

  1. Broader scope

 The new ePrivacy Regulation will apply to any company processing data in connection with communication services including all providers of electronic communications services.

This includes “over-the-top” service providers such as text message, email and messaging app providers so services such as WhatsApp, Facebook Messenger and Skype will be within scope of the ePrivacy Regulation.

Like the GDPR, the ePrivacy Regulation will have an extended reach in that non-EU providers providing electronic services to users in the EU will also be within scope of the ePrivacy Regulation.

  1. Content and metadata included

 All electronic communications data are cover by the ePrivacy Regulation. However, the ePrivacy Regulation distinguishes between content data (what is actually said in the communication) and metadata (data related to the communication such as time, location and duration of a call or website visit). Separate rules apply in respect of each type of data:

  • Content can only be used if the end user has consented to its use for a specified purpose and the processing is necessary for the provision of the service.
  •  Metadata can only be use where it is necessary for the quality of the service such as billing, payments, detecting and/or stopping fraudulent or abusive use of the service.

In circumstances where all end users have consented to the use of content or metadata for a purpose which cannot be fulfilled if the information is anonymised, the data may be used provided that the service provider has consulted the competent EU Data Protection Authority (in the UK, the Information Commissioner’s Office (ICO)) before the processing is carried out.

The threshold for consent under the ePrivacy Regulation is defined by reference to the GDPR. This means consent must be “freely given, specific, informed and unambiguous” given by “a statement or by a clear affirmative action”. Like the GDPR, end users must also be given the right to withdraw their consent at any time.

  1. Storage and erasure of data required

The ePrivacy Regulation includes provisions requiring service providers to erase or anonymise all content after it is received by the end user.

All metadata must also be erased or anonymised once the permitted purpose has been fulfilled, except where such data is required for billing purposes.

  1. Cookie consent options

Like the Directive, the ePrivacy Regulation also provides that the consent of the end user is required for the use of cookies and similar technology. However, the current proposal is that consent can be built into the browser software set-up so that users can tailor their cookie consent choices at the point of installation, rather than by using cookie banners and pop ups.

In addition, analytics cookies which are non-privacy intrusive will not require consent (i.e. those which measure web audience measuring, remember shopping cart details or login information for the same session).

  1. Direct marketing rules

The ePrivacy Regulation distinguishes between business to consumer communications (B2C) and business to business communications (B2B).

Like the Directive, unsolicited commercial communications are not permitted. In B2C marketing prior consent (opt-in) is required. Consent will not be required where marketing similar products or services but a right to object must be provided.

For B2B marketing, the ePrivacy Regulation allows for Member States to determine that the legitimate interests of corporate end users are sufficiently protected from unsolicited communication.

  1. Enforcement and higher fines in line with GDPR

The Information Commission’s Office (ICO) will be responsible for enforcement of the ePrivacy Regulation and the GDPR in the UK.

Currently, ICO can only fine companies up to £500,000 for breaches of the PECR (the national legislation which implements the Directive).  The ePrivacy Regulation introduces fine which are in line with the GDPR (i.e. up to 20,000,000 EUR or 4% of total worldwide annual turnover, whichever is higher).

In addition, the ePrivacy Regulation confers users electronic communications services  with a right to seek compensation directly from services providers if they have “suffered material or non-material damage as a result of an infringement”.

Implications

The ePrivacy Regulation is critically important for many ad-tech businesses where the need to get specific opt in consent could be highly problematic for intermediaries who do not have a direct relationship with the end users and where soliciting that consent via publishers while legally possible may be impracticable.

All this is not helped by the fact that there is uncertainty around the final form of the ePrivacy Regulation; for example, as to whether valid consent can be managed within the browser.

As if compliance with GDPR did not present enough challenges, the ad-tech industry, as well as individual businesses, need to move quickly to prepare for these forthcoming changes in ePrivacy.

 

Josey Bright is an associate in the commerce & technology team at City law firm Fox Williams LLP and can be contacted at jbright@foxwilliams.com

Advertisements

Dynamic IP address can be personal data

Nigel Miller
Nigel Miller

Whether or not an IP address is “personal data” can be a crucial question because the answer determines whether or not the data is subject to the rigours of the EU Data Protection Directive (in the UK, the Data Protection Act).

An IP address is a number used to identify a device on a network. An IP address can be “dynamic” or “static”. A static IP address remains constant and does not change every time the device connects to the Internet. In contrast, the more usual dynamic IP address changes each time a new connection is made.

It has long been agreed that static IP addresses are personal data because they enable a link to be made with a particular device for profiling. IP addresses enable an individual to be “singled out” (even if that individual’s real-world identity remains unknown).

In its early opinion 4/2007, the Article 29 Working Party accepted that an IP address, for example, for a computer in an Internet café used by many people may not identify any particular individual. In other cases, however, the IP address can be associated with a particular user if for example there is a log of who used the computer at the relevant time. The Working Party therefore concluded that all IP information should be treated as personal data, “to be on the safe side”.

The question of whether a dynamic IP address can be “personal data” was less certain.

Patrick Breyer v Bundesrepublik Deutschland

The Court of Justice of the European Union (CJEU) has now ruled that dynamic IP addresses held by a website operator are personal data where the operator has “the legal means which enable it to identify the data subject with additional data which the internet service provider has about that person”.

While a dynamic IP address alone may not directly identify an individual, when combined with other information a dynamic IP address could be used to identify the individual user.

The question before the Court was whether a dynamic IP address can be personal data if the relevant additional information is in the hands of a third party (an internet service provider).

The case was brought by a politician, Mr Patrick Breyer, against the Federal Republic of Germany seeking to prevent them from storing, or arranging for third parties to store, his IP address from when he consulted publicly accessible websites of German Federal institutions. Mr Breyer claimed that IP addresses qualify as personal data under data protection laws; and therefore that consent was needed for processing such data.

If a user of a website reveals his identity on the website, for example by completing a form, then the IP address is certainly personal data because the operator of that website is able to identify the user by linking his name to his computer’s IP address.

However, if the user does not reveal his identity, the IP address alone does not enable the user to be directly identified. The website operator can identify the user only if the information relating to his identity is communicated to them by his ISP.

The court decided that the fact that the additional data necessary to identify the user are held, not by the website operator, but by the user’s ISP does not exclude dynamic IP addresses from being personal data. The question is whether the website operator has a legal way to obtain the additional data from the ISP. In that case it was decided that the Federal Republic of Germany did have a legal means to obtain the necessary additional information from the ISP and therefore the raw dynamic IP address data should be regarded as personal data.  For information to be treated as “personal data”, it is not necessary that all the information enabling the identification of the data subject must be in the hands of one person.

Comment

The Court has decided that a dynamic IP address could – but will not always necessarily – constitute personal data. In light of this decision, businesses that have not up to now been treating dynamic IP addresses as personal data need to re-assess that position and may need to alter data compliance practices. This may for example impact businesses engaged in online analytics and targeted advertising.

It may be that the case highlights a possible difference between the UK Data Protection Act and the implementation of the Directive in other EU countries. In the UK, data is personal data if an individual can be identified from those data and from “other information which is in the possession of, or is likely to come into the possession of, the data controller”. Is data “likely” to come into the possession of a data controller where the only way for him to obtain it is to ask for it?

All this will soon become academic as, looking ahead to May 2018, the General Data Protection Regulation (GDPR) specifically includes online identifiers, such as IP addresses, in its definition of “personal data”. It’s not that the position is now beyond doubt, it’s just that the nature of the question is changing …

 

Nigel Miller is a partner in the commerce & technology team at City law firm Fox Williams LLP and can be contacted at nmiller@foxwilliams.com

ICO: “Cyber security is not an IT issue, it is a boardroom issue”

Josey BrightTalk Talk

On 5 October 2016, Talk Talk was issued with a £400,000 fine – the highest fine yet from the Information Commissioner’s Office (“ICO”) – for breach of its security obligations under the Data Protection Act 1998 (“DPA”).

Between 15 and 21 October 2015 a hacker took advantage of technical weaknesses in Talk Talk’s systems and succeeded in accessing the personal data of 156,959 customers. In 15,656 cases, the attacker also had access to bank details and sort codes.

The Information Commissioner, Elizabeth Denham, said that the “fine acts as a warning that cyber security is not an IT issue, it is a boardroom issue. Companies must be diligent and vigilant. They must do this because they have a duty under law, but they must also do this because they have a duty to their customers.”

In addition to the fine, the costs resulting from Talk Talk’s data security breach amounted to £60 million.

Data Security Principle under the DPA

The seventh data protection principle in the DPA requires that personal information must be kept secure. It says that: “appropriate technical and organisational measures shall be taken against unauthorised or unlawful processing of personal data and against accidental loss or destruction of, or damage to, personal data.”

The DPA is not prescriptive about what measure must be taken and there is no “one size fits all” solution to information security. The security measures that are appropriate for an organisation will depend on its circumstances, and businesses should adopt a risk-based approach to deciding what level of security they need.

Preventative measures – lessons learnt from the ICO’s Talk Talk investigation

The ICO found inadequacies in Talk Talk’s security measures were the result of “serious oversight” rather than an deliberate intent to ignore or bypass the provisions of the DPA. The cyber-attack could have been prevented if the company had taken basic technical and security measures. In particular, the ICO identified the following issues:

  • Legacy Pages: the data was part of an underlying customer database that Talk Talk inherited when it acquired Tiscali in 2009. These pages were vulnerable and Talk Talk had failed to identify and remove them or make them secure.
  • Outdated Software: Talk Talk was not aware the database software was outdated. It did not know that the software had a bug or that a remedy for the bug had been publicised in 2012 and was easily available.
  • Defences: The hacker used a common technique called SQL injection to which defences exist. Talk Talk ought to have known that there was a risk to the data from this technique and ought to have implemented sufficient defences.
  • Lack of Monitoring: Talk Talk did not proactively monitor its systems to discover vulnerabilities.

The investigation found Talk Talk was unaware of two previous SQL injection attacks on 17 July 2015 and between 2-3 September 2015 and consequently Talk Talk’s contravention of the seventh data protection principle was ongoing until it took remedial action on 21 October 2015.

The ICO considered the breach serious due to the number of data subjects, the nature of personal data and the potential consequences from the breach – the data could be used for fraudulent purposes.

Other notable cyber attacks

The Talk Talk breach is one of several security breaches to have come to light  in recent months. The size and scale of these security breaches illustrates the Commissioner’s statement that companies urgently need to take stock of their cyber security arrangements.

  • Myspace: In June this year, Myspace discovered 360 million passwords and email addresses had been stolen in a hack that occurred in 2013 and these details were discovered listed on the dark web.
  •  Yahoo: In August, Yahoo discovered that at least 500 million of its accounts had been hacked in 2014. Yahoo only discovered the 2014 breach because it was investigating reports of a separate breach. The theft is the world’s biggest cyber breach so far. The data stolen included names, email addresses, telephone numbers, dates of birth and encrypted passwords.
  •  Tesco Bank: Early this month, Tesco Bank suffered a serious cyber-attack which affected 40,000 customer accounts. Money was stolen from 9,000 current accounts, forcing Tesco Bank to suspend all online transactions. Its security arrangements are currently being investigated by a number of regulatory bodies including the National Crime Agency and the ICO. However, a number of cyber security experts have indicated that its software was vulnerable and was being targeted by cyber criminals for months. Notwithstanding any fines Tesco Bank may be required to pay, it has already spent £2.5 million compensating customers for their losses. 

Practical steps for securing data

By being vigilant and proactive, companies ought to be able prevent significant security breaches and the regulatory fines and compensation payments incurred, not to mention the stigma that such breaches attract.

The following practical steps should be considered to enhance data security:

  • Updates Policy: it is good practice to have an updates policy for software which is used to process personal data and to ensure all software components are included in the policy (e.g. operating systems, applications, libraries and development frameworks);
  • Testing: regularly test and monitor online systems and software for common threats such as SQL injections;
  • Unnecessary Services: completely decommission any service that is not necessary and periodically review remaining services; and
  • Encryption: use encryption schemes to secure the communication of data across the internet.

Higher fines under the General Data Protection Regulation (“GDPR”)

The maximum fine the ICO is currently able to award under the DPA is £500,000. The new General Data Protection Regulation (GDPR), which will have effect from May 2018, offers the ICO the potential to fine up to 20,000,000 EUR or up to 4% of annual worldwide turnover, whichever is the higher.

That’s 20m reasons for companies to review their data security policies and practices.


Josey Bright is an associate in the commerce & technology team at City law firm Fox Williams LLP and can be contacted at jbright@foxwilliams.com

Facebook, WhatsApp and mission creep

Emma RoakeGerman regulators have slapped down WhatsApp’s move to share its users’ data with parent company Facebook, calling it an “infringement of national data protection law”.

Despite Facebook and WhatsApp publicly committing in 2014 (when Facebook bought WhatsApp) that users’ data would not be shared between the two companies, recent changes to WhatsApp’s terms and conditions have reversed this position.  The new terms and conditions state that user data (including the mobile number and device information of the WhatsApp user) will be shared with Facebook, including for targeted advertising purposes.  The terms and conditions automatically opt in users to the data-sharing arrangement.

However, in the last few days of September, the Hamburg data protection commissioner issued an administrative order which:

  • prohibits Facebook from collecting and storing the data of German WhatsApp users; and
  • compels Facebook to destroy any data which has already been collected from German WhatsApp users.

The Hamburg data protection commissioner has said that the WhatsApp user’s consent needs to be obtained to the data-sharing for it to be lawful, and this had not happened.

Facebook is appealing the decision.

The changes to WhatsApp’s terms and conditions have caused widespread controversy since being announced, and have caused concern with data regulators around the world.

The UK’s data protection regulator (the ICO) has announced that it is investigating the data-sharing on behalf of WhatsApp users in the UK.  Elizabeth Denham (the new information commissioner) commented in an interview with BBC’s Radio 4 that there was a “lot of anger” amongst the UK’s WhatsApp users.  Ms Denham also addressed the WhatsApp / Facebook data-sharing arrangement in her first speech as information commissioner on 29 September 2016, commenting that “all of this is about transparency and individual control”.

Transparency and trust were the central themes of Ms Denham’s first speech, where she explained that her fundamental objective as information commissioner was to build a culture of data confidence in the UK.  She noted her concern that an ICO survey from earlier in the year had shown that only 1 out of every 4 adults trust businesses with their personal data.

Ms Denham made clear that the ICO would pick and choose its investigations carefully, making sure that those investigations were relevant to the public.  Unsurprisingly, she said that technology “is already at the forefront” of most of the ICO’s major investigations.  For example, in addition to investigating the change in WhatsApp terms and conditions, the ICO has in the last few weeks asked questions about the major Yahoo data breach.

The ICO has indicated that it will be putting out an update soon on its WhatsApp/Facebook investigation.  It will be interesting to see whether the ICO follows the approach of the German regulators.

Emma Roake is a senior associate in the commerce & technology team at City law firm Fox Williams LLP and can be contacted at eroake@foxwilliams.com

New Code of Practice on Privacy Policies

Sian Barr
Sian Barr

The ICO’s new Code of Practice on Communicating Privacy Information to Individuals goes beyond the form of privacy notice that we are accustomed to seeing when we hand over our personal information. It advocates a blended approach of selecting a number of different techniques to communicate privacy details to individuals when they hand over their personal data.

According to the ICO, the benefits of the blended approach include:

  • greater control for individuals over how their personal data is used;
  • greater choice for individuals over how their personal data is used;
  • can be used to demonstrate that personal data is being used fairly and transparently;
  • preference management tools will mean that you are more likely to get better and more specific information from individuals; and
  • more likely to demonstrate that informed consent has been provided.

Drafting privacy notices in accordance with the Code

The Code is full of detailed and helpful guidance on preparing privacy notices, including the following:

Have a plan – consider whether your intended uses of the information would be reasonably expected by the individual?  If not, your privacy notice should explain the uses in greater detail. Make predictions of likely future uses, especially as part of big data, and include this information in the notice.  Put yourself in the shoes of the individual: carry out a privacy impact assessment.

Blended approach – make use of the privacy-enhancing technologies available such as just-in-time solutions, voice or video, privacy dashboards, icons and symbols.

Avoid catch-all privacy notices – instead, have separate notices tailored to groups.

Control – it is good practice to link the notice to a preference management tool such as a privacy dashboard; be clear about the information that is required and that which is optional

Adapt to your business model – the privacy notice should cover all platforms through which the individual can access your services.

Consent – consider whether the individual needs to consent to the processing described in the privacy notice and, if so, include a mechanism for giving and obtaining consent at the appropriate time.

Active communication – when appropriate privacy information should be actively communicated to individuals (as opposed to the individual having to seek it out through, e.g., a web link), for example if the uses are likely to be unexpected, or if information could be shared with other sources to build a more detailed picture about an individual.

Collaborative resource – where several data controllers are involved, the ICO suggests that in addition to individual privacy notices, a collaborative resource which brings together all privacy information could be the way forward.  Such a resource could allow the individual to make and apply privacy preferences across all data controllers.

Encourage individuals to take notice – word privacy notices in an engaging way and embed them into the user journey.

Comment

When dealing with complex transactions or platforms which involve personal data collection, compliance with the principles may require a range of privacy communication techniques to be used.  The key is to employ these techniques with a focus on how they can enhance the user experience, rather than over-complicate it.

What do you think about the proposed new Code? The Code is open for consultation until 24 March 2016.