New ePrivacy Regulation – implications for Ad-tech

Josey Bright
Josey Bright

On 10 January this year, the European Commission published a proposal for a new ePrivacy Regulation (the “ePrivacy Regulation”) to update and replace the current ePrivacy Directive (the “Directive”).

The ePrivacy Regulation, which is part of the Commission’s Digital Single Market Strategy, is designed to closely align with the provisions of the General Data Protection ePrivacy Regulation (GDPR) which was adopted in May 2016. The Commission intends that the ePrivacy Regulation will come into force on the same date as the GDPR, the 25 May 2018. However, as it is still yet to be finalised and approved this timetable may be overly ambitious.  It is currently reported that the aim is to finalise the ePrivacy Regulation by end 2018.

As it is a ePrivacy Regulation, just like the GDPR, it will be directly applicable in all EU Member States without the need for implementing national laws.

The main aim of the ePrivacy Regulation is to increase privacy protection for users of electronic communications.

The key features of the proposed ePrivacy Regulation are:

  1. Broader scope

 The new ePrivacy Regulation will apply to any company processing data in connection with communication services including all providers of electronic communications services.

This includes “over-the-top” service providers such as text message, email and messaging app providers so services such as WhatsApp, Facebook Messenger and Skype will be within scope of the ePrivacy Regulation.

Like the GDPR, the ePrivacy Regulation will have an extended reach in that non-EU providers providing electronic services to users in the EU will also be within scope of the ePrivacy Regulation.

  1. Content and metadata included

 All electronic communications data are cover by the ePrivacy Regulation. However, the ePrivacy Regulation distinguishes between content data (what is actually said in the communication) and metadata (data related to the communication such as time, location and duration of a call or website visit). Separate rules apply in respect of each type of data:

  • Content can only be used if the end user has consented to its use for a specified purpose and the processing is necessary for the provision of the service.
  •  Metadata can only be use where it is necessary for the quality of the service such as billing, payments, detecting and/or stopping fraudulent or abusive use of the service.

In circumstances where all end users have consented to the use of content or metadata for a purpose which cannot be fulfilled if the information is anonymised, the data may be used provided that the service provider has consulted the competent EU Data Protection Authority (in the UK, the Information Commissioner’s Office (ICO)) before the processing is carried out.

The threshold for consent under the ePrivacy Regulation is defined by reference to the GDPR. This means consent must be “freely given, specific, informed and unambiguous” given by “a statement or by a clear affirmative action”. Like the GDPR, end users must also be given the right to withdraw their consent at any time.

  1. Storage and erasure of data required

The ePrivacy Regulation includes provisions requiring service providers to erase or anonymise all content after it is received by the end user.

All metadata must also be erased or anonymised once the permitted purpose has been fulfilled, except where such data is required for billing purposes.

  1. Cookie consent options

Like the Directive, the ePrivacy Regulation also provides that the consent of the end user is required for the use of cookies and similar technology. However, the current proposal is that consent can be built into the browser software set-up so that users can tailor their cookie consent choices at the point of installation, rather than by using cookie banners and pop ups.

In addition, analytics cookies which are non-privacy intrusive will not require consent (i.e. those which measure web audience measuring, remember shopping cart details or login information for the same session).

  1. Direct marketing rules

The ePrivacy Regulation distinguishes between business to consumer communications (B2C) and business to business communications (B2B).

Like the Directive, unsolicited commercial communications are not permitted. In B2C marketing prior consent (opt-in) is required. Consent will not be required where marketing similar products or services but a right to object must be provided.

For B2B marketing, the ePrivacy Regulation allows for Member States to determine that the legitimate interests of corporate end users are sufficiently protected from unsolicited communication.

  1. Enforcement and higher fines in line with GDPR

The Information Commission’s Office (ICO) will be responsible for enforcement of the ePrivacy Regulation and the GDPR in the UK.

Currently, ICO can only fine companies up to £500,000 for breaches of the PECR (the national legislation which implements the Directive).  The ePrivacy Regulation introduces fine which are in line with the GDPR (i.e. up to 20,000,000 EUR or 4% of total worldwide annual turnover, whichever is higher).

In addition, the ePrivacy Regulation confers users electronic communications services  with a right to seek compensation directly from services providers if they have “suffered material or non-material damage as a result of an infringement”.

Implications

The ePrivacy Regulation is critically important for many ad-tech businesses where the need to get specific opt in consent could be highly problematic for intermediaries who do not have a direct relationship with the end users and where soliciting that consent via publishers while legally possible may be impracticable.

All this is not helped by the fact that there is uncertainty around the final form of the ePrivacy Regulation; for example, as to whether valid consent can be managed within the browser.

As if compliance with GDPR did not present enough challenges, the ad-tech industry, as well as individual businesses, need to move quickly to prepare for these forthcoming changes in ePrivacy.

 

Josey Bright is an associate in the commerce & technology team at City law firm Fox Williams LLP and can be contacted at jbright@foxwilliams.com

Data, duties and directors

Jessica Calvert
Jessica Calvert

The ICO blog recently reported that of the £2.7 million worth of fines issued in relation to nuisance calls since April 2015, only 6 of the 27 fines issued have been paid, leaving a total of £2.26 million penalties unpaid. The Privacy and Electronic Communications (EC Directive) Regulations 2003 (“Privacy Regulations”) contain powers for the ICO to fine companies which make marketing calls and texts, where the recipients have not consented to be contacted.

Recent fines that have been issued include:

  • a £70,000 fine to London based Nouveau Finance Limited, a company that sent 2.2 million spam text messages without consent from the recipients;
  • a £30,000 to Assist Law, a will writing firm in Weston-Super-Mare for making unsolicited marketing calls to persons registered with the Telephone Preference Service (TPS) for over a year.

Many of the companies fined however have so far avoided paying the fines by filing for insolvency. As the regulator put it “leaving by the back door as the regulator comes through the front door”.

At present the ICO can issue fines of up to £500,000 where there has been a serious contravention. These can be imposed on any legal person (e.g. a business or charity, or an individual), however there is no specific right to fine the directors responsible for such companies. A change to legislation is expected in Spring 2017 which will introduce fines of up to £500,000 for directors of nuisance marketing firms, and hopefully break the cycle whereby the same directors continue to operate under a new company.

The change in law should also be noted by all directors that fall within the remit of the Data Protection Act 1998 (“DPA”), if not the Privacy Regulations, as there is a clear move being made to seek to penalise those accountable for breaches relating to personal data. Points worth noting are:

  • The ICO have the power to fine directors for breaches of the Data Protection Act where breach can be shown to have occurred with a director’s consent, connivance or neglect;
  • Under the GDPR fines of value up to 4% of annual worldwide turnover, or 20 million euros, whichever is greater, will be possible;
  • When the GDPR is enacted data processors as well as data controllers will also be caught; and
  • Breach of general director duties to act in good faith, in the best interests of the company, and to exercise reasonable care, skill and diligence could result in an action for damages, termination of a directorship, or disqualification as a director.

Jessica Calvert is an associate in the commerce & technology team at City law firm Fox Williams LLP and can be contacted at jcalvert@foxwilliams.com

Facebook, WhatsApp and mission creep

Emma RoakeGerman regulators have slapped down WhatsApp’s move to share its users’ data with parent company Facebook, calling it an “infringement of national data protection law”.

Despite Facebook and WhatsApp publicly committing in 2014 (when Facebook bought WhatsApp) that users’ data would not be shared between the two companies, recent changes to WhatsApp’s terms and conditions have reversed this position.  The new terms and conditions state that user data (including the mobile number and device information of the WhatsApp user) will be shared with Facebook, including for targeted advertising purposes.  The terms and conditions automatically opt in users to the data-sharing arrangement.

However, in the last few days of September, the Hamburg data protection commissioner issued an administrative order which:

  • prohibits Facebook from collecting and storing the data of German WhatsApp users; and
  • compels Facebook to destroy any data which has already been collected from German WhatsApp users.

The Hamburg data protection commissioner has said that the WhatsApp user’s consent needs to be obtained to the data-sharing for it to be lawful, and this had not happened.

Facebook is appealing the decision.

The changes to WhatsApp’s terms and conditions have caused widespread controversy since being announced, and have caused concern with data regulators around the world.

The UK’s data protection regulator (the ICO) has announced that it is investigating the data-sharing on behalf of WhatsApp users in the UK.  Elizabeth Denham (the new information commissioner) commented in an interview with BBC’s Radio 4 that there was a “lot of anger” amongst the UK’s WhatsApp users.  Ms Denham also addressed the WhatsApp / Facebook data-sharing arrangement in her first speech as information commissioner on 29 September 2016, commenting that “all of this is about transparency and individual control”.

Transparency and trust were the central themes of Ms Denham’s first speech, where she explained that her fundamental objective as information commissioner was to build a culture of data confidence in the UK.  She noted her concern that an ICO survey from earlier in the year had shown that only 1 out of every 4 adults trust businesses with their personal data.

Ms Denham made clear that the ICO would pick and choose its investigations carefully, making sure that those investigations were relevant to the public.  Unsurprisingly, she said that technology “is already at the forefront” of most of the ICO’s major investigations.  For example, in addition to investigating the change in WhatsApp terms and conditions, the ICO has in the last few weeks asked questions about the major Yahoo data breach.

The ICO has indicated that it will be putting out an update soon on its WhatsApp/Facebook investigation.  It will be interesting to see whether the ICO follows the approach of the German regulators.

Emma Roake is a senior associate in the commerce & technology team at City law firm Fox Williams LLP and can be contacted at eroake@foxwilliams.com

Telegraph Media Group fined £30,000 by ICO

Laura Monro
Laura Monro

As 2015 draws to a close, the Information Commissioner’s Office has fined the Telegraph Media Group Ltd £30,000 for a serious breach of the UK Privacy and Electronic Communications Regulations (“PECR”). The PECR set out specific rules in respect of electronic communications. In particular, the PECR prevent the sending of unsolicited marketing and advertising by electronic means without the individual’s consent to such marketing and advertising.

On the day of the general election earlier this year, the Telegraph Media Group sent out its daily editorial e-bulletin which included a letter from the editor of the Telegraph newspaper urging its readers to vote Conservative. Whilst subscribers to the Telegraph Media Group had signed up, and hence consented to receiving, the editorial e-bulletin, the ICO found that by promoting a particular election campaign the nature of the e-bulletin had changed from an editorial communication to a ‘marketing communication’.

In order to amount to valid consent to receiving a particular electronic communication under the PECR, consent must be knowingly given, clear, and specific. In the circumstances, the Telegraph Media Group did not have the specific consent of the readers to send such a marketing communication and the communication was sent in breach of the PECR.  The ICO Head of Enforcement considered that the Telegraph had been negligent in sending the letter from the editor as part of the e-bulletin and explained that “people signed up to The Telegraph’s email service so they could catch up on the news or find out about subjects they were interested in. They did not expect to be told who they should be voting for.”

The ICO has the power to impose a monetary penalty on a data controller of up to £500,000 in respect of such a breach. However, the relatively low amount of £30,000 was determined by the fact that only 17 complaints were received, and that the email in question was a late addition to the usual mailing. The ICO acknowledged that there was pressure to distribute it quickly and little time to properly consider whether it should be included in the mailing.

This case serves as a reminder of the scope of the PECR and the enforcement action open to the ICO for those who ignore the rules.

Privacy and mobile apps

As with any other business or project, developers of mobile apps need to comply with the Data Protection Act.

A typical mobile ecosystem contains many different components, including mobile devices themselves, their operating systems, plus apps provided through an app store. In many ways these are simply developments of earlier technologies used on less portable hardware, but the mobile environment has some particular features that make privacy a particular concern.  For example:

  • Mobile devices such as smartphones and tablets are portable, personal, frequently used and commonly always on.
  • A mobile device typically has direct access to many different sensors and data, such as a microphone, camera and GPS receiver, together with the user’s combined data including email, SMS messages and contacts.
  • There are many different app configurations possible, and it is not necessarily obvious how an app deals with personal information behind its user interface.
  • Mobile devices often have small screens, typically with touch-based interfaces. This can make it more challenging for apps to effectively communicate with app users.
  • Consumers’ expectations of convenience can make it undesirable to present a user with a large privacy policy, or a large number of prompts, or both.

A survey of over 1,200 mobile apps by 26 privacy regulators from across the world has shown that a high number of apps are accessing large amounts of personal information without adequately explaining how people’s information is being used.

The key findings of the survey are:

  •  85% of the apps surveyed failed to clearly explain how they were collecting, using and disclosing personal information.
  • More than half (59%) of the apps left users struggling to find basic privacy information.
  • Almost 1 in 3 apps appeared to request an excessive number of permissions to access additional personal information.
  • 43% of the apps failed to tailor privacy communications to the small screen, either by providing information in a too small print, or by hiding the information in lengthy privacy policies that required scrolling or clicking through multiple pages.

The research did find examples of good practice, with some apps providing a basic explanation of how personal information is being used, including links to more detailed information if the individual wants to know more. The regulators were also impressed by the use of just-in-time notifications on certain apps that informed users of the potential collection, or use, of personal data as it was about to happen. These approaches make it easier for people to understand how their information is being used and when.

The Information Commissioner’s Office (ICO) has recently published ‘Privacy in Mobile Apps’ guidance to help app developers in the UK handle people’s information correctly and meet their requirements under the UK Data Protection Act.

As with all aspects of software, privacy is much easier to consider from the outset of a project rather than as an afterthought. This concept is often referred to as ‘privacy by design’.

If the app which you are developing may handle personal data, then you must comply with the Data Protection Act.  Personal data is not simply the usual identifier’s such as names and address, it could include a unique device identifier such as an IMEI number: even though this does not name the individual, if it is used to treat individuals differently it will fit the definition of personal data.

Some specific guidance points are as follows:

  • If you are a data controller, you need to register with the ICO.  Failure to do so is a criminal offence.
  • Carry out a privacy impact assessment to identify what personal data should be kept confidential, and a security assessment as to whether the app does in fact ensure confidentiality of the relevant data.
  • If any personal data is to be transferred outside the European Economic Area (EEA), you will have to ensure that legal safeguards are implements to provide adequate protection for it.
  • You should only collect and process the minimum data necessary for the tasks that you want your app to perform. Collecting data just in case you may need it in future is bad practice, even when the user has consented to provide that information.
  • Additionally, you must not store personal data for longer than is necessary for the task at hand. You should therefore define retention periods for the personal data you will hold.
  • If your app is aimed at children pay particular attention to what personal data you may be collecting.
  • You should allow your users to permanently delete their personal data and any account they may have set up with you. You should only make an exception if you are legally obliged to keep the data.
  • If you want to collect usage or bug report data, this is possible, but typically must be done either with informed consent from the user; or using anonymised data.
  • Users of your app must be properly informed about what will happen to their personal data if they install and use the app.
  • Privacy information is typically provided via a privacy policy. There is no requirement for this to be in one large document. In fact, in the mobile environment, this approach can be a hindrance. The relevant information can be provided in ways that better suit the small screen and touch-based interface of a typical mobile device.
  • Make relevant privacy information available as soon as practicable. Ideally this would be done before the user downloads the app, and could be done via an app store or via a link to your privacy policy. Where you provide privacy information after an app is downloaded and installed, make sure that this is done before the app processes the relevant personal data.
  • If appropriate, use a ‘layered’ approach where the most important points are summarised, with more detail easily available if the user wants to see it.
  • Give users a granular choice where possible. This allows the user to make meaningful decisions rather than giving the user a single ‘all or nothing’ choice.
  • Allow your users to easily review and change their decisions once the app is installed and in use. Give them a single and obvious place to go to configure the various settings within the app and give them privacy-friendly defaults. It should be as quick to disable a setting as it was to enable it.
  • If your app processes personal data in an unexpected way or is of a more sensitive nature you might need to consider the use of additional ‘just-in-time’ notifications or other alert systems to inform the user what’s happening. For example, if geo-location services are running in the background or you are uploading data to the internet, consider using clear and recognisable icons to indicate that this is occurring and where necessary the option to stop (e.g. to cancel an upload).
  • Take advantage of encrypted connections to ensure security of data in transit, by using SSL / TLS for instance. You should always use encrypted connections for transmitting usernames, passwords and any particularly sensitive information, including device IDs or other unique IDs.
  • You should be particularly careful if your app accesses data from other apps or locations; respect the sensitivity of the data in the context of its original purpose, not solely in the context of your app.