The use of location data by mobile apps post-GDPR

This article was first published on Lexis®PSL TMT on 24 September 2018.

From the perspective of a party providing an app via an app store, what regulations govern the use of location data by that mobile app?

The key consideration is data privacy and, therefore, the main regulation to consider is the General Data Protection Regulation (GDPR) which came into force on 25 May 2018. This will apply to the app provider if they carry out processing of personal data on the device.

While there is as yet no specific guidance under the GDPR on the use of location data by Apps, in 2011 the Article 29 Data Protection Working Party (now the European Data Protection Board (EDPB)) adopted Opinion 13/2011 on “Geolocation services on smart mobile devices” and in 2013 Opinion 2/2013 on “Apps on smart devices”. Although these opinions relate to the Data Protection Directive (95/46/EC), much of the content of the Opinions is still relevant under the GDPR.

In the UK, you should also take into account the Data Protection Act 2018 which supplements the GDPR in certain areas (such as in relation to special categories of personal data and data subject rights) although not specifically in relation to location data.

To what extent / in what circumstances will the Privacy and Electronic Communications Regulations 2003 regulate the use of location data by mobile app providers? What exemptions apply and does PECR 2003 apply to ‘information society services’?

Under regulation 6 of PECR (as amended by the 2011 Regulations), it is unlawful to gain access to information stored in the terminal equipment of a subscriber or user unless the subscriber or user (a) is provided with clear and comprehensive information about the purposes of the access to that information; and (b) has given his or her consent. This applies irrespective of whether or not the location data is “personal data”.

Regulation 14 relates specifically to the processing of location data and provides that you can only process location data if you are a public communications provider, a provider of a “value-added service”, or a person acting on the authority of such a provider, and only if: (a) the data is anonymous; or (b) you have the user’s consent to use it for a value-added service, and the processing is necessary for that purpose. This does not apply to data collected independently of the network or service provider such as GPS-based location data or data collected by a local wifi network. However, the use of such data will still need to comply with the GDPR.

To what extent / in what circumstances will the GDPR regulate the use of location data collected from mobile apps by mobile app providers?

The GDPR will apply if the app provider collects the location data from the device and if it can be used to identify a person.

If the data is anonymized such that it cannot be linked to a person, then the GDPR will not apply. However, if the location data is processed with other data related to a user, the device or the user’s behavior, or is used in a manner to single out individuals from others, then it will be “personal data” and fall within the scope of the GDPR even if traditional identifiers such as name, address etc are not known.

Opinion 13/2011 sets out the regulator’s view that a device is usually intimately linked to a specific individual and that location data will, therefore, be regarded as “personal data”. Indeed, the definition of “personal data” in the GDPR, specifically includes location data as one of the elements by reference to which a person can be identified.  The Opinion comments that the providers of geolocation based services gain “an intimate overview of habits and patterns of the owner of such a device and build extensive profiles.”

Furthermore, in certain contexts, location data could be linked to special category personal data (sensitive personal data). For example, location data may reveal visits to hospitals or places of worship or presence at political demonstrations.

How is compliance with such laws commonly addressed by app providers?

To process the data derived from the device or the app, the app provider needs to have a legal basis.

Contract necessity may apply to some uses of the location data. For other uses, depending on the app, it may be problematic to rely on “legitimate interests” as a lawful basis for tracking individuals using location data, for example, to serve location specific ads. Therefore, in many cases the app provider will need to rely on the user’s “consent” for processing location data.

How should app providers respond to recent changes in the law (e.g., the introduction of GDPR) impacting their apps’ use of location data?

Where app providers rely on “consent” as the legal basis, they will need to ensure that this meets the stricter requirements for consent under GDPR. This can be challenging given the constraints of the mobile app environment.

Transparency is essential. The Article 29 Guidelines on transparency WP260 rev.01 indicate that, for apps, the Article 13 privacy information should be made available from the app store before download. Once the app is installed, the privacy information needs to be easily accessible from within the app. The recommendation is that it should never be more than “two taps away” (e.g. by including a “Privacy” option in the app menu). Use of layered notices and contextual real time notifications will be particularly helpful on a mobile device.

The device’s operating system (such as IOS) may require the user’s permission to use the location data, for example via a dialogue box asking if the user agrees to allow the app to access the user’s location, either while using the app or in the background. Clicking on the “allow” button enables location service on the device and may also help signify consent provided that this has been sufficiently informed and is sufficiently granular.

If the app integrates with a third-party provider to enable, for example, location-based advertising the consent to use location data must be sufficiently explicit to include consent to data collection for advertising purposes by the third party, including the identity of the third party. Data sharing arrangements may also be required between the app provider and the third party.

Where children (in UK, under 13) may be involved, the consent must be given or authorised by the holder of parental responsibility over the child.

Following GDPR, app providers should review their data security and retention policies for compliance with the Article 5 principles.

App providers should be mindful of the principles of privacy by design and by default, and so for example location services should, by default, be switched off and its use should be customizable by the user.

Finally, using location data may involve “profiling” within the meaning of Article 4(4) which specifically refers to analysing location data. As such, consideration should be given to whether a data protection impact assessment (DPIA) is required under Article 35 or, if not required, should be undertaken as good practice.

Are there any forthcoming or anticipated changes to the law which may impact on use of location data by mobile app providers?

The ePrivacy Directive on which PECR is based is currently under review to be updated and aligned with GDPR in the form of the ePrivacy Regulation.

This is not yet finalised and its implementation date is not certain but may be in 2019 or 2020. However, GDPR-grade consent will still be required for use of location data subject to certain exceptions including where strictly necessary for providing an information society service specifically requested by the individual. Assuming the ePrivacy Regulation takes effect after Brexit, it remains to be seen if / how it will be implemented in the UK but this can be expected in the interests of UK “adequacy” status.

 

Nigel Miller leads Fox Williams’ technology and data protection group. Nigel is a Certified Information Privacy Professional/Europe (CIPP/E).

Advertisements

The consent trap

Nigel Miller

Having got passed 25 May 2018, the day the GDPR came into effect, the torrent of GDPR emails is beginning to abate.

It would be interesting to analyse how many GDPR emails were sent in the run up to the go live date seeking consent to continue being in contact, as against the percentage of recipients who then responded to opt-in. And how many trumpeted a new privacy policy, as against the percentage of recipients who actually read the new policy. I suspect the percentages in each case will be low! Indeed, many people have expressed satisfaction that, by doing nothing and not confirming consent when requested, they can reduce the flow of unwanted spam into their inbox.

But were all these emails necessary, and in particular, was it actually necessary to seek consent?

In many cases it was not necessary to seek consent to “stay in touch” and continue email marketing.

Under GDPR consent is one of the legal basis for processing, but is not the only one. In most cases, organisations will be able to rely on the “legitimate interests” ground to remain in contact with their contact list. Recital 47 GDPR expressly says that processing of personal data for direct marketing purposes may be regarded as carried out for a legitimate interest. Subject to confirming this in a “legitimate interests assessment”, many businesses can rely on the concept of ‘legitimate interest’ to justify processing client personal data on their mailing lists without the need to re-affirm the consent. GDPR expressly acknowledges that businesses may have a legitimate interest in direct marketing activities, which could include circulating invitations to events, new products and services, or updates etc. This is an appropriate basis for data processing where you use data in ways that people would reasonably expect and has a minimal privacy impact especially as a recipient should always be able to easily opt-out of future marketing.

While permission based marketing is certainly to be preferred, unless it is required, there is no need to seek specific GDPR-grade consent which may predictably result in the contact database being decimated as a result of recipient inertia and GDPR fatigue.

That all said, there is a key exception where consent to email marketing may be required.  This requirement is not to be found in the GDPR; instead it is in the Privacy and Electronic Communications Regulations (“PECR”). These have been around since 2003 and are currently being upgraded to GDPR level with a new ePrivacy Regulation, although this did not make it into law at the same time as GDPR as was the plan; it is likely to come on stream within the next year or so.

PECR contains supplemental rules on consent for electronic marketing (i.e. marketing by email, phone, SMS or fax). Whilst you may not need consent under the GDPR, you may need consent under PECR.

Different rules apply depending on whether the marketing is sent to an ‘individual’ or ‘corporate’ subscriber’.

Marketing to a corporate email address does not need consent. However, if you are sending unsolicited marketing emails to individual subscribers (a personal email address), then you will need the individual’s consent, unless the so called “soft opt-in” applies (e.g. where the individual is an existing customer).

In summary, assuming you can justify “legitimate interests” for the continued contact, consent is not needed to continue marketing by post, or by email to existing customers or to contacts at corporate email addresses. Consent will only be needed to send direct marketing by email to personal email addresses of individuals who are not customers for similar products and services.

Ironically, in an effort to be compliant, the email requesting consent to future marketing may itself be unlawful if consent was not already in place, and the ICO has fined organisations for engaging in this (e.g. Honda and Flybe). So, sending emails seeking consent may be either unnecessary or unlawful.

New ePrivacy Regulation – implications for Ad-tech

Josey Bright
Josey Bright

On 10 January this year, the European Commission published a proposal for a new ePrivacy Regulation (the “ePrivacy Regulation”) to update and replace the current ePrivacy Directive (the “Directive”).

The ePrivacy Regulation, which is part of the Commission’s Digital Single Market Strategy, is designed to closely align with the provisions of the General Data Protection ePrivacy Regulation (GDPR) which was adopted in May 2016. The Commission intends that the ePrivacy Regulation will come into force on the same date as the GDPR, the 25 May 2018. However, as it is still yet to be finalised and approved this timetable may be overly ambitious.  It is currently reported that the aim is to finalise the ePrivacy Regulation by end 2018.

As it is a ePrivacy Regulation, just like the GDPR, it will be directly applicable in all EU Member States without the need for implementing national laws.

The main aim of the ePrivacy Regulation is to increase privacy protection for users of electronic communications.

The key features of the proposed ePrivacy Regulation are:

  1. Broader scope

 The new ePrivacy Regulation will apply to any company processing data in connection with communication services including all providers of electronic communications services.

This includes “over-the-top” service providers such as text message, email and messaging app providers so services such as WhatsApp, Facebook Messenger and Skype will be within scope of the ePrivacy Regulation.

Like the GDPR, the ePrivacy Regulation will have an extended reach in that non-EU providers providing electronic services to users in the EU will also be within scope of the ePrivacy Regulation.

  1. Content and metadata included

 All electronic communications data are cover by the ePrivacy Regulation. However, the ePrivacy Regulation distinguishes between content data (what is actually said in the communication) and metadata (data related to the communication such as time, location and duration of a call or website visit). Separate rules apply in respect of each type of data:

  • Content can only be used if the end user has consented to its use for a specified purpose and the processing is necessary for the provision of the service.
  •  Metadata can only be use where it is necessary for the quality of the service such as billing, payments, detecting and/or stopping fraudulent or abusive use of the service.

In circumstances where all end users have consented to the use of content or metadata for a purpose which cannot be fulfilled if the information is anonymised, the data may be used provided that the service provider has consulted the competent EU Data Protection Authority (in the UK, the Information Commissioner’s Office (ICO)) before the processing is carried out.

The threshold for consent under the ePrivacy Regulation is defined by reference to the GDPR. This means consent must be “freely given, specific, informed and unambiguous” given by “a statement or by a clear affirmative action”. Like the GDPR, end users must also be given the right to withdraw their consent at any time.

  1. Storage and erasure of data required

The ePrivacy Regulation includes provisions requiring service providers to erase or anonymise all content after it is received by the end user.

All metadata must also be erased or anonymised once the permitted purpose has been fulfilled, except where such data is required for billing purposes.

  1. Cookie consent options

Like the Directive, the ePrivacy Regulation also provides that the consent of the end user is required for the use of cookies and similar technology. However, the current proposal is that consent can be built into the browser software set-up so that users can tailor their cookie consent choices at the point of installation, rather than by using cookie banners and pop ups.

In addition, analytics cookies which are non-privacy intrusive will not require consent (i.e. those which measure web audience measuring, remember shopping cart details or login information for the same session).

  1. Direct marketing rules

The ePrivacy Regulation distinguishes between business to consumer communications (B2C) and business to business communications (B2B).

Like the Directive, unsolicited commercial communications are not permitted. In B2C marketing prior consent (opt-in) is required. Consent will not be required where marketing similar products or services but a right to object must be provided.

For B2B marketing, the ePrivacy Regulation allows for Member States to determine that the legitimate interests of corporate end users are sufficiently protected from unsolicited communication.

  1. Enforcement and higher fines in line with GDPR

The Information Commission’s Office (ICO) will be responsible for enforcement of the ePrivacy Regulation and the GDPR in the UK.

Currently, ICO can only fine companies up to £500,000 for breaches of the PECR (the national legislation which implements the Directive).  The ePrivacy Regulation introduces fine which are in line with the GDPR (i.e. up to 20,000,000 EUR or 4% of total worldwide annual turnover, whichever is higher).

In addition, the ePrivacy Regulation confers users electronic communications services  with a right to seek compensation directly from services providers if they have “suffered material or non-material damage as a result of an infringement”.

Implications

The ePrivacy Regulation is critically important for many ad-tech businesses where the need to get specific opt in consent could be highly problematic for intermediaries who do not have a direct relationship with the end users and where soliciting that consent via publishers while legally possible may be impracticable.

All this is not helped by the fact that there is uncertainty around the final form of the ePrivacy Regulation; for example, as to whether valid consent can be managed within the browser.

As if compliance with GDPR did not present enough challenges, the ad-tech industry, as well as individual businesses, need to move quickly to prepare for these forthcoming changes in ePrivacy.

 

Josey Bright is an associate in the commerce & technology team at City law firm Fox Williams LLP and can be contacted at jbright@foxwilliams.com

Data, duties and directors

Jessica Calvert
Jessica Calvert

The ICO blog recently reported that of the £2.7 million worth of fines issued in relation to nuisance calls since April 2015, only 6 of the 27 fines issued have been paid, leaving a total of £2.26 million penalties unpaid. The Privacy and Electronic Communications (EC Directive) Regulations 2003 (“Privacy Regulations”) contain powers for the ICO to fine companies which make marketing calls and texts, where the recipients have not consented to be contacted.

Recent fines that have been issued include:

  • a £70,000 fine to London based Nouveau Finance Limited, a company that sent 2.2 million spam text messages without consent from the recipients;
  • a £30,000 to Assist Law, a will writing firm in Weston-Super-Mare for making unsolicited marketing calls to persons registered with the Telephone Preference Service (TPS) for over a year.

Many of the companies fined however have so far avoided paying the fines by filing for insolvency. As the regulator put it “leaving by the back door as the regulator comes through the front door”.

At present the ICO can issue fines of up to £500,000 where there has been a serious contravention. These can be imposed on any legal person (e.g. a business or charity, or an individual), however there is no specific right to fine the directors responsible for such companies. A change to legislation is expected in Spring 2017 which will introduce fines of up to £500,000 for directors of nuisance marketing firms, and hopefully break the cycle whereby the same directors continue to operate under a new company.

The change in law should also be noted by all directors that fall within the remit of the Data Protection Act 1998 (“DPA”), if not the Privacy Regulations, as there is a clear move being made to seek to penalise those accountable for breaches relating to personal data. Points worth noting are:

  • The ICO have the power to fine directors for breaches of the Data Protection Act where breach can be shown to have occurred with a director’s consent, connivance or neglect;
  • Under the GDPR fines of value up to 4% of annual worldwide turnover, or 20 million euros, whichever is greater, will be possible;
  • When the GDPR is enacted data processors as well as data controllers will also be caught; and
  • Breach of general director duties to act in good faith, in the best interests of the company, and to exercise reasonable care, skill and diligence could result in an action for damages, termination of a directorship, or disqualification as a director.

Jessica Calvert is an associate in the commerce & technology team at City law firm Fox Williams LLP and can be contacted at jcalvert@foxwilliams.com

Facebook, WhatsApp and mission creep

Emma RoakeGerman regulators have slapped down WhatsApp’s move to share its users’ data with parent company Facebook, calling it an “infringement of national data protection law”.

Despite Facebook and WhatsApp publicly committing in 2014 (when Facebook bought WhatsApp) that users’ data would not be shared between the two companies, recent changes to WhatsApp’s terms and conditions have reversed this position.  The new terms and conditions state that user data (including the mobile number and device information of the WhatsApp user) will be shared with Facebook, including for targeted advertising purposes.  The terms and conditions automatically opt in users to the data-sharing arrangement.

However, in the last few days of September, the Hamburg data protection commissioner issued an administrative order which:

  • prohibits Facebook from collecting and storing the data of German WhatsApp users; and
  • compels Facebook to destroy any data which has already been collected from German WhatsApp users.

The Hamburg data protection commissioner has said that the WhatsApp user’s consent needs to be obtained to the data-sharing for it to be lawful, and this had not happened.

Facebook is appealing the decision.

The changes to WhatsApp’s terms and conditions have caused widespread controversy since being announced, and have caused concern with data regulators around the world.

The UK’s data protection regulator (the ICO) has announced that it is investigating the data-sharing on behalf of WhatsApp users in the UK.  Elizabeth Denham (the new information commissioner) commented in an interview with BBC’s Radio 4 that there was a “lot of anger” amongst the UK’s WhatsApp users.  Ms Denham also addressed the WhatsApp / Facebook data-sharing arrangement in her first speech as information commissioner on 29 September 2016, commenting that “all of this is about transparency and individual control”.

Transparency and trust were the central themes of Ms Denham’s first speech, where she explained that her fundamental objective as information commissioner was to build a culture of data confidence in the UK.  She noted her concern that an ICO survey from earlier in the year had shown that only 1 out of every 4 adults trust businesses with their personal data.

Ms Denham made clear that the ICO would pick and choose its investigations carefully, making sure that those investigations were relevant to the public.  Unsurprisingly, she said that technology “is already at the forefront” of most of the ICO’s major investigations.  For example, in addition to investigating the change in WhatsApp terms and conditions, the ICO has in the last few weeks asked questions about the major Yahoo data breach.

The ICO has indicated that it will be putting out an update soon on its WhatsApp/Facebook investigation.  It will be interesting to see whether the ICO follows the approach of the German regulators.

Emma Roake is a senior associate in the commerce & technology team at City law firm Fox Williams LLP and can be contacted at eroake@foxwilliams.com