The use of location data by mobile apps post-GDPR

This article was first published on Lexis®PSL TMT on 24 September 2018.

From the perspective of a party providing an app via an app store, what regulations govern the use of location data by that mobile app?

The key consideration is data privacy and, therefore, the main regulation to consider is the General Data Protection Regulation (GDPR) which came into force on 25 May 2018. This will apply to the app provider if they carry out processing of personal data on the device.

While there is as yet no specific guidance under the GDPR on the use of location data by Apps, in 2011 the Article 29 Data Protection Working Party (now the European Data Protection Board (EDPB)) adopted Opinion 13/2011 on “Geolocation services on smart mobile devices” and in 2013 Opinion 2/2013 on “Apps on smart devices”. Although these opinions relate to the Data Protection Directive (95/46/EC), much of the content of the Opinions is still relevant under the GDPR.

In the UK, you should also take into account the Data Protection Act 2018 which supplements the GDPR in certain areas (such as in relation to special categories of personal data and data subject rights) although not specifically in relation to location data.

To what extent / in what circumstances will the Privacy and Electronic Communications Regulations 2003 regulate the use of location data by mobile app providers? What exemptions apply and does PECR 2003 apply to ‘information society services’?

Under regulation 6 of PECR (as amended by the 2011 Regulations), it is unlawful to gain access to information stored in the terminal equipment of a subscriber or user unless the subscriber or user (a) is provided with clear and comprehensive information about the purposes of the access to that information; and (b) has given his or her consent. This applies irrespective of whether or not the location data is “personal data”.

Regulation 14 relates specifically to the processing of location data and provides that you can only process location data if you are a public communications provider, a provider of a “value-added service”, or a person acting on the authority of such a provider, and only if: (a) the data is anonymous; or (b) you have the user’s consent to use it for a value-added service, and the processing is necessary for that purpose. This does not apply to data collected independently of the network or service provider such as GPS-based location data or data collected by a local wifi network. However, the use of such data will still need to comply with the GDPR.

To what extent / in what circumstances will the GDPR regulate the use of location data collected from mobile apps by mobile app providers?

The GDPR will apply if the app provider collects the location data from the device and if it can be used to identify a person.

If the data is anonymized such that it cannot be linked to a person, then the GDPR will not apply. However, if the location data is processed with other data related to a user, the device or the user’s behavior, or is used in a manner to single out individuals from others, then it will be “personal data” and fall within the scope of the GDPR even if traditional identifiers such as name, address etc are not known.

Opinion 13/2011 sets out the regulator’s view that a device is usually intimately linked to a specific individual and that location data will, therefore, be regarded as “personal data”. Indeed, the definition of “personal data” in the GDPR, specifically includes location data as one of the elements by reference to which a person can be identified.  The Opinion comments that the providers of geolocation based services gain “an intimate overview of habits and patterns of the owner of such a device and build extensive profiles.”

Furthermore, in certain contexts, location data could be linked to special category personal data (sensitive personal data). For example, location data may reveal visits to hospitals or places of worship or presence at political demonstrations.

How is compliance with such laws commonly addressed by app providers?

To process the data derived from the device or the app, the app provider needs to have a legal basis.

Contract necessity may apply to some uses of the location data. For other uses, depending on the app, it may be problematic to rely on “legitimate interests” as a lawful basis for tracking individuals using location data, for example, to serve location specific ads. Therefore, in many cases the app provider will need to rely on the user’s “consent” for processing location data.

How should app providers respond to recent changes in the law (e.g., the introduction of GDPR) impacting their apps’ use of location data?

Where app providers rely on “consent” as the legal basis, they will need to ensure that this meets the stricter requirements for consent under GDPR. This can be challenging given the constraints of the mobile app environment.

Transparency is essential. The Article 29 Guidelines on transparency WP260 rev.01 indicate that, for apps, the Article 13 privacy information should be made available from the app store before download. Once the app is installed, the privacy information needs to be easily accessible from within the app. The recommendation is that it should never be more than “two taps away” (e.g. by including a “Privacy” option in the app menu). Use of layered notices and contextual real time notifications will be particularly helpful on a mobile device.

The device’s operating system (such as IOS) may require the user’s permission to use the location data, for example via a dialogue box asking if the user agrees to allow the app to access the user’s location, either while using the app or in the background. Clicking on the “allow” button enables location service on the device and may also help signify consent provided that this has been sufficiently informed and is sufficiently granular.

If the app integrates with a third-party provider to enable, for example, location-based advertising the consent to use location data must be sufficiently explicit to include consent to data collection for advertising purposes by the third party, including the identity of the third party. Data sharing arrangements may also be required between the app provider and the third party.

Where children (in UK, under 13) may be involved, the consent must be given or authorised by the holder of parental responsibility over the child.

Following GDPR, app providers should review their data security and retention policies for compliance with the Article 5 principles.

App providers should be mindful of the principles of privacy by design and by default, and so for example location services should, by default, be switched off and its use should be customizable by the user.

Finally, using location data may involve “profiling” within the meaning of Article 4(4) which specifically refers to analysing location data. As such, consideration should be given to whether a data protection impact assessment (DPIA) is required under Article 35 or, if not required, should be undertaken as good practice.

Are there any forthcoming or anticipated changes to the law which may impact on use of location data by mobile app providers?

The ePrivacy Directive on which PECR is based is currently under review to be updated and aligned with GDPR in the form of the ePrivacy Regulation.

This is not yet finalised and its implementation date is not certain but may be in 2019 or 2020. However, GDPR-grade consent will still be required for use of location data subject to certain exceptions including where strictly necessary for providing an information society service specifically requested by the individual. Assuming the ePrivacy Regulation takes effect after Brexit, it remains to be seen if / how it will be implemented in the UK but this can be expected in the interests of UK “adequacy” status.

 

Nigel Miller leads Fox Williams’ technology and data protection group. Nigel is a Certified Information Privacy Professional/Europe (CIPP/E).

Advertisements

The consent trap

Nigel Miller

Having got passed 25 May 2018, the day the GDPR came into effect, the torrent of GDPR emails is beginning to abate.

It would be interesting to analyse how many GDPR emails were sent in the run up to the go live date seeking consent to continue being in contact, as against the percentage of recipients who then responded to opt-in. And how many trumpeted a new privacy policy, as against the percentage of recipients who actually read the new policy. I suspect the percentages in each case will be low! Indeed, many people have expressed satisfaction that, by doing nothing and not confirming consent when requested, they can reduce the flow of unwanted spam into their inbox.

But were all these emails necessary, and in particular, was it actually necessary to seek consent?

In many cases it was not necessary to seek consent to “stay in touch” and continue email marketing.

Under GDPR consent is one of the legal basis for processing, but is not the only one. In most cases, organisations will be able to rely on the “legitimate interests” ground to remain in contact with their contact list. Recital 47 GDPR expressly says that processing of personal data for direct marketing purposes may be regarded as carried out for a legitimate interest. Subject to confirming this in a “legitimate interests assessment”, many businesses can rely on the concept of ‘legitimate interest’ to justify processing client personal data on their mailing lists without the need to re-affirm the consent. GDPR expressly acknowledges that businesses may have a legitimate interest in direct marketing activities, which could include circulating invitations to events, new products and services, or updates etc. This is an appropriate basis for data processing where you use data in ways that people would reasonably expect and has a minimal privacy impact especially as a recipient should always be able to easily opt-out of future marketing.

While permission based marketing is certainly to be preferred, unless it is required, there is no need to seek specific GDPR-grade consent which may predictably result in the contact database being decimated as a result of recipient inertia and GDPR fatigue.

That all said, there is a key exception where consent to email marketing may be required.  This requirement is not to be found in the GDPR; instead it is in the Privacy and Electronic Communications Regulations (“PECR”). These have been around since 2003 and are currently being upgraded to GDPR level with a new ePrivacy Regulation, although this did not make it into law at the same time as GDPR as was the plan; it is likely to come on stream within the next year or so.

PECR contains supplemental rules on consent for electronic marketing (i.e. marketing by email, phone, SMS or fax). Whilst you may not need consent under the GDPR, you may need consent under PECR.

Different rules apply depending on whether the marketing is sent to an ‘individual’ or ‘corporate’ subscriber’.

Marketing to a corporate email address does not need consent. However, if you are sending unsolicited marketing emails to individual subscribers (a personal email address), then you will need the individual’s consent, unless the so called “soft opt-in” applies (e.g. where the individual is an existing customer).

In summary, assuming you can justify “legitimate interests” for the continued contact, consent is not needed to continue marketing by post, or by email to existing customers or to contacts at corporate email addresses. Consent will only be needed to send direct marketing by email to personal email addresses of individuals who are not customers for similar products and services.

Ironically, in an effort to be compliant, the email requesting consent to future marketing may itself be unlawful if consent was not already in place, and the ICO has fined organisations for engaging in this (e.g. Honda and Flybe). So, sending emails seeking consent may be either unnecessary or unlawful.

New ePrivacy Regulation – implications for Ad-tech

Josey Bright
Josey Bright

On 10 January this year, the European Commission published a proposal for a new ePrivacy Regulation (the “ePrivacy Regulation”) to update and replace the current ePrivacy Directive (the “Directive”).

The ePrivacy Regulation, which is part of the Commission’s Digital Single Market Strategy, is designed to closely align with the provisions of the General Data Protection ePrivacy Regulation (GDPR) which was adopted in May 2016. The Commission intends that the ePrivacy Regulation will come into force on the same date as the GDPR, the 25 May 2018. However, as it is still yet to be finalised and approved this timetable may be overly ambitious.  It is currently reported that the aim is to finalise the ePrivacy Regulation by end 2018.

As it is a ePrivacy Regulation, just like the GDPR, it will be directly applicable in all EU Member States without the need for implementing national laws.

The main aim of the ePrivacy Regulation is to increase privacy protection for users of electronic communications.

The key features of the proposed ePrivacy Regulation are:

  1. Broader scope

 The new ePrivacy Regulation will apply to any company processing data in connection with communication services including all providers of electronic communications services.

This includes “over-the-top” service providers such as text message, email and messaging app providers so services such as WhatsApp, Facebook Messenger and Skype will be within scope of the ePrivacy Regulation.

Like the GDPR, the ePrivacy Regulation will have an extended reach in that non-EU providers providing electronic services to users in the EU will also be within scope of the ePrivacy Regulation.

  1. Content and metadata included

 All electronic communications data are cover by the ePrivacy Regulation. However, the ePrivacy Regulation distinguishes between content data (what is actually said in the communication) and metadata (data related to the communication such as time, location and duration of a call or website visit). Separate rules apply in respect of each type of data:

  • Content can only be used if the end user has consented to its use for a specified purpose and the processing is necessary for the provision of the service.
  •  Metadata can only be use where it is necessary for the quality of the service such as billing, payments, detecting and/or stopping fraudulent or abusive use of the service.

In circumstances where all end users have consented to the use of content or metadata for a purpose which cannot be fulfilled if the information is anonymised, the data may be used provided that the service provider has consulted the competent EU Data Protection Authority (in the UK, the Information Commissioner’s Office (ICO)) before the processing is carried out.

The threshold for consent under the ePrivacy Regulation is defined by reference to the GDPR. This means consent must be “freely given, specific, informed and unambiguous” given by “a statement or by a clear affirmative action”. Like the GDPR, end users must also be given the right to withdraw their consent at any time.

  1. Storage and erasure of data required

The ePrivacy Regulation includes provisions requiring service providers to erase or anonymise all content after it is received by the end user.

All metadata must also be erased or anonymised once the permitted purpose has been fulfilled, except where such data is required for billing purposes.

  1. Cookie consent options

Like the Directive, the ePrivacy Regulation also provides that the consent of the end user is required for the use of cookies and similar technology. However, the current proposal is that consent can be built into the browser software set-up so that users can tailor their cookie consent choices at the point of installation, rather than by using cookie banners and pop ups.

In addition, analytics cookies which are non-privacy intrusive will not require consent (i.e. those which measure web audience measuring, remember shopping cart details or login information for the same session).

  1. Direct marketing rules

The ePrivacy Regulation distinguishes between business to consumer communications (B2C) and business to business communications (B2B).

Like the Directive, unsolicited commercial communications are not permitted. In B2C marketing prior consent (opt-in) is required. Consent will not be required where marketing similar products or services but a right to object must be provided.

For B2B marketing, the ePrivacy Regulation allows for Member States to determine that the legitimate interests of corporate end users are sufficiently protected from unsolicited communication.

  1. Enforcement and higher fines in line with GDPR

The Information Commission’s Office (ICO) will be responsible for enforcement of the ePrivacy Regulation and the GDPR in the UK.

Currently, ICO can only fine companies up to £500,000 for breaches of the PECR (the national legislation which implements the Directive).  The ePrivacy Regulation introduces fine which are in line with the GDPR (i.e. up to 20,000,000 EUR or 4% of total worldwide annual turnover, whichever is higher).

In addition, the ePrivacy Regulation confers users electronic communications services  with a right to seek compensation directly from services providers if they have “suffered material or non-material damage as a result of an infringement”.

Implications

The ePrivacy Regulation is critically important for many ad-tech businesses where the need to get specific opt in consent could be highly problematic for intermediaries who do not have a direct relationship with the end users and where soliciting that consent via publishers while legally possible may be impracticable.

All this is not helped by the fact that there is uncertainty around the final form of the ePrivacy Regulation; for example, as to whether valid consent can be managed within the browser.

As if compliance with GDPR did not present enough challenges, the ad-tech industry, as well as individual businesses, need to move quickly to prepare for these forthcoming changes in ePrivacy.

 

Josey Bright is an associate in the commerce & technology team at City law firm Fox Williams LLP and can be contacted at jbright@foxwilliams.com

Cyber Attacks: Why Cyber Security is more important now than ever

Amanda LeiuCyber security continues to be headline-grabbing news, particularly in light of the global “ransomware” cyber attack which recently hit the NHS, Telefónica and FedEx. The ransomware reportedly encrypted data on over 300,000 computers in some 150 countries, with hackers threatening to delete data unless a ransom was paid. This latest attack is reported to be the biggest online extortion scheme ever.

The Information Commissioner’s Office (ICO) issued a statement in response to the latest cyber attack to reiterate that “all organisations are required under the Data Protection Act to keep people’s personal data safe and secure.

Whilst concerns about cyber related risks and data security are not new, the issue is becoming ever more pressing for businesses, not least because of the introduction of the General Data Protection Regulations (GDPR) in May 2018.

The cyber threat

The recent global ransomware attack which hit 47 NHS trusts is not an isolated case. The UK government’s 2017 Cyber Security Breaches Survey found that:

  • over two thirds of large firms and SMEs detected a cyber security breach or attack in the last 12 months;
  • in the last year, the average business identified 998 breaches; and
  • for a large firm, the average cost to the business as a result of a breach is £19,600.[1]

These statistics highlight the fact that cyber attacks are a growing area of risk for businesses. Generally, more businesses are migrating into digital form and on globally interconnected technology platforms. As this trend continues, businesses’ exposure to a cyber attack inevitably increases.

The threat is no longer limited to large organisations. Smaller organisations have not historically been the target of cybercrime but this position has changed in recent years. SMEs are now being targeted by cyber criminals and with increasing frequency.

The consequences  

The consequences of a cyber attack can be multiple and far-reaching: disrupted business systems, regulatory fines, compensation claims, reputational damage and loss of consumer trust.

The legal implications in relation to cyber and data security arise primarily from the Data Protection Act 1998 (DPA). The DPA requires organisations to take appropriate technical and organisational security measures to prevent unauthorised or unlawful processing or accidental loss of or destruction or damage to personal data. Under the DPA, the ICO can impose fines of up to £500,000 for breach of this obligation. This is set to dramatically escalate under the GDPR to an upper limit of €20 million or 4% or annual global turnover – whichever is greater.

If appropriate measures have not been taken to keep peoples’ personal data secure and a cyber security breach occurs, organisations risk leaving themselves open to a fine or other enforcement action. This was the case with TalkTalk as discussed in our earlier article “The Only Way is Up – Fining Powers on the Increase for Data Protection Breaches” (21 March 2017). The ICO issued more than £1,000,000 in fines last year for breaches of the DPA. Moreover, personal data owners may seek compensation from organisations for such breaches.

The challenge of compliance with data protection laws is set to potentially increase and become more onerous under the GDPR. The GDPR will supersede the DPA and introduces new and extended obligations for organisations.

Businesses will be legally required to report data breaches that pose a risk to individuals to ICO within 72 hours and in some cases to the individuals affected. Data processors will also have direct obligations in relation to data security for the first time. Another key change is around accountability – the GDPR creates an onus on companies to demonstrate compliance with the data protection principles and put in place comprehensive governance measures.

Mitigating the risks – what should you be doing?

In light of the risks highlighted, it is more essential than ever that organisations protect themselves (and therefore, by extension their consumers), from increasingly sophisticated cyber attacks.

To minimise the risk of a cyber attack and ensure regulatory compliance with the current DPA and the incoming GDPR, businesses should be looking to take the following steps:

  • generate awareness within your organisation;
  • set up a project team with full board engagement;
  • carry out a data inventory and mapping exercise to understand what data you have, what you use it for, where it is held and what third parties are involved in processing data;
  • carry out a gap analysis to work out what compliance steps are needed;
  • review all relevant policies, procedures and contracts;
  • undertake a data privacy impact assessment, if needed;
  • prioritise and scope out a cyber security incident response plan;
  • implement and rehearse the cyber security incident response plan; and
  • train staff, monitor processes, audit and adjust.

[1]https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/609186/Cyber_Security_Breaches_Survey_2017_main_report_PUBLIC.pdf (pg. 8)

 

Amanda Leiu is a trainee solicitor in the Commerce & Technology team at Fox Williams LLP.

Tricky issues with use of employee data

Helen Farr
Helen Farr

Employers cannot manage the employment relationship without using their employees’ data. Data is used by employers on a daily basis for a variety of tasks ranging from monitoring sickness absence, administering benefits to paying salary using payroll.

To process this data lawfully most employers rely on provisions in the employment contract authorising them to do so.‎ However, employers need to be aware that simply including a provision in a contract may not be enough if the employer is using a specific class of data; sensitive personal data.

Sensitive personal data includes data about an employee’s health, sexuality, diversity and political beliefs. To use this data lawfully employers need the employee’s express consent to do so.

Problems can arise for employers in a number of situations where they need to use sensitive personal data.

A common problem area is when a referral ‎is made to a company’s occupational health team for an opinion and prognosis on an employee’s health problems. There are two main components to occupational health records: transferable information and the confidential clinical record. Transferable information is information that is generally accessible by the employer, the employee and enforcing bodies like the HSE – it includes information about accidents at work, monitoring data and exposure to hazards. The confidential clinical record is specific to the employee and his or her health during employment. This is sensitive personal data.

‎When the referral is made to Occupational Health it must be made with the employee’s consent. However, relying on consent may not be enough to protect the employer from a claim.

Employers must ensure that when they make a request for a medical report from Occupational Health the request is focussed and limited to the purposes for which consent is obtained.

They also need to make sure that any medical information provided to Occupational Health is focused. It is common practice for HR practitioners making the referral to send all sickness records they have about the employee. But what if the employee has suffered various health problems over the years, including conditions that the employee would not necessarily want his or her line manager or the wider business to know about? If the Occupational Health report refers to these historical conditions there could be claims by the disgruntled employee.

The consent that has been obtained is unlikely to be enough to protect the employer from a claim. Potential claims include a breach of the employee’s right to privacy and breach of the Data Protection Act. The issue could also lead to claims of discrimination. Therefore employers should not complacently rely on the consent received when requesting a report but must properly consider the ‎particular purposes for which the report is needed.

Our experience is most businesses do not send a copy of the Occupational Health referral to the employee. Best practice must be to do so. This will avoid any potential problem when the employee reads a report containing lots of historical medical information ; it makes it difficult for them to claim they did not agree to it being referred to.

Another potential problem area is the use of sensitive personal data about an employee’s sexual orientation. Many large employers have relationship at work policies obliging their employees to disclose information about romantic relationships with work colleagues. Of course this policy applies to same sex relationships.

Again the problem employers often omit to consider is how that information is used. The business justification for disclosure of a relationship with a work colleague is to enable the employer to ensure that the parties to the relationship do not either benefit or suffer because of it. Sometimes employers post information about the existence of a relationship with a colleague on their intranet.

What the policy authors overlook is that the employer needs express consent to process information about sexuality which of course this is. Therefore posting such information on the company’s intranet, unless the employee expressly consents to this, will be a clear breach of the Data Protection Act. There may also be claims for discrimination if the employee suffers less favourable treatment following publication of the information.

Employers therefore need to take care when relying on policies that allow them to use data. If the data concerned is sensitive personal data reliance on the policy is not enough to protect them from claims.

 

Helen Farr is a Partner in the HR Law team at Fox Williams LLP and can be contacted at HFarr@foxwilliams.com.