Addressing privacy concerns with NHSX App

NHSX CovidThe contact tracing App being developed by NHSX is being promoted as a key tool which will enable the lockdown to be eased by automating the process of identifying people who have been in recent close proximity with someone with symptoms of Covid-19.

The success of the App is dependant to a large extent on a significant proportion of the population downloading and using it. While the App has some utility if only 20% of the population download it, contact tracing will only be effective if a significant percentage (estimated to be around 60%) of the population participate.

Whether or not people will take up the App is, in turn, critically dependant on the level of trust which people have that the system will operate as advertised and on if and how legitimate concerns as to the privacy and security of the data will be addressed.

The way it works

The App uses low power Bluetooth on smartphone devices to communicate with other devices in near proximity that also have the App installed. The App tracks the estimated distance and duration of each device from each other device. Each device that is in contact with another will issue to the other randomised numbers. This proximity log is then stored on the device.

If, soon after, a user develops symptoms of the virus, the user can then update their status on the App. The proximity log will then be uploaded to the central system that will work out the specific other devices that need to be alerted to the fact that they have been in proximity with someone who now has symptoms, so that the users of the other devices can then self-isolate.

Privacy concerns

Any government sponsored technology that can track and trace the population instinctively raises privacy concerns.

First, although the data is anonymised and does not contain any personal identifiers, it will track everyone a user comes into contact with. Data concerning one’s daily personal inter-actions and the people one associates with can be highly sensitive and not something one would wish to share with the state.

Then there is “feature creep”. While the technology is being introduced with the best intentions and in the interests of public health, once it has been widely implemented and as time goes on there will be a temptation to “enhance” it and use it for broader purposes. For example, if the App starts to record specific location data (and not only proximity data), this will be a serious privacy concern as location data can itself reveal highly sensitive personal data (e.g. meetings at other people’s homes, attendance at particular (e.g. political) events, health clinics or places of worship etc).  There may be a temptation to share the data with other government departments or the police for other purposes, such as detecting crime, or for tax or immigration purposes.

Also, let’s face it, the government and NHS do not have a great track record in respect of data security – so how secure will the data collected by the App be? There must be a risk that it could it be hacked by criminals or a rogue state sponsored hacker?

The fact that NHSX has – in contrast with many other governments (such as Ireland, Germany and Switzerland) and unlike the Google / Apple initiative – apparently opted to implement a centralised system, where data is held by the government rather than only locally on the device, heightens these concerns.

Application of Data Protection laws

Data protection laws apply to “personal data” relating to an identified or identifiable person. In the case of the App, it is used on a no names basis with the user being given a random rotating ID. The specific device ID is not used although the make and model of the device is captured. The GDPR specifically refers to an “online identifier” as being personal data. However, while pseudonymised data is regulated as personal data, truly anonymised data is not.

Although the precise way the App works is yet to be finalised and published, we must assume that the use of the App for track and trace will involve personal data and as such will be regulated by the GDPR as it will be possible to identify and distinguish some individuals (or devices) from others and to apply different treatment accordingly. Data protection laws do not stand in the way of such technologies, but such technologies must be built and implemented in compliance with data protection laws.

How to address the privacy concerns

While most people will in the present circumstances accept some degree of compromise on their privacy in the interests of their, and the nation’s, health, this has to be proportionate with the App being as minimally privacy invasive as is possible. To ensure widespread adoption on the App, it will be essential to ensure that privacy concerns are comprehensively addressed. There are a number of steps that must be taken.

Centralised v localised

First, NHSX should reconsider the centralised data approach and consider switching to a localised data solution. As the ICO commented, a purely localised system without a centralised dataset must inherently be more secure. It would also have the benefit of achieving greater interoperability with localised solutions being implemented by other countries; in particular, it is important to have interoperability on the island of Ireland.

NHSX counter this, however, by saying that there are public health benefits in their having access to the big data for analytics and research so as to learn more about the virus. It may also help limit malicious self-reporting (which could be done to try to put someone into self-isolation).

While a centralised system can be made to work, it is the case that much greater efforts in terms of data security will be required if public confidence is to be won over. There is a trade-off between functionality and public confidence; the more you try to get of the one, the less you get of the other. And public confidence is critical for widespread adoption, and ultimately for success, of the App.

There have been reports in the past few days of NHSX investigating the feasibility of transitioning the App to Apple and Google’s technology, and this could indicate a change of heart and a shift towards a localised data approach.

Transparency

Second, transparency. Provision of transparent information regarding how a person’s data is to be used is a central requirement under the GDPR. This requires that information be provided in a concise, transparent, intelligible and easily accessible form, using clear and plain language.

Given that the App is to be used by the general population, the privacy notice will need to be carefully and skilfully drafted so that it is accessible to all whether young, old, or with reading difficulties. It is yet unknown what the age requirement will be for the App; but particular care will be needed for information addressed to children.

We also need to know who will be the “controller” of this data and with whom it may be shared and for what purpose. Will the controller be the NHS, or will it be the Government?

Risk assessment

Transparency will also be well served by making public the NHSX Data Protection Impact Assessment. Under GDPR, a DPIA – a form of risk assessment – is required whenever using a new technology that is likely to result in a high risk to the rights and freedoms of individuals. The GDPR says a DPIA is specifically required where the technology involves a systematic and extensive evaluation of personal aspects relating to individuals which is based on automated processing, and on which decisions are based that significantly affect the person; or where there is processing on a large scale of special categories of data such as health data; or where there is  systematic monitoring of a publicly accessible area on a large scale. In some ways, the App ticks all of these boxes and the DPIA will be a critical document.

The DPIA must contain a systematic description of the processing operations and the purposes for which the data will be used, an assessment of the necessity and proportionality of the processing in relation to these purposes, an assessment of the risks to the rights and freedoms of individuals and the measures to be taken to address these risks, including safeguards and security measures to ensure the security of the data.

NHSX must share this DPIA as soon as possible with the ICO (as contemplated by Art 36 GDPR) for consultation. While not a legal requirement, it should also be made public for wider consultation. Unless the government so requires, the DPIA does not need to be approved by the ICO as such; however, NHSX should consider and implement as appropriate any advice and recommendations that the ICO, as the independent privacy watchdog, may put forward.

Finally, the working of the App should be open to audit and review by independent experts, not as a one-off, but on an ongoing basis.

The lawful basis and consent

Under data protection laws, processing of personal data is only lawful if there is a “lawful basis” for the processing. The GDPR sets out six possibilities; the main options for the App will be user “consent” or “performance of a task in the public interest”. Health data requires an additional lawful basis which could be satisfied by “explicit consent” or for public health reasons.

It is not yet known which of these lawful bases will be applied. While the App is entirely voluntary to use, it may be that consent is not the best option as it can be difficult to establish that a valid consent has been obtained. However, consent may be required under the GDPR on the basis that the App involves “automated decision making”.

As the App accesses data on the device, it could be that consent is required under the Privacy and Communications Regulations (PECRs). If consent were required under PECRs, then it would also be necessary to use consent as the lawful basis under the GDPR. Consent will not be required under PECRs if the exemption applies where the access to the data is “strictly necessary for the provision of” the service requested by the user. If, however, the App is to access any data that is not “strictly necessary”, then consent would be required by law.

While the App may or may not rely on “consent” as the lawful basis, it is important for public trust that its use is truly voluntary. A person is free to download it, and delete it, as they wish. They are free to choose whether to update their health status or not. And – if warned that they have been in proximity with an infected person – they are free to self-isolate or not as they choose.

Data minimisation

One of the central principles of the GDPR is ‘data minimisation’ – that data being collected must be limited to what is necessary in relation to the purposes for which they are collected. It is essential for this, therefore, to identify and articulate the purpose and then test whether the data being collected is necessary for this.

For example, the App requires proximity data, but it does not require location data. If there is the potential with a centralised system to add additional data elements, such as location data, then that could breach this central principle of the GDPR.

It has been suggested that users of the App will not need to add their name or other identifiers, but will be required to enter the first half of their post code. This alone will not ordinarily be sufficient to identify a person, but may serve a purpose in enabling NHSX to spot clusters of infection.

Purpose limitation

Under GDPR data can only be collected for specified, explicit and legitimate purposes and must not be further processed in a manner that is incompatible with those purposes. The GDPR allows for further processing for scientific research or statistical purposes in addition to the initial purposes.  This is an important legal constraint on feature creep, but is it enough to give people confidence that their data will not be used for other purposes?

Storage limitation

A further principle is that data must not be kept for longer than is necessary for the purposes for which the personal data are processed. A key issue is what happens to all the data after the Covid-19 crisis has subsided and it will no longer be necessary to track and trace. The data should then be securely destroyed or completely anonymised, but what guarantee is there that this will happen? The data retention period in relation to the data must be set out in the privacy notice to be issued with the App. This will need to reflect this principle and we have to have confidence that NHSX will honour it.

Data security

It is a fundamental requirement of data protection that appropriate technical and organisational measures are taken to ensure a level of data security appropriate to the risks. This will require implementation of state-of-the-art encryption of the data at rest and in transit. Following the GDPR principle of data protection “by design and by default”, data security and compliance with the other principles must be designed in to the way the App is built and used.

While data security is never 100% guaranteed, the public will need to be satisfied through the provision of transparent information that rigorous safeguards are in place.

Do we need a specific NHSX App watchdog?

While we have the ICO who is the regulator for compliance with data protection laws, we do have separate watchdogs for specific areas, for example, biometrics and communications monitoring. Given the speed at which the App needs to be rolled out if it is to be effective, and given that the ICO is well established and respected as the regulator for data matters under GDPR and the Data Protection Act 2018, with powers to audit, investigate complaints and issue substantial fines, the ICO is the appropriate regulator and an additional regulatory regime should not be needed.

Is specific legislation needed?

Some have suggested that specific regulation is needed to enshrine some necessary safeguards in law. Again, given timing imperatives, and given the flexible and well developed structure we already have with the GDPR and the Data Protection Act 2018, this may be a “nice to have” but should not be necessary.

Thoughts for employers

Clearly, contact tracing could be highly beneficial to employers, since it could reduce the need to carry out manual contact tracing in the event an employee falls ill with coronavirus. So, can an employer make downloading the App compulsory?

The answer will depend to some extent on the lawful basis that is relied on for the processing of personal data through the App. If the lawful basis is “consent”, then compelling employees to download and use the App will invalidate any apparent consent since it will not have been freely given. If the lawful basis is “public interest”, then employers will need to decide if they should seek to compel, or alternatively strongly recommend, their employees to download and use the App. If they seek to compel, and an employee refuses, it is hard to see that the employee can with fairness be subjected to any detriment other than as required for health and safety.

We all have a strong interest in the App being rolled out, gaining maximum levels of public adoption and making a valuable contribution to fighting the virus. For this it will be necessary for the public to have a high level of trust in the App and its privacy safeguards. Good data protection will be an essential ingredient to achieving this trust.

Nigel Miller is a partner in Fox Williams LLP and leads the Data Protection and Privacy team. He is a Certified Information Privacy Professional (CIPP/E).

Ben Nolan is an associate in the Data Protection and Privacy team at Fox Williams LLP.

Happy Data Privacy Day! And what’s coming up in 2020?

Since 2006, 28 January has marked the anniversary of the first international law in the field of data protection – who knew?

A lot has happened since then. Data protection and privacy is now a rapidly expanding area of law of ever-increasing importance. As we head towards the second anniversary since the GDPR came into force, we review current developments and look ahead at what to expect in 2020.

Our special Data Privacy Day newsletter covers the following topics:

Accountability – sounds good, but what does it actually mean?
International transfers and Brexit
What’s cooking with cookies?
Whatever happened to the ePrivacy Regulation?
The growing culture of Data Subject Access Requests (DSARs)
Adtech – under regulator scrutiny
Artificial Intelligence (“AI”) and data protection
Data security – what’s appropriate?
Fines – more to come …
Class action compensation claims

Meanwhile, please make a diary note of our annual Data Protection Update seminar, which will be held on 14 May 2020.

Please do contact us if you have any questions or if our data protection team can assist you in any way.

Continue reading

Whatever happened to the ePrivacy Regulation?

The ePrivacy Regulation is due to replace the current ePrivacy Directive, which is the European law behind the Privacy and Electronic Communications Regulations (PECR). These are the rules which govern the use of cookies and similar tracking technologies, as well as digital marketing. The new Regulation is intended to bring the ePrivacy Directive into alignment with the GDPR and to introduce changes to the rules governing electronic marketing.

Originally intended to coincide with the GDPR, the introduction of the ePrivacy Regulation has been highly contentious and has met with considerable delay. Towards the end of 2019, the latest draft was rejected by the Council of Europe leading to further delays in its adoption.

The ePrivacy Regulation promised a simpler set of rules on cookies. It would remove the need for cookie banners and notices and allow browser settings to provide a way for users to indicate whether they accept or refuse cookies and other identifiers. It would clarify that consent is not needed for non-privacy intrusive cookies that improve internet experience (e.g. remembering shopping cart history) or analytics cookies used by a website to count visitors.

The new rules would also ban cookie walls (where a website requires users to accept cookies as a condition of being able to access the website’s content).

The proposal will also continue the ban on unsolicited electronic communications by emails, SMS and automated calling machines. However, it is not yet known if this will extend to B2B communications, or simply apply to B2C marketing as at present.

The draft Regulation also introduces more stringent penalties for non-compliance, and bring the sanctions regime and remedies available broadly into line with the GDPR.

It is uncertain what the final form of the Regulation will be. However, given the latest delay, Brexit has now intervened and so the Regulation will not be directly applicable in the UK. Despite that, it is likely that the UK will adopt the new rules as and when introduced. While the UK may be able to make its own decision on this following Brexit, if the UK does not implement the new Regulation that may stand in the way of the adequacy decision the UK needs in order to allow the free flow of data to and from the EEA. Also, the proposed extra-territorial scope of the new Regulation (like the GDPR) means that it will remain directly applicable to UK businesses targeting the EEA.  Who said that after Brexit the UK will take back control of its laws?!

Meanwhile, the ICO has also published a draft direct marketing code of practice for consultation. The consultation closes on 4 March 2020 and the ICO expects to finalise it in 2020. The ICO plans to produce additional practical tools such as checklists to go alongside the code.

Some key points include:

  • The two lawful bases most likely to be applicable to direct marketing are consent and legitimate interests. However, where PECR applies and requires consent, then in practice consent should also be your lawful basis under the GDPR.
  • It is important to keep personal data accurate and up to date. It should not be kept for longer than is necessary. It is harder to rely on consent as a genuine indication of wishes as time passes.
  • If you are considering buying or renting direct marketing lists, you must ensure you have completed appropriate due diligence
  • Profiling and enrichment activities must be done in a way that is fair, lawful and transparent.
  • If you are using new technologies for marketing and online advertising, it is highly likely that you will be required to conduct a data protection impact assessment (DPIA).
  • If someone objects you must stop processing for direct marketing purposes. You should add their details to your suppression list so that you can screen any new marketing lists against it.

Once the draft ePrivacy Regulation is finalised and the UK’s position on Brexit is clear, the ICO has indicated that it will update the direct marketing code to take into account of the ePrivacy Regulation.

Return to Data Privacy Day 2020 index

What’s cooking with cookies?

Cookies have become a hot topic for the ICO, with it receiving many complaints about websites’ (often unlawful) use of cookies. This theme looks set to continue into 2020.

This is particularly the case since a huge number of organisations, including some of the largest businesses in the UK, have still not updated their practices to ensure they comply with the rules. This is despite the fact that the ICO published clear guidance concerning the requirements for the lawful use of cookies in summer 2019.

It is likely that the ICO will start taking enforcement action against organisations which do not follow the rules, and this could lead to fines. As such, businesses which are not yet compliant should take steps to ensure compliance now.

At a high level, the following are the main rules when using cookies on websites:

  1. User consent must be obtained (except in relation to “strictly necessary cookies”)

The ICO confirmed that the standard of consent for using cookies is the same high standard as under the GDPR, even for cookies which do not involve the processing of personal data. This means that implied or inferred consent can no longer be relied on for cookies. For consent, a clear affirmative act is needed; pre-ticked boxes or inactivity does not constitute consent.

Websites which use non-essential cookies without specifically requiring users to consent to these when accessing a site (e.g. by specifying that continued use entails consent) are, therefore, not compliant. This also means that all non-essential cookies should be switched off by default. It also means that such cookies should only be served on the user if and when the user consents.

“Strictly necessary cookies”, which do not require consent, are those which are essential to provide a user with the service they have requested or to comply with applicable law. Analytics cookies and advertising cookies do not fall within this exemption.

  1. Provide clear and transparent information to users concerning the cookies you use

The ICO Guidance emphasises the need to provide users with transparent information about cookies. The information must be in accordance with the higher standards of transparency as required by the GDPR; it must be presented in a “concise, transparent, intelligible and easily accessible form, using clear and plain language”.

In relation to cookies, this means that online retailers need to review and update their cookies policies to ensure that these are drafted in a sufficiently clear and easily accessible manner for a normal user to be able to understand how the different types of cookies are being used on the website. Failure to provide clear information will breach the transparency requirement, and will also undermine any “consent” if the consent cannot be said to be sufficiently informed.

Highlighting the importance of transparency and consent, in January 2019, the French data protection regulator imposed a fine of €50 million on Google for lack of transparency, inadequate information and lack of valid consent regarding ads personalization on mobile devices. For more information on this, see further https://idatalaw.com/2019/01/25/e50m-fine-for-google-in-france/

Return to Data Privacy Day 2020 index

Artificial Intelligence (“AI”) and data protection

In the past few years, we have seen an increasing number of organisations developing or using AI solutions. Although the business case for the use AI is compelling, tensions can arise where its use is at odds with data protection laws.

These tensions between AI and data protection include the following:

  • Transparency – the GDPR requires you to provide individuals with notice setting out how you are using their personal data. Where there is an element of automated decision-making which results in legal effects or otherwise has a significant effect on an individual (as there often is with AI), the controller is required to provide affected individuals with “meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject”. Given the complexities with AI and the fact that some types of AI can develop in an unsupervised environment, without human intervention, it can sometimes be difficult to meet these requirements.
  • Purpose limitation, data minimisation and storage limitation – the GDPR requires that processing of personal data is carried out for specific purposes, no more personal data than is adequate to achieve those purposes is processed and that personal data is only processed for as long as necessary to achieve those purposes. There is often tension between these principles and AI, since the development of an AI system can often result in data being used for unexpected purposes, and often requires vast amounts of data to be inputted into the system in order for it to meaningfully detect patterns and trends.

In respect of the transparency issue, the ICO has developed draft guidance along with the Alan Turing Institute (the UK’s national institute for data science and artificial intelligence) dealing with explaining AI. The guidance provides detailed information on the different ways in which businesses can seek to explain the processing they undertake using AI to the individuals concerned and seeks to address some of the concerns businesses may have in providing such explanations.

In addition to the above, the ICO is also working on finalising its AI auditing framework which will address the following specific issues:

  • Accountability – which will discuss the measures that an organisation must have in place to be compliant with data protection law.
  • AI-specific risk areas – which will discuss the key risk areas the ICO has identified in relation to the use of AI in the field of data protection.

As the use of AI becomes more widespread, it is hoped that the guidance issued by the ICO will help businesses better understand and comply with their data protection obligations whilst still allowing them to develop AI systems which can benefit organisations and individuals alike as our knowledge in this area continues to grow.

Return to Data Privacy Day 2020 index