European Court issues major blow to transfer of personal data between EU and US

The European Court has today given its judgment which will come as a major blow to many businesses both in Europe and the US (particularly tech companies) which rely upon the Privacy Shield to transfer personal data to the US.

The judgment is concerned with the transfer of personal data by Facebook Ireland to its parent company in the US. Earlier this year we commented on the pre-judgment opinion of the Advocate General (“AG”) (here) which focused on the Controller to Processor Standard Contractual Clauses (“C2P SCCs”) and the fact that the AG had opined that the validity of these clauses should be upheld.

Whilst the European Court has now confirmed the validity of the C2P SCCs, it has unexpectedly found the EU-US Privacy Shield to be invalid.

Take home points

  1. Businesses which are currently relying on the Privacy Shield to transfer personal data to the US will need to rapidly review their data transfer practices and put in place alternative measures to allow for the data to continue to be transferred lawfully. The most suitable mechanism will most likely be for the organisation transferring the data to enter into standard contractual clauses (SCCs) with the US recipient. As an alternative, some businesses may now regard transfers to the US to be too complicated and look at options to retain the data within the EEA.
  2. Businesses which fail to put in place alternative measures will be exposed to claims for damages and fines by data protection regulators such as the Information Commissioner’s Office.

European Court Decision

In finding the Privacy Shield to be invalid, the European Court took the view that:

  • the requirements of US national security, public interest and law enforcement were put before the fundamental rights of data subjects whose personal data are transferred under the framework;
  • US law provides its public authorities with far reaching surveillance powers which go beyond what is “strictly necessary” (including in respect of non-US individuals) and do not afford individuals with adequate rights to challenge the relevant authorities before the courts;
  • the Ombudsman mechanism provided for under the Privacy Shield, which is designed to provide data subjects whose data are transferred under the framework with a right of recourse, does not guarantee data subjects the same protections that they would be afforded under EU law (for example, the Ombudsman does not have the power to make decisions which were binding on the US intelligence services).

As such the European Court decided that the Privacy Shield does not offer an adequate level of protection for data subjects whose personal data are transferred pursuant to it.  This is the second time that the scheme for EU-US data transfers has been struck down after the Safe Harbor was invalidated in 2015.

Ben Nolan (solicitor, qualified in Scotland) 

Six data protection steps for return to the workplace

As lock-down restrictions start to ease and businesses begin to reopen, the ICO has set out the key steps organisations need to consider around the use of personal information.

Six key steps:

The six key data protection steps are:

Only collect and use what’s necessary

This reflects the data protection principle of “purpose limitation”.

To help you decide if collecting and using people’s health data is necessary to keep your staff safe, you should ask yourself a few questions:

  • How will collecting extra personal information help keep your workplace safe?
  • Do you really need the information?
  • Will the test you’re considering actually help you provide a safe environment?
  • Could you achieve the same result without collecting personal information?

If you can show that your approach is reasonable, fair and proportionate to the circumstances, then it is unlikely to raise data protection concerns.

Keep it to a minimum

This reflects the data protection principle of “data minimisation”.

When collecting personal information, including people’s Covid-19 symptoms or any related test results, organisations should collect only the information needed to implement their measures appropriately and effectively.

Don’t collect personal data that you don’t need. In some case, some information only needs to be held for a short period, and there is no need to create a permanent record.

Be clear, open and honest with staff about their data

This reflects the data protection principle of “transparency”; people have a right to know their information will be handled.

Some people may be affected by some of the measures you intend to implement. For example, staff may not be able to work. You must be mindful of this, and make sure you tell people how and why you wish to use their personal information, including what the implications for them will be. You should also let employees know who you will share their information with and for how long you intend to keep it. You can do this through a clear, accessible privacy notice.

Treat people fairly

This reflects the data protection principle of “fairness”.

If you’re making decisions about your staff based on the health information you collect, you must make sure your approach is fair. Think carefully about any detriment they might suffer as a result of your policy, and make sure your approach doesn’t cause any kind of discrimination.

Keep people’s information secure

This reflects the data protection principles of “integrity and confidentiality” and “storage limitation”.

Any personal data you hold must be kept securely and only held for as long as is necessary.

Staff must be able to exercise their information rights

As with any data collection, organisations must inform individuals about their rights in relation to their personal data, such as the right of access or rectification. Staff must have the option to exercise those rights if they wish to do so, and to discuss any concerns they may have with organisations.

Legal basis for processing:

As well as following these principles, if you decide to implement symptom checking or testing, you must identify a lawful basis for using the information you collect.

We would suggest that employers avoid reliance on “consent” as the legal basis, as employee consent is unlikely to be valid for data protection purposes as employees do not have a free and genuine choice. The most appropriate legal basis, therefore, will be that the collection of health data is in the “legitimate interests” of the employer, such interests not being overridden by the interests of the employees.

In addition, as health data is one of the “special categories” of personal data, an additional lawful basis is required.  Again, we would suggest that employers avoid reliance on “explicit consent”, and instead rely on the necessity to process the information to comply with the employer’s health and safety at work obligations.

Finally, if you are processing health data on a “large-scale”, you will also need to conduct a “data protection impact assessment” (DPIA). The GDPR does not define what constitutes large-scale. In essence, this will be determined mainly by the number of employees involved. While a small business is unlikely to be processing employee data on a large-scale, even if you are not strictly required to carry out a DPIA, it is good practice to do so.

Our recommendations:

  • Provide a Covid-19 specific privacy notice to your employees, as a supplement to your general staff privacy notice.
  • Supplement your data retention policy to set out when personal information collected must be reviewed, deleted or anonymised.
  • If you are collecting employee health data, or checking and testing, document your legitimate interests assessment (LIA). This does not have to be in any particular form but should address the three tests: the purpose test (identify the legitimate interest); the necessity test (consider if the processing is necessary); and the balancing test (consider the individual’s interests).
  • Consider how the information will be stored to ensure it is kept secure, and who will have access to the information.
  • Do you have an internal data subject access request policy? If not, it’s a good time to introduce one to ensure DSARs are handled effectively.
  • If you are processing health data on a large scale, or just to comply with good practice, prepare a data protection impact assessment. This can be done as part of your wider return to work risk assessment.

 

Contact us

If you have any questions about these issues in relation to your own organisation, please contact a member of the team or speak with your usual Fox Williams contact.

Addressing privacy concerns with NHSX App

NHSX CovidThe contact tracing App being developed by NHSX is being promoted as a key tool which will enable the lockdown to be eased by automating the process of identifying people who have been in recent close proximity with someone with symptoms of Covid-19.

The success of the App is dependant to a large extent on a significant proportion of the population downloading and using it. While the App has some utility if only 20% of the population download it, contact tracing will only be effective if a significant percentage (estimated to be around 60%) of the population participate.

Whether or not people will take up the App is, in turn, critically dependant on the level of trust which people have that the system will operate as advertised and on if and how legitimate concerns as to the privacy and security of the data will be addressed.

The way it works

The App uses low power Bluetooth on smartphone devices to communicate with other devices in near proximity that also have the App installed. The App tracks the estimated distance and duration of each device from each other device. Each device that is in contact with another will issue to the other randomised numbers. This proximity log is then stored on the device.

If, soon after, a user develops symptoms of the virus, the user can then update their status on the App. The proximity log will then be uploaded to the central system that will work out the specific other devices that need to be alerted to the fact that they have been in proximity with someone who now has symptoms, so that the users of the other devices can then self-isolate.

Privacy concerns

Any government sponsored technology that can track and trace the population instinctively raises privacy concerns.

First, although the data is anonymised and does not contain any personal identifiers, it will track everyone a user comes into contact with. Data concerning one’s daily personal inter-actions and the people one associates with can be highly sensitive and not something one would wish to share with the state.

Then there is “feature creep”. While the technology is being introduced with the best intentions and in the interests of public health, once it has been widely implemented and as time goes on there will be a temptation to “enhance” it and use it for broader purposes. For example, if the App starts to record specific location data (and not only proximity data), this will be a serious privacy concern as location data can itself reveal highly sensitive personal data (e.g. meetings at other people’s homes, attendance at particular (e.g. political) events, health clinics or places of worship etc).  There may be a temptation to share the data with other government departments or the police for other purposes, such as detecting crime, or for tax or immigration purposes.

Also, let’s face it, the government and NHS do not have a great track record in respect of data security – so how secure will the data collected by the App be? There must be a risk that it could it be hacked by criminals or a rogue state sponsored hacker?

The fact that NHSX has – in contrast with many other governments (such as Ireland, Germany and Switzerland) and unlike the Google / Apple initiative – apparently opted to implement a centralised system, where data is held by the government rather than only locally on the device, heightens these concerns.

Application of Data Protection laws

Data protection laws apply to “personal data” relating to an identified or identifiable person. In the case of the App, it is used on a no names basis with the user being given a random rotating ID. The specific device ID is not used although the make and model of the device is captured. The GDPR specifically refers to an “online identifier” as being personal data. However, while pseudonymised data is regulated as personal data, truly anonymised data is not.

Although the precise way the App works is yet to be finalised and published, we must assume that the use of the App for track and trace will involve personal data and as such will be regulated by the GDPR as it will be possible to identify and distinguish some individuals (or devices) from others and to apply different treatment accordingly. Data protection laws do not stand in the way of such technologies, but such technologies must be built and implemented in compliance with data protection laws.

How to address the privacy concerns

While most people will in the present circumstances accept some degree of compromise on their privacy in the interests of their, and the nation’s, health, this has to be proportionate with the App being as minimally privacy invasive as is possible. To ensure widespread adoption on the App, it will be essential to ensure that privacy concerns are comprehensively addressed. There are a number of steps that must be taken.

Centralised v localised

First, NHSX should reconsider the centralised data approach and consider switching to a localised data solution. As the ICO commented, a purely localised system without a centralised dataset must inherently be more secure. It would also have the benefit of achieving greater interoperability with localised solutions being implemented by other countries; in particular, it is important to have interoperability on the island of Ireland.

NHSX counter this, however, by saying that there are public health benefits in their having access to the big data for analytics and research so as to learn more about the virus. It may also help limit malicious self-reporting (which could be done to try to put someone into self-isolation).

While a centralised system can be made to work, it is the case that much greater efforts in terms of data security will be required if public confidence is to be won over. There is a trade-off between functionality and public confidence; the more you try to get of the one, the less you get of the other. And public confidence is critical for widespread adoption, and ultimately for success, of the App.

There have been reports in the past few days of NHSX investigating the feasibility of transitioning the App to Apple and Google’s technology, and this could indicate a change of heart and a shift towards a localised data approach.

Transparency

Second, transparency. Provision of transparent information regarding how a person’s data is to be used is a central requirement under the GDPR. This requires that information be provided in a concise, transparent, intelligible and easily accessible form, using clear and plain language.

Given that the App is to be used by the general population, the privacy notice will need to be carefully and skilfully drafted so that it is accessible to all whether young, old, or with reading difficulties. It is yet unknown what the age requirement will be for the App; but particular care will be needed for information addressed to children.

We also need to know who will be the “controller” of this data and with whom it may be shared and for what purpose. Will the controller be the NHS, or will it be the Government?

Risk assessment

Transparency will also be well served by making public the NHSX Data Protection Impact Assessment. Under GDPR, a DPIA – a form of risk assessment – is required whenever using a new technology that is likely to result in a high risk to the rights and freedoms of individuals. The GDPR says a DPIA is specifically required where the technology involves a systematic and extensive evaluation of personal aspects relating to individuals which is based on automated processing, and on which decisions are based that significantly affect the person; or where there is processing on a large scale of special categories of data such as health data; or where there is  systematic monitoring of a publicly accessible area on a large scale. In some ways, the App ticks all of these boxes and the DPIA will be a critical document.

The DPIA must contain a systematic description of the processing operations and the purposes for which the data will be used, an assessment of the necessity and proportionality of the processing in relation to these purposes, an assessment of the risks to the rights and freedoms of individuals and the measures to be taken to address these risks, including safeguards and security measures to ensure the security of the data.

NHSX must share this DPIA as soon as possible with the ICO (as contemplated by Art 36 GDPR) for consultation. While not a legal requirement, it should also be made public for wider consultation. Unless the government so requires, the DPIA does not need to be approved by the ICO as such; however, NHSX should consider and implement as appropriate any advice and recommendations that the ICO, as the independent privacy watchdog, may put forward.

Finally, the working of the App should be open to audit and review by independent experts, not as a one-off, but on an ongoing basis.

The lawful basis and consent

Under data protection laws, processing of personal data is only lawful if there is a “lawful basis” for the processing. The GDPR sets out six possibilities; the main options for the App will be user “consent” or “performance of a task in the public interest”. Health data requires an additional lawful basis which could be satisfied by “explicit consent” or for public health reasons.

It is not yet known which of these lawful bases will be applied. While the App is entirely voluntary to use, it may be that consent is not the best option as it can be difficult to establish that a valid consent has been obtained. However, consent may be required under the GDPR on the basis that the App involves “automated decision making”.

As the App accesses data on the device, it could be that consent is required under the Privacy and Communications Regulations (PECRs). If consent were required under PECRs, then it would also be necessary to use consent as the lawful basis under the GDPR. Consent will not be required under PECRs if the exemption applies where the access to the data is “strictly necessary for the provision of” the service requested by the user. If, however, the App is to access any data that is not “strictly necessary”, then consent would be required by law.

While the App may or may not rely on “consent” as the lawful basis, it is important for public trust that its use is truly voluntary. A person is free to download it, and delete it, as they wish. They are free to choose whether to update their health status or not. And – if warned that they have been in proximity with an infected person – they are free to self-isolate or not as they choose.

Data minimisation

One of the central principles of the GDPR is ‘data minimisation’ – that data being collected must be limited to what is necessary in relation to the purposes for which they are collected. It is essential for this, therefore, to identify and articulate the purpose and then test whether the data being collected is necessary for this.

For example, the App requires proximity data, but it does not require location data. If there is the potential with a centralised system to add additional data elements, such as location data, then that could breach this central principle of the GDPR.

It has been suggested that users of the App will not need to add their name or other identifiers, but will be required to enter the first half of their post code. This alone will not ordinarily be sufficient to identify a person, but may serve a purpose in enabling NHSX to spot clusters of infection.

Purpose limitation

Under GDPR data can only be collected for specified, explicit and legitimate purposes and must not be further processed in a manner that is incompatible with those purposes. The GDPR allows for further processing for scientific research or statistical purposes in addition to the initial purposes.  This is an important legal constraint on feature creep, but is it enough to give people confidence that their data will not be used for other purposes?

Storage limitation

A further principle is that data must not be kept for longer than is necessary for the purposes for which the personal data are processed. A key issue is what happens to all the data after the Covid-19 crisis has subsided and it will no longer be necessary to track and trace. The data should then be securely destroyed or completely anonymised, but what guarantee is there that this will happen? The data retention period in relation to the data must be set out in the privacy notice to be issued with the App. This will need to reflect this principle and we have to have confidence that NHSX will honour it.

Data security

It is a fundamental requirement of data protection that appropriate technical and organisational measures are taken to ensure a level of data security appropriate to the risks. This will require implementation of state-of-the-art encryption of the data at rest and in transit. Following the GDPR principle of data protection “by design and by default”, data security and compliance with the other principles must be designed in to the way the App is built and used.

While data security is never 100% guaranteed, the public will need to be satisfied through the provision of transparent information that rigorous safeguards are in place.

Do we need a specific NHSX App watchdog?

While we have the ICO who is the regulator for compliance with data protection laws, we do have separate watchdogs for specific areas, for example, biometrics and communications monitoring. Given the speed at which the App needs to be rolled out if it is to be effective, and given that the ICO is well established and respected as the regulator for data matters under GDPR and the Data Protection Act 2018, with powers to audit, investigate complaints and issue substantial fines, the ICO is the appropriate regulator and an additional regulatory regime should not be needed.

Is specific legislation needed?

Some have suggested that specific regulation is needed to enshrine some necessary safeguards in law. Again, given timing imperatives, and given the flexible and well developed structure we already have with the GDPR and the Data Protection Act 2018, this may be a “nice to have” but should not be necessary.

Thoughts for employers

Clearly, contact tracing could be highly beneficial to employers, since it could reduce the need to carry out manual contact tracing in the event an employee falls ill with coronavirus. So, can an employer make downloading the App compulsory?

The answer will depend to some extent on the lawful basis that is relied on for the processing of personal data through the App. If the lawful basis is “consent”, then compelling employees to download and use the App will invalidate any apparent consent since it will not have been freely given. If the lawful basis is “public interest”, then employers will need to decide if they should seek to compel, or alternatively strongly recommend, their employees to download and use the App. If they seek to compel, and an employee refuses, it is hard to see that the employee can with fairness be subjected to any detriment other than as required for health and safety.

We all have a strong interest in the App being rolled out, gaining maximum levels of public adoption and making a valuable contribution to fighting the virus. For this it will be necessary for the public to have a high level of trust in the App and its privacy safeguards. Good data protection will be an essential ingredient to achieving this trust.

Nigel Miller is a partner in Fox Williams LLP and leads the Data Protection and Privacy team. He is a Certified Information Privacy Professional (CIPP/E).

Ben Nolan is an associate in the Data Protection and Privacy team at Fox Williams LLP.

Supreme Court absolves Morrisons of liability for rogue employee data breach

In a landmark judgment, important from both a data protection and employment law standpoint, the Supreme Court has held that vicarious liability cannot be imposed on Morrisons in a case which concerned the unlawful publication of Morrisons’ employee personal data online by a rogue employee.

Facts

The case involved a class of 9,263 Morrisons employees or ex-employees whose personal data had been unlawfully made available online back in 2013. The information (which included name, address, gender, date of birth, phone numbers, national insurance number, bank sorting code, bank account number and salary) was published by a rogue employee, Mr Andrew Skelton, as an act of vengeance against Morrisons due to a grudge he held against his employers for disciplinary action taken against him earlier that year. Whilst Mr Skelton was entitled to access the data as part of his role, he was only permitted to share the data with the company’s auditors.

The claims brought against Morrisons were made under the Data Protection Act 1998 (DPA), under common law for misuse of private information and breach of confidence, and also on the basis that Morrisons were vicariously liable for the acts of Mr Skelton. Damages were sought for the distress, anxiety, upset and damage which had been suffered by the data subjects concerned.

The court noted that Morrisons had also spent more than £2.26m in dealing with the immediate aftermath of the disclosure. A significant element of that sum was spent on identity protection measures for its employees. Meanwhile, Skelton, the employee, was convicted of a number of criminal offences and sentenced to eight years’ imprisonment.

High Court and Court of Appeal decisions

In 2017, the High Court found in favour of the claimants, ruling (among other matters) that Morrisons could be held vicariously liable for the acts of Mr Skelton since he had been provided access to the relevant data in the course of his duties as an employee and his publication of the data was “a seamless and continuous sequence of events”  relating to his duties. Furthermore, it was held that there was nothing which would prevent vicarious liability from applying under the DPA. Morrisons appealed to the Court of Appeal but were unsuccessful and so further appealed to the Supreme Court which heard the case at the end of last year.

Supreme Court ruling

The Supreme Court’s decision covered the following key issues.

  1. Could Morrisons be vicariously liable for Mr Skelton’s conduct?

The court found that the decision of the High Court and Court of Appeal relating to vicarious liability had focused too heavily on the judgment of Lord Toulson in an earlier Supreme Court decision (Mohamud [2016]) (coincidentally also involving Morrisons) in which a customer at a petrol station had been assaulted by an employee of the petrol station. Much had been made by the judges in the lower courts of Lord Toulson’s comments in that case that the decision of the employee had been connected to his employment and that his motives for assaulting the customer were “irrelevant”.

However, the Supreme Court found that Lord Toulson’s comments in the Mohamud judgement had been taken out of context and should not be construed as introducing new principles to the concept of vicarious liability. It ruled that the “close connection” test remained the appropriate test for determining whether vicarious liability could be imposed on an employer. Pursuant to the close connection test:

“…the wrongful conduct [of the employee] must be so closely connected with acts the employee was authorised to do that, for the purposes of the liability of the employer to third parties, it may fairly and properly be regarded as done by the employee while acting in the ordinary course of his employment.”

In the present case, the Supreme Court found that the “close connection” test was not met (despite there being a close temporal and causal link between Mr Skelton’s role and his publication of the data on the internet) for the following key reasons:

  • The disclosure of the data on the Internet did not form part of Mr Skelton’s functions or field of activities – he was not authorised to disclose the relevant data to anyone other than KPMG.
  • The motives of Mr Skelton in disclosing the data were important – the fact that he did so for personal reasons was “highly material”. Indeed, the reasons Mr Skelton had decided to publish the data was to cause harm to Morrisons due to his personal vendetta against the company.
  1. Does the DPA exclude vicarious liability for statutory torts committed by an employee who is acting as a data controller under the DPA?

Although not strictly necessary given the court’s finding that Morrisons could not be held vicariously liable based on the facts of the case, the court did give its views on the above question which are important from a data protection perspective.

It had been agreed by all parties that both Morrisons and Mr Skelton were independent controllers in relation to the data which was published online. In light of this, Morrisons had argued that it could not be held vicariously liable for the acts of Mr Skelton under the DPA since it had complied with its obligations as a controller under the DPA and Mr Skelton was acting as a separate controller when disclosing the data. Morrisons argued that the DPA did not allow for vicarious liability to be imposed on them for Mr Skelton’s actions as a controller.

However, the Supreme Court rejected this position, stating that since the DPA does not indicate (whether expressly or impliedly) whether the principle of vicarious liability applies to breaches of its obligations, an employer can be found vicariously liable for breaches which are committed by an employee who is acting as a data controller in the course of his or her employment.

Comment

The decision will be welcomed by business since it shows that employers will not generally be held liable for the acts of rogue employees acting outside their “field of activities”. However, it is important to bear in mind that the decision came down to the specific facts of the case. It is entirely possible that there could be cases where unauthorised disclosure of personal data by an employee results in an employer being held vicariously liable; an example could be an employee negligently leaving sensitive documents on a train on the way to a business meeting, or causing a data breach by failing to follow the company’s data security policies. As ever, implementing appropriate data security measures and policies and reinforcing the need for employees to follow such policies can help to reduce these risks.

The case is also the first to come before the Supreme Court involving a class action brought by data subjects for a violation of data protection rules. Notwithstanding the decision in favour of Morrisons, we expect class actions in relation to data breaches to become increasingly common.

Finally, although the case was brought under the (old) Data Protection Act, the position would not be any different under the GDPR and the new DPA.

 

Ben Nolan (solicitor, qualified in Scotland) and Nigel Miller (partner)

Data Protection and COVID-19 – Regulator Guidance

The ICO has published in a blog post some helpful guidance on data protection compliance and COVID-19. This also draws on a statement issued by the European Data Protection Board (EDPB).

Broadly, data protection rules (such as the GDPR) do not hinder measures taken in the fight against the pandemic. The EDPB says that it is in the interest of humanity to curb the spread of diseases and to use modern techniques in the fight against scourges affecting great parts of the world. Even so, the EDPB underlines that, even in these exceptional times, the data controller and processor must ensure the protection of the personal data of data subjects.

The ICO recognises the unprecedented challenges we are all facing during the pandemic, and that organisations might need to share information quickly or adapt the way they work.  The ICO confirms that data protection will not stop you doing that. It’s about being proportionate, and not going beyond what people might reasonably expect.

Core principles

Core data protection principles need to be followed even for emergency data uses. This includes the following:

  • Personal data that is necessary to attain the objectives pursued should be processed for specified and explicit purposes.
  • Data subjects should receive transparent information on the processing activities that are being carried out and their main features, including the retention period for collected data and the purposes of the processing. The information provided should be easily accessible and provided in clear and plain language.
  • It is important to adopt adequate security measures and confidentiality policies ensuring that personal data are not disclosed to unauthorised parties.
  • Measures implemented to manage the current emergency and the underlying decision-making process should be appropriately documented.

Delays in compliance

ICO guidance:  Organisations with concerns about complying with GDPR requirements are offered assurance. The ICO says they understand that resources, whether finances or people, might be diverted away from usual compliance work. The ICO indicate that they won’t penalise organisations that they know need to prioritise other areas or adapt their usual approach during this extraordinary period.

While the ICO can’t extend statutory timescales, they will tell people that they may experience understandable delays when making information rights requests during the pandemic.

Comment:  This offers some comfort, for example, to businesses that are currently grappling with lack of resource or access to documents for responding to data subject access requests (DSARs) which have a deadline for response of one month or, in complex cases, extendable to three months. A key factor will be to keep the data subject up to date with progress on the response.

Homeworking

ICO guidance:  Data protection is not a barrier to increased and different types of homeworking. During the pandemic, staff may work from home more frequently than usual and they can use their own device or communications equipment. Data protection law doesn’t prevent that, but you’ll need to consider the same kinds of security measures for homeworking that you’d use in normal circumstances.

Comment:  Employers should carry out a data privacy risk assessment of the data protection implications of employees working from home on a scale greater than might be usual. This could include review of the following:

  • ensuring staff have been given training and guidance and regular reminders about their obligations to safeguard personal data, including not saving sensitive data to unsecured devices or cloud storage;
  • as there is an uptick in cybercriminals and email scams looking to profit from the crisis, warning staff about emails that may look as if they are from official sources but include malicious software, as well as fake phishing emails impersonating people within the organisation;
  • requiring the use of complex passwords and the need to change them often;
  • taking care when using wifi, avoiding public wifi and using known secure wifi where possible.

Can you tell staff that a colleague may have contracted COVID-19?

ICO Guidance: Yes. You should keep staff informed about cases in your organisation. Remember, you probably don’t need to name individuals and you shouldn’t provide more information than necessary. You have an obligation to ensure the health and safety of your employees, as well as a duty of care. Data protection doesn’t prevent you doing this.

The EDPB adds that in cases where it is necessary to reveal the name of the employee(s) who contracted the virus (e.g. in a preventive context), the concerned employees should be informed in advance and their dignity and integrity protected.

Comment: even though such information relates to a person’s health, which is classified as special category (or sensitive) personal data, an employer is entitled to process / disclose this information where necessary to comply with employment law which includes ensuring the health, safety and welfare of its employees. Again, this only extends to what is necessary and proportionate for this purpose.

Can you collect health data in relation to COVID-19 about employees or from visitors?

ICO Guidance:  You have an obligation to protect your employees’ health, but that doesn’t necessarily mean you need to gather lots of information about them.

It’s reasonable to ask people to tell you if they have visited a particular country, or are experiencing COVID-19 symptoms.

You could ask visitors to consider government advice before they decide to come. And you could advise staff to call 111 if they are experiencing symptoms or have visited particular countries. This approach should help you to minimise the information you need to collect.

If that’s not enough and you still need to collect specific health data, don’t collect more than you need and ensure that any information collected is treated with the appropriate safeguards.

Comment: while this guidance was issued only in the past few days, it can become rapidly out of date as Government / NHS guidance on COVID-19 changes.

 

Nigel Miller is a partner in the commerce & technology team at City law firm Fox Williams LLP and is a Certified Information Privacy Professional (CIPP/E). Nigel can be contacted at nmiller@foxwilliams.com