The Information Commissioner’s Office (ICO) recently fined British Airways (BA), Marriott International (Marriott), Ticketmaster £20 million, £18.4 million and £1.25m respectively for failures to keep their customers’ personal data secure. These companies suffered separate data breaches in 2018 which resulted in a large number of their customers having their personal data, including credit card details, compromised.
Whilst all these fines are significant (a record fine in the case of BA), what is interesting is the huge change of approach by the ICO which had originally issued notices of intention (“NOIs”) to fine BA an incredible £183.4 million and Marriott £99.2 million back in July 2019. The NOI to fine for Ticketmaster was £1.5M.
Clearly, something has changed. But what is it?
Why were the fines reduced by so much?
The most significant reason for the reduction in the level of the fines issued against the companies appears to be due to the ICO using a fresh methodology to calculate the fines.
For the BA and Marriot NOIs, the ICO had relied on a methodology set out in an unpublished, internal document. This provided that turnover should be the key consideration for the ICO when setting fines under the GDPR. However, BA argued that reliance upon this was unlawful and, ultimately, the ICO decided to depart from this methodology entirely when calculating the fines issued against BA and Marriott. It did not use this methodology for Ticketmaster and hence there was only a small reduction from £1.5M to £1.25M.
Instead, the ICO calculated the fines in line with its Regulatory Action Policy (“RAP”). The RAP sets out a five step process that the ICO must follow when issuing fines. Steps 1 to 4 deal with factors which add to the level of the fine (including, amongst other matters, whether the infringing party obtained any financial gain from their actions and the severity of the infringement). Taking into account these factors alone, the ICO deemed that BA’s breach of GDPR would warrant a fine of £30 million, Marriott’s would warrant a fine of £28 million and Ticketmaster £1.5 million.
However, step 5 of the process requires the ICO to take into account any mitigating factors (a list of which are set out in the RAP) which should result in the fine being reduced.
A number of overlapping mitigating factors were considered to be present in the case of both the BA and Marriott breaches. These mitigating factors included:
both companies implemented immediate measures to minimise and mitigate the effects of the attacks;
both companies cooperated fully with the ICO as part of its investigations into the incidents;
the broad press coverage relating to the cyber-attacks likely raised awareness with other companies as to the risks involved with cyber-attacks; and
both companies suffered significant reputational loss as a result of the cyber-attacks.
Taking into account all mitigating circumstances, the ICO determined that each company should have their fine reduced by 20% (representing a £6 million reduction in the case of BA and a £5.6 million reduction in the case of Marriott).
Finally, the ICO took account of the impact of Covid-19 on the companies. In the case of both BA and Marriott, this resulted in the fine being reduced by a sum of £4 million. In the case of Ticketmaster this was £250,000.
This is a relatively small amount considering how hard these companies have been hit by the pandemic and suggests that companies should not expect too much leniency for infringements during this time.
Other key take-aways
In addition to the above, a number of other conclusions can be drawn from the enforcement notices. We have set out a summary of these below:
Importance of security frameworks – the ICO found that the companies should have had in place various security measures (such as multifactor authentication and encryption) which would have either prevented the cyber-security incidents from occurring or at least mitigated their effects. In reaching these conclusions, the ICO referred to guidance from various IT security institutes and bodies, including the National Cybersecurity Centre, OWASP and NIST. As a result, it appears that all companies should have regard to well-known security frameworks when assessing and implementing their security protocols.
Intent not required for heavy sanctions – both BA and Marriott argued that it was unfair for them to be heavily sanctioned for the cyber-security incidents given that they themselves were victims of the cyber-attacks and not the perpetrators. However, the ICO found that, given their size and sophistication, the companies were negligent in failing to implement proper security measures and therefore the breaches fell within the bracket of the most severe type of infringement under the ICO’s RAP. This is in line with the wording in Art. 83 GDPR which allows supervisory authorities to take into account the “negligent character of the infringement” when issuing fines.
Act fast and cooperate in the event of a breach – BA and Marriot, both companies had their fines significantly reduced in part due to their speedy action to mitigate the effects of the breach and their cooperation with the ICO. However, Tickmaster’s slowness to respond was perceived to be an aggravating factor. It is clear that cooperating with the ICO in the event of a breach will be received positively.
Compliance with principles is essential – the companies were all found by the ICO to have violated the principle of integrity and confidentiality under Art. 5(1)(f), as well as the security obligations set out under Art. 32 GDPR. Violation of the GDPR’s principles attracts the highest levels of fines and therefore compliance with these should be considered a priority for all organisations caught by the GDPR.
The latest Ticketmaster fine highlights that the ICO has honed its regulatory enforcement approach and we are unlikely to see the massive reduction in fines as in the cases of BA and Marriot. It also establishes a marker for that future in that we are more likely to see fines in the single and tens of millions instead of hundreds of millions.
If you have any questions about these issues in relation to your own organisation, please contact a member of the team or speak with your usual Fox Williams contact.
The contact tracing App being developed by NHSX is being promoted as a key tool which will enable the lockdown to be eased by automating the process of identifying people who have been in recent close proximity with someone with symptoms of Covid-19.
The success of the App is dependant to a large extent on a significant proportion of the population downloading and using it. While the App has some utility if only 20% of the population download it, contact tracing will only be effective if a significant percentage (estimated to be around 60%) of the population participate.
Whether or not people will take up the App is, in turn, critically dependant on the level of trust which people have that the system will operate as advertised and on if and how legitimate concerns as to the privacy and security of the data will be addressed.
The way it works
The App uses low power Bluetooth on smartphone devices to communicate with other devices in near proximity that also have the App installed. The App tracks the estimated distance and duration of each device from each other device. Each device that is in contact with another will issue to the other randomised numbers. This proximity log is then stored on the device.
If, soon after, a user develops symptoms of the virus, the user can then update their status on the App. The proximity log will then be uploaded to the central system that will work out the specific other devices that need to be alerted to the fact that they have been in proximity with someone who now has symptoms, so that the users of the other devices can then self-isolate.
Any government sponsored technology that can track and trace the population instinctively raises privacy concerns.
First, although the data is anonymised and does not contain any personal identifiers, it will track everyone a user comes into contact with. Data concerning one’s daily personal inter-actions and the people one associates with can be highly sensitive and not something one would wish to share with the state.
Then there is “feature creep”. While the technology is being introduced with the best intentions and in the interests of public health, once it has been widely implemented and as time goes on there will be a temptation to “enhance” it and use it for broader purposes. For example, if the App starts to record specific location data (and not only proximity data), this will be a serious privacy concern as location data can itself reveal highly sensitive personal data (e.g. meetings at other people’s homes, attendance at particular (e.g. political) events, health clinics or places of worship etc). There may be a temptation to share the data with other government departments or the police for other purposes, such as detecting crime, or for tax or immigration purposes.
Also, let’s face it, the government and NHS do not have a great track record in respect of data security – so how secure will the data collected by the App be? There must be a risk that it could it be hacked by criminals or a rogue state sponsored hacker?
The fact that NHSX has – in contrast with many other governments (such as Ireland, Germany and Switzerland) and unlike the Google / Apple initiative – apparently opted to implement a centralised system, where data is held by the government rather than only locally on the device, heightens these concerns.
Application of Data Protection laws
Data protection laws apply to “personal data” relating to an identified or identifiable person. In the case of the App, it is used on a no names basis with the user being given a random rotating ID. The specific device ID is not used although the make and model of the device is captured. The GDPR specifically refers to an “online identifier” as being personal data. However, while pseudonymised data is regulated as personal data, truly anonymised data is not.
Although the precise way the App works is yet to be finalised and published, we must assume that the use of the App for track and trace will involve personal data and as such will be regulated by the GDPR as it will be possible to identify and distinguish some individuals (or devices) from others and to apply different treatment accordingly. Data protection laws do not stand in the way of such technologies, but such technologies must be built and implemented in compliance with data protection laws.
How to address the privacy concerns
While most people will in the present circumstances accept some degree of compromise on their privacy in the interests of their, and the nation’s, health, this has to be proportionate with the App being as minimally privacy invasive as is possible. To ensure widespread adoption on the App, it will be essential to ensure that privacy concerns are comprehensively addressed. There are a number of steps that must be taken.
Centralised v localised
First, NHSX should reconsider the centralised data approach and consider switching to a localised data solution. As the ICO commented, a purely localised system without a centralised dataset must inherently be more secure. It would also have the benefit of achieving greater interoperability with localised solutions being implemented by other countries; in particular, it is important to have interoperability on the island of Ireland.
NHSX counter this, however, by saying that there are public health benefits in their having access to the big data for analytics and research so as to learn more about the virus. It may also help limit malicious self-reporting (which could be done to try to put someone into self-isolation).
While a centralised system can be made to work, it is the case that much greater efforts in terms of data security will be required if public confidence is to be won over. There is a trade-off between functionality and public confidence; the more you try to get of the one, the less you get of the other. And public confidence is critical for widespread adoption, and ultimately for success, of the App.
There have been reports in the past few days of NHSX investigating the feasibility of transitioning the App to Apple and Google’s technology, and this could indicate a change of heart and a shift towards a localised data approach.
Second, transparency. Provision of transparent information regarding how a person’s data is to be used is a central requirement under the GDPR. This requires that information be provided in a concise, transparent, intelligible and easily accessible form, using clear and plain language.
Given that the App is to be used by the general population, the privacy notice will need to be carefully and skilfully drafted so that it is accessible to all whether young, old, or with reading difficulties. It is yet unknown what the age requirement will be for the App; but particular care will be needed for information addressed to children.
We also need to know who will be the “controller” of this data and with whom it may be shared and for what purpose. Will the controller be the NHS, or will it be the Government?
Transparency will also be well served by making public the NHSX Data Protection Impact Assessment. Under GDPR, a DPIA – a form of risk assessment – is required whenever using a new technology that is likely to result in a high risk to the rights and freedoms of individuals. The GDPR says a DPIA is specifically required where the technology involves a systematic and extensive evaluation of personal aspects relating to individuals which is based on automated processing, and on which decisions are based that significantly affect the person; or where there is processing on a large scale of special categories of data such as health data; or where there is systematic monitoring of a publicly accessible area on a large scale. In some ways, the App ticks all of these boxes and the DPIA will be a critical document.
The DPIA must contain a systematic description of the processing operations and the purposes for which the data will be used, an assessment of the necessity and proportionality of the processing in relation to these purposes, an assessment of the risks to the rights and freedoms of individuals and the measures to be taken to address these risks, including safeguards and security measures to ensure the security of the data.
NHSX must share this DPIA as soon as possible with the ICO (as contemplated by Art 36 GDPR) for consultation. While not a legal requirement, it should also be made public for wider consultation. Unless the government so requires, the DPIA does not need to be approved by the ICO as such; however, NHSX should consider and implement as appropriate any advice and recommendations that the ICO, as the independent privacy watchdog, may put forward.
Finally, the working of the App should be open to audit and review by independent experts, not as a one-off, but on an ongoing basis.
The lawful basis and consent
Under data protection laws, processing of personal data is only lawful if there is a “lawful basis” for the processing. The GDPR sets out six possibilities; the main options for the App will be user “consent” or “performance of a task in the public interest”. Health data requires an additional lawful basis which could be satisfied by “explicit consent” or for public health reasons.
It is not yet known which of these lawful bases will be applied. While the App is entirely voluntary to use, it may be that consent is not the best option as it can be difficult to establish that a valid consent has been obtained. However, consent may be required under the GDPR on the basis that the App involves “automated decision making”.
As the App accesses data on the device, it could be that consent is required under the Privacy and Communications Regulations (PECRs). If consent were required under PECRs, then it would also be necessary to use consent as the lawful basis under the GDPR. Consent will not be required under PECRs if the exemption applies where the access to the data is “strictly necessary for the provision of” the service requested by the user. If, however, the App is to access any data that is not “strictly necessary”, then consent would be required by law.
While the App may or may not rely on “consent” as the lawful basis, it is important for public trust that its use is truly voluntary. A person is free to download it, and delete it, as they wish. They are free to choose whether to update their health status or not. And – if warned that they have been in proximity with an infected person – they are free to self-isolate or not as they choose.
One of the central principles of the GDPR is ‘data minimisation’ – that data being collected must be limited to what is necessary in relation to the purposes for which they are collected. It is essential for this, therefore, to identify and articulate the purpose and then test whether the data being collected is necessary for this.
For example, the App requires proximity data, but it does not require location data. If there is the potential with a centralised system to add additional data elements, such as location data, then that could breach this central principle of the GDPR.
It has been suggested that users of the App will not need to add their name or other identifiers, but will be required to enter the first half of their post code. This alone will not ordinarily be sufficient to identify a person, but may serve a purpose in enabling NHSX to spot clusters of infection.
Under GDPR data can only be collected for specified, explicit and legitimate purposes and must not be further processed in a manner that is incompatible with those purposes. The GDPR allows for further processing for scientific research or statistical purposes in addition to the initial purposes. This is an important legal constraint on feature creep, but is it enough to give people confidence that their data will not be used for other purposes?
A further principle is that data must not be kept for longer than is necessary for the purposes for which the personal data are processed. A key issue is what happens to all the data after the Covid-19 crisis has subsided and it will no longer be necessary to track and trace. The data should then be securely destroyed or completely anonymised, but what guarantee is there that this will happen? The data retention period in relation to the data must be set out in the privacy notice to be issued with the App. This will need to reflect this principle and we have to have confidence that NHSX will honour it.
It is a fundamental requirement of data protection that appropriate technical and organisational measures are taken to ensure a level of data security appropriate to the risks. This will require implementation of state-of-the-art encryption of the data at rest and in transit. Following the GDPR principle of data protection “by design and by default”, data security and compliance with the other principles must be designed in to the way the App is built and used.
While data security is never 100% guaranteed, the public will need to be satisfied through the provision of transparent information that rigorous safeguards are in place.
Do we need a specific NHSX App watchdog?
While we have the ICO who is the regulator for compliance with data protection laws, we do have separate watchdogs for specific areas, for example, biometrics and communications monitoring. Given the speed at which the App needs to be rolled out if it is to be effective, and given that the ICO is well established and respected as the regulator for data matters under GDPR and the Data Protection Act 2018, with powers to audit, investigate complaints and issue substantial fines, the ICO is the appropriate regulator and an additional regulatory regime should not be needed.
Is specific legislation needed?
Some have suggested that specific regulation is needed to enshrine some necessary safeguards in law. Again, given timing imperatives, and given the flexible and well developed structure we already have with the GDPR and the Data Protection Act 2018, this may be a “nice to have” but should not be necessary.
Thoughts for employers
Clearly, contact tracing could be highly beneficial to employers, since it could reduce the need to carry out manual contact tracing in the event an employee falls ill with coronavirus. So, can an employer make downloading the App compulsory?
The answer will depend to some extent on the lawful basis that is relied on for the processing of personal data through the App. If the lawful basis is “consent”, then compelling employees to download and use the App will invalidate any apparent consent since it will not have been freely given. If the lawful basis is “public interest”, then employers will need to decide if they should seek to compel, or alternatively strongly recommend, their employees to download and use the App. If they seek to compel, and an employee refuses, it is hard to see that the employee can with fairness be subjected to any detriment other than as required for health and safety.
We all have a strong interest in the App being rolled out, gaining maximum levels of public adoption and making a valuable contribution to fighting the virus. For this it will be necessary for the public to have a high level of trust in the App and its privacy safeguards. Good data protection will be an essential ingredient to achieving this trust.
Nigel Miller is a partner in Fox Williams LLP and leads the Data Protection and Privacy team. He is a Certified Information Privacy Professional (CIPP/E).
Ben Nolan is an associate in the Data Protection and Privacy team at Fox Williams LLP.
Since 2006, 28 January has marked the anniversary of the first international law in the field of data protection – who knew?
A lot has happened since then. Data protection and privacy is now a rapidly expanding area of law of ever-increasing importance. As we head towards the second anniversary since the GDPR came into force, we review current developments and look ahead at what to expect in 2020.
Our special Data Privacy Day newsletter covers the following topics:
Originally intended to coincide with the GDPR, the introduction of the ePrivacy Regulation has been highly contentious and has met with considerable delay. Towards the end of 2019, the latest draft was rejected by the Council of Europe leading to further delays in its adoption.
The new rules would also ban cookie walls (where a website requires users to accept cookies as a condition of being able to access the website’s content).
The proposal will also continue the ban on unsolicited electronic communications by emails, SMS and automated calling machines. However, it is not yet known if this will extend to B2B communications, or simply apply to B2C marketing as at present.
The draft Regulation also introduces more stringent penalties for non-compliance, and bring the sanctions regime and remedies available broadly into line with the GDPR.
It is uncertain what the final form of the Regulation will be. However, given the latest delay, Brexit has now intervened and so the Regulation will not be directly applicable in the UK. Despite that, it is likely that the UK will adopt the new rules as and when introduced. While the UK may be able to make its own decision on this following Brexit, if the UK does not implement the new Regulation that may stand in the way of the adequacy decision the UK needs in order to allow the free flow of data to and from the EEA. Also, the proposed extra-territorial scope of the new Regulation (like the GDPR) means that it will remain directly applicable to UK businesses targeting the EEA. Who said that after Brexit the UK will take back control of its laws?!
Meanwhile, the ICO has also published a draft direct marketing code of practice for consultation. The consultation closes on 4 March 2020 and the ICO expects to finalise it in 2020. The ICO plans to produce additional practical tools such as checklists to go alongside the code.
Some key points include:
The two lawful bases most likely to be applicable to direct marketing are consent and legitimate interests. However, where PECR applies and requires consent, then in practice consent should also be your lawful basis under the GDPR.
It is important to keep personal data accurate and up to date. It should not be kept for longer than is necessary. It is harder to rely on consent as a genuine indication of wishes as time passes.
If you are considering buying or renting direct marketing lists, you must ensure you have completed appropriate due diligence
Profiling and enrichment activities must be done in a way that is fair, lawful and transparent.
If you are using new technologies for marketing and online advertising, it is highly likely that you will be required to conduct a data protection impact assessment (DPIA).
If someone objects you must stop processing for direct marketing purposes. You should add their details to your suppression list so that you can screen any new marketing lists against it.
Once the draft ePrivacy Regulation is finalised and the UK’s position on Brexit is clear, the ICO has indicated that it will update the direct marketing code to take into account of the ePrivacy Regulation.
It is likely that the ICO will start taking enforcement action against organisations which do not follow the rules, and this could lead to fines. As such, businesses which are not yet compliant should take steps to ensure compliance now.
At a high level, the following are the main rules when using cookies on websites:
User consent must be obtained (except in relation to “strictly necessary cookies”)
The ICO confirmed that the standard of consent for using cookies is the same high standard as under the GDPR, even for cookies which do not involve the processing of personal data. This means that implied or inferred consent can no longer be relied on for cookies. For consent, a clear affirmative act is needed; pre-ticked boxes or inactivity does not constitute consent.
Websites which use non-essential cookies without specifically requiring users to consent to these when accessing a site (e.g. by specifying that continued use entails consent) are, therefore, not compliant. This also means that all non-essential cookies should be switched off by default. It also means that such cookies should only be served on the user if and when the user consents.
“Strictly necessary cookies”, which do not require consent, are those which are essential to provide a user with the service they have requested or to comply with applicable law. Analytics cookies and advertising cookies do not fall within this exemption.
Provide clear and transparent information to users concerning the cookies you use
The ICO Guidance emphasises the need to provide users with transparent information about cookies. The information must be in accordance with the higher standards of transparency as required by the GDPR; it must be presented in a “concise, transparent, intelligible and easily accessible form, using clear and plain language”.
In relation to cookies, this means that online retailers need to review and update their cookies policies to ensure that these are drafted in a sufficiently clear and easily accessible manner for a normal user to be able to understand how the different types of cookies are being used on the website. Failure to provide clear information will breach the transparency requirement, and will also undermine any “consent” if the consent cannot be said to be sufficiently informed.
Highlighting the importance of transparency and consent, in January 2019, the French data protection regulator imposed a fine of €50 million on Google for lack of transparency, inadequate information and lack of valid consent regarding ads personalization on mobile devices. For more information on this, see further https://idatalaw.com/2019/01/25/e50m-fine-for-google-in-france/