Do you consent to cookies? The latest data protection reforms in the UK

Kolvin Stone
Kolvin Stone (partner)
Vlad Arutyunyan

The government has announced significant proposed reforms to data privacy laws in the form of a Data Reform Bill, which was introduced into Parliament on 18 July 2022.

The Bill, part of the UK’s National Data Strategy, aims to improve on the UK’s current data protection standards whilst minimising the administrative burden of requirements on businesses in the UK.

We look at key aspects of the Bill, which originated from a government consultation, the response to which came out earlier this year.

Cookies and calls

Part of the Bill focuses on reducing ‘consent fatigue’.

Websites will use an ‘opt-out’ rather than ‘opt-in’ model for cookie consents and the onus for protecting data will be on users to alter their own browser settings to better protect their data. This means accepting cookies each time you enter a new site may be a thing of the past!

There will also be greater financial penalties for nuisance calls, texts, and certain data breaches where no consent has been given for such marketing. For example, fines will now be made in-line with current UK GDPR guidelines, the higher of up to 4% of the company’s global turnover or £17.5 million.

Updating the ICO

The Bill aims to modernise the Information Commissioner’s Office (ICO) including extending its legal remit, clarifying its framework for decision-making, and building out its leadership to enhance its reputation internationally.

The proposed board of the ICO will be entirely independent and consist of a chair, chief executive, and other board members. The Bill also proposes greater accountability of the ICO to the public and the government. The ICO will also be expected to consider in future decision making:

  1. economic growth and innovation
  2. competition
  3. collaborating with other regulators and relevant bodies

In addition, the ICO will be expected to set up expert panels in relevant areas when developing statutory guidance.

“Data Protection”

The Bill seeks to limit the definition of “data protection” to only include situations where:

  • information is identifiable by the controller or processor by reasonable means at the time of the processing or
  • the controller or processor ought to know that another person will likely obtain the information as a result of the processing and the individual will likely be identifiable by that person by reasonable means at the time of the processing.

Fewer requirements

The Bill also poses removing the requirement:

  • for mandatory ICO consultation (where a company has identified a high-risk data processing activity) and making it voluntary
  • to appoint a Data Protection Officer and placing data privacy responsibilities on a senior member of the company
  • to perform Data Protection Impact Assessments and
  • to retain records of any processing activities.

Automated decision making

The Bill has removed previous restrictions on automated decision making. It proposes to allow for solely automated decision making in relation to significant decisions where appropriate safeguards are in place, including the right to human intervention. There is not yet clarity as to what would constitute a “significant” decision in this context.

Data transfers

Whilst data privacy laws will need to remain at the standard imposed by the EU GDPR to facilitate effective data transfer between the UK and EU, the Bill also seeks to strengthen data transfers with trade areas outside the EU. The Bill puts forward an autonomous UK international transfer regime in lieu of the current EU-aligned regime.

The UK has highlighted high target jurisdictions where adequacy decisions will be prioritised. This includes the US, Australia and Singapore. On 5 July 2022, the UK announced that it has reached a data agreement with the Republic of Korea which hopes to create a new age of digital trade between the two nations.

Supporting scientific research

The proposed reform aims to encourage at-home scientific innovation by offering further clarity as to how data can be used for research purposes.

The Bill removes some of the tick boxes before scientists can collect data, by removing the need for granular specification of the ultimate purpose of any research before it can begin.

The Bill also suggests clarifying the standard to which data should anonymised to be relevant to each situation and the extent to which any data can be reused for further research.

The future

There is a substantial risk it will jeopardise the UK’s adequacy decision with the EU, which facilitates free data flow between the UK and EU. For instance, the Law Society aired its reservations surrounding the approach for being too business and innovation focussed which may be to the detriment of individual rights and protection.

The data rights activist body, Open Rights Group have commented on the Bill’s restriction of data subject’s rights “substantially incompatible” with the EU GDPR.

As a result, we expect ongoing discourse between the EU and UK to resolve these issues.

Disruption in AdTech: where are we and what next?

Kolvin Stone
Kolvin Stone (partner)
Ben Nolan
Ben Nolan (associate)

The AdTech industry is facing the biggest overhaul since its inception, which inevitably will have an impact on the wider web ecosystem as so many services and content are funded via advertising revenue.

AdTech is currently heavily premised on the concept of delivering personalised ads to users. This is achieved through the use of technologies such as cookies and mobile advertising identifiers.

The impact of the GDPR and similarly inspired regulations, the tightening grip of regulators and, in some ways even more significantly, the recent action by two of the industry’s biggest players, Apple and Google, have left the industry in a state of flux.

We discuss recent developments below and look at what’s next for more privacy friendly AdTech.

New regulations and regulatory action

Following GDPR, new privacy laws are being developed in jurisdictions across the globe and many of these specifically regulate online advertising. Notably, in the US, California has introduced the CCPA and CRPA and similar privacy laws are expected in various other US states in the near future. Further changes to the ePrivacy landscape are also coming to the EU soon.

In the UK, regulatory action is on the cards, with the ICO currently investigating the AdTech industry. It is expected that industry participants will need to make significant changes to their practices following the conclusion of the ICO’s investigation and expected enforcement action.

Apple’s new operating system

In April, Apple rolled out a new operating system, iOS 14.5, which prevents mobile applications from using IDFAs (unique advertising IDs attributed to iPhones) and other device identifiers to track users’ app and internet browsing activities for marketing purposes, unless the user has provided consent to such tracking.

This change affects iPhone users worldwide and early statistics suggests a large proportion of users are taking advantage of the option to opt-out of being tracked.

Google Chrome and the Removal of the Third-Party Cookie

At the browser level, Google has announced that it will block all third party cookies in early 2022 (all other major browser providers have already phased out these cookies).

Third-party cookies have traditionally been relied on to track users’ internet browsing activities across websites to build up a profile of that user.  This information is then shared within the AdTech ecosystem to ensure that businesses are able to deliver targeted ads to users.

However, there are almost insurmountable challenges with using third party cookies lawfully for tracking and advertising given the challenges to meet the high standards of transparency and consent required from privacy regulations like the GDPR.

This is the context in which Google has decided to phase third party cookies.

What next for AdTech?

Although it is too soon to say for sure what these changes will mean for companies in the AdTech space, we have set out some likely consequences below:

  • Cookie-less advertising – businesses are developing advertising strategies that do not rely on cookies. For example, Google has begun trialling its proposed alternative, “Federated Learning of Cohorts”, where ads are delivered to categories of users (rather than specific individuals).
  • First party data advertising – based on information collected directly from the user or via interactions with your site or App.
  • Resurgence of contextual advertising? – this type of advertising, which fell out of favour following the rise of behavioural advertising, displays ads to users relating to the content of the page being viewed, rather than being targeted at specific users.
  • Incentives to sharing data? – it is possible that some businesses may offer incentives to customers who agree to their data being used for advertising purposes.

What does this mean?

If your business model is based on Ad revenue, you need to review whether your Ad partners are using third party cookies.  There will likely be legal risk with using third party cookies.  In addition, now is the time to consider using more privacy friendly AdTech models.

 

If you have any questions about these issues in relation to your own organisation, please contact a member of the team or speak with your usual Fox Williams contact.

Cookies and the new ePrivacy Regulation

Nigel Miller (partner)

Why is it important?

While many people may not care too much about cookies, there are a number of reasons why they are important for website owners.

First, you cannot drop a cookie without prior consent. As a result of the changes already brought in by the GDPR since May 2018, it is no longer possible to reply on implied consent for cookies (for example, deemed consent by continuing to browse the website) as the standard for consent under the GDPR is much higher and requires a specific opt-in.

Second, the issue of cookies is high on regulator’s (the ICO) agenda. While many of us suffer from “cookie notice fatigue”, and just click through to get rid of the annoying banners, there has been an increasing number of complaints about cookies to the ICO, nearly 2,000 in the past year.

Third, the ICO is also currently investigating the Adtech sector which is largely driven by cookies. While many cookies are innocuous, others are highly privacy invasive and are involved in systematic monitoring and tracking browsing across devices, device fingerprinting and online behavioural advertising. The intrusive nature of the technology makes this a priority area for the regulators. In response to this, the hugely complex adtech industry will likely be required to adapt and provide much higher levels of transparency.

Fourth, because of the GDPR level fines; there is nothing like the eye-watering fines that can be issued under the GDPR, and have been issued in relation to cookies notably by the French regulator to Google and Amazon, to get this issue high up the corporate agenda (eg CNIL – €100m Google, €35m Amazon).

And finally, the law is developing with a new ePrivacy regulation on the horizon, which we look at below.

What is the current law?

The current law is based on the EU ePrivacy Directive of 2002. In the UK, this was implemented by the Privacy and Electronic Communications Regulations, fondly known as “PECR”.

Actually, the law does not refer to “cookies” as such; the regulation is technology neutral and covers a range of cookie-like technologies. The key point is that PECR covers any technology that can “access” or “store” data on the user device – this includes smartphones, smart TVs and other devices. It can also include technologies like tracking pixel gifs, often used to track if marketing emails have been opened which can provide valuable analytics.

The key requirement under PECR is that, where you deploy a cookie, you must:

  • provide the user with clear and comprehensive information about the purposes of the cookie; and
  • get the consent of the user.

There are a couple of exceptions to this, the most important one being that you do not need consent for cookies that are “strictly necessary” for the service requested by the user.

So, cookies that are helpful or convenient but not essential, or that are only essential for your own purposes, as opposed to the user’s, will still require consent.

For example, cookies used to authenticate a user, to remember items in a shopping cart, or to remember language or other user preferences are regarded as “strictly necessary”, but cookies for analytics purposes, and advertising cookies are non-essential and need consent.

Even where consent is not a requirement, users must still be informed of the use of cookies through means of a cookie banner and policy.

PECR v GDPR

An important thing to bear in mind is that consent for cookies is needed, whether or not the cookie data involves any “personal data”.  If it does involve personal data, such as device ID, username, browsing details etc, then that will be subject to the GDPR as well as PECR.

Under the GDPR, you need a legal basis for processing personal data. Typically, for marketing, this could be either consent or legitimate interests. However, where cookies are deployed and processing of personal data is involved, then PECR trumps the GDPR. This means that, if consent is required under PECR, then consent is also the appropriate legal basis for processing personal data under the GDPR.

There is some debate about this in the adtech sector where it is argued that, while consent is needed for the cookie, “legitimate interests” could be used as the legal basis for any subsequent processing of the data. The regulator does not agree with this, but the actual legal position is not settled.

So, what do we need to do?

The first thing to do would be to carry out a cookie audit to make sure you know exactly what cookies are in use, and the purpose and duration of each. In this audit:

  • Identify any of the cookies that are “strictly necessary”, and so don’t need consent.
  • Identify any 3rd party cookies – in the case of 3rd party cookies, such as Google analytics or affiliate networks, while it is the third party that requires the consent as it is their cookie, in practice the third party requires that the site owner gets the consent on their behalf.
  • Review the consent mechanism you have on the site to make sure it is compliant – everyone seems to do this differently, and some ways are more compliant than others.
  • Review / update your cookie policy – to make sure that it meets the transparency requirement, and importantly that it is consistent with the cookies actually in use. There is no one-size-fits all for this as the policy needs to be specific to the cookies you have implemented and the purposes of those cookies.
  • Finally, you may need to carry out a data protection impact assessment under the GDPR – if the cookies involve personal data and are used for profiling for marketing or other purposes, then you may need to carry out a DPIA. Even if this is not strictly required, it can be good practice to do so to ensure that any risks are identified and any appropriate measure implemented to mitigate those risks.

How to get consent?

The consent required under PECR follows the GDPR standard, meaning it must be freely given, specific, informed, and an unambiguous indication of the end user’s wishes through a clear affirmative action. There are a few key points to bear in mind:

  • As above, there is no need to get consent for “strictly necessary” cookies. And there is no need therefore for a pre-ticked box for these cookies.
  • Where consent is needed, do not use pre-ticked boxes; this would not be a valid consent, as consent has to be signified by a positive step such as ticking the box.
  • This is important – do not set cookies before you get the opt-in, so you may need to do some technical work on the site to make sure that this is the case.
  • Provide clear and comprehensive information. This is because, if the information is not clear and comprehensive then, as well as breaching the transparency requirement, it will undermine the consent as it will not be a “fully informed” consent.
  • Do not bundle multiple consents into one; ideally, there would granular consents for each cookie, or at least each category.
  • There should also be an “Accept All” and a “Reject All” button.
  • Provide an option for users to revisit consents that they have given.

The new ePrivacy Regulation

A new ePrivacy Regulation has been on the horizon since the GDPR came into force but has been batted back and forth in Europe since 2017 without agreement being reached.  However, the text was finally agreed in February 2021 and it is now going to the European Parliament.

The objective of the ePrivacy Regulation is to update the ePrivacy Directive – which is nearly 20 years old – and to bring it into line with GDPR.  It aligns with the substantial fines possible under the GDPR, whereas at the moment fines under PECR are limited to £0.5m. The ePrivacy Regulation also allows for individuals to bring claims which could involve class action claims.

Also, like the GDPR, the regulation provides for extraterritorial application, so it will apply to businesses outside the EU insofar as it relates to end users in the EU. However, unlike the GDPR, it does not require that EU users are specifically targeted — the extraterritorial application is triggered as soon as users in the EU are implicated regardless of whether there was an intention to direct activities at the EU market.

So far as the cookie requirement is concerned:

  • There is still a need for affirmative consent, except in a number of circumstances which are a little broader than at present, and will include cookies for the purpose of audience measuring (e.g., web analytics) and for IT security purposes.
  • The regulation also allows for consent to be given by selecting technical settings in the browser, for example by having a whitelist of sites which the user consents to dropping cookies. But browsers will need to develop to facilitate this.
  • Also, users who have given consent must be reminded every 12 months of their right to withdraw consent.

Once the ePrivacy Regulation is finalised there will be a two year transition period before it comes into force.

As regards the UK, following Brexit, the ePrivacy Regulation will not automatically extend to the UK, but the UK may amend PECR to align it to the ePrivacy Regulation, especially in so far as the Regulation is more business-friendly and provides additional exceptions to the cookie rule. Also, because of the extraterritorial application of the Regulation, it will effectively apply to all UK businesses as regards end users in the EU.

If you have any questions about these issues in relation to your own organisation, please contact a member of the team or speak with your usual Fox Williams contact.

Addressing privacy concerns with NHSX App

NHSX CovidThe contact tracing App being developed by NHSX is being promoted as a key tool which will enable the lockdown to be eased by automating the process of identifying people who have been in recent close proximity with someone with symptoms of Covid-19.

The success of the App is dependant to a large extent on a significant proportion of the population downloading and using it. While the App has some utility if only 20% of the population download it, contact tracing will only be effective if a significant percentage (estimated to be around 60%) of the population participate.

Whether or not people will take up the App is, in turn, critically dependant on the level of trust which people have that the system will operate as advertised and on if and how legitimate concerns as to the privacy and security of the data will be addressed.

The way it works

The App uses low power Bluetooth on smartphone devices to communicate with other devices in near proximity that also have the App installed. The App tracks the estimated distance and duration of each device from each other device. Each device that is in contact with another will issue to the other randomised numbers. This proximity log is then stored on the device.

If, soon after, a user develops symptoms of the virus, the user can then update their status on the App. The proximity log will then be uploaded to the central system that will work out the specific other devices that need to be alerted to the fact that they have been in proximity with someone who now has symptoms, so that the users of the other devices can then self-isolate.

Privacy concerns

Any government sponsored technology that can track and trace the population instinctively raises privacy concerns.

First, although the data is anonymised and does not contain any personal identifiers, it will track everyone a user comes into contact with. Data concerning one’s daily personal inter-actions and the people one associates with can be highly sensitive and not something one would wish to share with the state.

Then there is “feature creep”. While the technology is being introduced with the best intentions and in the interests of public health, once it has been widely implemented and as time goes on there will be a temptation to “enhance” it and use it for broader purposes. For example, if the App starts to record specific location data (and not only proximity data), this will be a serious privacy concern as location data can itself reveal highly sensitive personal data (e.g. meetings at other people’s homes, attendance at particular (e.g. political) events, health clinics or places of worship etc).  There may be a temptation to share the data with other government departments or the police for other purposes, such as detecting crime, or for tax or immigration purposes.

Also, let’s face it, the government and NHS do not have a great track record in respect of data security – so how secure will the data collected by the App be? There must be a risk that it could it be hacked by criminals or a rogue state sponsored hacker?

The fact that NHSX has – in contrast with many other governments (such as Ireland, Germany and Switzerland) and unlike the Google / Apple initiative – apparently opted to implement a centralised system, where data is held by the government rather than only locally on the device, heightens these concerns.

Application of Data Protection laws

Data protection laws apply to “personal data” relating to an identified or identifiable person. In the case of the App, it is used on a no names basis with the user being given a random rotating ID. The specific device ID is not used although the make and model of the device is captured. The GDPR specifically refers to an “online identifier” as being personal data. However, while pseudonymised data is regulated as personal data, truly anonymised data is not.

Although the precise way the App works is yet to be finalised and published, we must assume that the use of the App for track and trace will involve personal data and as such will be regulated by the GDPR as it will be possible to identify and distinguish some individuals (or devices) from others and to apply different treatment accordingly. Data protection laws do not stand in the way of such technologies, but such technologies must be built and implemented in compliance with data protection laws.

How to address the privacy concerns

While most people will in the present circumstances accept some degree of compromise on their privacy in the interests of their, and the nation’s, health, this has to be proportionate with the App being as minimally privacy invasive as is possible. To ensure widespread adoption on the App, it will be essential to ensure that privacy concerns are comprehensively addressed. There are a number of steps that must be taken.

Centralised v localised

First, NHSX should reconsider the centralised data approach and consider switching to a localised data solution. As the ICO commented, a purely localised system without a centralised dataset must inherently be more secure. It would also have the benefit of achieving greater interoperability with localised solutions being implemented by other countries; in particular, it is important to have interoperability on the island of Ireland.

NHSX counter this, however, by saying that there are public health benefits in their having access to the big data for analytics and research so as to learn more about the virus. It may also help limit malicious self-reporting (which could be done to try to put someone into self-isolation).

While a centralised system can be made to work, it is the case that much greater efforts in terms of data security will be required if public confidence is to be won over. There is a trade-off between functionality and public confidence; the more you try to get of the one, the less you get of the other. And public confidence is critical for widespread adoption, and ultimately for success, of the App.

There have been reports in the past few days of NHSX investigating the feasibility of transitioning the App to Apple and Google’s technology, and this could indicate a change of heart and a shift towards a localised data approach.

Transparency

Second, transparency. Provision of transparent information regarding how a person’s data is to be used is a central requirement under the GDPR. This requires that information be provided in a concise, transparent, intelligible and easily accessible form, using clear and plain language.

Given that the App is to be used by the general population, the privacy notice will need to be carefully and skilfully drafted so that it is accessible to all whether young, old, or with reading difficulties. It is yet unknown what the age requirement will be for the App; but particular care will be needed for information addressed to children.

We also need to know who will be the “controller” of this data and with whom it may be shared and for what purpose. Will the controller be the NHS, or will it be the Government?

Risk assessment

Transparency will also be well served by making public the NHSX Data Protection Impact Assessment. Under GDPR, a DPIA – a form of risk assessment – is required whenever using a new technology that is likely to result in a high risk to the rights and freedoms of individuals. The GDPR says a DPIA is specifically required where the technology involves a systematic and extensive evaluation of personal aspects relating to individuals which is based on automated processing, and on which decisions are based that significantly affect the person; or where there is processing on a large scale of special categories of data such as health data; or where there is  systematic monitoring of a publicly accessible area on a large scale. In some ways, the App ticks all of these boxes and the DPIA will be a critical document.

The DPIA must contain a systematic description of the processing operations and the purposes for which the data will be used, an assessment of the necessity and proportionality of the processing in relation to these purposes, an assessment of the risks to the rights and freedoms of individuals and the measures to be taken to address these risks, including safeguards and security measures to ensure the security of the data.

NHSX must share this DPIA as soon as possible with the ICO (as contemplated by Art 36 GDPR) for consultation. While not a legal requirement, it should also be made public for wider consultation. Unless the government so requires, the DPIA does not need to be approved by the ICO as such; however, NHSX should consider and implement as appropriate any advice and recommendations that the ICO, as the independent privacy watchdog, may put forward.

Finally, the working of the App should be open to audit and review by independent experts, not as a one-off, but on an ongoing basis.

The lawful basis and consent

Under data protection laws, processing of personal data is only lawful if there is a “lawful basis” for the processing. The GDPR sets out six possibilities; the main options for the App will be user “consent” or “performance of a task in the public interest”. Health data requires an additional lawful basis which could be satisfied by “explicit consent” or for public health reasons.

It is not yet known which of these lawful bases will be applied. While the App is entirely voluntary to use, it may be that consent is not the best option as it can be difficult to establish that a valid consent has been obtained. However, consent may be required under the GDPR on the basis that the App involves “automated decision making”.

As the App accesses data on the device, it could be that consent is required under the Privacy and Communications Regulations (PECRs). If consent were required under PECRs, then it would also be necessary to use consent as the lawful basis under the GDPR. Consent will not be required under PECRs if the exemption applies where the access to the data is “strictly necessary for the provision of” the service requested by the user. If, however, the App is to access any data that is not “strictly necessary”, then consent would be required by law.

While the App may or may not rely on “consent” as the lawful basis, it is important for public trust that its use is truly voluntary. A person is free to download it, and delete it, as they wish. They are free to choose whether to update their health status or not. And – if warned that they have been in proximity with an infected person – they are free to self-isolate or not as they choose.

Data minimisation

One of the central principles of the GDPR is ‘data minimisation’ – that data being collected must be limited to what is necessary in relation to the purposes for which they are collected. It is essential for this, therefore, to identify and articulate the purpose and then test whether the data being collected is necessary for this.

For example, the App requires proximity data, but it does not require location data. If there is the potential with a centralised system to add additional data elements, such as location data, then that could breach this central principle of the GDPR.

It has been suggested that users of the App will not need to add their name or other identifiers, but will be required to enter the first half of their post code. This alone will not ordinarily be sufficient to identify a person, but may serve a purpose in enabling NHSX to spot clusters of infection.

Purpose limitation

Under GDPR data can only be collected for specified, explicit and legitimate purposes and must not be further processed in a manner that is incompatible with those purposes. The GDPR allows for further processing for scientific research or statistical purposes in addition to the initial purposes.  This is an important legal constraint on feature creep, but is it enough to give people confidence that their data will not be used for other purposes?

Storage limitation

A further principle is that data must not be kept for longer than is necessary for the purposes for which the personal data are processed. A key issue is what happens to all the data after the Covid-19 crisis has subsided and it will no longer be necessary to track and trace. The data should then be securely destroyed or completely anonymised, but what guarantee is there that this will happen? The data retention period in relation to the data must be set out in the privacy notice to be issued with the App. This will need to reflect this principle and we have to have confidence that NHSX will honour it.

Data security

It is a fundamental requirement of data protection that appropriate technical and organisational measures are taken to ensure a level of data security appropriate to the risks. This will require implementation of state-of-the-art encryption of the data at rest and in transit. Following the GDPR principle of data protection “by design and by default”, data security and compliance with the other principles must be designed in to the way the App is built and used.

While data security is never 100% guaranteed, the public will need to be satisfied through the provision of transparent information that rigorous safeguards are in place.

Do we need a specific NHSX App watchdog?

While we have the ICO who is the regulator for compliance with data protection laws, we do have separate watchdogs for specific areas, for example, biometrics and communications monitoring. Given the speed at which the App needs to be rolled out if it is to be effective, and given that the ICO is well established and respected as the regulator for data matters under GDPR and the Data Protection Act 2018, with powers to audit, investigate complaints and issue substantial fines, the ICO is the appropriate regulator and an additional regulatory regime should not be needed.

Is specific legislation needed?

Some have suggested that specific regulation is needed to enshrine some necessary safeguards in law. Again, given timing imperatives, and given the flexible and well developed structure we already have with the GDPR and the Data Protection Act 2018, this may be a “nice to have” but should not be necessary.

Thoughts for employers

Clearly, contact tracing could be highly beneficial to employers, since it could reduce the need to carry out manual contact tracing in the event an employee falls ill with coronavirus. So, can an employer make downloading the App compulsory?

The answer will depend to some extent on the lawful basis that is relied on for the processing of personal data through the App. If the lawful basis is “consent”, then compelling employees to download and use the App will invalidate any apparent consent since it will not have been freely given. If the lawful basis is “public interest”, then employers will need to decide if they should seek to compel, or alternatively strongly recommend, their employees to download and use the App. If they seek to compel, and an employee refuses, it is hard to see that the employee can with fairness be subjected to any detriment other than as required for health and safety.

We all have a strong interest in the App being rolled out, gaining maximum levels of public adoption and making a valuable contribution to fighting the virus. For this it will be necessary for the public to have a high level of trust in the App and its privacy safeguards. Good data protection will be an essential ingredient to achieving this trust.

Nigel Miller is a partner in Fox Williams LLP and leads the Data Protection and Privacy team. He is a Certified Information Privacy Professional (CIPP/E).

Ben Nolan is an associate in the Data Protection and Privacy team at Fox Williams LLP.

Happy Data Privacy Day! And what’s coming up in 2020?

Since 2006, 28 January has marked the anniversary of the first international law in the field of data protection – who knew?

A lot has happened since then. Data protection and privacy is now a rapidly expanding area of law of ever-increasing importance. As we head towards the second anniversary since the GDPR came into force, we review current developments and look ahead at what to expect in 2020.

Our special Data Privacy Day newsletter covers the following topics:

Accountability – sounds good, but what does it actually mean?
International transfers and Brexit
What’s cooking with cookies?
Whatever happened to the ePrivacy Regulation?
The growing culture of Data Subject Access Requests (DSARs)
Adtech – under regulator scrutiny
Artificial Intelligence (“AI”) and data protection
Data security – what’s appropriate?
Fines – more to come …
Class action compensation claims

Meanwhile, please make a diary note of our annual Data Protection Update seminar, which will be held on 14 May 2020.

Please do contact us if you have any questions or if our data protection team can assist you in any way.

Continue reading