Facebook, WhatsApp and mission creep

Emma RoakeGerman regulators have slapped down WhatsApp’s move to share its users’ data with parent company Facebook, calling it an “infringement of national data protection law”.

Despite Facebook and WhatsApp publicly committing in 2014 (when Facebook bought WhatsApp) that users’ data would not be shared between the two companies, recent changes to WhatsApp’s terms and conditions have reversed this position.  The new terms and conditions state that user data (including the mobile number and device information of the WhatsApp user) will be shared with Facebook, including for targeted advertising purposes.  The terms and conditions automatically opt in users to the data-sharing arrangement.

However, in the last few days of September, the Hamburg data protection commissioner issued an administrative order which:

  • prohibits Facebook from collecting and storing the data of German WhatsApp users; and
  • compels Facebook to destroy any data which has already been collected from German WhatsApp users.

The Hamburg data protection commissioner has said that the WhatsApp user’s consent needs to be obtained to the data-sharing for it to be lawful, and this had not happened.

Facebook is appealing the decision.

The changes to WhatsApp’s terms and conditions have caused widespread controversy since being announced, and have caused concern with data regulators around the world.

The UK’s data protection regulator (the ICO) has announced that it is investigating the data-sharing on behalf of WhatsApp users in the UK.  Elizabeth Denham (the new information commissioner) commented in an interview with BBC’s Radio 4 that there was a “lot of anger” amongst the UK’s WhatsApp users.  Ms Denham also addressed the WhatsApp / Facebook data-sharing arrangement in her first speech as information commissioner on 29 September 2016, commenting that “all of this is about transparency and individual control”.

Transparency and trust were the central themes of Ms Denham’s first speech, where she explained that her fundamental objective as information commissioner was to build a culture of data confidence in the UK.  She noted her concern that an ICO survey from earlier in the year had shown that only 1 out of every 4 adults trust businesses with their personal data.

Ms Denham made clear that the ICO would pick and choose its investigations carefully, making sure that those investigations were relevant to the public.  Unsurprisingly, she said that technology “is already at the forefront” of most of the ICO’s major investigations.  For example, in addition to investigating the change in WhatsApp terms and conditions, the ICO has in the last few weeks asked questions about the major Yahoo data breach.

The ICO has indicated that it will be putting out an update soon on its WhatsApp/Facebook investigation.  It will be interesting to see whether the ICO follows the approach of the German regulators.

Emma Roake is a senior associate in the commerce & technology team at City law firm Fox Williams LLP and can be contacted at eroake@foxwilliams.com

Advertisements

Protecting the quantified self: data protection issues related to wearable tech

Emma RoakeThe market for and consumer awareness of wearable tech has rocketed over the last few years, and is predicted by some analysts to be worth $25 billion by 2019.  From fitness bands for wrists and the first generation of smartwatches and smart eyewear, we will soon be able to purchase smart clothes with sensors to monitor fitness and athletic performance.  And with the technology developing at a dizzying pace, ingestibles and embeddables are just over the horizon, taking the form of digital pills, and chips to be inserted into muscles or under the skin.

Each new generation of wearable tech aims to be more sophisticated and less obtrusive than the last.   The less obtrusive it becomes, however, the greater the risk of it becoming more intrusive, as the wearer (and potentially third parties who come into close proximity with the wearer) are at risk of having their personal data used in ways which they may not have anticipated.

The data protection concerns inherent in wearable tech have been exercising regulators for some time.  Part of the problem is that the current legislation in the UK – the Data Protection Act 1998 – was drafted in a time when smart technology was in its very early development phase.  Despite this, regulators have emphasised that all stakeholders involved in the production and operation of wearable tech must comply with data protection laws.

Wearable tech companies will be “data controllers” for the purposes of the data protection legislation if their device collects “personal data” from users, and if (as is likely) the wearable tech company determines the purposes for which and the manner in which such data is to be used.

“Personal data” is any data which relates to a living individual who can be identified from that data alone, or from that data when it is combined with other information which is in the possession of the data controller. A common assumption is that personal data is limited to someone’s name, photograph, email address and mobile number, but in fact the definition goes much wider.  Data such as an IMEI number of a smartwatch can be personal data, if it is used to differentiate an individual from others.

There are various requirements with which data controllers have to comply under data protection legislation, including the following:

  1. The processing of the data must be fair and lawful. As part of this, the company will need to tell the user what data it is collecting and what the data will be used for.  Given that some wearable tech devices collect different sorts of data using different sensors, it is crucial that the user is aware of all the data being collected by all enabled sensors.
  2. The consent of the user to the processing of their personal data will almost always be needed for the processing to be fair. Consent must be freely given, specific and informed. In relation to sensitive personal data (such as data relating to an individual’s health) the requirements for consent are more stringent. Data controllers collecting data relating to an individual’s health (which will be a large proportion of the wearable tech industry) will need to ensure that their users give “explicit” consent before such data is collected.  Opt-in consent is required in these circumstances, not opt-out consent.
  3. The data must be protected by appropriate technical and organisational measures against unauthorised or unlawful use, and against accidental loss, destruction or damage. Given the extent of the personal data collected by many wearables, the sensitivity of that data and the rise of hacking, data security must be a top priority for wearable tech companies.
  4. Personal data must not be transferred to a country outside the EEA unless that country ensures an adequate level of protection of personal data. For US-based wearable tech companies selling into the EU, it should be borne in mind that the US is not considered by the European Commission to adequately protect personal data and that it is no longer possible to rely on Safe Harbors. An alternative solution should be put in place to ensure transfers outside of the EEA are lawful.