Evidence Hub

CNIL Recommends Against US Service Providers for Processing Health Data – What Does that Mean?

Written by Dev | Nov 26, 2020

Organizations processing the personal data of EU citizens, and data from France in particular, took note of recent recommendations made by France’s Data Protection Authority (CNIL) following the Schrems II ruling this fall. External organizations that are owned and/or operating in third countries are being targeted, with a call for greater technical, contractual and organizational measures to protect the EU data that they are processing.

At the beginning of October, the CNIL submitted recommendations to the French Administrative Supreme Court (Conseil d’État) for suspension of the Microsoft Azure hosted “Health Data Hub” due to concerns over the company’s susceptibility to US surveillance laws. CNIL also recommended that French services that handle health data avoid the use of American cloud hosting companies. This recommendation followed the Schrems II ruling which struck down the EU-US Privacy Shield agreement due to privacy concerns related to American surveillance laws. CNIL indicated that even when data is processed within the EU, American companies may be subject to FISA702 and other surveillance laws that grant US authorities the right to access the data.

On October 13, the Conseil d’État issued a summary judgement opposing CNIL’s recommendation to suspend the Health Data Hub. Although the summary judge did acknowledge that there may be a risk of intelligence services gaining access to the data, the risk was not seen as strong enough to warrant a suspension of the Hub, particularly during the COVID-19 pandemic. The Conseil d’État recommended additional safeguards be put in place, including strengthening of the legal agreement with Microsoft to increase data protections.

Increasing Data Protection

As the recommendations of the CNIL demonstrate, there is a shift toward tightening data protections in the wake of the Schrems II decision both for data leaving the EU and now for data being processed by overseas companies within the EU. However, some organizations in France, and in Europe more widely, may not be able to avoid the use of external service providers and/or may require continued data transfers to other jurisdictions with data protections that have not been deemed adequate by EU authorities.

European authorities are now increasing efforts to try to fill some of the gaps created as a result of the Schrems II decision and shore up protections for the personal data of EU citizens. Many countries within the EU, as well as the UK and the US, have issued guidance in response to Schrems II.  And on November 10 th, the European Data Protection Board (EDPB) released recommendations of additional measures to supplement existing transfer tools. These recommendations are aimed at data exporters to help them assess data protections in third countries and implement supplementary measures when required. The EDPB has laid out 6 steps for exporters to follow which include:

  1. Map all transfers of data,
  2. Verify the tool upon which each transfer relies,
  3. Assess whether any laws or practices in the third country could compromise the safeguards afforded by those tools,
  4. Adopt supplementary data protection measures if the assessment has deemed that necessary,
  5. Document the steps taken and get approval from supervisory authorities when needed, and
  6. Re-evaluate the protections periodically to ensure they remain adequate. In terms of supplementary measures, the EDPB has suggested three categories of measures: technical, contractual and organizational. They have provided examples of appropriate safeguards in each category, and indicate that if no appropriate supplementary measure can be found, data transfers should be stopped

In terms of technical measures, encryption is named as a tool that may be leveraged as well as other privacy enhancing technologies (PETs) such as pseudonymisation.

On the heels of the EDPB recommendations, the European Commission published a draft of new Standard Contractual Clauses (SCCs) on November 12 th for the transfer of personal data to third countries. These draft clauses are very much in line with the EDPB recommendations, and some of the recommendations are translated directly into contractual obligations in the draft SCCs. For example, there are proposed clauses that would obligate organizations in third countries to document access requests received from public authorities, to notify the data exporter and data subjects of such requests, when possible, and, when there are grounds, to challenge such data access orders. Also, technical measures that are referenced in the EDPB recommendations are referred to in the SCCs.


Data Synthesis as a Technical Measure

As indicated by the EDPB recommendations, PETs can play a key role in mitigating the risks associated with the use of external service providers and/or data transfers by acting as additional data protection measures.

Data synthesis in particular may be well suited as an additional measure since the process of creating and evaluating a synthetic dataset is within the scope of EU data protection law and the resulting synthetic dataset, when properly created and verified, should not be considered personal data under the law. This is a step above pseudonymisation in terms of data protection as pseudonymous data remains personal data under EU regulations and as such, requires additional protections. Synthetic data, on the other hand, replicates the patterns and characteristics of real data, but does not maintain a one-to-one mapping between synthetic records and real individuals. As a result, such data can be shared more broadly and used for analysis and research in other jurisdictions.

If synthetic versions of health datasets are generated within the EU, or another “adequate” jurisdiction, prior to being shared with an external service provider or transferred to another jurisdiction, the risks that have been highlighted by the Schrems II case and the recommendations of the CNIL can be mitigated. In this case, external service providers would only have access to non-personal synthetic data. Therefore, if access was gained by authorities in a third country, such access would no longer pose as significant a privacy risk. Meanwhile, external entities could continue to have access to high quality, realistic (synthetic) data from which they may gain comparable insights to those which would be obtained from the original data through the development and deployment of AI and machine learning models.

Data synthesis may also be helpful in cases where access by non-EU service providers to personal data, or the transfer of personal data to other jurisdictions, is deemed necessary. In this case, data synthesis could be used as an additional safeguard, on top of the required SCCs, for any data processing and retention that does not require the use of personal data. As a result, access to personal data by an external entity could be minimized to only that which is necessary to fulfill their contractual obligations.

For a more in depth discussion on how synthetic data may help organizations respond to Schrems II, see this October 28th blog piece on the subject. We are also happy to answer any questions you might have on this topic. Visit our  contact page to get in touch, or email  info@replica-analytics.com.