On August 1, 2022, the Court of Justice of the European Union (CJEU) issued an opinion regarding a Lithuanian data protection case that may signal an expansion of interpretation of the definition of sensitive personal data under the EU’s General Data Protection Regulation (GDPR). Specifically, the CJEU found that data indirectly disclosing sexual orientation constitutes sensitive personal data.
At issue was a Lithuanian law that requires the Chief Official Ethics Commission of Lithuania to publish information about the private interests of public officials in an effort to combat corruption. In the facts underlying the case, a Lithuanian official objected to the Chief Official Ethics Commission’s online publication of his private interest information, which included his spouse’s name. The CJEU concluded that the publication of such information was prohibited by the GDPR because it was “liable to disclose indirectly the sexual orientation of a natural person,” a type of special category of personal data generally prohibited from processing under GDPR Article 9 (processing of special categories of personal data) unless certain additional conditions are satisfied such as the data subject’s explicit consent, or that processing is necessary for reasons of substantial public interest.
The court explained that the relevant prohibition on processing under Article 9(1) prohibits processing of personal data “revealing” various characteristics of a natural person, including the person’s sex life or sexual orientation, and that the terms “special categories of personal data” and “sensitive data” are interpreted broadly elsewhere in the GDPR. The CJEU also relied on a broad interpretation of “data concerning health” from Article 4(15) and guidance from Recital 35 to conclude that context must be taken into account with regard to special categories of personal data. Therefore, the CJEU concluded, “those provisions cannot be interpreted as meaning that the processing of personal data that are liable indirectly to reveal sensitive information concerning a natural person is excluded from the [protections of Article 9].”
The CJEU decision is noteworthy because it binds EU Member State courts on an issue that has thus far divided data protection authorities in Europe. Both the Norwegian data protection authority and Spanish data protection authority recently addressed the issue of inferred special categories of personal data in the context of sexual orientation when they considered cases involving Grindr, a social networking app for the LGBTQ community. The Norwegian authority determined that an individual’s use of Grindr creates an inference “that they belong to a sexual minority,” and it levied a $6.5M fine against Grindr for disclosing “user data to third parties” for behavioral advertising purposes without obtaining meaningful consent. The Spanish authority, by contrast, found otherwise, declining to infer individuals’ sexual orientation based on their use of the Grindr platform.
Accordingly, the CJEU’s decision has potentially significant implications for companies’ data collection and processing practices. Data that might include spouse or partner names and thus “reveal” information about a person’s sex life or sexual orientation could be found in an extensive array of data sets. Many of these would not otherwise necessarily constitute sensitive data under the GDPR, including customer relationship and non-profit donor databases, student and employee emergency contacts, and employee diversity information. Other data that might “reveal” information about a person’s sex life or sexual orientation are also likely maintained by dating apps, social networks, and advertising networks.
Importantly, the CJEU’s logic appears to extend to other special categories of personal data, suggesting that any data that “reveal” information about racial or ethnic origins, political opinions, religious or philosophical beliefs, trade union membership, or health would also themselves constitute special categories of data subject to the GDPR’s strict processing limitations. It is difficult to identify a limiting principle to the scope of the court’s ruling—if a name can be used to make an inference of gender and from there an inference of sexual orientation, would the court also conclude that a name or photo can also be used to make inferences about, for example, racial or ethnic origins, or religion? The fact that such inferences may be mistaken does not seem to have troubled the court. Further assessment from Member State courts and the CJEU may clarify the scope of this opinion.
Finally, the CJEU’s opinion may also have implications for processing of sensitive data in the United States. The GDPR definition of “special categories of data” was largely repurposed for the definitions of “sensitive personal information” and “sensitive data” found in U.S. data protection statutes. California, for example, defines “sensitive personal information” as personal information that “reveal[s]” various data about an individual, including, among other things, racial or ethnic origin, religious or philosophical beliefs, union membership, contents of consumer communications, and genetic data. Similarly, Colorado defines “sensitive data” as personal data “revealing” racial or ethnic origin, religious beliefs, a mental or physical health condition or diagnosis, sex life or sexual orientation, or citizenship or citizenship status. Data protection statutes in Connecticut, Utah, and Virginia have definitions of “sensitive data” that are nearly identical to Colorado’s.
Although no U.S. state has yet affirmatively decided that information that indirectly allows a business to infer sensitive traits itself constitutes “sensitive personal information,” California may be headed in that direction. The California Attorney General has already concluded that inferences “are themselves ‘personal information,’” explaining that inferences can reveal any number of characteristics about a consumer, from likes and dislikes to relationship/family status to “sensitive personal attributes such as age, gender, race, ethnicity, sexual orientation, political views, and personality traits.” And in its over sales of personal information via third-party cookies, the California AG cited concerns about data revealed indirectly from information collected via cookies: “Sephora’s website allows visitors to browse and purchase products such as prenatal and menopause support vitamins—data points which can be used by third-party companies to infer conclusions about women’s health conditions, like pregnancy.” For now, information that indirectly permits a sensitive inference to be drawn is not sensitive data in the U.S., but that may change pending further regulatory action and interpretation at the state level.