Our latest briefing explores the recent FTC commercial surveillance and data security forum (including discussion on widespread use of AI and algorithms in advertising), California’s inquiry into potentially discriminatory health care algorithms, and the recent California Department of Insurance workshop that could shape future rulemaking regarding the industry’s use of artificial intelligence, machine learning and algorithms.
As of March 2022, there are more than 18,000 different crypto currencies in service for a total market capitalization of $2 trillion. In this episode of the Faegre Drinker on Law and Technology Podcast, host Jason G. Weiss chats with Faegre Drinker partner Jeff Blumberg and associate Jane Blaney, to revisit the world of crypto and not only discuss the foundations of crypto currencies, but also explore the world of blockchain and non-fungible tokens, or NFTs.
The conversation tackles a number of questions, including:
- What is blockchain and what makes it different from traditional recordkeeping processes?
- What exactly is crypto currency? What is important to understand about it?
- What exactly is an NFT? How do they work in easy-to-understand concepts?
- What uses are there for blockchain, both using crypto currencies and in uses other than crypto currency?
- Are crypto currencies considered “legal tender”? Can people use them like normal money?
- How can and will NFTs be used in other ways in our society as these concepts grow?
On August 24, 2022, California Attorney General Rob Bonta announced a settlement with Sephora for violations of the California Consumer Privacy Act (CCPA). The action places online consumer tracking, analytics and advertising squarely in the regulatory crosshairs. “Sephora, like many online retailers, installs third-party companies’ tracking software on its website and in its app so that these third parties can monitor consumers as they shop,” the AG alleged, “. . . [and] when a company like Sephora utilizes third-party tracking technology without alerting consumers and giving them the opportunity to control their data, they deprive consumers of the ability to limit the proliferation of their data on the web.”
The National Institute of Standards and Technology (NIST) has released the second draft of its Artificial Intelligence (AI) Risk Management Framework (RMF) for comment. Comments are due by September 29, 2022.
NIST, part of the U.S. Department of Commerce, helps individuals and businesses of all sizes better understand, manage and reduce their respective “risk footprint.” Although the NIST AI RMF is a voluntary framework, it has the potential to impact legislation. NIST frameworks have previously served as basis for state and federal regulations, like the 2017 New York State Department of Financial Services Cybersecurity Regulation (23 NYCRR 500).
The AI RMF was designed and is intended for voluntary use to address potential risks in “the design, development, use and evaluation of AI products, services and systems.” NIST envisions the AI RMF to be a “living document” that will be updated regularly as technology and approaches to AI reliability to evolve and change over time.
On August 1, 2022, the Court of Justice of the European Union (CJEU) issued an opinion regarding a Lithuanian data protection case that may signal an expansion of interpretation of the definition of sensitive personal data under the EU’s General Data Protection Regulation (GDPR). Specifically, the CJEU found that data indirectly disclosing sexual orientation constitutes sensitive personal data.
At issue was a Lithuanian law that requires the Chief Official Ethics Commission of Lithuania to publish information about the private interests of public officials in an effort to combat corruption. In the facts underlying the case, a Lithuanian official objected to the Chief Official Ethics Commission’s online publication of his private interest information, which included his spouse’s name. The CJEU concluded that the publication of such information was prohibited by the GDPR because it was “liable to disclose indirectly the sexual orientation of a natural person,” a type of special category of personal data generally prohibited from processing under GDPR Article 9 (processing of special categories of personal data) unless certain additional conditions are satisfied such as the data subject’s explicit consent, or that processing is necessary for reasons of substantial public interest.
On July 29, 2022, the New York Department of Financial Services (NYDFS) published the pre-proposed second amendment to its Cybersecurity Regulations, 23 NYCRR 500 (Part 500), that if adopted, would likely require numerous policy and operational changes. NYDFS sought comments to the pre-proposal through August 18, 2022. Although this amendment has been long-anticipated, the next step will be for NYDFS to formally publish the second amendment.
Effective in 2017, Part 500 was a first-of-its-kind state regulation that created mandatory cybersecurity and risk management regulations for “covered entities.” Part 500 defines Covered Entities as persons operating under or required to operate under a license, registration, charter, certificate, permit, accreditation or similar authorization under the Banking Law, the Insurance Law or the Financial Services Law.