UK and US Announce Partnership on Science of AI Safety

Share

On 1 April 2024, the UK and US signed a memorandum of understanding on the science of AI safety. This partnership is the first of its kind and will see the two countries work together to assess risks and develop safety tests for the most advanced AI models.

Following their announcement of cooperation at the AI Safety Summit in Bletchley Park last November, the UK and US have formally agreed to align their scientific approaches to AI safety testing, with plans to perform at least one joint testing exercise on a publicly accessible model. The partnership will take effect immediately and will see the two countries work together to tackle the safety risks posed by next-generation versions of AI. The agreement will facilitate collaboration between the UK AI Safety Institute (formed last November) and the US AI Safety Institute (which is still in its initial stages) and will include the sharing of vital information and research on the capabilities and risks associated with AI systems, together with the exchange of expertise through researcher secondments between the institutes.

Continue reading “UK and US Announce Partnership on Science of AI Safety”

NIST Releases Cybersecurity Framework 2.0

Share

On February 26, 2024, the National Institute of Standards and Technology (NIST) released the NIST Cybersecurity Framework 2.0 (CSF 2.0). CSF 2.0 represents the first major update to the Cybersecurity Framework, which was first released in February 2014. CSF 2.0 provides an increased focus on entities’ governance functions and broadens the CSF’s scope. For companies subject to state and federal standards demanding “reasonable security,” CSF 2.0 is particularly important because it could very well become the de facto standard of care under various cybersecurity and data privacy laws.

Focus on Governance

CSF 2.0 builds on the five high-level functions from CSF 1.0 (Identify, Protect, Detect, Respond, and Recover) by introducing a new core function—Govern. This function focuses on ensuring that an organization’s cybersecurity risk management strategy, expectations, and policies are established, communicated, and monitored. In particular, this new core function emphasizes that an organization’s cybersecurity framework must be (i) based on the organization’s individual circumstances, goals, and risk appetite; (ii) well established and communicated within the organization to ensure compliance and continuity; and (iii) continually reviewed and improved.

Continue reading “NIST Releases Cybersecurity Framework 2.0”

UK Supreme Court Rules that AI cannot be an ‘Inventor’ Under UK Patent Law

Share

In Thaler v Comptroller-General of Patents, Designs and Trade Marks [2023] UKSC 49, the UK Supreme Court ruled that AI cannot be an ‘inventor’ for the purposes of UK patent law. The ruling concludes a series of appeals from Dr Stephen Thaler and his collaborators, who argued that an AI system called ‘DABUS’ should be named as the inventor of two new inventions generated autonomously by it relating to food and beverage packaging and light beacons. This was part of a series of test cases, which have had limited success globally, seeking to establish that AI systems can make inventions and that the owners of such systems can apply for and secure the grant of patents for those inventions. The judgment noted that the broader questions of whether an invention generated autonomously by AI ought to be patentable, or whether the meaning of the term ‘inventor’ should be expanded to include machines powered by AI, were matters of policy that would need to be addressed by legislation.

The UK Supreme Court made three main findings.

  1. DABUS is not an ‘inventor’ under the Patents Act 1977 (“Patents Act”)
  2. An ‘inventor’ within the meaning of the Patents Act must be a natural person (a human being). Since DABUS is a machine, not a natural person, it cannot be an ‘inventor.’
  3. It was not Dr Thaler’s case that he was the inventor and had simply used DABUS as a highly sophisticated tool. Had Dr Thaler made that case and named himself as the inventor, the Court noted that its decision might have been different, but it was not the Court’s place to determine that question.
  1. Dr Thaler was not entitled to apply for and obtain a patent simply by virtue of his ownership of DABUS
  2. Dr Thaler sought to rely on the doctrine of accession whereby the owner of existing property would own new property generated by that existing property (in the same way that a farmer owns the cow and also the calf). The Court held that this only applies to tangible property and not to intangible inventions. For this reason, title to the invention cannot pass as a matter of law from the machine that generated it to the owner of that machine. This argument also assumes that DABUS itself can be an inventor within the meaning of the Patents Act, which, as the court had already established, it cannot.
  1. By failing to satisfy the requirements of the Patents Act, the two patent applications must be taken to have been withdrawn
  2. Because Dr Thaler had failed to name an inventor and had failed to state a valid right to apply for and obtain the patents, the UK Intellectual Property Office had been correct to find that Dr Thaler’s two patent applications would be taken to be withdrawn at the expiry of the 16-month period prescribed by UK patent law for this purpose.

Commentary

Dr Thaler’s UK patent applications were part of a project involving parallel applications to patent offices around the world. The UK Supreme Court’s ruling is unsurprising and follows similar decisions in the United States and Europe.

The ruling raises significant issues for the AI industry, but it is important to focus on what it confirms: that inventors must be natural persons for the purposes of UK patent law. The judgment does not impact the patentability of AI-generated inventions as it does not necessarily preclude a person from securing a patent, provided that a human being is named the inventor.

©2024 Faegre Drinker Biddle & Reath LLP. All Rights Reserved. Attorney Advertising.
Privacy Policy