Guidance

Guidelines for examining patent applications relating to artificial intelligence (AI)

Updated 7 May 2024

The guidelines are temporarily suspended pending consideration of the judgement of the Court of Appeal in Comptroller of Patents v Emotional Perception AI Ltd [2024] EWCA Civ 825.

1. This document provides guidelines for examining patent applications relating to artificial intelligence (AI) inventions at the Intellectual Property Office (IPO).

2. The relevant law is the Patents Act 1977 and the Patents Rules 2007.  The act and rules are interpreted by following judgments of the UK courts. Judgments concerning the Patents Act 1977 are binding on the IPO. The act implements provisions of the European Patent Convention (EPC). Although decisions made by the European Patent Office (EPO) Boards of Appeal concerning the EPC are not binding on the IPO, they are persuasive.

3. The guidelines explain how the exclusions to patentability of the act apply to AI inventions. They also look briefly at the requirement for sufficiency of disclosure of AI inventions.

4. The guidelines include an accompanying set of scenarios. Each scenario briefly describes an AI invention and includes a non-binding assessment of the invention in relation to the exclusions to patentability.

5. The guidelines and the accompanying scenarios should be read as a supplement to the Manual of Patent Practice (MoPP).

7. Any comments or questions arising from these guidelines should be addressed to:

Ben Buchanan
Intellectual Property Office
Concept House
Cardiff Road
Newport
South Wales
NP10 8QQ

Nigel Hanley
Intellectual Property Office
Concept House
Cardiff Road
Newport
South Wales
NP10 8QQ

Summary

7. These guidelines are summarised as follows.

  • in the UK, patents are available for AI inventions in all fields of technology

  • AI inventions are computer-implemented inventions. They rely on mathematical methods and computer programs in some way. UK patent law excludes from patent protection inventions relating solely to a mathematical method or a program for a computer. However, the exclusions are applied as a matter of “substance not form” by considering the task or process an AI invention performs

  • when the task or process performed by an AI invention makes a technical contribution to the known art, the invention is not excluded and is patent-eligible

  • an AI invention makes a technical contribution if:

    - it embodies or performs a technical process which exists outside a computer, or

    - it contributes to solving a technical problem lying outside a computer or lying within the computer itself, or

    - it is a new way of operating a computer in a technical sense

  • an AI invention is not excluded if it is claimed in hardware-only form. Hardware-only form means the claimed invention does not rely on a computer program or a programmable device in any way

  • an AI invention is only excluded from patent protection if it does not make a technical contribution. An AI invention does not make a technical contribution if:

    - it relates solely to items listed as being excluded (for example a business method) and there is no more to it, or

    - it relates solely to processing or manipulating information or data and there is no more to it, or

    - it is just a better or well-written program for a conventional computer and there is no more to it

  • however, following the judgment in Emotional Perception, an invention involving an artificial neural network (ANN) is not a computer program as such. The judgment in Emotional Perception is being appealed to the court of appeal. It is binding on the IPO until such time as the court determines otherwise

  • the conditions set out above apply to inventions that may be categorised as “applied AI” or “core AI”. They also apply to the training of AI inventions

  • patent protection is available for training data sets when they are used as part of an invention making a technical contribution. However, claims to data sets characterised solely by their information content are excluded as the presentation of information as such

  • the sufficiency of disclosure of an AI invention or a data set is assessed in the same way as any other invention. The relevant legal principles for assessing sufficiency are set out in Eli Lilly v Human Genome Sciences

What is an AI invention?

8. There is no single agreed-upon definition of artificial intelligence. The UK government has defined AI as:

technologies with the ability to perform tasks that would otherwise require human intelligence, such as visual perception, speech recognition, and language translation.

9. Although there is no accepted definition of AI, the features and functions of AI inventions can be summarised simply as follows.

Image of 2 arrows with a box separating the arrows - containing words Data in - Models & algorithms running on a computer - Data out

10. AI inventions are computer-implemented inventions. They are based on computational models and algorithms such as neural networks, genetic algorithms, machine learning algorithms, and other similar approaches. These models and algorithms are essentially mathematical in nature.  They operate upon input data and provide various forms of output data.

11. By being “computer-implemented”, AI inventions are implemented, at least in part, by a computer program or a programmable device such as a conventional computer.  A computer program may include a sequence of executable instructions for a conventional computer.

12. Artificial intelligence inventions find application across all fields of technology. Patent protection is sought for many aspects of AI inventions. For the purposes of these guidelines, AI inventions are categorised as either “applied AI” or “core AI”.

Applied AI

13. An “applied AI” invention applies AI techniques to a field other than the field of AI. An applied AI invention may:

  • perform specific processes or solve specific problems lying outside the computer on which it is implemented, or

  • make the computer work better generally, for example by performing processes or solving problems concerned with the internal workings of the computer

14. An applied AI invention may be implemented in an application program for execution on a computing system. Alternatively, it may be provided as part of the internal workings of a computing system.

Core AI

15. In contrast to an “applied AI” invention, a “core AI” invention defines an advance in the field of AI itself. A core AI invention involves, for example, an improved AI model, algorithm, or mathematical method. The IPO treats a computer-implemented core AI invention as an application program that carries out the claimed tasks of the core AI.

Other aspects of AI

16. Regardless of whether they might be categorised as applied AI or core AI, certain types of AI invention require training. For example, some AI models are trained using specific training data sets.

17. AI inventions may also be implemented in a hardware-only form, such as in special-purpose electronic circuitry.

The law

Conditions for the grant of a valid patent

18. Section 1(1) of the act sets out four conditions that an invention must satisfy for the grant of a valid patent.

A patent may be granted only for an invention in respect of which the following conditions are satisfied, that is to say-

(a) the invention is new;

(b) it involves an inventive step;

(c) it is capable of industrial application;

(d) the grant of a patent for it is not excluded by subsections (2) and (3) or section 4A below; and references in this Act to a patentable invention shall be construed accordingly.

19. These four conditions apply to all inventions in all fields of technology. A patent may only be granted if an invention is new, has an inventive step, is industrially applicable, and is not excluded from patent protection. Patents are available for AI inventions provided they satisfy these four conditions. These guidelines are primarily concerned with the requirement of section 1(1)(d) that an AI invention must not relate to so-called excluded subject matter.

Excluded subject matter

20. Section 1(2) of the act implements Article 52 of the European Patent Convention (EPC). Section 1(2) declares that certain things are not inventions for the purposes of the act.

It is hereby declared that the following (among other things) are not inventions for the purposes of this Act, that is to say, anything which consists of-

(a) a discovery, scientific theory or mathematical method;

(b) a literary, dramatic, musical or artistic work or any other aesthetic creation whatsoever;

(c) a scheme, rule or method for performing a mental act, playing a game or doing business, or a program for a computer;

(d) the presentation of information;

but the foregoing provision shall prevent anything from being treated as an invention for the purposes of this Act only to the extent that a patent or application for a patent relates to that thing as such.

21. AI inventions typically rely on mathematical models and algorithms for their implementation. This means the “mathematical method” and “program for a computer” exclusions must be considered carefully in the examination of AI inventions. The words “as such” appearing at the end of section 1(2) qualify the extent of its exclusions. For example, it is only a program for a computer “as such” that is excluded from patent protection. A computer-implemented invention is not excluded if it relates to something more than a program for a computer “as such”.

Technical contribution and the Aerotel approach

22. According to binding precedent of the UK courts, a computer-implemented invention avoids exclusion under section 1(2) if it makes a “technical contribution” to the state of the art. However, a contribution that consists purely of excluded subject matter or that is not technical in nature does not count as a technical contribution.

23. Whether an invention makes a relevant technical contribution is decided by following the approach approved by the UK Court of Appeal in Aerotel/Macrossan. The approach has four steps.

(1) properly construe the claim

(2) identify the actual contribution;

(3) ask whether it falls solely within the excluded subject matter;

(4) check whether the actual or alleged contribution is actually technical in nature.

24. A full explanation of the four steps of the Aerotel approach is found in the Manual of Patent Practice. In Aerotel, the Court explained how to identify the contribution for the purposes of steps (2), (3) and (4):

[43] The second step – identify the contribution - is said to be more problematical. How do you assess the contribution? Mr Birss submits the test is workable – it is an exercise in judgment probably involving the problem said to be solved, how the invention works, what its advantages are. What has the inventor really added to human knowledge perhaps best sums up the exercise. The formulation involves looking at substance not form – which is surely what the legislator intended.

[44] Mr Birss added the words “or alleged contribution” in his formulation of the second step. That will do at the application stage – where the Office must generally perforce accept what the inventor says is his contribution. It cannot actually be conclusive, however. If an inventor claims a computer when programmed with his new program, it will not assist him if he alleges wrongly that he has invented the computer itself, even if he specifies all the detailed elements of a computer in his claim. In the end the test must be what contribution has actually been made, not what the inventor says he has made.

25. Importantly, it is the claim as a whole that must be considered when assessing the contribution that the invention has made (see Lantana, paragraph 64).

When does a computer-implemented invention make a technical contribution?

26. The answer to this critical question has been considered in many judgments of the court of appeal. They include its judgments in Merrill Lynch, Gale’s Application, Aerotel/Macrossan, Symbian, HTC v Apple and Lantana.

27. In Symbian, the court of appeal said this question is inherently difficult. They concluded that there is no precise test for determining when a computer-implemented invention makes a technical contribution. However, paragraphs 45 to 49 of HTC v Apple provide a helpful starting point for asking if a computer-implemented invention makes a technical contribution.

[45] How then is it to be determined whether an invention has made a technical contribution to the art? A number of points emerge from the decision in Symbian and the earlier authorities to which it refers. First, it is not possible to define a clear rule to determine whether or not a program is excluded, and each case must be determined on its own facts bearing in mind the guidance given by the Court of Appeal in Merrill Lynch and Gale and by the Boards of Appeal in Case T 0208/84 Vicom Systems Inc [1987] 2 EPOR 74, [1987] OJ EPO 14, Case T 06/83 IBM Corporation/Data processing network [1990] OJ EPO 5, [1990] EPOR 91 and Case T 115/85 IBM Corporation/Computer-related invention [1990] EPOR 107.

[46] Second, the fact that improvements are made to the software programmed into the computer rather than hardware forming part of the computer does not make a difference. As I have said, the analysis must be carried out as a matter of substance not form.

[47] Third, the exclusions operate cumulatively. So, for example, the invention in Gale related to a new way of calculating a square root of a number with the aid of a computer and Mr Gale sought to claim it as a ROM in which his program was stored. This was not permissible. The incorporation of the program in a ROM did not alter its nature: it was still a computer program (excluded matter) incorporating a mathematical method (also excluded matter). So also the invention in Macrossan related to a way of making company formation documents and Mr Macrossan sought to claim it as a method using a data processing system. This was not permissible either: it was a computer program (excluded matter) for carrying out a method for doing business (also excluded matter).

[48] Fourth, it follows that it is helpful to ask: what does the invention contribute to the art as a matter of practical reality over and above the fact that it relates to a program for a computer? If the only contribution lies in excluded matter then it is not patentable.

[49] Fifth, and conversely, it is also helpful to consider whether the invention may be regarded as solving a problem which is essentially technical, and that is so whether that problem lies inside or outside the computer. An invention which solves a technical problem within the computer will have a relevant technical effect in that it will make the computer, as a computer, an improved device, for example by increasing its speed. An invention which solves a technical problem outside the computer will also have a relevant technical effect, for example by controlling an improved technical process. In either case it will not be excluded by Art 52 as relating to a computer program as such.

28. In AT&T/Cvon the high court set out five so-called “signposts” to a relevant technical effect. These signposts were refined by the court of appeal in HTC v Apple where it expressed the fourth signpost less restrictively.  The five signposts are:

i) whether the claimed technical effect has a technical effect on a process which is carried on outside the computer;

ii) whether the claimed technical effect operates at the level of the architecture of the computer; that is to say whether the effect is produced irrespective of the data being processed or the applications being run;

iii) whether the claimed technical effect results in the computer being made to operate in a new way;

iv) whether the program makes the computer a better computer in the sense of running more efficiently and effectively as a computer;

v) whether the perceived problem is overcome by the claimed invention as opposed to merely being circumvented.

29. In HTC v Apple, the court explained that the signposts are useful in answering steps (3) and (4) of the Aerotel approach. However, the court explained this does not necessarily mean they will be determinative in every case. They emphasised the signposts should not be treated as prescriptive conditions. Conversely, an invention is not automatically patentable if only one of the signposts is found to exist.

30. The principle of “substance not form” means all categories of claim should be considered in the same way when applying steps (2), (3), and (4) of Aerotel. For example, claims to a computer program, a programmed computer, and a computer-implemented method are each assessed by considering the task or process they perform. The requirement to consider the claimed task or process of a computer-implemented invention in this way was explained by the high court in Halliburton:

[32] Thus when confronted by an invention which is implemented in computer software, the mere fact that it works that way does not normally answer the question of patentability. The question is decided by considering what task it is that the program (or the programmed computer) actually performs. A computer programmed to perform a task which makes a contribution to the art which is technical in nature, is a patentable invention and may be claimed as such. Indeed (see Astron Clinica [2008] RPC 14) in those circumstances the patentee is perfectly entitled to claim the computer program itself.

[33] If the task the system performs itself falls within the excluded matter and there is no more to it, then the invention is not patentable (see Symbian paragraph 53 above). Clear examples are from the cases involving computers programmed to operate a method of doing business, such as a securities trading system or a method of setting up a company (Merrill Lynch and Macrossan). Inventions of that kind are held not to be patentable, but it is important to see why. They are more than just a computer program as such. For example, they self-evidently perform a task which has real world consequences. As Fox LJ said in Merrill Lynch (p569 at line 27), a data processing system operating to produce a novel technical result would normally be patentable. However, that is not the end of the analysis. He continued: “however it cannot be patentable if the result itself is a prohibited item” (i.e. a method of doing business). When the result or task is itself a prohibited item, the application fails.

[34] The reasoning in Merrill Lynch means that the computer implemented invention claimed there would not have been excluded from patentability if it were not for the combined effect of two exclusions in s1(2) - computer programs and (in that case) business methods. The cases in which patents have been refused almost always involve the interplay between at least two exclusions …

[35] The business method cases can be tricky to analyse by just asking whether the invention has a technical effect or makes a technical contribution. The reason is that computers are self-evidently technical in nature. Thus, when a business method is implemented on a computer, the patentee has a rich vein of arguments to deploy in seeking to contend that his invention gives rise to a technical effect or makes a technical contribution. For example, the computer is said to be a faster, more efficient computerized book keeper than before and surely, says the patentee, that is a technical effect or technical advance. And so it is, in a way, but the law has resolutely sought to hold the line at excluding such things from patents. That means that some apparently technical effects do not always count. So a computer programmed to be a better computer is patentable (Symbian) but as Fox LJ pointed out in relation to the business method exclusion in Merrill Lynch, the fact that the method of doing business may be an improvement on previous methods is immaterial because the business method exclusion is generic.

[36] The Aerotel approach is a useful way of cutting through the cases like Merrill Lynch, Macrossan and Gale in which more than one exclusion is engaged. Take a patent claim consisting of a claim to a computer programmed to perform a business method. What has the inventor contributed? If the answer is a computer program and method of doing business and there is nothing more present, then the contribution falls solely within the excluded subject matter. It can be seen not to be patentable at step 3, before one gets bogged down in the argument that about whether a book keeping system running more efficiently on a computer is a technical effect. Following Aerotel the question has answered itself.

[37] The “better computer” cases - of which Symbian is paradigm example - have always been tricky however one approaches this area. The task the program is performing is defined in such a way that everything is going on inside the computer. The task being carried out does not represent something specific and external to the computer and so in a sense there is nothing else going on than the running of a computer program. But when the program solves a technical problem relating to the running of computers generally, one can see that there is scope for a patent. Making computers work better is not excluded by s1(2).

[38] What if the task performed by the program represents something specific and external to the computer and does not fall within one of the excluded areas? Although it is clear that that is not the end of the enquiry, in my judgment that circumstance is likely to indicate that the invention is patentable. Put in other language, when the task carried out by the computer program is not itself something within the excluded categories then it is likely that the technical contribution has been revealed and thus the invention is patentable. I emphasise the word “likely” rather than “necessarily” because there are no doubt cases in which the task carried out is not within the excluded areas but nevertheless there is no technical contribution at all.

[39] So in Merrill Lynch and Macrossan the computer programs were unpatentable because the task the program performed was a business method. In Gale the program was unpatentable because the task it performed was a mathematical method (albeit the reasoning was the other way round, starting from the mathematical method rather than the computer program aspect).

31. Accordingly, IPO practice is to examine whether an AI invention makes a contribution that is technical in nature by considering the task or process it performs.

Artificial neural networks

32. The patentability of artificial neural networks (ANNs) was considered by the high court in Emotional Perception AI Ltd v Comptroller-General of Patents. The invention in Emotional Perception provided music track recommendations to a user by passing the tracks through a trained ANN. This allowed the invention to make suggestions of similar music in terms of human perception and emotion, irrespective of the genre of the music.

33. To achieve this, the ANN of the invention was trained in a special way. It involved training the ANN by considering both natural language descriptions of music files and extracted physical properties of the music files. The natural language descriptions, for example the metadata relating to the music files, conveyed how they might be perceived by a human. The physical properties of the music files related to properties such as the tone, timbre, speed, and loudness of the music they contained.

34. Once the ANN was trained, the extracted properties of music tracks were passed through it. This produced outputs which were compared to a database from which recommendations of semantically similar tracks were produced. The recommendations were provided to an end user by sending a file and message.

35. The judgment noted that an ANN may be implemented, for example as a physical box with electronics in it (see paragraph 14). Alternatively, an ANN may be implemented using a piece of software which enables a conventional computer to emulate the hardware ANN (see paragraph 18). The judgment considered both the hardware and software implementations and concluded that while both involve a computer, neither involves a computer program. Accordingly, the court held as a matter of construction the claimed invention was not a computer program at all (see paragraph 61). The computer program exclusion was not invoked by the claimed invention.

36. The court held it was, therefore, unnecessary to consider whether the claimed invention involved a technical contribution. Nonetheless, the court went on to consider if the claimed invention would involve a technical contribution. The court accepted an argument that moving data outside the computer system, in the form of a file that is transferred, provided an external technical effect. When coupled with the purpose and method of selecting the file’s contents, this was a technical effect meeting signpost (i) (see paragraph 76). The court further held that a trained ANN is capable of being an external technical effect which prevents the computer program exclusion from applying (see paragraph 78).

37. Following the Emotional Perception judgment, patent examiners should not object to inventions involving ANNs under the “program for a computer” exclusion. To qualify as an invention involving an ANN, the invention must either claim an ANN itself or include claim limitations to training or using an ANN. Otherwise, the Emotional Perception judgment will not apply.

38. Examiners should consider whether other exclusions of section 1(2) apply to inventions involving ANNs. For example, devoid of any application, an ANN is an abstract mathematical model, so the mathematical method exclusion may apply in appropriate cases. Further, following Merrill Lynch, a computer implemented invention (such as an ANN) may be rejected as a method of doing business as such. If an invention involving an ANN performs nothing more than a method of doing business, then it is excluded under the business method exclusion.

39. If a claimed invention is not directed to an ANN, its training, or its use, then the computer program exclusion must be considered. For example, if a claim merely refers to machine learning or training a model, then it engages the computer program exclusion. The allowability of such claims should be determined by asking whether the invention makes a relevant technical contribution.

Applied AI: performing processes or solving problems lying outside the computer

40. In paragraph 38 of Halliburton the high court observed that:

… if the task performed by the program represents something specific and external to the computer and does not fall within one of the excluded areas … that circumstance is likely to indicate that the invention is patentable.

41. In this context a computer-implemented invention makes a contribution that is technical in nature if it:

  • carries out or controls a technical process which exists outside the computer, or

  • contributes to the solution of a technical problem lying outside the computer

42. If an AI invention satisfies either of these conditions, then it likely reveals a technical contribution and is not excluded under section 1(2). AI inventions meeting either (or both) these conditions likely produce the sort of technical effects indicated by signposts (i) and (v).

Carrying out or performing a technical process outside the computer

43. The paradigm computer-implemented invention embodying a technical process outside a computer is found in the decision of the EPO Board of Appeal in Vicom. The invention in Vicom concerned a method of image convolution which involved repeated selection and application of convolution filters using a mathematical error-minimisation technique. The convolution filters had a smaller dimension compared to those used by a known method. When the inventive image convolution was run on a conventional computer it required far fewer calculations to give approximately the same result as the known method. Hence, the invention produced image convolutions with an increase in processing speed compared to the known method when it was executed on a conventional computer.

44. As the court explained in AT&T/Cvon, the board held that the inventive algorithm in Vicom was more than a mathematical method “as such” because:

… the mathematical method which underlay the invention was being used in a technical process which was carried out on a physical entity by technical means.

45. And it was more than a program for a computer “as such” because:

… what was claimed was not the computer program at all, but the process of manipulating the images. That process was a technical process and hence made a technical contribution.

46. The computer-implemented invention in Vicom made a technical contribution because it performed a specific technical process (image processing) lying outside the computer itself. The essential reasoning of Vicom is reflected in signpost (i).

47. Many applied AI inventions relate to image processing in some way and may make a technical contribution. For example, BL O/296/21 (Imagination Technologies) involved a combination of a general-purpose computer and a deep neural network (DNN) accelerator. The combination processed image data for computer vision problems more efficiently. The hearing officer found that this was a relevant technical effect and that a technical problem was solved. Signposts (i) and (v) pointed to patentability.

48. In BL O/0007/23 (Teledyne FLIR Commercial Systems) the invention included a synthetic image generator for generating a set of synthetic images. The synthetic images were used, along with a subset of images retained from a previous training data set, to generate an updated training data set. The updated data set was used for retraining a neural network to improve its performance in identifying and classifying objects present in images. The hearing officer found the claimed invention performed a specific task external to the computer, namely image object detection. The hearing officer found the contribution was made to the technical field of image recognition and that signposts (i) and (v) both pointed to allowability.

Solving a technical problem lying outside the computer

49. In Halliburton the high court considered the patentability of computer simulations by applying the modern-day Aerotel approach. The invention in Halliburton was concerned with improving the design of drill bits for drilling oil wells to increase their drilling efficiency and operational life. The claimed invention included iteratively modelling multiple designs of a drill bit (including its cutting elements) using finite element methods. The drilling of earth formations was simulated using the multiple designs to determine an optimised design for the drill bit.

50. The court held that the invention in Halliburton was more than a computer program “as such” because it was:

… a method of designing a drill bit. Such methods are not excluded from patentability by Art 52/s1(2) and the contribution does not fall solely within the excluded territory.

51. And it was more than a mathematical method “as such” because:

the data on which the mathematics is performed has been specified in the claim in such a way as to represent something concrete (a drill bit design etc.).

52. Finally, the court confirmed the contribution was technical in nature because:

Designing drill bits is obviously a highly technical process, capable of being applied industrially … The detailed problems to be solved with wear and ability to cut rock and so on are technical problems with technical solutions. Accordingly finding a better way of designing drill bits in general is a technical problem. This invention is a better way of carrying that out. Moreover the detailed way in which this method works – the use of finite element analysis – is also highly technical.

53. Hence, the invention in Halliburton made a technical contribution. As well as being a technical process, it solved a technical problem (with drilling efficiency and operational lifespan) lying outside the computer. The reasoning in Halliburton indicates the presence of signposts (i) and (v).

54. Vicom and Halliburton are similar in that the data processed by the computer represented a physical entity external to the computer. In Vicom the data represented an image and in Halliburton it represented a drill bit. However, it is important to note that a computer-implemented invention may make a technical contribution even if the data it processes does not represent a physical entity.

55. For example, Protecting Kids the World Over (PKTWO) concerned an alert system for monitoring electronic communications data (for example email) for particular words and phrases. The system worked to ensure that users (for example children) were not exposed to inappropriate content or language. The high court held that the speed and reliability of the algorithm used for sampling and analysing expressions found in the electronic communications was improved. This in turn improved the speed and reliability of an alarm notification for alerting a user (for example a parent) to inappropriate communication.

56. The court held that, when viewing the claim in PKTWO as a whole, providing the alarm in this way was a relevant technical process. Thus, the court held:

The effect here, viewed as a whole, is an improved monitoring of the content of electronic communications. The monitoring is said to be technically superior to that produced by the prior art. That seems to me to have the necessary characteristics of a technical contribution outside the computer itself.

57. Accordingly, the court held that the contribution made by the invention in PKTWO

… does not reside wholly within the computer program as such exclusion. I think that conclusion is in accordance with the AT&T signposts. In particular I would say that the invention solves a technical problem lying outside the computer, namely how to improve upon the inappropriate communication alarm provided by the prior art.

58. Thus, PKTWO solved a technical problem lying outside the computer, indicating the presence of signposts (i) and (v).

59. One example of applied AI being used to solve a technical problem outside the computer is BL O/296/21 (Imagination Technologies). A combination of a computer and a DNN accelerator provided a technical effect of processing image data for computer vision problems more efficiently. The hearing officer found that signposts (i) and (v) pointed to patentability.

Processes or problems lying outside a computer that are not technical in nature

60. The analysis presented above does not mean that every process (or solution of a problem) lying outside a computer reveals a technical contribution. In paragraph 33 of Halliburton, quoted above, the court observed that:

“If the task the system performs itself falls within the excluded matter and there is no more to it, then the invention is not patentable …  Clear examples are from the cases involving computers programmed to operate a method of doing business, such as a securities trading system or a method of setting up a company (Merrill Lynch and Macrossan) … When the result or task is itself a prohibited item, the application fails.”

61. Thus, when the contribution made by a computer-implemented invention consists purely of excluded matter, and there is no more to it, then it does not count as a technical contribution. In such cases the invention will be excluded under section 1(2).

62. A clear example is found in Merrill Lynch which related to a data processing system for making a trading market in securities. The court acknowledged that the invention had the real-world effect of producing an improved trading system. Although this was arguably an effect outside the computer, it did not count as a technical contribution. This was because the contribution consisted of nothing more than a method of doing business as such.

63. Similarly, in Macrossan, the invention involved producing the documents for use in forming a corporate entity such as a company. The court held that the task of the invention was “for the very business itself, the business of advising upon and creating an appropriate company”. This task consisted solely of a method of doing business “as such”.

64. Several examples of applied AI inventions have been refused for performing no more than a method of doing business as such. BL O/0517/23 (Nielsen Consumer) related to a machine learning model whose outputs adjusted a marketing strategy. In BL O/0315/23 (WNS Global Services (UK) Ltd) a classifier was used to output a “social equity index” measuring the social reputation of brands. BL O/0284/23 (Kinaxis) involved selecting machine learning models for forecasting sales of products. BL O/811/22 (Halliburton Energy Services Inc.) used AI for planning oilfield exploration.

65. A further example is BL O/0566/23 (Tomas Gorny) which used an unsupervised deep learning model to assign tickets to agents in a customer service center. In BL O/0988/23 (Conquest Planning Inc.) an AI module was used to generate and rank modified financial plans for achieving certain financial goals. In BL O/774/22 (Flatiron Health Inc.) a machine learning model processed unstructured medical information and produced outputs indicating the suitability of an individual for participation in a medical trial. The invention was refused, among other things, as a method of doing business.

Processes that are not outside the computer and are not technical

66. Certain tasks or processes performed by a computer-implemented invention are not regarded as being outside a computer or technical in nature. Any effect which is no more than one produced by merely running a program (such as the mere manipulation of data) does not count as a technical contribution. Such effects fall solely within the program exclusion.

67. For example, in Autonomy one aspect of the claimed invention involved automatic text analysis. The text in an active window of a computer was analysed automatically and a list of links related to that content was generated. This aspect was held to be a paradigm example of what is caught by the exclusion to a program for a computer “as such”.

[40] In my judgment … automatic text analysis, comparison and results generation is a paradigm example of a case in which the contribution falls squarely within excluded matter, i.e. a program for a computer. The claimed contribution, so far as the first element is involved does not exist independently of whether it is implemented by a computer. On the contrary, it depends on a computer processing or displaying information in an active window, and on a search program to analyse it and to compare and generate results … The only effect produced by the invention is an effect caused merely by the running of the program, which consists of the manipulation of data. It is in short a claim to a better search program.

68. The invention in Autonomy (automatic text analysis) would not produce a technical effect external to a computer in accordance with signpost (i).

69. Examples of applied AI inventions involving nothing more than the analysis or generation of text have been refused under the computer program exclusion. In BL O/1047/22 (Hunter IP) a trained machine learning model was used to determine sentence style type. In BL O/829/22 (Google LLC) a trained machine learning model suggested to a user how they might reply to a received communication.

Conclusions

70. An AI invention is likely to reveal a technical contribution if:

  • it embodies or performs a technical process which exists outside a computer, or
  • it contributes to the solution of a technical problem lying outside the computer

71. When either of these criteria applies to an AI invention it indicates signposts (i) and/or (v) are relevant and the invention is not excluded. Several examples of patentable AI inventions meeting these criteria are found in the scenarios. See for example:

  • scenario 1 (ANPR system for recognising a vehicle registration number)
  • scenario 2 (monitoring a gas supply system for faults)
  • scenario 3 (analysing and classifying movement from motion sensor data)
  • scenario 4 (detecting cavitation in a pumping system)
  • scenario 5 (controlling a fuel injector in a combustion engine)
  • scenario 6 (measuring percentage of blood leaving a heart)

72. An AI invention is unlikely to reveal a technical contribution if the task or process it performs relates:

  • to items listed as being excluded under section 1(2) (for example a business method) and there is no more to it, or
  • to processing information or data and there is no more to it

73. The scenarios illustrate several examples of AI inventions that are not patentable for these reasons. See for example:

  • scenario 7 (automated financial instrument trading)
  • scenario 8 (analysing patient health records)
  • Scenario 9 (identifying junk e-mail using a trained AI classifier)

Applied AI: making computers work better

74. Paragraph 37 of Halliburton explains that making computers work better is not excluded by section 1(2).

The “better computer” cases - of which Symbian is paradigm example - have always been tricky however one approaches this area. The task the program is performing is defined in such a way that everything is going on inside the computer. The task being carried out does not represent something specific and external to the computer and so in a sense there is nothing else going on than the running of a computer program. But when the program solves a technical problem relating to the running of computers generally, one can see that there is scope for a patent. Making computers work better is not excluded by s1(2).

75. In this context, decided case law shows a computer-implemented invention makes a technical contribution if:

  • it solves a technical problem lying within a computer, or

  • it defines a new way of operating a computer in a technical sense

76. If an AI invention satisfies either of these conditions, then it likely reveals a technical contribution and is not excluded matter. Any invention meeting either (or both) of these conditions will likely produce the sort of technical effects indicated by one or more of signposts (ii), (iii), (iv), and (v).

Solving technical problems lying within the computer

77. The paradigm computer-implemented invention solving a technical problem lying within a computer is Symbian. It concerned the programming of a dynamic linked library (DLL) for storing functions common to the applications running on the computer’s operating system. In certain circumstances, adding functions to the DLL meant that existing applications using the DLL were unable to link to the added functions correctly, causing a malfunction. The inventive program had an “extension part” for the DLL which ensured that any application could select and link correctly to the desired functions in the DLL.

78. The court held the invention was a

program which makes the computer operate on other programs faster than prior art operating systems enabled it to do by virtue of the claimed features.

79. This solved a technical problem lying within the computer:

because it has the knock-on effect of the computer working better as a matter of practical reality.

80. Thus, Symbian indicates the presence of signposts (iv) and (v). Signpost (ii) may also be present since the effect was achieved irrespective of the nature of the data being processed and the applications being run.

81. Symbian also approved two IBM decisions of the board of appeal, T 0006/83 and T 0115/85. These decisions are two further examples of programs solving technical problems concerned with the internal workings of a computer itself. Both decisions provided technical effects in a computer system which operated irrespective of the nature of the data being processed or the applications being run. Their essential reasoning is reflected in signpost (ii) (see paragraphs 21 to 31 of AT&T).

82. In marked contrast to Symbian and the two IBM decisions, Gale’s Application is the paradigm computer program that does not solve a technical problem with the internal workings of a computer. The invention was a new way of calculating square roots which was sympathetic to the operation of the underlying architecture of the computer.

83. For example, prior art methods of calculating square roots were said to rely on binary division operations. The problem with these prior methods was that, conventionally, division operations were not directly provided within binary systems. Conventionally, they had to be implemented by combining various binary “add”, “subtract”, “test” and “shift” functions. In Gale, the inventive method of calculating square roots eschewed division operations, relying instead on multiplication operations. The multiplication operations were inherently easier and faster to implement using the registers of a conventional computer, for example by using binary “shift” operations.

84. Yet the court of appeal concluded the invention in Gale was no more than a program for a computer as such. This was because its instructions did “not solve a “technical” problem lying within the computer.” The Court held that the instructions did no more than:

prescribe for the cpu in a conventional computer a different  set of calculations from those normally prescribed when the user wants a square root.

85. The court also held that the instructions did not:

define a new way of operating the computer in a technical sense.

86. So, the court decided that the effect produced in Gale did not count as a technical contribution. It was no more than an effect produced by merely running a new program. In other words, the effect of the invention in Gale extended no further than it being just a better or well-written program. This is an effect which is excluded as a program for a computer as such under section 1(2).

87. Viewed through the modern-day signposts, the invention of Gale would not indicate any effect that is “technical” in the sense of signposts (ii), (iii), (iv), or (v).

A new way of operating a computer in a technical sense

88. As we have seen, Gale’s Application was refused (in part) because its program for calculating square roots did “not define a new way of operating the computer in a technical sense”. This principle was distilled into the wording of signpost (iii) which asks:

whether the claimed technical effect results in the computer being made to operate in a new way.

89. Thus, signpost (iii) is indicated by a computer-implemented invention when it is a new way of operating a computer in a relevant technical sense.

90. However, it does not follow that a new program running on a computer is a new way of operating a computer in a technical sense. The new program must produce some sort of technical effect which is not caught by the program exclusion. As explained in Aerotel, a “technical effect which is no more than the running of the program is not a relevant effect”.

91. The new program for calculating square roots in Gale did not operate the computer in any relevant technical sense that avoided the program exclusion. Gale did not operate a computer in a new way beyond just being a better or well-written program for a conventional computer. Gale is a negative example of signpost (iii).

92. The wording of signpost (iii) is agnostic as to whether the invention works at the application level. This contrasts with the wording of signpost (ii), for example. Signpost (ii) requires “a technical effect produced irrespective of the data being processed or the applications being run”. This suggests it is unlikely an application program would meet signpost (ii). However, the wording of signpost (iii) is not restricted in this way, suggesting that an application program may produce a technical effect meeting signpost (iii).

When does a computer-implemented invention define a new way of operating a computer in a technical sense?

93. Unfortunately, there are few (if any) explicit examples of the positive application of signpost (iii) in decided case law. This means there is uncertainty about the effect a computer-implemented invention (or an AI invention) must make to be a positive example of signpost (iii). Paragraph 42 of Gemstar suggests it is not enough to make a computer “work differently in the sense of processing data in a different way”. Instead, a suitable technical effect involves making a computer “work better, faster or differently in that sort of performance sense”.

94. Signpost (iii) may apply when a functional unit of a computer (for example a bus, GPU, or accelerator) has new programming. The unit must be programmed to operate in a new way that is technical in nature. The functional unit must produce a technical effect going beyond the mere execution of its new program. It must be better, faster, or work differently in a performance sense, not merely process data differently. When this is true, the newly programmed functional unit has the knock-on effect of operating the computer in a technical sense.

95. Alternatively, the inclusion of a newly programmed functional unit may create a new arrangement of hardware within the computer. For example, the judgment in Lenovo observed that the facts of Aerotel could be an example of signpost (iii). In Aerotel, a new “special exchange” was implemented by a new application program running on a conventional hardware unit of a wider computer system. Incorporating the new “special exchange” into the otherwise conventional computer system was held to create “a new physical combination of hardware” which was not excluded.

96. The decision in BL O/066/06 (ARM Limited) might be another example of signpost (iii). The speed and accuracy of a compiler was improved by using performance data from a non-invasive trace unit. The trace unit monitored the execution of a compiled program and used performance data to control the workings of the compiler. The hearing officer found the contribution over the known art was technical and could not be regarded as merely a computer program as such. This outcome may be contrasted with BL O/173/08 (Intel Corporation) in which a vectorising (parallelising) compiler only made a contribution to improving a program.

Conclusions

97. An AI invention is likely to reveal a technical contribution if:

  • it solves a technical problem lying within a computer, or
  • it is a new way of operating a computer in a technical sense

98. When either of these criteria applies to an AI invention it indicates one or more of signposts (ii), (iii), (iv) or (v) are relevant and the invention is not excluded.

99. The scenarios illustrate examples of AI inventions which solve technical problems lying within a computer system in accordance with one or more of signposts (ii), (iv) or (v). See for example:

  • scenario 10 (cache management using a neural network)
  • scenario 11 (continuous user authentication)
  • scenario 12 (virtual keyboard with predictive text entry)

100. The scenarios also illustrate examples of AI inventions causing new operation of a computer in a relevant technical sense, according to signpost (iii). See for example:

  • scenario 16 (processing neural network on a heterogeneous computing platform)
  • scenario 17 (special purpose processing unit for machine learning computations)
  • scenario 18 (a multiprocessor topology adapted for machine learning)

Core AI

101. Core AI inventions should be examined in the same way as any other computer-implemented invention by considering each case on its own merits. The guidelines for assessing applied AI inventions that are set out above apply equally to core AI inventions. If a core AI invention reveals a relevant technical contribution, then it will not be excluded under section 1(2).

102. However, in contrast to applied AI inventions, the advance a core AI invention makes is necessarily limited to the field of AI itself. The advance made will lie in the models, algorithms, or mathematical methods constituting the core AI invention. Core AI inventions are not concerned with the real-world application of those models and algorithms to technical problems external to, or lying within, a computer. This means it is unlikely that signposts (i), (ii), and (iv) will point to allowability for core AI inventions.

103. Nonetheless, a core AI invention may still meet signposts (iii) or (v). However, as discussed above, there is some uncertainty about when a computer implemented invention (or an AI invention) might meet signpost (iii). The facts of Gale, and Aerotel indicate that application programs (such as core AI programs) may make technical effects in accordance with signposts (iii) and (v).

104. For example, a core AI invention may make a technical contribution if:

  • it causes a functional unit of a computer (for example a GPU or accelerator) to work in a new way in a technical sense, or

  • it creates a new physical combination of hardware within the computer

105. If either of these criteria apply, then the core AI must still produce a technical effect falling outside the excluded subject matters of section 1(2). For example, the effect must be something more than an effect arising from a better algorithm or well-written program running on a conventional computer. The core AI must bring about a change to the technical operation of a computer, not just a change to an underlying algorithm or mathematical method.

106. An example of a core AI invention being excluded by the high court is found in Reaux-Savonte v Comptroller General of Patents. The invention in Reaux-Savonte was an “AI genome”. This was understood to be a hierarchical or a modular arrangement of computer code facilitating its evolution over time. The inventive computer code could modify, adapt, change, and improve over time in the same way that biological code evolves.

107. The court upheld the hearing officer’s finding that the invention was a program for a computer “as such” and was not technical in nature. Amongst other things, the court upheld the hearing officer’s findings that signposts (iii) and (v) were not shown. For example, the hearing officer found that even if the applicant’s program was new, it did not meet signpost (iii):.

… a computer system operating on new code does not imply that the system works in any way differently to how it would with the old code. I have been unable to find anything in the application that suggests that a computer system is being made to operate in a new way.

108. In BL O/390/22 (IBM Corp.) the invention was aimed at improving active learning of entity resolution rules to scale better over large data sets. Entity resolution relates to finding records in a data set that refer to the same entity across different data sources. It may be used for deduplication in a single database or for matching entities of different databases. The claimed invention included conventional hardware elements (distributed memory and disk cache hierarchy) to perform entity resolution more efficiently. However, the operation of these hardware elements was unchanged, so a computer did not operate in a new way and signpost (iii) did not assist.

109. BL O/1045/22 (Institute for Information Industry) related to a federated machine learning system. A host device updated a model based on training results received over multiple training rounds from many client devices. The training speeds of the client devices were different, so the host device updated the model using a threshold value. The value indicated a difference in the number of training rounds between client devices. When the difference was below the threshold, the host updated its model without waiting for results from the client having a lowest training round.

110. In addition, the time taken to receive a training result from a client was considered. If it was greater than a pre-set time value, then the host would no longer receive training results from that client. The hearing officer found these features were policy-based decisions about when to update the host model. They were not concerned with a technical process and did not solve a technical problem.  The invention was found to be excluded as a program for a computer as such.

Core AI scenarios

111. The scenarios illustrate examples of core AI inventions that might be seen to operate a computer in a new way in the technical sense of signpost (iii). See for example:

  • scenario 16 (processing neural network on a heterogeneous computing platform)
  • scenario 17 (special purpose processing unit for machine learning computations)
  • scenario 18 (a multiprocessor topology adapted for machine learning)

112. The scenarios also illustrate core AI inventions involving artificial neural networks. Following the judgment in Emotional Perception the computer program exclusion does not apply to them, so they are not excluded. See for example:

  • scenario 13 (optimising a neural network)
  • scenario 14 (avoiding unnecessary processing using a neural network)
  • scenario 15 (active training of a neural network)

Training AIs and data sets

113. An AI invention may rely on a model that requires training with a set of training data before it can be used for its intended application or purpose. This process of training may be referred to as machine learning. Training methods and machine learning methods may also be categorised as being either applied AI or core AI inventions. It follows they should be examined in the same way as applied AI or core AI using the guidelines above. Inventions involving training of AI inventions are not excluded if they reveal a relevant technical contribution to the known art.

114. A useful analogy for thinking about inventions involving the training of AIs is that of calibration. Certain technical devices or functions may require calibration before they can be used accurately for their intended technical purpose. Examples are devices having a sensor such as a thermometer or touch-sensitive display. These sorts of devices require calibration to provide accurate estimates of physical parameters (for example to measure temperature or detect touches on a touch-sensitive display). This is true whether the devices are implemented in hardware or software (or some combination of both).

115. Under the Aerotel approach, a computer-implemented method of calibration that makes a technical contribution to the art would be a patentable invention. By analogy, it follows that methods of training AI models or machine learning for achieving a specific technical purpose may also make a technical contribution. Each case should be examined on its own merits.

116. In practice, there are many ways in which methods of training AIs and machine learning may be claimed. For example, an optional training step or method may be claimed in a dependent claim of a patentable AI invention. An independent claim directed to training an AI may share an inventive concept with another independent claim for a patentable use of the trained AI. Alternatively, a method of training an AI model may make a technical contribution in its own right and may be claimed as such.

Data sets

117. Methods of training AI models and machine learning rely on training data, often referred to as a “data set”.  There are several ways in which patent protection for the features comprising data sets might be considered.

118. Firstly, the use of a data set may be explicitly (or implicitly) claimed as a constituent feature of a training method. If the training method makes a technical contribution, then the data set will be protected by being part of the patentable method. Secondly, a patentable innovation may lie in a method of generating or improving a dataset. If the method makes a technical contribution, then it is patent-eligible.

119. Thirdly, the constituent features of a dataset itself may be claimed directly as a data set characterised by its content. This may involve claiming the information the data represents, its organisation, or its delivery (for example on paper or in some computer-readable form). However, it is unlikely that a claim to a data set itself can be shown to meet all four requirements of section 1(1) for a patentable invention. For example, claims to data sets characterised solely by their information content are likely to be excluded as presentation of information as such.

120. The scope of the exclusion to presentation of information was considered by the high court in Gemstar v Virgin. The court noted that “if the presentation of information has some technical features over and above the information and its delivery, then it might be patentable”. The court distinguished “the content or its mere delivery”, which is excluded, from “that material plus some additional technical aspect of its delivery” which may be patentable. The court concluded, “So what achieves patentability is some real world technical achievement outside the information itself.”

121. The conclusions in Gemstar are consistent with the subsequent judgment of the high court in Garmin v Philips. In Garmin, the court held that, “the key point is to ensure that the claimed feature is not in substance a claim to information content.”

122. Thus, if a claim to a data set is only characterised by its content and a conventional means or process for presenting it (for example, an electronic file), then it is likely excluded as presentation of information. It is unlikely that claiming a data set in this way provides any real-world technical achievement outside its information itself.

123. The practice of the IPO is that any claim to a data set will be treated on its own merits. However, unless there is a persuasive reason that the claimed data set makes, as a matter of substance, a real-world technical achievement outside the information it holds, then such a claim is likely to be excluded under section 1(2)(d) as the presentation of information as such.

Scenarios involving the training of AI inventions

124. The scenarios illustrate examples of training AI models that reveal a technical contribution. See for example:

  • scenario 4 (detecting cavitation in a pumping system)
  • scenario 6 (measuring percentage of blood leaving a heart)
  • scenario 11 (continuous user authentication)
  • scenario 18 (a multiprocessor topology adapted for machine learning)

125. An example of a method of training a neural network that is patentable following the Emotional Perception judgment is:

  • scenario 15 (active training of a neural network)

Hardware-only implementations of AI inventions

126. An AI invention may be claimed in hardware-only form which does not rely on a program or a programmable device in any way. For example, a claim may be directed to new functional or constructional features of dedicated or special-purpose electronic circuitry. If an AI invention is claimed in this way, then it is likely the program for a computer exclusion of section 1(2) is not engaged. Moreover, the mathematical method exclusion is usually avoided because a circuit implementing a mathematical method is more than a mathematical method as such and is technical.

127. However, it is emphasised that merely drafting a claim to cover both software and hardware implementations does not necessarily avoid the exclusions set out in section 1(2). It is a long-standing principle of patent law that all embodiments embraced by a claim must be patentable, otherwise the claim is bad (see MoPP 1.04). If a claim covers a software implementation, then it should be assessed by asking whether it makes a technical contribution.

128. However, in Emotional Perception, the court held an artificial neural network (ANN), whether implemented in hardware or software, is not a computer program.

129. Examples of hardware-only implementations of mathematical methods being allowable are found in BL O/420/21 (Imagination Technologies Limited). The hearing officer interpreted the phrases “fixed function circuitry” and “dedicated hardware” appearing in the claims in the context of the relevant descriptions and drawings.  The hearing officer construed these terms as meaning an arrangement of gates, transistors, and registers that achieve a specific function. Hence, the hearing officer found the claimed inventions to be protecting (new) specific pieces of hardware which were not programmed or programmable in any way. The hearing officer concluded the claimed inventions were technical in nature and could not be a program for a computer or mathematical method as such.

130. The hearing officer also distinguished the claimed inventions from the situation in Gale’s Application. In Gale, computer-program instructions for execution by a processor of a conventional computer were stored in hardware form (on a ROM). These instructions were held, as a matter of substance, to be caught by the program exclusion.

Conclusions

131. An AI invention likely reveals a technical contribution if claimed in a hardware-only form which does not involve a computer program or a programmable device.

Sufficiency

132. Section 14(3) of the Patents Act 1977 requires that:

The specification of an application shall disclose the invention in a manner which is clear enough and complete enough for the invention to be performed by a person skilled in the art.

133. The relevant principles to be applied when determining whether a patent application satisfies this section are summarised in Eli Lilly v Human Genome Sciences:

The specification must disclose the invention clearly and completely enough for it to be performed by a person skilled in the art. The key elements of this requirement which bear on the present case are these: (i) the first step is to identify the invention and that is to be done by reading and construing the claims;

(ii) in the case of a product claim that means making or otherwise obtaining the product;

(iii) in the case of a process claim, it means working the process;

(iv) sufficiency of the disclosure must be assessed on the basis of the specification as a whole including the description and the claims;

(v) the disclosure is aimed at the skilled person who may use his common general knowledge to supplement the information contained in the specification;

(vi) the specification must be sufficient to allow the invention to be performed over the whole scope of the claim;

(vii) the specification must be sufficient to allow the invention to be so performed without undue burden.

134. These are the relevant principles to be applied when considering the sufficiency of disclosure of AI inventions.  Whether an AI invention meets these disclosure requirements is decided by considering each case on its own merits.

135. For example, in  BL O/0007/23 (Teledyne FLIR Commercial Systems) the hearing officer considered sufficiency and concluded:

I am in no doubt that performing such a technique in a practical embodiment of the invention would clearly be within the realms of what the relevant skilled person could achieve without further instruction.

136. Likewise, the extent to which a training data set should itself be disclosed is a matter to be decided by considering each case on its own merits. For example, the EPO board of appeal considered whether an invention relying on a data set was sufficiently disclosed in T 0161/18 (Äquivalenter Aortendruck/ARC SEIBERSDORF). This decision both reflects, and is consistent with, the principles set out in Eli Lilly v Human Genome Sciences. The decision in T 0161/18 highlights the importance of disclosing how an AI invention relies upon a training data set sufficiently. A patent application should teach those details in a manner that enables the invention to be worked across its scope without undue burden.

137. For example, in paragraph 2.2 of T 0161/18 the board stated that:

the application does not disclose which input data are suitable for training the artificial neural network according to the invention, or at least one dataset suitable for solving the present technical problem. The training of the artificial neural network can therefore not be reworked by the person skilled in the art and the person skilled in the art therefore cannot carry out the invention.