One question that needs to be answered is: can an AI’s creation be protected by copyright and patents?
This issue is multifaceted as legal experts try to apply existing law to fast-evolving circumstances — something that does not always work. There are also differences in national legal systems, so technology companies need to take a global perspective to ensure full realisation of all implications.
For example, under French copyright law, which was largely created to protect individual authors above all and is even recalcitrant to ownership of copyright by legal persons (as opposed to individuals), the standard of protection of copyright work is the imprint of the author’s personality. Clearly, authors must be individuals, and AI cannot hold copyright.
The solution is identical in the UK: “the Copyright Designs and Patents Act 1988 sets out that in the case of a literary, dramatic, musical or artistic work which is computer-generated, “the author shall be taken to be the person by whom the arrangements necessary for the creation of the work are undertaken”.
Another way to protect intellectual property is through patents — and here the law is even clearer. Patent law requires inventors to be individuals who contributed to the conception or conversion of a concept to a practicality.
For example, if an AI created an entirely new semiconductor chip, it could not be protected by patents unless some human intervention took place in the creative process, such as through the person who programmed the AI. Within the marketplace it is now common practice for the owner, founder or the head of the R&D department of the company who owns the AI, to be named as the inventor of the product to deal with this eventuality.
So again, under current laws, an AI cannot own a patent.
>Read more on Artificial intelligence in the legal industry: Adoption and strategy – Part 1
>Read more on Artificial intelligence in the legal industry: AI’s broader role in law – Part 2
>Read more on Artificial intelligence in the legal industry: The future – Part 3
IP rights
There is also the question of whether an AI can infringe existing IP rights.
The time where every AI development needed human intervention to ‘coach’ the machine to learn its processes is gone. Now, AI systems have been given the capability to modify their code, and there is a risk that the resulting code can infringe on someone else’s rights.
This brings into question who is liable for that infringement. At present the owner, developer, programmer or manufacturer of the AI is likely to be held ultimately responsible for its actions.
Promisingly, some legal areas relating to AI are gaining significant attention from European lawmakers, including liability. Last February, MEPs asked the EU Commission to propose rules on AI to establish liability for accidents. A group of experts subsequently submitted an open letter against the proposal earlier this year. There have also been various debates on whether to establish personhood for AI.
Why artificial intelligence still needs a human touch
If robots are granted full legal personhood status in future then the situation would be quite different. If an AI with personhood status independently created a work of art that infringed on another’s rights, the usual infringement rules would apply. Suing a robot may sound like a far-flung science fiction fantasy, but legal personality and the ability to own assets go hand in hand, so it is theoretically possible if legal personality is granted.
However, an additional legal change would be needed to open up this possibility. IP laws as they stand do not recognise an AI’s right to invent a new piece of technology that can be patented, or create a work of art that can be copyrighted. The law as it stands still needs human intervention for creation to have been said to have taken place. This matters because so far IP legislation is moving at a much slower pace than liability issues.
‘Human’ intervention
These issues will become more prevalent as technology moves from Soft AI (non-sentient artificial intelligence focused on one narrow task) to Hard AI (artificial general intelligence with consciousness, sentience and mind).
We may get to a point where AI is as smart as a human and requests the same rights as people — as dramatised in the late Isaac Asimov’s novels.
In future, there might be a fourth law to Asimov’s Laws of Robotics: ‘Robots have legal personality and are responsible for their actions’. But for now, we have some way to go. Where we cannot apply existing law to new situations, laws need to be created. Identifying a legal status for AI and sharing this status around the world would provide an answer to this exciting challenge.’