theForum

New offences to be created


https://forum.unlock.org.uk/Topic35201.aspx

By punter99 - 3 Feb 25 11:16 AM

Although they haven't yet been written, it sounds as if new laws are on the way to criminalise possession. This time they are targetting the AI tools used to create images, which will be difficult because those tools are widely available and they are used to create other things as well.

When it comes to deepfakes, the law changes originally included soliciting deepfake images, but now it seems to be expanding to include the technology itself. But as with all these things, it still takes some user input to create something illegal. Photoshop for example, could turn an innocent photo of a child into an illegal image, so will possession of photoshop become a crime?

Presumably the prosecution would need to show that the AI could only produce illegal images and nothing else. Otherwise, all sorts of law abiding users could potentially be arrested. Alternatively they could just focus on the demand issue and that's where an offence of soliciting images makes sense because unlike possession, it connects the viewer of the image to the person creating it. 
By AB2014 - 6 Feb 25 11:46 AM

punter99 - 6 Feb 25 10:52 AM
ED - 3 Feb 25 2:32 PM
The only way such a law could be introduced and in any way be policed would be to make the actual illegal act the entering of the prompt to the AI program. Anything else will be unworkable.

For example: It is an offence, contrary to S1 of the so and so Act 2025 to "enter a prompt message into an AI image/video generation tool with the intention of causing said tool to produce a sexually explicit image/psuedo-image of a child".


Any other attempt by Government to introduce any other kind of law targetting this will wholly fail. Mark these words.

That might be difficult, unless the AI keeps a record of all prompts that have been entered. It's also the case that the software is unpredictable, because the output doesn't necessarily resemble the prompt. The obvious solution is to look at the images that were created, because it seems unlikely that anyone arrested would have the tools to create illegal images on their device, but not have any images. But then we already have laws for images...

We do have laws around images, and they are all based on strict liability. If you have the images, it's on you. If you requested something legal but the AI gave you something illegal, I suspect the law would still be strict liability, despite the accused being able to show that they requested something legal.