One of two classic excuses, to hijack control of our devices, our computing, an attack on libre software (they don’t care about porn). Next, they’ll be banning more math, encryption, again.
Good luck fighting that. Pandoras box has been opened. The cat is out of the bag. The horse has left the barn. Got it, Congress?
As long as AI engines are available and training material is easily accessible, an AI for about any purpose can be trained.
I don’t think anyone believed this would render AI incapable of making nonconsensual porn. The point is to make creating it illegal.
It’s not working for piracy. This would seem to be even more difficult to fight than piracy since it can be done on a computer all by yourself without connecting to any other computer or distributing any files.
Again, the idea isn’t to render it physically impossible to occur or to stop it from ever happening. There’s probably not any law that fits either of those criteria. The point is to make it a crime. It shouldn’t be legal to make realistic nonconsensual porn of another person. If it’s discovered that you’ve done that, there should be legal consequences.
And yet we still put stop signs where no police cars patrol. Because people will avoid bad behavior for three main reasons:
- They know it’s wrong
- They know it’s illegal
- They know they might get caught
Laws help with #1 and #2. If you’re telling me the only one that factors into your moral calculations is #3, then you might wanna check to see if your soul is still under warranty.
So does this mean there will be no holodecks like Quark’s in the future?
Barclay will be very disappointed
They hate it for different reasons, though, and the devil is in the details.