You said you can't see a legitimate use, but clearly there are legitimate uses (the "no legitimate use" idea is used to justify bad drug policy for example, so we should be skeptical of it). As to whether we should allow it, I don't see how we have a choice. The models are already out there. Even if they weren't, it becomes cheaper every year to train new ones, and eventually today's training supercomputers will be tomorrow's commodity. The whole idea of AI "fingerprinting" is bad anyway; you don't fingerprint that something is inauthentic. You sign that it is authentic.
> The models are already out there. Even if they weren't, it becomes cheaper every year to train new ones,
Yes, lets just give up as bad actors undermine society, scam everyone and generally profit from us.
> You sign that it is authentic.
Signing means you denote ownership. A signed message means you can prove where it comes from. A service should own the shit it generates.
Which is the point, because if I cannot reliably see what is generated, how is a normal person able to tell. being able to provide a mechanism for the normal person to verify is a reasonable ask.