Which in case of digital replicas that can feign real people, may be worth considering. Not a blanket legislation as proposed here, but something that signals the downstream risks to the developer to prevent undesired uses.
Unless they released a model named "Tom Cruise-inator 3000," I don't see any way to legislate that intent that would provide any assurances to a developer that their misused model couldn't result in them facing significant legal peril. So anything in this ballpark has a huge chilling effect in my view. I think it's far too early in the AI game to even be putting pen to paper on new laws (the first AI bubble hasn't even popped, after all) but I understand that view is not universal.
Then only foreign developers will be able to work with these kinds of technologies... the tools will still be made, they'll just be made by those outside jurisdiction.