"and can't be easily extracted" is doing a lot of work there. People are very good at reverse-engineering. There would soon be a black market for 'clean' private keys that could be used to sign any video you want.
There would also be a requirement for all playback to actually properly check the private keys and for all the parties involved in the process to be acting in good faith. Not only would you have a black market for individuals to scalp clean keys but you'd likely have nation states with interests putting pressure on local manufacturers to give them backdoors.
We'd probably hit a lot of that with SSL if it wasn't so unimportant from a political perspective[1]... but if the thing we were trying to secure is going to boost or damage some prominent politician directly then the level of pressure is going to be on a whole different scale.
1. And we might still have that corruption of SSL when it comes to targeted phishing attacks.
There's also always the "analog loophole". Display the AI-generated video on a sufficiently high-resolution / color gamut display and record it on whatever device has convenient specs for making the recording, then do some light post-processing to fix moire/color/geometry. This would likely be detectable, but could shift the burden of (dis-)proof to the defendant, who might not have the money for the expert witnesses required to properly argue the technical merits of their case.
More likely, the signing would have to use compression-resistant steganography, otherwise it's pretty easy to just remux/re-encode the video to strip the metadata.