If the reality of things, the simple truth, is able to "influence" Americans does it really matter who brought that truth up?
Do you prefer Americans to be ignorant about certain topics, or to be informed even if that comes at the cost of reduced approval for the government?
What if, and hear me out, China didn't limit its propaganda to the truth?