Indeed, facts are part of the moral discussion in ways you outlined. My objection was that just listing some facts/opinions about what AI can do right now is not enough for that discussion.
I wanted to make this point here explicitly because lately I've seen this complete erasure of the moral dimension from AI and tech, and to me that's a very scary development.
Get with the program dude. Where we're going, we don't need morals.
> because lately I've seen this complete erasure of the moral dimension from AI and tech, and to me that's a very scary development.
But that is exactly what the "is ought problem" manifests, or? If morals are "oughts", then oughts are goal-dependent, i.e. they depend on personally-defined goals. To you it's scary, to others it is the way it should be.