What a monumentally stupid idea it would be to place sufficiently advanced intelligent autonomous machines in charge of stuff and ignore any such concerns, but alas, humanity cannot seem to learn without paying the price first.
Morality is a human concern? Lol, it will become a non-human concern pretty quickly once humans don't have a monopoly on human violence.