>Thinking and consciousness don’t by themselves imply emotion and sentience...
Sure, but all the examples of conscious and/or thinking beings that we know of have, at the very least, the capacity to suffer. If one is disposed to take these claims of consciousness and thinking seriously, then it follows that AI research should, at minimum, be more closely regulated until further evidence can be discovered one way or the other. Because the price of being wrong is very, very high.
Probably because those examples arose in an environment with harm, the Earth, and thus had incent to evolve the capacity to suffer. There is no such case for AI today and creating a Pascal's wager for such minimization is not credible with what we know about them.
[dead]
Emotions and suffering are "just" necessary feedback for the system to evaluate it's internal and external situation. It's similar to how modern machines have sensors. But nobody would say a PC is suffering and enslaved, just because the CPU is too hot or the storage is full.
It's probably the sentience-part which makes it harmful for the mind.