Asking it to do something isn't exactly complicated. At the very least, it's way easier than actually coding so why would you expect people to struggle with writing? There's no skill required in using LLMs, that's kinda the point.
The point is that people who reject them on moral grounds won't be using them, irrespective of whether they are easy to use.
The point is that people who reject them on moral grounds won't be using them, irrespective of whether they are easy to use.