I think it's a bit like the Dunning-Kruger effect. You need to know what you're even asking for and how to ask for it. And you need to know how to evaluate if you've got it.
This actually reminds me so strongly of the Pakleds from Star Trek TNG. They knew they wanted to be strong and fast, but the best they could do is say, "make us strong." They had no ability to evaluate that their AI (sorry, Geordi) was giving them something that looked strong, but simply wasn't.
Oh wow this is a great reference/image/metaphor for "software engineers" who misuse these tools - "the great pakledification" of software