Is the solution to sycophancy just a very good clever prompt that forces logical reasoning? Do we want our LLMs to be scientifically accurate or truthful or be creative and exploratory in nature? Fuzzy systems like LLMs will always have these kinds of tradeoffs and there should be a better UI and accessible "traits" (devil's advocate, therapist, expert doctor, finance advisor) that one can invoke.