There is no breakthrough required, it's trivial. It's just that by making a model do that, you'll screw it up on several other dimensions.
Asking a question like this only highlights the questioners complete lack of understanding of LLMs rather than an LLMs inability to do something.