So that's the key difference. A lot of people train these Markov models with the expectation that they're going to be able to use the generated output in isolation.
The problem with that is either your n-gram level is too low in which case it can't maintain any kind of cohesion, or your n-gram level is too high and it's basically just spitting out your existing corpus verbatim.
For me, I was more interested in something that could potentially combine two or three highly disparate concepts found in my previous works into a single outputted sentence - and then I would ideate upon it.
So I haven't opened the program in a long time so I just spun it up and generated a few outputs:
A giant baby is navel corked which if removed causes a vacuum.
I'm not sure what the original pieces of text were based on that particular sentence but it starts making me think about a kind of strange void harkonnen with heart plugs that lead to weird negatively pressurized areas. That's the idea behind the dream well.