I would very much like to leave the "free state of Florida". All of the benefits of living here that I grew up knowing by heart no longer exist and the state government's only concern seems to be punishing people for wrongthink. It isn't cheap to lease or buy property anywhere near a metro area, the coral reefs and sponge beds have mostly disappeared, the beautiful wildlife in our state parks has been curtailed by constant wildfires, and the schools have atrophied to a shell of their former selves. What's the point of living here anymore, or raising my kids here?
I hear Austin, TX is the bees knees
definitely avoid CA
The schools were ever good? FL, and the entire US south, for that matter, have a long track record of poor K-12 performance.