Completely removing the U.S. Government from the health care market (Medicare, Medicaid, Obamacare, uncompensated care, etc) would be a great start.
Could you explain how this would help? I'm struggling to understand where you're coming from here, besides perhaps a reflexive libertarian reaction to government.
Why not go to the other extreme, as most other developed countries have done (and have lower healthcare per capita)?
Do you know any countries that have no government involvement in healthcare that has good health outcomes?