We Have a Right to Health Care
I agree that we all have a right to "health care." Unfortunately, I disagree with the commonly accepted meaning of that term. My definition of health care is very straight forward. Quite simply, it means: Take care of your health!
That's the opposite of living off a nutritionally dead and toxin-rich diet of processed foods, GMO, high fructose corn syrup, MSG, fluoride, aspartame, etc. It's the opposite of refusing to exercise, abusing the hell out of your body your whole life, and then demanding other people pay to repair the damage when everything starts to break.
In short, government-mandated "health care" has nothing to do with health care. Rather, it's about dumping more and more money into the politically-connected, multi-trillion dollar "health-repair" industry. As such, the CAUSES of escalating disease in this nation, which are mostly consumption-based and preventable, will continue to be ignored.
Stated another way: If the government actually cared about health care, it would focus on health care. It would stop endorsing, subsidizing, and protecting those who are poisoning us. It would stop demonizing, attacking, and attempting to shut down those who challenge the current health-destruction, health-repair model.
Do we have a right to health care? Absolutely. Do we have a right to completely neglect the physical needs of our body? Sure. Do we have the right to make others pay for the damage that our choices cause? No way.