I don't quite understand the whole deal. Why is there both health care and private health insurance? Americans pay taxes for health care, but what does health care do? What is the purpose of Health insurance if they have health care? I'm confused.
--------------------
Health care and insurance are two different things. The insurance industry is there to make money. It just happens to provide healthcare as well. Look at the facts. In the US, more money is spent on healthcare per person than any other country in the world. And people live longer if they are born in Western Europe instead. Also, babies die in the USA that would have a better chance of life if born in western Europe. Should Americans be proud that the insurance system they have costs more and does not give babies the same chance of life if born abroad?
Source
No comments:
Post a Comment