Tuesday, May 3, 2011

Why do Americans think, it's the Governments responsibility to give them Health Insurance?

Isn't it our responsibility to buy our own insurance. It has been for years, what is it different now?
--------------------
Only the Dems promise to give every American health care. The rest of us were raised to believe that we must work for what we get. And all those millions out there who are "able" but not willing to work and want to live on welfare and food stamps are furnished better insurance and medical care than those who do work for it. And before you come unglued on that, I am not talking about those who legitimately need it. If the premiums go up, we just learn to tighten our belts and deal with it. We don't go crying to the government to take care of us. We weren't raised that way! I am sure a lot of you will have plenty to say about this. So rave on, cat sh**, someone will come along eventually and cover you up!
Source

No comments:

Post a Comment