Monday, December 13, 2010

When did it become an employers duty to provide you with health insurance?

What gives Obama the right to mandate that an employer is responsible to provide health insurance to its workers? Whats next, housing & cars will be mandated by Washington?
--------------------
It was during WW2 and the wage-price controls that were mandated by the government to keep civilian spending under control. Eager to hire workers away from military production, employers had to offer some incentives to them. Offering health insurance became a way around the wage-price controls. There were also rationing coupons for meat, gasoline, rubber (tires) metals, paper, etc. Since access to medical care has now been rationed for many years, and we are used to it, perhaps looking at another approach would be more productive.
Source

No comments:

Post a Comment