Sunday, January 23, 2011

Will employers be required to PAY for employees health insurance or just be required to provide access to it?

Lots of companies take part or all of the cost of health insurance out of their employees checks. Will this practice stop because of Obama's health care plans? And force employers to pay 100% of health insurance costs? Wont that cost the US jobs?
--------------------
It is unclear as they keep revising the bill, if it does you bet it will be paid for in jobs, as a small businessman I'd have to lay off half of my 6 employees or close up shop
Source

No comments:

Post a Comment