but car insurance is mandatory. Isn't that ironic? Americans must have car insurance but not health insurance.
--------------------
Interesting. If you don't have car insurance, you don't drive. If you don't have health insurance, you can get messed up. If you believe that it is the responsibility of the state to look after the welfare of the citizens (what else is the purpose of the state anyway!), it is the failure of the state not to provide health care for all citizens.
Source
No comments:
Post a Comment