WASHINGTON, D.C. – Health insurance is the most important benefit an employer can offer workers and their families, according to a new survey on how American workers view employer-sponsored health coverage. Workers report that they overwhelmingly prefer to receive health insurance directly from an employer rather than through other means. The poll found that as high as 96% of Americans believe it is important that a job to offer health insurance. Additionally, by nearly a two-to-one margin, respondents said they would not accept a job that does not offer health insurance.
Calais Document Category:
Topics:
Location:
Tags:
Social Tags: