Does the U.S. Really Have a Free Market for Health Insurance?
Conventional wisdom holds that the U.S. has a free market for health insurance and Europe relies on a state-run, socialist health care system. The U.S. ‘free market’ for health insurance, however, is in fact strictly regulated. States exert significant authority over what benefits plans can offer and what premiums they charge. Consider the following evidence…