Somehow we've developed a culture in America where everyone thinks that it's taboo to pay directly for certain things. Healthcare in particular has this strange fog of mystery surrounding it which causes people completely dissassociate the service with the price.
I have a friend that was complaining about how it had been four years since she last went to her dentist. I asked her why and she said that she didn't have insurance. It was as if she was resigned to the fact that there's no way to see a dentist unless you have insurance to cover the visit.
I pointed out that dental visits aren't generally covered by health insurance and dental insurance itself is generally pretty worthless. Most Americans pay for dental visits with their own hard-earned money.
Her response: "Oh..."
This isn't a problem with one person not knowing something that should be common knowledge. We as a nation have this bizarre idea that we can't pay for things. It doesn't matter how much money we have, we can't go to the doctor unless insurance covers it.
Part of this confusion is caused by how secretive doctors are with prices. If you could go online and see the price for an annual physical, you'd view it just like anything else that you buy. However, because no one really knows how much things cost, many people don't believe that they have the ability to pay themselves.
We've got all kinds of posts about the problems this causes, but for now, let's all remember that your doctor's office is just like any other business: they provide you with something and they get paid. There isn't any black magic going on here.