Healthcare seems to be a buzzword this year. Everyone is talking about it and it is no wonder why. With the start of The Affordable Care Act american now have more access to healthcare than ever before. So why is it that healthcare seems to be in even more turmoil now than it was before? With insurance rates rising and people going to the doctor less it makes you wonder where the country went wrong.
Healthcare is becoming more expensive and more exclusive. With a lack of doctors in their fields and expensive treatments that most patients cannot afford people are choosing to forgo the doctor more often than not. When patients choose not to see their doctors the doctors are not getting paid as much and tend to hike up costs to help supply their bottom line. When this happens we fall into a vicious cycle of over priced and over produced products that no one has the money or use for. and don’t even get me started on pharmaceuticals. Patients are having to pay hundreds of dollars for medications that cost pennies to make. Healthcare has become less and less about patient care and more about profits.
Healthcare was a problem before The Affordable Care Act and if something isn’t done soon it is going to continue to be a problem. Until some of the basic healthcare needs are met nothing will improve. And without a serious reduction in costs many patients will chose to go without healthcare or just not go to the doctor at all which isn’t healthy for anyone.