In previous blogs we talked about the main milestones of the US healthcare system. We highlighted the main events that led to the birth of health insurance around 1929. We showed how professional pressure from physician lobbyists was in the direction of convincing the government to take over health insurance in the first half of the 20th century. Ultimately, their efforts paid off around the 1960s with the enactment of Medicare and Medicaid laws. The US government, proud of its great achievements in solving the cost problems of healthcare for a large sector of the American people, tried to extend its coverage for more Americans but failed due to immaturity in managerial roles among physician groups. On the other hand, the US government was blind to the fact that the aging population was getting bigger, and that the technological developments and innovations that raced in the second half of the 19th century, was putting a great burden on the budget, that cost of healthcare was beginning another trip up the scale.
Was that all that the US healthcare system should worry about?
Absolutely not, for after the World War II, another problem was on the rise. There was some trends towards a shortage of physicians around that time, a trend that was government officials responded to by increasing the capacity of medical schools and teaching hospitals in the United States. That was good news to local physicians as well as foreign medical students as well. It is well known that some medical degrees in some countries of the world are not equivalent to the US medical education and training, that's why if a foreign physician wanted to come and work in the US, he has to do additional tests called USMLE ( United States Medical Licensure Examination ), which is required by the ECFMG ( Educational Commission for Foreign Medical Graduates ). After succeeding in those tests, a foreign medical graduate can hook up with a US residency program and start practicing medicine in the US.
With the progress of technology, and medical research around the 1970s, there was a similar trend towards progress in medical knowledge and training. So being a specialist in your field would add a lot to your value and income, and would allow you to benefit the most out of the expanding science of innovation. Nevertheless, things didn't go as they should be.
We have said before that one of the greatest differences that the US healthcare system has is that the healthcare coverage between urban and rural areas is so vast that it is similar to the differences in coverage between developed and developing countries. When you're a specialist in any medical field, you sure will want to stay close to the doctor who send you patients, and you will certainly want to affiliate yourself to a respectful hospital or medical center, all that has lead to an oversupply of physicians and specialists in urban areas, with their shortage in more rural areas.
What did the US healthcare system do about that?
It had two solutions that it thought would solve the problem but actually made it worse. First , medical schools increased their training opportunities in an attempt to outrun the need for new specialists so that fresh medical graduate can shift to other specialties or choose to be primary care physicians that will provide basic healthcare for people all over the country. That policy didn't produce the desired outcome. Still many physicians chose to excel in specialties that the need for was not so high and elected to work in cities and well populated areas of the US.
Second, Hospitals were not that keen to cut down on their intern and residency programs, simply because of the extra funds they would receive from programs like Medicare for the presence of teaching and training programs among their curriculae. The net result is an oversupply of specialists in cities and shortage of medical coverage in other rural areas. In 1989, despite major increases in the physician supply, rural areas in the United States had fewer than 100 physicians per 100,000 persons, compared with up to six times that many in many cities.
It was thought that the managed care organizations that started in 1990s would form a new trend to emphasize primary care as a new and important option for the new standards of healthcare in the US, through incorporating such concepts in many models like case managers, and collaborative teamwork for patients with chronic illnesses, but still the problem wasn't fully corrected despite good steps on that path.
What are the average healthcare costs for the elderly across the US in 2012 ?
1- Nursing homes, which can provide extensive, skilled medical care, are the most expensive facilities. The median US rate for a private room is 222 USD per day. The highest costs are in Alaska, where the median daily rate for long term care is 783 USD. The lowest median rate is in Missouri, at 137 USD.
2- Assisted living facilities, which often offer a homelike atmosphere, are the second most expensive option. The national median monthly rate for a one bedroom private residence is 3,300 USD. The highest median rate is in Alaska, at 6,813 USD, while the lowest is in Georgia, at 1,500 USD.
3- Home health aides allow elderly patients to stay in their homes while receiving daily care. The national median hourly rate for a non-Medicare certified aide is 19 USD. Minesota has the highest median hourly rate of 28 USD. Texas has the lowest, 13 USD.
(3). Reinhardt UE. Reinhardt on reform (interview done by Donna Vavala). Physician Executive. 1995;21:10-12.
(4). Eisenberg JM. If trickle-down physician workforce policy failed, is the choice now between the market and government regulation? Inquiry. 1994;31:241-249.
(5). Schroeder SA. Academic medicine as a public trust. JAMA. 1989;262:803-812.
(6). Cost of care Survey, Genworth 2012.