History Proves Many Doctors’ Recommendations are Disasters

History Proves Many Doctors’ Recommendations are Disasters

 

Posted by: Dr. Mercola
September 23 2008 | 69,275 views

By Dr. Mercola

A pill for this … an operation for that. There is no end to the ways that modern medicine can make you bigger, better, stronger, sexier, healthier … right?

After all, a prescription drug is the panacea for just about anything that makes your body less than perfect, isn’t it?

And you can always count on your doctor to give you accurate, reliable information … wouldn’t you agree?

A closer look might suggest otherwise.

Most of you know that the U.S. spends far more on health care in actual dollars and as a percentage of GDP than does any other country. Much of it is due to the constant research and development of new drugs, the deep pockets of big pharmaceutical companies, the cost of malpractice insurance, managed care, and expensive government programs such as Medicare and Medicaid.

Yet, the U.S. ranks embarrassingly low among industrialized countries in both life expectancy and infant mortality. The U.S. now ranks LAST out of 19 countries for unnecessary deaths – deaths that could have been avoided through timely and effective medical care. The reasons for this are complex and often debated; however, one contributing factor often NOT considered is the risk of medical treatment itself.

How dangerous is conventional medicine?

One-third of adults with health problems reported mistakes in their care in 2007, and rates of visits to physicians or emergency departments for adverse drug effects increased by one-third between 2001 and 2004.

Additionally, the Centers for Disease Control (CDC) reported that drug overdoses killed 33,000 people in 2005 — second only to car accidents in the category of accidental deaths — up from 20,000 in 1999, and 10,000 in 1990. Contrary to popular belief, this major increase in drug overdoses is not due to a heroin or crack epidemic. These deaths are largely due to prescription drugs.

In addition to accidental prescription drug overdoses, an estimated 106,000 hospitalized patients die each year from drugs that, by medical standards, are properly prescribed and properly administered!

But wait … isn’t your doctor supposed to be helping you by having your best interests at heart? Isn’t he or she obligated to practice under the Hippocratic Oath, sworn to “do no harm”?

It can be assumed for the most part that the majority of doctors throughout history have treated their patients with the intention of promoting health.

Certainly there have been the occasional evildoers, such as the ruthless Nazi concentration camp doctor Josef Mengele1 who escorted countless children to their deaths, but such blatantly sinister types are fortunately few.

Why Do You Place Such Blind Trust in Your Doctors?

A Gallup Poll from December of 2006 compared how people rated the honesty and ethical standards of people of various professions. Doctors were rated “very high” for honesty and ethics and were fourth from the top of the list at 69 percent, exceeded only by veterinarians, pharmacists and nurses, who rated highest. In other words, 69 percent of people polled believed their doctors were honest and ethical.2

The Harris Poll from July of 2006 compared how likely people are to trust another person, based on the other’s profession. Doctors topped this list — 85 percent of the people polled believed they could trust their doctors, followed by 83 percent for teachers, and 77 percent for scientists.

Do you trust your doctor because he or she has earned it?

Or, do you trust your doctor because you have been brainwashed by 50 billion dollars a year of advertising by the food and drug industry, causing you to fear that you will suffer some awful fate if you don’t opt for the latest pill, surgical intervention or delight in the latest “necessary” procedure?

Throughout history, many well-meaning physicians have made enormous medical blunders. A few of the most blatant of these will be explored here.

The next time you are handed a prescription, you might want to consider some of medicine’s less celebrated history.

Medicine’s Blunders, Bungles, Mishaps and Foozles

History is full of misjudgments and bad science, often with fatal results.

Physicians and scientists often did the best they could with what they knew in their efforts to try to save their patients. However, many procedures had no benefits at all, yet doctors rigidly adhered to the same dogma in their vain hopes of finding a cure.

How has medicine’s reputation survived for more than 2,000 years? Why has it been so hard to see when medicine was not working?

  1. Doctors believed they were doing good, so patients believed in them too. This is a very powerful intervention sometimes referred to as the placebo effect and can frequently trigger healing reactions in patients.
  2. Many diseases were self-limiting, and the health improvements that doctors took credit for actually resulted from the body’s natural healing processes.
  3. For centuries, a disease was viewed merely as a disorderly condition of a particular patient, not a typical condition of many patients. It wasn’t until disorders began to be categorized that standard treatments could be compared.
  4. Doctors feared that they would be seen as not doing their best for a particular patient if they prescribed a different remedy (or none at all), perpetuating bogus or ineffective treatments.

The history of the relationship between microorganisms and disease is one glaring example of the battle between good and bad knowledge (and the resulting lag time between discovery of a medical treatment and the application of that treatment), and is one of the historical scenarios that will be explored in this video series.

History is full of good scientific knowledge being shoved under the rug by ego-driven physicians and rigid institutions that resisted change in an effort to keep things operating in the status quo, a resistance that continues to this day.

Some of medicine’s most glaring historical (sometimes hysterical) misadventures include:

  • Bloodletting as cure-all
  • Evolution of germ theory
  • Dr. Semmelweiss and his hand-washing recommendation
  • Promotion of cocaine, heroin, and other narcotics
  • Promotion of cigarettes
  • Blindness in babies
  • The shoe-fitting fluoroscope
  • Lobotomy
  • Thalidomide
  • DES
  • HRT: The menopause “cure”
  • H. pylori, the true cause of ulcers
  • The Human Genome Project
  • The Vioxx disaster that killed 60,000

The Basics of Bloodletting

Perhaps the longest running tradition in medicine is bloodletting, originating in the ancient civilizations of Egypt and Greece, flourishing in Indian (Ayurvedic) medicine, and persisting through the Industrial Revolution. It was based on an ancient system of medicine in which blood and other body fluids were considered to be “humors” whose proper balance maintained health.4

Bloodletting was done by opening a vein in the arm, leg or neck with a small, fine knife called a lancet. An area was then tied off with a tourniquet and, while holding the lancet delicately between the thumb and forefinger, a cut was made diagonally or lengthwise into the vein. The blood would then be collected into measuring bowls of exquisite fine Venetian glass.

Doctors bled patients for just about every conceivable condition from pneumonia and fevers, to back pain, headaches, and depression. Bloodletting was the “aspirin” of the day.

Yet, there was never any evidence that it did any good, and many times the patients died!

Of course, it was always assumed it was the disease that killed them, rather than the treatment. There is even evidence that over-zealous bloodletting might have caused George Washington’s death in 1799.

Nevertheless, bloodletting persisted for 2,500 years until finally scientists in the 19th Century began to question its value. Medical statisticians began tracking case histories and found that bloodletting was not helping much of anything.

At the same time, scientists such as Louis Pasteur, Joseph Lister, and Robert Koch showed that germs, not humors, were responsible for disease.

Eventually the practice died, although it continued in some parts of rural America into the 1920’s. Phlebotomy, the modern version of bloodletting, is rarely used today.

However, there are conditions in which one has an elevated iron level when this can be useful. Surprisingly large numbers of individuals have this condition and are unaware of it due to frequent supplementation of food with inorganic iron supplements. These elevated iron levels can increase oxidative stress in your body and contribute to heart disease and cancer, so this is one scenario where bloodletting may be acutely helpful.

The Beginnings of Germ Theory: Antoni van Leeuwenhoek

Germ theory was a concept that took many years to congeal. The whole idea of little germs that were too small to see causing disease was hard for people to swallow. Even Fracastoro, who discovered that Bubonic plague was contagious, was inexcusably ignored.

Germ theory was actually born in the 1670’s when the Dutch researcher Antoni van Leeuwenhoek built a simple microscope and discovered a whole world of microorganisms, in addition to red blood cells and spermatozoa. He observed germs from canal water, from ginger, from the dirt between his toes — from almost everywhere he looked.

Yet, his discovery was disregarded until the 1830’s when other scientists finally began to catch on that the microscope was a legitimate tool.

The traditional view is that those early microscopes were simply not sophisticated enough to be reliable, but recent research by David Wootton3 shows that Leeuwenhoek’s microscopes were of a very high standard, just rather awkward to use.

Wootton attributes the lack of acceptance of germ theory to professional opposition, not the shortcomings of the microscope, and quotes the doctor and philosopher John Locke.

Locke professed that God had adapted our senses to our needs, implying that if our enemies were really so small, we would have been given the power to see them unaided. Wootton goes on to say that doctors saw no need to reform their medical paradigm because they believed their traditional way of doing things was quite adequate.

Germ Theory Progresses With Joseph Lister

All the way back in 1707, a French scientist demonstrated that microorganisms did not develop in a water/manure mixture that had been boiled and then sealed.

There were versions of this experiment repeated for over 150 years, and by the 1830’s, the principles of antiseptic surgery should have been in place.

According to Wootton, it would have been reasonable to expect that antiseptic surgery could have begun more than a century earlier, based on the evidence available at the time!

In 1865, a Glasgow surgeon named Joseph Lister used antiseptic for the first time in surgery. The results were dramatic: his post-amputation death rate fell from 45 to 15 percent.

For the first time, there were surgical procedures from which patients had a moderate chance of recovery.

However, Lister had to battle more than just microbes. He said he was applying the principles of the “germ theory of putrefaction”, which suggested that invisible living organisms caused decay, and those organisms were just about everywhere.

However, for centuries, there existed a rival theory that said decay was a chemical process, and the microbes were a side effect of the decay process. In order for infection to be prevented, one had to have some idea about where germs came from. If they simply appeared out of nowhere, there was little one could do.

Germ theory took a few more steps forward in 1872 when a growth of penicillium glaucum killed off bacteria in one of Lister’s liquid cultures. In 1884, he treated a patient with penicillium and cured him of an infected wound. However, as Wootton explains, Lister “lacked the energy or the resources to promote penicillium, partly because he was still struggling to win acceptance of antiseptic surgery”.

It is difficult to imagine the number of people who would have benefited from antibiotics had they been introduced fifty or sixty years earlier than 1941. Just think about all those infected wounds in World War I.

Semmelweis

Resistance to germ theory marched on. In 1846, a young Austrian-Hungarian doctor named Igaz Semmelweis investigated a notorious maternity ward in which nearly all of the inpatients contracted a fatal case of “childbed fever”. What he noticed was that women who came into the ward after giving birth were not likely to become ill.

When a professor who cut his finger in the middle of an autopsy in that same hospital died of symptoms identical to those of the unfortunate mothers, Semmelweis reasoned that the students doing the autopsies were somehow transferring the fever to the women in the maternity ward.

Semmelweis began making his students disinfect their hands before delivering babies, and the number of childbed fever cases dropped. Of course, no good deed goes unpunished.

Semmelweis was labeled “insane” by his colleagues for having the audacity to suggest that they should wash their hands between deliveries, and they fired him. He tried to continue his research but was ostracized by the medical community. His own mental health eventually deteriorated, leading to his death in an insane asylum.

One final event leading to the acceptance of germs occurred in 1860. A famed doctor was scheduled to speak at a conference to thoroughly denounce Semmelweis’s ideas. Before the speech began, he was interrupted by a man who proceeded to tell the audience that he had discovered the bacterium responsible for childbed fever. That man was Louis Pasteur, and the rest is history.

Lack of proper hand washing continues to be the primary reason why MRSA and other superbugs are spread in hospitals today.

References:

1 Lifton, R. J., 2000. The Nazi Doctors: Medical Killing and the Psychology of Genocide. Basic Books.

2 PollingReport.com section related to “Values” http://www.pollingreport.com/values.htm (Accessed May 26, 2008)

3 Wootton, D., 2007. Bad Medicine: Doctors Doing Harm Since Hippocrates. USA: Oxford University Press.

4 Starr, D., “Blood basics: Early practices: Bloodletting,” http://www.pbs.org/wnet/redgold/basics/bloodletting.html (Accessed May 26, 2008)

Leave a Comment

Your email address will not be published. Required fields are marked *