“First, do no harm,” is one of the main principles guiding health care professionals. While these words are not part of the Hippocratic Oath, they form the basis of what physicians are generally taught. While treating a disease is important, the consequences of those treatments should be carefully weighed against their benefits.
Over the course of history, many medical practices have come and gone for various reasons, including changes in medical theory, evidence that the risks of a treatment outweigh the potential benefits, or simply the discovery of new or better techniques.
In general, the practices that go out of favor, stay out of favor. But occasionally, something old that has been debunked is found to have significant use in another application. Lobotomies, once thought to be a cure for schizophrenia, have been repurposed to treat seizures; arsenic, used in tonics and in largely unsubstantiated herbal remedies, is now a life-saving treatment for blood cancer.
Practices such as these were stopped because they were not sufficiently studied before they were implemented, and with use were found to do more harm than good. Yet, once adequate research into these techniques was conducted, some of these older practices were found to have value in another application.
Below is a list of seven medical treatments or practices common in the past that, after years of no longer being practiced, have become important in modern medical care.
Click here to see seven medical treatments back in use today.
1. Lobotomy
The lobotomy, also known as leucotomy, was a procedure developed in the 1930s in an attempt to treat severe psychiatric illness. In the first procedures, small holes were drilled into the skull and ethanol was injected into part of the brain to functionally sever the frontal lobes (which generally control more executive functions, such as personality, motivation, and attention) from the rest of the brain
Even without sufficient medical trials, the mechanics of the procedure continued to evolve, and in the mid-1940s, lobotomies became routine office procedures. Doctors would stick an icepick into the brain through the top of the eye socket. Roughly 40,000 Americans were “treated” with this procedure, with a significant amount of associated death and morbidity.
Given the rather gruesome nature of the procedure and the clear lack of evidence for its success, many countries eventually banned the practice altogether.
With improvements in localization provided by high-resolution imaging (e.g. MRI scans) and brain wave studies, teams of neurosurgeons and neurologists are now able to target and remove small regions of the brain to significantly improve seizure control while causing minimal secondary damage. Promising results have led to a rise in the lobotomy-like procedure. According to the Nationwide Inpatient Sample, roughly 6,500 lobectomies and partial lobectomies were performed in the United States between 1990 and 2008. In one subset of epilepsy patients, seizure control was achieved in up to 75% of patients treated with surgery and medications compared to 0% in patients treated with only medications.
ALSO READ: The States With the Most People Dying From Cancer
2. Electroshock Therapy
Now more commonly referred to as electroconvulsive therapy (or ECT), electroshock therapy was developed as a treatment for psychiatric conditions in the 1930s. It was shown to be very effective in improving mental illness, especially severe depression. At its height in the 1940s and 50s, it was estimated that approximately one-third of patients hospitalized with affective disorders (e.g bipolar, depression, anxiety) had undergone the treatment.
Unfortunately, it was also associated with significant memory disturbances and confusion, and prior to the use of anesthetics was known to cause bodily harm, including bone fractures and dislocations. Improvement in techniques and the use of anesthetics to prevent the contorting movements of the seizure helped mitigate these problems.
Despite these improvements, however, public perception of ECT was exceedingly negative. In the popular novel and movie, One Flew Over the Cuckoo’s Nest, Big Nurse (Nurse Ratched), used it as a tool to terrorize and control the patients in her ward; in Sylvia Plath’s The Bell Jar, the protagonist’s first psychiatrist punishes her with ECT, although later she is treated successfully with ECT in a more controlled environment. The negative public perception, combined with the rise of antidepressant use from the 1950s onwards, led to a significant decrease in the use of ECT.
Over the past 20 or so years, however, ECT has had a resurgence due to its effectiveness in patients with severe depression that do not respond to medication alone. It is estimated that roughly 100,000 Americans receive ECT per year in the United States, and it is considered to be the gold standard treatment for severe depression.
3. Leeches and Bloodletting
For thousands of years, bloodletting was a practice of ancient medicine to balance the humours. It is thought to have been among the most common medical practices from the antiquity period through the late 1800s. The first documented uses were in ancient Egypt around 1000 BCE. In Hippocrates’ Greece, when dietary changes, exercise, sweating, and vomiting proved unhelpful, the body was frequently “re-balanced” through bleeding.
Various methods of bloodletting were used over time, usually involving small knives to score the skin or nick a vein. Leeches were first documented for this use in 800 BCE and were exceedingly popular in the early 1800s. In 1830s France, 35 million leeches were used per year.
Bloodletting was debunked as a cure-all as rigorous study showed that it had little positive impact on most diseases. Today, bloodletting is still in use only in the treatment of diseases where the body produces too many red blood cells (polycythemia vera) or where there is malfunction of iron metabolism (iron being the main ingredient of hemoglobin, which allows red blood cells to transport oxygen).
The use of leeches, too, has increased in the past couple of decades after it was discovered that the leeches’ saliva secretions contain several medicinal enzymes. In microsurgery and tissue reimplantation, leeches can help relieve blood congestion and allow for proper circulation. In addition, the anti-inflammatory and numbing agents are useful in the treatment of arthritis. As early as 1817 and as recently as 2014, numerous attempts have been made to replace leeches with blood-sucking mechanical devices. None appear to be as effective as leeches, however, which continue to be used to this day.
ALSO READ: America’s Most Violent (and Most Peaceful) States
4. Thalidomide
Developed in Germany in the 1950s, thalidomide was considered a cure-all. It was marketed as a treatment for respiratory infections, insomnia, cough, colds, headaches, nausea, and — most significantly — morning sickness. The company that developed the drug tried to get it approved in America, but the application was denied by the U.S. Food and Drug Administration. In 1957, thalidomide became an over-the-counter medicine in Germany. It was also licensed sold in the United Kingdom, Canada, and Australia. It became one of the most successful prescription drugs in the history of medicine.
Doctors prescribed thalidomide to thousands of pregnant women for morning sickness until they realized children were being born with serious birth defects. It was estimated that between 10,000 and 20,000 children were born with missing and severely malformed limbs, and roughly 50% of them died. The drug was withdrawn from the market in Germany, and eventually pulled from the rest of the countries in which it was licensed.
In 1991, researchers found that the drug had a significant impact on the regulation of the immune system. Seven years later, the FDA approved the drug for the treatment of leprosy, and since that time it and multiple derivatives have been used effectively on blood cancers and other immune system disorders.
5. Maggots
Maggots have been used for thousands of years to treat wounds. Archaeological findings, historical reports, and ancient writings show their use in Mayan culture, among Australian Aboriginal tribes, during the reign of the Roman Empire, and during the Renaissance. Maggots were especially useful during wartime. Military doctors and battlefield medics noticed soldiers whose wounds were colonized by maggots were less likely to die from their injuries. During the Napoleonic Wars, while campaigning through the Middle East, French surgeons observed that the larvae of certain fly species consumed only dead tissue, speeding up the healing process.
In the 1930s, studies found that maggots contain antimicrobial properties. Despite the growing body of evidence, however, the discovery of penicillin and other antibiotics led to far lower maggot usage for infectious wounds.
With widespread antibiotic use, some bacteria have developed into superbugs with resistance to most, if not all, antibiotics. At the same time, the rise in diabetes has led to more wounds with poor blood supply and significantly impaired healing. In a 1989 study, researchers found maggot therapy to be a suitable alternative to other modern wound-control techniques. And in 2004, the FDA approved the prescription of maggots for the treatment of non-healing wounds, including diabetic ulcers, venous stasis ulcers, and other non-healing traumatic or surgical wounds.
ALSO READ: 10 States With the Most Hate Groups
6. Fecal Transplant
Taking stool from one healthy person and implanting it in the digestive system of an unhealthy
person certainly sounds like one of the least pleasant treatments. Despite the “ick” factor, recent studies have shown fecal transplant to be exceedingly successful in controlling some bacterial infections in the gut that do not respond to available antibiotics. Modern delivery techniques have also made it a much less unpleasant experience.
A sample is collected from the donor, who is pre-screened for a wide array of bacterial and parasitic infections. The sample is diluted and then applied to the patient’s digestive tract either through a colonoscopy or a tube placed through the nose into the digestive tract. The treatment has been most successful as a treatment for patients with C. difficile, a common gut infection that usually arises after prolonged use of strong antibiotics. In fact, it has been shown to be even more effective than the current standard therapy, the antibiotic vancomycin.
Fecal transplants have been used to treat other ailments as early as the fourth century. Chinese medical reports from the period discuss its application in the treatment of food poisoning and severe diarrhea. At that time, and for at least the next 1,200 years, a solution of stool and water was simply drunk by the patient.
7. Arsenic
As is the case with many modern chemotherapies, arsenic is well known for both its ability to fight disease and cause serious side effects and death. The heavy metal was used for thousands of years to treat infections around the skin, although it is perhaps best-known as the “Poison of Kings” due to its long and storied use in assassinations. Arsenic has been used in several medicinal tonics as well, most famously Fowler’s solution, which was used to treat high blood pressure, gastric ulcers, asthma, eczema, tuberculosis, and both skin and breast cancers. While these tonics were somewhat helpful, they had significant side effects and fell out of use in the mid-1900s.
In the early 1900s, a physician, Dr. Paul Ehrlich found that atoxyl, a derivative of arsenic, was highly effective in the treatment of trypanosomiasis, a common and often fatal chronic infection of the time. Based on this, he searched for a similar derivative of arsenic to treat syphilis, and in so doing created the concept (and name) of chemotherapy. He created Salvarsan, which was the first effective treatment of syphilis; it quickly became the most prescribed drug in the world.
Unfortunately, these medications still had significant side effects, with atoxyl leading to blindness in many of those treated, and Salvarsan causing rashes, liver damage, and even death. Both were replaced by slightly safer arsenic compounds which remained the basis of treatments for these diseases for many years. In the 1940s, with the discovery of penicillin, the arsenic derivatives went largely out of favor in Western medicine.
However, research continued on their potential uses in China, and in the 1970s, arsenic trioxide was shown to be effective in the treatment of acute promyelocytic leukemia, a blood cancer. Over the next 30 years, it has become the prefered second line treatment for this deadly leukemia.
The #1 Thing to Do Before You Claim Social Security (Sponsor)
Choosing the right (or wrong) time to claim Social Security can dramatically change your retirement. So, before making one of the biggest decisions of your financial life, it’s a smart idea to get an extra set of eyes on your complete financial situation.
A financial advisor can help you decide the right Social Security option for you and your family. Finding a qualified financial advisor doesn’t have to be hard. SmartAsset’s free tool matches you with up to three financial advisors who serve your area, and you can interview your advisor matches at no cost to decide which one is right for you.
Click here to match with up to 3 financial pros who would be excited to help you optimize your Social Security outcomes.
Have questions about retirement or personal finance? Email us at [email protected]!
By emailing your questions to 24/7 Wall St., you agree to have them published anonymously on a673b.bigscoots-temp.com.
By submitting your story, you understand and agree that we may use your story, or versions of it, in all media and platforms, including via third parties.
Thank you for reading! Have some feedback for us?
Contact the 24/7 Wall St. editorial team.