Be part of the knowledge.
Register

We’re glad to see you’re enjoying ReachMD…
but how about a more personalized experience?

Register for free
  1. Home
  2. Programs
  3. COVID-19: On The Frontlines

Off-Label Medicines in Fighting COVID-19 & Through the Years

ReachMD Healthcare Image
Restart
Resume
Choose a format
Completing the pre-test is required to access this content.
Completing the pre-survey is required to view this content.

Ready to Claim Your Credits?

You have attempts to pass this post-test. Take your time and review carefully before submitting.

Good luck!

Details
Presenters
Comments
  • Overview

    In the absence of FDA-approved therapies and vaccines to battle the COVID-19 pandemic, healthcare professionals on the frontlines are turning to malarial medicines like chloroquine and hydroquinone to treat their patients—despite the fact that there’s no evidence to support their use. But this isn’t the first time that medicines have been used for a different purpose than what they were intended for, as Dr. John Russell explains.

    Published April 21, 2020

  • Read the Article

    Off-Label Medicines in Fighting COVID-19 & Through the Years

    Dr. Russell:
    Coming to you from the ReachMD studios, this is COVID-19: On the Frontlines. I’m your host, Dr. John Russell

    Today’s episode will be “COVID-19: Off-Label Medicines in Fighting COVID and Through the Years.” 

    In hospitals across the country, there are doctors using old malarial medicines to help battle COVID-19.  This is discussed on news programs, press conferences, talk shows – but how did this even dawn on anyone to try it on this novel coronavirus? 

    Well, chloroquine and hydroxyquine are both FDA-approved antimalarial drugs that have been in use for many years.  Chloroquine was originally developed in 1934 by the pharmaceutical company Bayer, and used in World War II to prevent malaria.  Both of these drugs are FDA-approved antimalarial drugs, but after the initial outbreak of MERS, Middle Eastern Respiratory Syndrome, in 2012, scientists conducted randomized screens of thousands of approved drugs to identify one that might block MERS infection.  Several drugs, including chloroquine, showed the ability to block coronaviruses from infecting cells in vitro, but these drugs were not extensively pursued because, ultimately, they did not show enough activity to be considered further.  When the new coronavirus appeared, many drugs that had been shown some initial promise against the related coronaviruses, MERS and SARS, were at the top of the list of worthy further-evaluation possible treatments.  It’s still unclear how the chloroquines, or any antimalarial drug, would work against COVID-19. 

    Malaria is caused by Plasmodium parasites – COVID-19 is a virus.  Malaria is caused by the spread of mosquitos, where COVID-19 is caused by the transmission, at this point, from human to human.  Viral infections and parasitic infections are very different, and so scientists wouldn’t expect what works for one would work for the other.  It has been suggested that the chloroquines can change the acidity of the surface level of the cell, thereby preventing the virus from infecting it.  It’s also possible chloroquines help activate the immune response. 

    One studies that was just published tested hydroxyquine in combination with an antibacterial drug, azithromycin, which worked better to stop the spread of the infection than hydroxychloroquine alone.  However, it’s only one preliminary study that was done on a limited test group.  Other studies are ongoing, but we need to take this with a huge grain of salt. 

    There is no recommendation that any ambulatory patient should get hydroxyquine-azithromycin.  Remember, both of these drugs can prolong the QTc, and especially if you’re in your offices and you’re treating people over the phone, you’re going to have no idea what their baseline QTc is, and you don’t know if they’re on other medicines that can prolong their QTc.  Also remember, these meds are in short supply and being trialed on ICUs across the country.  Sadly, there are stories of people buying up thousands of doses of these medicines early on in the epidemic, like toilet paper.  There have also been studies of a husband and wife who got very, very sick from starting themselves on chloroquine.  This has led to shortages of medicines for patients who take them every day for rheumatic conditions like lupus, so I think we need to take it with a grain of salt. 

    Studies are ongoing, and we’re going to learn more about this.  There are beliefs that the cytokine storm in some patients might be able to be treated by the immunosuppressant interleukin-6 receptor antagonist class.  Early research suggested COVID-19 triggers a cytokine storm that stimulates high levels of interleukin-6 and granulocyte-macrophage stimulating factor.  One example, tocilizumab, is a potent humanized interleukin-6 receptor blocker.  There had been anecdotal case reports from China, and this existing medicine and some others like it are being studied, eh, being repurposed in critically ill patients who have COVID-19. 

    For years, doctors have used FDA-approved medications for other indications.  Oftentimes, this will lead to later approval of these medications for these new indications.  Sometimes discovery of unusual side effects will lead to a whole new life for these medications, and sometimes when a new crisis comes, scientists will try a wide variety of medicines already in use to see if they can be useful, as we are seeing with some of these older malarial drugs. 

    So, let’s look through history at some of these drugs.  So, how do you cure one drug epidemic?  Create a new drug – and that’s what happened in the late 1880s when heroin was introduced as a safe, non-addictive substitute for morphine, known as diamorphine.  It was created by English chemical researcher named C. R. Alder Wright in the 1870s, but it wasn’t until chemists working for the Bayer pharmaceutical company discovered Wright’s paper in 1895 that the drug came to market.  Finding it to be five times more effective and supposedly less addictive than morphine, Bayer began advertising a heroin-laced aspirin in 1898, which they marketed towards children suffering from sore throats, coughs, and colds.  Some bottles depicted children eagerly reaching for the medicine with moms giving their sick kids heroin on a spoon.  Doctors started having an inkling that heroin might not be as non-addictive as it seemed when patients began coming back for bottle after bottle.  Despite the pushback from physicians and negative stories about heroin’s side effects piling up, Bayer continued to market and produce their product until 1913.  Eleven years later, the FDA banned heroin altogether. 

    Around the mid-1880s, scientists were able to isolate the active ingredient of the coca leaf, Erythroloxin coca, later known as cocaine.  Pharmaceutical companies loved this new fast-acting and relatively inexpensive stimulant.  In 1884, an Australian ophthalmologist, Carol Koller, discovered that a few drops of cocaine solution put on a patient’s cornea acted as a topical anesthetic.  It made the eye immobile and desensitized it to pain and caused less bleeding at the site of incision, making eye surgery much less risky.  News of this discovery spread, and soon cocaine was being used in both eye and sinus surgeries, marketed as a treatment for toothaches, depression, sinusitis, lethargy, alcoholism, and impotence.  Cocaine was soon being sold as a tonic lozenge powder and even used in cigarettes.  It even appeared in the Sears Roebuck catalogs.  Popular home remedies, such as Allen’s Cocaine Tablets, could be purchased for just 50 cents a box and offered relief from everything from hay fever, throat troubles, nervousness, headaches, and sleeplessness.  In reality, the side effects of cocaine actually caused many of the ailments it claimed to cure, causing lack of sleep, eating problems, depression, and even hallucinations.  You didn’t even need a doctor’s prescription to purchase it. 

    Some states sold cocaine at bars, and it was famously one of the key ingredients in the soon to be a ubiquitous Coca-Cola soft drink.  By 1902, there was an estimated 200,000 cocaine addicts in the US alone.  By 1914, the Harrison Narcotic Act outlawed the production, importation, and distribution of cocaine. 

    Well, what about more recent times?  Amantadine was first used in 1966.  It was approved by the US Food and Drug Administration in October 1968 as a prophylactic agent against the Asian influenza epidemic.  They did a study where they exposed volunteer college students to a viral challenge.  The group that received amantadine had less Asian influenza infections than the placebo group.  Amantadine received approval for the treatment of influenza infections in adults in 1976, but an incidental finding in 1969 prompted investigations about amantadine’s effective for treating symptoms of Parkinson’s.  A woman with Parkinson’s disease was prescribed amantadine to treat her influenza infection and reported her cogwheel rigidity and tremors improved.  She also reported that her symptoms worsened after she finished the course of amantadine.  This published case report was not initially corroborated by any other instances in medical literature.  A team of researchers looked at a group of ten patients with Parkinson’s disease and gave them amantadine.  Seven of the ten patients showed improvement, which was convincing evidence for the need of a clinical trial.  The 1969 trial, lead author Robert Schwab, included 163 patients with Parkinson’s disease.  About two-thirds of them experienced subjective or objective reductive – reduction of symptoms. 

    Additional studies of Schwab’s team followed patients for greater lengths of time and in different combinations of neurologic drugs.  The FDA approved amantadine for the treatment of Parkinson’s disease.  Interestingly enough, in 2017, the FDA approved the use of amantadine in an extended-release formulation by Adams Pharma for the treatment of dyskinesia, an adverse effect of levodopa that people with Parkinson’s experience. 

    Also interestingly, amantadine has not been effective against influenza for the last ten years.  The first weapon against HIV wasn’t a new compound scientists had to develop from scratch.  It was one that was already on the shelf, although it had been abandoned.  AZT, or azi, azidothy-thymidine, was originally developed in the 1960s by a US researcher on the way to thwart cancer.  The compound was supposed to insert itself into the DNA of the cancer cell and mess with its ability to replicate and produce more tumor cells, but it really didn’t work well when it was tested in mice and was put aside.  Two decades later, after AIDS emerged as a new infectious disease, the pharmaceutical company Burroughs Wellcome, already known for its antiviral drugs, began a massive test of potential anti-HIV agents, hoping to find anything that might work against this new viral flow.  Among the things tested was something called Compound S, a remade version of the original AZT.  When it was thrown into a dish with animal cells infected with HIV, it seemed to block the virus’s activity.  The company sent samples to the FDA and the National Cancer Institute, where Dr. Samuel Broder, who headed the agency, realized the significance of this discovery, but simply having a compound that would work against HIV wasn’t enough.  In order to make it available to the estimated millions who were infected, researchers had to be sure it was safe and it would indeed stop HIV in some way.  Even if it didn’t cure the people of their infection, at the time such tests overseen by the FDA took eight to ten years.  So, patients couldn’t wait that long, and under enormous public pressure, the FDA review of AZT was fast-tracked, some say at the expense of patients.  Minoxidil – minoxidil was developed in the 1950s by The Upjohn Company, later part of Pfizer, to treat peptic ulcers. 

    In trials using dogs, the compound did not cure ulcers but proved to be a powerful vasodilator.  Upjohn synthesized over 200 variations of compound, including the one it developed in 1963 and named minoxidil.  These studies result in the FDA approving minoxidil in the form of oral tablets to treat hypertension in 1979.  When Upjohn received permission from the FDA to test the new drug as a medicine for hypertension, they approached Charles A. Chidsley, Associate Professor of Medicine at University of Colorado School of Medicine.  He conducted two studies – the second study showed unexpected hair growth.  Puzzled by the side effect, Chidsley consulted a Dr. Kahn, who while a dermatology resident at the University of Miami had been the first to observe and report hair developments on patients using minoxidil patch and discussed the possibility of using minoxidil for treating hair loss.  Kahn, along with his colleague, Paul Grant, obtained a certain amount of the drug and conducted their own research.  Since they were the first to make this side effect observation, neither Updon nor Chisley at the time were aware of the side effect of hair growth.  The two doctors had been experimenting with a 1% solution of minoxidil mixed with several alcohol-based liquids.  Both parties filed patents to use the drug for hair loss prevention, which resulted in a decade-long trial between Kahn and Upjohn. 

    Meanwhile, the effect of minoxidil on hair loss prevention was so clear, the 1980s were prescribing minoxidil off-label for the balding patients.  In August 1988, the FDA finally approved the drug for treating baldness under the trade name Rogaine.  The agency concluded that although the product will not work for everyone, 39 percent of mon – men studied had moderate to dense hair growth on the crown of their head.  In fact, gabapentin – gabapentin was developed by Parke-Davis.  It was first described in 1975 under the brand name Neurontin.  It was first approved in May 1993 for the treatment of epilepsy in the United Kingdom and was marketed in the United States in 1994. 

    Subsequently, gabapentin was approved in the United States for treatment of postherpetic neuralgia in May 2002.  A generic version of gabapentin first became available in the United States in 2004.  It was approved for the treatment of postherpetic neuralgia in January 2011.  It also got indications for restless leg and postherpetic neuralgia.  So, even though it was never really a first-line medicine for epilepsy, it developed all these new indications.  In 2006, 64 million prescriptions for gabapentin were dispensed, up from 39 million in 2012.  This was reported in the New England Journal of Medicine.  Thalidomide was developed by German scientists as a drug with many potential causes, such as calming anxiety, promoting sleep, and alleviating nausea.  In the late 1950s and early 1960s, an estimated 10 to 20,000 babies with short limbs and other severe, often fatal birth defects were born in the United States, Europe, and Australia to women who had taken thalidomide as a morning sickness remedy in the first trimester of pregnancy.  Thalidomide was never approved in the US by the FDA and was banned worldwide in 1962. 

    A lot of the studies that we have now – a lot of the approval of drugs that require testing in pregnancy – date back to the problems with thalidomide, but research interests in the drug continued in the 1960s, and thalidomide was found to be useful in treating complications of leprosy.  The drug’s anticancer activities were discovered in research beginning in the 1990s, and in 2006, the combination of thalidomide and dexamethasone was approved for treating multiple myeloma.  Research demonstrated a powerful derivative of thalidomide killed multiple myeloma cells by disabling overactive switches, called transcription factors, that drive the cells’ excessive growth.  Transcription factors are proteins that bind genes and increase their activity, and cancers are often driven by overactivity of these molecular switches. 

    How the drugs worked remained a mystery, but in 2010, scientists in Japan reported, reported that thalidomide binds to and inactivates a protein, which is important in the normal development of limbs.  Cereblon is a component of the potential complex which tags proteins that need to be destroyed for cells’ health and is a key to embryologic development.  The researchers found that thalidomide’s inactivation disrupted development and explained the drug’s tendency to cause birth defects, but not its ability to cause cancer treatment. 

    Sildenafil, the active ingredient in Viagra, was originally developed to treat cardiovascular problems.  It was meant to dilate the heart’s blood vessels by blocking a particular protein called PDE5.  In animal tests, it seemed to work moderately well, and researchers could find evidence that it was impeding PDE5, and the animals weren’t having any obvious negative side effects, so it was brought into a phase 1 clinical trial in the early 1990s to test whether humans could tolerate this new compound.  All seemed to be going well except for one weird thing – the men enrolled in the study, when nurses went to check on them, they found a lot of men were lying on their stomachs.  The head of research and development at Pfizer while this research was ongoing, said in 2016, “A very observant nurse reported that, saying that men were embarrassed because they were getting erections.”  It appeared that the blood vessels dilating were not in the heart, but rather in the penis.  The sildenafil was working, but in the wrong part of the body, and with that, the so-called potency pill was born. 

    Viagra was approved by the FDA for use as erectile dysfunction drug in 1998.  About a decade later, researchers began running new clinical trials to see if it could double as a heart drug as originally intended. Sure enough, in 2005, the FDA approved the same compound for a heart condition called pulmonary hypertension, which constricts blood flow to the lungs and affects both men and women. 

    Botox – Clostridium botulinum – was first discovered by Belgian scientists following a botulism outbreak in Belgium.  By the 1920s, scientists at the University of California-San Francisco first tried to isolate the botulism toxin, although it took 20 years before botulism toxin was finally isolated in crystalline form by a Dr. Schantz.  In the 1970s, scientists started using botulism to treat strabismus, or crossed eyes.  While testing this treatment on monkeys, researcher nose – noticed that the botulism toxin reduced wrinkles in the glabellar area of the forehead – the glabella being the area in the skin between the eyebrows and above the nose.  After botulism toxin proved successful in the treatment of s-strabismus, the company Allergan licensed the treatment and branded it Botox. 

    Subsequently, botulism toxin, or Botox, received FDA approval for a variety of medical and cosmetic needs.  So, it was developed for excessive sweating in 2004; in 2002 for these, kind of, wrinkles on the facial line; chronic migraines and upper limb spasticity in 2010; urinary incontinence in 2011; and crow feets in 2013.  In the annals of medicine, botulism toxin was probably most notable because it was the first microbial injection used to treat disease.  The injection of bacterial products into the human body represented a new invention, and with each passing year, researchers developed more formulations for this versatile agent. 

    Bimatoprost, also known as Latisse or Lumigan, belongs to a group of drugs called prostamides.  These are synthetic structural analogues or prostaglandin.  Bimatoprost, marketed by Allergan, is administered in both ophthalmologic solution and implant form and is used to treat glaucoma, but one of the side effects that they found in treating patients with glaucoma is it caused eyelash growth, and in 2008, this drug became marketed as Latisse for treating eyelash hypotrichosis, or sparse eyelash growth. 

    For ReachMD, this is COVID-19: On the Frontlines.  I’m your host, Dr. John Russell

    For continued access to this and other episodes, and to add your perspectives towards the fight against this global pandemic, please visit us at ReachMD.com, and become part of the knowledge. 

    Thanks for listening.

     
Recommended
Details
Presenters
Comments
  • Overview

    In the absence of FDA-approved therapies and vaccines to battle the COVID-19 pandemic, healthcare professionals on the frontlines are turning to malarial medicines like chloroquine and hydroquinone to treat their patients—despite the fact that there’s no evidence to support their use. But this isn’t the first time that medicines have been used for a different purpose than what they were intended for, as Dr. John Russell explains.

    Published April 21, 2020

  • Read the Article

    Off-Label Medicines in Fighting COVID-19 & Through the Years

    Dr. Russell:
    Coming to you from the ReachMD studios, this is COVID-19: On the Frontlines. I’m your host, Dr. John Russell

    Today’s episode will be “COVID-19: Off-Label Medicines in Fighting COVID and Through the Years.” 

    In hospitals across the country, there are doctors using old malarial medicines to help battle COVID-19.  This is discussed on news programs, press conferences, talk shows – but how did this even dawn on anyone to try it on this novel coronavirus? 

    Well, chloroquine and hydroxyquine are both FDA-approved antimalarial drugs that have been in use for many years.  Chloroquine was originally developed in 1934 by the pharmaceutical company Bayer, and used in World War II to prevent malaria.  Both of these drugs are FDA-approved antimalarial drugs, but after the initial outbreak of MERS, Middle Eastern Respiratory Syndrome, in 2012, scientists conducted randomized screens of thousands of approved drugs to identify one that might block MERS infection.  Several drugs, including chloroquine, showed the ability to block coronaviruses from infecting cells in vitro, but these drugs were not extensively pursued because, ultimately, they did not show enough activity to be considered further.  When the new coronavirus appeared, many drugs that had been shown some initial promise against the related coronaviruses, MERS and SARS, were at the top of the list of worthy further-evaluation possible treatments.  It’s still unclear how the chloroquines, or any antimalarial drug, would work against COVID-19. 

    Malaria is caused by Plasmodium parasites – COVID-19 is a virus.  Malaria is caused by the spread of mosquitos, where COVID-19 is caused by the transmission, at this point, from human to human.  Viral infections and parasitic infections are very different, and so scientists wouldn’t expect what works for one would work for the other.  It has been suggested that the chloroquines can change the acidity of the surface level of the cell, thereby preventing the virus from infecting it.  It’s also possible chloroquines help activate the immune response. 

    One studies that was just published tested hydroxyquine in combination with an antibacterial drug, azithromycin, which worked better to stop the spread of the infection than hydroxychloroquine alone.  However, it’s only one preliminary study that was done on a limited test group.  Other studies are ongoing, but we need to take this with a huge grain of salt. 

    There is no recommendation that any ambulatory patient should get hydroxyquine-azithromycin.  Remember, both of these drugs can prolong the QTc, and especially if you’re in your offices and you’re treating people over the phone, you’re going to have no idea what their baseline QTc is, and you don’t know if they’re on other medicines that can prolong their QTc.  Also remember, these meds are in short supply and being trialed on ICUs across the country.  Sadly, there are stories of people buying up thousands of doses of these medicines early on in the epidemic, like toilet paper.  There have also been studies of a husband and wife who got very, very sick from starting themselves on chloroquine.  This has led to shortages of medicines for patients who take them every day for rheumatic conditions like lupus, so I think we need to take it with a grain of salt. 

    Studies are ongoing, and we’re going to learn more about this.  There are beliefs that the cytokine storm in some patients might be able to be treated by the immunosuppressant interleukin-6 receptor antagonist class.  Early research suggested COVID-19 triggers a cytokine storm that stimulates high levels of interleukin-6 and granulocyte-macrophage stimulating factor.  One example, tocilizumab, is a potent humanized interleukin-6 receptor blocker.  There had been anecdotal case reports from China, and this existing medicine and some others like it are being studied, eh, being repurposed in critically ill patients who have COVID-19. 

    For years, doctors have used FDA-approved medications for other indications.  Oftentimes, this will lead to later approval of these medications for these new indications.  Sometimes discovery of unusual side effects will lead to a whole new life for these medications, and sometimes when a new crisis comes, scientists will try a wide variety of medicines already in use to see if they can be useful, as we are seeing with some of these older malarial drugs. 

    So, let’s look through history at some of these drugs.  So, how do you cure one drug epidemic?  Create a new drug – and that’s what happened in the late 1880s when heroin was introduced as a safe, non-addictive substitute for morphine, known as diamorphine.  It was created by English chemical researcher named C. R. Alder Wright in the 1870s, but it wasn’t until chemists working for the Bayer pharmaceutical company discovered Wright’s paper in 1895 that the drug came to market.  Finding it to be five times more effective and supposedly less addictive than morphine, Bayer began advertising a heroin-laced aspirin in 1898, which they marketed towards children suffering from sore throats, coughs, and colds.  Some bottles depicted children eagerly reaching for the medicine with moms giving their sick kids heroin on a spoon.  Doctors started having an inkling that heroin might not be as non-addictive as it seemed when patients began coming back for bottle after bottle.  Despite the pushback from physicians and negative stories about heroin’s side effects piling up, Bayer continued to market and produce their product until 1913.  Eleven years later, the FDA banned heroin altogether. 

    Around the mid-1880s, scientists were able to isolate the active ingredient of the coca leaf, Erythroloxin coca, later known as cocaine.  Pharmaceutical companies loved this new fast-acting and relatively inexpensive stimulant.  In 1884, an Australian ophthalmologist, Carol Koller, discovered that a few drops of cocaine solution put on a patient’s cornea acted as a topical anesthetic.  It made the eye immobile and desensitized it to pain and caused less bleeding at the site of incision, making eye surgery much less risky.  News of this discovery spread, and soon cocaine was being used in both eye and sinus surgeries, marketed as a treatment for toothaches, depression, sinusitis, lethargy, alcoholism, and impotence.  Cocaine was soon being sold as a tonic lozenge powder and even used in cigarettes.  It even appeared in the Sears Roebuck catalogs.  Popular home remedies, such as Allen’s Cocaine Tablets, could be purchased for just 50 cents a box and offered relief from everything from hay fever, throat troubles, nervousness, headaches, and sleeplessness.  In reality, the side effects of cocaine actually caused many of the ailments it claimed to cure, causing lack of sleep, eating problems, depression, and even hallucinations.  You didn’t even need a doctor’s prescription to purchase it. 

    Some states sold cocaine at bars, and it was famously one of the key ingredients in the soon to be a ubiquitous Coca-Cola soft drink.  By 1902, there was an estimated 200,000 cocaine addicts in the US alone.  By 1914, the Harrison Narcotic Act outlawed the production, importation, and distribution of cocaine. 

    Well, what about more recent times?  Amantadine was first used in 1966.  It was approved by the US Food and Drug Administration in October 1968 as a prophylactic agent against the Asian influenza epidemic.  They did a study where they exposed volunteer college students to a viral challenge.  The group that received amantadine had less Asian influenza infections than the placebo group.  Amantadine received approval for the treatment of influenza infections in adults in 1976, but an incidental finding in 1969 prompted investigations about amantadine’s effective for treating symptoms of Parkinson’s.  A woman with Parkinson’s disease was prescribed amantadine to treat her influenza infection and reported her cogwheel rigidity and tremors improved.  She also reported that her symptoms worsened after she finished the course of amantadine.  This published case report was not initially corroborated by any other instances in medical literature.  A team of researchers looked at a group of ten patients with Parkinson’s disease and gave them amantadine.  Seven of the ten patients showed improvement, which was convincing evidence for the need of a clinical trial.  The 1969 trial, lead author Robert Schwab, included 163 patients with Parkinson’s disease.  About two-thirds of them experienced subjective or objective reductive – reduction of symptoms. 

    Additional studies of Schwab’s team followed patients for greater lengths of time and in different combinations of neurologic drugs.  The FDA approved amantadine for the treatment of Parkinson’s disease.  Interestingly enough, in 2017, the FDA approved the use of amantadine in an extended-release formulation by Adams Pharma for the treatment of dyskinesia, an adverse effect of levodopa that people with Parkinson’s experience. 

    Also interestingly, amantadine has not been effective against influenza for the last ten years.  The first weapon against HIV wasn’t a new compound scientists had to develop from scratch.  It was one that was already on the shelf, although it had been abandoned.  AZT, or azi, azidothy-thymidine, was originally developed in the 1960s by a US researcher on the way to thwart cancer.  The compound was supposed to insert itself into the DNA of the cancer cell and mess with its ability to replicate and produce more tumor cells, but it really didn’t work well when it was tested in mice and was put aside.  Two decades later, after AIDS emerged as a new infectious disease, the pharmaceutical company Burroughs Wellcome, already known for its antiviral drugs, began a massive test of potential anti-HIV agents, hoping to find anything that might work against this new viral flow.  Among the things tested was something called Compound S, a remade version of the original AZT.  When it was thrown into a dish with animal cells infected with HIV, it seemed to block the virus’s activity.  The company sent samples to the FDA and the National Cancer Institute, where Dr. Samuel Broder, who headed the agency, realized the significance of this discovery, but simply having a compound that would work against HIV wasn’t enough.  In order to make it available to the estimated millions who were infected, researchers had to be sure it was safe and it would indeed stop HIV in some way.  Even if it didn’t cure the people of their infection, at the time such tests overseen by the FDA took eight to ten years.  So, patients couldn’t wait that long, and under enormous public pressure, the FDA review of AZT was fast-tracked, some say at the expense of patients.  Minoxidil – minoxidil was developed in the 1950s by The Upjohn Company, later part of Pfizer, to treat peptic ulcers. 

    In trials using dogs, the compound did not cure ulcers but proved to be a powerful vasodilator.  Upjohn synthesized over 200 variations of compound, including the one it developed in 1963 and named minoxidil.  These studies result in the FDA approving minoxidil in the form of oral tablets to treat hypertension in 1979.  When Upjohn received permission from the FDA to test the new drug as a medicine for hypertension, they approached Charles A. Chidsley, Associate Professor of Medicine at University of Colorado School of Medicine.  He conducted two studies – the second study showed unexpected hair growth.  Puzzled by the side effect, Chidsley consulted a Dr. Kahn, who while a dermatology resident at the University of Miami had been the first to observe and report hair developments on patients using minoxidil patch and discussed the possibility of using minoxidil for treating hair loss.  Kahn, along with his colleague, Paul Grant, obtained a certain amount of the drug and conducted their own research.  Since they were the first to make this side effect observation, neither Updon nor Chisley at the time were aware of the side effect of hair growth.  The two doctors had been experimenting with a 1% solution of minoxidil mixed with several alcohol-based liquids.  Both parties filed patents to use the drug for hair loss prevention, which resulted in a decade-long trial between Kahn and Upjohn. 

    Meanwhile, the effect of minoxidil on hair loss prevention was so clear, the 1980s were prescribing minoxidil off-label for the balding patients.  In August 1988, the FDA finally approved the drug for treating baldness under the trade name Rogaine.  The agency concluded that although the product will not work for everyone, 39 percent of mon – men studied had moderate to dense hair growth on the crown of their head.  In fact, gabapentin – gabapentin was developed by Parke-Davis.  It was first described in 1975 under the brand name Neurontin.  It was first approved in May 1993 for the treatment of epilepsy in the United Kingdom and was marketed in the United States in 1994. 

    Subsequently, gabapentin was approved in the United States for treatment of postherpetic neuralgia in May 2002.  A generic version of gabapentin first became available in the United States in 2004.  It was approved for the treatment of postherpetic neuralgia in January 2011.  It also got indications for restless leg and postherpetic neuralgia.  So, even though it was never really a first-line medicine for epilepsy, it developed all these new indications.  In 2006, 64 million prescriptions for gabapentin were dispensed, up from 39 million in 2012.  This was reported in the New England Journal of Medicine.  Thalidomide was developed by German scientists as a drug with many potential causes, such as calming anxiety, promoting sleep, and alleviating nausea.  In the late 1950s and early 1960s, an estimated 10 to 20,000 babies with short limbs and other severe, often fatal birth defects were born in the United States, Europe, and Australia to women who had taken thalidomide as a morning sickness remedy in the first trimester of pregnancy.  Thalidomide was never approved in the US by the FDA and was banned worldwide in 1962. 

    A lot of the studies that we have now – a lot of the approval of drugs that require testing in pregnancy – date back to the problems with thalidomide, but research interests in the drug continued in the 1960s, and thalidomide was found to be useful in treating complications of leprosy.  The drug’s anticancer activities were discovered in research beginning in the 1990s, and in 2006, the combination of thalidomide and dexamethasone was approved for treating multiple myeloma.  Research demonstrated a powerful derivative of thalidomide killed multiple myeloma cells by disabling overactive switches, called transcription factors, that drive the cells’ excessive growth.  Transcription factors are proteins that bind genes and increase their activity, and cancers are often driven by overactivity of these molecular switches. 

    How the drugs worked remained a mystery, but in 2010, scientists in Japan reported, reported that thalidomide binds to and inactivates a protein, which is important in the normal development of limbs.  Cereblon is a component of the potential complex which tags proteins that need to be destroyed for cells’ health and is a key to embryologic development.  The researchers found that thalidomide’s inactivation disrupted development and explained the drug’s tendency to cause birth defects, but not its ability to cause cancer treatment. 

    Sildenafil, the active ingredient in Viagra, was originally developed to treat cardiovascular problems.  It was meant to dilate the heart’s blood vessels by blocking a particular protein called PDE5.  In animal tests, it seemed to work moderately well, and researchers could find evidence that it was impeding PDE5, and the animals weren’t having any obvious negative side effects, so it was brought into a phase 1 clinical trial in the early 1990s to test whether humans could tolerate this new compound.  All seemed to be going well except for one weird thing – the men enrolled in the study, when nurses went to check on them, they found a lot of men were lying on their stomachs.  The head of research and development at Pfizer while this research was ongoing, said in 2016, “A very observant nurse reported that, saying that men were embarrassed because they were getting erections.”  It appeared that the blood vessels dilating were not in the heart, but rather in the penis.  The sildenafil was working, but in the wrong part of the body, and with that, the so-called potency pill was born. 

    Viagra was approved by the FDA for use as erectile dysfunction drug in 1998.  About a decade later, researchers began running new clinical trials to see if it could double as a heart drug as originally intended. Sure enough, in 2005, the FDA approved the same compound for a heart condition called pulmonary hypertension, which constricts blood flow to the lungs and affects both men and women. 

    Botox – Clostridium botulinum – was first discovered by Belgian scientists following a botulism outbreak in Belgium.  By the 1920s, scientists at the University of California-San Francisco first tried to isolate the botulism toxin, although it took 20 years before botulism toxin was finally isolated in crystalline form by a Dr. Schantz.  In the 1970s, scientists started using botulism to treat strabismus, or crossed eyes.  While testing this treatment on monkeys, researcher nose – noticed that the botulism toxin reduced wrinkles in the glabellar area of the forehead – the glabella being the area in the skin between the eyebrows and above the nose.  After botulism toxin proved successful in the treatment of s-strabismus, the company Allergan licensed the treatment and branded it Botox. 

    Subsequently, botulism toxin, or Botox, received FDA approval for a variety of medical and cosmetic needs.  So, it was developed for excessive sweating in 2004; in 2002 for these, kind of, wrinkles on the facial line; chronic migraines and upper limb spasticity in 2010; urinary incontinence in 2011; and crow feets in 2013.  In the annals of medicine, botulism toxin was probably most notable because it was the first microbial injection used to treat disease.  The injection of bacterial products into the human body represented a new invention, and with each passing year, researchers developed more formulations for this versatile agent. 

    Bimatoprost, also known as Latisse or Lumigan, belongs to a group of drugs called prostamides.  These are synthetic structural analogues or prostaglandin.  Bimatoprost, marketed by Allergan, is administered in both ophthalmologic solution and implant form and is used to treat glaucoma, but one of the side effects that they found in treating patients with glaucoma is it caused eyelash growth, and in 2008, this drug became marketed as Latisse for treating eyelash hypotrichosis, or sparse eyelash growth. 

    For ReachMD, this is COVID-19: On the Frontlines.  I’m your host, Dr. John Russell

    For continued access to this and other episodes, and to add your perspectives towards the fight against this global pandemic, please visit us at ReachMD.com, and become part of the knowledge. 

    Thanks for listening.

     
Schedule27 Nov 2024