“The desire to take medicine is perhaps the greatest feature which distinguishes man from animals.”
That was the opinion of Sir William Osler, graduate of McGill University, professor of medicine at McGill, one of the founders of the Johns Hopkins University School of Medicine, and a man often called the Father of Modern Medicine.
Osler introduced the concept of clinical clerkship, insisting that third- and fourth-year medical students be exposed to hands-on experience with patients. He also pioneered the idea of a residency program for medical graduates, to further hone their skills. He reportedly asked for no other epitaph than that he taught medical students in the wards, something he considered his most useful and important work. By putting less emphasis on lectures and books, and more on practical skills, he fundamentally changed the way medicine was taught.
“Listen to your patient,” he told his students. “He’s telling you the diagnosis.”
Osler’s statement about man’s desire to take medicine, which was made in a lecture he delivered to the public, is widely quoted, but his follow-up sentence isn’t:
“Why this appetite should have developed, how it could have grown to its present dimensions, what it will ultimately reach, are interesting problems in psychology.”
He was expressing his concern that while physicians “have gradually emancipated ourselves from a routine administration of nauseous mixtures on every possible occasion, and when we are able to say that a little more exercise, a little less food, and a little less tobacco and alcohol, may possibly meet the indications of a case, you, the people should wander off after all manner of idols, and delight more and more in patent medicines and be more than ever at the hands of the advertising quacks.”
It isn’t surprising that Osler was critical of the use of medicines at the time, because most of them did not do much good.
“One of the first duties of the physician is to educate the masses not to take medicine,” he maintained, believing in the self-limiting nature of disease.
Osler himself prescribed relatively few drugs, his basic armamentarium consisting of quinine for malaria, digitalis for heart failure, opiates for pain and coughs, and iron and arsenic for anemia. Instead of drugs, he recommended bleeding the patient for a variety of conditions including pneumonia, stroke and mumps, and he suggested acupuncture for sciatica and neuralgia. He also thought that for nosebleeds there was no harm in trying the insertion of a cobweb into the nostril.
That, of course, sounds pretty curious to us, but throughout history people have resorted to every imaginable remedy for their ailments, from a toothache poultice made of “mashed mouse” to a whiff of flatulence stored in a jar to ward off the Black Plague.
Most of the treatments failed, but eventually, through trial and error, at the expense of much misery, some effective drugs did emerge. As early as 70 AD, Dioscorides described the use of the seeds of autumn crocus to treat gout. The active ingredient, colchicine, was not extracted and identified until 1820. Some drugs, penicillin being a classic example, were discovered by accident; others, such as Taxol, for cancer, by a meticulous search for physiologically active compounds found in nature.
Today, drug research focuses on molecular structure and known mechanisms of action. Gleevec (imatinib), a drug that dramatically increases the survival rate in chronic myelocytic leukemia (CLM), was developed based on the finding that CML patients produce an abnormal version of the enzyme tyrosine kinase, which in turn leads to an overproduction of white blood cells. Knowing the molecular structure of the enzyme, researchers were then able to design and synthesize a compound that would inhibit its action.
In some cases, the effectiveness of a drug for an ailment was discovered when it was being used for a different condition.
Antidepressants known as monoamine oxidase inhibitors (MAOI) are a classic example. In 1951, isoniazid was introduced as a treatment for tuberculosis with great success. But it wasn’t long before concerns about bacterial resistance arose, and when that happens, chemists begin to tinker with the molecular structure of the drug to develop a derivative that will help stave off resistance. Within a year, iproniazid was ready for testing in tuberculosis hospitals. While it turned out not to be effective against TB, the drug had a remarkable side effect. Doctors and nurses noted a significant improvement in patients’ mood, with some even taking to dancing in the hallways. Not a common sight in any hospital.
As it turned out, iproniazid inhibited the action of monoamine oxidase, an enzyme that normally degrades norepinephrine and serotonin, two compounds that control mood. Inhibition of the enzyme raises blood levels of both and leads to feelings of happiness. After favourable results were obtained on testing depressed patients, iproniazid hit the marketplace as Marsilid, only to be withdrawn in 1961 because of liver toxicity. But iproniazid had demonstrated the principle of antidepressant action, and opened the way to the introduction of other monoamine oxidase inhibitors, which are still in use, although they have mostly given way to the newer serotonin reuptake inhibitors (SSRIs).
Other drug actions that have been discovered in this fashion include sildenafil (Viagra), first introduced as an anti-anginal medication. However, it was its ability to elevate more than just mood that brought it fame and fortune.
Amphetamine was introduced as a treatment for congestion and asthma, but it was its stimulant side-effect that led to its use both by the Allies and the Germans during the Second World War as a performance-enhancing substance. A further surprise was when amphetamine, despite being a stimulant, proved to be effective in the treatment of attention deficit hyperactivity disorder (ADHD).
So then, is it really the desire to take medicine that distinguishes us from animals, as Osler opined?
I would suggest it is the making of medicines rather than taking them that sets us apart.
After all, chimps have been known to seek out certain plants when they feel ill, but only humans have the ability to isolate and identify and perhaps improve upon the active ingredient. Who knows, maybe our next drug will come from some sort of monkey business.