Taking the Medicine: A Short History of Medicine’s Beautiful Idea, and our Difficulty Swallowing It

Home > Other > Taking the Medicine: A Short History of Medicine’s Beautiful Idea, and our Difficulty Swallowing It > Page 4
Taking the Medicine: A Short History of Medicine’s Beautiful Idea, and our Difficulty Swallowing It Page 4

by Burch, Druin

This is all very fine, but it won’t do – Anatomy – Botany – Nonsense! Sir, I know an old woman in Covent Garden who understands botany better, and as for anatomy, my butcher can dissect a joint full and well; no, young man, all that is stuff; you must go to the bedside, it is there alone you can learn disease.

  His frank views went along with enough personal warmth to make him attractive to many leading thinkers. Boyle was a close friend, as was the philosopher Locke. Yet for all Sydenham’s advances in epistemology and observation, for all that he prompted doctors to pay more attention to the natural history of diseases, recording their signs and symptoms, progressions and outcomes, the benefit for patients was almost nil. In the end, Sydenham’s greatest therapeutic tool was his willingness to withhold medicines. ‘The arrival of a good clown exercises a more beneficial influence upon the health of a town’, he wrote, ‘than of twenty asses laden with drugs.’ Finding a patient brought to a state of physical and emotional collapse – not by disease, but by the drugs that others had given to bring on vomiting and diarrhoea – Sydenham ‘therefore ordered him a roast chicken and a pint of canary’.

  Sydenham’s therapeutic nihilism, a disbelief in the purported value of medicines, was profound. ‘I confidently affirm that the greater part of those who are supposed to have died of gout’, Sydenham declared, ‘have died of the medicine rather than the disease.’ As a gout sufferer himself, he worked his way through the available treatments, concluding that they were more toxic than therapeutic. He was not the first to decide that masterly inactivity was frequently the best option, but he was unusually open in his views. ‘It is a great mistake to suppose that Nature always stands in need of the assistance of Art,’ he argued, referring to the art of a doctor. ‘I have consulted the safety of my patient and my own reputation effectually by doing nothing at all.’

  Sydenham’s approach to bleeding verged on the revolutionary – rather than calling for a leech or a lancet at every opportunity, he called for them with relative moderation. He recognised the benefits of laudanum, although he was not able to distinguish between those of the drug itself and those it brought about by helping a patient to escape from more poisonous ‘cures’. He was suspicious of the extra ingredients added by Paracelsus, and simplified the recipe. In a medical world that prized the complexity of drugs – the more stuff, and the more exotic, the better – this was an accurate and fairly original intuition. Woodlice, human skulls, supposed unicorn horn, pearls, snakes and the contents of animal guts were routinely added to preparations. It was called polypharmacy, indicating the number of constituents, and it carried on until nineteenth-century chemists became confident that what was important were the properties of particular active ingredients – an insight that developed into the idea of molecular receptors, cellular locks that responded only to keys of a specific microscopic structure.

  For laudanum, Sydenham recommended two parts of opium to one of saffron, along with some cinnamon and some cloves, all mixed in with sweet wine. Cloves possess some mild local anaesthetic properties, but like the other spice and the choice of a popular (and relatively expensive) drink, their main purpose was far more practical. They tasted good. That helped the medicine go down. ‘The act is all,’ said Goethe in Faust, ‘the reputation nothing.’ Doctors knew better.

  The fact that Sydenham dissolved opium in wine – canary wine, similar to the Madeira we have today – has an aspect to it that is easy to overlook. Wine and poppies went together. They provided both ease and forgetfulness and also alertness and a sharpening of the senses. When Samuel Taylor Coleridge, in 1817, wanted a word to describe what laudanum did, he coined a new one: intensify.

  Part of the reason for our modern horror of opiates comes as a side effect of our war on drugs. The penalty for transporting coca leaves is the same as for cocaine; highly concentrated morphine the same as for the unprocessed latex of Papaver somniferum. Potency therefore carries a premium. If you are going to risk yourself in producing and moving illegal drugs, it is in your interests to shift them in as concentrated a form as possible. So the legal dangers of drug dealers become the physiological ones of their customers. Heroin now finds its way into every city, while the milder alternatives that were so common throughout human history – poppy tea, or home-made laudanum – have gone. All that has remained is a love of the poppy for the sake of its appearance.

  How was it that doctors persisted in prescribing remedies that helped to kill their patients, yet the profession of medicine continued? How did doctors maintain a reputation for being helpful while causing harm?

  The nineteenth-century Boston physician Oliver Wendell Holmes thought he had the answer. What people desired most was something to believe in, and they were willing to pay heavily for it:

  There is nothing people will not do, there is nothing they have not done, to recover their health and save their lives. They have submitted to be half drowned in water, half cooked with gasses, to be buried up to their chins in the earth, to be seared with hot irons like slaves, to be crimped with knives like codfish, to have needles thrust into their flesh, and bonfires kindled on their skin, to swallow all sorts of abomination, and to pay for all of this, as if to be singed and scaled were a costly privilege, as if blisters were a blessing and leeches were a luxury.

  To this day, little in medicine is so difficult as doing nothing at all. Medicine is founded on the desire of patients to be helped and of doctors to help. These desires outweigh sense. The difficulty of doing nothing, or of admitting that there is nothing to be done, is overwhelming. Like politicians who need to be seen to be doing something – anything – about problems that are actually beyond their control, doctors are pushed into playing a part. The danger, with both doctors and politicians, comes when they start to believe in their own illusory importance. People want doctors who are confident, certain, able to offer treatment. The confidence that makes people trust doctors has a way of working its way into the doctor’s character. Persuading people to trust your judgement is essential – if people are going to feel cared for, if they are going to feel safe enough to follow your advice, or at least to be soothed by it – and it is always easiest to persuade people if you have convinced yourself first of all.

  ‘As for a radical cure, one altogether perfect,’ wrote Sydenham, ‘and one whereby a patient might be freed from even the disposition to the disease. This lies, like Truth, at the bottom of a well; and so deep is it in the innermost recesses of Nature, that I know not when or by whom it will be brought forward into the light of day.’ This is good advice, but not the sort of advice a frightened patient wants to hear. The patient wants someone offering confidence and hope, not the audacity of doubt.

  Francis Bacon, and the others after him who developed what we now call scientific method, were not developing ways of using pipettes, or rules about wearing white coats and working in laboratories. The mental tools they developed had nothing to do with particular pieces of laboratory equipment. Journalists talk of ‘scientists’ as though they are a different species from the rest of us, rather than being any person trying to make his or her beliefs more accurate by testing them. A child skimming rocks on the surface of a pond is engaged in a sort of science, gradually experimenting with the shapes of the stones and the angle of their throw to get as many bounces as possible. A man with a beard and a PhD working with sophisticated machines is, if he is not testing his theories, engaged in something that is not science at all.

  Struggling to understand the best ways for learning about the world, Francis Bacon tried to work out why it was that we so often got it wrong, taking mental routes that led us away from reality. At the end of the sixteenth century and the start of the seventeenth, in small uncertain steps, he was distinguishing fertile thought from a different kind of mental activity, one that produced illusions that seemed real and mistakes that felt right. Bacon’s love for truth was real to him, as was his fear of those things in his mind that distracted him from recognising it:

  The idols
and false notions which are now in possession of the human understanding, and have taken deep root therein, not only so beset men’s minds that truth can hardly find entrance, but even after entrance is obtained, they will again . . . meet and trouble us, unless men being forewarned of the danger fortify themselves as far as may be against their assaults.

  Our minds, said Bacon, have a habit of seeing order where none exists, of making connections because they appeal to us rather than because there is evidence for them. We have individual prejudices, clouding our minds and pushing us away from the truth purely because some conclusions taste better to us than others. Words matter also, and we get some things wrong simply because we are muddled over expressing them, allowing vagueness or confusion to spill into our thoughts from the phrases we house them in. Then there are the errors brought on by success, by teachings and arguments whose popularity and appeal are beyond their actual value. Arguments are not necessarily won by those in possession of the truth, but by those who are the most persuasive.

  These mental errors, Bacon said,

  have their foundation in human nature itself, and in the tribe or race of men. For it is a false assertion that the sense of man is the measure of things. On the contrary, all perceptions, as well of the sense as of the mind, are according to the measure of the individual and not according to the measure of the universe. And the human understanding is like a false mirror, which, receiving rays irregularly, distorts and discolours the nature of things by mingling its own nature with it.

  Science provided a system whereby people could rescue themselves from muddles, and guard against mistakes. There was no perfect way of avoiding mental errors, any more than disease. The best that could be hoped for was to remain aware of how inevitably mistakes arose, and to use tests and trials to continually weed them out.

  3 Self-confidence and Quinine

  ‘I DIE BY the help of too many physicians’ was supposed to have been the final sentiment on the lips of Alexander the Great in 323 BC. Four hundred years later, Pliny suggested a new epitaph was becoming common, echoing Alexander. ‘It was the crowd of physicians that killed me’ was an easy enough sentiment for someone to declare upon his death, a harder one to accept in the days leading up to it. Blaming doctors for their failures did not stop people flocking to them hoping for success.

  The knowledge that medicines were toxic was widespread, but this augmented rather than undermined the impression that doctors commanded therapeutic power. If medicines were dangerous, that meant they were powerful – and even if the power to harm was what was seen, the power to help was imagined to accompany it. It was difficult, in affliction, to resist the comforting idea of medical help. The greatest of doctors achieved their stature partly by unflinching self-belief. Galen’s confidence, for example, was part of his appeal. It was confidence robust enough to survive any collision with reality. He said of one potion:

  All who drink of this treatment recover in a short time, except those whom it does not help, who all die. It is obvious, therefore, that it fails only in incurable cases.

  Fevers, Galen believed, arose from an excess of blood. The treatment was therefore clear. (Galen’s faith in bleeding was extreme. He even recommended it as a cure for blood loss.) For a fever, a patient should be bled twice a day, the second time to the point where they fainted. Galen based his beliefs about the human body on complex theories of internal humours and their differing effects. He was contemptuous of those healers whose lack of theoretical beliefs left them reliant on experimenting.

  Bleeding stuck with phenomenal longevity in the mind of medics. Here is Sir William Osler, a founding professor at Johns Hopkins University and later the Regius Professor of Medicine at Oxford, on pneumonia:

  To bleed at the very onset in robust, healthy individuals in whom the disease sets in with great intensity and high fever is good practice.

  He was writing in 1920, almost two millennia after Galen. The theories underlying the treatment had changed. Osler knew more about the human body and the microbes that cause pneumonia than Galen. What was unaltered was the effect of bleeding on a patient suffering from this infection of the lungs. It remained bad. The theories changed, the harm remained the same. Galen’s poisoned gift to his profession, even more than his belief in bleeding or his list of 473 different drugs, was his complacency. He wrote:

  I have done as much for medicine as Trajan did for the Roman Empire when he built bridges and roads through Italy. It is I, and I alone, who have revealed the true path of medicine. It must be admitted that Hippocrates already staked out this path . . . he prepared the way, but I have made it passable.

  One of the key infections that Galen dealt with was malaria. The characteristic cycle of chills and fevers makes descriptions of the disease by early doctors recognisable, even when they understood little of what they were recording. Malaria is caused by a protozoan parasite, a single-celled organism with a cell wall and a means of propulsion – the latter making it a little more like an animal than a plant. Plasmodium, the protozoan genus concerned, has infected humans long enough for many of us to have evolved genetic defences to it. Most likely it has existed as long as our species, since related protozoans affect chimps and other primates. It seems to have originated, like us, in Africa. When people migrated, they took it with them. The plasmodium spends only part of its life in humans, taking up residence the rest of the time in mosquitoes. The bite of a mosquito is the way in which the disease spreads from person to person – or, to put it from the insect’s point of view, malaria goes from one mosquito to another when they eat from the same walking table.

  For the First World, malaria today is a holiday problem. Elsewhere, two thirds of a billion people fall sick with it each year and several million, mostly African children, die. There are no vaccines. Drugs, however, can successfully protect against and treat the disease. The oldest of these is the bark of a South American tree, cinchona, containing a compound called quinine which is poisonous to the protozoan.

  In England the disease used to be called the ague, from a word for fever. It came upon people for no obvious reason, although many linked it to swamps and to foul air. Not until Horace Walpole fled Rome in the summer of 1740, keen to escape the disease, did the English begin to adopt their modern term. There is, said Walpole, writing home to a friend, ‘a horrid thing called the mal’aria, that comes to Rome every summer and kills one, and I did not care for being killed so far from Christian burial’.

  The swamps and marshes around Rome, as Walpole observed, were known to give rise to the disease. Religion, at least as judged by one’s rank in the Church, was no defence: popes and cardinals lived in fear of malaria, and died as easily as their humbler brethren.

  Around 150 years before Walpole’s letter, at the start of the seventeenth century, the Spanish had begun bringing back to Europe the bark of a particular South American tree. Jesuit priests in Peru found that the natives used it, chiefly for treating wounds. The quina-quina tree gave out a balsam, and as well as being useful for wounds it seemed to work for fevers. It had no particular value for malaria, but its use caught on all the same. This ‘Peruvian balsam bark’, however, was expensive. To supply the demand, merchants began sending home an alternative bark instead. At first it was used haphazardly and without great interest. The prevailing atmosphere was hostile to innovation: in 1624 Pope Urban VIII issued a papal bull excommunicating all smokers of the newly introduced tobacco; in 1633 he demanded that Galileo recant his ideas about the universe.

  Others were more open to new ideas. In 1643 a Belgian doctor referred to the substitute bark – which came to be called árbol de calenturas or fever tree – being used in Europe to treat malarial fevers. It also took the name quinine, from the original quina-quina tree that it had been introduced to replace. Interest in Rome was driven by Juan de Lugo, a Spanish cardinal, who kept a supply of the bark, selling it at great price to the rich and giving it away freely to the poor. This bark, ground into powder, was the
first European medicine that actually cured the patients who took it. Opiates dulled pain, but did not increase survival. For the first time in history, here was a drug that did something better. It was miraculous, and yet faith in existing nostrums was so profound that most people failed to notice.1

  When Urban VIII died in 1644, fear of the Roman malaria meant many cardinals refused to cross the fever-ridden flatlands surrounding the city for the conclave picking his successor. In the year that Innocent X was elected to his office, Cardinal de Lugo asked the new Pope’s doctor what he thought of the powder’s power. The verdict was glowing, although the papal physician saw nothing out of the ordinary about the powder’s excellence. In the next few years Juan de Lugo’s reputation and influence rose. He began distributing the bark more widely, both from his own house and from the Collegio Romano, the supreme seminary of the Jesuits. It was taken up more enthusiastically for being patronised by this powerful base. Major Jesuit congregations were held in Rome in the years 1646, 1649 and 1650. Desire for the pulvis cardinalis or pulvis Jesuiticus – the cardinal’s powder, the Jesuits powder – grew. De Lugo, supported by Innocent X, preached to the Jesuits of its utility. The impressed brethren returned to their corners of the Catholic empire with enthusiasm on their lips.

  By 1651 the powder had found its way onto an official formulary, a list of permissible and recognised drugs. The Schedula Romana, amongst the hundreds of useless and harmful medications, now included one that healed. The next year, 1652, Archduke Leopold of Austria was struck down with a malarial fever. Treated with the new bark, as recommended in the Schedula Romana, Leopold rapidly recovered. A month later, however, he became feverish again. Rather than opting for another dose of the excellent powder, Leopold ‘was so incensed that . . . he ordered his physician to write a book attacking the remedy and warning against its dangers’. Other doctors joined in, their own prejudices trumping their ability to perceive the drug’s life-saving effects. In 1655, bubonic plague hit Rome, and when the feverish victims of this quite different disease were treated with Jesuits’ powder they got no better. The bark fell out of favour.

 

‹ Prev