“In the beginning” – are three evocative words which preface so many a tale. “How did we get here” are five which, in despair or confusion can sometimes spark such narratives today. A single spark which can ignite a forest fire, if ground conditions are tinder dry and the winds of time blow maliciously. A butterfly in an Amazon rainforest, sucking nectar from a beautiful flower, flapping its wings, can reputedly have a similar impact, albeit being harder to demonstrate the connectivity of such actions to their distant repercussions.
One idea can spread globally and be felt in all four corners of the planet and for generations afterwards. Money, say, there’s an idea which spread, stuck and impacts on all our daily lives, yet that’s all it is – an idea, an institutionalised mode of interaction. Times gone, there was no financial system and our ancestors survived wholly successfully without it.
Equally, once, there were no gods and religious interpretations of behaviour were not made. Lives passed without sin as people simply acted to survive, existing in balance with their environments and, who knows, maybe to enjoy themselves in their Garden of Eden situation. We cannot judge them, nor should we look to. Clearly they lived full and successful lives. Else we, of today, would not be here, today.
But life got complicated, humankind evolved, moved from their origins and radiated out, over time, to those four corners of our so curvaceous planet. We developed money, power, control and religions as integral to the structures of our societies. Some moved faster than others, some societies thus became more elaborate, more quickly. More sophisticated, it was felt, more “developed”, more profitable, more civilised and most modern.
Society is now an integrated meshwork of specialisms, in order to function efficiently. Quid pro quo, mutual back scratching and shared responsibilities allowed a far wider range of skills to be developed than any one person could ever accomplish. In fact, this process became a self-sustaining prophesy and tasks were ever more sub-divided as so much of an individual’s life was delegated to others to perform.
In the establishment of this system, the cross generational transfer of the required knowledge plus the internal assessment and cross-referencing of the expertise led to apprenticing, education and, for many of the disciplines, the emergence of guilds. Druids, knights, masons, cooks, alchemists and scientists – these sort of areas. And, of course, the Keepers of Good Health, the medical fraternity.
This essay is about Medicine and how the craft was compromised by a single idea originating over two hundred years ago, which today rules stronger than ever and yet had neither any internal logic for its use nor has it any direct evidence as to its ever having been of any benefit whatsoever.
On the contrary, the evidence exists which strongly condemns the procedure as directly causing innumerable deaths and lifelong disabilities. Despite this, the industry supplying it is both immensely profitable and also rapidly growing, annually making multiple billions of whichever currency you care to name. Opposition is simply ignored, or over-ruled and squashed.
It all started, history tells us, about two hundred and twenty years ago in a small, bucolic farmstead deep in rural England when a country medic chatted with a farmer and they agreed to perform an experiment. Edward Jenner, an established, forty seven year old doctor, practising around Gloucester, had listened to tales of good health abounding in dairy girls, who boasted they developed a strength which allowed them to survive when others in their community succumbed to infection, and often death, from a scourge of the era, the dreaded “smallpox”.
There is no reason to believe that humans had ever been free of disease. However, since larger populations had begun to live cheek by jowl, in structured groups, with limited and sporadic diet and without access to provisons we today regard as essential, life was difficult for many and frequently also quite short. It seems reasonable to assume, though, that any fatalism would have been over-ruled by a strong sense of individual worth – the will to live.
Herbs and herbalism, healers and witchdoctors, druids and other sages all have ancient pedigrees – to be able to assist the ailing to return to full health must early on have been valued by every community. There are sketchy texts going back five thousand years – yogasutras, for example – and every reason to believe such knowledge was assembled over a far longer period, way back to the initial emergence of human understanding, much further back into evolutionary history.
Surely, though, their skills were overwhelmed when epidemics struck. Communities could be laid low, often decimated, by the cruel interventions of plagues. Faith in your healer was clearly an important ingredient in arriving at a positive outcome, but, when everyone fell ill, this was interpreted as being due to betrayal of faith by the whole population – not a faith in the healer but in a supernatural guardian. Their behaviour had fallen short and so this God responded by unleashing punishment.
In ancient Greece, Hippocrates was trained as a health practitioner by his father and grandfather. In practice, however, he refined his approach, removing both the need for a their intervention or, indeed, much in the way of medicines. Instead, he believed the body knew, innately, how to heal itself and, given the clean, secure conditions this needed, could do the work best by itself.
Although Hippocrates is regarded as the father of medicine, perhaps for his analytic approach to his work, always noting his observations, the above break with both faith and also “lotions and potions” is sadly no longer adhered to, although the “Hippocratic Oath” is sworn by medical students at the culmination of their studies. Deep in that oath is the proclamation to “first do no harm”, intrinsic in Hippocrates’ teaching.
It is dangerous and yet so tempting to look back through history, to note what records were made as time progressed and, then, reassess them in the context of today’s understandings and under prevailing beliefs. A form of “confirmation bias” can so easily ensue and selective sampling be enacted. Such is so comfortable but also suffocating.
I read histories of infectious disease and human reaction to them but see these prefaced and imbued by today’s attitudes, assuming such to be the only objective stance to take. Whereas Hippocrates assessed illness by the symptoms demonstrated, coupled with the time of year and the environmental conditions prevailing, illness today is described as being directly caused by a single microscopic organism. Different illness, different organism.
Hippocrates wrote a lot about “epidemics” but, although these cases were concurrent, the descriptions he attached to them were so diverse that modern analyses conclude simply that, although a great number of Athenians were ill, they were actually suffering a range of different illnesses. Back then, though, Hippocrates’ method was to aid the body to correct itself from its malaise which had arisen as an individual’s reaction to a cocktail of environmental circumstances.
Skip forward eighteen hundred years to 1348 and, in the medieval, dark ages, epidemics and illness had not gone away. The word was plague and the outcomes gruesome. Infected, an individual had a high chance of a hasty death. Three days was common and appearance often gruesome, with the notorious black spots breaking out, due to gangrenous conditions developing. Otherwise symptoms were largely as for more modern ailments, like flu, so, sickness, diarrhoea, fever and headaches. In some forms, glands under the armpits would swell, hence the “buboes” of “bubonic” plague. Other forms of plague involved the lungs – “pneumonic” plague – and the blood – “septicemic” plague.
Skip to today and we are told that now, although it is very rare and few succumb to this ailment, it is still known and all forms are caused by infection with one bacterium, Yersinia pestis. The illness generally arises after being bitten by a flea which has itself first bitten a rat which then acts as a sump for the tiny bacterium, circulating in said rat’s bloodstream. Symptoms of plague in rats are not described, but word is that again, it is not a pretty sight. Yes, rats can and do suffer and die from this illness. Whether fleas develop plague is not recorded………
Hippocrates would, however, have sought to extend his parameters of vision. How was the climate, what was the food like, could people reasonably restore their internal environments to optimal state? Perhaps not! It is indeed well-known that the plague was preceded by a major famine, from 1315 to 1322, brought about by extraordinarily wet weather. However, the deteriorating living circumstances had been felt even earlier, starting in the mid thirteenth century. A major volcanic event at Samalas in Indonesia, which exploded sometime between May and October 1257, had effects which could be felt all over Europe. English chronicler Matthew Paris wrote that during this year:
“the north wind blew without intermission, a continued frost prevailed, accompanied by snow and such unendurable cold, that it bound up the face of the earth, sorely afflicted the poor, suspended all cultivation, and killed the young of the cattle to such an extent that it seemed as if a general plague was raging amongst the sheep and lambs.”
And things got worse! So:
“Recent comparative studies of people’s lifespans in the early and late pre-plague periods have shown that it deteriorated significantly in the latter time, when mean survival was 31.6 years as opposed to 38.2 years. It has been speculated that this is a sign of a less robust health…….which led to more severe consequences when the plague finally hit Europe in 1348.”
Origins of ideas are difficult to track and most folks are perhaps too preoccupied with the minutiae of daily survival to even stop to think whence certain understandings came. The above, somewhat long winded contextualisation has left an ambitious country doctor standing, ready to chat with a working farmer and to suggest to him a rather dangerous but, in our Doctor Jenner’s eyes, potentially beneficial experiment. The question “Beneficial to whom?” is well worth asking.
In Africa, for centuries, a habit had evolved whereby illness was shared between families within communities. We often talk of “witch doctors” keeping their tribesfolk well through using spells, incantations and weird concoctions, derived from the jungles. These protectors of their people were cussed, distant and dictatorial. That’s as may be but, by all accounts, the process we discuss here was run by the mothers amongst their own children. The practice was, if one child developed sickness leading to spots “heading” which contained a pus, then this substance would be collected and spread onto the arms of the other kids, possibly after gently scratching the skin surface first.
This could lead to some other children also becoming ill but tribal logic had it that this “took the children though the illness in a controlled manner” and also got the process dealt with for all the kids together. Generally, the originally ill child recovered and sometimes those “anointed” could become quite ill. Maybe some died but this is not recorded.
In the 1960s and 70s and even today, in the developed world, some parents held “measles parties” where one of the guests was ill with this spotty condition and, it was hoped, all the other youngsters present would catch it too. To “get it gone through and to develop a lifelong resistance to the condition”. Absolutely none ever died as a result of this custom.
In China, whether informed by the practice pursued in Africa or not, a similar habit prevailed. In chronicles dating back as far as 1000AD, an analogous technique is described. Here, though, the extracted pus was dried and then sniffed or blown ceremonially into the nose. Many people, generally children, were so dosed and, again, it was felt, given resilience to fight off this illness later. How they gauged success is not described.
Would Hippocrates have utilised such an approach? Possibly, but it would not have impacted on his provisions for the patients in his care. Further, he would have been deeply worried about contravention of his tacit assumption that the individual holding responsibility to assist patients in their return to good health, should “First do no harm”.
Although the practice was established in popular use in Africa and the East, there was no early adoption in Europe. Rumour, emanating from Turkey, became experiment and then usage only in the eighteenth century. Word also travelled to North America, although there, there was often vigorous objection to the idea. It was adopted by the nascent trade of medical practitioners and gained the title “variolation” in the early seventeen hundreds.
Variolation was closer to the African method, with the smallpox pus introduced to a cut in the arm of the recipient and then bandaged up and sent off to hopefully have only a mild case of the pox, followed by a robust, self-generated barrier to any future bout of this illness.
This is really where the story starts to become murky and reporting of outcomes diverges. People had for a very long time been succumbing to illness, both in single events but also in epidemics. Surely there prevailed a dread of such early and unpleasant demise and any and all manner to resist this eventuality would have been welcomed. Here was business.
In 1777, George Washington, commander-in-chief of the settlers’ Continental Army, and twelve years later the first President of the United States, ordered mandatory variolation for troops if they had not survived a smallpox infection earlier in life – possibly in reaction to the inability of General Benedict Arnold’s troops to capture Quebec from Britain the year before, when more than half of the colonial troops had developed smallpox. The illness was far more frightening than the military opponents, it was said.
There are no reports of the success or otherwise of Washington’s move. Suffice to say that there had already arisen much hostility to the procedure. In Boston, there had been loud protest, for example against Cotton Mather, who had learned of the African experience from slaves, who had recently been brought to America, and started to try to encourage its local adoption. This was not a cure for smallpox, he realised. Three or four percent of those “variolated” would die soon after. Yet smallpox, when it hit in epidemics, could have quite high fatalities. “Fourteen to eighteen percent of those infected naturally would die”. A typical confusion of statistics, so common in this sphere to this very day. The missing figure is how many of those variolated would die when an epidemic came!
So, now, to Jenner’s experiment. Acting on the clue that the dairymaids’ exposure to cowpox enabled them to resist smallpox, he took some cowpox pus, from spots on a recently ill cow’s skin, and introduced it into a cut in the farmer’s son’s arm, which he then bandaged up. A couple of weeks later he repeated the operation, this time using pus taken from the spots of a human smallpox sufferer.
The boy survived these travails and Jenner so introduced “vaccination” to the vocabulary, from Vacca, Latin for a cow.
Now, let us consider – what is pus? A modern encyclopaedia offers this for the source of Jenner’s prophylactic:
“Pustule, a small circumscribed elevation of the skin that is filled with pus, a fluid mixture containing necrotic (decomposing) inflammatory cells. Pustules are often infected and have a reddened, inflamed base.”
Thus our early pharmacist collected a viscous, yellowish liquid containing necrotic cells and all manner of skin surface bacteria, fungal spores and other matter in non-aseptic conditions. He then took these samples to his rooms where they were dried and a powder prepared. This highly heterogenous mixture of rotted human tissue plus all manner of colonising micro-organisms became Jenner’s elixir of health – “Have this rubbed into a gash in your arm and you’ll never develop the Pox”.
About ten years later the idea was accepted by the medical establishment and the government awarded him some £30 000, equivalent to some two and a half million pounds today.
Word spread into Europe, where all Western Countries soon moved to adopt the innovation. Word spread, too, to the United States where a certain Dr Waterhouse quickly obtained a lucrative monopoly in supply of the materials which he imported from London. Ten years later, in 1813, this burgeoning people’s republic set up a National Vaccine Agency, possibly because of incidents such as, in 1802, when “sixty eight people died after material [from a full smallpox pustule] was used to prepare a batch of vaccines in Massachusetts”. This type of incident was not uncommon in either the New or the Old World.
Britain moved more slowly to introduce regulations. In 1840 vaccination was deemed the only way to offer smallpox prevention and variolation was banned. At the same time free doses of dried cowpox pus were offered to all infants. The medical journal The Lancet said this was too little, too late, as five children a day were dying of smallpox in London alone. The Lancet made no mention of how many of these youngsters had been either variolated or vaccinated.
Then, in 1853, Britain “made smallpox vaccination mandatory in the first three months of an infant’s life”. If you, as a parent, did not comply there was a fine or imprisonment. A further act in 1871 re-emphasised this. However, major epidemics of smallpox still spread through the country, despite very widespread vaccination. 1860’s invention of the hypodermic syringe, to replace the slit-in-the-arm technique, did not stop death rates from the illness remaining very high.
In Leicester, where there was already a strong uneasiness about vaccination,
in 1871 – 72, three hundred and fourteen people died of smallpox and mass protest began. Many parents were gaoled and fined for their refusals to participate in what they now regarded as a deadly falsehood. Despite an initial harsh reaction to this by authorities, a change of heart came and a far more Hippocratean method was used, whereby fresh air, cleanliness and isolation of patients became the illness management regime and compulsory vaccination was dropped.
During the next smallpox epidemic, five years later, only six died in Leicester. Elsewhere in the country, there were the usual high death tolls, despite their continued use of vaccination. Or should that read “Because of their continued use of vaccination”?
Now the idea was established, however, and its influence grew ever stronger over the years that followed, despite much evidence, such as from Leicester, that the process was clearly dangerous. Demonstrably, the approach of actively managing illness, at the same time as improving housing, diet, water supply and nutrition for all citizens could save far more lives and at the same time not kill people through their reaction to the clearly toxic pus potions.
Far from remaining a folk tradition whereby possibly infectious material was either breathed in through the nose or rubbed onto the skin surface by typically earthmother types, we now had, already, a new international and seemingly lucrative business being established. Health preservation was becoming a commodity. Direct dispensation of the service was by the medical practitioners and they were now assisted and abetted by scientists searching for other illnesses to treat in the same manner and by operators working to use these discoveries to produce the materials for the physicians to use.
In this progression, the scientists certainly had to provide methods and material for “vaccination” against other ailments – for the word acquired generic status, losing the reference to cattle and cowpox. In addition, though, these same scientists had to provide a rationale and a backstory to explain why they were doing what they were doing. It does not do to say “Er, just because it works”, especially when there is no good reason to believe either that it should or that it does.
Louis Pasteur, working in Paris, is generally regarded as the father of bacteriology, the science of micro-organisms. He invented the process we know as pasteurisation, to kill bacterial colonies which would otherwise sour milk. Six thousand years earlier, when making a barley beer, water was first boiled to sterilise it, before adding a quantity of grain to quickly ferment an ale. Context, complexity, confusion. History.
Yes, Pasteur provided a start on the oh so needed backstory, whilst also famously producing an anti-rabies vaccine. Interestingly, when his papers finally were released from his own imposed purdah, one hundred years after his death, his cooking of his results was exposed. This practice is also still very common in this field.
A chemist by training, he had moved to biological sciences when consulted on fermentation processes by a grower. Around this time tragedy struck as his eldest daughter died of Typhoid fever. In the next six years two further children also so succumbed. It seems Pasteur understandably saw fighting such illness as a crucial challenge. I feel this may have impacted on his methodologies and judgement.
Illness was changing in the late nineteenth century and into the twentieth. Plague had long since been discounted in the developed world as a threat, and smallpox now was greatly diminished, as living standards had hugely improved. However, in 1894, the United States had its FIRST polio epidemic. Ten years later they suffered their LAST of Yellow Fever, despite there never being used a vaccine.
The backstory grew to provide description of physiological process whereby recognition of alien, chiefly protein, structures – three dimensional structures – was shown to be achieved by specialist blood cells we all carry. B-cells, T-cells and so many more. The sciences of immunology and vaccinology were born.
And all to legitimise process, the process whereby, we are told, we needed these systems artificially stimulated in order for them then to be equipped properly to, at some time in the future, successfully defend us against micro organismal take over, illness and possible death. How had we managed without this facility for four billion years, one wonders?
Instead of pus, which had been quietly dropped from the recipe, there were now more purified extracts or “attenuated” – crippled types – of the potential infective organism, plus additional, assorted chemicals, generically entitled “adjuvants”, aiming to harness and co-opt our inbuilt biological systems and also to preserve the extracts from other bacterial contamination.
Frequently now up to six potential illnesses can be “fought off with one jab”. UK Health Minister Edwina Curry gave a broad smile as she announced the MMR – “One jab gives you a lifetime’s protection against Measles, Mumps and Rubella”, only to very soon be forced to retract the statement. The leading, present day, American driver of the vaccination industry, as well as kingpin of its regulation, multi-millionaire Dr Paul Offit forcefully states that a child could be “given ten thousand vaccine “antigens” simultaneously with no harm”.
Where are we now? What is the nature of the idea implanted so deeply and exactly how did it originate?
Now, bacteria and other micro-organisms are our constant companions, they colonise every square inch of our external surfaces. This includes large parts of our guts and our bronchial systems. Biologically we have evolved with them since life developed on this planet. Four billion years, give or take a moment, of collective development. During that time, as species have evolved and diverged, from single cells to Wellingtonia and Brontosauri, trial and error have ironed out inconsistencies. Each of us is a fantastically complicated ecosystem and part of a greater one. Our physiologies are evolved to support this and we look to improve our defences as the products, not the fabricators, of this process.
Jenner’s medicalization of passive folk habits was the single idea which stole health from the individual, commoditised it and broke humanity’s connection to the noble Hippocrates. Now, vaccination has become the foundation of the healthcare system – it provides customers for life!
[Entry to a writing competition run by The Spectator this spring. Sadly, I don’t think I’ve won as I’ve not heard back from them yet! ]