This article is part of a longitudinal series on humanities.
Those who cannot remember the past are condemned to repeat it.
– George Santayana, 1905
Current standards of medical practice in the developed world are high, supported by scientific discoveries that provide a clearer understanding of how to prevent, investigate and treat a wide range of health problems. In particular, advances in immunology and genetics might allow management strategies to be tailored to individual circumstances – ‘personalised’ medicine. Communications technology provides more health information on demand, although not all is accurate or helpful. Artificial intelligence (AI) has the potential to improve information management, particularly if ‘live’ sources can be searched and critical appraisal algorithms improved. Meanwhile, a well-trained medical workforce guides patients through a ‘maze’ of clinical pathways that reflect the complexity of managing multiple comorbidities in an ageing society. Although equity of access remains variable and good outcomes cannot be guaranteed for all, both the quality of life and life expectancy are increasing. The future is likely to see a continuation, perhaps an acceleration, of these trends.
However, these advances have not occurred in a vacuum. Without detracting from their value, reflecting on how medical practice has evolved provides interesting insights into how the past has informed current practice. Medical practice has existed in some form for a very long time. Documented archaeological findings were the primary source of older information until handwritten manuscripts were stored securely, usually in monasteries and then university libraries. Wider dissemination followed the invention of printing presses early in the Renaissance period and much of this information is now accessible online. Seeking this information is at the core of history; as defined by the Cambridge dictionary as: the study of (or a record of) past events considered together, especially events of a particular period, country, or subject.1
Caution is required in exploring history because it is contestable. Another aphorism of uncertain origins but ‘borrowed’ by many is: ‘History is written by the victor’. There is inherent potential for bias, depending on who ‘holds the pen’. Disagreement is usually less about ‘what’ and ‘when’ and more about ‘why’ and ‘so what’. ‘Hindsight’ might be informed by availability and accessibility of information and influenced by personal, societal and cultural factors. It is easy to overlook developments recorded in other languages and from other cultures. Sometimes, interpretation is tinged with either ‘rose-coloured’ or perhaps very dark spectacles. At the extremes of this are attempts to ‘rewrite history’ and claims of discovery that do not acknowledge past advances.
If medical history is so interesting and important, should it be a formal part of the primary medical curriculum? The case for inclusion is that history contributes to cultural grounding of modern medicine and professional identity formation;2,3 assists clinical reasoning by informing what and why something has failed;4 and is so closely tied to social history that it improves understanding of complex issues.3,5 However, balancing competing curriculum demands amidst expanding knowledge leaves little room for ‘soft’ content. Australian data are hard to find, but anecdotally, medical history has not been a significant part of medical teaching for several decades. North American surveys of medical curricula during the twentieth and early twenty-first centuries showed continuing declines in explicit inclusion of medical history.6,7 Even then, references to history were disorganised and scattered.
This paper presents a selection of eight topics from medical history that provide lessons for current healthcare. There are, of course, many more than eight topics and readers are encouraged to read widely and develop their own list.
Observation (carefully documented) drives ‘organised curiosity’ and can result in significant advances
Nothing has such power to broaden the mind as the ability to investigate systematically and truly all that comes under thy observation in life.
– Marcus Aurelius 121–180 ACE
In an era where randomised controlled clinical trials are the gold standard for translation into better outcomes from evidence-based changes in practice, it is easy to overlook the many innovations that began with observation in laboratories and clinical practice. Information from ‘primary sources’ is likely to be more accurate (eg first-hand letters, field notes by original participants) and ideally both facts and interpretation are similar from different sources. This supports credibility, transferability, dependability and confirmability;8 the qualitative research equivalents of validity and reliability. Throughout history, scientists and clinicians have advanced knowledge and practice through careful attention to what they do. Hippocrates described carefully, presentations and likely diagnoses in a list of about 400 aphorisms, many of which remain correct.9 Hippocrates also developed the Miasma theory of disease,10 based on the observation that infectious diseases were more common in lower, wet and relatively airless ground, amidst ‘bad air’ or ‘malaria’, a term later attached to the mosquito-borne parasitic infection. In response, hospitals since have usually been built on higher land with good ventilation. The first successful vaccine was based on simple observation that milkmaids in France did not get smallpox after cowpox, a similar but much milder disease. Puzzled by this, Edward Jenner theorised that cowpox infection somehow protected against smallpox. In 1796, he inoculated a boy with the cowpox virus from a recently infected milkmaid and the boy did not get smallpox.11 Louis Pasteur took this further by showing that inoculating chickens with attenuated cholera viruses (1879) and farm animals with attenuated anthrax bacteria (1881) prevented infections.12 A more recent example is the intensive care unit (ICU) doctor who noticed that Irukandji syndrome (a tropical jellyfish envenomation) had similarities with eclampsia. He tried a magnesium infusion and the patient improved.13
Successful public health disease control measures are not new
Study the past if you would define the future.
– Confucius 551–479 BCE
Our understanding of pandemics and their management remains similar to that learned during previous plagues and pandemics. Throughout documented history, the origin of pandemics was obscure initially and blamed on somewhere else. The diseases spread through transport and trade by travellers. Successful management strategies included: early detection and isolation of patients and close contacts; restriction of movement of the population; closure of national borders; and restrictions at ports and quarantine (literally 40 days) isolation of people arriving from elsewhere.14 Does this sound familiar? Although better known in European history over several centuries, Bubonic plague outbreaks occurred intermittently between 1900 and 1925 along the eastern seaboard of Australia in the major east coast ports of Sydney, Brisbane and Townsville – the reason that quarantine stations were built there. The ‘Spanish flu’ probably did not originate in Spain and its international spread was hastened by the movement of soldiers during and after World War 1. This pandemic lasted much longer than many think – until the early-mid 1930s in some parts of the world – as sufficient herd immunity developed.15 Moving to the recent COVID-19 pandemic, the same public health measures were relied on until vaccination produced sufficient global herd immunity, a point which in late 2023, we have not yet reached. Combining vaccines with rapid international travel might result in a shorter pandemic period, but new variants also develop and spread rapidly, so this pandemic is not yet over. Managing the next pandemic will rely initially on the same public health measures.
Understanding basic principles of preventing sepsis might be more effective than complex interventions
Hygiene is two-thirds of health.
– Old Lebanese proverb
The generally good health that we enjoy now might have more to do with innovations in engineering. The distribution of clean drinking water and separation of sewage removal by the Romans contributed much more than medical advances to preventing disease.16 Although it now seems obvious that basic hygiene measures are crucial to the quality of healthcare and patient outcomes, several pioneers made significant contributions swimming against the tide of official policy. Florence Nightingale developed modern nursing during the Crimean war though promoting cleaning of wounds, better food and clean, more spacious hospital wards,17 reducing mortality to around 2%. Semmelweiss showed that doctors carried germs between patients, often spreading fatal infections that could be prevented by hand washing between patients.18 Mortality fell from 18 to 1%, but his medical peers rejected his theory and had him committed to a mental institution where, ironically, he died of a wound infection. Pasteur developed his Germ Theory in 1855 after noticing that spoiling of milk, beer and wine by fermentation required living organisms that could be removed by bubbling oxygen through the liquids (for anaerobes) or heating them to between 50 and 60 degrees Centigrade (pasteurisation).19 Joseph Lister brought these and other contributions together in 1867 to develop an antiseptic approach to surgery.20
Many of the advances in medical care are the result of warfare
War is an evil genius.
– William Jennings Bryan 1860–1925
This rather controversial statement is perhaps best explained in the College of Surgeons exhibition in Edinburgh, which is well worth a visit. The management of trauma has arguably benefited most, as for centuries the mortality rate of wounded soldiers was very high. A slow, painful death often awaited those not killed quickly. Military doctors had to adapt and act quickly to save lives. Ligatures to reduce blood loss were introduced in the sixteenth century ACE (After Christian Era) in Italy.21 Retrieval of the wounded by horse carriage to safer medical facilities began in the early nineteenth century ACE – the first ‘swoop and scoop’ strategy.22 Anaesthesia (available from the mid-nineteenth century ACE) significantly reduced deaths from amputations and allowed surgery to develop. World War 1 saw the widespread use of blood transfusions and more rapid retrieval of the wounded by ambulance volunteers. World War 2 saw widespread use of antibiotics and metal plates for internal fixation of fractures. Wars since then have improved management of burns, both initially and in re-constructive surgery, increased the speed of retrieval (helicopters) and pioneered robot-assisted surgery from remote centres.23 More recently, wars have resulted in an improved understanding of ‘war neurosis’ – post-traumatic stress disorder.24 It has become clear that the best way to reduce battle injuries is to reduce close human contact, resulting in the increase in protective equipment, remotely controlled drones and the development of robotic soldiers. Future wars might involve non-human combatants run by humans sitting at AI-guided consoles. Ideally, wars should be avoided, as the main contributor to morbidity and mortality is impact on non-combatants.
Many important discoveries reflect ‘fortunate’ findings that were not the focus of the research or clinical practice activity being conducted
There’ll always be serendipity involved in discovery.
– Jeff Bezos (1964–)
Interesting, relevant and sometimes major findings come from research into something different, although often related. Alexander Fleming left out a petri dish in 1928 while on holiday. On his return, a mould had inhibited bacterial growth and he named this mould penicillium.25 Thus began the era of antibiotics, which has revolutionised management of trauma, joint replacements and bacterial infections. In 1895, William Roentgen found that a mysterious ray (X for unknown) cast an image of bones while studying the physics of cathode ray tubes.26 Bones could be observed directly and certain pathologies diagnosed. Thus began the development of medical imaging and the exploration of potential use of other rays and magnetic fields.
Many remedies might be more dangerous than the diseases they aim to cure
First, do no harm.
– Hippocrates 460–375 BCE
Just like contemporary general practice, early doctors mostly dealt with symptoms and poorly differentiated problems for which precise causation was unclear and specific treatments were unavailable.27 Heavy reliance was placed on bedside manner and symptomatic treatment. Then, as now, patients felt comforted by doctors who were caring and usually accepted their advice, even when outcomes were poor. This might be because they almost always did something, based on experience and societal views of the time. However, not all prescribed treatments were beneficial. Fever was recognised broadly as infection and for centuries piling on blankets to increase body heat was thought to ‘sweat out’ the cause. Herbal remedies were common, some containing effective drugs, but others combining ineffective drugs, sometimes causing potent side effects. Many of the latter were recognised later as poisons (eg arsenic) or effective for other purposes (eg digitalis, atropine). Purging was used to clear ‘poisons’ from the body. Pain has been treated with opium products for 4000–5000 years,27 but unless causes were treated (pus drained or limb amputated), outcomes were poor. Bloodletting was common for more vague presentations. In Hippocratic times, treatments often relied on offerings to the gods and religion has often played a significant role in healthcare. Science has made safer the art of medicine.
A degree of scepticism is healthy
Do not consider it proof just because it is written in books, for a liar who will deceive with his tongue will not hesitate to do the same with his pen.
– Maimonides 1138–1204
Just because something is in writing (journals, books or websites) it may not be correct.28 ‘Quackery’ is as old as medicine and comes in many forms. The most challenging source of troublesome information now might be the widely accessible self-published information (websites, in-house journals and books) that suggest false authority. Relying on ‘pre-digested’ evidence in guidelines and updates might be risky; much depends on how this was done.29 Patients are now targeted directly with false but feasible explanations and treatments that do little or even cause harm. Social media might have some strengths, but disseminating reliable information is not one of them. Even published research can be misleading, as unreliable research evidence is ubiquitous. Sometimes this is accidental, involving inappropriate methods, analysis and interpretation. ‘Negative’ findings are published infrequently. A strong personal conviction can drive attempts to prove that it is correct, as in the famous ‘MMR vaccine causes autism’ controversy in 1999 that continues to influence some parents.30 Falsification of results still happens, even in large research laboratories (‘I know this works; I just need more data for statistical significance so will inflate numbers’). These practices can distract thinking and divert resources from more important issues, even though the motivation might not be malicious.
Many successful management strategies have little supporting evidence
Absence of evidence is not evidence of absence.
– William Wright, 1888
This appears to conflict with the last point, but simply expands on the challenges of judging new information. Scepticism should be accompanied by a readiness to accept new information that appears plausible, has potential theoretical support or has worked in similar situations. Sometimes case reports and experiences reflect truth, although there might be contextual differences that challenge validity (demographics, climate, topography or specific details). Some successful strategies without an evidence base – up to 80% of what we do – are continuations of traditional practices that ‘just work’.31 Critical appraisal sits at the overlap of the science and art of medicine, supporting the importance of clinical reasoning in medical education.
Summary
Current medical practice includes a combination of what history tells us is appropriate and scientific explanation can confirm. Everyday practice requires both sources of guidance. Medical history remains relevant by informing us of what might or might not work amidst complexity, as well as explaining social history, promoting understanding of disease processes and improving explanations to patients. This might be important for professional identity formation and the provision of more holistic care, particularly when evidence-based guidance is absent. Curriculum time is hotly contested, and medical history is often squeezed out, but there is a case for maintaining some core curriculum exposure to raise awareness of key contributions. Medical history also makes a great hobby, particularly for those who like to travel.12,32