Celebrating 75 years of Anaesthesia: our past, present and future | Association of Anaesthetists
Return to top

Celebrating 75 years of Anaesthesia: our past, present and future

Celebrating 75 years of Anaesthesia: our past, present and future

Contemporary Classics 728X90 WOL

In 1946 the Association launched our new quarterly publication, Anaesthesia, with a foreword from the then president of the Royal College of Surgeons of England. The associated editorial from Henry Featherstone recounted the many challenges of the time, and set out why the inauguration of our own specialty journal was necessary. Its publication came 100 years following the first use of ether in the UK in 1846, and now, 75 years later, we celebrate our own anniversary.

‘Contemporary Classics’ is a new limited monthly series of articles which we have commissioned to celebrate our 75th anniversary. These will feature an important paper published from each decade of Anaesthesia, as selected by a member of the current editorial board. Our hope is to celebrate our history, but at the same time use our past to look forward to our future. We have selected a team of guest authors, some with an interest in the chosen paper or topic, and some with their own attachment to the history of the journal.

We will also feature monthly blog posts from the Heritage Centre, on topics related to these journal articles. The Heritage Centre will also run Heritage Lates, which will look back at some of the key developments within anaesthesia throughout the decades, since the journal's inception. You can find out more by following the Heritage Centre on Twitter.

The 2010s

Eight years and already a classic: marking the rise of ultrasound-guided fascial plane blocks for chest wall surgery

For this instalment in the ‘Contemporary Classics’ series, we celebrate the work of modern pioneers in regional anaesthesia. For regional anaesthesia enthusiasts, the 2010s represented a decade of rapid expansion and utilisation of ultrasound-guided techniques. One clinically important area of expansion was the discovery of fascial planes as novel regional anaesthesia targets. These newer fascial plane blocks represented the first true alternative to neuraxial blockade in the context of multimodal peri-operative pain management. For this month’s Contemporary Classic, we have chosen a descriptive study by Blanco et al. that narrates the process of discovering, defining and validating a new regional anaesthesia technique: the serratus plane block.

Read the full article in the journal

Blood and Blocks: Nerve Blocks in Anaesthesia

In August 1810, novelist and diarist Frances (Fanny) Burney feels a small, annoying pain in her right breast. She dismisses it as nothing to worry about but her friends and husband are concerned. Only after some badgering by her nearest and dearest does she agree to see a doctor, whose treatment regimen proves to be ineffective.


Frances Burney Credit: National Portrait Gallery

The pain gets worse and her husband, former military officer Alexandre D’Arblay, now insists on her seeing “the most celebrated surgeon of France,” Baron Antoine Dubois. After he examines Fanny, he and Alexandre talk for a long time in the next room, making Fanny feel quite anxious. When her husband finally returns, “his looks were shocking... his whole face display[ing] the bitterest woe,” Fanny writes later.

“When the dreadful steel was plunged into the breast – cutting through veins – arteries – flesh – nerves – I needed no injunctions not to restrain my cries. I began a scream that lasted unintermittingly during the whole time of the incision – & I almost marvel that it rings not in my Ears still! So excruciating was the agony.”

Fanny was diagnosed with breast cancer and “formally condemned to an operation.” The first general anaesthesia for surgery was given in 1846, thirty-six years after Fanny’s diagnosis. She would have to expect an operation during which she would be more or less fully conscious without any pain relief whatsoever available. Unsurprisingly, for this she felt “no courage,” and only consented to an operation after the pain in her breast got very severe and any other treatments, of which she tried many, were all unsuccessful.

The operation took place on 30 September 1811 in the family’s salon, performed by seven scary-looking, intimidating men in black. A year after the procedure, Fanny found the strength to write an extraordinary letter to her sister and friends at home, describing her ordeal in great detail:

“When the dreadful steel was plunged into the breast – cutting through veins – arteries – flesh – nerves – I needed no injunctions not to restrain my cries. I began a scream that lasted unintermittingly during the whole time of the incision – & I almost marvel that it rings not in my Ears still! So excruciating was the agony.”


Illustration by Claude Bernard(1813-1878), French physiologist Credit: Wellcome Collection

A devastating diagnosis to receive in any time period, it is difficult to imagine anyone going through physical torment of this kind. Breast cancer remains the most common cancer in women, and is still the most common cancer in the UK in general. Thousands of people, not exclusively breast cancer patients, undergo surgery in the region of the breast and armpit every year. In many cases, these procedures cause significant acute pain and sometimes even chronic pain states. These patients, as well as those undergoing any procedures involving the chest wall in general, benefit from thoracic nerve blockade to reduce postoperative pain.


Breast operation and instruments Credit: Wellcome Collection

In the last decade’s most cited article in Anaesthesia, R. Blanco, T. Parras, J. G. McDonnell, and A. Prats-Galino, outlined a new technique they named serratus plane block. It is described as a “safe and easily performed regional anaesthetic block,” which provides complete pain relief of the lateral (side) part of the thorax (chest), and potentially causes fewer side effects. To achieve this, the researchers used a portable ultrasound device to identify the precise position of the nerves they wanted to target.

Ultrasound was introduced into regional anaesthesia fairly recently, although ‘ultrasonics’ had been used in medicine since the early 1940s. The introduction of the Doppler signal in 1978 meant that anaesthetists could now visually perceive nerves, helping with needle placement. Local anaesthetic spread could be visualised from 1989. This increased many anaesthetists’ confidence in performing regional blocks, as previously they either had had to possess incredibly detailed anatomical knowledge, or rely on colleagues who were specialists in regional anaesthesia. Now, recognising the technique’s advantages and with ultrasounds readily available, regional anaesthesia has become a core skill for every anaesthetist.

Local and regional anaesthesia have come a long way from their earliest methods, which included freezing or nerve compressions; reports of the numbing effect of both go back to antiquity. Fun fact: one of the doctors operating on Fanny Burney (and her favourite), Baron Dominique Jean Larrey, used the freezing method for amputations on the battlefields of the Napoleonic wars.

The advent of anaesthesia meant that enduring immense physical pain was perhaps one less thing to worry about for patients facing an operation, also allowing surgeons more time for their handiwork, on which they were now able to focus properly, without distracting screams.

The first effective local anaesthetic was Benjamin Ward Richardson’s ether spray, which, however, was of limited use. Only the (re)discovery of cocaine’s anaesthetic properties in 1884 made more complex procedures viable. Thanks also to the development of hypodermic syringes and needles in the 1850s, it was now possible for local anaesthetic to be injected directly into a nerve. After much self-experimentation, the first spinal blocks were performed around the turn of the century


Illustration of tourniquet


Richardson's ether spray Credit: Wellcome Collection


Richardson's ether spray equipment Credit: Wellcome Collection

Despite fluctuating in popularity over the decades, regional anaesthetics are used for a wide range of procedures today, including mastectomies. As Blanco et al have shown in their research, patients who undergo surgery using blocks such as the serratus plane block experience less postoperative pain than those who had been under general anaesthesia. Generally, patients will have a quicker recovery too, which is the hope of every practitioner.

The advent of anaesthesia meant that enduring immense physical pain was perhaps one less thing to worry about for patients facing an operation, also allowing surgeons more time for their handiwork, on which they were now able to focus properly, without distracting screams. Doctors have striven to make procedures both more effective and safer for those whose lives can be dramatically improved or even saved through an operation. Anaesthesia hugely contributed to safer, more successful operations and anaesthetists have continued to develop equipment, drugs, and techniques to make procedures as well as recovery more comfortable for patients.


Sonography guided femoral block

Fanny Burney’s operation only lasted twenty minutes, but, suffering unimaginable pain, must have felt like a lifetime. Yet, “all ended happily” for her; not only did she survive the procedure, she lived on for another twenty-nine years until she died in 1840, at the ripe age of eighty-seven.

If you would like to know more about the history of local and regional anaesthesia, view the Heritage Centre’s past exhibition Comfortably Numb on this topic. 

The 2000s

The 2004 Difficult Airway Society guidelines for the management of difficult tracheal intubation: revolutionary and enduring

It could not be timelier to reflect on the Difficult Airway Society (DAS) 2004 difficult airway guidelines – or as they were back then, simply ‘The DAS Guidelines’. If there is one thing that has defined ‘what anaesthetists do’ it is managing the airway safely. The number and breadth of specialties and non-medical staff managing the airway has undoubtedly increased in the subsequent 17 years, but safe airway management in a crisis has perhaps gained prominence again in the last year with the emergence of the COVID-19 pandemic and the focus on tracheal intubation of the critically ill in a manner that is safe both for the patient and staff.

Read the full article in the journal

Controlling the Aspera Ateria: A Short History of Airway Management - a blog post from the Heritage Centre

On average, we take around 20,000 breaths every single day. Breathing in through our nose or mouth, air flows through our throat and windpipe (trachea) into our lungs. It’s something most of us are lucky enough to do without thinking about it. Yet, when breathing does become really difficult or even impossible, the pathway through which air travels into the lungs – the airway – needs managing.

Tracheotomies were performed in emergencies in ancient India, Egypt, and Greece alike, suggesting that by 1000 BC, it had become a routine procedure. 

Our early ancestors were just as much aware of this as we are today, as accounts from the Bronze Age evidence. Tracheotomies were performed in emergencies in ancient India, Egypt, and Greece alike, suggesting that by 1000 BC, it had become a routine procedure. It appears that during the Middle Ages, however, the potentially lifesaving method went out of fashion. There are only few mentions of the operation, one terming it a “scandal of surgery.” Interest in tracheotomies only started to pick up again during the Renaissance, when countless experiments on animals were carried out.


Tracheotomy from Casserius, De Vocis Auditusque Organis Credit: Wellcome Collection

In 1546, Antonio Brassavola went a step further and performed the first successful (documented!) tracheotomy in a patient with a laryngeal abscess, who made a full recovery. The medical community quickly rediscovered appreciation for the procedure; anatomist and surgeon Girolamo Fabrici d’Acquapendente was convinced “the opening of the aspera arteria,” the artery of air, was one, if not the, most important operation of all, as “man is recalled from a quick death to a sudden repossession of life,” able to “resume an existence which had been all but annihilated.”

Some seventy years later, in 1620, French surgeon Nicholas Habicot published a treatise describing cases of successful tracheotomies and their dramatic back stories, such as that of a fourteen year-old boy who had swallowed a bag of gold coins to keep it safe from thieves, or that of a convicted criminal sentenced to be hanged, who hired a surgeon to perform a secret tracheotomy before his appointment with the executioner. They somehow concealed the tube that had been inserted, and he might have gotten away with it too, was it not for the unfortunate fact that the criminal’s neck broke.


Nicolas Habicot, Question chriurgicale, 1620

Tracheotomies had made a comeback, then. But what about endotracheal intubation and endotracheal anaesthesia? How were airways managed during anaesthesia? Endotracheal intubations were first used to resuscitate drowning victims in the 18th century. That, and rectal intubation but that’s a whole different story... In the early days of anaesthesia in the mid-19th century, anaesthetics were given via a mask, and since patients’ cough and swallowing reflexes remained intact, only little attention was paid to the management of airways.

After ever-curious anaesthesia pioneer John Snow had managed to anaesthetise a rabbit performing a tracheotomy on it, German surgeon Friedrich Trendelenburg administered the first endotracheal anaesthesia using the same method in a human in 1871. Only seven years later, Scottish surgeon William Macewen is believed to have performed the first orotracheal (through the mouth) intubation for chloroform anaesthesia – finally!


German surgeon Friedrich Trendelenburg


Trendelenburg cone


Trendelenburg cone

A physician is only as good as their equipment, right? It helps, at least. German surgeon Franz Kuhn was rather visionary at the turn of the twentieth century. Not only did he design metal tubes that could be inserted through the mouth even ‘blindly’ (without direct view of the larynx), he also described the use of a curved tube introducer, as well as being the first to publish a paper on nasal intubation, the tube for which, he argued, “lies better” and leaves the mouth free for any surgical procedures. Moreover, Kuhn appreciated the risk of spasm of the larynx and believed that ‘cocainization’ of it was helpful when intubating.

K Kuhn die perorale intubation

Paper by German surgeon Franz Kuhn


Kuhn endotracheal tube, 1901


Kuhn endotracheal tube


Endotracheal nasal tube set of 8

Enter Ivan Whiteside Magill. He was the (initially reluctant) anaesthetist at Queen’s Hospital in Sidcup, where facial injuries incurred in the First World War were treated. As such, masks and traditional breathing tubes were simply impractical. Developing a series of equipment as well as techniques for endotracheal intubation such as tubes, laryngoscopes, and forceps, he and his colleague Stanley Rowbotham conquered the airway problem.


Ivan Magill (far right) and house officers


Ivan Magill


Tongue forceps modified by Magill

Indirect laryngoscopy was first described by an English singing teacher in 1855, when he used sunlight and two mirrors to see his own vocal cords move. Direct laryngoscopy came along in 1895. Although fibreoptic intubation has become more common in the more recent past, the Macintosh curved laryngoscopic blade that was introduced in 1943 remains much beloved piece of equipment today.

Building on ancient knowledge, tracheotomy and tracheal intubation procedures have been refined in the last few centuries, resulting in the prevention of countless deaths.

Today, managing airways, ensuring a patient’s breathing is safe, is an anaesthetist’s bread and butter. With the knowledge and equipment available to them, as well as the rigorous training they go through, this has become much ‘easier’ over the decades and centuries. However, managing an airway is not always straightforward. This could be owing to a number of reasons, such as an injury or swelling of the airway, difficulty of opening the patient’s mouth, or indeed because of unexpected complications. In these very rare circumstances, anaesthetists can resort to the guidelines produced by the Difficult Airway Society (DAS), aiding them in these circumstances. DAS, formed in 1995, is the UK’s largest anaesthetic subspecialist society and committed to improving the standard of airway management as well as advance its public understanding.

The history of airway management is long and fascinating. Building on ancient knowledge, tracheotomy and tracheal intubation procedures have been refined in the last few centuries, resulting in the prevention of countless deaths. There is no doubt that further advances in techniques and equipment will be made to render airway management easier, ensuring that even if we need a little help with any of those 20,000 breaths we take every day, we’re as safe as we can be.

The 1990s

Manual in-line stabilisation during tracheal intubation: effective protection or harmful dogma?

This month’s ‘Contemporary Classic’ focuses on a landmark paper from the 1990s. The study by Nolan and Wilson examined tracheal intubation in patients with a suspected cervical spine injury, a topic which still remains a clinical concern for anaesthetists today. Cervical spine injury occurs in 3%–4% of trauma patients, with around 25% of these patients having a cervical spinal cord injury. Cervical cord injury is more common in patients who have a reduced Glasgow Coma Scale (GCS), and around 25% will have injuries to other body systems. As a result, these patients are likely to require airway intervention, either to protect/secure their airway or to facilitate treatment or investigation of their injuries. Clinicians are often worried about the possibility that tracheal intubation may either worsen an existing neurological deficit or cause a new spinal cord injury. This is despite there being little published evidence showing an association between tracheal intubation and either primary or secondary cervical cord damage.

Read the full article in the journal

A bloodless field for surgery: Induced hypotension - a blog post from the Heritage Centre

Induced or controlled hypotension (low blood pressure), which was introduced into anaesthetic practice in the late 1940s, required a new way of thinking for the anaesthetist. Before then, an anaesthetist was trained to maintain blood pressure at the ‘normal’ level. The idea of reducing blood pressure to a point where it was not recordable in the usual way with a blood pressure cuff seemed nonsensical. Add to that, the fact that this technique was being used on patients with high blood pressure and therefore at higher risk of coronary heart disease and stroke, and the practice seemed reckless.

However, induced hypotension extended the range of surgeries and was utilised in major operations where there was significant blood loss, or where there was a risk of rupturing a blood vessel (e.g. aorta operations). It also provided a greater understanding of body systems, which improved ways on controlling the body to improve patient safety.

Monitoring blood pressure has been one of the earliest forms of monitoring by anaesthetists. First used during anaesthesia in 1901 by American surgeon Harvey Cushing. Joseph Clover was the first exponent of pulse measuring in the late nineteenth century.


Clover with his chloroform apparatus, 1862

779px-Dr_Harvey_Cushing_Edmund_Tarbell_1908_Wikimedia Commons

Dr Harvey Cushing

Blood pressure monitoring techniques can be traced back to the eighteenth century. The first machine for measuring and recording blood pressure came in 1860. Advancements to these early techniques and equipment have been made since that first measurement of blood pressure in 1733.

The first person to quantitatively measure blood pressure was Stephen Hales. He was a physiologist, clergyman and one of the most famous British scientists of his day. His investigations at Corpus Christi College, Cambridge into arterial systems in animals were published in, Statistical Essays: Containing Haemotaticks in 1733. In these investigations, he inserted a manometer (glass tube filled with mercury) into the artery of a horse and measured the height the blood rose, and so the first measurements of blood and pulse pressure were made.

Hales Horse-Wellcome Collection

Hales' Horse. Image credit: Wellcome Collection

The next advancements were made by French physician and physiologist Etienne-Jules Marey whose 1860 ‘Marey Sphygmograph’ was the first instrument of its kind to be widely used in clinical medicine. It produced a paper record of blood pressure. The sphygmograph was held on the patient's forearm with a cloth band and an ivory plate was positioned on the patient's radial artery. The patient’s pulse created a motion which was transmitted through a stylus dipped in ink and onto a piece of paper, thereby creating a line which represented the motion.

Marey_Wellcome Collection

Illustration of Marey Sphygmograph. Credit: Wellcome Collection

The first sphygmomanometer was invented in 1881 by Austrian physician Karl Samuel Ritter von Basch. The manometer included a small balloon and a dial. The balloon was placed over the artery in the wrist and compressed until the pulse disappeared. The pressure needed to do this was then read on the dial.


Basch's sphygmomanometer. Credit: Wellcome Collection

In 1896 Italian paediatrician Scipione Riva-Rocci (1863–1937) introduced his mercury sphygmomanometer, a piece of equipment that would look familiar to people today. The sphygmomanometer used an inflatable rubber cuff around the circumference of the patient’s arm to obstruct the blood flow of the brachial artery. This pressure measurement is known as systolic, and is the force at which your heart pumps blood around your body. This technique meant the brachial artery was compressed equally from all sides. This was an improvement on Von Basch’s technique which took the measurement from one place.

Riva-Rocci-Sphygmo-art1_Wood Library

Riva-Rocci's mercury sphygmomanometer. Credit: Wood Library-Museum of Anesthesiology

Riva-Rocci’s method however did not measure the diastolic pressure (the resistance to the blood flow in the blood vessels). In 1897, the Hill and Barnard sphygmomanometer, named after its designers Dr Leonard Hill, a British physiologist, and Harold Barnard, a British surgeon, included a needle pressure gauge which was sensitive enough to record the diastolic pressure.

Hill and Barnard_Science Museum Group

The Hill and Barnard sphygmomanometer. Credit: Science Museum Group

In 1901, Heinrich von Recklinghausen increased the size of the cuff used with Riva Rocci's equipment from 5cm to 12cm wide. This improved the accuracy of readings by reducing localised areas of high pressure build up.

In 1905, Russian surgeon Nicolai Korotkoff described the sounds heard through a stethoscope placed over the brachial artery during slow deflation of a Riva Rocci cuff. This became the most widely used method of blood pressure measurement.

Despite these early developments, measuring blood pressure during anaesthesia first took place in 1901. However, measuring blood pressure as a health check only became common practice from the 1920s.

After the introduction of cyclopropane in 1934 and curare for anaesthesia in 1942, which made deep general anaesthesia unnecessary, excessive bleeding became increasingly problematic in surgery. Significant blood loss was a great risk to the patient and could impede a surgeon’s work. This is where planned hypotension provided a solution. This technique was seen to be a means of controlling bleeding where otherwise it may be severe and difficult to control[1], or where even a small amount may adversely affect precision of exposure and the ultimate result, for example in thoracolumbar sympathectomy and certain plastic procedures.

In 1948 Harold Griffiths and John Gillies trialled a technique whereby they deliberately reduced a patient’s blood pressure to a point where it was un-recordable by the normal means of a cuff. The aim with reducing the patient's blood pressure to a level significantly below normal was to provide a near bloodless field of operation, which it did successfully.

Through induced hypotension, it was found that a patient could tolerate an arterial pressure of around 60mmHg for several hours without damage to the brain or other organs. A similar period of hypotension due to blood loss would however often prove fatal. Considering also that hypertension was the second-most common factor associated with postoperative morbidity in the 1950s[2], the benefits to the patient are apparent.

For the surgeon, the advantages of hypotension are clear; their work is made easier and can be done more quickly. These and other benefits, particularly the ability to maintain a constant blood volume, also benefit the patient as long as the hypotensive technique is competently controlled.

After the experiment, John Gillies noted that the technique of induced hypotension represented ‘physiological trespass’, as the normal safety margins were eroded. For example, decreased blood pressure will cause a decrease in blood flow if an artery is blocked by disease, meaning patients with arterial disease who were subject to this technique might suffer a stroke or coronary thrombosis.

By 1953, it became clear that the technique of induced hypotension had resulted in some deaths, and it was used more cautiously. It was reserved for operations likely to have significant blood loss (i.e. brain aneurysms), or where a large amount of tissue had to be removed (i.e. in cancer).

Controlled hypotension not only minimised blood loss for major surgeries, it provided a greater understanding of circulation. Through observing the effects of dilating the blood vessels by various means, such as changing the venous return to the heart by posture, increased lung pressure, or by modifying blood volume, new knowledge of the behaviour of the circulation was gained by anaesthetists which improved anaesthetic practice.

This greater understanding of the body system led to further improvements in the way the body could be controlled to improve patient safety. For example, in the early 1950s, induced hypothermia (decreasing the patient’s body temperature from 37 -30c) was found to half the brain’s oxygen consumption, which doubled the amount of time the body could be without blood flow, which improved cardiac and neurological surgeries.

Drugs which control hypertension were introduced during the 1940s. Pharmacology still manages a patient’s blood pressure. Vasodilating drugs are used when the patient develops significant hypertension during surgery or after cardiac operations. Perioperatvely, patients also benefit from being given antihypertensive drugs to reduce the risk of heart attack.


  1. Proceedings of the Royal Society of Medicine Vol.45, September 7, 1951, Physiological Trespass in Anaesthesia
  2. Measurement of Adult Blood Pressure

The 1980s

Making the grade: has Cormack and Lehane grading stood the test of time?

Few anaesthetists will not have heard of Cormack-Lehane grading of laryngeal view and use it or its modifications in clinical practice. Ronnie Cormack and John Lehane were anaesthetists at Northwick Park Hospital, Harrow, UK. In 1984, they published a landmark paper describing airway management in obstetric practice but with general application. They described four grades of laryngeal exposure during direct laryngoscopy, which have subsequently formed the basis of a classification of difficult tracheal intubation. This timeless paper offers more than just a grading system. At its heart, it was an attempt to improve obstetric outcomes by suggesting manoeuvres based on the grade of laryngeal exposure. Furthermore, the authors highlighted the challenges of a grade-3 view, advocated simulating a grade-3 view when it is not naturally present for the purposes of training and signposted specific difficult tracheal intubation drills. Many of these concepts were not only novel but remain ongoing practices or debates today.

Read the full article in the journal

Drills and algorithms: Managing difficult tracheal intubation in obstetrics - a blog post from the Heritage Centre

On 19 January 1847, Sir James Young Simpson administered ether to relieve pain in labour and recorded the first use of anaesthesia for childbirth. This was just three months after William Morton’s recorded use of ether for surgery.

Born in Bathgate, West Lothian, Simpson entered Edinburgh University aged 14 to study classics, changing to medicine in 1827. He qualified at 18 and studied obstetrics, gaining the MD in 1832. He progressed rapidly in midwifery, was elected President of the Royal Medical Society in 1835 and became a recognised authority on diseases of the placenta. In December 1839, he was elected by one vote to the Edinburgh chair of midwifery and hundreds attended his lectures.

Simpson practiced hypnotism but introduced ether inhalation to relieve labour pains in 1847. Unsatisfied with ether, which had a pungent smell and was flammable, Simpson began looking for an alternative which was more pleasant to inhale and provided a quick induction.

The dinner party where the effects of chloroform were discovered by Simpson and his assistants, 4 November 1847.

The dinner party where the effects of chloroform were discovered by Simpson and his assistants, 4 November 1847.

Self experimentation with chloroform and observations of others using it recreationally helped Simpson recognise the advantages of it for pain relief in labour and for surgery. On the 5 November 1847, just one day after the infamous dinner party, Simpson used chloroform during labour. This became the agent of choice for obstetrics until the 1870s in England, and the early 1900s in Scotland. This was despite complications with its use; it was easy to administer an overdose, it produced irregular heartbeats and even heart attacks.

The rest of the nineteenth century saw attempts to alleviate pain in labour by means other than inhalation anaesthetics, this included morphine and morphine like drugs.

From the 1930s, methods of analgesia which could be administered by midwives when a doctor was not present were developed. In the UK by this time, midwives delivered around 50% of normal births. Liverpool anaesthetist R.J. Minnitt’s developed the ‘Minnitt Machine’ in 1936, which provided a set mix of nitrous oxide and air for labour pain relief. The mother could self-administer the mixture whilst under the supervision of the midwife.

The machine was a step forwards in providing labour pain relief, but malfunctions with the apparatus meant it was largely abandoned by the 1950s.

The machine was a step forwards in providing labour pain relief, but malfunctions with the apparatus meant it was largely abandoned by the 1950s.

In the 1950s, trichloroethylene (Trilene) was administered via specifically designed vaporizers, and was approved for use by midwives in 1954.

The ‘Cyprane’ Trilene Vaporizer, c.1954.

The ‘Cyprane’ Trilene Vaporizer, c.1954.

Epidural anaesthesia has become common in childbirth since the 1960s.

Whilst these techniques, equipment and agents provided relief from labour pain, they also presented some problems for either mother and, or, baby. However, general anaesthesia for women in labour presented more problems, with increased difficulty of tracheal intubation as a result of the anatomical, physiological and hormonal changes to a woman’s body during pregnancy.

The paper, Difficult Tracheal Intubation in Obstetrics published in Anaesthesia in November 1984 was that decade's most cited article. It reviewed and discussed recent analysis which showed that in obstetrics, the main difficulty during laryngoscopy was the inability to see the vocal cords. This was because as the laryngoscope blade was lowered the epiglottis descended and hid the cords, which meant intubation had to be done blind using the ‘Macintosh method’, named after Dr Robert Macintosh.

The other common difficulty cited was that beginners did not put the patient's head in the Magill ‘sniffing’ position. This recommended positioning of the head and neck before intubation was first published by Dr Ivan Magill in 1936. He described it as the position a person would take, ‘when sniffing the morning air’

The sniffing position

The sniffing position

Passing an endotracheal tube before the introduction of curare into anaesthesia in the 1940s was very difficult. Curare was used as a muscle relaxant, which eased laryngeal reflexes and aided intubation. Dr Macintosh stated that, “The ability to pass an endotracheal tube under direct vision was the hallmark of the successful anaesthetist. Magill was outstanding in this respect.”

In 1943, Macintosh introduced the Macintosh Laryngoscope which had a curved blade. Prior to this, blades were straight, and the blade was used directly to hold the epiglottis.

Laryngoscope developed by Dr Ivan Magill.

Laryngoscope developed by Dr Ivan Magill.

Much of the credit paid to Macintosh’s laryngoscope was for the development of the curved blade. However, as he explained, the important aspect was that the tip did not pass the epiglottis and that the laryngoscope was properly used. In his own words, “The important point being that the tip finishes up proximal to the epiglottis. The curve, although convenient when intubating with naturally curved tubes, is not of primary importance.”

The flange design feature, which ran along the left lower edge of Macintosh’s blade, improved the ease of intubation. It moved the tongue to the side, which improved the view of the larynx and made more room for an endotracheal tube.

Macintosh Laryngoscope c.1943

Macintosh Laryngoscope c.1943

Whilst improvements with equipment aided intubation the Difficult Tracheal Intubation in Obstetrics article evidences that difficulties were still faced. The article cited the use of drills as an integral part of an anaesthetists training to reduce maternal mortality. The three main drills concerned, difficult intubation, failed intubation, and the correct use of cricoid pressure, all of which they noted, “…need as much attention as the cardiac arrest drill, but whether they always get it seems doubtful at present.”

The failed intubation drill quoted was devised by Dr Michael Tunstall. In 1976, Dr Tunstall suggested a failed intubation drill for caesarean section. This drill was the first of its kind and a significant step in the role anaesthesia played in reducing maternal mortality.

Listen to the intubation drill

Tunstall's drill became part of an anaesthetist’s training, with every final Fellow of the Faculty of Anaesthetists of the Royal College of Surgeons (FFARCS) examination candidate expected to repeat the drill word for word.

Tunstall’s first contribution to obstetric anaesthesia actually came whilst he was a trainee with the development of Entonox. Nitrous oxide and oxygen were stored as liquid and gas respectively, and came in four cylinders (two plus two spares). In the late 1950s, Tunstall developed the idea of pre-mixing the oxygen and nitrous oxide in one cylinder, which would also deliver a constant mixture of the agents. So it was that Entonox was developed and introduced for labour analgesia by the British Oxygen Company (BOC). Dr Tunstall first used it clinically at The Royal Hospital, Portsmouth in 1961.

Listen to the Entonox cylinders

Another benefit of the Entonox cylinders was they could be easily transported, and small ones were produced for home deliveries. The apparatus comprised of demand valves, a pressure reducing valve, a cylinder contents gauge, and a non-interchangeable pin-index system to prevent the administration of the wrong gas.

The liquids would layer if left standing for a long time, and this process was accelerated by low temperatures. A gentle rocking in a warm room can solve the problem.

Compact and portable apparatus with Entonox, (premixed 50% nitrous oxide and 50% oxygen) cylinder. The mask is applied tightly to the face. A pressure gauge indicates the quantity of gas mixture in the cylinder.

Compact and portable apparatus with Entonox, (premixed 50% nitrous oxide and 50% oxygen) cylinder. The mask is applied tightly to the face. A pressure gauge indicates the quantity of gas mixture in the cylinder.

Concerned about the high level of patient awareness with recall (AWR) associated with general anaesthesia for caesarean section Tunstall developed the ‘Isolated Forearm’ technique in 1977. The technique consisted of inflating a padded cuff/tourniquet around the right upper arm before giving any neuromuscular blocking drug (NBD), it was deflated shortly after delivery when anaesthesia was deepened. Tunstall also researched methoyflurane as an inhalational analgesic, and helped colleagues develop one of the first neonatal intensive care units. A truly eminent obstetric anaesthetist, he was President of the Obstetric Anaesthetists’ Association (OAA) (1987-1990) and received their Gold Medal in 1990.

Moving forward several decades and in 2012 the OAA and Difficult Airway Society (DAS) committees established a working group to develop national guidelines on the management of difficult airway in obstetrics in the UK. This came at a time of declining numbers and experience in obstetric general anaesthesia.

Following extensive work, the Guidelines for the management of difficult and failed tracheal intubation in obstetrics was published in Anaesthesia in 2015. The guidelines contain four algorithms and two information tables. The guidelines were produced to improve consistency of clinical practice, reduce adverse events and provide a structure for teaching and training on failed tracheal intubation in obstetrics.

The 1970s

Assessing pain: how and why?

Michael Rosen's 1976 paper on the reliability of the linear analogue for evaluating pain – now more commonly called a visual analogue scale (VAS) – starts with the words: “pain is difficult to measure”. This is as true now as it was 45 years ago and is likely to remain true for some years to come. The title might be taken to imply that the study on which the paper is based assessed the validity of the VAS as a method for objectively assessing pain, but this is far from its purpose. Rather, the publication of this paper showed the VAS to be a reliable and repeatable assessment tool of a patient's self‐report of their pain and, in doing so, gave academics a way of measuring and comparing the effects of analgesic regimens on acute pain and clinicians the ability to assess and track the efficacy of pain management.

Read the full article in the journal

Fatal Attraction: A Brief History of Morphine - a blog post from the Heritage Centre

Few of us could imagine living in a world without easy access to instant pain relief, may it be to soothe a headache or ease the pain of a sore toe. We expect to feel no pain during an operation, and afterwards we are supplied with pain medication as a matter of course. Today, safe medication can help us manage acute as well as chronic pain and thus lead a life that is not, temporarily or constantly, disturbed by the experience of pain.

The desire to live free of pain is universal among all living creatures, and humans have tried to find ways to alleviate pain since the dawn of man. Although many plants were used for their analgesic properties in the course of history, sleep-inducing poppies (papaver somniferum) were domesticated and cultivated as early as 5000BCE, likely making opium the first drug humans discovered. The ‘plant of joy’, as the opium poppy was denoted in Sumerian, had already been used medically for millennia even before the alkaloid morphine was isolated from raw opium in 1804. There is plenty of evidence that physicians in ancient Egypt and Rome regularly used it to relieve pain as well as calm the nerves. Emperor and philosopher Marcus Aurelius himself relied on an opium-based electuary compounded with honey to sleep most nights. (Perhaps that was the reason for his stoicism in the first place?)

The desire to live free of pain is universal among all living creatures, and humans have tried to find ways to alleviate pain since the dawn of man.

By the time apothecary’s assistant Friedrich Wilhelm Sertürner set out to isolate opium’s sleep-inducing factor in 1804, then, opium had seemingly been tried and tested, although habitual ‘opium eaters’ had become more common. Sertürner accomplished his goal working in his spare time, and eventually named the substance ‘morphium’ after the Greek god of dreams and sleep, Morpheus. Morphine, hailed as ten times more potent than opium, was first produced commercially in 1821, in a parlour behind a retail pharmacist’s shop in Farringdon Street. Fifteen years later morphine officially entered in the London Pharmacopoeia.

Although morphine was quickly accepted, it was the development of hypodermic syringes that made it one of the most popular drugs of the nineteenth century. Invented simultaneously but independently by physician Alexander Wood in Britain and surgeon Charles Pravez in France in 1853, the hypodermic syringe allowed for morphine to be administered by injection rather than orally. Wood reasoned that if the drug was injected rather than eaten, it would not create an appetite like other food and drink – a behaviour he had already observed in patients treated with opiates.

Opium poppies_Wellcome Collection

Opium poppies Wellcome Collection

Physicians everywhere rejoiced. From the 1860s onwards, morphine was given to anyone for anything; it appeared that the ailment that could not be remedied by hypodermic morphine did not exist. It reliably relieved the pain of those suffering from temporary or chronic illnesses, and calmed those in need of sleep. In 1868, Francis Anstie, founding editor of the periodical The Practitioner, confidently declared that “of danger there is absolutely none.”

It became evident that hypodermic morphine was not the magical panacea the medical world had hoped and believed it to be. Yes, it did bring peace and comfort to patients, but at a cost. 

However, critical voices emerged even in the initial wave of enthusiasm. Already in the 1860s, Felix von Niemeyer, though conceding that “the introduction of hypodermic injections was a great event, and...an immense advance in treatment [for neuralgia],” warned that he clearly observed it creating addiction in his patients, who began “to feel an absolute need of the injections.” The following decade saw more and more practitioners come forward with their own observations. Physician Clifford Allbutt, who had effusively and publicly praised hypodermic morphine only a few months before, now reported its injurious effects on his patients: “They seem as far from cure as they ever were, they all find relief in the incessant use of the syringe, and they all declare that without the syringe life would be insupportable.”

It became evident that hypodermic morphine was not the magical panacea the medical world had hoped and believed it to be. Yes, it did bring peace and comfort to patients, but at a cost. The ‘disease’ of morphine addiction that was threatening society was spreading. For the first time, the question of culpability arose. Who was to blame?

In his important book Morbid Craving for Morphia (Die Morphiumsucht) of 1877, German physician Eduard Levinstein firmly stated that “the originators and propagators of this disease” were the medical men who freely prescribed hypodermic morphine for any old ailment, especially if they supplied patients with morphine and syringes, and left them to their own devices. However, doctors “must not be blamed for acting as they did,” Levinstein continued, “as it was done in the hope of affording relief to their patients, none of them thinking of the attendant danger.” This danger was only now being understood. It could, so he argued, happen to anyone – and it did; every echelon of society was affected. By 1888, the British Medical Journal (BMJ) even claimed that “the abuse of morphine has in many cases replaced the abuse of alcohol, especially in refined society.”

What could be done to combat this ‘morally dangerous disease’? For those already battling with addiction, Levinstein’s advice was to quit suddenly, as he found that his patients had generally overcome the withdrawal symptoms in two or three days. 

By this time too, morphine addiction and the figure of the morphine addict was featuring prominently in literature as well as visual culture. Indeed, there was an outburst of morphine-themed art particularly in Paris, where artists were fascinated with portraying the female morphinée. These society women were said to routinely slip away during a theatre performance to indulge in their secret pastimes using bejewelled syringes and morphine bottles. Women did not become more easily addicted to morphine than men; yet, female addicts were often seen as particularly immoral, devoid of self-control, lying and even vicious, self-indulging in the pleasures morphine afforded. When Sir Lauder Brunton, however, recounted the case of a Member of Parliament who injected himself 24 to 32 grains of morphine daily – often secretly when in session, – he stressed that the MP only started this habit because his daughter was seriously ill, a reason which could be considered much more noble than seeking pleasure.

What could be done to combat this ‘morally dangerous disease’? For those already battling with addiction, Levinstein’s advice was to quit suddenly, as he found that his patients had generally overcome the withdrawal symptoms in two or three days. Writing a decade later, Oscar Jennings recommends the opposite approach of gradually reducing the morphine, eventually replacing morphine with other medicines such as sparteine and trinitrine. Both claimed successes in rehabilitating patients. For a brief time it was also believed that cocaine could be used to wean people from morphine.

But how to prevent addiction occurring in the first place? Anstie still believed morphine to be “one of the most valuable inventions of the country,” of the same significance as gaslight and the railway. As such it was too valuable to renounce completely; he simply recommended keeping the doses small. The BMJ emphasised that morphine should only be used by professional hands and advised keeping a record of the doses given. Moreover, morphine should only be given for fourteen days.

Liquid Morphine_Science Museum

Liquid Morphine Science Museum

Nonetheless, morphine addiction remained prevalent. Even in the first decades of the 20th century, there were shops offering ‘buy one get one half-price’ injections in Germany. In Britain, too, real change only came with the Dangerous Drugs and Poisons (Amendment) Act of 1923. A year later, a Committee on Morphine and Heroin Addiction was set up. It concluded that, though morphine addiction was still more common than heroin addiction, both had become rare in Britain, which was attributed to the difficulty of obtaining drugs without medical prescriptions.

Morphine was both a blessing and a curse. It undeniably alleviated the physical and mental pain of many, but possibly caused as much pain and heartache in the process. Pain management has come a long way as medical professionals have striven to develop effective as well as safe pain medication.

The 1960s

Epidurals in the UK: practice and complications over 80 years

Epidurals are standard practice nowadays but have not always been so; all innovations require enthusiasts and pioneers. It is always interesting to learn about the latters’ experiences and to observe how things have changed overtime. In this month’s Contemporary Classics article, we examine “An analysis of the complications of extradural and caudal block”, by C.J. Massey Dawkins, published in Anaesthesia in October 1969.

Dawkins (1905-75) was a British anaesthetist who reportedly performed the first epidural in the UK in 1942 and went on to become a pioneer of this technique. A graduate of Cambridge University in 1929, he was a consultant anaesthetist at University College Hospital in London from 1941 to 1970, also working as a consultant anaesthetist at Hampstead General Hospital and Paddington Green Children’s Hospital (incorporated into the Royal Free Hospital and St Mary’s Hospital, respectively). Dawson published several papers on epidural block, his 1969 review of the technique’s complications being the most comprehensive.

Read the full article in the journal

“I decided to perform some investigations on my own body”: Cocaine and Self-experimentation - a blog post from the Heritage Centre

On 15 September 1884, Dr Joseph Brettauer made his way to the podium at the German Ophthalmological Society meeting in Heidelberg. He delivered the paper to a stunned audience, who quickly realised the importance of what they were witnessing, the sense of excitement and opportunity palpable in the auditorium. Its author Dr Carl Koller, Brettauer’s colleague, was a young doctor who could not afford the journey from Vienna to Heidelberg to present his findings himself. Despite his absence at the congress, Koller’s name should soon be known internationally as the news of his discovery spread with incredible speed: cocaine could produce an effective local anaesthetic.

P2 I4 coca leaves_GettyImages-696325614

Coca leaves

The quest for a reliable local anaesthetic agent had begun four decades before Koller published his findings, soon after ether and chloroform had been introduced into clinical practice. The numbing effect of chewing coca leaves had been widely reported, and after Albert Niemann had successfully isolated cocaine in 1860, its anaesthetic effects were even recorded by Friedrich Wöhler, whom Niemann assisted: “It tastes bitter and produces a peculiar effect on the nerves of the tongue, inasmuch as the point of contact becomes deadened and very nearly insensitive.” Twenty years later, Dr Basil von Anrep experimented with cocaine and explicitly recommended “that cocaine be tested as a local anaesthetic and in melancholia.”

Once satisfied with the outcomes of the experiments involving frogs and dogs, he and colleagues were happy to put the agent to test on themselves. 

Four years after von Anrep’s comment, aspiring ophthalmologist Koller found himself working in a research laboratory, searching for an alternative agent to ether, which was unsuitable for eye surgery. At this time, cocaine was used to treat a myriad of diseases and afflictions, including morphine addiction. Only after he had experimented with various different agents without success did Koller turn to cocaine.

Self-experimentation had always been rife in medicine, and the field of anaesthesia was no exception; more than one anaesthetic agent was ‘discovered’ at a dinner party. 

Once satisfied with the outcomes of the experiments involving frogs and dogs, he and colleagues were happy to put the agent to test on themselves. A head of a pin was brought to the cornea, the conjunctiva of the bulb grasped with a toothed forceps, and the cornea pitted by pressure – all in the name of science!

Cocaine bottle Science Museum

Cocaine bottle - Science Museum

Self-experimentation had always been rife in medicine, and the field of anaesthesia was no exception; more than one anaesthetic agent was ‘discovered’ at a dinner party. Nitrous oxide, ether, and chloroform were all used recreationally to some extent before their anaesthetic properties were fully appreciated. James Young Simpson is said to have hit his knee on a dinner table after one such chloroform party, noticing the absence of pain. Shortly after, he started experimenting with the agent, which was eventually introduced into clinical practice in 1847. In more recent history, forty years ago Archie Brain tested his prototype of the laryngeal mask on himself – indeed, it became a unique party trick – before it got anywhere near patients.

However, as news of the efficacy of cocaine as local anaesthetic spread, doctors appeared to be particularly adventurous. Many others were inspired to self-experiment with the agent and described their experiences before the year was out. In November 1884, American Drs Richard Hall and William Halsted reported that they had performed the first nerve block. Regularly posing as test subjects for each other, they managed to block almost every peripheral somatic nerve before the end of 1885. On one occasion, while exploring the effect of cocaine on dental nerves, Halsted thrust a pin through Hall’s lips and hit his teeth with the back of a knife, allegedly none of which caused any pain.

Thus, one August evening, Bier ordered his assistant Dr Hildebrand to inject him with half a syringe of a 1% solution of cocaine. 

About a decade later, Dr August Bier, the German surgeon who performed the first spinal anaesthesia in 1898, had already used cocaine on six patients of varying ages with varying outcomes. Most of them had experienced headaches after their operations, some vomiting. “To reach a well-informed opinion,” Bier declared, “I decided to perform some investigations on my own body.”

Their experiments ranged from the tickling of the sole of the foot to a strong blow to the shin with an iron hammer, with some avulsion of pubic hairs and traction to the testicles in between, all in the space of only forty minutes.

Thus, one August evening, Bier ordered his assistant Dr Hildebrand to inject him with half a syringe of a 1% solution of cocaine. Poor Hildebrand botched the procedure and Bier retained full sensibility in his legs. Hildebrand promptly volunteered to be the subject instead. What followed was a series of experiments that would probably not withstand any ethics committees today.

Starting fairly innocently with needle pricks in his thigh, Hildebrand had a long needle pushed down to his femur just three minutes later. Immediately after that, Bier applied a burning cigar to Hildebrand’s leg. Their experiments ranged from the tickling of the sole of the foot to a strong blow to the shin with an iron hammer, with some avulsion of pubic hairs and traction to the testicles in between, all in the space of only forty minutes. After the anaesthetic had worn off, they proceeded to eat dinner, drink wine, and smoke cigars.

The discovery of a local anaesthetic constituted a milestone in the development of anaesthesia. Yet, as important as these experiments were, they often came at a price; both Hall and Halsted, for instance, became addicted to cocaine as a consequence of their self-experimentation and suffered from this addiction for the rest of their lives. We owe a lot to these pioneers, whose courage and desire for knowledge have improved our understanding of anaesthesia and ultimately made operations safer.

The 1950s

Deaths associated with anaesthesia – 65 years on

In 1949, the Council of the Association of Anaesthetists announced the launch of an “investigation of deaths associated with administration of anaesthetic”. Two articles describing specific complications associated with death were published during the investigation but the complete report, 'Deaths associated with anaesthesia’ by Edwards et al., was published in Anaesthesia in 1956 and is this month’s ‘Contemporary Classics’ publication. 

The conclusion states: "The way in which the data have been collected precludes the formulation of inferences with respect to the relative safety of the different agents and techniques, and the relative frequency of the various forms of fatality in the country as a whole. But much information of clinical interest has been produced. From this it appears that some of the cases are inexplicable. It is to be hoped that light may be thrown on such by more frequent and accurate reporting, and by ad‐hoc investigations. In the great majority of the reports, however, there were departures from accepted practice. This fact, and its implications, should receive the attention of those responsible for the teaching of anaesthesia."

Read the full article in the journal

Safe in Sleep - a blog post from the Heritage Centre

The most cited Anaesthesia journal article in the 1950s was, Deaths Associated with Anaesthesia: A Report on 1,000 Cases, which was published in 1956, and authored by George Edwards, H.J.V. Morton, E.A. Pask and W.D. Wylie.

In 1949 the Council of the Association of Anaesthetists encouraged the voluntary reporting of deaths associated with anaesthesia. Questionnaires were distributed to all hospital groups, and the project was publicised in Anaesthesia. The authors noted above were appointed as a committee to review the returned questionnaires and the article, Deaths Associated with Anaesthesia, is an analysis of the causes of deaths associated with anaesthesia and the authors’ observations.

Given the motto of the Association, in somno securitas (‘Safe in Sleep’) it is hardly surprising that the Association commissioned this study. This study developed into the Confidential Enquiry into Perioperative Deaths (CEPOD) in the 1980s. This became the National Confidential Enquiry into Perioperative Deaths (NCEPOD) and fulfils a similar role today.

It was most likely that these deaths were caused by incorrect dosages, rather than the anaesthetic agent itself. The administration of anaesthetic by doctors who weren’t anaesthetic specialists was also a factor.

Anaesthetic death inquiries began during the 1840s in the earliest days of anaesthesia. John Snow and James Young Simpson investigated reported deaths from anaesthesia, hoping to improve patient safety. From March 1847, coroners held inquests into every “anaesthetic death.” It was most likely that these deaths were caused by incorrect dosages, rather than the anaesthetic agent itself. The administration of anaesthetic by doctors who weren’t anaesthetic specialists was also a factor.

Continued high number of deaths attributed to chloroform led the Royal Medical and Chirurgical Society (now the Royal Society of Medicine) to appoint a Committee in 1863 to investigate why this was and how it could be avoided in the future. Joseph Clover led the enquiry, which recommended that the pulse be monitored throughout an anaesthetic.

Clover with his chloroform apparatus 1862

Clover with his chloroform apparatus 1862

The Royal Humane society was an early advocate of ‘training’ to reduce the number of unnecessary deaths. Concerned by the number of people mistakenly thought to be dead (and sometimes buried alive!), Doctors William Hawes (1736-1808) and Thomas Cogan (1736-1818) founded the Royal Humane Society in 1774. The doctors wanted to promote the new resuscitation technique, and even offered money to anyone who used it to try and save someone.

Their hope was to restore, ‘a father to the fatherless, a husband to the widow and a living child to the bosom of its mournful parents’.

The first meeting of the Society was held with fifteen of their friends on 18 April 1774 at the Chapter Coffee House, St Paul’s Churchyard. Their hope was to restore, ‘a father to the fatherless, a husband to the widow and a living child to the bosom of its mournful parents’.

The Royal Humane Society (then called the Society for the Recovery of Persons Apparently Drowned) set five aims: 

  • to publish information on how to save people from drowning
  • to pay two guineas to anyone attempting a rescue in the Westminster area of London
  • to pay four guineas to anyone successfully bringing someone back to life 
  • to pay one guinea to anyone – often a pub-owner – allowing a body to be treated in his house 
  • to provide volunteer medical assistants with some basic life-saving equipment

Reports reveal that mouth-to-mouth resuscitation was practiced as early as the fifteenth century, while Paracelsus was the first to experiment with bellows a century later. The bellows method was preferred by the Royal Humane Society. Their resuscitation set contained equipment that could inject fresh air or stimulants such as tobacco into the lungs, stomach, or rectum, and were located by the River Thames, London for anyone to access. The Society has kindly lent two of their kits to the Anaesthesia Museum, and one is currently on display.

The effects of putting pressure on the chest to restart the heart had been known in the nineteenth century, but although German surgeon Dr Friedrich Maass successfully used external compressions to restart the hearts of two patients in 1891, his method did not catch on at the time. Only when researchers rediscovered external compressions in 1933 did it become widely used.

Tobacco smoke enema

Tobacco smoke enema

The first defibrillator was described in 1774 by Charles Kite. In that year, a 3-year-old child, Catherine Greenhill, fell from an upstairs window onto flagstones and, despite being declared dead, was successfully treated with electricity by an apothecary, Dr Squires. Again, it wasn’t until the 1930s that Dr W. Kouwenhoven developed internal and external defibrillation.

During the late 1950s dramatic changes took place in attitudes towards and technology for resuscitation. The new technique was to directly introduce air under pressure to ventilate the lungs. An early method was ‘mouth-to-mouth’, reintroduced by Dr Peter Safar, in which the resuscitator blew directly into the patient’s lungs, or via a tube or airway such as the Safar or the Brook. Until the 1960s, resuscitation was generally performed in hospitals. When Dr Bullough resuscitated Sylvia Berwick at the scene of a roadside accident in 1963, it was headline news.

The invention of the Ambu bag in 1956 proved to be a milestone in resuscitation equipment. Following a shortage of oxygen in Danish hospitals, Dr Henning Ruben developed the first artificial manual breathing unit, a self-inflating resuscitator and ventilation valve. It allowed the manual ventilation of the patient with room air or oxygen. Automatic gas powered resuscitators followed, such as the Stephenson and the Pneu Pac. These work off a cylinder of pressurised oxygen with an adjustable proportion of air. If connected to an endotracheal tube they can ventilate automatically, leaving the resuscitator’s hands free. They are often carried by the emergency services.

The 1940s

The science of neuromuscular blockade, 75 years on

This month, we focus on the 1940s, and we have chosen the first original article published in the journal which looked at curarisation as compared with other methods of securing muscle relaxation in anaesthesia. Of course, we no longer refer to such agents as ‘muscle relaxants’, due to our detailed understanding of their mechanism of action at the neuromuscular junction, which is different from muscle relaxants used in musculoskeletal disorders that act on the spinal cord or brain. This point becomes important when thinking about some of the language used throughout the paper.

The author, Frank Barnett Mallinson (23/07/1905–14/ 06/1965), was born in Malta and was a full-time anaesthetist at Woolwich Memorial Hospital from 1938. Previously, he worked as a general practitioner (as did many anaesthetists of his time) but his real interest was in patient safety and he was also an early advocate for road safety. At the time, the balance between achieving optimal neuromuscular blockade to facilitate surgery and the need to avoid excessive depth of anaesthesia was a cause of much concern, and the use of intocostrin (curare) was postulated as a new and exciting solution. His paper came 4 years after Harold Griffith and Enid Johnson reported the first case series of 25 patients who received intocostrin in clinical practice, and at a time where the field of anaesthesia was undergoing enormous change at the end of the Second World War. There are several themes from his paper that are arguably as applicable to clinical practice now as they were then. 

Read the full article in the journal

Explorer and botanist: The story of Richard C. Gill and curare - a blog post from the Heritage Centre

Richard C. Gill could be described as an adventurer, certainly a researcher, and an amateur ethnographer and botanist. Gill was born in America in 1901. Following in the family tradition, at university he studied medicine, but only for two years before deciding it wasn’t for him. He went on to read English at Cornell University and subsequently worked as a teacher and a Ranger in Yellowstone National Park. In 1928 he moved to South America to work for a rubber company, but left when the business was hit by the Great Depression in 1929. Gill and his wife Ruth loved the country so much, they decided to stay and establish a ranch to grow coffee and plants, and from where Richard could study the culture of the local peoples, and explore the jungle. After eight months of searching, they bought land in Ecuador and built the Rio Negro ranch.

Whilst living on the ranch, Gill made many expeditions into the jungle where he studied the people and built relationships with the local tribes and their leaders. His studies included their social practices, music, and use of botanicals for medicine. He became fascinated by the herbal remedy preparations of the shamans and watched them prepare arrow poison (curare). The tribes used curare coated darts or arrows in blow pipes and bows to shoot and kill or stun animals for food and clothing. When injected into the bloodstream curare acted as a muscle relaxant, which paralysed and asphyxiated prey. The mixing of curare poison and creating weapons is a highly skilled process. Different strength of poison is needed depending on the size of the prey, and mixing these accurately can only be determined by taste, curare is not toxic through ingestion alone. Learning about this poison directly from those that prepared and used it was therefore very important.

Yahua_Blowgun_Amazon_Iquitos_Peru_Creative Commons_Attritbue photographer JialiangGao

Yahua Blowgun Amazon Iquitos Peru, Creative Commons Attritbue photographer JialiangGao

However, arrow poison was not a new discovery, it has been known to Europeans since Sir Water Raleigh’s expeditions to Guyana in 1595. It was first brought back to England in the 1760s by Edward Bancroft (1741-1821) who had encountered the poison during his time in Guyana writing, An Essay on the Natural History of Guiana in South America.

In the nineteenth century, Naturalist Charles Waterton (1782-1865) brought curare samples, or ‘wourali’ as he called it, back to England and conducted experiments on animals. In 1814 at the Royal Vetinanary College in London, along with Benjamin Collins Brodie, Waterton administered wourali to a donkey whilst ventilating it with bellows until the poison wore off.

After Waterton’s experiments, more scientific work was conducted by physicians of the nineteenth century. It was Claude Bernard’s (1813-78) experiments on frogs in 1844 which showed conclusively that curare was acting as a muscle relaxant. He noted that, “it is an anaesthetic agent only in appearance. The animal feels, but cannot show it”.

However, it wasn’t until Gill’s discoveries in the 1930s that the scientific research of curare really got underway.

Gill’s determination is shown by the physiotherapy care plan he devised for himself, which in two years saw him able to drive, and by 1938, he was able to walk with the aid of a walking stick.

In 1932, Gill fell from a horse and suffered neurological symptoms, the symptoms were assumed to be a result of the fall, but he was later diagnosed with multiple sclerosis. A severe attack in 1934 left Gill with near total paralysis and severe muscle spasms. This led him and his neurologist Walter Freeman to explore the idea of using curare to alleviate the spasms. Gill’s determination is shown by the physiotherapy care plan he devised for himself, which in two years saw him able to drive, and by 1938, he was able to walk with the aid of a walking stick. Throughout his rehabilitation he taught himself botany, spoke with pharmacists, botanists and doctors and read relevant texts, he was also making plans for an expedition to collect curare and plants which might relieve his symptoms.

In May 1938, funded by Sayre Merrill, Gill set off into the jungle in search of curare and other plant specimens with a crew of around 95 people, 36 mules, 12 canoes and two tons of equipment. The trip to the jungle base camp took three weeks. The camp was fully equipped with thatched roofed buildings housing living and sleeping quarters, a kitchen, office and laboratory. Here Gill hoped, “to see curare being made, learning all I could about its making, and bringing back that knowledge for those who needed it.” After five months he had done just that. Gill and his crew returned with around 75 plants and 12kg of curare.

Gill sent his plant specimens and botanical specimens of curare, which he’d collected, dried, and pressed, to botanist B.A. Krukoff at the New York Botanical Garden to be identified. They became part of the Steere Herbarium’s collection, where they remain today.

Gill manufacturing curare, c.1938_Arthur Guedel Anaesthesia Collection, University of California San Francisco

Gill manufacturing curare, c.1938 Arthur Guedel Anaesthesia Collection, University of California San Francisco

On returning to America, Gill supplied some of the curare to A.E. Bennett, a psychiatrist from Nebraska and A.R. McIntyre, Chairman of the Department of Pharmacology of the University Of Nebraska College Of Medicine. McIntyre agreed to standardise it and Bennett used it in early trials with patients suffering compilations from shock therapy treatment. In 1939, E.R. Squibb and Son bought Gill’s supply of curare and began researching its properties. Two of Squibb’s scientists reported that they had been able to extract an alkaloid similar to d-tubocurarine from the bark of Chondrodendron Tomentosum and had developed a new method for preparing curare. It was also found to be in the bark of Strychnos Toxifera. This synthesised curare was marketed by E.R. Squibb and Son in 1939 as Intocostrin. It was first used in electroconvulsive therapy (ECT) to control spasms during convulsion therapy, in the treatment of tetanus, and finally in 1942 for anaesthesia.

Before the advent of curare in the 1940s, in order to achieve muscle relaxation anaesthetists would have had to administer a very deep ether or cyclopropane anaesthesia, which could cause a number of heart, liver or kidney complications.

The synthesised curare was successfully used in anaesthesia when Harold Griffith and Enid Johnson used the preparation of curare, Intocostrin, successfully in 25 patients who were lightly anaesthetised with cyclopropane. In Britain, Cecil Gray found Intocostrin unreliable and instead popularised the use of d-tubocurarine chloride, which was more consistent in its potency. D-tubocurarine would become the muscle relaxant of choice until curare-like synthetic agents replaced natural curare from the 1980s onwards.

Before the advent of curare in the 1940s, in order to achieve muscle relaxation anaesthetists would have had to administer a very deep ether or cyclopropane anaesthesia, which could cause a number of heart, liver or kidney complications. Additionally, with the total paralysis of a patient’s diaphragm, these surgeries were only possible with the invention of tracheal intubation and mechanical ventilation of the lungs.

Now, with muscle relaxants and manual ventilation, life saving heart, brain and thoracic surgeries can be performed.

You may also be interested in: