Drop Dead Gorgeous: 19th Century Beauty Tips for the Aspiring Consumptive

swoonPicture the ideal nineteenth century English beauty: pale, almost translucent skin, rosy cheeks, crimson lips, white teeth, and sparkling eyes. She’s waspishly thin with elegant collarbones. Perhaps she’s prone to fainting.

It shouldn’t be difficult to imagine; numerous depictions survive to this day, and the image is still held up as the gold standard for Caucasian women. At this point, it’s so embedded in the Western psyche as beauty that it doesn’t occur to us to question it. Of course that’s beautiful. Why wouldn’t it be?

By the nineteenth century, beauty standards in Britain had come a long way from the plucked hairlines of the late Middle Ages and the heavy ceruse of the Stuart period. Fashionable women wanted slimmer figures because physical fragility had become associated with intelligence and refinement. Flushed cheeks, bright eyes, and red lips had always been popular, particularly among sex workers (they suggested arousal), and women had been using cosmetics like belladonna, carmine, and Spanish leather for years to produce those effects when they didn’t occur organically.

Bright eyes, flushed cheeks, and red lips were also signs of tuberculosis.

Tuberculosis—known at the time as consumption, phthisis, hectic fever, and graveyard cough—was an epidemic that affected all classes and genders without prejudice. Today, an estimated 1.9 billion people are infected with it, and it causes about two million deaths each year. At the time, it was mainly associated with respectable women (although there are no few depictions of sex workers dying of it*) and thought to be triggered by mental exertion or too much dancing.** Attractive women were viewed as more susceptible to it because tuberculosis enhanced their best features. It was noted to cause pale skin, silky hair, weight loss, and a feverish tinge to the face (along with less desirable symptoms including weakness, coughing up blood, GI upset, and organ failure), and it was treated with little to no effect with bleeding, diet, red wine, and opium.

Although having an active (rather than latent) case of consumption was all but a death sentence, it didn’t inspire the revulsion of other less attractive diseases until the end of the 19th century when its causes were better understood.

In 1833, The London Medical and Surgical Journal described it in almost affectionate terms: “Consumption, neither effacing the lines of personal beauty, nor damaging the intellectual functions, tends to exalt the moral habits, and develop the amiable qualities of the patient.”

keats

John Keats. Joseph Severn, 1819.

Of course it didn’t only affect women. The notion that it was caused by mental exertion—along with the high number of artists and intellectuals who lost their lives to it—also led to its association with poets. John Keats died of it at 26. His friend Percy Shelley—also infected—wrote tributes to Keats that attempted to explain consumption not as a disease, but as death by passion. Bizarrely, a symptom that is unique to consumption is spes phthisica, a euphoric state that can result in intense bursts of creativity.*** Keats’ prolific final year of life has been attributed to his consumption, and spes phthisica was viewed by some as necessary for artistic genius.

As Alexandre Dumas (fils) wrote in 1852: “It was the fashion to suffer from the lungs; everybody was consumptive, poets especially; it was good form to spit blood after any emotion that was at all sensational, and to die before reaching the age of thirty.”

Because of its association with young women and poets, the disease itself came to represent beauty, romantic passion, and hyper sexuality. As far as illnesses went, it was considered to be rather glamorous, and in a culture half in love with death, it inspired its fair share of tributes. There are numerous romantic depictions of young women wasting away in death beds at the height of their beauty. Women with consumption were regularly praised for the ethereal loveliness that came from being exceptionally thin and nearly transparent.

Picture that ideal nineteenth century beauty again: that complexion is almost a pallor, and you can see her veins through it. Those lips, eyes, and cheeks are all indicative of a constant low-grade fever. Her teeth are so white they’re almost as translucent as her skin. And her figure? She’s emaciated due to the illness and the chronic diarrhea that comes with it. If she faints, it’s more to do with the lack of oxygen in her blood than the tension of her corset. The sicker she gets, the more beautiful she becomes, until she’s gone; the beauty is all the more poignant because of its impermanence. This beauty can’t last, and it’s as deadly as it is contagious.

Only a fool would wish for it, so what’s a healthy girl to do?

If you didn’t have consumption but wanted the look, there were two things you could do: wait (at its peak between 1780 and 1850, it is estimated to have caused a quarter of all deaths in Europe. Statistically, you would have had a fair chance of getting it), or fake it. Corsets could be made to narrow the waist and encourage a stooped posture, and necklines were designed to show off prominent collar bones. As for the rest, people could try:

Arsenic Complexion WafersAHB2009q11701

Although arsenic was known to be toxic, it was used throughout the nineteenth century in everything from dye to medication. Eating small amounts of arsenic regularly was said to produce a clear, ghostly pale complexion. Lola Montez reported that some women in Bohemia frequently drank the water from arsenic springs to whiten their skin.

Stop Eating

In The Ugly-Girl Papers, S.D. Powers offers her own advice for achieving consumptive skin: “The fairest skins belong to people in the earliest stages of consumption, or those of a scrofulous nature. This miraculous clearness and brilliance is due to the constant purgation which wastes the consumptive, or to the issue which relieves the system of impurities by one outlet. We must secure purity of the blood by less exhaustive methods. The diet should be regulated according to the habit of the person. If stout, she should eat as little as will satisfy her appetite.”

How little? Writing in the third person, she uses herself as an example: “Breakfast was usually a small saucer of strawberries and one Graham cracker, and was not infrequently dispensed with altogether. Lunch was half an orange—for the burden of eating the other half was not to be thought of; and at six o’clock a handful of cherries formed a plentiful dinner. Once a week she did crave something like beef-steak of soup, and took it.”

Olive-Tar

For “fair and innocent” skin that mimics the effects of consumption, The Ugly-Girl Papers offers the following recipe: “Mix one spoonful of the best tar in a pint of pure olive oil or almond oil, by heating the two together in a tin cup set in boiling water. Stir till completely mixed and smooth, putting in more oil if the compound is too thick to run easily. Rub this on the face when going to bed, and lay patches of soft old cloth on the cheeks and forehead to keep the tar from rubbing off. The bed linen must be protected by old sheets folded and thrown over the pillows. The odor, when mixed with oil, is not strong enough to be unpleasant—some people fancy its suggestion of aromatic pine breath—and the black, unpleasant mask washes off easily with warm water and soap. The skin comes out, after several applications, soft, moist, and tinted like a baby’s. The French have long used turpentine to efface the marks of age, but olive-tar is pleasanter.”

White Lead

800px-Madame_X_(Madame_Pierre_Gautreau),_John_Singer_Sargent,_1884_(unfree_frame_crop)

Madame X. John Singer Sargent, 1883-4

Lead had been used as the primary ingredient for ceruse and other forms of foundation and powder for centuries. It was known to cause skin problems over time (and, you know, lead poisoning). In the nineteenth century, it was still used for the same purpose and appeared in paints and skin enamels in Europe and the United States.

Lavender Powder

If the pallor of consumption didn’t occur naturally or with the aid of arsenic, it could be imitated with the use of lavender colored powder. Usually applied over ceruse or other foundation made from white lead, it gave the skin a bluish, porcelain shade. Perhaps the best known example of this is John Singer Sargent’s Madame X. The model, Virginie Gautreau, was known to use lavender powder to create her dramatically pale complexion. She was said to be a master of drawing fake veins on with indigo, and she painted her ears with rouge to add to the illusion of translucence.

Rouge

Commonly sold and sometimes made at home, rouge was everywhere. Made from toxic bismuth or vermilion, or carmine from cochineal beetles, it was applied to cheeks, lips, ears, and sometimes even nostrils to make them appear transparent. It came in liquid, cream, and powder forms, and Napoleon’s Empress Josephine is said to have spent a fortune on it. The Ugly-Girl Papers offers this recipe for Milk of Roses, which sounds rather nice:

“(Mix) four ounces of oil of almonds, forty drops of oil of tarter, and half a pint of rose-water with carmine to the proper shade. This is very soothing to the skin. Different tinges may be given to the rouge by adding a few flakes of indigo for the deep black-rose crimson, or mixing a little pale yellow with less carmine for the soft Greuze tints.”

Ammonia

The Ugly-Girl Papers recommends ammonia for use as both a hair rinse and, worryingly, a depilatory. For healthy hair, Powers recommends scrubbing it nightly with a brush in a basin of water with three tablespoons of ammonia added. Hair should then be combed and left to air dry without a night cap.

Lemon Juice and Eyeliner

To achieve the ideal feverish “sparkling eyes,” some women still used belladonna (which could cause blindness) while others resorted to putting lemon juice or other irritants in their eyes to make them water. Eyes, eyelashes, and eyebrows could also be defined. Powers advises: “All preparations for darkening the eyebrows, eyelashes, etc., must be put on with a small hair-pencil. The “dirty-finger” effect is not good. A fine line of black round the rim of the eyelid, when properly done, should not be detected, and its effect in softening and enlarging the eyes is well known by all amateur players.”

Jessica Cale

 

 

Sources

Day, Carolyn. Consumptive Chic: A History of Beauty, Fashion, and Disease. (2017)

Dumas, Alexandre (fils). La Dame Aux Camélias. (1852)

Klebs, Arnold. Tuberculosis: A Treatise by American Authors on its Etiology, Pathology, Frequency, Semeiology, Diagnosis, Prognosis, Prevention, and Treatment. (1909)

Meier, Allison. How Tuberculosis Symptoms Became Ideals of Beauty in the 19th Century. Hyperallergic. January 2nd, 2018.

Montez, Lola. The Arts of Beauty: or Secrets of a Lady’s Toilet. (1858)

Morens, David M. At the Deathbed of Consumptive Art. Emerging Infectious Diseases, Volume 8, Number 11. November 2002.

Mullin, Emily. How Tuberculosis Shaped Victorian Fashion. Smithsonian.com, May 10th, 2016.

Pointer, Sally. The Artifice of Beauty: A History and Practical Guide to Perfumes and Cosmetics. (2005)

Powers, S. D. The Ugly-Girl Papers, or Hints for the Toilet. (1874)

Zarrelli, Natalie. The Poisonous Beauty Advice Columns of Victorian England. Atlas Obscura, December 17th, 2015.

Notes

*Depictions of sex workers dying of tuberculosis: La Traviata, Les Misérables, La Bohème, and now Moulin Rouge, etc. In the 19th century, consumption was portrayed as a kind of romantic redemption for sex workers through the physical sacrifice of the body.

**Although dancing itself wouldn’t have done it, the disease was so contagious that it could be contracted anywhere people would be at close quarters—dancing at balls with multiple partners could have reasonably been high-risk behavior.

***You know what else does that? Tertiary syphilis. How do you know which one you have? If you’re coughing blood, it’s consumption. If your skin is falling off, it’s syphilis. Either way, you’re going to want to call a doctor.

Advertisements

Pervitin, The People’s Drug: How Methamphetamine Fueled the Third Reich

Pervitinampullen

Meth didn’t come out of nowhere. Like cocaine, heroin, and morphine, it has its origins in 19th century Germany. When Romanian chemist Lazăr Edeleanu first synthesized amphetamine in 1887, he couldn’t have known that his creation would evolve into a substance that would one day help to fuel a world war. Nagai Nagayoshi took it a step closer when he synthesized methamphetamine in 1893. It was transformed into the crystalline form we know today by Japanese pharmacologist Akira Ogata in 1919, at which point it found its way back to Germany, where the conditions were just right for another pharmacological breakthrough.

Drugs and the Weimar Republic

Drugs were not unknown to Berlin. Okay, that’s an understatement. Weimar Berlin was soaked in them. Not only were drugs like morphine, heroin, and cocaine legal, but they could be purchased from every street corner and were all but issued to those attending the legendary nightclubs, where any kink or perversion up to an including BDSM, public orgies, and voyeurism happened on the regular.(1)

Anita Berber Cocaine by F.W. Koebner

Anita Berber by F. W. Koebner

Dancer Anita Berber, the It Girl of Weimar Berlin, was known to go about her business wearing nothing but a sable coat and an antique brooch stuffed with cocaine (pictured). She was such an exhibitionist, the local sex workers complained that they couldn’t keep up with the amount of skin she was showing. Of all the idiosyncratic breakfasts of history, Berber’s still stands out: she was said to start every day with a bowl of ether and chloroform she would stir with the petals of a white rose before sucking them dry.

She wasn’t the only one. Having lost its access to natural stimulants like tea and coffee along with its overseas colonies in the Treaty of Versailles, Germany was in need of synthetic assistance. Norman Ohler explains:

“The war had inflicted deep wounds and caused the nation both physical and psychic pain. In the 1920s drugs became more and more important for the despondent population between the Baltic Sea and the Alps. The desire for sedation led to self-education and there soon emerged no shortage of know-how for the production of a remedy.”

Poster for an anti-drug film, 1927

Produce they did. Eighty percent of the global cocaine market was controlled by German pharmaceutical companies, and Merck’s was said to be the best in the world. Hamburg was the largest marketplace in Europe for cocaine with thousands of pounds of it passing through its port legally every year. The country of Peru sold its entire annual yield of raw cocaine to German companies. Heroin, opium, and morphine were also produced in staggering quantities, with ninety-eight percent of German heroin being exported to markets abroad.

How were drugs able to flourish to such an extent? For one thing, they were legal. Many veterans of the First World War were habitually prescribed morphine by doctors who were addicted to it themselves. It wasn’t viewed as a harmful drug but as a necessary medical treatment for chronic pain and shell shock. Further, the line between drug use and addiction was uncertain. In spite of countless people regularly indulging in everything from cocaine to heroin for medical as well as recreational purposes, few were considered to be addicts. Drug use was not a crime, and addiction was seen as a curable disease to be tolerated.

As historian Jonathan Lewy explains:

“Addicts stemmed from a higher class in society. Physicians were the most susceptible professional group to drug addiction. Instead of antagonizing this group, the regime tried to include physicians and pharmacists in their program to control drugs. In addition, German authorities agreed that the war produced addiction; in other words, the prized veterans of the First World War were susceptible, and none of the political parties in the Weimar Republic, least of all the National Socialist Party, wished to antagonize this group of men.”

Pervitin, The Miracle Pill

On Halloween 1937, Pervitin was patented by Temmler, a pharmaceutical company based in Berlin. When it hit the market in 1938, Temmler sent three milligrams to every doctor in the city. Many doctors got hooked on it, and, convinced of its efficacy, prescribed it as study aid, an appetite suppressant, and a treatment for depression.

Pervitin Landesarchiv BerlinTemmler based its ad campaign on Coca-Cola’s, and the drug quickly became popular across the board. Students used it to help them study, and it was sold to housewives in chocolate with the claim that would help them to finish their chores faster with the added benefit that it would make them lose weight (it did). By 1939, Pervitin was used to treat menopause, depression, seasickness, pains related to childbirth, vertigo, hay fever, schizophrenia, anxiety, and “disturbances of the brain.”

Army physiologist Otto Ranke immediately saw its potential. Testing it on university students in 1939, he found that the drug enabled them to be remarkably focused and productive on very little sleep. Pervitin increased performance and endurance. It dulled pain and produced feelings of euphoria, but unlike morphine and heroin, it kept the user awake. Ranke himself became addicted to it after discovering that the drug allowed him to work up to fifty hours straight without feeling tired.

Despite its popularity, Pervitin became prescription only in 1939, and was further regulated in 1941 under the Reich Opium Law. That didn’t slow down consumption, though. Even after the regulation came in, production increased by an additional 1.5 million pills per year. Prescriptions were easy to come by, and Pervitin became the accepted Volksdroge (People’s Drug) of Nazi Germany, as common as acetaminophen is today.

Although the side effects were serious and concerning, doctors continued to readily prescribe it. Doctors themselves were among the most serious drug abusers in the country at this time. An estimated forty percent of the doctors in Berlin were known to be addicted to morphine.

As medical officer Franz Wertheim wrote in 1940:

“To help pass the time, we doctors experimented on ourselves. We would begin the day by drinking a water glass of cognac and taking two injections of morphine. We found cocaine to be useful at midday, and in the evening we would occasionally take Hyoskin (an alkaloid derived from nightshade) … As a result, we were not always fully in command of our senses.”

Its main user base, however, was the army. In addition to the benefits shown during the test on the university students, Ranke found that Pervitin increased alertness, confidence, concentration, and willingness to take risks, while it dulled awareness of pain, hunger, thirst, and exhaustion. It was the perfect drug for an army that needed to appear superhuman. An estimated one hundred million pills were consumed by the military in the pre-war period alone. Appropriately enough, one of the Nazis’ slogans was, “Germany, awake!”

Germany was awake, alright.

Military Use

After its first major test during the invasion of Poland, Pervitin was distributed to the army in shocking quantities. More than thirty-five million tablets of Pervitin and Isophan(2) were issued to the Wermacht and Luftwaffe between April and July of 1940 alone. They were aware that Pervitin was powerful and advised sparing use for stress and “to maintain sleeplessness” as needed, but as tolerance increased among the troops, more and more was needed to produce the same effects.

Pervitindose

Pervitin was a key ingredient to the success of the Blitzkrieg (lightning war). In these short bursts of intense violence, speed was everything.  In an interview with The Guardian, Ohler summarized:

“The invasion of France was made possible by the drugs. No drugs, no invasion. When Hitler heard about the plan to invade through Ardennes, he loved it. But the high command said: it’s not possible, at night we have to rest, and they [the allies] will retreat and we will be stuck in the mountains. But then the stimulant decree was released, and that enabled them to stay awake for three days and three nights. Rommel and all those tank commanders were high, and without the tanks, they certainly wouldn’t have won.”

Bomber pilots reported using Pervitin to stay alert throughout the Battle of Britain. Launches were often late at night, so German pilots would not make it to London until after midnight. As one bomber pilot wrote:

“You were over London or some other English city at about one or two in the morning, and of course then you’re tired. So you took one or two Pervitin tablets, and then you were all right again … The commander always has to have his wits about him, so I took Pervitin as a precautionary measure. One wouldn’t abstain from Pervitin because of a little health scare. Who cares when you’re doomed to come down at any moment anyway?”

Pervitin was issued to pilots to combat fatigue, and some of its nicknames—“pilot salt,” “Stuka pills,” “Göring pills”—hinted at its use. One commodore fighting in the Mediterranean described the feeling of using it while flying:

“The engine is running cleanly and calmly. I’m wide awake, my heartbeat thunders in my ears. Why is the sky suddenly so bright, my eyes hurt in the harsh light. I can hardly bear the brilliance; if I shield my eyes with my free hand it’s better. Now the engine is humming evenly and without vibration—far away, very far away. It’s almost like silence up here. Everything becomes immaterial and abstract. Remote, as if I were flying above my plane.”

As powerful as Pervitin was, it wasn’t enough. Still, whatever they needed was given to them. By 1944, Vice-Admiral Hellmuth Heye requested something stronger than would enable troops to fight even longer while boosting their self-esteem. Not long after, Kiel pharmacologist Gerhard Orzechowski answered with a newer, stronger pill called D-IX, the active ingredients of which were three milligrams of Pervitin, five milligrams of cocaine, and five milligrams of Eukodal, a painkiller derived from morphine.

Initially tested on prisoners at the Sachsenhausen concentration camp (the victims were forced to walk until they dropped, regardless of how long it took), D-IX was given to the marines piloting one-man U-boats designed to attack the Thames estuary. It was issued as a kind of chewing gum that was to keep the marines awake and piloting the boats for days at a time before ultimately attacking the British. It did not have the intended effect, however. Trapped under water for days at a time, the marines suffered psychotic episodes and often got lost.

The Hangover

No “miracle pill” is perfect, and anything that can keep people awake for days is going to have side effects. Long-term use of Pervitin could result in addiction, hallucination, dizziness, psychotic phases, suicide, and heart failure. Many soldiers died of cardiac arrest. Recognizing the risks, the Third Reich’s top health official, Leonardo Conti, attempted to limit his forces’ use of the drug but was ultimately unsuccessful.

Temmler Werke continued supplying Pervitin to the armies of both East and West Germany until the 1960s. West Germany’s army, the Bundeswehr, discontinued its use in the 1970s, but East Germany’s National People’s Army used it until 1988. Pervitin was eventually banned in Germany altogether, but methamphetamine was just getting started.

Jessica Cale

Sources

Cooke, Rachel. High Hitler: How Nazi Drug Abuse Steered the Course of History. The Guardian, September 25th, 2016.

Hurst, Fabienne. The German Granddaddy of Crystal Meth. Translated by Ella Ornstein. Spiegel Online, May 30th, 2013.

Lewy, Jonathan. The Drug Policy of the Third Reich. Social History of Alcohol and Drugs, Volume 22, No 2, 2008

Ohler, Norman. Blitzed: Drugs in the Third Reich. Houghton Mifflin Harcourt. New York, 2015.

Ulrich, Andreas. The Nazi Death Machine: Hitler’s Drugged Soldiers. Translated by Christopher Sultan. Spiegel Online, May 6th, 2005.

(1) Don’t worry. We’re definitely going to cover that.

(2) Isophan: a drug very similar to Pervitin produced by the Knoll pharmaceutical company

Resurrection, Corpse Art, and How the Father of Modern Surgery Stole the Irish Giant

Ressurectionists by Phiz (HK Browne) Chronicles of Crime 1841

Resurrectionists by Phiz (H.K. Browne). Chronicles of Crime, 1841.

By the middle of the eighteenth century, medical science in Britain was rapidly evolving. Surgeons had split from the Worshipful Company of Barbers as a professional guild in 1745, forming the Company of Surgeons. This was the forerunner of the Royal College of Surgeons, which was created by Royal Charter in 1800, and the reason we don’t call hospital consultants ‘Doctor’ – barber-surgeons held no medical degree.

As the profession established itself as an Enlightenment science based upon empirical research and experiment, two figures came to dominate its development: the Scottish anatomist and physician William Hunter (1718 – 1783), and his younger brother John (1728 – 1793), who went on to be known as the ‘Father of Modern Surgery.’

William had studied medicine at Edinburgh University. He moved to London in 1741, studying anatomy at St George’s Hospital. He quickly established himself as an able physician, also running a private anatomy school in London offering supplementary tuition to hospital school students. He taught the ‘Parisian’ method, whereby each trainee surgeon could work on an individual corpse rather than the more usual practice of watching an instructor dissect or lecture using models in a great Theatrum Anatomica.

William was joined there by his brother John, who acted as his assistant, which in practice almost certainly meant illegally procuring bodies for dissection, before becoming a teacher himself. John went on to study at Chelsea Hospital and St. Bartholomew’s, and was commissioned as an army surgeon in 1760, further refining his skills during the Seven Year’s War.

Smugglerius sketched by William Linnell 1840

Smugglerius. Sketched by William Linnel, 1840.

William, meanwhile, became physician to Queen Charlotte in 1764, and by the end of the decade he was a Fellow of the Royal Society and a Professor of Anatomy to the Royal Academy, where he once posed the body of an executed smuggler so that it would stiffen into the attitude of the Roman statue the ‘Dying Gaul,’ before flaying it and having the Italian artist Agostino Carlini cast it in plaster. (The ‘Smugglerius’ can still be seen at the Royal Academy.) (See above.)

On returning to England on half-pay, John became a surgeon at St George’s Hospital in 1768, after a brief stint as a dentist during which he experimentally transplanted human teeth. In 1776, he was appointed surgeon to George III, rising to the position of Surgeon General in 1790.

In addition to the importance of their collective research, which remains relevant to this day, these distinguished brothers were innovative teachers. The Hunters taught some of the most influential blades of the next generation, such as Sir Astley Paston Cooper (1768 – 1841), whose passion for anatomical study was such that he once dissected an elephant obtained from the Royal Menagerie in his front garden, the carcass being too big to get inside. They stressed the importance of hands-on pathological and physiological knowledge, which could only be gained through the regular dissection of animals and human beings, enhancing diagnostic accuracy, and the refinement of surgical technique. Books and lectures, they believed, were not enough, while no published medical ‘fact’ should be accepted without rigorous empirical testing.

Despite such advances, Georgian surgery was not pretty. There was no real understanding of infection, and no anaesthetic. Operating tables were made of wood, an ideal surface for bacteria to flourish, with a channel for blood to run off into sawdust-filled buckets. John Hunter called his patients ‘victims,’ and they were tied down and held as necessary, conscious and screaming throughout the procedure, which was often conducted in front of a large class of medical students. The mortality rate was high, but your chances of survival were greatly enhanced if your surgeon was knowledgeable, precise, and above all quick with the blade and the suture. That said, many of the patients who were strong enough to survive the operation subsequently died from infection.

For surgeons, the only way was to learn by doing. The problem was that there simply weren’t enough human corpses legally available to anatomists. Bodies for dissection were supplied under the provision of the 1752 Murder Act, as an additional deterrent to what politicians believed was a troubling rise in capital crime.

Even the Bloody Code could not meet the ever-growing demand for specimens in the burgeoning and lucrative world of the private anatomy schools. The surgeon Robert Knox, for example, who was supplied by Burke and Hare, had 400 students under him at the height of his success, while his school was only one of half a dozen in Edinburgh at the time. Thus, as is well-known, came the resurrection men, organised criminal gangs who exhumed bodies from graveyards which they sold to the surgeons, who were well aware of where their ‘subjects’ were coming from. This wasn’t a new trade, but by the end of the century it was becoming a ghoulish epidemic. As James Blake Bailey wrote in The Diary of a Resurrectionist (1896), ‘The complaint as to the scarcity of bodies for dissection is as old as the history of anatomy itself.’

Even though, as surgeons were quick to argue, the general population could only benefit from advances in surgical knowledge and well-trained doctors, the thought of body-snatching was appalling to ordinary folk. Dissection carried the stigma of criminal punishment, while in a god-fearing culture, people believed that if their mortal remains were defiled, they would not rise on the Day of Judgement. To medical men, however, all this was a necessary evil, in which the good far outweighed the bad. Surgeons viewed themselves as scientists; human corpses were no different to any other dead animal, merely specimens to study and, indeed, collect.

John Hunter was a case in point. Hunter was an active learner, who eschewed what he saw as outdated and inadequate academic study in favour of dissecting hundreds, if not thousands, of bodies provided by a network of resurrection men. He was particularly interested in abnormal specimens, and his professionally detached, unemotional eye saw no harm in his obsessive pursuit of the mortal remains of the ‘Modern Colossus’ or ‘Irish Giant,’ Charles Byrne (1761 – 1783), despite the public protestations to the contrary by the fatally ill young man.

The Surprising Irish Giant by Thomas Rowlandson 1782

The Surprising Irish Giant. Thomas Rowlandson, 1782.

The 7’ 7” Byrne was a popular celebrity in England, conquering London in 1782, but his great height was a symptom of the then unknown and unnamed disorder Acromegaly, and by the age of twenty-two his health was failing rapidly, and Hunter wanted him. Terrified, the boy from County Tyrone gave an undertaker his life savings and arranged with friends that his body be constantly watched until it was sealed in a lead coffin and buried at sea.

Hunter’s fears were more practical in nature. Concerned that another surgeon might beat him to it, Hunter paid a man to watch Byrne’s lodgings for news of his demise, while the men charged with protecting the body were paid £500 to look the other way. Byrne died in June 1783, but when his huge coffin set sail from Margate and was duly committed to the deep he was not in it, having been conveyed to Hunter’s house in Earl’s Court and boiled down to his bones.

John Hunter by John Jackson (after Sir Joshua Reynolds) 1813

John Hunter by John Jackson (after Sir Joshua Reynolds)

Four years later, after public interest in the Irish Giant had died down, Byrne’s articulated skeleton went on display at the Hunterian Museum at the Royal College of Surgeons. It stands there to this day, despite calls from the British Medical Journal in 2011 and the Mayor of Derry in 2015 to end its unethical display and bury it in accordance with Byrne’s final wishes. The curious brown discolouration of the skeleton is the result of Hunter’s indecent haste during the rendering process, which locked fat into the bones.

To Hunter, who died of a heart attack ten years after Byrne, this was all done in the interests of science, and his reputation suffered no damage as a result. He’s still getting away with it. His marble bust (one of many public memorials), is mounted proudly above the glass case in which Byrne forever stands, the centrepiece of the museum. In his portrait by Sir Joshua Reynolds, exhibited at the Royal Academy in 1786, the Irish Carver, 19th C Underworld (P&S)Giant’s skeletal feet are clearly visible in the background (see above right).

Dr. Stephen Carver teaches creative writing at The Unthank School of Writing. His latest book, The 19th Century Underworld, will be published by Pen & Sword next year. You can find more of his writing here

Sources

Bailey, James Blake. (1896). The Diary of a Resurrectionist 1811-1812, to which are added an account of the resurrection men in London & a short history of the passing of the anatomy act. London: S. Sonnenschien & co.

Cooper, Bransby Blake. (1843). The Life of Sir Astley Cooper. 2 vols. London: John W. Parker.

Cubbage, Eric. (2011). ‘The Tragic Story of Charles Byrne “The Irish Giant”.’ Available at: http://www.thetallestman.com/ (Accessed November 26, 2017).

Garrison, Fielding H. (1914). An Introduction to The History of Medicine. Philadelphia: Saunders.

Low, Donald A. (1999). The Regency Underworld. London: Sutton.

Muinzer, Thomas (2013). ‘A Grave Situation: An Examination of the Legal Issues raised by the Life and Death of Charles Byrne, the “Irish Giant”.’ International Journal of Cultural Property. 20 (1), February.

Moore, Wendy. (2005).  The Knife Man: The Extraordinary Life and Times of John Hunter, Father of Modern Surgery. New York: Broadway.

Richardson, Ruth. (1988). Death, Dissection and the Destitute. London: Penguin.

Fanny Burney and Her Mastectomy

280px-Frances_d'Arblay_('Fanny_Burney')_by_Edward_Francisco_Burney-wiki

Fanny Burney

In 1811, before anesthesia was invented, Frances Burney d’Arblay had a mastectomy aided by nothing more than a wine cordial. She wrote such a gripping narrative about her illness and operation afterwards readers today still find it riveting and informative.

Fanny came from a large family and was the third child of six. From an early age, she began composing letters and stories, and she became a phenomenal diarist, novelist, and playwright in adulthood. Certainly, her skillful writing was a primary reason her mastectomy narrative had such appeal.

In her narrative, Fanny provides “psychological and anatomical consequences of cancer … [and] while its wealth of detail makes it a significant document in the history of surgical techniques, its intimate confessions and elaborately fictive staging, persona-building, and framing make it likewise a powerful and courageous work of literature in which the imagination confronts and translates the body.” Prior to her surgery, she had written similar works about “physical and mental pain to satirize the cruelty of social behavioral strictures, especially for women.”

Samuel_Johnson_by_Joshua_Reynolds-wikipedia

Dr. Samuel Johnson

Fanny grew up in England and had been embraced by the best of London society. She had served in George III and Queen Charlotte’s court as Second Keeper of the Royal Robes. Moreover, she was admired by such literary figures as Hester Thrale, David Garrick, and Edmund Burke. Fanny also befriended Dr. Samuel Johnson, the English writer who made significant contributions to English literature as a poet, essayist, moralist, literary critic, biographer, editor and lexicographer. In fact, some of Fanny’s best revelations are about Johnson, how he teased her, and the fondness that he held for her.

In 1793, Fanny married Louis XVI Alexandre-Jean-Baptiste Piochard d’Arblay and became Madame d’Arblay. D’Arblay was an artillery officer who served as adjutant-general to the famous hero of the American Revolution, Gilbert du Motier, Marquis de Lafayette. D’Arblay had fled France for England during the Revolution just as had many other Frenchmen. However, in 1801, d’Arblay was offered a position in Napoleon Bonaparte’s government. He and Fanny relocated to France in 1802 and moved to Passy (the same spot where Benjamin Franklin and the princesse de Lamballe had lived), and they remained in France for about ten years.

Larrey and Dubois-x300

Baron Dominique-Jean Larrey (left) and Antoine Dubois (right)

While living in France, Fanny suffered breast inflammation in her right breast in 1804 and 1806. She initially dismissed the problem but then in 1811 the pain became severe enough that it affected her ability to use her right arm. Her husband became concerned and arranged for her to visit Baron Dominique-Jean Larrey, First Surgeon to the Imperial Guard, as well as the leading French obstetrician, surgeon, and anatomist, Antoine Dubois.

The French doctors treated Fanny palliatively but as there was no response to the treatment, it was determined surgery was necessary. Fanny’s surgery occurred on 11 September 1811. At the time, surgery was still in its infancy and anesthesia unavailable. Cocaine was later isolated, determined to be an effective local anesthetic, and used for the first time in 1859 by Karl Koller. So, it must have been horrific for Fanny to experience the pain of a mastectomy with nothing more than a wine cordial that may have contained some laudanum. Fanny was traumatized by the surgery and it took months before she wrote about the surgery details to her sister Esther exclaiming:

“I knew not, positively, then, the immediate danger, but every thing convinced me danger was hovering about me, & that this experiment could alone save from its jaws. I mounted, therefore, unbidden, the Bed stead – & M. Dubois placed upon the Mattress, & spread a cambric handkerchief upon my face. It was transparent, however, & I saw through it, that the Bed stead was instantly surrounded by the 7 men & my nurse. I refused to be held; but when, Bright through the cambric, I saw the glitter of polished Steel – I closed my Eyes. I would not trust to convulsive fear the sight of the terrible incision. A silence the most profound ensued, which lasted for some minutes, during which, I imagine, they took their orders by signs, & made their examination – Oh what a horrible suspension! … The pause, at length, was broken by Dr. Larry [sic], who in a voice of solemn melancholy, said ‘Qui me tiendra ce sein?”

Fanny went on to describe “torturing pain” and her inability to restrain her cries as the doctors cut “though veins – arteries – flesh – nerves.” Moreover, she noted:

“I began a scream that lasted unintermittingly during the whole time of the incision – & I almost marvel that it rings not in my Ears still! so excruciating was the agony. When the wound was made, & the instrument was withdrawn, the pain seemed undiminished, for the air that suddenly rushed into those delicate parts felt like a mass of minute but sharp & forked poniards, that were tearing the edges of the wound. … I attempted no more to open my Eyes, – they felt as if hermetically shut, and so firmly closed, that the Eyelids seemed indented into my Cheeks. The instrument this second time withdrawn, I concluded the operation over – Oh no! presently the terrible cutting was renewed – and worse than ever … I then felt the Knife rackling against the breast bone – scraping it! – This performed, while I yet remained in utterly speechless torture. “

Despite the excruciating pain, Fanny lived through the operation, and her surgery was deemed a success. Larrey produced a medical report about his brave patient stating that he removed her right breast at 3:45pm and that Fanny showed “un Grand courage.” Courageous as she was, there was no way for doctors to determine if Fanny’s tumor was malignant or if she suffered from mastopathy.

Burney_tombstone-x300

Fanny’s Commemorate Plaque. Courtesy of Bath-heritage.co.uk

Fanny’s healing took a long time, and while still recuperating, she and husband returned to England in 1812. Six years later, in 1818, her husband died from cancer, and she died twenty-two years later, at the age of eighty-seven, on 6 January 1840 in Lower Grosvenor-street in London. As Fanny had requested, a private funeral was held in Bath, England, and attended by a few relatives and some close friends. She was laid to rest in Walcot Cemetery, next to her beloved husband and her only son Alexander, who had died three years earlier. Their bodies were then moved during redevelopment of the Walcot Cemetery to the Haycombe Cemetery in Bath and are buried beneath the Rockery Garden.

References

DeMaria, Jr., Robert, British Literature 1640-1789, 2016
“Died,” in Northampton Mercury, 18 January 1840
Epstein, Julia L., “Writing the Unspeakable: Fanny Burney’s Mastectomy and the Fictive Body,” in Representations, No. 16 (Autumn, 1986), pp. 131-166
Madame D’Arblay, in Evening Mail, 20 January 1840
Madame D’Arblay’s Diary, in Evening Mail, 18 May 1842
“The Journals and Letters of Fanny Burney (Madame D’Arblay), Volume VI, France 1803-1812,” in Cambridge Journals 

61yLoQ9ugKL._SX345_BO1204203200_-347x381Geri Walton has long been interested in history and fascinated by the stories of people from the 1700 and 1800s. This led her to get a degree in History and resulted in her website, geriwalton.com which offers unique history stories from the 1700 and 1800s. Her first book, Marie Antoinette’s Confidante: The Rise and Fall of the Princesse de Lamballe, discusses the French Revolution and looks at the relationship between Marie Antoinette and the Princesse de Lamballe.
FacebookTwitter | Google+ | Instagram | Pinterest

A Field Guide to Historical Poisons

[From the archives]

The Long Way Home takes place in the court of Louis XIV during the Affair of the Poisons. During this period, many people from all walks of life were employing poison to dispatch with rivals and even family members to improve their fortunes or standing in court. As you can imagine, poison plays a large part in the plot of The Long Way Home. Here are three that are featured in the book along with symptoms so you’ll be first to know if your enemies have dosed your wine.

You know, just in case.

Arsenic (also known as Inheritance Powder)

Arsenic was the most commonly used poison at this time, and was used alone or to add extra toxicity to other lethal concoctions. It was the primary ingredient in Inheritance Powder, so called because of the frequency with which it was against relatives and spouses for the sake of inheritance.

Tasteless as it was potent, arsenic usually went undetected in wine or food, although it was also added to soap and even sprinkled into flowers. It could easily kill someone quickly, but was more commonly distributed over a long period of time to make it appear that the victim was suffering from a long illness. The symptoms begin with headaches, drowsiness, and gastrointestinal problems, and as it develops, worsen into convulsions, muscle cramps, hair loss, organ failure, coma, and death.

Unusually for a poison apart from lead, arsenic has had many other common uses throughout history. It was used as a cosmetic as early as the Elizabethan period. Combined with vinegar and white chalk, it was applied to whiten the complexion as a precursor to the lead-based ceruse popular in later centuries.

Ad for Arsenic Wafers, 1896. Arsenic was a common complexion treatment until the early 20th century.

By the Victorian period, arsenic was taken as a supplement to correct the complexion from within, resulting in blueish, translucent skin. Victorian and Edwardian doctors prescribed it for asthma, typhus, malaria, period pain, syphilis, neuralgia, and as a nonspecific pick-me-up. It was also used in pigments such as Paris Green, Scheele’s Green, and London Purple, all of them extremely toxic when ingested or inhaled. A distinctive yellow-green, Scheele’s Green was a popular dye in the nineteenth century for furnishings, candles, fabric, and even children’s toys, but it gave off a toxic gas. It may have even played a part in Napoleon’s death. While it took nearly a century to discover the dangers of the pigment, it was later put to use as an insecticide.

A Glass of Wine With Caesar Borgia. John Collier, 1893. From left to right: Cesare, Lucrezia, their father, Pope Alexander VI, and a young man with an empty glass. The implication is that the man doesn’t know if it will be poisoned.

Cantharides (also known as Cantarella or Spanish Fly)

Cantarella was a poison that was rumored to have been used by the Borgias (among others). Although it appeared in literature as something that could mimic death, cantarella was probably made from arsenic, like most of the common poisons of the era, or of canthariden powder made from blister beetles, and was highly toxic. Cantharides are now more commonly known as Spanish Fly.

Although it was only rumored to have been used by the Borgias, it was definitely 8fda6-cantharidesassociated with the Medicis. Aqua Toffana, or Aquetta di Napoli, was a potent mixture of both arsenic and cantharides allegedly created by an Italian countess, Giulia Tofana (d. 1659). Colorless and odorless, it was undetectable even in water and as little as four drops could cause death within a few hours. It could also be mixed with lead or belladonna for a little extra f*** you.

In case you’re wondering how one would catch enough blister beetles to do away with one’s enemies, cantharides were surprisingly easy to come across. They were also used as an aphrodisiac. In small quantities, they engorge the genitals, so it must have seemed like a good idea at the time. In larger quantities, however, they raise blisters, cause inflammation, nervous agitation, burning of the mouth, dysphagia, nausea, hematemesis, hematuria, and dysuria.

Oh, and death.

The powder was brownish in color and smelled bad, but mostly went unnoticed with food or wine. More than one character in The Long Way Home has come in contact with it, and it even plays a part in the story.

Ad for Pennyroyal Pills, 1905.

Pennyroyal

Pennyroyal was not often used to intentionally poison anyone, but I’m including it in this guide because of its toxic effects. Usually drunk as tea, is was used as a digestive aid and to cause miscarriage. Is was also used in baths to kill fleas or to treat venomous bites.

Although this is the least toxic of the bunch, the side effects are much more worrying. Taken in any quantity, it may not only result in contraction of the uterus, but also serious damage to the liver, kidneys, and nervous system. It’s a neurotoxin that can cause auditory and visual hallucinations, delirium, unconsciousness, hearing problems, brain damage, and death.

Along with Inheritance Powder and Cantarella, Pennyroyal also appears in The Long Way Home and causes some interesting complications for a few of our characters.

*

All of these poisons were common and easily obtainable in much of Europe during the time this book takes place and as you can see, continued to be commonly used for a variety of purposes until very recently. The use of Inheritance Powder in particular is very well-documented and it played a huge part in the Affair of the Poisons as well as commanding a central position in The Long Way Home.

Don’t say I didn’t warn you.

Bones, Blood, Barbers, and Butchers: Surgeons in the 18th Century

In the eighteenth century, the record for the fastest amputation at the thigh was nine seconds, start to finish, including sawing through the bone. Are you impressed yet? Even the average, thirty seconds, was pretty damned fast.

And speed was of the essence. Let’s face it. If you needed surgery in the eighteenth century or the first half of the nineteenth, you’d better be strong and brave, because it wasn’t a doddle. Not for the surgeon, and not at all for the patient.

Patients faced three major killers

They’d solved one of the major issues that killed people who needed surgery, reinventing ligatures to tie off blood vessels so the patient didn’t bleed out on the table. Before the sixteenth century, they’d used cautery—burning—to seal any gushers, vastly adding to the pain. And, of course, closing up the wound as fast as possible helped.

And pain was the second issue. No effective anesthetics. Not until the mid-nineteenth century. The patient was awake for the entire operation, which was the main reason why speed (and some strong helpers to hold the patient down) mattered.

The biggest killer was factor number three. Germs.

Not that they knew that, of course. The prevailing opinion was that wound infections were caused by air, though how nobody quite knew. They had no way of knowing that the surgeon’s hands and clothes, the bed sheets, the surgical instruments, the dressings, and a myriad of other surfaces that would come into contact with the patient were covered with organisms too tiny to see, but that would infect the wound. Most people sickened. More than half died.

Keep out the air to keep out the contagion

Some hospitals did pretty well. Their theory was that the infective element was carried in noxious fumes; that is, if it smelled like bad air, it would be bad for their patient. Alexander Monro (Primus and Secundus), a father and son team who headed the Royal Infirmary in Edinburgh, must have run a clean operation. They managed to get the death rate for amputations down to eight percent. Given that other hospitals of the time managed rates of 45 to 65 percent, that’s truly impressive.

Most surgeons relied on speed to limit the amount of time the wound was exposed to the air, thus—they hoped—cutting down on the damage the air did to the tissues.

More butchery than medicine

So a fast surgeon was far more likely to be a successful surgeon for three reasons: less blood flow, a shorter time of acute agony, and (they thought) less contagion. No wonder that, to the rest of the human race, surgery seemed more a matter of butchery than medicine.

Naturally, as they thought at the time, physicians did not perform surgery. Physicians had, since medieval times, been university trained. They were gentlemen’s sons with a medical doctorate, highly educated and knowledgeable about the humours of the body and the appropriate ways to balance them. In theory, their superior knowledge made them the only proper people to practice medicine and oversee surgery. They did not involve themselves in physical labour, but expected rather to command those who distilled the medicines they prescribed (apothecaries) or who carried out operations they deemed necessary.

Surgeons, barber surgeons and apothecary surgeons

Specialist surgeons learned their craft on the job, working as a surgeon’s mate in the navy or the army, or as the apprentice to a barber surgeon or an apothecary surgeon.

Barbers were good men with a blade, so an obvious choice for removing some part that shouldn’t be there or performing a beneficial bloodletting. The familiar red and white barber’s pole dates from the time of the barber surgeon, representing the rod the patient held tightly during the operation and the bloodied and clean bandages used. When washed and hung to dry, they would twist together in the wind, forming the spiral we see today.

Apothecary surgeons had won a landmark case in the first decade of the eighteenth century, when an apothecary was taken to court by the Guild of Physicians for compounding and administering medicines without the benefit of a physician’s advice. The Physicians won, but the Society of Apothecaries appealed to the House of Lords, who were unimpressed with the argument that allowing apothecaries to care for the sick would:

“Deprive the gentry of one of the processions by which their younger sons might honourably subsist and be a great detriment to the Universities.”

The Lords reversed the judgement.

The rise of a profession

By the eighteenth century, surgeons were giving physicians a run for their money, some attending university as well as learning their craft by apprenticeship. However, they seldom had any formal qualifications before the Royal College of Surgeons was founded in London in 1800. They were ‘Mister’ compared to the physician’s more prestigious ‘Doctor’, though the brilliant work of a plethora of eighteenth century surgeons raised their status and the work of medical teaching hospitals such as the Royal Infirmary mentioned above raised their knowledge.

By the time Victoria ascended the throne, the confidence of surgeons, and the income they could command, had risen to the point that the cheeky surgeons made the former insulting honorific into a badge of honour. In the UK, Eire and New Zealand to this day, surgeons are called ‘Mister’ rather than ‘Doctor’.

Jude Knight’s writing goal is to transport readers to another time, another place, where they can enjoy adventure and romance, thrill to trials and challenges, uncover secrets and solve mysteries, delight in a happy ending, and return from their virtual holiday refreshed and ready for anything.

She writes historical novels, novellas, and short stories, mostly set in the early 19th Century. She writes strong determined heroines, heroes who can appreciate a clever capable woman, villains you’ll love to loathe, and all with a leavening of humour.

A Raging Madness is out May 9th. Stop by our sister blog today to see surgery in action in a new excerpt and enter two giveaways!

On The Famous Voyage: Finding London’s Lost River

the fleet by samuel scott

The Fleet River. Samuel Scott, 1750.

London’s major river is, of course, the Thames but, as the capital’s antiquarians will tell you, there are more than a dozen ancient tributaries hidden beneath the surface of the modern metropolis. The largest of these smaller rivers is the River Fleet, which flows from the largest stretch of common green in London, at Hampstead Heath, to Blackfriars Bridge, where it enters the Thames. This is a journey, not just from North London to the River, but also through the history of the City from Ancient to Modern times, marking some colourful characters and encompassing some bewildering changes along the way.

Cities are typically built along rivers to provide drinking water, transport, defense, and sewage removal. The Fleet has served all of these functions over London’s long history. As place-names along its banks (Brideswell, Clerkenwell) suggest, many wells were built along the Fleet in Roman and Saxon times, although, as we shall see, the purity of its waters were not set to be a defining feature as London grew.

The Fleet (‘tidal inlet’ in Anglo-Saxon) initially provided a waterway which served London from the North and, in a later incarnation as the New Canal, was part of the network which brought coal from the North of England to fuel the rapidly industrializing London of the seventeenth and eighteenth centuries. Even after the canals were superseded by road and rail and entirely covered over in the later eighteenth and early nineteenth centuries, the valley carved by the Fleet continued to form the basis for some of London’s modern arteries, such as Farringdon Road and the Metropolitan Railway line (although it resisted having an underground railway line–that which would become the Jubilee Line–lain beneath it by repeatedly flooding tunnels).

Defensively, the Fleet has a rather inglorious history. It is unclear how the Fleet was utilized by the Romans and it seems rarely to have been called upon subsequently. A second century boat carrying ragstone (possibly intended for building the city wall) was discovered in 1962, sunk at the mouth of the river.

Much later, the Fleet’s banks were built up into earthworks during the Civil War, when London was very much a Parliamentarian (‘Roundhead’) stronghold. The Royalist armies, however, never threatened the capital, with Charles II’s return to the City being by invitation rather than by conquest. During one of the great crises of the restored king’s reign in 1666, desperate Londoners were hopeful that the Fleet would provide an effective break against the Great Fire as it reached its third day. Here the Fleet proved as ineffective as the civic defenses and the Fire jumped the Fleet ditch, ultimately allowing it to claim St Paul’s Cathedral.

Of course, the most serious modern military threat to London came from the air in the form of the Luftwaffe. The old river beneath Fleet Street could offer no protection when Serjeant’s Inn, one of the oldest legal precincts in England, was destroyed during the Blitz.

It is with the removal of sewage and other waste, or at least with its failure to do so effectively, with which the Fleet is most famously associated. As London grew, the Fleet increasingly became a repository for whatever the city’s inhabitants wanted to get rid of. The medieval meat markets which grew up to feed the expanding population soon became problematic and in 1290 the Carmelite monks complained that the offal deposited in the river by butchers at a nearby market (the delightfully-named Shambles, at Newgate) was constantly blocking what was, at this point, a stream.

Copperplate_map_Fleet

The southern end of the Fleet, 1550s.

Although all manner of industries poured waste into the Fleet, it was the offal and dead animals in various forms which seemed to catch the imagination of early modern satirists of the capital. Ben Jonson’s (c. 1612) mock-epic poem which lends its title to this article was a litany of classical references intertwined with toilet humour and social satire and described the diverse pollutants of the river with considerable gusto:

Your Fleet Lane Furies; and hot cooks do dwell,
That, with still-scalding steams, make the place hell.
The sinks ran grease, and hair of measled hogs,
The heads, houghs, entrails, and the hides of dogs:
For, to say truth, what scullion is so nasty,
To put the skins, and offal in a pasty?
Cats there lay divers had been flayed and roasted,
And, after mouldy grown, again were toasted,
Then, selling not, a dish was ta’en to mince them,
But still, it seemed, the rankness did convince them.
For, here they were thrown in with the melted pewter,
Yet drowned they not. They had five lives in future.

Jonson’s influence and the continued assault of the Fleet upon the senses continued into the eighteenth century: Jonathan Swift’s “Drown’d Puppies” and “Dead Cats” of 1710’s A Description of a City Shower, floating amongst the offal and turnip-tops, were echoed by Alexander Pope’s “large tribute of dead dogs to the Thames” in 1728’s Dunciad.

The enthusiasm of these men for describing the sewage, of which the Fleet’s waters seemed largely comprised, was hardly less. Jonson’s ‘voyage’ was taken down a river where “Arses were heard to croak, instead of frogs”. His Fleet contained the contents of every ‘night-tub’ from an overcrowded metropolis, where “each privy’s seat/ Is filled with buttock” and the very “walls do sweat Urine”. This state of affairs is compounded by the diet of a city where “every clerk eats artichokes, and peason, Laxative lettuce, and such windy meat”. In 1700, Thomas Brown has his narrator, an ‘Indian’ revealing the strange “Manners, Customs, and Religions” practiced by the various “Nations” of London to his readers, shove an impudent rag-seller into the kennel [1] in the centre of the street with the words:

Tho’ I want nothing out of your Shops, methinks you all want good Manners and Civility, that are ready to tear a New Sute (suit) from my Back, under pretence of selling me an Olde one; Avant Vermin, your Cloaths smell as rankly of Newgate and Tyburn, as the bedding to be sold at the Ditch-side near Fleet-Bridge, smells of Bawdy-House and Brandy.

Brown’s tone is lighthearted and playful, but some of the associations he makes are telling. The visceral nature of these accounts certainly reflected a literal reality but they also had a metaphorical dimension in which it was the excesses and vices of London itself which were clogging up its abused waterways. The writers were playing, not just on the Fleet’s role in waste disposal, but also on the reputation of those who occupied its banks. In Jonathan Swift’s A Description of a City Shower, in particular, a storm washing through London links the different areas and strata of the city together through its flow.

The Fleet flowed past Bridewell and the Fleet prisons and through areas such as Clerkenwell, notorious for sheltering heretics, thieves, and prostitutes from the arms of the law. Here the bodies floating downstream alongside the unfortunate cats and dogs might be human. The industries around the river were messy and disease was known to cling to its slums. The Dunciad plays on the Fleet’s use as an open sewer by having the hack-writers, who are one of the principal subjects of Pope’s ire, swim in it. The implication was as clear as Pope’s Fleet was ‘muddy’. Much later, Charles Dickens’ child-warping pick-pocket, Fagin, would have his den alongside the Fleet.

From the early attempts by the Carmelites to keep the river unblocked to the late seventeenth and early eighteenth century attempt to make it serve as a canal, the smell and the constant need for dredging could not be overcome. So impossible was it to contain the flood of effluent that, even after the river was paved over during the later part of the eighteenth and early part of the nineteenth centuries, the build-up of trapped gas exploded near Blackfriars in 1846, taking out three posthouses and a steamboat in the process. It must have seemed as though the truth would not be hidden beneath the streets. Eventually, however, the Great Stink of 1858 preceded a concerted effort to enclose the city’s sewers and a London more familiar to us today emerged.

Dr. J.V.P. Jenkins is a historian and freelance editor from London. He earned his BA, Master’s, and Doctorate at Swansea University. He is the new co-editor of Dirty, Sexy History and sometimes tweets @JVPolsomJenkins.

Sources

Brown, Thomas. Amusements serious and comical, calculated for the meridian of London (1700)
Dickens, Charles. Oliver Twist (1839)
Jonson, Ben. On The Famous Voyage (c.1612)
Pope, Alexander. Dunciad (1728)
Swift, Jonathan. A Description of a City Shower (1710)
Ackroyd, Peter. London: The Biography (Anchor; New York, 2003)
Brown, Laura. Fables of Modernity: Literature and Culture in the English Eighteenth Century (Cornell U.P., 2003)
Gray, Robert. A History of London (Taplinger; New York, 1979)

[1] An open gutter, running down the middle of the street. The 1671 Sewage and Paving Act had prescribed moving the kennel from the center of the street to an open side drain set off by a raised pavement. The main thoroughfares were also to be cambered (built up in middle for drainage and paved) but these measures were not instantly applied to all streets.