We tend to ask a lot of questions – and demand a lot of answers – about what we put in our bodies. Is it organic? Is it all-natural? Is it genetically modified? Is it processed? Where did it come from? Was it traded fairly? What happened to it between harvest and hearth? How salty is it? Does it contain MSG? Does it contain animals? Does it contain animal products? Has it ever touched anything that touched an animal product? Has it ever come within thirty feet of a tree nut? Will it send me into anaphylaxis?
We are accustomed to finding all this information readily available when we purchase food, but this degree of openness about food and its make-up is fairly new. In the mid-eighteenth century, it was common practice for bakers to whiten their flour artificially using additives like chalk and bone powder, allowing them to sell their bread at a higher price than they could if it were darker. Though such practices continued despite a parliamentary ban, nineteenth-century British politicians seem to have been more concerned with getting enough bang for their buck than with what exactly went into all that bang: the Bread Act of 1822, which specified acceptable weights for commercial breads, remained in effect until World War Two.
However, the nineteenth century did witness significant advances in food regulation, even if Oscar Wilde’s dinner might not have met today’s standards of food purity. Parliament passed acts tightening the restrictions on the contents of foods, drinks, and drugs in 1860, 1875, 1887. and 1899. However, the main watchdogs of food and drug purity in the Victorian era were the guilds, which set their own standards of food composition, working to keep ash out of bread, acorns and grass out of coffee, and gravel out of peppercorns.
As a pretty obsessive label-reader, I shudder at the idea of even needing to think about these things, but adulterated food was a very real, concrete (maybe literally – ugh) part of daily life for Victorians. Above and beyond the horrors of food contamination, the issues with medicine are a whole other can of worms. The risk of contamination still applied, but there was less regulation, since apothecaries could put anything they wanted into their patent medicines without revealing their formulas – trade secrets – to the public. Plus, until 1868, anybody could obtain anything without a prescription from an actual physician. It was also common to self-medicate at home, as cookbooks of the era often contained a section with recipes for tonics and medicines.
Even if you knew exactly what went into your medicine, though, you still weren’t necessarily safe. Many substances considered therapeutic in the nineteenth century are considered poisonous today; opium, mercury, and arsenic were all commonly used in chemists’ concoctions. In a world where laudanum – a blend of alcohol and opium – was given to babies to help them sleep, medicine often did more harm than good.
It’s easy to mock our ancestors for being so oblivious to the true nature of the substances they consumed. But maybe we should bite our tongues. I do believe we are unequivocally healthier today than we were a century and a half ago, and I certainly believe in the application of the scientific method in the form of double-blind FDA studies of drug safety and efficacy. On the other hand, we, too, medicate ourselves with poisons, and we eat plenty of substances that probably can’t legally be called “food.” Lithium, the preferred treatment for bipolar disorder, used to be an ingredient in a popular salt substitute until it was found to cause horrendous neurological defects; it does seem to legitimately ameliorate bipolar symptoms, but we have yet to discover the biochemical mechanism of its effect. And some of our most popular foods are so bizarre that scientists are still studying them.
That said, I’d still rather eat my generic-brand Weetabix than dig into a bowl of Elijah’s Manna.
[Originally posted 7/4/10]