The word “science” dates back to around 1300 C.E. The word “scientist,” on the other hand, didn’t exist until 1833. Until then, the men (and, rarely, women) who tinkered with chemicals and stared at the stars were said to be practicing a branch of philosophy related to the workings of the natural world. These “natural philosophers” (or “naturalists,” who focused on biology) were not usually university professors or professional researchers. Unlike today, Oxford and Cambridge weren’t research institutions so much as finishing schools for wealthy, young, strictly Anglican gentlemen. The kind of cutting-edge research we associate with universities today was conducted in completely different places in late Romantic and early Victorian England. Independently wealthy gentlemen with time to spare sometimes pursued scientific research in their homes for their own entertainment. If you didn’t have an inheritance to live off of, you could still spend your time on research if you could attach yourself to a wealthy patron who could cover your living expenses and finance your experiments. Many officials of the Anglican Church studied the workings of God’s world between sermons. Among those who were obliged to work for a living, the advent of patent law encouraged competitive innovation in major industries like mining and construction, opening the door to advances in fields like thermodynamics and chemistry.
During the eighteenth and nineteenth centuries, though, these natural philosophers, inventors, and bricoleurs developed into “scientists” as we know them today, as research became a profession in itself, and universities – often led by new secular institutions like University College, which opened as an alternative to the religiously exclusive Oxbridge colleges – gradually came to focus more on rigorous academia. The birth of the “scientist” and the shifting of science’s place in society seem to have been accompanied by changing ideas about what exactly constitutes “science.” Formerly, physicians were men of science because they had memorized Galen in the original Greek; later, the real men of science were surgeons like John Hunter, who decided to learn anatomy by studying real human bodies instead of antique books, and physicists like Michael Faraday, who conducted methodical experiments to understand the principles of electromagnetism. This was a fairly radical transition, and drastically altered the status differential between various professions. Today, surgeons are pretty much the most respected doctors in the world; two hundred years ago, physicians diagnosed patients based on millennia-old criteria, and barber-surgeons did the base manual labor of treatments like blood-letting. The difference in their social positions was extremely visible: physicians were permitted to carry gold-tipped canes, while barber-surgeons had to be content with silver.
The growing emphasis on empiricism (as opposed to the received wisdom of the ancients) is demonstrated by the methods by which new technologies found their way into everyday life in the late eighteenth and early nineteenth centuries. For example, in the 1810’s, Humphry Davy and George Stephenson each invented, nearly simultaneously, a special type of mining lamp designed not to explode in high-methane environments. A vicious debate ensued over who would get the credit (and the profit). According to Professor Frank James of the Royal Institution, a turning point in the controversy came when Davy’s friends wrote letters (to newspapers, mining companies, etc.) arguing that Davy had invented his lamp “with science” – which, I assume, was supposed to mean that he had reasoned his way to his design from scientific principles, as opposed to making semi-random adjustments through trial and error. Apparently the process of inventing something with science was distinctly different from just stumbling on it by systematic tinkering. It’s far from clear that Davy’s inventive process was actually any different from Stephenson’s, but it didn’t matter: the Davy lamp won out – due in part, perhaps, to its deliberate association with Science (with a capital S).
Twenty years earlier, Edward Jenner began the experiments that led to his recognition as the inventor of the smallpox vaccine. Jenner came from Gloucestershire, where a minor illness called cowpox was fairly common among people who milked cows regularly. An old wives’ tale of the region claimed that people who had caught cowpox (a brief and mild disease) never got smallpox (a deadly epidemic that ravaged Europe for centuries). Instead of dismissing it as nonsense, Jenner decided to test this idea empirically. (Jenner trained under the aforementioned John Hunter, who always told his students, “Don’t think – try the experiment.”) Using his gardener’s eight-year-old son as a guinea pig (which would undoubtedly have thrilled his institutional review board if he were conducting this experiment today), he introduced pus from a milkmaid’s cowpox sores into fresh scratches in the boy’s arm. James Phipps, the guinea pig, acquired and recovered from a mild case of cowpox. Weeks later, in order to determine whether this had resulted in smallpox immunity, Jenner inserted pus taken from a smallpox patient into the boy’s arm (which would be the icing on the institutional review board’s cake). Sure enough, James had no reaction to the smallpox exposure, indicating that he had indeed acquired smallpox immunity from his cowpox infection. (The myriad ethical issues here are a topic for a future discussion.)
This was great, but it turns out that Jenner wasn’t the first to have this idea. Another Gloucestershire doctor (whom Jenner probably knew) had already been talking for years about taking the cowpox connection seriously, and may have even given Jenner the idea. And other forms of inoculation against smallpox had been practiced for centuries – most involved deliberately giving someone a mild case of smallpox (there were several strains of varying severity) at a convenient time in an effort to avoid catching a fatal case later on. In China, for example, people sought smallpox immunity by having the powdered scabs of smallpox patients blown up their nose. (Appealing, I know.) The Turkish technique of injecting smallpox pus was even brought to England by Lady Mary Wortley Montagu in the early eighteenth century, and variolation (deliberate infection with a mild strain of smallpox in order to confer immunity) was already practiced fairly commonly in England when Jenner was doing his research. In fact, Jenner himself had been variolated against smallpox as a child. So why does Jenner get all the credit? What did he do to deserve it?
You could dismiss him as a plagiarist, but I think that’d be unfair. Sure, he didn’t invent the smallpox vaccine from scratch. However, he was the one who made a difference in public health by bringing this technology to the masses. He deserves credit for popularizing the vaccine, which saved tens of thousands of lives. How did he do it? What made his work different from all that came before it was the application of empiricism. He tested his theories rigorously and published his results. He developed the vaccine with science. That was what elevated the life-saving vaccine from folk remedy to cutting-edge medicine. The contributions of others, like country doctors and Lady Montagu, should certainly not be ignored; however, the importance of the distinction between their work and Jenner’s – between myth and medicine, bricolage and science – should not be underestimated.
[Originally posted 7/9/10]