I have a heavy summer cold this hundred-degree morning in the grimy cradle of American democracy, but at least I don't have what the guy next to me has. The poor chap—a Victorian fetus floating in a jar of formaldehyde—has managed to land himself a nasty dose of exencephaly, or skull-lessness, meaning he has to wear the mass of his brain hanging down between his shoulder blades like the pendulous knit hat of a management-level Rastafarian. I make no claims to being a fashion guru—well, I do but I'm trying to make them less often. Nevertheless, I'm going out on a limb to say it's not a good look, the brain-down-the-back. As the maniacally detached Stacey London delights in explaining to her victims on the Learning Channel's What Not to Wear: If you're going to hang on to a style in hopes it'll come back into fashion, do quickly check to make sure that it ever was in fashion to begin with.
Sorry, I'm being arch; not quite sure how else to react, to be honest. The Mütter Museum of medical anomalies at the venerable College of Physicians of Philadelphia is well supplied with helpful staff and airy colonnades, but what it could really use is a little stack of printed leaflets explaining to the modern visitor how he or she is supposed to feel about all this, or at least what to make of it: the uprooted genitalia and beach-ball tumors, the skeleton of the man whose muscle has turned to bone, the woman so fat that after death her body transformed itself into soap, the embryos in jars whose peeling labels break the sad but unsurprising news that not having a skull, or a brain, or a stomach, or any skin, is a state of affairs "incompatible with life."
Nor is it clear, even allowing for the freewheeling chaos of medical science at the time of the museum's 1858 founding, how any of the exhibits would really help anyone be a better doctor. "Congratulations on your pregnancy, Mrs. Thompson. Fingers crossed it has a skull!" "Ah, yes, Mr. Jenkins, it appears you have your organs on the outside of your body; if you'll give me a minute, I believe somewhere I have a cream." More likely, the young doctors of 1858, like everyone else drawing breath in the English-speaking world at that time, simply liked to unwind after a hard day by gawping at the deformed.
Which they did, famously, for a reason: They were worried.
The Victorians' fascination with Human Monstrosities got off to a rollicking start with the 1818 publication of Mary Shelley's Frankenstein. (The fact that Queen Victoria wouldn't even be born for another year, or be crowned for another 19, only goes to show how desperate they were to get going, I reckon). The horrifying tale of a brilliant doctor who builds a murderous and repulsive monster by mistake, the book wore its moral on its sleeve, or at least in its original subtitle: The Modern Prometheus.
These were the early days of the Industrial Revolution. One literally couldn't open the newspaper without discovering that someone had just invented the steam train, struck a match, or taken the first picture. After eons of glacial incrementalism, Progress had suddenly taken off and was dragging humanity to terrifying points uncertain like an awful wild stallion. Frankenstein gave voice to the ubiquitous public concern that the human species, for its arrogant and incautious tinkering with the laws of Nature, might be due some sort of cosmic comeuppance, like Prometheus, who legendarily invented fire only to find himself strapped to a rock while a bird plucked out his liver—which would then grow back so it could happen all over again the next day, if you can believe it.
Frankenstein became a runaway best seller, and because no one could yet figure out how to make a movie of it, enterprising moguls settled on the next best thing, if not in fact a slightly better thing: the freak show.
Here, in tents at traveling carnivals and on stationary boardwalks, were real monsters—giants, women with beards, skull-less embryos bobbing in jars—that were confrontable in carefully stage-managed environments. The rise of the zoo was an expression of the same phenomenon, as A. N. Wilson observes in his essential The Victorians. At a time when "disturbing thoughts were beginning to dawn in the public mind about the nature of humanity in the scheme of things," as he says, wild animals and mutants gave Victorians something to stare at in safety while asking themselves the big and suddenly urgent question: What, if anything, makes me different from that? In other words, what does it mean to be human?
And once again, like our angst-ridden Victorian forebears, we're using our eyes and our stomachs to do our thinking for us, to judge humanness.Which is pretty much where we find ourselves again—hence my visit to the Mütter on this unbreathably hot morning. Thanks to the birth and rapid maturation of genetic engineering, artificial intelligence, and sophisticated pineapple-based facial scrubs that can give even a 50-something mother of 12 a sheen of nubility, at least in a darkened bar, humanity once again finds itself trying to solidify its own definitional boundaries. What are we? we're wondering again. How much could we change and still be ourselves? Where do we stop and the monsters begin?
Take the debate over embryonic stem cell research. The issue of whether frozen embryos who are doomed to destruction anyway are fair game for scientific experimentation has roiled the nation for the best part of a decade, yet has rarely progressed beyond a battle of competing visual aids. On the progressive, pro-research side of the argument—with which I happen to agree—a favorite tactic of those weighing in is to observe with postmodern panache that the frozen embryos in question are no bigger than the period at the end of this very sentence that I'm typing right now—as if anyone had argued that these embryos should be treasured for their sacred hugeness. President Bush, unconvinced, signed this year's veto of expanded stem cell funding in a room filled with adorable "snowflake babies," all of whom had at one stage been a frozen embryo in a lab and were now—indisputably—fully fledged booger-flicking American young people. As if anyone had argued that frozen embryos couldn't be grown into people given the right care and feeding and a few hundred grand's worth of special treatment by the very people whose funding he was now freezing.
Which is not to imply that anything particularly sinister or Orwellian, or even new, has happened to our powers of rational discourse. The blurring of ethics and aesthetics is a simple fact of human nature and one that's generally served us well. The near-universal opposition to torture, for instance (President Bush, once again, is unconvinced), arises not from the population having carefully waded through volumes of Hobbes and Foucault and arrived at a heavily footnoted opinion but from the fact that the sight of it, and the thought of the sight of it, simply makes us shudder. Because what if that were us? Over time we've managed to compile an impressive canon of moral principles by literally following our guts: that which makes us nauseated, or want to reel away in gibbering horror, must have something ethically wrong with it, and vice versa.
The trouble is—and this is new—that the next generation of freaks and monsters whose humanity we'll be charged with evaluating is likely to be rather easy on the eye. Positively gorgeous in some cases. As science continues to smooth out the rough spots and increase the functionality of the human blueprint, we may find ourselves not repulsed but seduced. It isn't rotten fruit and flaming torches we'll be hurling at the Elephant Men of the future but phone numbers and gift certificates. So smitten will we be with what our eyes are telling us, we'll be tempted to overlook the fact that Karen 3000 is one of a thousand monozygotic sisters or that an entire women's prison in Sweden had to die to make her piercing blue eyes.
And indeed it's with a certain wariness that I totter out of the Mütter Museum onto what Bruce Springsteen so aptly termed the streets of Philadelphia. The Victorian Age may have been a tough place to live in for those with a weak stomach or those, like me, with a finely tuned aesthetic apparatus, but at least the monsters of that time had the courtesy to declare themselves as such. As we enter an era in which wrongness and ugliness have dissolved their ancient alliance, I fear it won't be too long before—at the risk of sounding dramatic—they walk among us.