Guns, Germs, and Steel by Jared Diamond

diamond_guns.jpgA simple glance around the modern world makes it clear that some continents, some peoples, have seen greater success, at least insofar as success is measured in terms of material wealth and territorial conquest. Europeans, and their descendants, have by and large achieved the highest levels of financial, technological, and political "progress," and have successfully supplemented native populations on several continents (North and South America, Australia). In the past week, I have already reviewed two books which in some part reflect the recent aftermath of the centuries of European ascent (Robert Fisk's The Great War for Civilisation, reviewed here, and Alistair Horne's A Savage War of Peace, reviewed here).

Most people probably take this reality for granted, without wondering much why history took that particular course. Others who have considered the question have relied upon facile attributions to supposed cultural or racial advantages for Europeans vis-a-vis the rest of humanity. In his controversial, Pulitzer Prize-winning 1997 book, Guns, Germs, and Steel, UCLA professor Jared Diamond aimed to answer this immense question, offering his own provocative thesis:

We all know that history has proceeded very differently for peoples from different parts of the globe. In the 13,000 years since the end of the last Ice Age, some parts of the world developed literate industrial societies with metal tools, other parts developed only nonliterate farming societies, and still others retained societies of hunter gatherers with stone tools. Those historical inequalities have cast long shadows on the modern world, because the literate societies with metal tools have conquered or exterminated the other societies. While those differences constitute the most basic fact of world history, the reasons for them remain uncertain and controversial...

Authors are regularly asked by journalists to summarize a long book in one sentence. For this book, here is such a sentence: "History followed different courses for different peoples because of differences among peoples' environments, not because of biological differences among peoples themselves."

Diamond explains that the desire to investigate this phenomenon arose during his field research in New Guinea, the large island north of Australia that remains home to an incredibly diverse number of tribal and linguistic groups (accounting for more than 1,000 of the world's ~6,000 surviving languages). While there, he was asked by one of his New Guinean friends why it was Europeans who had come to his land, and brought all sorts of advanced tools and products, and not the other way around. Diamond was intuitively skeptical of any explanation based on innate intellectual differences, based in part of the lack of any robust studies demonstrating such difference, and in part on his own observations of the intelligence of New Guinea's native population.

Recognizing immediately that a broad cross-disciplinary approach would be necessary to approach this question, Diamond found himself well-situated. The child of a physician and a linguist, Diamond studied physiology and biophysics at Harvard and Cambridge, pursued an interest in the ornithology of New Guinea, and developed an expertise on environmental history. His Wikipedia biography claims fluency in twelve languages, and prior to Guns, Germs, and Steel, he had published works in the fields of ecology, ornithology, human evolution, and human sexuality. Throughout the book, Diamond uses a variety of well-documented historical examples to define, test, and then illustrate his thesis, from New Guinea to . In his effort to redefine human history as a science, he draws from the fields of archaeology, linguistics, botany, zoology, sociology, geology, chemistry, biology, and more. As stated in his thesis, he believes environmental factors to be the prime mover in the broad course of human history, and he identifies four in particular:

The first set consists of the continental differences in the wild plant and animal species available as starting materials for domestication. That's because food production was critical for the accumulation of food surpluses that could feed non-food-producing specialists, and for the buildup of large populations enjoying a military advantage through mere numbers even before they had developed any technological or political advantage. For both of those reasons, all developments of economically complex, socially stratified, politically centralized societies beyond the level of small nascent chiefdoms were based on food production.

But most wild animal and plant species have proved unsuitable for domestication: food production has been based on relatively few species of livestock and crops. It turns out that the number of wild candidate species for domestication varied greatly among the continents... As a result, Africa ended up biologically somewhat less endowed than the much larger Eurasia, the Americas still less so, and Australia even less so...

The early chapters devoted to food production are amongst the most interesting in the book, which might not seem intuitively obvious. I myself was a bit skeptical as to how much attention I could pay to the domestication of wheat and so on. But Diamond's exploration of the junction between random mutation, natural selection, and human intervention through selective breeding is surprisingly compelling. Even more so is his discussion of the world's wildlife, and the factors which make some large mammals (e.g. cattle, sheep) more susceptible to domestication than others (e.g. lions, rhinos). That the distribution of domestication-prone animals so greatly favored Eurasia is one of the most striking revelations in Diamond's book.

[A] second set of factors consists of those affecting rates of diffusion and migration, which differed greatly among continents. They were most rapid in Eurasia, because of its east-west major axis and its relatively modest ecological and geographical barriers. The reasoning is straightforward for movements of crops and livestock, which depend strongly on climate and hence on latitude. But similar reasoning also applies to the diffusion of technological innovations, insofar as they are best suited without modification to specific environments. Diffusion was slower in Africa and especially in the Americas, because of those continents' north-south major axes and geographic and ecological barriers.

The best examples Diamond provides of this phenomenon lay in the contrast between Eurasia and the Americas. Consider the tremendous contacts made between the civilizations of the Fertile Crescent, Europe, and China. And these contacts were not all one-way. Though evidence exists for the earliest food production arising in the Fertile Crescent, successive millennia would see innovations headed both east and west. In the Americas, however, even the great civilizations of Peru and Mesoamerica, the Incas and Aztecs, failed to engage in any analogous cultural or technological exchange. As Diamond laments, the Native Americans were never able to link up the one large domestic animal, the llama of the Andes, with that vital innovation, the wheel.

Related to these factors affecting diffusion within continents is a third set of factors influencing diffusion between continents, which may also help build up a local pool of domesticates and technology. Ease of intercontinental diffusion has varied, because some continents are more isolated than others. Within the last 6,000 years it has been easiest from Eurasia to sub-Saharan Africa, supplying most of Africa's species of livestock. But interhemispheric diffusion made no contribution to Native America's complex societies, isolated from Eurasia at low latitudes by broad oceans, and at high latitudes by geography and by a climate suitable just for hunting-gathering. To Aboriginal Australia, isolated form Eurasia by the water barriers of the Indonesian Archipelago, Eurasia's sole proven contribution was the dingo.

The chapters charting the course of intercontinental diffusion were some of the most difficult for me to work through, whether focused on the Austronesian movement through Southeast Asia or the Bantu expansion through sub-Saharan Africa. Much of the evidence for these progressions is found either in archaeological analysis of pottery or linguistic scrutiny of common words. Comprehensive? Yes. Convincing? Certainly. But this is the only place where the narrative really drags. One exception, rooted solely in the bizarre nature of the case, is the migration of Austronesian peoples all the way from their likely origins in Indonesia all the way across the Indian Ocean to the African island of Madagascar, eventually resulting in a remarkably complex demography.

The fourth and last set of factors consists of continental differences in area or total population size. A larger area or population means more potential inventors, more competing societies, more innovations available to adopt--and more pressure to adopt and retain innovations, because societies failing to do so will tend to be eliminated by competing societies. That fate befell African pygmies and many other hunter-gatherer populations displaced by farmers. Conversely, it also befell the stubborn, conservative Greenland Norse farmers, replaced by Eskimo hunter-gatherers whose subsistence methods and technology were far superior to those of the Norse under Greenland conditions. Among the world's landmasses, area and the number of competing societies were largest for Eurasia, much smaller for Australia and New Guinea and especially for Tasmania. The Americas, despite their large aggregate area, were fragmented by geography and ecology and functioned effectively as several poorly connected smaller continents.

In addition to the geological realities described above, Diamond also places heavy emphasis on various positive-feedback loops. Food production and population size are the most notable of these. Though unable to ascertain definitively which is the chicken and which the egg, it is clear that the surpluses of sustenance created by food production will support a larger population than hunting and gathering alone. Not only can this larger population then produce more food, it can spare manpower for other uses, such as professional warfare, politics, and science, which will expand the community's power and its capacity for further innovation. And so on.

Guns, Germs, and Steel was widely-read and quite controversial upon publication, and it has remained so in the years since. A quick glance at the book's Wikipedia page gives a decent summary of the various lines of criticism that have been leveled in Professor Diamond's direction, some directed at the substance of his thesis, some focused on particular gaps or weaknesses in his arguments. Some are attributable to the nature of the book, which consists of a mere 400-odd pages of non-technical prose; this ensured a wide audience, but Diamond himself admits the difficulty of purporting to examine 13,000 years of global human history in so few pages.

That said, what Diamond accomplishes in his 400-odd pages is rather impressive. He takes his reader on a rewarding survey of the chronological and geographic scope of human civilization, with fascinating insights gained from fields as diverse as agriculture and linguistics and examples from every inhabited continent. Diamond explicitly intended the book to be provocative, and in his final chapter he advocates for a more scientific approach to the field of human history. At the very least, Guns, Germs, and Steel forcefully demonstrates how vital an appreciation of ecology, biology, and the other sciences is for understanding, if not justifying, the course of our civilization.

The Weather Makers by Tim Flannery

flannery_weather.jpgEfforts to understand the climate change debate are often sidetracked by an inability to grasp the scientific principles at play and by partisan substitution of ideology for evidence. Al Gore did tremendous work in An Inconvenient Truth, using visual imagery to establish for the masses the basic idea that our world is changing, that change is occurring with unnatural speed, and that much of this change can be tied to human causes.

But Al Gore is, for some, a divisive figure. A substantial portion of the population thinks he won the presidential election in 2000. An even larger portion probably wishes he had. And for all his erudition, he is not a scientist. So with due credit to his efforts, there is still room for others to play a pivotal role in educating us about climate change and what we can do about it.

Tim Flannery stepped into that role with his 2005 book, The Weather Makers. Flannery, an Australian scientist and environmental activist, previously published previous on the ecological history of Australia, and the ecological history of the United States. In The Weather Makers, he turned his focus to the topic of climate change, and in numerous short chapters, endeavors to tackle everything from the basics of climatology, the dangerous warning signs we've seen in past decades, the methods of prediction and what those models predict, the recent history of climate politics, and potential solutions for solving the crisis:

One thing that I hear again and again as I discuss climate change with friends, family, and colleagues is that it is something that may affect humanity in decades to come but is no immediate threat to us. I'm far from certain that that is true, and I'm not sure it is even relevant. If serious change or the effects of serious change are decades away, that is just a long tomorrow. Whenever my family gathers for a special event, the true scale of climate change is never far from my mind. My mother, who was born during the Great Depression--when motor vehicles and electric light were still novelties--positively glows in the company of her grandchildren, some of whom are not yet ten. To see them together is to see a chain of the deepest love that spans 150 years, for those grandchildren will not reach my mother's present age until late this century. To me, to her, and to their parents, their welfare is every bit as important as our own. On a broader scale, 70 percent of all people alive today will still be alive in 2050, so climate change affects almost every family on this planet.

Flannery treats every aspect of his sobering text with an even-hand. He does not villanize those whose scientific or political opinions clash with his own. He notes areas of scientific disagreement, he gives space to the proposals made by those who deny or diminish the dangers of climate change. He acknowledges the possible need for nuclear power and suggests that the continued use of fossil fuel for airline travel is not only necessary, but maybe even beneficial (due to possible cooling effects from the contrails made by airplane exhaust). Throughout the text, Flannery does not shy away from the shocking, but he never descends into sensationalism or spite.

One unique aspect of the book is that Flannery devotes at least as much attention to the policy failures in his native land as those in the United States. This is a perspective lacking in the U.S. debate, which per the recently-departed Bush administration's general outlook on the world, tended to devolve into an "us vs. them" mindset. If the U.S. refusal to ratify the Kyoto protocol is rightly infamous, it is possibly trumped by the Australians' behavior: after bullying the small Pacific island countries most threatened by climate change, and wrestling concessions allowing it to expand its own CO2 production, Australia still refused to sign the treaty. An indication that Americans are not alone in our dangerous backwardness, though as leaders in innovation and initiative we should still be ashamed not to be at the tip of the spear.

Unfortunately the several years since Flannery published The Weather Makers have failed to yield much visible progress in the war on climate change. Though the presidential campaign last year involved a great deal of renewable energy rhetoric, the legislating progress is a different game entirely. Just last week, eight Democratic senators signed a letter stating their opposition to using the budget process to sidestep anticipated Republican filibusters on climate change legislation. At the same time in Copenhagen, the Intergovernmental Panel on Climate Change (IPCC) was releasing a disturbing report:

The world is facing an increasing risk of "irreversible" climate shifts because worst-case scenarios warned of two years ago are being realized, an international panel of scientists has warned.

Temperatures, sea levels, acid levels in oceans and ice sheets were already moving "beyond the patterns of natural variability within which our society and economy have developed and thrived," scientists said in a report released Thursday.

This is particularly upsetting in light of the rather conservative nature of the IPCC. As Flannery describes it, because the panel operates by consensus and includes members from the petrostates and heel-draggers like the U.S., China and Australia, IPCC reports are "lowest-common-denominator science." But that also means that "If the IPCC says something, you had better believe it--and then allow for the likelihood that things are far worse than its says they are." It is hard to imagine how that could be.

Einstein by Walter Isaacson

isaacson_einstein.jpgThere are some individuals from history whose legacy looms so large that it has become detached from its underlying basis. Martin Luther King, Jr. is a symbol of civil rights even for those who can't link him to the Montgomery Bus Boycott or the Southern Christian Leadership Conference. Mother Theresa is a symbol of charity even for those who can't identify the city or country in which she ministered (Calcutta, India). Albert Einstein is a symbol of scientific genius even for those who don't know his Nobel Prize was awarded for "discovery of the law of the photoelectric effect," not for his theories of special and general relativity. I don't say this to be judgmental; I'm not in any way immune to this effect. That's why I so highly value a good biography of these larger-than-life figures, and that's why I was so excited to see that Walter Isaacson had published Einstein.

Isaacson has made a little niche for himself telling the life stories of diverse individuals whose achievements have been obscured by their symbolism, publishing Kissinger in 1992 and Benjamin Franklin in 2003; I found the latter superior to the efforts by Edmund Morgan and Gordon Wood (no mean feat).

The science-related books I most enjoy are those that succeed in taking on the challenge of presenting complex science to a popular audience. And it does not get more complex than modern physics. That's part of why Richard Rhodes' The Making of the Atomic Bomb is such a masterpiece; he turned a story fundamentally about nuclear physics and technological innovation into a gripping human narrative. Isaacson has similar success with Einstein; he does not shy away from extended discussions of the state of pre-Einstein physics, the triumphs Einstein achieved (especially those in his Annus Mirabilis), or the early work in quantum mechanics that would rile Einstein until the end of his life ("God does not play dice").

Isaacson does not gloss over the less flattering aspects of Einstein's life, particularly in his roles as a husband and father. He essentially abandons his first wife and their children, engages in numerous adulterous affairs, before marrying again. Still, this is a life to be wondered at. Unable to obtain any sort of academic job after graduating college, he is a clerk in the Swiss patent office when he makes his major breakthroughs in 1905, working mostly at night. He worked primarily from intellectual principles, favoring thought experiments above all else:

Some scientific theories depend primarily on induction: analyzing a lot of experimental findings and then finding theories that explain the empirical patterns. Others depend more on deduction: starting with elegant principles and postulates that are embraced as holy and then deducing the consequences from them. All scientists blend both approaches to differing degrees. Einstein had a good feel for experimental findings, and he used this knowledge to find certain fixed points upon which he could construct a theory. But his emphasis was primarily on the deductive approach.

To explain why Einstein was essentially able to simply think his way toward a revolution in physics, Isaacson emphasizes his knack for "questioning conventional wisdom, challenging authority, and marveling at mysteries that struck others as mundane." Einstein also had a professed love for simplicity, believing to the end that it was possible to create a single theory that would resolve the tensions inherent in modern physics:

While others continued to develop quantum mechanics, undaunted by the uncertainties at its core, Einstein persevered in his lonelier quest for a more complete explanation of the universe--a unified field theory that would tie together electricity and magnetism and gravity and quantum mechanics.

Indeed, he would pursue this until his dying day, without success. In so doing, the erstwhile revolutionary would be cast as a stubborn conservative by the younger generation that used his innovations as a launching pad into the new field of quantum mechanics. Einstein himself recognized the irony: "To punish me for my contempt of authority, Fate has made me an authority myself."

Beyond science and his personal life, Isaacson also explores the breadth of Einstein's personality: his evolving religious beliefs (essentially deistic, and strongly critical of atheism), his geographic flight from Germany (and back, and away again), his pacifism and the modifications made in the shadow of Hitler, his growing commitment to Zionism (he was offered, and declined, the presidency of Israel after Chaim Weizmann's death), his tangential involvement with the development of the nuclear bomb, and his support for a supranational government to safeguard (impose?) world peace. A thorough, and thoroughly enjoyable book about one of modernity's genuine heroes.

The Demon Under the Microscope by Thomas Hager

hager_demon.jpgIn my review of Molly Crosby's disappointing The American Plague that a good medical history weaves together scientific discovery, social history, and biography. Crosby's book fell flat because she focused too heavily on the social history at the expense of a decent exploration of the science.

In The Demon Under the Microscope, Thomas Hager does not make that mistake. His book explores not one particular illness, but the search for a drug that might treat the wide range of bacterial diseases that were taking hundreds of thousands of lives each year. This search, which would lead to the development of the world's first antibiotics, sulfa drugs, was spurred by the frustration of World War I doctors who saw thousands of soldiers die of wound infection:

[E]ven the most heroic and seemingly successful surgeries could go completely wrong a few days later. A soldier could wake one morning to find his carefully closed incisions, which had been fine the day before, now swollen, red, and painful. The edges, perhaps, had started to split open. Sometimes a foul-smelling, dark liquid oozed out. The skin around the wound began to take on a "curious half-jellified, half-mummified look," as one physician described it. These were cases of what military physicians feared most in their postoperative cases: Gasbrand, the Germans called it. Gas gangrene. The doctors knew what caused Gasbrand--an infection by bacteria--and they knew how it progressed... There was nothing much that could be done... Once gas gangrene was under way, the bacteria almost always won. Some patients fought it, railing and ranting for a day or two. Then they usually gave up, went silent and pale, temperature dropping, lips bluish. A day or two later, they quietly died of "green-black gangrene," one historian wrote," which emptied surgical wards into the graveyard."

One German doctor in particular, Gerhard Domagk, led the search and he is the main (but not only) protagonist of the story. Domagk's medical studies were interrupted by World War I, during which he would be wounded and serve as a medic on the Eastern Front, experiencing first-hand the destructive power of bacterial infection described above. After the war, he finished his medical degree, and after several stints in academic research positions, went to work for Bayer as the head of their new chemical drug research program. Hager gives a brief business history of the German chemical industry (including Bayer), which rose on the production of dyes; it was medicinal use of these dyes, pioneered by Paul Erlich, that Domagk was exploring at Bayer.

Indeed, the prominent role that Germany plays in the story leads to a variety of subplots. Crosby's book on yellow fever emphasized the backwards nature of medical education in 19th-century America, particularly as compared to that in Europe. This was part of a larger systemic difference in scientific academia, and the Germans were the innovators:

Until World War I broke the German monopoly on chemistry, no matter where you were in the world, you could not consider yourself a chemist (or much of a physicist, for that matter) until you first spent time in Germany studying with a master. Scientists from around the world flocked to Germany and came home to remake their own colleges. Johns Hopkins, founded in 1876, was the first German-model school in the United States, the first "research university." Hopkins introduced many German-style innovations into American education: undergraduate "majors" instead of a generic liberal arts degree; small seminars with their give-and-take with a professor in addition to lectures; an emphasis on original faculty research, especially in the sciences; "doctoral" degrees awarded to students once they had shown their own ability for independent and innovative inquiry. Soon virtually every major university in the United State was doing what Hopkins did, instituting polices that had been in place in Germany for a generation.

Not only was there a difference in the academic structure, there was also an interesting contrast regarding the perceived legitimacy of industrial scientific research vis a vis academica:

Doing science for a corporation was disdained by most academic scientists, who believed that only in a university setting or perhaps a government laboratory could a scientist follow the trial of pure knowledge, unsullied by commercial concerns. In Germany, however, the situation was different. German science had become the best in the world because German schools were among the best in the world, and German schools tended to have productive relationships with German industry.

If it sounds like there was a particularly close relationship between academia, industry, and the state, remember this was Imperial and then Nazi Germany. There are, no surprise, multiple Nazi connections to the story. Hager examines the folding of Bayer into the infamous German chemical conglomerate I.G. Farben, which would make substantial use of slave labor at Auschwitz during World War II, and produced the poison gas used by the Nazis to massacre Jews. He digs into the assassination of Reinhard Heydrich, who died from bacterial infection a week after a grenade explosion sent shrapnel deep into his body. According to Hager, allegations of insufficient use of sulfa as a treatment by Karl Gebhardt, Heinrich Himmler's personal physician, spurred medical experimentation on female prisoners at Ravensbruck. Gebhardt was executed after standing trial at Nuremberg. The Nazi regime even set its sights on Domagk, briefly imprisoning him for the audacity of replying politely to notification that he'd been awarded the 1939 Nobel Prize in Medicine.

While this exploration of political and social history of the era is fascinating, Hager's real success lay in the medical aspects of the story. Though multiple bacterial diseases are mentioned, it is those caused by streptoccus that are most prominently featured:

Strep was every doctor's nightmare. The organisms could be found everywhere, in dirt and dust, in the human nose, on the skin, and in the throat. Most strains of strep were harmless. But a few were deadly, and when they got into the wrong place--beneath the skin, through a wound, into the blood--they could cause at least fifteen different human diseases, each so different from each other that in the 1920s researchers had still not untangled them. The worst strains of strep could secrete three poisons, wipe out red blood cells, raise fevers, eat through tissue, fight their way through the body's natural defenses, and create a bewildering variety of different diseases as they went. A strep-infected scratch could lead to the burning rash of erysipelas, the old St. Anthony's Fire; a bit deeper it became cellulitis, a potentially fatal infection of the subcutaneous tissue; if it got into the bloodstream, it cause septicemia, a blood infection; in the spinal fluid, meningitis.

Not to mention it was responsible for scarlet fever, some forms of pneumonia, and one of the most potent forms of septicemia, childbed fever:

In the 1920s the paradigm for obstetrics--a field that primarily male physicians had finally taken over, during the previous three centuries, from primarily female midwives--was that of illness. "Pregnancy is a disease of nine months' duration," one physician had quipped; another advised, "It is best to consider every labor case as a severe operation." Their remarks underscored the pessimism of caregivers who lost many new mothers after childbirth. The process of birth included a natural wound, deep in the mother's body, where the placenta detached from the uterus... New mothers--especially those in maternity wards--risked a disease called cildbed fever, endemic in many hospitals, that killed tens of thousand of women every year... [S]tudies showed that childbed fever was caused by the same strains of Sreptococcus that had been found in soldiers... the primary cause of wound infections.

After thus reviewing the wide variety of diseases that prompted the search for some way to fight back, Hager returns to Domagk's laboratory at Bayer, where repeated manipulations of dyes led to the almost accidental addition of sulfur to the mix, with great results: the world's first antibiotic drug, Prontosil. Though it would take many months, and the intervention of French scientists seeking their own version of the drug, it was eventually discovered that the healing agent was not the dye but the sulfur, a cheap and abundant resource. This led to an explosion in sulfur-based drugs, and the American experience with these sulfur drugs revealed quite a bit about the state of pharmaceuticals at that time:

Almost any drug, as long as it was not a narcotic, could be sold without a prescription. There was no requirement that labels list all ingredients, proper dosages, or side effects... Patent medicines in the early part of the twentieth century were as firmly established a part of American culture as jazz or baseball. Americans were accustomed to medicating themselves, deciding on their own treatments, and buying their own drugs. It went against the grain to have some doctor or federal agency telling Americans how to cure themselves.

Manufacturers in this field jealously guarded secret recipes and sold their products directly to the public through massive advertising campaigns. They were masters of ballyhoo, filling every newspaper and papering every town with claims for the most amazing cures attributed to concoctions often brewed from the most worthless ingredients.

Efforts had been made early in President Roosevelt's presidency to update the old law, but they had failed in the face of lobbyists, manufacturers, and advertisers. One concoction would change everything and lead to a total transformation in American food and drug law, setting the model for the world. This drug, called Elixir Sulfanilamide, was not wholly worthless. It did, after all, contain sulfanilamide. Unfortunately, sulfanilamide was difficult to dissolve in water, so the chemist who created the elixir decided to mix it with diethylene glycol, which is just as poisonous as it sounds. More than 100 deaths later, Congress passed the 1938 Food, Drug and Cosmetic Act, which laid the foundation for modern food and drug regulation as we know it.

Hager has succeeded admirably in crafting a history that plumbs the scientific aspects of illness and medicine, ties these to the political, military, and social history of the early twentieth-century, and does justice to those who suffered, those who slaved, and those who succeeded in advancing the art and science of healing.

The American Plague by Molly Crosby

crosby_american.jpgAccounts of mankind's endless effort to combat disease are fascinating to me. A good medical history weaves together the scientific discovery of the illness, the social history of its effect on humanity, and the biography of the men and women who devote their lives to fighting it. Several months ago I read David Oshinsky's Polio, which traces the race to a vaccine between Jonas Salk and Albert Sabin, and enjoyed it greatly.

In The American Plague, Molly Crosby traces the history of another viral disease, yellow fever. Though little known by most Americans today, this is a disease with symptoms horrific enough to match anything else nature has thrown at us:

It hit suddenly in the form of a piercing headache and painful sensitivity to light, like looking into a white sun. At that point, the patient could still hope that it was not yellow fever, maybe just a headache from the heart. But the pain worsened, crippling movement and burning the skin. The fever rose to 104, maybe 105 degrees, and bones felt as though they had been cracked. The kidneys stopped functioning, poisoning the body. Abdominal cramps began in the final days of illness as the patient vomited black blood brought on my internal hemorrhaging. The victim became a palate of hideous color: Red blood ran from the gums, eyes and nose. The tongue swelled, turning purple. Black vomit roiled. And the skin grew a deep gold, the whites of the eyes turning brilliant yellow.

It doesn't get much nastier than that. Crosby devotes the early chapters of the book to one of the largest modern outbreaks of yellow fever, the 1878 epidemic in Memphis, Tennessee:

The city collapsed, hemorrhaging its population, its income, its viability. Trains pulled away, leaving people weeping beside the tracks, their last chance at escape gone as the final train cars rolled to a start. A morbid calmness fell over Memphis, so still and quiet as to be serene if one didn't know it was simply the pallor of death. In July of that year, the city boasted a population of 47,000. By September, 19,000 remained and 17,000 of them had yellow fever.

Crosby then shifts her focus to Cuba, twenty years later, and the bulk of the remaining pages are devoted to the efforts led by Army doctor Walter Reed to isolate the causes of yellow fever. The disease had hit the American military hard during the Spanish-American War, and Reed was dispatched to Cuba to lead a team to investigate the disease.

As Walter Reed's group began to narrow their focus to the mosquito as a likely vector for the disease, they needed experimental data to support this hypothesis. It had been twenty years since Dr. Carlos Finlay was widely mocked for his mosquito theory, and in the interim a bacterial theory of yellow fever had gained support. Crosby devotes a short chapter to vivisection, human experimentation, and the antivivisectionist movement, and later provides some context about the young men, mostly soldiers, who answered the Yellow Fever Commission's call for volunteers:

In modern times, it's hard to understand the mentality that would lead a soldier into knowingly risking his life for the purpose of medicine. Soldiers are trained to fight and defend; if any illness befalls them, it's considered a cruel and unjust turn of events. But prior to World War II and the introduction of penicillin, soldiers lost their loves to disease far more than bullets. From the time of the American Revolution through World War I, a soldier knew his odds of dying from dysentery, cholera, typhoid, smallpox, influenza, or yellow fever were greater than those on the battlefield, so volunteering for human experiments might not seem as much of a psychological departure as it would today. After all, a soldier's duty is to defense, and many men felt that the greatest threat to the American people lay not in enemy warships or troops, but in disease.

In the Commission's work, there were two parallel experiments. The first was to prove that mosquitoes were the carriers of yellow fever; the second was to prove that simple, unsanitary filth could not spread the disease. The latter would end once and for all the bacterial theory. The first experiment involved, obviously enough, having infected mosquitoes bite the volunteers. The circumstances under which the second experiment was conducted, though posing no risk of a yellow fever infection to the volunteers (or so Reed correctly believed), was immensely unpleasant in its own right:

A single stove stood in the one-room house, and it kept the temperature inside somewhere between 90 and 100 degrees at all times. Impenetrable to light or air, the small room felt like a furnace. The three men began breaking open the crates and boxes left in the center of the room. As they opened the first trunk, the odor was so pungent that the men ran outdoors, hands over their mouths, to keep from retching. After a few minutes, the three men returned and finished unpacking boxes full of soiled sheets, covered in vomit, sweat and feces from the yellow fever ward. They dressed in the filthy clothing that had been worn by dying patients, they covered their cots in sheets stained with black vomit, and then they spent the next twenty nights the same way.

This horrific experience was endured by multiple trios, but not a single one ever developed yellow fever, providing "irrefutable proof that yellow fever could not be transmitted by 'germs,' infected clothing or air." When the mosquito trials succeeded in infecting numerous volunteers (and killing Jesse Lazear, one of Reed's three colleagues on the Commission), the riddle was solved.

Crosby devotes the last section of the book to the century since Walter Reed's efforts. The good news includes the development of a vaccine. The bad news is that the vaccine is no longer included in many vaccine schedules, and the World Health Organization estimates that 30,000 per year still die from the disease in Africa and South America, and the rest of the world "must still be considered at risk for yellow fever epidemics."

The problem with The American Plague is it is not the book Crosby really wanted to write. A resident of Memphis, it is apparent that the local history explored in the first chunk of the book is her true passion. She goes into great detail about city life, and the doctor, nurses, and citizens who lived and died during the 1878 epidemic. Even the verbose subtitle of the book highlights "the epidemic that shaped our history."

As a result, the subsequent shift to Cuba is jarring. Her take on the Yellow Fever Commission's operation is really just a series of short biographies strung into a narrative. Like she does in the Memphis section, Crosby spends far more time on the people than the disease, and there is no significant effort to explore the scientific and medical underpinnings of the Commission's efforts. The final section, which purports to takes the story to the present day, is a mere 25 pages. Barely five pages are devoted to the development of the vaccine, a few references to the continuing presence of yellow fever in Africa and South America, and then, you guessed it, an epilogue that returns to Memphis for a discussion of the long-term effects on that city.

Crosby has written an excellent long article on the 1878 yellow fever epidemic in Memphis, with a shallow hundred page detour to Cuba tacked on to pad the page length. The narrative is breezy, and the details about Memphis are legitimately interesting, but those seeking a serious scientific history of yellow fever will want to look elsewhere.

Vista Reboot

Having heard nothing positive from anyone who has actually used Windows Vista, it is far past time for Microsoft to attempt to re-introduce the product. I was so weary of it that we purposely "downgraded" to Windows XP on the new Thinkpad my wife bought me last year (an option discontinued by Microsoft on June 30). And considering John Cole's initial frustrations yesterday, I'm glad I did. XP works for me. But at least Microsoft is acknowledging the problems:

"We know a few of you were disappointed by your early encounter," the company says on the site. "Printers didn't work. Games felt sluggish. You told us -- loudly at times -- that the latest Windows wasn't always living up to your high expectations for a Microsoft product. Well, we've been taking notes and addressing issues."

That does not mean I am going to run out and get it. In fact, I doubt I will upgrade the operating system at all, so my first experience with Vista will come with my next computer. Considering my last Thinkpad was going strong after five years, Microsoft may well be onto its next product before I come aboard.

Science vs. Scientism

John Silber's recent article in The New Criterion is a fascinating take on the current evolution/intelligent design debate, and what it says to us about how we view humanity and its place in the world:

With regard to the literalists and the reductionists, I would say, a plague on both houses.

The literalists have no standing in universities. But what standing, we must ask, have the reductionists who claim the authority of science in areas of inquiry beyond scientific evidence or proof? I do not question their right to develop their ideas and their research as they deem best. The freedom of inquiry should not be challenged. But neither should any scientist or researcher claim an immunity from criticism. The right to err is fundamental for, as Goethe remarked, “Man must err so long as he strives.” We have, moreover, the assurance of the Council of Trent that all our institutions, including the university, retain their validity despite the failures and mistakes of our members.

We have, therefore, every right to demand of the reductionists: What is the relevance of your pronouncements that trivialize or outright deny the full range of human potentiality in the face of the demonstrable wonders of mankind? Do your claims account for or diminish the beauty of the Parthenon, the music of Bach or Mozart, the frescoes and sculptures of Michelangelo, the plays of Shakespeare, or the genius of Lincoln’s prose?

I'm as yet insufficiently well-read on the evolutionary debate to comment on Silber's allocation of fault, but I am struck by the attempt to place these questions in a much larger context and I think his effort is worth reading.

Baby Panda / Evolutionary Biology

panda1It sounds like the baby panda at the National Zoo is still doing well, a much welcome sign of good news for a zoo that has had its share of problems in recent years:

The National Zoo's giant panda cub has doubled in length since his first examination two months ago and could be crawling around within two weeks, the animal park's chief veterinarian said yesterday.

During his seventh medical examination yesterday morning, the cub measured 24.7 inches long, compared with 12 inches during his first exam Aug. 2. He weighed 11.1 pounds, compared with 1.82 pounds at his first checkup.

"He's the incredible expanding panda," said chief veterinarian Suzan Murray.

I have been glued to the story of the baby panda, both because pandas are one of my favorite animals and because I am fascinated by the phenomenon of altricial young, where newborns are so helpless as to require long term care by a parent. Humans (and pandas and elephants) are interesting examples of altricial creatures, since we are also k-selected. K-selected animals tend to have infrequent breeding, long gestation and maturation periods, and precocial young (born with skills, sight, hair, etc).

It does seem a bit strange, after all, that human babies gestate for so long, and yet are born still utterly defenseless, and remain that way for years. Cats gestate for nine weeks, have litters averaging two to five, and a newborn kitten can safely leave its mother after 12 weeks, reaching sexual maturity in six months.

Of course, elephants gestate for 22 months for a single calf and the calf nurses for up to 2 years (at 3 gallons of milk per day!). Like human children, elephant calves learn primarily through observation of adults, not from natural instinct. Elephant calves do, however, stand within an hour after birth and can follow a herd within a few days.

A lot of fascinating stuff out there. The only two science classes I took in college were intro to astronomy, which was less fun than anticipated, and Science B-29, "Human Behavioral Biology", which went (and presumably still goes) by the euphemism "Sex" at Harvard. The latter was one of the more fascinating classes I took, and I have retained a novice interest in the subject.

My New Favorite Pastime

Well, I may have doomed myself to a permanent state of unproductiveness. I have a new favorite link. But I warn you that if you follow this link, or add it to your bookmarks, or make it your homepage.... let's just say it is a bottomless pit from which you may not emerge. Don't blame me.

Random Wikipedia Article

Prepare yourself for knowledge. Bring a flashlight. And some provisions.

States To Push Internet Sales Tax

Are the days of tax-free toothpaste from Amazon coming to an end? Tax-free books? Tax-free computer equipment from NewEgg? It is almost too terrible to imagine, but it may be in the works:

Come this fall, 13 states will start encouraging — though not demanding — that online businesses collect sales taxes just as Main Street stores are required to do, and more states are considering joining the effort.

Right now, buyers are expected to pay sales taxes on Internet purchases themselves directly to the state when they pay their income taxes. But it's not widely enforced, and states say it costs them upwards of $15 billion a year in lost revenues, collectively.

"Taxes that it was difficult to collect before will now be collected. And consumers will pay that," said David Quam at the National Governors Association, helping lead the five-year effort that brought together state revenue officials, legislators and business leaders.

A 1992 U.S. Supreme Court ruling forbids states from forcing a business to collect their sales taxes unless the company has a physical presence in the affected state. The court noted the dizzying array of tax jurisdictions and widely varying definitions of taxable goods, such as fast food versus groceries.

Well I don't think "encouraging" is going to have much effect. Would you pay your income tax if you were encouraged but not required to do so? Unlikely.

I am sympathetic to the underlying problem of cross-border taxation. Having lived in Massachusetts and watched people haul kitchen appliances across the border from New Hampshire, it seems clear that this is not a new problem. The scale, however, is so much bigger. And I can not think of a good fundamental economic or moral argument for why purchasers ought to be able to avoid the sales tax of both their own state and the seller's state.

There are plenty of good practical economic and administrative arguments, however, for why this may be a doomed project without broader changes in the way we tax. Whether it be a shift to a federal sales tax, a VAT system, or some other scenario my ignorance of tax law is overlooking, I don't know. But I don't think this is going to cut it:

To be accepted as part of the project, a state must change its tax laws to match up with the others. So far, 13 states have come far enough to be part of the project. Five more are approved to join within the next few years, and others have made partial steps.

The process wasn't easy. Among the issues to be answered: If candy is taxed but food isn't, what is candy? And what is food? Is a Twix cookie bar candy or food?

The solution: anything with flour is food, not candy.

I can't wait to see Amazon re-defining its categories in response to this. Books, DVDs, Music, Food with Flour, Candy without Flour, etc. Perhaps they will come up with a "search by ingredient" along the lines of their "search this book" program.

The First Photo From My Nikon

scout1.jpg

One of the reasons I picked this camera (other than the great sale) was its macro abilities. This shot was taken just a few minutes after I took the camera out of the box. You can even see the Nikon instruction guide in the background.

My New Camera

The New Virus

See if you can spot the unintentional (I think) self-parody in the latest email from my school's IT department, regarding the latest version of the MyDoom virus that is floating around:

To protect yourself from infection from this virus, do not click on emailed hyperlinks unless you are certain of their integrity and are sure that they are to be trusted.

For additional information, please see this link:

http://money.cnn.com/2004/11/09/technology/mydoom/index.htm?cnn=yes

It's like they're testing us.

And Then There Was Light

The first serious camera I owned was the Canon EOS A2-E (aka EOS 5), which I took with me to Europe during the summer after I graduated from high school, and for a couple years into college. Then I sold that outfit and decided to go all manual, investing in a pretty expansive manual focus system (though for the life of me I can't remember the manufacturer... Minolta I think). And then my last year of college, I sold that and bought the Olympus E-10 as my first foray into digital photography. Unfortunately, I needed money to finance my move to Charlottesville for law school, and had to sell it. For the first two years of law school, I made do with a little Canon S200 Digital Elph.

Right before this summer started, I decided to finally choose once and for all between Canon and Nikon systems. At some point in life you have to make the big choices, and it was time. I bought both the Nikon D-70 and the Canon Digital Rebel. No question, based on the camera body alone, I would have gone with the Nikon. I think it is a superior camera, and priced accordingly. But there was one major, major problem: it back-focused.

So I could either return it for a new one, hoping that I wouldn't get a second lemon. Or I could go with the Rebel. The Rebel had two main advantages: 1) I could buy it locally, from a pro shop that would be helpful and attentive; 2) I much prefer the Canon professional camera bodies and lenses to their Nikon counterparts. So what I decided to do was this: step-by-step, build myself a great Canon system.

For this summer, I've just toyed around with the kit lens and the 50mm f/1.8 that everyone raves about as being the best bargain in town (and it is). But now that I'm headed back to school, I thought it was time to add a substantial flash element. And here she is:

420ex.jpg

No more red-eye! No more unsightly shadows! No more evil focus assist flash strobe (by far the most heinous part of the Digital Rebel, as anyone who has been the subject of the onboard flash can tell you)! Now I will start salivating over the Canon 70-200mm f/4.0L lens (also much raved about). Yum.

UPDATE: Wow, I have to put in a plug for B&H Photo & Video (and UPS). I've bought a lot of camera equipment from them for many years, and always been pleased, but this tops it all. I ordered the flash at 11:30am yesterday morning and designated 2nd Day Air. It's 9:58am and the box was just put in my hand. That's under 23 hours, door to door. Outstanding.

Internet Chat Archives

A very entertaining and distracting collection of the Top 100 Quotes collected in chat rooms, constantly updated by live voting. #1 is genius:

<Zybl0re> get up
<Zybl0re> get on up
<Zybl0re> get up
<Zybl0re> get on up
<phxl|paper> and DANCE
* nmp3bot dances :D-<
* nmp3bot dances :D|-<
* nmp3bot dances :D/-<
<[SA]HatfulOfHollow> i'm going to become rich and famous after i invent a device that allows you to stab people in the face over the internet

That immediately makes me think of Kevin Drum's comment section. Huh. And for neverending excitement, use the random quote option. Too, too funny. Though I suppose being an internet and computer geek helps.

UPDATE: Mr. Poon likes it too.

Entrepeneurial Nutjob

Here I am, peacefully reading a very beautiful tribute to Johnny Cash in this month's Atlantic Monthly, when my eye is drawn to the edge of the page where a tiny ad has this simple message:

World events due...........
Visit now, "thirdtestament.org"
by Thomas.

Well I've got my computer open anyway, so I go ahead and visit. I could use the answers to all my questions, or at least a good laugh. Unfortunately, I got neither. Just an amateurish Lost in Space looking website with absolutely zero content. Just a little paypal button that lets me contribute $10 for God Knows What. If I had an extra $10, morbid curiosity might see me pressing the little button. Anyone want to chip in a buck? We can split the goods ten ways!

UPDATE: I can only guess that this is distinct from the Third Testament offered here, which somewhat suspiciously offers free download of the "Third Testament" only as an .exe file. I'll pass.

You Know You Want To

That's right, it has arrived. Mozilla Firefox. Download it. Use it. Love it.

Mars

Just to add my two cents to a well-debated issue, I think the proposed mission to Mars (with a Moon base to boot) is pure stupidity. I mean, it is a flat out unreconstructed moronic idea. As my colleagues all over the spectrum (Baude, Bernstein, Butler, Drezner, Singer, Yglesias) have demonstrated, this is a bad idea for lots and lots of reasons. I won't harp on what a hack political move this probably is, since I'm unconvinced other politicians are doing any better. As a pleasant surprise, few have stepped forward defending this indefensible idea. When they have, it hasn't been pretty.

Anthony Rickey starts his term as a Crescat guest-blogger off with a dud post arguing that "$500 billion is a cheap price to pay for putting the romanticism and nobility back into our ideas of government." First off, no way does putting a man on Mars do that if we're also neck deep in deficits and wondering what other domestic or foreign achievements we could have accomplished with the money. Second off, where's the limit? As Will Baude has pointed out, there doesn't seem to be a natural ceiling for such delusions of granduer. Third, as Peter Northrup says (in admittedly intemperate terms), it is not the job of the United States government to take lots of our money away from us just to show us that it is capable of making the biggest fireworks show on (or near) Earth.

Anyhow, I felt a little sick today thinking of what a phenomenal waste of money and human resources this will be and I wanted to share it.

iTunes fun and Apple Temptations

One of the things I did to procrastinate from my mountains of schoolwork was to add album artwork to all of my MP3s in iTunes. That program is truly a marvel of ease and convenience (particularly since I have an iPod and had to suffer through using Musicmatch Jukebox to add songs to it), and has me wondering what other gems the Apple world has to offer. I've never before been tempted to cross over into that world, but now that I've gotten used to having two computers (laptop for schoolwork and desktop for gaming, etc), it seems like I could pretty easily make the transition to having an Apple laptop, since I woudn't have to worry about games and other programs not working (since I'd be using the desktop PC for those anyway).

Those Powerbooks sure are nice.

Circuit City

Interesting computer shopping experience today. I'm upgrading my desktop computer (added a new motherboard and processor) and decided to buy a larger hard drive. I went online to look at prices at my favorite online source, and compared those prices to the ones listed at Circuit City, which is close to my home. Circuit City was having a sale on hard drives, so the prices were almost even. I happily jumped in the car and rode over to Circuit City, only to find the very same hard drive priced $30 higher than I'd seen online. At first I thought I must be crazy, but it then dawned on me that Circuit City might very well have targeted pricing schemes set up. On their webpage, they know they are competing for online customers against places like newegg.com. In the store, particularly the one in Charlottesville, they have no such competition.

So what did I do? I walked over to the silly Sprint Broadband demo computer, logged onto circuitcity.com, ordered the hard drive and designated express pickup as my delivery option, walked 10 feet to the merchandise pickup window, and received my new hard drive for $30 less than the price in the store. That felt pretty good.

New Scanner

I've never had much use for a scanner, but many of my research sources are library books and I have a feeling many will soon be recalled. I also obviously can not write in them (nor would I want to). So I did a little web surfing, and saw that the Canon CanoScan LiDE 20 is on sale for $39.99 at Circuit City. I reserved mine online, drove over to Circuit City, showed them the online confirmation, and walked out with a new scanner for $41. It weighs 3 lbs. and is 1.4 inches high, and does exactly what I need it to do. I inserted the USB plug (which provides both data and power, so no AC plug is necessary) and within 5 minutes I was creating PDF copies of the chapters I need. Combined with my laser printer, it can also serve as a copier in a pinch, with printing beginning the instant the scan is complete. I'm a happy camper. Considering I also just got a 52x cd-burner (an upgrade from my 16x) for $43, this has been a very good computer peripheral week for me.

Hockey in Hell

Now I'm getting spam advertising spam services:

EXPLODE your Profits
24/7 Dedicated Bulk Email Servers!

Do you sell a product or service?
Are you able to fulfill this product or service to a national or international market base?
Do you have the motivation and follow through to attain your goals?
If you answered yes to all three of the above questions,
Bulk Email can EXPLODE your Profits!

Well sign me up.

Googlism

Via Eric Muller I found this funny little program called Googlism, which returns some phrases concerning whatever name you enter. My results:

unlearned hand is thy name
unlearned hand is looking for guest bloggers � particularly blog readers who don't currently blog and might like to test the
unlearned hand is feeling pessimistic
unlearned hand is looking for guest bloggers who'd like to try a stint posting on his blog
unlearned hand is dead on in his judgment of o'conner's grutter opinion
unlearned hand is basically right about the
unlearned hand is adding personnel
unlearned hand is a 1l at uva law school
unlearned hand is looking for guest bloggers and eventually is looking to become a
unlearned hand is looking for
unlearned hand is collecting bloggers' opinions for a series called "where do we go from here?" mr
unlearned hand is going to be my speechwriter
unlearned hand is right

Yes indeed.

The Digital World

For a long time I was reluctant to purchase a digital camera. I had grown up using film and had spent enough time developing my own negatives and printing my own enlargements that I felt an attachment to that world. In addition, I became proficient with higher end SLR equipment, whose digital counterparts I still can't afford (though the prices are coming down dramatically). What swayed me was the realization that I ended up with rather few pictures of my college experience, because it was just too much effort to carry around a big SLR and then have the negatives developed and prints made. What I really needed was a camera that could go with me anywhere, and would be as hassle-free as pretty much everything else in my uber-digitized world. I shopped around for digital cameras, went to Circuit City to see how they all felt in my rather large hands, and became very fond of Canon's Digital ELPH series. If I was seeking to print the pictures, I would have gone for the 3 or 4 megapixel versions. As I really only wanted a camera for email/internet purposes, the 2 megapixel S200 fit the bill, and the order went through to Amazon.

Now that it's been through its first vacation, I can say I am quite satisifed. The images aren't razor sharp at their original 1600x1200 resolution, but resizing them to 800x600 leaves a very clear image at an ideal size for email. I really enjoyed the standard benefits of a digital camera, namely that we immediately knew whether the photo came out well (and could delete and retake the shot if not) and were able to take several hundred pictures to come out with the 50 or so that really illustrate our experience (and already have them, faster and cheaper than one-hour prints). I will say, however, that I think the niftiest feature of the camera is its panorama feature. This allows you to take a series of photographs (up to 360 degrees), each time showing in the LCD the edge of the previous photo (so that you know where to line up the next shot). When you get home, the included software helps you to "stitch" the individual photos together. The results are pretty impressive I think, and certainly far beyond anything I could do manually.

Satellite Imagery

Via Calpundit, check out Microsoft's TerraServer for satellite imagery of the entire U.S. (mostly from 1994 and 1997). Just enter an address and relive old times.