Archive for Space
Firsts have always been important in exploration. This seems rather straightforward, even tautological, to say since being first is woven into the definition of exploration. After all, traveling to unknown places is doing something that hasn’t been done before (or at least hasn’t been reported before). And this is how the history of exploration often appears to us in textbooks and timelines: as lists of expeditionary firsts from Erik the Red to Neil Armstrong.
In truth, though, firsts are fuzzy.
Some fuzziness comes from ignorance, our inability to compensate for the incompleteness of the historical record. This is a perennial problem in history in general and history of exploration in particular. (I call it a problem but it’s actually what makes me happy and keeps me employed).
Was Christopher Columbus the first European to reach America in 1492? Probably not, since evidence suggests that Norse colonies existed in North America five hundred years before he arrived. Was Robert Peary the first to reach the North Pole in 1909? It’s hard to say since Frederick Cook claimed to be first in 1908 and its possible that neither man made it.
Some fuzziness comes from the different meanings we give to “discovery.” The South American leader Simon Bolivar called Alexander von Humboldt “the true discoverer of America.” Bolivar did not mean this literally since Humboldt traveled through South America in 1800, 17 years after Bolivar himself was born there, 300 years after Columbus first arrived in the Bahamas, and about 16,000 years after Paleo-Indians arrived in America, approved of what they saw, and decided to stay.
But for Bolivar, Humboldt was the first person to see South America holistically: as a complex set of species, ecosystems, and human societies, held together by faltering colonial empires. Being first in exploration, Bolivar realized, meant more than planting a flag in the ground.
At first glance, we seem to have banished fuzziness from modern exploration. For example, there is little doubt that Neil Armstrong was the first human being to set foot on the moon since the event was captured on film and audio recordings, transmitted by telemetry, and confirmed by material artifacts such as moon rocks. (Moon hoax believers, I’m sorry. I know this offends.) Were the Russians suddenly interested in challenging Armstrong’s claim to being first, they would have a tough time proving it since Armstrong could give the day and year of his arrival on the moon (20 July 1969) and even the exact hour, minute, and second when his boot touched the lunar surface (20:17:40 Universal Coordinated Time).
But this growing precision of firsts has generated its own ambiguities. We have become more diligent about recording firsts precisely because geographical milestones have become more difficult to achieve. As a result, there has been a shift from firsts of place to firsts of method. As the forlorn, never-visited regions of the globe diminish in number, first are increasingly measured by the manner of reaching perilous places rather than the places themselves.
For example, Tenzing Norgay and Edmund Hillary were the first to ascend Mt. Everest in 1953, but Reinhold Messner and Peter Habeler were the first to climb the mountain without oxygen in 1978. In 1980, Messner achieved another first, by ascending Everest without oxygen or support.
Now as “firsts of difficulty” fall, they are being replaced by “firsts of identity.” James Whittaker was the first American to summit Everest in 1963. Junko Tabei was the first woman (1975). Since then, Everest has spawned a growing brood of “identity first” summits including nationality (Brazil, Turkey, Mexico, Pakistan), disability (one-armed, blind, double-amputee) and novelty (snowboarding, married ascent, longest stay on summit).
It would be easy to dismiss this quest for firsts as a shallow one, a vainglorious way to achieve posterity through splitting hairs rather than new achievements. But I don’t think this is entirely fair. While climbing Everest or kayaking the Northwest Passage may have little in common with geographical firsts in exploration 200 years ago, this is not to say that identity firsts are meaningless acts. They may not contribute to an understanding of the globe, but they have become benchmarks of personal accomplishment, physical achievements — much like running a marathon — that have personal and symbolic value.
Still, I am disturbed by the rising number of “youngest” firsts. Temba Tsheri was 15 when he summited Everest on 22 May 2001. Jessica Watson was 16 last year when she left Sydney Harbor to attempt a 230 day solo circumnavigation of the globe. (She is currently 60 miles off Cape Horn). Whatever risks follow adventurers who seek to be the oldest, fastest, or the sickest to accomplish X, they are, at least, adults making decisions.
But children are different. We try to restrict activities that have a high risk of injury for minors. In the U.S. for example, it is common to delay teaching kids how to throw a curve ball in baseball until they are 14 for fear of injuring ligaments in the arm. Similar concerns extend to American football and other contact sports.
So why do we continue to celebrate and popularize the pursuit of dangerous firsts by minors? What is beneficial in seeing if 16-year-olds can endure the hypoxia of Everest or the isolation of 230 days at sea. Temba Tsheri, current holder of youngest climber on Everest, lost five fingers to frostbite.
We must remember that to praise “the youngest” within this new culture of firsts, we only set the bar higher (or younger as it were) for the record to be broken again. In California, Jordan Romero is already training for his ascent of Everest in hopes to break Tsheri’s age record. He is thirteen.
By Michael Robinson and Dan Lester
NASA has always stood at the fulcrum of the past and future. It is the inheritor of America’s expeditionary legacy, and it is the leading architect of its expeditionary path forward. Yet the agency has found it hard to keep its balance at this fulcrum. Too often, it has linked future projects to a simplistic notion of past events. It has reveled in, rather than learned from, earlier expeditionary milestones. As NASA considers its future without the Constellation program, it is time to reassess the lessons it has drawn from history.
For example, when U.S. President George W. Bush unveiled the Vision for Space Exploration (VSE) in 2004, the administration and NASA were quick to link it to the 200th anniversary of the Lewis and Clark expedition, stating in the vision: “Just as Meriwether Lewis and William Clark could not have predicted the settlement of the American West within a hundred years of the start of their famous 19th century expedition, the total benefits of a single exploratory undertaking or discovery cannot be predicted in advance.” In Lewis and Clark, NASA saw a precedent for the Vision for Space Exploration: a bold mission that would offer incalculable benefits to the nation.
Yet this was a misreading of the expedition. The Lewis and Clark expedition did not leave a lasting imprint on Western exploration. The expedition succeeded in its goals, to be sure, but it failed to communicate its work to the nation. The explorers’ botanical collections were destroyed en route to the East Coast, their journals remained long unpublished, and the expedition was ignored by the press and public for almost a century. In 1809, 200 years ago last September, a despondent Lewis took his own life. NASA might do well to reflect on this somber anniversary in addition to the more positive one used to announce the Vision for Space Exploration in 2004. Doing exploration, Lewis reminds us, often proves easier than communicating its value or realizing its riches.
NASA should also remember the anniversary of Robert Peary’s expedition to reach the North Pole, completed a century ago last September. Peary’s expedition, like the ones envisioned by the Vision for Space Exploration, was a vast and complicated enterprise involving cutting-edge technology (the reinforced steamer Roosevelt) and hundreds of personnel. Peary saw it as “the cap & climax of three hundred 300 years of effort, loss of life, and expenditure of millions, by some of the best men of the civilized nations of the world; & it has been accomplished with a clean cut dash and spirit . . . characteristically American.”
Yet Peary’s race to the polar axis had little to offer besides “dash and spirit.” Focused on the attainment of the North Pole, his expedition spent little time on science. When the American Geographical Society (AGS) published its definitive work on polar research in 1928, Peary’s work received only the briefest mention. Indeed, the Augustine committee’s statement that human exploration “begin should begin with a choice of about its goals – rather than a choice of possible destinations” would have applied itself equally well to the race to the North Pole as it does the new did recent plans to race to the Moon.
But the most important anniversary for NASA to be considering is the recent 400th anniversary of Galileo’s publication of “Sidereus Nuncius” (“Starry Messenger”), a treatise in which he lays out his arguments for a Sun-centered solar system. Was Galileo an explorer in the traditional sense? Hardly. He based his findings upon observations rather than expeditions, specifically his study of the Moon, the stars, and the moons of Jupiter. Yet his telescopic work was a form of exploration, one that contributed more to geographical discovery than Henry Hudson’s ill-fated voyage to find the Northwest Passage made during the same year. Galileo did not plant any flags in the soil of unknown lands, but he did something more important: helping to topple Aristotle’s Earth-centered model of the universe.
As NASA lays the Constellation program to rest, the distinction between “expedition” and “exploration” remains relevant today.While new plans for human space flight will lead to any number of expeditions, it doesn’t follow that these will constitute the most promising forms of exploration. Given our technological expertise for virtual presence – an expertise that is advancing rapidly – exploration does not need to be the prime justification for human space flight anymore.
The Augustine committee has shown the courage to challenge the traditional view of astronauts as explorers in its “Flexible Path” proposal, a plan to send humans at first into deep space, perhaps doing surveillance work on deep gravity wells, while rovers conduct work on the ground. Critics have derisively called it the “Look But Don’t Touch” option, one that will extend scientific exploration even if it does not include any “Neil Armstrong moments.”
Yet perhaps 2010 is the year when we challenge the meaning of “exploration.” For too long, NASA has been cavalier about this word. Agency budget documents and strategic plans continue to use it indiscriminately as a catch-all term for any project that involves human space flight. Yet this was not always the case. The National Aeronautics and Space Act of 1958, the formal constitution of the agency, doesn’t mention the word in any of the eight objectives that define NASA’s policy and purpose. Rather, NASA’s first directive is “the expansion of human knowledge of the Earth and of phenomena in the atmosphere and space.”
Perhaps the best way forward, then, starts with a more careful look back. The world has changed since Lewis and Clark, with technology that would have stunned the young explorers. In the year of “Avatar,” we need to think differently about the teams who direct rovers across the martian landscape, pilot spacecraft past the geysers of Enceladus and slew telescopes across the sky. These technologies are not static in their capabilities, nor as are the humans who control them. Their capabilities advance dramatically every year, and the public increasingly accepts them as extensions of our intellect, reach, and power. As Robert Peary’s quest for the North Pole illustrates, toes in the dirt (or in his case, ice) don’t necessarily yield new discoveries.
Of course robots and telescopes can’t do everything. A decision that representatives of the human species must, for reasons of species survival, leave this Earth and move to other places would make an irrefutable case for human space flight. But that need has never been an established mandate. It isn’t part of our national space policy. As we celebrate NASA’s 50th anniversary, NASA begins its sixth decade, do we have the courage to look beyond our simplistic notions of exploration’s past to find lasting value in the voyages of the future?
Michael Robinson is an assistant history professor at the University of Hartford’s Hillyer College in Connecticut. Dan Lester is an astronomer at the University of Texas, Austin.
This essay appears here courtesy of Space News where it was published on 8 February 2010.
Obama’s 2010 budget proposal is a radical document. Not because it runs the biggest federal deficit in American history ($1.53 trillion). Posting record deficits has become commonplace since Reagan started doing it in the 1980s. No, it is radical because it tries something new: killing off a multi-billion dollar NASA program that has strong support in Congress.
Constellation grew out of President George W. Bush’s Vision for Space Exploration, which he announced shortly after the Columbia Shuttle disaster of 2003. Bush’s plan was visionary: a plan to design and build boosters and spacecraft capable of returning astronauts to the Moon and, ultimately, Mars.
But visionary does not equal smart. The Constellation Program failed because it fell into the same trap that Apollo did in the 1970s: it was a massively expensive public program that, while symbolically impressive, lacked practical, real-world benefits that could match its $97 billion price tag (GAO-estimated cost through 2020).
Indeed, the Constellation Program was so colossal that it stood poised to suck the life out of every other NASA initiative, particularly space science projects that did not require humans, crew modules, or moon buggies to conduct research.
The technology of the Constellation Program may have been new but the arguments were old, a list of reasons for pursuing human space flight that have been used to justify missions for the past forty years:
1. Human space flight is an extension of humanity’s quest to explore and therefore cannot, and should not, be stopped. To do so would be to blunt human curiosity and deny human nature. In truth, exploration has been pursued for many reasons, of which curiosity has usually ranked low on the list. Even if we accepted, for the sake of argument, that an exploration impulse that is part of human nature, it still does not mean that we should obey this impulse. This is a classic “naturalistic fallacy” which says something is good because it is natural. Social Darwinists used this line of reasoning to justify poor treatment of workers and colonial subjects on the idea that survival of the fittest was natural and therefore should be allowed to run its course.
2. Human space flight will offer unforeseen benefits to science and technology. This may be true. Or maybe not. It’s hard to say really because proponents admit that any benefits are unforeseen. Still it seems an odd toss-of-the-dice way to spend public money. Would we trust a general who defended his plan of attack on the unforeseen possibilities of victory? Would shareholders trust a company selling products with unforeseen potentials of profit?
3. If we abandon human space flight, we will soon be outpaced by the China, Russia, India, [insert developing industrial nation] in the space race. The United States did gain prestige from landing astronauts on the moon in 1969, showing up our Cold War rival, the Soviet Union. But how much did that prestige, or “soft power” actually benefit the United States? Prestige did not stop the Vietnam War, or the Arab Oil Embargo, or the onset of stagflation. How much, then, is this type of prestige worth in the post-Cold War Age, a time when the United States is, arguably, supposed to reap the benefits of belonging to a multilateral world? What does the United States gain in winning the space race against China when they are losing the economic race to China back on Earth?
4. Human space flight is the first step in the human settlement of space, a process vital to continuation of the species. The idea that astronauts are really 21st century pioneers is a romantic one, but unrealistic. Going to the moon (or Mars) is a lot easier than settling there. Perhaps the real question here is why proponents of space settlement are so willing to give up on planet Earth? Global warming? Nuclear war? Overpopulation? This begs the question: if we cannot take care of a 197 million square mile habitat that’s free, self-regulating, and self-sustaining, what makes us think that we’re going to do any better on multi-billion dollar artificial habitats on other planets?
It’s time for NASA to think differently about space exploration. The Obama budget requests $18 billion for the agency over the next five years, an increase from the current budget. Now NASA has the time and the money to think about new ways of moving forward. Bravo to the Obama Administration for forcing the issue.
(For Part I, go here)
I first took the Myers Briggs Type Indicator (MBTI) when I was a junior in high school. This was a good time to take it since, like most sixteen-year-olds, I was self-absorbed enough to think I should spend more time trying to figure myself out.
The test labeled me as an ENTP: Extroversion, Intuition, Thinking, Perceiving.
I was a clear extrovert, someone who Jung describes as gaining energy from the world around them, or in my case, trying to set fire to the world around them. Introverts like Jung find energy through reflection. Thinking first, acting second. An interesting idea.
I also tested strongly intuitive, or as the MBTI would observe, I gathered information as concepts and abstract patterns rather than as concrete, immediate facts available to the senses.
One axiom of the MBTI is that personality types are stable, more or less. As this idea goes, the psyche sets up basic patterns of gathering, interpreting, and acting on information quite early, by age three or four.
Yet critics of the MBTI such as Paul Matthews point out that people who take the test often get different answers. My testing history confirms this as well. In high school, the MBTI tagged me as a thinker rather than feeler, deciding issues on logical and consistent premises.
When I took the test again last week, I had swung over to the feeling side of the spectrum, making decisions based upon personal association or empathy more than general principles. That a person’s psychological type seems squishy, mutable over time, is one of many criticisms leveled at the MBTI, one that challenges its claim to measure meaningful psychological differences.
Still, my MBTI evaluation has been stable other than that, particularly in the final category of perceiving/judging which evaluates how people process information. Strongly judging individuals tend to like settling matters and, as a result, gather information in order to make decisions and tie up loose ends.
Those with strong perceiving tendencies (of which I am one) gather information like rodents in November, amassing it without end. Perceivers are the hoarders of ideas, stowing and revising them even though it keeps things unsettled. They have messy desks.
Driving on Rt 6 in Wellfleet last week, my wife Michele (an INFJ) wondered how her students might type characters in her lit courses. Ahab would have to be an INTJ. The Great Gatsby? ESFP I think.
The conversation brought me back to a post I wrote last year about The Explorer Type. At the time, I was thinking about how certain explorers, such as Roy Chapman Andrews and Louis Leakey, took on similar cultural personae: popular outsiders who contributed to, but were not a part of, the academic establishment.
Was there something deeper here? A psychological type that lay behind the public persona? The ENTP personality type (Extrovert, Intuitive, Thinking, Perceiving) is often labeled “The Inventor-Explorer.” Other analyses of Myers-Briggs tag INTP (Introvert, Intuitive, Thinking, Perceving) as the rightful home of this type. Yet what spotty data exists on this subject shows that real explorers, such Chuck Yeager and Alan Shepard, test as ISTPs (Introvert, Sensing, Thinking, Perceiving).
Then again, test pilots and astronauts offer a narrow field of explorers. Reading Goethe and ranging over the mountains of South America, would Alexander von Humboldt have been an ISTP? Never. An ENTP if ever there was one.
Nor should the military discipline and technical demands of modern spaceflight necessarily point to controlled, process-oriented types such as ISTPs. The world’s most famous astronaut is a confirmed ENFP.
Take a quick MBTI assessment here.
Type profiles are available here.
Other posts on exploration and personality:
In these dog days of August, NASA is feeling the heat. The Review of U.S. Human Space Flight Plans Committee (aka Augustine Committee) is now spell-checking its final report for the Obama Administration about the direction of U.S. space policy. With talk of exploration in the air, Dan Lester (astronomy research scientist, University of Texas-Austin) and I thought it was a good time to take a closer look at the different meanings of exploration and their use by policy makers. The full article on the subject, “Visions of Exploration” is now out in the journal Space Policy and available here.
Here’s an excerpt:
The historical record offers a rich set of examples of what we call exploration: Christopher Columbus sailing to the New World, Roald Amundsen driving his dogs towards the South Pole, and Neil Armstrong stepping into the soft dust of the moon. Yet these examples illustrate the difficulty in pinning down exploration as an activity.
If we define exploration as “travel through an unfamiliar area in order to learn about it” we exclude Columbus, whose discovery was serendipitous rather than purposeful. We would also have to exclude Amundsen and Armstrong, and indeed many of the pantheon of explorers, who tended to dash across new terrain rather than investigate it systematically.
Even more expansive terms such as “discovery” sometimes offer a poor fit for the object of modern expeditions: did Robert Peary discover the North Pole in 1909, an axis point that Greek astronomers knew about 2500 years ago? Not in any meaningful sense of the word. Students of exploration, then, must make peace with this uncomfortable fact: “exploration” is a multivalent term, one which has been (and undoubtedly will continue to be) used in different ways by different people. Geographical discovery, scientific investigation, resource extraction, and high-risk travel are activities tucked inside this definitional basket.
Because of exploration’s multiple historical meanings, policy makers and administrators have often used this history selectively and out of context. Specifically, policy statements cite the history of exploration in order to make two points: first, that humans are compelled to explore, that curiosity about the world is an innate attribute of our species; and second, that this compulsion has expressed itself most fully in the United States, where exploration has moved beyond matters of trade and settlement to become a part of national identity, a symbol of American idealism, enterprise, and self-sufficiency
Let’s take these ideas in order, starting with the human impulse to explore. We cannot deny that the history of our species is a history of motion. We are all the children of travelers: of long migrations out of Africa, oceanic crossings and continental traverses. Archaeological evidence suggests that humans spent most of their prehistory, from 120,000 BCE to 10,000 BCE, on the move.
We bear the marks of these migrations: in the foods we eat, the languages we speak, and the places we live. Indeed, we carry traces of our itinerant past inside of us: in our dietary preferences for foods salty and sweet, our peculiar anatomy and physiology, and our unique mitochondrial DNA, which, read carefully, offers us a road map of our ancestors’ paleolithic travels.
Yet these facts, so well established, tell us little about motives. Human curiosity has a long and storied history. Aristotle begins his Metaphysics by stating “All men possess by nature a craving for knowledge”, an observation borne out in the earliest works of human literature.
Yet there is little evidence to suggest that humans traveled primarily, or even incidentally, because of curiosity. During the long millennia of our prehistory, the most obvious reason for travel was survival, following seasonal animal migrations, escaping harsh weather, avoiding predators and, perhaps, other humans.
Evidence points to exploration – in all of its incarnations of meaning – as a cultural or political activity rather than a manifestation of instinct. History’s most celebrated voyagers — Pytheas, Zhang He, and Columbus — sailed from nations with imperial ambitions. As Stephen Pyne points in his survey of the ages of exploration, “There is nothing predestined about geographic discovery, any more than there is about a Renaissance, a tradition of Gothic cathedrals, or the invention of the electric light bulb.” (Pyne, “Seeking Newer Worlds,” in Critical Issues in the History of Space Flight, 2006)
The notion that exploration expressed deeper impulses, such as wanderlust or curiosity, came much later, during the Enlightenment, when voyages took up the systematic practice of science: gathering specimens and ethnographic data, observing celestial events, and testing geographical hypotheses. These expeditions expressed a genuine curiosity about the globe, yet they elicited state sponsorship only because rulers saw political value in discovery expeditions, a form of “soft power” statecraft that could enhance national prestige rather then add to colonies or imperial coffers.
If eighteenth-century audiences came to accept the lofty trait of curiosity as a driving force behind voyages of discovery, nineteenth-century audiences found deeper impulses behind humanity’s urge to explore. In particular, the Romantic Movement gave rise to ideas central to the ethos of modern exploration; first, that discovery is a process that includes, but is not contained by, practical pursuits. While geographical discovery, science, and resource extraction all have their parts to play, exploration has an intangible, ineffable quality that cannot simply be reduced to logical goals. The second idea (which follows closely from the first) is that the value of exploration is tied to the subjective experience of the explorer, a symbol of the nation at home.
Why do people climb 8000-meter mountains? Free-solo the Eiger? BASE jump the Eiffel Tower? Motives are tricky things.
My work on Arctic explorers gave me a way to think about it.
Nineteenth-century explorers had their own answers to the “why” question. In the 1850s, when U.S. exploration of the Arctic began, explorers defended their missions by describing all of the commercial benefits that would accrue from their expeditions: new routes to Asia, new whale fisheries, new technological innovations in ship design. (Interestingly, NASA features a similar-sounding set of commercial benefits when it justifies its current plan to return humans to the Moon and Mars).
Then, in the 1880s, explorers changed course, justifying their exploits by anti-commercial motives: we explore because it is impractical. We explore to escape the strictures of the civilized world. We explore for the sake of exploring. Or, in George Mallory’s translation for mountain climbing, “because it’s there.”
In the language of day, the explorer had succumbed to “Arctic fever,” a term used over and over again in the last decades of the nineteenth century to describe the seemingly irrational behavior of explorers in putting themselves at risk:
“The northern bacilli were in my system, the arctic fever in my veins, never to be eradicated.” Robert Peary, 1898
“The polar virus was in [my husband's] blood and would not let him rest.” Emma DeLong, 1884
Explorers are ” infected with the same spirit.” Frederick Cook, undated
“Arctic enthusiasm is an intermittent fever, returning in almost epidemic form after intervals of normal indifference.” McClure’s Magazine, 1893
As I tried to make sense of “Arctic Fever” for my book Coldest Crucible, I concluded that all of this talk of fevers was just another means to show purity of motive:
The disease may seem to be nothing but a playful literary metaphor, but it had serious functions. Arctic fever located the urge to explore in the human passions. It was a condition that afflicted the heart against the better judgement of the mind, operating beyond conscious control. Why should anyone attempt to reach the North Pole when it served no useful or scientific function? Because -explorers claimed- they felt irrationally compelled. In this way, Arctic fever masked rational motives for voyaging north, namely, the promise of celebrity and financial reward.
While explorers spoke about their irresistible compulsions, they were simultaneously working out huge publishing contracts, product endorsements, and lecture fees. At the time I wrote my book, it seemed to me that all of this talk of instinct, true spirit, experience of the sublime, etc. was just a matter of bait-and-switch: finding motives that would impress paying audiences and would hide the true, mercenary motives behind them.
I haven’t abandoned this line of thinking entirely, but after reading the first chapter in Maria Coffey’s book, Where the Mountain Cast Its Shadow, I think I need to revise it.
Coffey’s book is about the effects of extreme adventure on the people left behind: spouses, parents, and children who have to come to terms with the loss of loved ones. She starts her book with interviews of adventurers who talk about their motives in putting themselves at such risk.
“Endurance, fear, suffering cold, and the state between survival and death are such strong experiences that we want them again and again. We become addicted. Strangely, we strive to come back safely; and being back, we seek to return, once more to danger.” Reinhold Messner
“I was totally possessed. The experience was like some inner explosion. I knew it would somehow mark the rest of my life.” Wanda Rutkiewicz
Coffey’s list of climbers who speak about this compulsion is impressive. It extends beyond the elite, celebrity climbers such as Messner and Rutkiewicz to include those who do not have agents, publishing contracts, or product endorsements.
I am realizing that it’s not enough to label this exploration “fever” as merely a savvy form of marketing. It is clearly a psychological manifestation too, one that Coffey links to the impact of extreme risk on biological factors such as adrenaline and dopamine.
Coffey also describes the way that such extreme experience can have, ironically, a quieting effect on adventurists, making them feel less moody, more even-keeled, more able to focus on the present moment. Indeed, more than one climber described climbing as an escape from distraction, a way to concentrate on the task at hand, to live in the moment, to experience things more fully.
At times, it made me wonder if there a common psychological profile for elite climbers. The frequency of people referring to attention and distraction sounded very similar to interviews conducted by Dr. Edward Hallowell in his book, Driven to Distraction, a book about attention deficit disorder (ADD).
The point here is not to throw out one label in order to replace it with another. But Coffey’s book is making me realize that my work on the history of exploration should not only play out at the level of nations, empires, commerce, and popular culture. I need to make room for the individual, a tangled world of emotion, experience, and behavior.
I know that many of you are thinking “No duh! This is standard stuff for climbing books.” True enough: Will power, spirit, fear, endurance, ecstasy: the meat and potatoes of adventure literature. But cultural historians are trained to think of personal motives as ultimately unknowable, a black box that should not be opened. To psychoanalyze the historical subject is like touching the third rail in the subway. Dangerous terrain.
It almost goes without saying that exploration is dangerous work. Vasco da Gama left for India with 180 men. He returned to Portugal with 60. John Franklin’s 1845 expedition to discover the Northwest Passage resulted in an impressive 100% fatality rate. Even expeditions to places well-mapped and long-traveled carry risk. Planning an ascent of K2 in the Karakoram Range? Chances are better than 1 in 4 that you will die in the attempt.
Where does danger lurk? One immediately thinks of physical and biological hazards, of gale-force winds, hull-crushing pack ice, capricious avalanches, & malarial fevers.
But these forces are only efficient causes, the sharp edge of the reaper’s scythe.
When 37 Americans died in two Arctic expeditions from 1879-1884, it was clear to everybody that the men died from starvation and exposure (well, mostly: one man drowned and another was shot for stealing food). But most Americans looked beyond these causes to contributing factors, to poor ship design and faulty relief efforts.
Yet if we look more closely, we see that, more often than not, the expedition party itself is largely to blame for its own failures. Reading the historical record, it becomes clear that one of the most difficult tasks of expeditionary life was not weathering the elements but enduring one’s peers. The 1870 Polaris Expedition to the North Pole fell apart when its pious and imperious commander Charles Hall suffered convulsions (and ultimately died) after drinking arsenic-laced coffee (probably prepared by his disgruntled science officer).
For most of the nineteenth century, Elisha Kane was America’s celebrity explorer, a man revered for his eloquence, cultivation, and high-mindedness. Most of Kane’s men, however, thought he was an insufferable prig. Indeed, more than half of his crew turned against Kane in the Arctic, attempting to escape the Arctic without his approval. Almost all of this was hidden from public view, expunged from the narratives of the expedition written by Kane and his men.
Still one gets subtle glimpses, even from Kane’s own work. The image above was published in Kane’s best-selling narrative of his expedition, Arctic Explorations. In the scene, Kane sits in the center, surrounded by his officers. Kane looks weary and somewhat annoyed, staring down to the right. On the right, two of his officers stare forward towards the viewer, looking at different points. On the left, two other officers are engrossed in conversation, one with a shotgun slung over his shoulder. Whispering about plans perhaps? Indeed, the only one in the scene looking admiringly at Kane is his dog. Or perhaps he’s just hungry.
These might seem like sepia-colored anecdotes from long ago. But expeditions continue to live or die on the ability of their members to get along, to communicate well, and to improvise effectively when things go wrong. Such is one of the findings of Michael Kodas who wrote about last year’s debacle on K2.
With this in mind, I wonder how much “unit cohesion” is on the minds of NASA’s administrators as it plans its mission to Mars. Two years is a long time to spend in a capsule with one’s mates, even with DVDs.