This book provides an account of the University of Manchester's struggle to meet the government's demands for the rapid expansion of higher education in the 1950s and the 1960s. It looks at the University's ambitious building programme: the controversial attempts to reform its constitution and improve its communications amid demands for greater democracy in the workplace, the struggle to retain its old pre-eminence in a competitive world where new ‘green field’ universities were rivalling older civic institutions. The book tells the story, not just from the point of view of administrators and academics, but also from those of students and support staff (such as secretaries, technicians and engineers). It not only uses official records, but also student newspapers, political pamphlets and reminiscences collected through interviews.
The highest and coldest regions of the earth might seem an unpromising choice of location to find studies of balance and moderation; but throughout the twentieth century physiologists and other biomedical scientists used extreme environments as forms of ‘natural laboratory’ to study not only the limits of human performance and survival, but also the ways in which normality and balance were maintained, and altered, in the face of extreme external pressure – both physical and mental. Indeed, some of the earliest historians of the concept of homeostasis were themselves physiologists who worked on human and animal adaptation. 1
This chapter investigates notions of balance in the ‘natural laboratories’ of extreme physiology – specifically the high Arctic, Antarctica and high altitude in South America and the Himalaya. Physiologists and other biomedical scientists celebrated these sites as spaces in which many varieties of imbalance could be studied. Here I will concentrate on three different kinds of balance: moderation, physiological homeostasis and psychological stress responses. Through these case studies extreme environments emerge as sites where, firstly, notions of balance could be debated and reconstituted, and secondly where the white adult male's body became established as the norm for such research. This unquestioned centralisation of a very specific kind of body as a standard measure in balance research – particularly as it was a body not indigenous to extreme environments – had consequences for the practices of both science and exploration.
That the choice of norms and benchmarks in balance research can have deep socio-political consequences is well established through the other chapters in this volume. In the case of extreme physiology the focus on sea-level-born white scientists not only obscured the fact that there were multiple ways of adapting to altitude, but also led to a backlash by South American researchers who began to refigure the ‘Andean man’ as the baseline normal for studies. 2 More broadly, extreme physiology's centralisation of the white European body and its problems was one pillar that maintained ideas of white supremacy at a moment when theories about the origins and evolution of the human race were shifting. 3 It also acted as a self-reinforcing tautology that impeded women's access to extreme spaces and prevented them benefiting from the lessons of the physiological work done there. 4
Clearly, then, however remote or exotic the sites of extreme physiology appear, they were part of broader research networks. As an example, the first physiological expedition to Antarctica (INPHEXAN, discussed in more detail below) explicitly set out to study what the physiologists broadly called ‘stress’ – although caused by the external pressures of an extreme environment and isolated conditions, this was an ‘internal physiological or psychological [process] generated by environmental pressure’. 5 The American researchers involved concentrated on hormonal and biochemical responses to cold, isolation and physical labour, reflecting, and feeding into, the pervasive endocrinal focus of mid-century stress research. Meanwhile, the British researchers studied metabolism, fat deposition, nutrition, fatigue and desensitisation to cold – all markers of earlier twentieth-century stress research, but also themes that would become absorbed into later studies of stress which figured it as part of a complex of lifestyle disorders that included obesity and heart disease. 6 Such work contributed to the development of psychomedical standards for astronaut recruitment (finding balanced personalities), to the design of improved (nutritionally balanced) rations for military forces, but also to everyday lives as part of studies that established the ‘ideal’ (read: balanced and normal) temperature for office buildings and factories.
Of course extreme physiology fieldwork also had characteristics that distinguished it from research in laboratories or more temperate locations, most notably the fact that it was interested in two kinds of adaptation – not just the immediate failure of adaptive bodily or behavioural systems which made up Hans Selye's famous formulation of ‘stress’ in the mid-1930s, but also much longer-term processes, happening over lifetimes and generations, which would gradually evolve human bodies adapted to their environmental conditions. 7 Would the conditions of high altitude, extreme heat or cold, create new ‘normal’ bodies? Such questions, when posed in a context of Western scientific work, often contrasted the skills and abilities of indigenous and non-indigenous peoples: was the superior climbing ability of the Sherpa peoples evidence of physiological acclimatisation – which Western climbers could simulate in their own bodies – or was it evidence of long-term hereditary change – which Western climbers were denied (but could perhaps replicate with drugs or other assistive technologies)? 8
Extreme physiological research therefore exposes the connections between balance in micro and macro worlds – from the minute and rapid biochemical changes in individual haemoglobin molecules, to million-year histories of the balance between human bodies and changing environmental pressures. Focusing predominantly on fieldwork emphasises these connections. The physiologists and biomedical researchers supporting expeditions worked to ‘rebalance’ the explorer's body, using behavioural changes and technological interventions. In so doing they explicitly recognised the mismatch between reductive studies of isolated bodily systems and the clearly holistic reality of the homeostatic/balance systems in the human body. 9 They consistently argued for the value of whole-body, field-site studies as the only way to consider the multi-factorial issues of stress, fatigue and imbalance; indeed, as I have outlined elsewhere, 10 these researchers created complex spaces for knowledge production, blurring the boundaries other scholars have described between laboratory and fieldwork. 11 This process included turning sites of sport, as well as of exploration, into ‘natural laboratories’, particularly for (more) extreme sports such as the marathon (discussed below), or major international events held at altitude or in non-temperate countries. 12 Further, the researchers discussed in this chapter dealt with the difficulty of balancing different working practices and political aims (for example, civilian and military researchers routinely worked alongside one another), while these collages of laboratory, clinical and field researches were created to test and disrupt homeostatic mechanisms. This chapter will start with the laboratory understanding of human balance, and spread out, via blood, breath and psychological stresses, to consider the field study of balance at the extremes of human survival.
Finding the balance: early work on extremity and homeostasis
Despite the later focus on the field, the history of extreme physiology tends to build from laboratory studies in the middle of the nineteenth century; the apparent conflict between findings ‘in the field’ and the laboratory – part of ongoing debates about whether artificial models were good scientific representations of ‘the real world’ – led to a focus on expeditionary fieldwork as a form of ‘reality testing’ for laboratory concepts. 13 As a consequence it was the male body that became not only normalised, but also effectively universalised as the only body about which we had either observational or experimental knowledge when it came to balance and imbalance in extreme conditions.
French physiologist Claude Bernard (1813–78) coined the term milieu intérieur in the 1870s to describe the complex, self-regulating system of the animal body; he also promoted a reductive, experimental approach to studying this system, whereby artificially induced disruption (e.g. placing an animal in a barometric chamber, removing an organ or severing a nerve) sought to isolate individual parts and understand their role in the living, holistic whole. 14 It was in this context that French physiologist Paul Bert (1833–86) created a simplified laboratory model of mountain sickness, in deliberate emulation of Bernard's laboratory-prioritising experimental ideology. 15 Thus the lived phenomena of fatigue, headaches, disorientation and nausea experienced by climbers, explorers and soldiers at altitude was specifically defined as an imbalance in the milieu intérieur – eventually identified as a problem with the regulation of respiration and oxygen levels (a mechanism which alongside ‘water … temperature and chemical reserves’ had formed the touchstones of Bernard's research). 16 Based on extensive barometric studies in the 1870s, Bert created a reductive, single-cause explanation for mountain sickness, turning it into ‘altitude sickness’. Simply speaking, the reduced oxygen partial pressure at high altitude caused a deficit in inhaled oxygen, an inadequacy for which the homeostatic responses of the body attempted to compensate. While this was usually successful at medium altitudes, the extremity of Everest and other high-altitude sites pushed beyond the human body's ability to adapt. The extreme ‘milieu extérieur’ pushed the milieu intérieur to the point of collapse, but this could be fixed with a simple rebalance – the addition of supplementary oxygen.
What was easy in the laboratory was a 100-year challenge on the mountainside. Away from the simplified conditions of the barometric chamber, altitude sickness was a more unpredictable beast – climbers might experience bouts of it on one expedition, but not another, and it might manifest at a variety of altitudes. The first compressed oxygen cylinders were produced (for medical purposes) in 1868, and by the late nineteenth century contained breathing systems (designed for diving and mining as well as respiratory therapy) were sufficiently light and robust to take on an Alpine climbing trip. 17 But oxygen was used with very mixed success as a form of emergency medicine in these circumstances; altitude sickness was treated as an acute-onset disease for which the usual treatment was a retreat down the mountain. When oxygen was given it was prescribed as if it were a medication, on the onset of symptoms, and for only as long as the symptoms lasted. Its most famous failure was the death of Dr Etienne Jacottet on Mont Blanc in 1891, and ongoing scepticism about the usefulness of oxygen to climbers meant that Bert's hypothesis about the altitude-oxygen link was not established as fact until the beginning of the twentieth century. 18
One of the staunchest critics of the altitude sickness-oxygen link was Italian physiologist Angelo Mosso (1846–1910). Otherwise one of the leading scientists working on fatigue, extreme physiology and mountaineering around the turn of the century, Mosso specifically used Jacottet's death as a case study proving his theory that it was a deficiency of carbon dioxide, rather than oxygen, that caused mountain sickness. 19 Having studied under organic physicist Carl Ludwig (1816–95), Mosso was influenced by reductive laboratory methodologies but, unlike Bert, used a mixed system of research in his studies, combining the barometric chamber with field study; indeed he was in part responsible for the construction of the world's first mid-altitude laboratory, the Capanna Regina Margherita, built first as a shelter on Punta Gnifetti (Monte Rosa) c.4,559 m above sea level. 20 The Margherita hut was opened in late 1893; in 1894 proposals were made to add further rooms for scientific work, and the building was gradually expanded and developed into a multi-room research facility. It remained in almost continuous use until the 1930s, but after the Second World War only a few trips were made there, until the facility was entirely rebuilt by the Italian Alpine Club in 1980. The Margherita hut is significant to this story because research into altitude physiology became a robustly field-based specialism in the twentieth century: although barometric chambers and (especially around the two world wars) aeroplanes were used as alternatives, the seminal studies on high altitude, blood and respiration were predominantly those that involved mountain expeditions. So while contemporary altitude physiologists look to Bert – and his theory about oxygen balance – as the ‘father’ of their professional field, it was Mosso who best models the actual research practices of physiologists interested in human limits and homeostasis. 21
Scientists repeatedly referred to the sides of mountains, and, too, to the Arctic and Antarctic regions, as their ‘natural laboratories’; while more attention has been paid to the astronomical and physical sciences in these spaces, recent scholarship has shown that these were important places for biological research too. 22 While ecologists considered these natural laboratories as spaces to consider other kinds of balance (for example the ways in which the processes of evolution ‘fitted’ organisms for their ecological niches, or the ‘balance’ of specific ecosystems), 23 for the physiologist, doctor and psychologist high altitude, the polar regions and similar ‘natural laboratories’ were spaces in which the human body was exposed to extreme conditions: extremes of temperature, altitude, fatiguing physical work and mental strain, and extreme isolation. Here, as in the barometric chamber, the milieu extérieur could force the milieu intérieur to the very limits of its capacity to adapt – that is, to the point at which it became irreparably unbalanced. This imbalance explains the attractiveness of these spaces to scientists, and opens up another form of balance and moderation for consideration: the limits of ethical and reasonable experiments on human beings. There is an interlacing of the legal experiments conducted on mountaineers and explorers, the potentially exploitative studies that used military recruits and the clearly abusive work done using the bodies of prisoners and concentration camp victims – this will be picked out later in this chapter. But one of the advantages of studying explorers and sportsmen (and this was exclusively men until the middle of the twentieth century, a bias that existed in civilian as well as military research) was that they were willing to put themselves into environments, and carry out activities, that ‘normal’ human guinea pigs would not tolerate, or which would not be considered safe and ethical by review boards. 24 This is not a situation limited to the pre-Nuremberg research past: when the American Medical Research Expedition to Everest gained funding from The National Heart, Lung and Blood Institute and the American Thoracic Society for their 1981 expedition, the death rate for summiteers on Everest was around one in fifteen; it is extremely doubtful whether any laboratory or clinic-based research practice would have been approved if it offered such a significant risk of morbidity, let alone mortality, for young, healthy, adult male participants. 25
Sportspeople of all kinds were useful not just for their willingness to enter uncomfortable environments, but also to deliver extreme, reliable and repeated physical efforts. Almost as soon as the marathon became a regular sporting event, doctors and physiologists crowded the start line in order to study the effects of the race on participants. Effectively invented as an event at the 1896 revival of the modern Olympic Games, marathons began to be run elsewhere, the first in Boston in 1897. Boston immediately became a site for physiologists as well as runners, with the first studies (concentrating on cardiovascular work) published in 1899. 26 Where else, after all, could a physiologist find not just one, but many human guinea pigs willing to run twenty-six miles non-stop? As the Nobel prize-winning British physiologist A. V. Hill put it in 1927, the advantages of experimenting on athletes were that ‘athletes themselves, being in a state of health and dynamic equilibrium, can be experimented on without danger and can repeat their performances exactly again and again’. 27
The use of explorers and elite sports performers as subjects had a significant effect on the study of extreme physiology, as it reinforced the erasure of the female body. Even where scientists acknowledged that women's physiology was poorly understood, they made little effort to rectify their ignorance. For example, in 1959 a major symposium on Polar Medicine in Cambridge, with an all-male speaker list, concluded that ‘the time had come to observe the reaction of women as well as of men’, but none of the attendees went on to design field studies that would include women in extreme environments. 28 Studying men was not just an intellectual default, it was the easiest option; women's participation in elite sport was extremely limited in the first half of the century, and they were effectively barred – through legal means, soft power and social pressure – from routinely accessing sites at high altitude or Antarctica until the last decades of the twentieth century. 29 That is, Western European and North American white women were excluded; ‘Sherpani’ and other female porters were routinely used as part of long treks in mountainous regions, and of course women had been living in the Arctic for millennia. But few of these women participated in Western scientific experiments as subjects, and none ran the experiments themselves. 30 Therefore the narratives of balance and moderation at extremes explicitly framed the white adult male body as the standard form: theirs was the ‘normal’ homeostasis, which was disrupted by extreme environments; theirs were the ‘normal’ physiological reactions that responded to this disruption.
Moderate gentlemen and scientific ethics
These earliest investigations into human adaptation – that is, rebalancing – to altitude discovered that altitude caused an apparently universal, ‘normal’ physiological reaction in the blood. Significant changes, outlined below, appeared rapidly in the blood profile of those who moved from sea level to mid-altitude; attempts to manipulate these processes to improve performances in extreme environments provoked questions not only about the balance of the homeostatic system, but also about moderation and fairness – the ethics of sport and of science.
The immediate homeostatic responses of the human body to altitude – those that occur within hours or days – mostly involve the cardiorespiratory system, increasing breathing rates, increasing heart rate and so on. The next stage of response, after several days or weeks, is the development of polycythaemia – a higher red blood cell count per millilitre of blood than is considered ‘normal’. As with so many of the body's balance systems, there are two counterbalanced ways to ‘concentrate’ the blood – either increase the production rate of red blood cells, or decrease the amount of fluid (plasma) in the blood. Early research into this phenomenon was complicated by the fact that mountaineers were often dehydrated, which meant their bodies might be responding to the lack of fluids by reducing plasma. It was therefore difficult to prove experimentally how the polycythaemia was produced – whether it was a response to altitude or to dehydration, or a combination of both. In 1906, two French physiologists, Paul Carnot and Clotilde-Camille Deflandre, developed a theory that a hormone might stimulate the red blood cell production process, and they named this theoretical signalling hormone ‘hémopoïétine’. Renamed erythropoietin, it was eventually specifically identified and isolated by Eugene Goldwasser and his team in the late 1950s and 1960s. 31 By the early twentieth century, then, it was clear that it was possible that altitude polycythaemia was due to active hormone-stimulated cell production, rather than being a side effect of dehydration. 32
This offered another potential solution to the problem of mountain sickness and climbing fatigue: as well as being able to supplement the respiratory system with oxygen, perhaps it would also be possible to create a state of polycythaemia artificially. This ‘blood packing’ could either be used to ‘pre-acclimatise’ someone to altitude, so that they did not have to wait the week or more for their body's systems to respond; or it could be pushed further, to create a state of super-polycythaemia, giving an individual an advantage over their ‘natural’ level of red blood cell production. By the end of the twentieth century this theory had become a reality, and it became the basis of ‘blood doping’ systems used by athletes; but in the first half of the century it remained only a theory and a rumour – there were whispers in the British climbing community in the 1950s that the Germans had tried blood transfusions during their attempts at the high Himalaya in the 1930s. 33
While the British focused their efforts on Everest in the 1920s and 1930s, the Germans looked to Nanga Parbat, the ninth highest mountain in the world, and in an area to which, unlike Everest, the Germans could negotiate access. The German teams did conduct physiological research on the mountain, but the technology for blood transfusions was nowhere near effective and safe enough in the 1930s for blood doping to be a realistic prospect at high altitude, and transfusions of blood long before the expedition were unlikely to aid climbing. It is probable that the British rumours of the 1950s were fuelled by a misunderstanding of research into polycythaemia that was done on Nanga Parbat, particularly that by Ulrich Luft (1910–91). 34 Luft was a doctor and research physiologist with a particular interest in the respiratory system and in respiratory distress, which involved suffocation, low oxygen pressure responses and so on. As a keen climber, he managed to get himself on the 1937 expedition to Nanga Parbat, led by the physiologist and mountaineer Karl Wein. 35 Early in the expedition Luft was left behind at Base Camp to do some routine observations while the rest of the team, seven German climbers and nine Sherpa porters, went on to set up camps higher up the mountain; they created Camp IV at about 6,100 m, and began preparing the way to Camp V. Three days later Luft and five porters resumed the climb, intending to go on to Camp IV – except that they could not find it. Where Camp IV should have been there was nothing but fresh, flat snow. They dug, and found three rucksacks belonging to members of the team, but the snow was too hard and packed for further digging without the right tools. A new climbing team was flown out and, with Luft, dug up the tents and the crushed remains of the entire expedition – wiped out in a single massive avalanche. They retrieved the bodies of five of the German climbers (finding that their smashed wristwatches recorded the time of the disaster as just after midnight), and carefully collected as much of the scientific work, in the form of notebooks and equipment, as was possible. 36
Because of his research interests, Luft was recruited into the Luftwaffe to work on anoxia, oxygen systems and aviation. The significant exchanges between military and civilian research are exemplified by the crossover between mountaineering and aviation, not least because they demonstrate that the relationship was clearly a two-way street, and included important crossovers in the study of mental, as well as physical, stress. 37 As I have explored elsewhere, altitude physiology also highlights occasions when military aviation experience was dismissed as irrelevant or unhelpful by civilian explorers. 38 At the end of the war, Luft was targeted and extracted by the Americans during Operation Paperclip, again because of his research expertise. 39 While other German scientists from this project went to work on the atomic bomb and rocket science, Luft went to the National Aeronautics and Space Administration, translating his expertise in extreme survival physiology to designing systems to test and support astronauts. 40 As a beloved teacher and widely admired scientist he was celebrated in his lifetime, and immediately after his death in 1991, with honours and buildings named after him; but with the opening of East German archives, documents were discovered suggesting that his expertise in what happened to human bodies exposed to extreme conditions had come from work done by others in the Nazi concentration camps. 41 While there is no evidence of him as an active participant, it is clear that he knew about and profited from the hypothermia experiments in Dachau, and possibly the murder of ‘undesirables’ in decompression chambers. 42
Ironically then, the study of homeostasis, extreme physiology, and thus balance in human biology, inevitably raises questions of balance in research ethics – not just in the design and regulation of new experiments, but also the balance between the potential to save current lives (or win important political space races) and the ethical problems of using data from murderous experiments of the past. Expeditionary science is a risky form of scientific practice, and therefore one of its significant advantages is that its participants have been willing to enter environmental situations and engage in physical practices that offer a small but serious risk of physical harm, and even death. In terms of high-altitude science, the first death on a British expedition to Everest was a doctor – Alexander Kellas, a pioneer of oxygen systems in the very early twentieth century, who died from dysentery on the trek to Everest in 1921. Famously, George Mallory and Sandy Irvine died somewhere near the summit in 1924, bringing to an end the expeditions of the 1920s; less famously, seven Sherpa porters had fallen to their deaths on the British expedition in 1922, largely due to an error of judgement by Mallory. The involvement of Sherpa people was a matter of concern to the Europeans who relied on them in the high Himalaya. It was Kellas who first popularised the use of local people at high altitude (as opposed to bringing Alpine porters), and, yet again, we find a fine balance necessary in the discussion of indigenous support. On the one hand, more than sixty Sherpa porters and guides have died while assisting foreign climbers on Everest in the twentieth century, and climbers themselves have asked whether it was fair to use financial bribes to persuade people to undertake this risk for the benefit of their own sporting, imperial or scientific goals (goals which, arguably, brought little benefit or credit to the Sherpa). On the other hand, the attitude of early climbers to their Sherpa guides was undeniably paternalistic and patronising (including the suggestion that the Sherpa were like children and did not fear death), and so ethical concerns could sometimes be framed in a condescending way that denied local people agency. 43
It is important to recognise that the Sherpa participants in high-altitude expeditions were also participants directly in the science, not just the exploration. They appeared as human guinea pigs in various published and unpublished experiments on fatigue, respiration and other elements of exercise physiology. Some of these practices certainly pushed the boundaries of acceptable research practice, most notably the testing of performance-enhancing drugs on Everest. The 1953 British expedition was trying a new route up the mountain, tackling the Lohtse Face. This was, overall, a longer route than had been used by the rival Swiss expedition in 1952, but had the advantage of a stepped ascent, allowing for more camps and depots of equipment to be laid. This in turn required a lot of to-and-fro trips by the Sherpa porters to set up the camps – all of which had to pass through the extremely technically challenging Khombu Icefall. The expedition leader John Hunt feared this would tire the porters, and in the second week of May the team doctor, Mike Ward, ran a test of the amphetamine Benzedrine as an anti-fatigue medication. Two (unnamed) Sherpa participants ‘volunteered’, and took it while carrying food, tents, fuel and other supplies between camps. Several of the team were familiar with the use of amphetamines as a stimulant – it had been used by allied military forces, particularly by pilots – and there were two highly qualified doctors in the climbing team, which controlled the risks; but without consent forms, without any record of what the Sherpa porters were told, it is difficult to understand how this negotiation of risk and reward occurred, if it occurred at all. 44 Luckily, no harm was done, and as the two participants reported back that the drug merely cured one Sherpa's headache and made the other sleepy, the team seem to have decided it would not be useful to repeat the dosing further up the mountain. 45
The use of amphetamines and other performance-enhancing substances is obviously a topic of heated debate in the sporting world. Most of the European high-altitude climbing teams had amphetamine and/or cocaine derivatives in their medical kits, and this use of stimulants and tonics may initially seem to conflict with the ‘gentlemanly amateur’ identity of the early twentieth-century climbing elite. 46 In fact, attitudes towards such performance enhancers were much more relaxed prior to mid-century – George Mallory himself offered to secure for the teams of the 1920s a stimulant ‘similar to caffeen [sic] and kola, but much better and absolutely innocuous’ from his Cambridge colleague J. B. S. Haldane. 47 This fact leaves historians with an apparent puzzle relating to debates about supplemental oxygen in the 1920s, 1930s and 1950s. Typically, at least for the British case, these debates have been represented as an ‘oxygen controversy’, a face-off between modern scientific rationalism (oxygen is necessary) and old-fashioned gentlemanly amateurism (oxygen is ‘cheating’). 48 In that context it is not obvious why the boost given by an amphetamine should be morally acceptable, if attempts to rebalance a homeostatic disturbance by using oxygen were considered ethically dubious; but as I have shown elsewhere, this puzzle is solved when we see this representation as at best a partial story, and at worst a misunderstanding of the debate. 49 Far more important than the issue of ethics in the ‘controversy’ was the fact that early oxygen systems did not work as well as they might, and there were serious scientific and experimental reasons to wonder whether heavy oxygen canisters and respiration-restricting masks might actually hinder a climber more than they helped him.
What this ‘oxygen controversy’ – and, indeed, many of the debates about technology in exploration science – demonstrates is that even if there was a clear scientific consensus on theories of balance and homeostatic regulation in the laboratory, it still took a great deal of work to turn solutions in theory, such as supplemental oxygen, into solutions in practice. Blood packing is another example of this process: although by the 1930s it was reasonable to believe that rival climbing teams might be trying the technology, the reality was that blood transfusions remained difficult and dangerous even in advanced medical facilities at sea-level locations. Attempts to increase the concentration of red blood cells in human subjects continued in laboratory-based experiments, but it seems to have taken another two or three decades before the technology was seriously used to improve sporting performance, and even then in the relative safety of training rooms and motels, 50 rather than on icy mountain slopes.
While elite sportspeople began to take blood packing seriously as an enhancement technology (before it was banned in the mid-1980s), German mountaineers began to consider the exact opposite: haemodilution. In the early 1980s, the American Medical Research Expedition to Everest applied to the American Lung Association for money to study haemodilution on their planned expedition in 1981, because:
the Germans have been fooling with this, but they have done no really scientific, controlled studies. The Germans state that hemodilution is great – makes you feel like a million and enables you to climb like the wind. We're interested in seeing if this is so, and also because the findings will have implications for managing patients with hypoxic disease at sea level. 51
This practice might seem counterintuitive – how could both increasing and decreasing the concentration of red blood cells in a climber's blood improve their performance? The answer lies in the concept of balance, or rather homeostasis, where for every adjustment there is a counter-adjustment. One of the healthy human body's responses to increased altitude is, it turns out, to create polycythaemia by up-regulating the production of red blood cells; this increases the oxygen-carrying capacity of the blood, which means that when there is less oxygen in the atmospheric air, and therefore less in the lungs on each breath, the red blood cells are capable of capturing as much of that scarce oxygen as possible and transporting it around the body. But this response does not come without its own side effects, the most pernicious of which is that increasing the number of cells in a millilitre of blood makes that blood more viscous. This thicker, stickier blood travels with much more difficulty around the capillaries and the areas of microcirculation in the body, which results in an increasing risk of losing circulation in the peripheries of the body, and of suffering from the results of blockages and clots. Further, the process is exacerbated by another alteration in the body's homeostatic balance: the ‘right shifting’ of the oxygen-haemoglobin dissociation curve. This curve describes the chemical response of red blood cells to lower-than-normal oxygen concentrations, which is to have more affinity to oxygen – that is, to bind to oxygen more strongly. 52 At the lungs this is a positive trait, allowing even more of the oxygen in the lungs to be ‘captured’ by the blood instead of being exhaled, and therefore ‘wasted’. But this is counterbalanced – as are so many bodily processes – by a negative, as the red blood cells are also more resistant to releasing their oxygen where it is needed in the body. The more oxygen is transferred away from blood cells as they travel around the body the lower the oxygen concentration is in the blood, until a point is reached, usually in the peripheries and small capillaries, where the red blood cells’ attachment to their oxygen is so strong that they cannot ‘do their job’ and deliver it. This exacerbates the challenges that thicker blood poses to the microcirculation, and compounds the risk of loss of circulation in some parts of the body.
One of the crucial lessons of extreme physiology is this realisation that the compensatory mechanisms of the body can only go so far, and that at the limits of adaptation they may become maladaptive, or cause harm to other bodily systems, or make bodily processes less efficient. This facet of adaptation was also, inadvertently, the spur that motivated the foundation of some of the world's most important and productive research centres into extreme physiology, when, as we will see below, due to the slippage between ‘blood’, ‘race’ and ‘ethnicity’, a Peruvian researcher interpreted a British researcher's statement about homeostasis as a national slur.
Blood and race
Western physiologists working in the nineteenth century maintained a long-standing, Eurocentric assumption that the ideal environment for humans was the temperate zone. While the (white) body might be able to survive in extreme environments, there would inevitably be a biological price to pay, both in an individual sense and also a racial one, evidenced by the fears of mortality and morbidity for civilians and soldiers in the ‘White Man's Grave’, and by anxieties that tropical environments would lead to the hereditary degeneration of more long-term colonial settlers. 53 The classic early twentieth-century restatement of this belief came from Joseph Barcroft (1872–1947), the British chemist and physiologist who researched extensively into respiration, circulation and altitude. In his seminal 1925 book, The Respiratory Function of the Blood, he suggested that ‘[a]ll dwellers at high altitude are persons of impaired physical and mental powers’. 54
In context, this statement anticipated demonstrations of the right-shifted curve of oxygen affiliation – that is, the situation described above where animals, including humans, adapt to altitude by forcing their haemoglobin to bind more strongly to oxygen – good at the lung, but potentially disastrous at the extremities. More generally it was a statement about acclimatisation and adaptation: homeostatic responses of the body sometimes come at a price, especially at the extremes of adaptation. It was read, however, by the Peruvian physiologist, Carlos Monge Medrano (1884–1970), as a specific and racialised insult against mid-altitude populations – mostly those of South America. Barcroft's work drew on his 1921 expedition to Cerro de Pasco in Peru, a mining town that, at 4300 m above sea level, was thought at the time to be the highest permanent human settlement. 55 So, in the following decade, Monge Medrano arranged a rival expedition to study the exercise capacity of such high-altitude residents to specifically rebuff Barcroft's insistence that residents at altitude were ‘impaired’ – indeed, Monge Medrano suggested that the eminent scientist must himself have been befuddled by altitude sickness to have made such a mistake of interpretation. 56
As the director of the Instituto de Biología y Patología Andina, Monge Medrano instituted a concerted programme of research into altitude physiology, which included collaborations across national boundaries; South America became a powerhouse of research into this area, in part due to its ability to provide convenient field sites, but also because of local scientific and biomedical expertise. 57 Monge Medrano himself contributed to new understandings of mountain sickness and adaptation, and later in his career developed a much broader interpretation of the consequences of altitude physiology, one which attempted to write what we would now term an environmentally determinist account of South American history. He wrote about what he called ‘climatic aggression’, which was the damage done on the one hand to people who failed to adapt to new environments and, on the other, to those who were perfectly adapted but then uprooted from their ‘natural’ homes. 58 This theory has obvious resonances with earlier nineteenth-century fears about white racial degeneration in the tropics, and with late nineteenth- and twentieth-century concerns about the potential ‘extinction’ of indigenous people, expedited by their removal from native lands as well as by the interference of Western civilisation – from alcohol to fatty foods and tuberculosis.
There was, however, a significant contextual difference between Monge Medrano's ‘climatic aggression’ and the degeneracy fears of colonial late Victorians, because ideas of the natural human body, in terms of adaptation and origin, had moved from the temperate zone to the tropics. This is of course also echoed in broader research into stress, neurasthenia and other related disorders, which increasingly from the end of the nineteenth century highlighted the conditions of modern life – overcrowding, mechanical work, ‘unnatural’ life rhythms – as a cause of sickness and decline. 59 As work elsewhere in this volume shows – particularly Chapter 4 on anti-obesity campaigns, Chapter 3 on ‘problem’ drinking and Chapter 10 on the renegotiation of ‘normal’ in the treatment of Parkinson's Disease – public health programmes in Western countries increasingly framed challenges to human health as the products of ‘civilization’. What extreme physiology shows us is that the same assumptions were built into our study of even the most basic environmental difficulties people faced: heat, cold and altitude.
Therefore, while the white body had, in 1880, been perfectly adapted to the temperate zone and therefore at risk in the tropics, in the mid-twentieth century the naked (thus ‘natural’) primitive human form was considered to be perfectly organised for survival in hot and humid climates. Logically, then, our ability to survive outside the temperate zone was a matter of technology, not biology; we wore clothes, built shelters, mined coal to keep ourselves warm (processes which then, as forms of civilisation, placed new pressures on our physical and mental health). The origins of this shift from the temperate zone to the tropics are too complex to be detailed here, but came from multiple sources from around 1880 to the middle of the twentieth century: anthropologists and archaeologists reconsidered the path of human origins; race scientists and geneticists reconsidered the relationships between ethnic groups, and the age of the human race; physiologists actively studied a range of different ethnic groups and their responses to cold, heat, deprivation and disease. 60
This shift in understanding of the normal natural human body had significant consequences for physiological research into homeostasis and balance. First, it maintained a racial hierarchy, despite disrupting the centrality of the temperate zone; as adaptations to heat and altitude were biological, they were primitive, while adaptations to cold (which included the whole temperate zone, if we assume that the tropics were the ‘normal’ site for human life) were technological, innovative, inventive and civilising. 61 Or, to put it another way, studying adaptations to heat or altitude was physiology; studying adaptations to cold was bioprospecting for technology. This presumption was confirmed by early studies into cold adaptation, which proved difficult, contradictory and inconclusive through the 1930s and 1940s, and which by the 1950s seemed to be creating a consensus that most people did not significantly biologically adapt to cold (that is, they do not have as strong a physiological defence mechanism for cold climates as they do for increasing altitude or heat), with the possible exception of local adaptations in the bodily peripheries: hands, feet and face. 62
The move away from studies of cold adaptation is most obvious in large-scale projects. As an example, the International Biological Programme, which was founded in 1964 (in emulation of the International Geophysical Year), included from the start a ‘Human Adaptability’ theme. But among this research, studies of cold adaptation were a minority, and physiologists of the Poles instead turned their attention to issues of daylight and isolation – circadian rhythms, sleep and the psychology of exploration. 63 Indeed, at the first International Symposium on Polar Human Biology in 1972, attendees did not even mention the only large physiological expedition to the Antarctic – the INternational PHysiological EXpedition to ANtarctica (INPHEXAN) in 1957–58. 64 INPHEXAN was an Anglo-American expedition, timed to coincide with both the International Geophysical Year and the late stages of (Sir) Edmund Hillary (1919–2008) and (Sir) Vivian Fuchs's (1908–99) attempt to cross Antarctica as part of the Commonwealth Trans-Antarctic Expedition. Dr Lewis Griffith Cresswell Evans Pugh (1909–94), the physiologist in part responsible for the British success on Everest in 1953, was the lead British researcher and, as outlined above, the team conducted very varied work into cold adaptation, metabolism, cold injury and stress. 65 By the time of the next physiological expedition to Antarctica – a full twenty years later in 1977–78 – the focus had clearly shifted; the 1970s International Biomedical Expedition to the Antarctic did consider adaptation to cold, but only as part of a psychological as well as physiological analysis. 66 Consequently, the Arctic and Antarctic were figured as useful ‘natural laboratories’ to study ‘freak’ phenomena such as human responses to 24-hour daylight; the question of human adaptation to cold was, by the 1970s, no longer a pressing physiological research issue.
The ‘rebalancing’ of the ‘natural’ human environment to a tropical, rather than temperate, base meant that the inhabitants of warmer countries remained a focus of both physiological and anthropological fascination. Studies through the middle of the twentieth century still used terms such as ‘primitive’ to describe the indigenous people of the tropics, and their study was racially coded, as they were used (and are still used) as ‘proxies’ for earlier stages of human adaptation. Studies of homeostasis, of the natural balance of the human body, were marked by complicated webs of assumptions about racial science, indigenous rights and human evolution. So, for example, early twentieth-century studies of Australian Aboriginal peoples were shaped by concerns about ‘White Australia’; while some thought Aboriginal people would be the victims of racial decline, and eventually ‘die out’, others wondered at their ability to survive harsh environments that decimated settler colonies. By the middle of the twentieth century, one of the most prominent researchers into environmental physiology was arguing that, while better adapted than the white populations, Aboriginal peoples showed adaptation to hot humid climates, not the hot dry climate of central Australia – evidence of a migration (the date of which was in great dispute) from the tropical islands of South East Asia. 67 This meant that both their physiology, and what were considered ‘primitive’ customs – such as organising societies around water sources – were in fact useful lessons for Europeans also trying to survive in the harsh Australian interior; this was an argument that proved difficult for physiologists, and politicians, to accept. 68
Even in the later work of the International Biological Programme, adaptation physiology routinely used a methodology that compared tropical and subtropical indigenous populations to white incomers (such as ‘white’ and ‘negro’ sharecroppers and American soldiers, or Balan and Chaamba Arabs versus French servicemen). 69 While temperate and Arctic peoples were largely thought to have adapted to their environment using ingenuity and technology, those in tropical and subtropical areas were still depicted as surviving as a result of biology and superstitious, or at least irrational, custom. In this way, research into extreme physiology managed to maintain an imbalance – namely an established hierarchy of civilisations.
Making a ‘balanced team’: finding the right men
In addition to balance constituting a literal property of the human body, the term balance also carried metaphorical weight, providing ways of understanding the world and shaping the nature of research programmes and expeditionary teams. In particular, there were parallels between the ecosystem of the body and that of the research group. Expeditions in spaces such as Everest or the South Pole functioned with a fixed set of resources – whether it was rations, person hours or gasoline – and therefore all functions of the team were elements in a zero-sum game: resources spent on scientific work were not available to be spent on travel, exploration or survival. At every stage of planning and execution issues of balance were in question, from the design of ration boxes (more fat or more carbohydrates, or more calories at the expense of more weight) to the loading of aeroplanes and Sno-Cats.
When it came to the human component of an expedition, leaders, whether explorers or scientists, tended to use the term ‘harmony’ rather than ‘homeostasis’ or ‘balance’. Here the concern is not only to maximise the skill sets available across the chosen group, but also to ensure that the team worked together efficiently in dangerous, isolated conditions. As with bodily homeostasis, it could be challenging to compensate for extreme changes in circumstances, such as the death or serious illness/disablement of a team member, which may lead to evacuation, premature ending of the expedition, or in the worst case scenarios further deaths. For ‘milder’ disturbances, leaders hoped that their choice of team members would be able to compensate for temporary absences from work or other unexpected disruptions. While it was theoretically easy to ensure balance of skills on paper, ensuring harmony in terms of interpersonal relationships proved to be a much greater challenge.
In the first half of the twentieth century, and in many cases well into the 1970s and 1980s, the major technique for picking reliable expedition members was to rely on experience. Above all other factors, personal experience of a potential recruit by the expedition leader or a trusted colleague appears to have been crucial to their engagement on an expedition. This could even outweigh specific experience of the environment in question; for example, successful participation in high-altitude mountaineering could be taken as evidence of suitability for an Antarctic expedition. Attempts were made from the mid-1960s to put this experiential practice on a more empirical footing by studying the psychological factors that correlated with successful overwintering and teamwork, and resilience in the face of environmental challenges and isolation. 70 But, echoing the experience of attempts to create ‘scientific’ and objective personnel selection tests in the military, psychological questionnaires, physiological stress tests and other interventions functioned either as an adjunct to, or a refinement of, the more informal and ‘gut instinct’ processes of selection and role allocation on expeditions throughout the twentieth century. 71
As a consequence, extreme physiology – at least in the Anglophone world – tended to be a closed shop, based around cliques, either those centred on individuals (such as Edmund Hillary) or organisations (such as the Royal Geographical Society). This process undoubtedly created well-bonded and successful expeditionary teams; but it also reduced the diversity of such work, and in particular acted as yet another barrier to female participation. Women routinely applied to be parts of such expeditions. In 1935, Kathleen M. Taylor (then living in Nanking, China) wrote to ask to be included in the British expedition to Everest. As evidence of her capability she claimed two solo ascents of Mount Fuji (‘It is a very easy mountain, but you start at sea level, and do the whole 13,000 ft. in one night’), and many other climbs with a pack weighing ‘not less than 20lbs’. The response letter said she had applied too late to be considered. 72 In 1957, a female geologist, Dawn Rodley, succeeded in persuading not only her male expedition colleagues, but also their wives, that she would be a good colleague. Unfortunately the US Navy refused to transport her to Antarctica; as a site of military activity, Antarctica was forbidden to women from 1956, until the US Congress removed its ban in 1969. 73
While there were no laws banning women from British bases, Fuchs, head of the British Antarctic Survey from 1959 to 1973, emphatically opposed female participation on the grounds that they would disrupt the ‘harmony’ of Antarctic stations. 74 Here, ‘balance’ and ‘harmony’ were key; routinely throughout the twentieth century women were figured as a disruptive influence – even their letters from home, let alone their physical presence, could bring unwanted emotional stress to a closed homosocial homeostatic universe. There is an irony here that the desire for ‘balance’ (or ‘harmony’) results in extremely unbalanced research teams – and, consequently, extremely disrupted patterns of physiological research. Almost all physiology practised in Antarctica or Everest – and other extreme environments – was self-experiment and ‘citizen science’, with the bodies of the explorers themselves functioning as human subjects. By excluding women from this practice, not only were they unable to gain the ‘experience’ that was so crucial to future inclusion in expeditionary teams, but the bodies of women were systematically ignored in research projects. This imbalance fed back into laboratory studies; even though these occurred in ‘safe’ sea-level locations, women were not a focus of serious interest until the 1970s, when some early studies seemed to suggest women actually adapted more efficiently to altitude than men. 75
This is not to say that women were not involved in research into extreme physiology. As I have shown elsewhere, they have participated in extreme physiology, but often in unacknowledged roles. Indeed, it was not until a female ‘computer’ was hired that a major part of the INPHEXAN data could be analysed, showing the lack of adaptation to cold. 76 And, of course, none of the European trips to the high Himalaya could have been undertaken without women; among the thousands of porters hired to carry vital survival gear, food and scientific equipment were many Sherpani. Their double invisibility – not only women, but non-white women – is a stark illustration that our histories of expeditionary science remain unbalanced.
Conclusion: balancing which ‘self’?
Through the first two-thirds of the twentieth century – perhaps until the International Biological Programme began significantly reshaping research practices – biomedical experimentation in extreme environments was frequently experiment, literally and metaphorically, on the self; it was conducted by white Western men, on the bodies of white Western men – often their own bodies. Physiologists and expedition doctors took samples of their own blood and breath, encouraged their children and laboratory assistants into pressure chambers and onto treadmills, and recruited colleagues and co-explorers into experimental studies of the effects of cold, heat, fatigue and low oxygen pressure. 77 While this use of the self and of close colleagues as experimental organisms might seem a rational response to spaces like Antarctica where the potential experimental population is small, or to those like Everest where access is expensive and limited, there was obviously also extensive use of self- and colleague experimentation in temperate zone laboratory studies – for example, at the Harvard Fatigue Laboratory, or earlier studies of gas poisoning by J. S. Haldane on his son J. B. S. Haldane. 78 This chapter can only gesture to the possibility that such ‘heroic’ self-experimentation, whether on a treadmill or a glacier, might be related to notions of the self that connect scientific experimentation with exploration, with ideas of robust masculinity, science as conquest over feminine nature and the value of personal testimony and experience in generating truth about a complicated natural world. 79 What it has more extensively laid out is how those practices reinforced, if not necessarily created, a normalised, homeostatic, balanced (sea-level and temperate) white adult male body, which was read as the standard – the ideal – from which others deviated. While this is not a novel argument for histories of medicine, what is apparent is how broadly the notion of balance could be applied, and how extraordinarily self-reinforcing it was – to stretch the metaphor, it is itself a homeostatic mechanism, which prioritised certain kinds of self, and created practices and assumptions that functionally excluded other kinds of bodies, and even personalities, from the creation and application of standards. When other selves were included, such as the use of Sherpa guides to test amphetamines or of Inuit peoples to study physiological adaptation using radioactive tracers, their inclusion was often on an unequal, and sometimes exploitative basis. 80 Women's bodies, identified as disruptive or inadequate to the task, were simply not studied; at the same time as adaptational physiology asserted the difference of women, it failed to recognise this difference as a potential source of research findings.
The balanced self in an extreme earthly environment possesses a set of moral and behavioural characteristics, as well as a specific, scientifically defined, body. The ability to respond to environmental stressors is a question not only of metabolism, body chemistry and inherited capacity, but also personality: team spirit and self-reflection are necessary for a climber or explorer to recognise, for example, when they are overtired or suffering from anoxic mental fog and beginning to pose a risk to their team; sportsmanship and honesty are necessary to know when it is and is not acceptable to use a technological or pharmacological supplement to aid a sporting goal; stoicism and a strong stomach are necessary for an explorer to eat sufficient calories for survival, even when that is presented as unpalatable blocks of pemmican or hard biscuit. These practices are gendered and racialised in such ways that it is not always possible for all ‘selves’ to successfully engage in them. 81 They also highlight the ways in which responsibility for self-balance could be divided between teams, leaders and individuals: explorers were expected to demonstrate self-awareness, to spot fatigue or hunger before they could cause problems; to judge whether illness or injury were serious or not; to act in prompt, preventative ways to situations such as mild frostbite or anoxia. And, again, the belief that this sort of self-responsibility and self-regulation was a particular property of the white male was part of the justification for excluding other people – particularly women – from expeditionary teams. It was also sometimes an excuse for using manipulative or misleading approaches to try to get non-white participants to ‘volunteer’ for studies: writing as late as the 1970s, one researcher examining adaptation and physiology among Inuit people complained of the challenge of using even standard exercise tests as ‘there are difficulties in motivating primitive and non-competitive people to perform an all-out effort’ – while the sporting, competitive, civilised white man offered no such challenge to scientific experiment. 82
Despite their exotic nature, then, extreme environments emerge from this account as prime ‘natural laboratories’ for the study of human balance – mental and physical. In part, this is because of the strong military interest in topics of cold and low-oxygen survival, which has provided funding and – when self-experiment was not sufficient – human guinea pigs for studies of adaptation and acclimatisation. Relatedly, it is also in part to do with issues of consent and performance – it is ‘nature’ that is causing the real physical risk and discomfort, not the scientists, which has appeared to allow experimentation that might not otherwise find volunteers or ethical clearance. But the Arctic, Antarctic and sites at high altitude have also been able to function as spaces to explore all kinds of balance. It is no coincidence that the focus of adaptational studies shifted over the mid-twentieth century away from physiological adjustments to cold, and towards more psychosocial aspects such as circadian rhythm studies, isolation and depression; this reflects a more general direction in human physiological, psychological and stress research, which increasingly took seriously the ‘pressures of civilization’ – shift work, jet lag, overcrowding, overstimulation – as sources of disruption in the milieu extérieur. Hard physical labour, lack of daylight and cramped, impoverished living conditions could be the lot of the low-paid shift worker in Detroit, or the mechanic doing a rotation at an Antarctic base. 83 Extreme environments prove to be useful for the historian of science, just as they have been for the scientist, because they were spaces in which all the pressures on the human being – both wild and civilised – could be studied.