Great Throughts Treasury

This site is dedicated to the memory of Dr. Alan William Smolowe who gave birth to the creation of this database.

Nick Bostrom

Swedish Transhumanist Philosopher at St. Cross College, University of Oxford, Director of Oxford's Future of Humanity Institute, Co-founder and chair of both the World Transhumanist Association and the Institute for Ethics and Emerging Technologies

"On the bank at the end of what was there before us, gazing over to the other side on what we can become. Veiled in the mist of naïve speculation we are busy here preparing rafts to carry us across before the light goes out leaving us in the eternal night of could-have-been. "

"Before transhumanism, the only hope of evading death was through reincarnation or otherworldly resurrection. Those who saw viewed such religious doctrines as figments of the our own imagination had no alternative but to accept death as an inevitable fact of our existence. Secular worldviews, including traditional humanism, would typically include some sort of explanation of why death was not such a bad thing after all. Some existentialists even went so far as to maintain that death was necessary to give live meaning!"

"Discovering traces of life on Mars would be of tremendous scientific significance: The first time that any signs of extraterrestrial life had ever been detected. Many people would also find it heartening to learn that we're not entirely alone in this vast, cold cosmos."

"Dignity as a Quality can be attributed to entities other than persons, including populations, societies, cultures, and civilizations. Some of the adverse consequences of enhancement that Kass predicts would pertain specifically to such collectives. "Homogeneity" is not a property of an individual; it is a characteristic of a group of individuals. It is not so clear, however, what Dignity as a Quality consists in when predicated to a collective. Being farther from the prototype application of the idea of dignity, such attributions of Dignity as a Quality to collectives may rely on value judgments to a greater extent than is the case when we apply it to individuals, where the descriptive components of the concept carry more of the weight."

"Does human enhancement threaten our dignity, as some prominent commentators have asserted? Or could our dignity perhaps be technologically enhanced? After disentangling several different concepts of dignity, this essay focuses on the idea of dignity as a quality, a kind of excellence admitting of degrees and applicable to entities both within and without the human realm. I argue that dignity in this sense interacts with enhancement in complex ways which bring to light some fundamental issues in value theory, and that the effects of any given enhancement must be evaluated in its appropriate empirical context. Yet it is possible that through enhancement we could become better able to appreciate and secure many forms of dignity that are overlooked or missing under current conditions. I also suggest that, in a posthuman world, dignity as a quality could grow in importance as an organizing moral/aesthetic idea."

"For healthy adult people, the really big thing we can foresee are ways of intervening in the ageing process, either by slowing or reversing it."

"Enhancements" of drives, emotions, mood, and personality might pose special threats to dignity, tempting us to escape "the tensions of alienation by precipitate fusion and headlong surrender." An individual could opt to refashion herself so as to be content with reality as she finds it rather than standing firm in proud opposition. Such a choice could itself express a meretricious attitude. Worse, the transformation could result in a personality that has lost a great portion of whatever Dignity as a Quality it may have possessed before."

"Getting rid of the dragon would deflect us from realizing the aspirations to which our lives naturally point. I tell you ? the nature of the dragon is to eat humans. And our own nature is truly and nobly fulfilled only by being eaten by it."

"Had Mother Nature been a real parent, she would have been in jail for child abuse and murder."

"Espousing a deathist viewpoint tends to go with a certain element of hypocrisy. It is to be hoped and expected that a good many of death's apologists, if they are one day presented the concrete choice between (A) getting sick, old, and die, and (B) be given a new shot of life to stay healthy, vigorous and remain in the company of friends and loved ones to participate in the unfolding of the future, would, when put push came to shove, choose this latter alternative. If a few people would still choose death, that's a choice that is of course to be regretted, but nevertheless respected. The transhumanist position the ethics of death is crystal clear: death should be voluntary. This means that everybody should be free to extend their life and to arrange for cryonic suspension of their deanimated bodies. It also means that voluntary euthanasia, under conditions of informed consent, is a basic human right."

"I personally don't think of myself as either an optimist or a pessimist."

"I think the definition of an existential risk goes beyond just extinction, in that it also includes the permanent destruction of our potential for desirable future development. Our permanent failure to develop the sort of technologies that would fundamentally improve the quality of human life would count as an existential catastrophe. I think there are vastly better ways of being than we humans can currently reach and experience. We have fundamental biological limitations, which limit the kinds of values that we can instantiate in our life?our lifespans are limited, our cognitive abilities are limited, our emotional constitution is such that even under very good conditions we might not be completely happy. And even at the more mundane level, the world today contains a lot of avoidable misery and suffering and poverty and disease, and I think the world could be a lot better, both in the transhuman way, but also in this more economic way. The failure to ever realize those much better modes of being would count as an existential risk if it were permanent."

"If I wanted some sort of scheme that laid out the stages of civilization, the period before machine super intelligence and the period after super machine intelligence would be a more relevant dichotomy. When you look at what?s valuable or interesting in examining these stages, it?s going to be what is done with these future resources and technologies, as opposed to their structure. It?s possible that the long-term future of humanity, if things go well, would from the outside look very simple. You might have Earth at the center, and then you might have a growing sphere of technological infrastructure that expands in all directions at some significant fraction of the speed of light, occupying larger and larger volumes of the universe?first in our galaxy, and then beyond as far as is physically possible. And then all that ever happens is just this continued increase in the spherical volume of matter colonized by human descendants, a growing bubble of infrastructure. Everything would then depend on what was happening inside this infrastructure, what kinds of lives people were being led there, what kinds of experiences people were having. You couldn?t infer that from the large-scale structure, so you?d have to sort of zoom in and see what kind of information processing occurred within this infrastructure."

"It is possible to take a more optimistic view of the possibilities of secular change in the societal and cultural realms. One might believe that the history of humankind shows signs of moral progress, a slow and fluctuating trend toward more justice and less cruelty. Even if one does not detect such a trend in history, one might still hope that the future will bring more unambiguous amelioration of the human condition. But there are many variables other than Dignity as a Quality that influence our evaluation of possible cultures and societies (such as the extent to which Human Dignity is respected, to name but one). It may be that we have to content ourselves with hoping for improvements in these other variables, recognizing that Dignity as a Quality, when ascribed to forms of social organization rather than individuals, is too indeterminate a concept-and possibly too culture-relative-for even an optimist to feel confident that future society or future culture will appear highly dignified by current lights."

"In the nearer term I think various developments in biotechnology and synthetic biology are quite disconcerting. We are gaining the ability to create designer pathogens and there are these blueprints of various disease organisms that are in the public domain?you can download the gene sequence for smallpox or the 1918 flu virus from the Internet. So far the ordinary person will only have a digital representation of it on their computer screen, but we?re also developing better and better DNA synthesis machines, which are machines that can take one of these digital blueprints as an input, and then print out the actual RNA string or DNA string. Soon they will become powerful enough that they can actually print out these kinds of viruses. So already there you have a kind of predictable risk, and then once you can start modifying these organisms in certain kinds of ways, there is a whole additional frontier of danger that you can foresee. In the longer run, I think artificial intelligence?once it gains human and then superhuman capabilities?will present us with a major risk area. There are also different kinds of population control that worry me, things like surveillance and psychological manipulation pharmaceuticals."

"It is too early to tell whether our days are necessarily numbered. Cosmology and fundamental physics are still incomplete and in theoretical flux; theoretical possibilities for infinite information processing (which might enable an upload to be immortal) seem to open and close every few years. We have to live with this uncertainty, along with the much greater uncertainty about whether any of us will manage to avoid dying prematurely, before technology has become mature."

"It may turn out to be impossible to live strictly speaking forever, even for those who are lucky enough to survive to such a time when technology has been perfected, and even under ideal conditions. The amount of matter and energy that our civilization can lay its hands on before they recede forever beyond our reach (due to the universe's expansion) is finite in the currently most favored cosmological models. The heat death of the universe is thus a matter of some personal concern to optimistic transhumanists!"

"Isn't death part of the natural order of things? Transhumanists insist that whether something is natural or not is irrelevant to whether it is good or desirable [see also "Isn't transhumanism tampering with nature?", "Won't extended life worsen overpopulation problems?", and "Why do transhumanists want to live longer?"."

"Knowledge about limitations of your data collection process affects what inferences you can draw from the data."

"It?s hard to know what that might look like, because our human experience might be just a small little crumb of what?s possible. If you think of all the different modes of being, different kinds of feeling and experiencing, different ways of thinking and relating, it might be that human nature constrains us to a very narrow little corner of the space of possible modes of being. If we think of the space of possible modes of being as a large cathedral, then humanity in its current stage might be like a little cowering infant sitting in the corner of that cathedral having only the most limited sense of what is possible."

"Once a discovery has been published, there is no way of un-publishing it."

"Nanotechnology has been moving a little faster than I expected, virtual reality a little slower."

"One of the three propositions seems very highly likely true: 1.) Almost, or all civilizations like ours go extinct before reaching technological maturity. Technological maturity is defined as something like Ray Kurzweil's or Hans Moravec's wettest dreams: Artificial Intelligence carried to a profound degree, solving the death problem, end of economic scarcity, etc. This proposition has been written alternately thus: No civilization will reach a level of technological maturity to the point where they can simulate reality that is so detailed so that "that reality" could be mistaken as "reality." 2.) Almost all technologically mature civilizations (on any possible planet) lose interest in creating ancestor simulations, which are computer simulations so dizzyingly complex and nuanced that the simulated minds would be conscious, or believe they're conscious. Sophisticated beings so profoundly adept at technological manipulation aren't interested/don't do simulations of reality for ancestors. If these beings DO do these simulations, they don't do many, for varying reasons having to do with wanting to use computational power for other things, or due to ethical objections about keeping simulated beings captive, etc. 3.) We're almost certainly living in a simulation. Now. You and me and everyone we know, our entire history and world, possibly."

"Since we are still far from being able to halt or reverse aging, cryonic suspension of the dead should be made available as an option for those who desire it. It is possible that future technologies will make it possible to reanimate people who have cryonically suspended. While cryonics might be a long shot, it definitely carries better odds than cremation or burial."

"That people should make excuses for death is understandable. Until recently there was absolutely nothing anybody could do about it, and it made some degree of sense then to create comforting philosophies according to which dying of old age is a fine thing ("deathism"). If such beliefs were once relatively harmless, and perhaps even provided some therapeutic benefit, they have now outlived their purpose. Today, we can foresee the possibility of eventually abolishing aging and we have the option of taking active measures to stay alive until then, through life extension techniques and cryonics. This makes the illusions of deathist philosophies dangerous, indeed mortal, since they teach helplessness and encourage passivity."

"Our Dignity as a Quality would in fact be greater if some of our capacities were greater than they are. Yet one might hold thatÿthe act of enhancingÿour capacities would in itself lower our Dignity as a Quality. One might also hold thatÿcapacities obtained by means of some artificial enhancementÿwould fail to contribute, or would not contribute as much, to our Dignity as a Quality as the same capacities would have done had they been obtained by "natural" means."

"The Orthogonality Thesis: Leaving aside some minor constraints, it possible for any ultimate goal to be compatible with any level of intelligence. That is to say, intelligence and ultimate goals form orthogonal dimensions along which any possible agent (artificial or natural) may vary."

"The Internet is a big boon to academic research. Gone are the days spent in dusty library stacks digging for journal articles. Many articles are available free to the public in open-access journal or as preprints on the authors? website."

"Posthumans [are possible future beings] whose basic capacities so radically exceed those of present humans as to be no longer unambiguously human by our standards."

"The quest for immortality is one of the most ancient and deep-rooted of human aspirations. It has been an important theme in human literature from the very earliest preserved written story, The Epic of Gilgamesh, and in innumerable narratives and myths ever since. It underlies the teachings of the world religions about spiritual immortality and the hope of an afterlife. If death is part of the natural order, so too is the human desire to overcome death."

"The Instrumental Convergence Thesis: Agents with different ultimate goals will pursue similar intermediate or sub-goals [because such intermediate goals are either: (a) necessary preconditions for achieving the ultimate goal; or, alternatively (b) ?good tricks? for achieving the ultimate goal.]"

"The chances that a species at our current level of development can avoid going extinct before becoming technologically mature is negligibly small? almost no technologically mature civilizations are interested in running computer simulations of minds like ours, or we are ?almost certainly? a simulation? All three could be equally possible, he wrote, but if the first two are false, the third must be true."

"The Unfriendliness Thesis: Because some intermediate goals are unfriendly to humans, and because of instrumental convergence, even artificial superintelligences with seemingly benign ultimate goals can do things that are unfriendly to human existence."

"There are some problems that technology can't solve."

"There is more scholarly work on the life-habits of the dung fly than on existential risks [to humanity]."

"Traits acquired during one's lifetime - muscles built up in the gym, for example - cannot be passed on to the next generation. Now with technology, as it happens, we might indeed be able to transfer some of our acquired traits on to our selected offspring by genetic engineering."

"Transhumanism stresses the moral urgency of saving lives, or, more precisely, of preventing involuntary deaths among people whose lives are worth living. In the developed world, aging is currently the number one killer. Aging is also biggest cause of illness, disability and dementia. (Even if all heart disease and cancer could be cured, life expectancy would increase by merely six to seven years.) Anti-aging medicine is therefore a key transhumanist priority. The goal, of course, is to radically extent people?s active health-spans, not to add a few extra years on a ventilator at the end of life."

"Transhumanism advocates the well-being of all sentience, whether in artificial intellects, humans, and non-human animals (including extraterrestrial species, if there are any). Racism, sexism, speciesism, belligerent nationalism and religious intolerance are unacceptable. In addition to the usual grounds for deeming such practices objectionable, there is also a specifically transhumanist motivation for this. In order to prepare for a time when the human species may start branching out in various directions, we need to start now to strongly encourage the development of moral sentiments that are broad enough encompass within the sphere of moral concern sentiences that are constituted differently from ourselves."

"We are the dumbest possible species that can maintain a technological civilization."

"We humans like to pride ourselves on being the smartest and most advanced species on the planet. Perhaps this position gives us a kind of Dignity as a Quality, one that could be shared by all humans, including mediocrities and even those who fall below some nonhuman animals in terms of cognitive ability. We would have this special Dignity as a Quality through our belonging to a species whose membership has included such luminaries as Michelangelo and Einstein. We might then worry that we would risk losing this special dignity if, through the application of radical enhancement technologies, we created another species (or intelligent machines) that surpassed human genius in all dimensions. Becoming a member of the second-most advanced species on the planet (supposing one were not among the radically enhanced) sounds like a demotion."

"When we are headed the wrong way, the last thing we need is progress"