I have a review in The Guardian of Superintelligence by Nick Bostrom and A Rough Ride to the Future by James Lovelock. I wasn't sure it would work to pair these books, but it seems to have turned out OK as far as it goes. Here are a few additional comments and notes.
An interesting piece on Roko's Basilisk. "The combination of messianic ambitions, being convinced of your own infallibility, and a lot of cash never works out well."
Bostrom recently outlined his ideas at the RSA. You can listen to the recording here.
Once we begin to celebrate... this phrase is from Thomas Berry's essay The Ecozoic Era. In the western mystical tradition see also, inter alia, Thomas Traherne. A state of awareness that unites elevated cognition and affect might enable what the writer Tim Robinson calls the good step -- though he doubts this is durably achievable for humans: “Can such contradictions be forged into a state of consciousness even fleetingly worthy of its ground?”
An interesting piece on Roko's Basilisk. "The combination of messianic ambitions, being convinced of your own infallibility, and a lot of cash never works out well."
Bostrom recently outlined his ideas at the RSA. You can listen to the recording here.
Once we begin to celebrate... this phrase is from Thomas Berry's essay The Ecozoic Era. In the western mystical tradition see also, inter alia, Thomas Traherne. A state of awareness that unites elevated cognition and affect might enable what the writer Tim Robinson calls the good step -- though he doubts this is durably achievable for humans: “Can such contradictions be forged into a state of consciousness even fleetingly worthy of its ground?”
New machines could one day
have almost unlimited impact on humanity and the rest of life See Turing's Cathedral: the Origins of the Digital Universe by George Dyson (2012).
killing remotely - already, notes The Economist, America is arguing about whether to give medals to pilotless drones.
singularity... by around
2030 [discredited] See, for example the resounding meh from Bruce Sterling and this by Alan Winfield. Some analysis suggests consciousness may be intractable to mathematics and the forms of intelligence we identify as most well developed in
human societies appear to be dependent on consciousness.
The argument that a superintelligent system will
shape the world according to its “preferences” preferences is developed in chapters 5 and 6 of Bostrom's book. The argument that most preferences that such an agent could have will...involve the
complete destruction of human life and most plausible human values is developed in chapters 7 and 8.
balance of risks here are the five biggest risks to humanity according to Sandberg et al.
balance of risks here are the five biggest risks to humanity according to Sandberg et al.
Lovelock thinks...in the very long term...we should welcome-machine-based consciousness. Sara Imari Walker and Paul Davies speculate that “life forms that ‘go digital’ may be the only systems that survive in the long run and are thus the only remaining product of the processes that led to life.”
For a far out scenario for life in the very very very long term see this.
[superintelligence] will
live and experience thousands of times as fast as we
can - here is more from Turing's Cathedral (page 302)
...Organisms that evolve in a digital universe are going to be very different from us. To us, they will appear to be evolving ever faster, but to them, our evolution will appear to have been decelerating at their moment of creation – the way our universe appears to have suddenly begun to cool after the big bang. Ulam's speculations were correct. Our time is become the prototime for something else.catastrophic risk see It could be worse and this profile by Ross Andersen.
judgement on right or wrong. Bostrom writes at the beginning of Superintelligence that it is likely that his book is seriously wrong and
misleading. He adds, however, that alternative views, including the
idea that we can safely ignore the prospect of superintelligence, are
more wrong.
There may (or may not) be mileage in thinking about and comparing to scenarios in which superintelligence arrives from outer space. Stephen Hawking is among those who suggest this would
probably be a catastrophe for humanity, analagous to the slaughter of indigenous Americans by Europeans. In The Beginning of Infinity (Chapter 9) David Deutsch counters that any
civilisation sufficiently advanced to transport itself across
interstellar distances would, necessarily, have no need of the raw
materials, or anything else, in our solar system. Deutsch continues:
“Would we seem like insects to [an advanced alien civilisation]?
This can seem plausible only if one forgets that there can only be
one type of person: universal explainers and constructors. The idea
that there could be beings that are to us as we are to animals is a
belief in the supernatural.”
stupidity The first story in Stanislaw Lem's Cyberiad is about a machine which its inventor intends to be fantastically intelligent but which turns out to be incorrigibly stupid. And, of course, in Douglas Adams's Hitchhiker's Guide to the Galaxy, Deep Thought calculates that the answer to the ultimate question of life, the universe and everything is 42. When the receivers of the Ultimate Answer demur, Deep Thought replies that "[he] checked it very thoroughly, and that quite definitely it is the answer. I think the problem, to be quite honest with you is that you've never actually known what the question was."
stupidity The first story in Stanislaw Lem's Cyberiad is about a machine which its inventor intends to be fantastically intelligent but which turns out to be incorrigibly stupid. And, of course, in Douglas Adams's Hitchhiker's Guide to the Galaxy, Deep Thought calculates that the answer to the ultimate question of life, the universe and everything is 42. When the receivers of the Ultimate Answer demur, Deep Thought replies that "[he] checked it very thoroughly, and that quite definitely it is the answer. I think the problem, to be quite honest with you is that you've never actually known what the question was."
Image: natural stone arch near Þingvellir in Iceland, site of an early Parliament. Jacob Bronowski warned "we must not perish by the distance between people and government, between people and power."