Actually, some of the singularity people regard the singularity both inevitably happening soon(ish), and their worst nightmare. Not desirable at all.
The superintelligent machines will take over the world and probably destroy humankind as a side effect. Consequently, they think that we urgently need philosophical musings over possible ways to ensure that the inevitably created superintelligent AI overlords would be build using principles that make them friendly to humankind. This is (according to them) the only hope to save humankind from extinction in near future.
The superintelligent machines will take over the world and probably destroy humankind as a side effect. Consequently, they think that we urgently need philosophical musings over possible ways to ensure that the inevitably created superintelligent AI overlords would be build using principles that make them friendly to humankind. This is (according to them) the only hope to save humankind from extinction in near future.