Stephen Hawking (perhaps the most famous living physicist), Stuart Russell (co-author of the most important artificial intelligence textbook) and two other famous professors (at MIT) recently wrote an article entitled "Transcending Complacency on Superintelligent Machines" which happens to match my thoughts fairly closely. They do a good job of summarizing the main ideas, risks and potential benefits of the singularity in a very short article and call on humanity to start thinking about it seriously.
I think that is a positive sign. High profile thought leaders are starting to take notice and call for action. It will be hard to dismiss Stephen Hawking as a singularity "nut" in the way that Ray Kurzweil is routinely dismissed. On the other hand, I think it's worth considering this video of Noam Chomsky declaring that the singularity is science fiction, we are nowhere near it and perhaps never will be.
He believes that capturing the nature of human intelligence is a colossal problem well beyond the limits of contemporary science, indeed possibly beyond the capacity of humanity to ever achieve, and that recent advances such as Watson, self driving cars, etc. don't signal any significant breakthroughs. There is no reason to think about or prepare for (much less fear) superintelligence, and doing so would be a distraction from legitimate threats to human survival such as climate change.
Chomsky's singularity denial is instructive about what to expect as more and more high profile thinkers like Hakwing et al. start trying to raise awareness about the extreme risks and possible proximity of the singularity.
The path from "whats the singularity?" to "let's prepare" will likely not be smooth or rapid, at least at first. For each high profile thinker who comes out with a message that it is time to take the singularity seriously, there is likely to be another who derides it as science fiction.
It's interesting to compare singularity denial and climate change denial. Both climate change theory and singularity theory extrapolate existing trends to make predictions about future risks. Both have significant uncertainty. The most dire climate change predictions, for instance, are based on the assumption that we will not, in the near future, develop extremely inexpensive alternative sources of energy that allow us to halt and possibly even reverse carbon emissions. But, of course, continued progress in fusion or solar energy would upend that assumption. Whether you believe that such progress will happen in the near future depends on whether you are a climate change activist or a singularitarian. A climate change activist might suffer heart palpitations at the mere thought of climate change theory and singularity theory being compared, and reply indignantly that climate change is actual science and the singularity is foolish speculation. But, a singularitarian might look at the consistent downward trend of $/kilowatt hour for solar energy, extend the curve out a few years and declare that we are on the cusp of a cheap energy revolution. At the most basic level, though, both climate change and singularity theory, are extrapolating long term ongoing trends into the future and looking at the results.
One big difference, is that climate change denial is funded by large oil interests, whereas singularity denial may never have any significant source of funding. I can't think of any powerful interests whose profits would be threatened by increased concern about the singularity (aside from the tech companies that are working to bring it about ... but it's hard to imagine how it would be in their interests to fund a denial campaign). Indeed, one of the few groups that comes to mind as having something to lose are climate change activists. If you think there is a decent chance that the singularity happens in the next 50 years or so, then many of the projected risks of climate change will not have occurred yet (see risk summary starting on page 26), and thus, likely never will. Perhaps more importantly, there is limited bandwidth available among activists, academics and other thinkers. Right now climate change is an intense focus, but if people start writing and working on the singularity, they will be writing and working less on other things.
Obviously, I do not believe that climate change work and singularity work are directly in conflict. Personally, I think climate responsibility makes sense. Insofar as I agree with Chomsky that we do not yet understand the nature of intelligence, it makes sense to reserve definitive judgement about the timeline of the singularity--even if recent advances have pushed my intuition strongly towards the "sooner than later" end of the spectrum. I don't think there is any reason to bet everything on the singularity happening within 50 years. That would be foolish. So we ought to take care of our Earth while we wait to see what happens.
Nevertheless, I wonder if the massive energy being poured into pushing forward climate change study, law, etc., will ultimately serve to distract us from a much more pressing and serious risk: super intelligence.