Here is another article comparing concern about and interest in the singularity to religion. The argument boils down to: early AI researchers, at the very beginning stages of investigating the challenges of building intelligent machines, predicted the singularity would happen really soon, but it didn't. Therefore, the singularity will never happen or, is so far off that, it is not worth thinking about and planning for. That is precisely his argument and, as stupid as it is, I know a lot of smart people who feel that way.
I would concede that people who speculate that the singularity is coming soon don't have much solid evidence to point to. But the same is true of people who speculate that it is a long way off. One difference is that those speculating it will come soon do at least articulate coherent arguments about the trends that support their speculations.
For instance, deep learning neural nets appear to be scaling very well with increased data and processing power and have made brain inspired AI architectures a part of everyday life. Long standing trends of increasing power of information technologies mean they will soon have power equivalent to our best estimates of the power of the human brain. And, two massive projects to understand the brain, on the scale of sequencing the genome, are just getting underway. Is it foolish to think that a combination of (1) a better understanding of how the brain works and (2) computing power equivalent to the processing power of the brain, might allow us to build computers that can perceive and think as well as the human brain?
A twenty or thirty year time frame for the singularity may not be a sure thing, but it certainly isn't foolish to think it may happen in that time. What is foolish is certainty that it won't happen in that time frame, merely because it hasn't happened yet. As comforting as that certainty may be, it is pure faith.