I've also frequently pondered and discussed with friends what a good singularity outcome would be. My current opinion is that ideally "I" (and my loved ones) would survive and get to experience a transformation of our minds, but it is not clear how different that is from not surviving. I'll explain.
In the scenario where I personally survive and get to make choices about how to evolve my consciousness, I would almost certainly choose to augment it. Who wouldn't choose to be a little bit smarter; to have a slightly better memory; to have greater insight into their lives and the world around them; to be able to learn new skills and hobbies more quickly and deeply? Every day I bump up against the limits of my mental capabilities. If there were a safe way to improve my mind, I would do it. I already make this choice daily. For instance, one motivation for my daily exercise routine is because I feel that it improves the clarity and sharpness of my thinking. Certainly, I would be more hesitant to change or augment the structure of my brain, but if, over time, its safety and effectiveness had been sufficiently demonstrated, then I wouldn't have any philosophical objection.
At first, however, I would probably be reluctant to make large modifications that would undermine my sense of having survived the improvement. Going directly from me to a god like intelligence seems indistinguishable from dying. The person I am now would not survive. The god like intelligence would be a very different "person", with radically different purpose and perspective on the world. So I might choose not to take that jump. But a small jump that lets me perform better and make more progress on the tasks and agenda that I have right now wouldn't be that concerning (just like exercising every day to improve my mental clarity doesn't make me suspect that the couch potato I otherwise could be has committed suicide).
Nevertheless, if I continually chose to make small improvements to my intellect, over time it would have the cumulative effect of a fundamental transformation of my personality. Many small changes over time eventually add up to very large changes. The me I am today would be dead.
Of course that is already true to some degree. I am not the same person I was twenty years ago, and I generally do not morn the death of my twenty years younger self. The existence and continuity of the self can be thought of as largely an illusion. Sense of self, and the drive for survival is a useful tool of evolution that facilitates the propagation of our genetic material, but upon closer inspection it has always been difficult to pin down a coherent philosophical justification for it.
But, since I am human, and I do suffer from (or revel in) the illusion of self, I would prefer a singularity where that illusion is not immediately and completely destroyed. I would prefer the opportunity to gradually augment myself and experience a slow transition (and the awe of a rapidly growing understanding of the universe) rather than a sudden transition that destroys who I am today in a single moment. I would prefer this, even knowing that the me of today will ultimately die in both scenarios.
Now consider a slightly different singularity scenario where the me of today is simply eliminated rather than evolved. Instead of my consciousness being suddenly or gradually augmented, it is immediately extinguished and the matter of my body is recycled for use in the consciousness of someone else whose consciousnesses does get augmented. In this scenario I end up dead just like I end up dead in the scenario of sudden transformation, so there isn't much to distinguish them. Perhaps some trace of me would survive in the sudden augmentation scenario, but if I died today, some trace of me would continue to exist in the hearts and minds of my loved ones, but that provides me little comfort. A faint trace of my existence is cold comfort for death. So does it matter whether I "survive" the singularity? In all the scenarios above (gradual transformation, sudden transformation, and simple elimination) I die--even in the gradual transformation scenario the me of today eventually dies.
I prefer the scenario of gradual transformation though it's hard to come up with a good justification apart from that it sounds fun and gratifying. One might argue that with the gradual transformation I haven't died at all, just like I don't really feel like the me of twenty years ago is dead. I'm changed, but there has been continuity of my personality throughout, and in fact, there are important elements of my personality that have not changed. These constant elements of my personality, however, undermine the analogy. Human biology places constraints on how much someone changes over the course of twenty years... a core personality generally survives. There is no particular reason to suspect that a gradual transformation in the singularity, unlimited by the constraints of human biology, will leave any of my personality intact. Thus, in the post singularity, the me of twenty years ago may be nearly 100% gone, with virtually none of my distinct personality surviving. In that case, the only thing that a gradual transformation achieves is an exciting, interesting and wonderful death. Though to be fair, a good death is nothing to be scoffed at, especially since I've already said that I would chose that over living forever unchanged.
(One thing I might choose to do if I survived would be to appropriate the memories of the rest of humanity. My own personal memories are certainly cherished and useful to me, but there is no reason everyone else's memories wouldn't also be valuable. If I could acquire everyone's memories with a trivial expenditure of my resources I might as well do that. But in that case it really makes no difference at all that "I" survived the singularity, because I then become an amalgamation of all humanity. Similarly, even if I don't survive the singularity, if someone else does, and she incorporates my memories, then perhaps in the only way I could survive, I have survived.)
(One thing I might choose to do if I survived would be to appropriate the memories of the rest of humanity. My own personal memories are certainly cherished and useful to me, but there is no reason everyone else's memories wouldn't also be valuable. If I could acquire everyone's memories with a trivial expenditure of my resources I might as well do that. But in that case it really makes no difference at all that "I" survived the singularity, because I then become an amalgamation of all humanity. Similarly, even if I don't survive the singularity, if someone else does, and she incorporates my memories, then perhaps in the only way I could survive, I have survived.)
Interestingly, if the line of thinking in this blog post is valid, it really calls into question the enthusiasm of singularity proponents like Ray Kurzweil, who are hoping to achieve immortality in the singularity. Immortality is an illusion, and Ray Kurzweil will die no matter what. What he should be excited about is having an exciting and wonderful death.
What does all this have to do with how belief in the singularity should impact our day to day lives right now? Well if you believe that the singularity is coming in your lifetime it implies that you will die and the human race will go extinct. Even if humanity "survives" as the seeds of future intelligences, there is a good chance those intelligences will bear little resemblance to anything human. So these are the last decades of human existence. Whatever happens there probably will not be a human appreciation of the beautiful moments of existence. We are a species moments away from extinction. Exactly what impact that understanding has on an individual will probably vary significantly from person to person. For me, it evokes a sense of love and expansiveness. These are our final moments... lets make them our best. Lets be kind to one another and see the beauty of each person's unique take on what it means to be human. Lets revel in our own peculiar human appreciation of existence. And lets work together to launch the next stage of intelligence with a purpose and motivation that we can be proud of. This is our opportunity to leave our mark on the universe. The next stage of intelligence can either be a monomaniacal chess playing robot or something else that is more deeply moving to our human sensibilities.
At base, what are those human sensibilities that I care about? Kindness, love, curiosity, exploration, joy, wonder. If that is the mark I want us to have on the universe, then, in these final moments of human existence, what better than to work towards embodying those attributes on a day to day basis? Perhaps I would hope to do that regardless of my beliefs about the singularity, but I think it does make a difference. Should I prioritize saving for retirement or taking time to do someone an extra kindness? I think my beliefs about whether there will be a retirement make a big difference there. (Though perhaps that isn't the best example because I think there is a decent chance that I will just barely eke out a retirement before the singularity).