One of these days I'll have to do some serious reading and thinking on the notion of the Singularity. I recently listened to a podcast interview with Vernor Vinge, which reminded me to re-read his original article on the concept. I find much to question in his argument. At root, Vinge claims (based on the continued advancement in the power and speed of computer hardware) that superhuman intelligence will emerge by 2030. But who will write the software that powers these superhuman intelligences? Vinge simply assumes that if the hardware is fast enough, the software will follow. Given that we can't even design reliable operating systems (let alone usable encryption software), I have my doubts. For Vinge, the Singularity simply is the emergence of superhumanity -- without that, we would experience only "a glut of technical riches". Yet I find the prospect of extremely fast technical/economic change -- a phase shift -- to be much more likely than the emergence of highly intelligent, non-human agents (we're good at making tools, but have had no success in making self-acting artificial agents). But, as I say, I need to do more research on the idea of the Singularity before I come to any definitive conclusions (these essays and these links look like good places to start).