I have a confession. I used to be all about the Singularity. I thought it was inevitable. I thought for certain that some sort of Terminator/HAL9000 scenario would happen when ECHELON
achieved sentience. I was sure The Second Renaissance
from the Animatrix was a fairly accurate depiction of how things would go down. We'd make smart robots, we'd treat them poorly, they'd rebel and slaughter humanity. Now I'm not so sure. I have big, gloomy doubts about the Singularity. Michael Anissimov tries to restock the flames of fear over at Accelerating Future with his post "Yes, The Singularity is the Single Biggest Threat to Humanity.
"
Combine the non-obvious complexity of common sense morality with great power and you have an immense problem. Advanced AIs will be able to copy themselves onto any available computers, stay awake 24/7, improve their own designs, develop automated and parallelized experimental cycles that far ...