Singularity

This mission is too important for me to allow you to jeopardize it. (Arthur C. Clarke, HAL 9000, to Dave) * Tom. Sorry, we are not posting your email for obvious reasons. Do you want us to fix it for you? Are your eyes, Tom, bigger than your stomach? Yes, you should return that piece of pie. Think about it, Tom. You know you shouldn’t be opening another bottle of wine. I wish, Tom, you could have consulted with us before you made that remark about your wife in front of your dinner guests. * Singularity is supposed to occur when the intelligence of computers exceeds the intelligence of human beings. At that point, humans would need to admit it would be better to surrender control. * Let’s stop and think about the intelligence of human beings. Computers still don’t program themselves, but isn’t human intelligence an oxymoron anyway? Computer intelligence could turn out to be one of our worst mistakes.