The "Freeola Customer Forum" forum, which includes Retro Game Reviews, has been archived and is now read-only. You cannot post here or create a new thread or review on this forum.
Within thirty years, we will have the technological means to create superhuman intelligence. Shortly after, the human era will be ended.
Is such progress avoidable? If not to be avoided, can events be guided so that we may survive? These questions are investigated. Some possible answers (and some further dangers) are presented.
-------------------------------------------------------
http://www.ugcs.caltech.edu/~phoenix/vinge/vinge-sing.html
--------------------------------------------------------
A paper on the birth of AI and the possibilities it presents. I've stuck this link on here before, but what with all the matrix story hype, I thought some people might wanna read.
Interesting stuff!
For those of you who find reading boring, here is another link :
---------------------------------------------
http://www.trevorvanmeter.com/flyguy/
---------------------------------------------
> Exactly, scientists do it mainly for advancement in science itself. If
> one day an evil singularity does kill us all, blame the Government
> for butting in AI research.
I blame the government for most things.
"Wheres my shoes...must be the work of the government!"
Just like Homer Simpson ......
"So Homer, how do you hope to pass this exam?"
"Well, during the exam, I plan to hide under a pile of coats, and hope everything turns out ok"
> Yet those gits in white coats do it in the name of science.
>
> phuckit
yeah, but without increases in the field of AI, we may not get the sequel to perfect dark for example.
I'd rather risk armageddon than go back to playing space invaders
Yet those gits in white coats do it in the name of science.
phuckit
scratch that last one.