Posted: Thu Dec 21, 2006 8:54 am
Brynet-Inc: you could also see potential threats. Imagine for example based on a statistical mapping scheme you could upload your mind (possibly also the personality), and based on that info you can predict every single decision the uploader will take during his life (as long as the same choices and circumstances are tested on the map, which brings me to a potential paradox: the uploading must happen also inside the map and every knowledge about the system, which means it's kinda reciprocal). At first it would be a way to help you think (maybe offload some thougts), but imagine if advertises use them to test ways to get your atention.
It could become a form of government. A hyper-pseudodemocracy. Where every citizen is mapped, and the rules of society is based on the predictions of the majority (it would be almost exactly what the majority wants). Then it could become a method of censorship, since your brain is mapped, people with power can eliminate people with diverging thoughts.
And finally what if it becomes the way of immortality? All we become is data and all of a sudden nature doesn't matter anymore (because we don't need it anymore to propagnate our species, which can be done by creating more data in whatever form possible).
Anyway, just thinking outside the box. Adding some spice to it =)
Back to the present... Inteligence is a term that is too abstract. What is a smart thing? For example smart could be something that can use all it has learned (and possibly learn some more). Intelligent could mean smart and creative (ie. trying something outside of it's scope of information; "thinking outside the box").
Do we need smart machines? It sure helps... Do we need self aware machines? Maybe they only need to know about us, humans. Maybe by being self aware they can be creative (by knowing their limitations).
Could it have personality? I think not. But it could learn to hate. In a sense, feelings on us can be described as chemical reactions (rather than electrical ones), that's why you become happy, sad, etc. But it can by logic draw to a conclusion we are inefficient (or bad), and it could be a form of hate.
Perhaps the simplest form of a "possible" AI is based on natural selection. Trial-and-error. Dunno. Rambling....
JVFF
It could become a form of government. A hyper-pseudodemocracy. Where every citizen is mapped, and the rules of society is based on the predictions of the majority (it would be almost exactly what the majority wants). Then it could become a method of censorship, since your brain is mapped, people with power can eliminate people with diverging thoughts.
And finally what if it becomes the way of immortality? All we become is data and all of a sudden nature doesn't matter anymore (because we don't need it anymore to propagnate our species, which can be done by creating more data in whatever form possible).
Anyway, just thinking outside the box. Adding some spice to it =)
Back to the present... Inteligence is a term that is too abstract. What is a smart thing? For example smart could be something that can use all it has learned (and possibly learn some more). Intelligent could mean smart and creative (ie. trying something outside of it's scope of information; "thinking outside the box").
Do we need smart machines? It sure helps... Do we need self aware machines? Maybe they only need to know about us, humans. Maybe by being self aware they can be creative (by knowing their limitations).
Could it have personality? I think not. But it could learn to hate. In a sense, feelings on us can be described as chemical reactions (rather than electrical ones), that's why you become happy, sad, etc. But it can by logic draw to a conclusion we are inefficient (or bad), and it could be a form of hate.
Perhaps the simplest form of a "possible" AI is based on natural selection. Trial-and-error. Dunno. Rambling....
JVFF