Artificial Intelligence

Nov 03, 2009 13:06

We're discussing artificial intelligence in my philosophy class right now, and I was just curious as to what all you guys think about it ( Read more... )

artificial intelligence, philosophy, thinking

Leave a comment

Comments 7

raisexyourxhead November 3 2009, 18:49:06 UTC
I think you can artificially copy another person's intelligence, but an AI is always going to be constrained by how it's programmer thinks it should think. But if it's a copy that works the same as a natural intelligence, would it really matter to the person interacting with it?

And if our brains began to be replaced with computers.... Well. I really want Ghost In the Shell to happen. :3

Reply

earthykitty November 3 2009, 22:13:27 UTC
I never thought of the constraints of AI like that, and now it seems really obvious.

Is Ghost In the Shell where some spirit like thingy possesses a body or something?

Reply


satrugha November 4 2009, 04:37:40 UTC
one day, we're gonna create evolving AI that's gonna TAKE OVER THE WORLD!
j/k. srs.

AI's getting really smart now. Right now, computers really don't have a conscience but the way AI is written, it can really make a computer seem like it's alive. maybe one day it'll get smart enough to seem more emotional?

Reply

earthykitty November 4 2009, 23:01:00 UTC
But does it just seem like it's thinking, or does it really think for itself? A person can look like they're thinking, but really be spacing out and have nothing going on inside.

I'm not saying it's impossible, though in my opinion it looks like we're a long ways away from getting computers to think like we do.

Reply

satrugha November 5 2009, 00:09:02 UTC
currently they only appear to be thinking. the "thought" is just based on a set of logical rules with a bit of what's called fuzzification, which is essentially some randomness with a weighted distribution of choices to make the AI appear more organic. A computer can appear to be learning by saving past actions of the player or user and make guesses on what the next move will be according to past behaviors and statistics.

i don't think it's impossible either, just not happening yet.

Reply

earthykitty November 5 2009, 02:01:20 UTC
I like your vocabulary. I'm going to use fuzzification on a regular basis now. ^_^

Reply


Leave a comment

Up