STUFF AND THINGS

Nov 17, 2008 12:34

I just came across this, and I wanted to see what people thought about it. Read the introductory paragraphs carefully.

http://discovermagazine.com/2007/brain/i-chat-therefore-i-am/article_view?b_start:int=0&-C=

link

Leave a comment

Comments 7

mousse November 17 2008, 17:39:18 UTC
Wow. That's actually pretty cool.

Reply


so_low November 17 2008, 17:53:40 UTC
Hahaha oh man, the conversations between the trwo robots are highly entertaining. I used to play around with a chatbot yeeeaaaars ago called MegaHAL, but its memory only held a week or two's worth of information, and only using words and phrases people already fed it. You got some pretty crazy stuff out of it that way.

Reply

mousse November 17 2008, 18:02:41 UTC
Vic and I once programmed one to regard the word "watermelon" as horribly dirty profanity.

Reply

richterca November 17 2008, 19:00:51 UTC
Yeah, that was an ELIZA program, which is the precursor to the ALICE program in this article. They work on the same basic principle.

Didn't we also program it to yell at you if you mentioned Shaq, or something?

Reply

timmymac1978 November 17 2008, 19:20:31 UTC
When I was in around 8th grade our school library got a computer with a virtual "counselor" program on it. The idea was that students would go into this little room in the library and type their problems to the program, which would use speakers and text to respond. I think it was supposed to give advice and to keep the student talking until they worked out their problems on their own. (I wonder what my psychology-related friends would have thought of it.) I only messed with it a few times, mostly because a) it was a computer program which could speak and respond, which was novel at the time and b) I spent a lot of my free time in the library that year for various reasons. Anyway, long story short, within a few weeks students who had been "conversing" with it had "taught" it to respond to even relatively ordinary statements with profanity. (In retrospect I find this ironic, because my memories of that school suggest that they could have gotten the same response walking down the hallways.)Still, interesting.

Reply


timmymac1978 November 17 2008, 19:13:55 UTC
Interesting, I suppose. I wonder if they could have a longer conversation if you seeded a question that was less philosophical? I haven't really tried chatting with any of these chatbots myself, so I have no real way of knowing.

Interestingly, this article does remind me of something from my information ethics class. It'd take too long to get into it here, but a big part of the paradigm we've been learning is that other people are deserving of respect and equal treatment, and we come to understand this through something called a Fundamental Moral Experience, which is an experience in which we recognize the personhood of another person. (I had to write a paper on it; too long to post here.) So far in all of the things I have read for the class no-one has established criteria for "personhood", so I'm wondering: at what point can a "thinking" program be considered to be a "person" with ethical rights and obligations? I may have to pass this along to the class...

Reply


flurid_cube November 17 2008, 21:38:09 UTC
Pfft, the chat bots that make Jerk City have been huaghlguahglhuagling and calling each other faggots for almost 10 years.

But yeah it is pretty cool, especially when they start emoting to each other :3

You can see a lot of the advanced chatbot technology being used in spam bots, to the point where spambot comments on youtube are actually becoming more coherent and topical than the actual comments.

Reply


Leave a comment

Up