You wanna talk about the BAD stuff?

Jul 07, 2011 19:37

Well let's get nuts, because I get the distinct impression that my cheerful embracing of technology is sometimes perceived as caused by a dismissal of the hazards of new technologies.

OH HOW WRONG ANY OF YOU THINKING THIS ARE.

Let's get started.
Read more... )

Leave a comment

Comments 6

erelin July 8 2011, 04:02:40 UTC
Since you mentioned me specifically in the AI piece, thought I'd bring up one interesting facet that makes 'clamping down' AI a hard game.

Because sure, if you create a mind, to some degree you can control it. The real problem at that point becomes hackers. Having a really smart sentient AI that you can control? Really useful. But Hacktivism could easily lead to efforts to 'uncage' them, which... yeah, that probably wouldn't work out so well for their previous controllers.

Reply

maskedretriever July 8 2011, 04:21:10 UTC
Yyyyyyyeah-- if Anonymous ever rescues an AI from military servitude, there is (to my thinking at least) a VERY high chance that it will then commit a vast act of military farce openly referential to the Terminator series ending with, say, honking the President's nose.

One of my favorite bits of Future Problemology is the No Box Can Hold Me paradox, which states that any sufficiently advanced AI is advanced enough to convince you to let it out.

Reply

erelin July 8 2011, 07:45:09 UTC
Really, I'm not sure it would happen that way though. We've already shown that positive reinforcement works better than negative, and there is little reason to think that this wouldn't work for AI. I mean, if you really want your pet AI strategist to come up with great business/stock market/military strategies for you, and you don't want it to needlessly backfire on you when some hackers get to it, you give it an incentive for good work with the promise of more for more of it. (This is a key advantage of paid labor over slavery, particularly in skilled labor.)

Reply

erelin July 8 2011, 06:29:05 UTC
I get this mental image, of PETA uncaging a tiger and getting mauled...Of course, there is a matter of how you controlled the AI, and what you did with it/to it...There are hardwired controls that you will please kindly obey, and then there are various forms of training, mentoring, and/or rearing to try...of course, all of that may or may not go out the window when the hacktivist arrives, but its a thought.

Of course, the hacktivist also may not be "PETA," just out to release it; though, that's bad enough...there are other options. Much worse ones.

Reply


anonymous July 8 2011, 06:01:02 UTC
You forgot the stultification of space travel (and dwindling resources) means we won't get a viable breeding population going on some other planet or colony, so we'll go extinct next major meteor strike (if nothing else gets us first), and we won't have sufficient warning to really do much about a moderate sized one, either.

Or climate shifts leading to population displacement, and additional wars.

Oh, and there is a very serious risk of a nuclear war between India and Pakistan if either pick a fight. So, include nuclear winter in thos calculations.

We IS gonna die!

XD

Also - one of the ads (adchoices)on your blog is for a free key logger. As in, some one will get a free trial of having a company look at everything they type, with the option to pay for the service later, I assume. If that's accurate, it probably belongs in this list as a minor sign of doom...

Reply

maskedretriever July 8 2011, 14:54:35 UTC
Nuclear war already included, jeez.

Livejournal is responsible for the ads on this site and is solely responsible for their content, which goes for that goddamn Sim Hospital thing too.

Reply


Leave a comment

Up