Ways the singularity could fail to happen - the poll

Aug 22, 2012 12:26

Coo, free accounts can do LJ polls now! Following on from this post, a poll.

I often find discussion of the idea of superintelligence quickly devolves into a less interesting discussion about how people feel about those who talk about superintelligence. I'm interested to know what you think will happen

Read carefully: twothree important caveats ( Read more... )

Leave a comment

Comments 59

zotz August 22 2012, 11:40:14 UTC
A yes/no poll doesn't really cover my position, which is really that the statements made are more unascertained than actually wrong because we don't really know enough to properly formulate hypotheses.

The first one is largely true but not in the sense I thing you mean - biological systems are uglier and messier. This doesn't stop us understanding them, but one of the reasons I have difficulty with discussions of creating human(+?) level intelligences is that it tends to be treated as if this isn't the case.

The second and third may well be true. We don't know. As far as I can tell, we currently have no way of knowing based on the evidence we have, so having a position is premature.

None of the above? Well, there are certainly other reasons to disbelieve any specific predictions I've come across.

Reply

ciphergoth August 22 2012, 12:08:39 UTC
Does de Finetti's way of giving such probabilistic questions meaning help? We don't know who will win the US elections - but if you're offered a choice now between winning a desirable prize if Obama wins, or the same prize on the same day if a coin comes up heads, which do you choose?

Reply

zotz August 22 2012, 12:16:10 UTC
Yes, but how far down that road do we go before it's just a version of Pascal's wager? We actually know that there's a sizable chance that Obama will win, and there are statistical tools for estimating it. If it's just a way of quantifying opinion, then it only tells us how convinced we are, not how good our reasons are.

Reply

ciphergoth August 22 2012, 12:27:57 UTC
It's how convinced you are - your subjective estimates - that I'm interested in knowing.

Reply


wight1984 August 22 2012, 12:09:20 UTC
"Human minds are fundamentally different to other physical things, and not subject to thinking about like an engineer"

First claim is probably partially true (at least to the extent that human minds are probably very different to any kind of physical system/computer that we're used to working with) but the second claim seems more dubious.

Given that the second claim is pretty essential to it being a 'way the singularity could fail to happen', I select that option :o)

Reply

ciphergoth August 22 2012, 12:14:13 UTC
It's difficult to phrase that one. We apply our engineering understanding to all sorts of other evolved mechanisms, so I think that if you disbelieve in souls and accept materialism, there shouldn't be much room to tick it, but it depends on how you interpret the question. Better ways of putting it welcome!

Reply

pozorvlak August 22 2012, 12:21:48 UTC
I think the meaning of "engineering" is shifting in response to our increased understanding of and power over biological systems. But this understanding is still at nothing like the level of our understanding of, say, thermodynamics. So while I think that present day engineering culture isn't enough to get us there, it's not impossible that it will evolve into a culture that's better able to attack such problems.

Reply


pozorvlak August 22 2012, 12:25:06 UTC
Human minds are within a few orders of magnitude of the most efficient minds possible in principle in our corner of the Universe

"A few orders of magnitude" is still rather large - an AI that's 100x as intelligent as the average human (whatever that means, precisely) may not be enough for us to achieve techno-Nirvana, but it would still have a huge transformative effect on society.

Reply

reddragdiva August 22 2012, 12:42:49 UTC
For comparison, in present-day computing, and engineering in general, speed up any process 10x and it isn't the same process any more. (How big a town you have when people drive at 30mph rather than walking at 3mph; how often you try stuff out in a Photoshop filter that takes 30 seconds over one that takes 5 min.)

Reply

steer August 22 2012, 13:49:45 UTC
in present-day computing, and engineering in general, speed up any process 10x and it isn't the same process any moreI'm not too convinced by this claim -- particularly in my field of network engineering. The computer I have is ten times faster than the one I replaced. I notice little if any difference -- indeed I use it very similarly to the one which was 100 times slower with 1/100th of the HD space. Basic network substrates (such as layer 3) have remained robust to many orders of magnitude changes. One of my colleagues recently posited that perhaps the most impressive thing about the internet as a machine is that it works in a very similar way, many orders of magnitude later -- it seems almost impervious. Especially at layers 3 and 4 we've seen nearly no changes for an age ( ... )

Reply

reddragdiva August 25 2012, 20:49:02 UTC
Yeah, yeah, some processes then :-)

Reply


steer August 22 2012, 13:38:53 UTC
Human minds are fundamentally different to other physical things,

Ticking yes to this is slightly misleading. I suspect that there is something about minds we really do not yet know and that this is the reason our fundamental understanding of consciousness is not greatly different to what it was thousands of years ago (we know a lot more details but little of the real important question "what process generates consciousness"). If this is discovered then it may be minds would again be capable of being approached with engineering-like thinking. Of course I may be wholly wrong and something more Hofstadterlike is enough of an explanation. At the moment though, I suspect that an engineering explanation of the mind is going to be like a pre-atomic theory explanation of emission spectra.

Reply


zwol August 22 2012, 16:50:08 UTC

My choices, with rationales:
The idea of one mind being greatly more efficient than another isn't meaningful.I wouldn't have phrased it this way, but this is the closest option to my educated opinion on the meta-result of the past 60 years of research into computing and cognition. The way I would put it is: we have no reason to think it is appropriate to model whatever-it-is-the-brain-does as a Turing machine. I emphasize model because there's plenty of reason to think that a universal Turing machine could in principle simulate a brain…but there's also plenty of reason to think that we wouldn't learn anything from such a simulation. It follows from this assertion that everything we know about making computers more efficient is not necessarily applicable to making minds more efficient, and indeed that there may not be any meaningful "more efficient" for minds.Human minds are within a few orders of magnitude of the most efficient minds possible in principle in our corner of the Universe.Supposing that I'm wrong about the above, the ( ... )

Reply


Leave a comment

Up