Dec 13, 2015 00:15
I'm reading Ray Kurzweil's The Singularity is Near, and there's something that I don't get about the theory of exponential growth, or even exponentially growing exponential growth. And that's physical or historical flukes.
This cycle of machine intelligence’s iteratively improving its own design will become faster and faster. This is in fact exactly what is predicted by the formula for continued acceleration of the rate of paradigm shift. One of the objections that has been raised to the continuation of the acceleration of paradigm shift is that it ultimately becomes much too fast for humans to follow, and so therefore, it’s argued, it cannot happen. However, the shift from biological to nonbiological intelligence will enable the trend to continue.
Kurzweil, Ray (2005-09-22). The Singularity Is Near: When Humans Transcend Biology (p. 28). Penguin Publishing Group. Kindle Edition.
So what I don't get is that this is tracing trends at a very, very high level, which seems to ignore the effects of chaos at the individual level, even when that is significant. In the aggregate, advancement was slow long ago, and much faster now. Part of that is individual processing --how fast someone can think and how much knowledge they have access to-- but part of it also is communication between individuals, especially in an era of distributed and federated intelligence. I seem smarter, am smarter, when I have access to the internet. Without it, I would be hard-pressed to build a shack to live in or make clothes for my body. Without the larger distributed-intelligence of civilization, I'd have a real hard time with the clothes problem.
So where that comes back to singularity theory is that the theorizing assumes that once something has happened somewhere, it's everywhere. That once it's invented, the race has grown. This ignores that technological advancements took decades or even centuries to spread in the past, and that historical events, flukes, could erase the advancements entirely or set them back for a long time.
So as advancement within the individual becomes much much faster, the cost for losing that individual, of their particular intellectual wandering, becomes that much greater. What if the Singularity met a natural disaster? What if communities with substantially different lines of inquiry were light years apart with no communication? What if nanobots weren't accessible for a particular population? Couldn't that disrupt that exponentially exponential progress pretty substantially, slow it down?
singularity,
emergence,
mnemoscene