Ramez Naam, author of Nexus and Crux (two books I enjoyed and recommend), has recently put together a few guest posts for Charlie Stross (another author I love). The posts are The Singularity Is Further Than It Appears and Why AIs Won’t Ascend in the Blink of an Eye.
They’re both excellent posts, and I’d recommend reading them in full before continuing here.
I’d like to offer a slight rebuttal and explain why I think the singularity is still closer than it appears.
via William Hertling’s Thoughtstream: The Singularity is Still Closer than it Appears.
William Hertling has given a reply to Ramez Naam’s articles that I linked to this morning. I don’t know whether Naam will reply, but I have a short reply, and I think it applies to both of their thinking to some extent.
A lot of singulatarians assume that Moore’s Law will proceed indefinitely. As I’ve written before, I think this is a questionable assumption. Moore’s Law is not an indefinite proposition, but an S curve.
I mentioned the S curve in another post earlier this week, but the main idea is that it represents a situation where progress is rapid, for a while, but eventually it levels out. Whether we’re talking about growth in market share, in scientific knowledge, or any other aspect of rapid growth, that growth almost always reaches a point of depletion or saturation.
When you’re in the steep upslope of the S curve, it’s often very difficult to see when it will level off. Indeed, if it lasts for a long time, it’s very tempting to assume that it will never level off.
In the case of Moore’s Law, that leveling off is likely to happen when processor technology butts up against the laws of physics. When will that be? I don’t know, but silicon technology is expected to do it around 2020. Will quantum computing come to the rescue? Perhaps, but that’s a matter of faith in singularity thinking rather than one of any kind of certainty. And even quantum computing will eventually reach its limit.
The question then is, within the laws of physics, how much can the human brain be improved upon? Almost certainly it can be improved dramatically, but it is a very large assumption that it can be improved upon to the many orders of magnitude implied by singularity thinking.
Remove that assumption, the assumption of limitless improvement in processing power and capacity, and the idea of a runaway singularity becomes much more of a long shot.
h/t John Blackman
6 thoughts on “Hertling says singularity closer than appears, and a brief comment”
I thought Moore’s Law was already different at this point, in that recent gains have been coming from better parallel processing as opposed to sticking more and more transistors on chips. Does it even really apply?
I think Moore’s law is still in play, and will as long as we can continue to make things smaller. How small can we go? Eventually we’ll be down to the width of an atom. Parallel processing is important, but it’s actually happening now on the chip itself, with the number of cores increasing. Eventually a single chip may have thousands of cores.
I think it must be an S-curve because at some point you run into really hard limits like the amount of energy in the solar system and so on. The speed of light limits your ability to access other sources of energy and matter in a meaningful way.
But the apparent exponential increase could go on for a long time before any of those limits are reached. We might never have computers that are a billion times more powerful than those we have now, or we might not hit a limit for another 30 orders of magnitude. At this point in time, we can’t know. But if someone suggested that we can only increase by a few more orders of magnitude, then I think that would definitely be understating the potential.
Exponential progress doesn’t even have to depend on more computing power. Even if we are forever stuck with “just” human brains, exponential progress can still take us to the stars and beyond.
LikeLiked by 1 person
We’re down to tens of nanometers for component sizes. The width of atoms is measuring in angstroms, or tenths of a nanometer. So, we may be less than two orders of magnitude from the limit. As I said above, quantum computing might move us past that limit, but that may also just be wishful thinking.
No exponential curve can continue indefinitely. The growth in available computer processing is no different.
But the current trend has been going on an awfully long time. In 1985, as a teenager, I remember reading about the release of Intel’s then-new 386 processor. Everyone heralded the new levels of performance but lamented as how they were nearing the limits of what was possible, and the end of the growth in computing processing speeds was just around the corner. In the forty years since then, I’d heard the refrain countless times. At this point, it’s easier to believe that growth will continue until it doesn’t and then we can make some new estimates based on what we learn.
Certainly, it doesn’t matter what form the growth in available computing processing takes. Faster single-task performance? That’s slowed, but it’s still there. More cores? Awesome. We know the human brain is widely distributed, there’s no reason to believe that’s not a viable AI architecture. Aggregating multiple processors, going 3D, using new materials, quantum computing…. there are still so many research avenues.
No arguments from me on any of that. But it pays to remember that past performance is no guarantee of future performance. I’ve been in computing for decades myself, but I’ve never heard as many knowledgable people, including Gordon Moore himself, say that we’re approaching the limits within 10-20 years.
Within that limit, I’m comfortable we’ll find the power to build an artificial brain, or emulate a human one. But I’m far less confident that we’ll have the power for god-like AIs.