The Singularity – IEEE Spectrum Special Report


Source: Special Report: The Singularity – IEEE Spectrum

Tech Luminaries Address Singularity

This is an excerpt from IEEE Spectrum’s SPECIAL REPORT: THE SINGULARITY

Douglas Hofstadter

PHOTO: Chris Meyer/ Indiana UniversityWHO HE IS Pioneer in computer modeling of mental processes; director of the Center for Research on Concepts and Cognition at Indiana University, Bloomington; winner of the 1980 Pulitzer Prize for general nonfiction.

SINGULARITY WILL OCCUR Someday in the distant future



THOUGHTS “It might happen someday, but I think life and intelligence are far more complex than the current singularitarians seem to believe, so I doubt it will happen in the next couple of centuries. [The ramifications] will be enormous, since the highest form of sentient beings on the planet will no longer be human. Perhaps these machines–our ‘children’–will be vaguely like us and will have culture similar to ours, but most likely not. In that case, we humans may well go the way of the dinosaurs.”

Jeff Hawkins

PHOTO: NumentaWHO HE IS Cofounder of Numenta, in Menlo Park, Calif., a company developing a computer memory system based on the human neocortex. Also founded Palm Computing, Handspring, and the Redwood Center for Theoretical Neuroscience. Considered the father of handheld computing.

SINGULARITY WILL OCCUR ”If you define the singularity as a point in time when intelligent machines are designing intelligent machines in such a way that machines get extremely intelligent in a short period of time–an exponential increase in intelligence–then it will never happen. Intelligence is largely defined by experience and training, not just by brain size or algorithms. It isn’t a matter of writing software. Intelligent machines, like humans, will need to be trained in particular domains of expertise. This takes time and deliberate attention to the kind of knowledge you want the machine to have.”

MACHINE CONSCIOUSNESS WILL OCCUR ”Machines will understand the world using the same methods humans do; they will be creative. Some will be self-aware, they will communicate via language, and humans will recognize that machines have these qualities. Machines will not be like humans in all aspects, emotionally, physically. If you think dogs and other mammals are conscious, then you will probably think some machines are conscious. If you think consciousness is a purely human phenomenon, then you won’t think machines are conscious.”

THOUGHTS ”I don’t like the term ‘singularity’ when applied to technology. A singularity is a state where physical laws no longer apply because some value or metric goes to infinity, such as the curvature of space-time at the center of a black hole. No one can predict what happens at a singularity. There are no examples of singularities in biology or technology that I know of. Even if humans created a new virus, biological or otherwise, that rapidly killed all life on Earth, it wouldn’t be a singularity–very unfortunate, yes, but not a singularity.

”The term ‘singularity’ applied to intelligent machines refers to the idea that when intelligent machines can design intelligent machines smarter than themselves, it will cause an exponential growth in machine intelligence leading to a singularity of infinite (or at least extremely large) intelligence. Belief in this idea is based on a naive understanding of what intelligence is. As an analogy, imagine we had a computer that could design new computers (chips, systems, and software) faster than itself. Would such a computer lead to infinitely fast computers or even computers that were faster than anything humans could ever build? No. It might accelerate the rate of improvements for a while, but in the end there are limits to how big and fast computers can run. We would end up in the same place; we’d just get there a bit faster. There would be no singularity.

”Exponential growth requires the exponential consumption of resources (matter, energy, and time), and there are always limits to this. Why should we think intelligent machines would be different? We will build machines that are more ‘intelligent’ than humans, and this might happen quickly, but there will be no singularity, no runaway growth in intelligence. There will be no single godlike intelligent machine. Like today’s computers, intelligent machines will come in many shapes and sizes and be applied to many different types of problems.

”Intelligent machines need not be anything like humans, emotionally and physically. An extremely intelligent machine need not have any of the emotions a human has, unless we go out of our way to make it so. No intelligent machine will ‘wake up’ one day and say ‘I think I will enslave my creators.’ Similar fears were expressed when the steam engine was invented. It won’t happen. The age of intelligent machines is starting. Like all previous technical revolutions, it will accelerate as more and more people work on it and as the technology improves. There will be no singularity or point in time where the technology itself runs away from us.”

John Casti

PHOTO: Juan EstevesWHO HE IS Senior Research Scholar, the International Institute for Applied Systems Analysis, in Laxenburg, Austria and cofounder of the Kenos Circle, a Vienna-based society for exploration of the future. Builds computer simulations of complex human systems, like the stock market, highway traffic, and the insurance industry. Author of popular books about science, both fiction and nonfiction, including The Cambridge Quintet, a fictional account of a dinner-party conversation about the creation of a thinking machine.



MOORE’S LAW WILL CONTINUE FOR 20 more years with current technology

THOUGHTS ”I think it’s scientifically and philosophically on sound footing. The only real issue for me is the time frame over which the singularity will unfold. [The singularity represents] the end of the supremacy of Homo sapiens as the dominant species on planet Earth. At that point a new species appears, and humans and machines will go their separate ways, not merge one with the other. I do not believe this necessarily implies a malevolent machine takeover; rather, machines will become increasingly uninterested in human affairs just as we are uninterested in the affairs of ants or bees. But it’s more likely than not in my view that the two species will comfortably and more or less peacefully coexist–unless human interests start to interfere with those of the machines.”

T.J. Rodgers

PHOTO: Cypress SemiconductorWHO HE IS Founder and CEO of Cypress Semiconductor, Corp., in San Jose, Calif., known for his brash opinions about the business world and politics. Owner of the Clos de la Tech winery and vineyards, in California, where he’s trying to make the best American pinot noir.


THOUGHTS ”I don’t believe in technological singularities. It’s like extraterrestrial life–if it were there, we would have seen it by now (there are actually rigorous papers on that point of view). However, I do believe in something that is more powerful because it is real–namely exponential learning. An exponential function has the property that its slope is proportional to its value. The more we know, the faster we can learn. High school students today quickly learn the mathematical tool of calculus that Newton struggled to invent.

”Technological transitions are required to maintain an exponential rate of learning. The first airplanes were certainly not as good as well-appointed trains in moving masses comfortably, but the transition later proved essential to maintaining our progress in human mobility. Gene splicing is a breakthrough technology, but it has not yet done (or been allowed to do) a lot for mankind. That will change in the future.

”I don’t believe in the good old days. We live longer and better than our predecessors did–and that trend will continue in the future. We will also be freer, more well educated and even smarter in the future–but exponentially so, not as a result of some singularity.”

Eric Hahn

PHOTO: Timothy ArchibaldWHO HE IS Serial entrepreneur and early-stage investor who founded Collabra Software (sold to Netscape) and Lookout Software (sold to Microsoft) and backed Red Hat, Loudcloud, and Zimbra. CTO of Netscape during the browser wars.


MACHINE CONSCIOUSNESS WILL OCCUR ”Yes, in that they eventually pass the Turing Test for ‘Is it thinking?’ ”


THOUGHTS ”I think that machine intelligence is one of the most exciting remaining ‘great problems’ left in computer science. For all its promise however, it pales compared with the advances we could make in the next few decades in improving the health and education of the existing human intelligences already on the planet. I believe the first thing a tabula rasa intelligence (machine or otherwise) would conclude is that humans are very poor stewards of their own condition.

”[The ramifications will be] less than is often contemplated. I think they will be more along the lines of what happened during the prior ‘revolutions’ (agricultural, industrial, information age, etc.), that is, incremental, albeit dramatic, changes to humanity. I’m not worried about The Matrix or The Day the Earth Stood Still. But I do hope the new intelligence doesn’t run Windows.”

Gordon Bell

PHOTO: MicrosoftWHO HE IS Principal researcher at Microsoft Research, Silicon Valley. Led the development of or helped design a long list of time-share computers and minicomputers at Digital Equipment Corp., including the PDP-6 and the VAX. A founder of Encore Computer; Ardent Computer; the Computer Museum, in Boston; and the Computer History Museum, in Mountain View, Calif.

SINGULARITY WILL OCCUR Someday in the distant future



THOUGHTS ”Singularity is that point in time when computing is able to know all human and natural-systems knowledge and exceed it in problem-solving capability with the diminished need for humankind as we know it. I basically support the notion, but I have trouble seeing the specific transitions or break points that let the exponential take over and move to the next transition. [If it does occur,] there’ll be a hierarchy of machines versus having a separate race. [But] it is unlikely to happen, because the population will destroy itself before the technological singularity.”

Steven Pinker

PHOTO: Rebecca GoldsteinWHO HE IS Professor of psychology at Harvard; previously taught in the department of Brain and Cognitive Sciences at MIT, with much of his research addressing language development. Writes best sellers about the way the brain works, like The Blank Slate (2002) and The Stuff of Thought (2007).


MACHINE CONSCIOUSNESS WILL OCCUR ”In one sense–information routing–they already have. In the other sense–first-person experience–we’ll never know.”


THOUGHTS ”There is not the slightest reason to believe in a coming singularity. The fact that you can visualize a future in your imagination is not evidence that it is likely or even possible. Look at domed cities, jet-pack commuting, underwater cities, mile-high buildings, and nuclear-powered automobiles–all staples of futuristic fantasies when I was a child that have never arrived. Sheer processing power is not a pixie dust that magically solves all your problems.”

PHOTO: joSon

Gordon E. Moore

WHO HE IS Cofounder and chairman emeritus of Intel Corp., cofounder of Fairchild Semiconductor, winner of the 2008 IEEE Medal of Honor, chairman of the board of the Gordon and Betty Moore Foundation. Made the prediction about the increasing number of components on a semiconductor chip that came to be known as Moore’s Law.


THOUGHTS ”I am a skeptic. I don’t believe this kind of thing is likely to happen, at least for a long time. And I don’t know why I feel that way. The development of humans, what evolution has come up with, involves a lot more than just the intellectual capability. You can manipulate your fingers and other parts of your body. I don’t see how machines are going to overcome that overall gap, to reach that level of complexity, even if we get them so they’re intellectually more capable than humans.”

Jim Fruchterman

PHOTO: Michael Callopy/The Skoll FoundationWHO HE IS Founder and CEO of the Benetech Initiative, in Palo Alto, Calif., one of the first companies to focus on social entrepreneurship. Former rocket scientist and optical-character-recognition pioneer. Winner of a 2006 MacArthur Fellowship, the so-called genius grant.




THOUGHTS ”I believe the singularity theory is plausible in that there will be a major shift in the rate of technology change. I am less convinced by projections of what it will mean to humans and humanity, such as human downloading in our lifetimes.

”Two things that rarely come up are the bug and algorithm questions. As Patrick Ball, Benetech’s chief scientist, has pointed out to me, Douglas Hofstadter has more or less proved that perfect programs are not practically possible. And algorithms don’t scale as nicely as processing power does: n log(n) is not our friend. As Patrick said: a Linux system that needs rebooting only every three years is a modern technological marvel. But do you want to reboot your brain regularly?”

”I think that futurists are much more successful in projecting simple measures of progress (such as Moore’s Law) than they are in projecting changes in human society and experience.”

Esther Dyson

PHOTO: Rick SmolanWHO SHE IS Commentator and evangelist for emerging technologies, investor and board member for start-ups; currently focused on health care, genetics, private aviation, and commercial space. Ran PC Forum conference until 2007; currently hosts the annual Flight School conference.

THOUGHTS ”The singularity I’m interested in will come from biology rather than machines. We won’t be building things; we’ll be growing and cultivating them, and then they will grow on their own.”

Source: Tech Luminaries Address Singularity – IEEE Spectrum

Comments are closed