![]() | The technological singularity is a hypothetical event related to the advent of genuine artificial general intelligence. Such a computer, computer network, or robot would theoretically be capable of recursive self-improvement (redesigning itself), or of designing and building computers or robots better than itself on its own. Repetitions of this cycle would likely result in a runaway effect, where smart machines design successive generations of increasingly powerful machines, creating intelligence far exceeding human intellectual capacity and control. Because the capabilities of such a superintelligence may be impossible to comprehend, the technological singularity is the point beyond which events may become unfathomable to human intelligence. | ![]() |
Although technological progress has been accelerating, it has been limited by the basic intelligence of the human brain, which has not changed significantly for millennia. With the increasing power of computer technologies, it might eventually be possible to build a machine that is more intelligent than humanity. If a superhuman intelligence were to be invented, it might be able to bring to bear greater problem-solving and inventive skills than humans are capable of. It might then design an even more capable machine. This more capable machine could then go on to design a machine of even greater capability. This recursive self-improvement could accelerate, potentially allowing enormous qualitative change before any upper limits imposed by the laws of physics or theoretical computation are reached. | ![]() |
![]() | The exponential growth in computing technology suggested by Moore's Law is commonly cited as a reason to expect a singularity in the relatively near future. Moore's law is the observation that the number of transistors in a dense integrated circuit doubles approximately every two years. | ![]() |
![]() | ![]() | ![]() |