n_huffhines
What's it gonna cost?
- Joined
- Mar 11, 2009
- Messages
- 91,631
- Likes
- 55,771
Has anybody ever heard of this? I don't really know what forum it belongs in. To some people it's kind of a technology-based religion, but the concepts have political implications.
It's pretty heavy stuff. Kind of weird to wrap your mind around. One of many implications is that individuals will have near-omniscience through technology. Do other people feel like this sort of thing is inevitable?
Technological singularity - Wikipedia, the free encyclopedia
It's pretty heavy stuff. Kind of weird to wrap your mind around. One of many implications is that individuals will have near-omniscience through technology. Do other people feel like this sort of thing is inevitable?
Kurzweil writes that, due to paradigm shifts, a trend of exponential growth extends Moore's law from integrated circuits to earlier transistors, vacuum tubes, relays, and electromechanical computers. He predicts that the exponential growth will continue, and that in a few decades the computing power of all computers will exceed that of human brains, with superhuman artificial intelligence appearing around the same time.
Many of the most recognized writers on the singularity, such as Vernor Vinge and Ray Kurzweil, define the concept in terms of the technological creation of superintelligence, and argue that it is difficult or impossible for present-day humans to predict what a post-singularity world would be like, due to the difficulty of imagining the intentions and capabilities of superintelligent entities.
A technological singularity includes the concept of an intelligence explosion, a term coined in 1965 by I. J. Good.[8] Although technological progress has been accelerating, it has been limited by the basic intelligence of the human brain, which has not, according to Paul R. Ehrlich, changed significantly for millennia.[9] However with the increasing power of computers and other technologies, it might eventually be possible to build a machine that is more intelligent than humanity.[10] If superhuman intelligences were invented, either through the amplification of human intelligence or artificial intelligence, it would bring to bear greater problem-solving and inventive skills than humans, then it could design a yet more capable machine, or re-write its source code to become more intelligent. This more capable machine then could design a machine of even greater capability. These iterations could accelerate, leading to recursive self improvement, potentially allowing enormous qualitative change before any upper limits imposed by the laws of physics or theoretical computation set in.
Technological singularity - Wikipedia, the free encyclopedia