Technological Singularity

#1

n_huffhines

What's it gonna cost?
Joined
Mar 11, 2009
Messages
91,631
Likes
55,771
#1
Has anybody ever heard of this? I don't really know what forum it belongs in. To some people it's kind of a technology-based religion, but the concepts have political implications.

It's pretty heavy stuff. Kind of weird to wrap your mind around. One of many implications is that individuals will have near-omniscience through technology. Do other people feel like this sort of thing is inevitable?

Kurzweil writes that, due to paradigm shifts, a trend of exponential growth extends Moore's law from integrated circuits to earlier transistors, vacuum tubes, relays, and electromechanical computers. He predicts that the exponential growth will continue, and that in a few decades the computing power of all computers will exceed that of human brains, with superhuman artificial intelligence appearing around the same time.

Many of the most recognized writers on the singularity, such as Vernor Vinge and Ray Kurzweil, define the concept in terms of the technological creation of superintelligence, and argue that it is difficult or impossible for present-day humans to predict what a post-singularity world would be like, due to the difficulty of imagining the intentions and capabilities of superintelligent entities.

A technological singularity includes the concept of an intelligence explosion, a term coined in 1965 by I. J. Good.[8] Although technological progress has been accelerating, it has been limited by the basic intelligence of the human brain, which has not, according to Paul R. Ehrlich, changed significantly for millennia.[9] However with the increasing power of computers and other technologies, it might eventually be possible to build a machine that is more intelligent than humanity.[10] If superhuman intelligences were invented, either through the amplification of human intelligence or artificial intelligence, it would bring to bear greater problem-solving and inventive skills than humans, then it could design a yet more capable machine, or re-write its source code to become more intelligent. This more capable machine then could design a machine of even greater capability. These iterations could accelerate, leading to recursive self improvement, potentially allowing enormous qualitative change before any upper limits imposed by the laws of physics or theoretical computation set in.

Technological singularity - Wikipedia, the free encyclopedia
 
#2
#2
Damn it! Can somebody please move this to politics? Maybe once in my life I'll get this right.
 
#3
#3
I saw a show on Discovery (or a similar chanel) about Kurzweil and his ideas. The guy came across as a complete looney tune. He's obsessed with living forever. He couldn't understand why anyone wouldn't want to live forever, and he plans on making a DNA clone of his deceased father, complete with (somehow) implanted memories. I came away thinking that the guy's crazy. Sounded like he's read too many science fiction books.

As for AI, I know it's a popular concept, but does it strike anyone else as counterintuitive that one entity could create something that is more intelligent than itself? That doesn't seem possible to me. Just reading through what you posted above, Kurzweil thinks that superhuman AI will just "appear" apparently. Who exactly will program it?
 
Last edited:
#4
#4
I saw a show on Discovery (or a similar chanel) about Kurzweil and his ideas. The guy came across as a complete looney tune. He's obsessed with living forever. He couldn't understand why anyone wouldn't want to live forever, and he plans on making a DNA clone of his deceased father, complete with (somehow) implanted memories. I came away thinking that the guy's crazy. Sounded like he's read too many science fiction books.

As for AI, I know it's a popular concept, but does it strike anyone else as counterintuitive that one entity could create something that is more intelligent than itself? That doesn't seem possible to me. Just reading through what you posted above, Kurzweil thinks that superhuman AI will just "appear" apparently. Who exactly will program it?

Clu.

Singularity isn't defined by what Kurzweil says. It's a pretty broad concept that crazies take a lot of different directions. I still find it interesting, and I think to some degree it's inevitable. If we believe we can't create technology smarter than ourselves then we are putting a limit on technology. I think the possibilities are infinite. Not sure how it will happen. I'll leave it to the geniuses of the world.
 
#5
#5
i think kurzweil put it best when he said that people change their opinions about living longer as they get closer to when they suppose they'll die. he also makes a good point that fears of technology have been around forever but humans adapt pretty easily once the technology is mainstream. some of the stuff he talks about, like humans having computers inside their bodies along side cells, is pretty freaky to think about.
 
#6
#6
I think most people don't understand Kurzweil. I'm not saying they're dumb, but they just aren't really critically focusing on what he is saying.

My only objection is that Moore's Law may reach limitations that we haven't foreseen that would alter the whole scenario.
 
#7
#7
i think kurzweil put it best when he said that people change their opinions about living longer as they get closer to when they suppose they'll die. he also makes a good point that fears of technology have been around forever but humans adapt pretty easily once the technology is mainstream. some of the stuff he talks about, like humans having computers inside their bodies along side cells, is pretty freaky to think about.

I could deal with near-omniscience. Doesn't freak me out at all.
 
#8
#8
I think it's unwise to discount neuroplasticity as a means to at least keep pace with technological advances. The idea of a super-computer 'outsmarting' humans is interesting but I don't think it's feasible. The again, never trust a HAL 9000.
 
#9
#9
I think it's unwise to discount neuroplasticity as a means to at least keep pace with technological advances. The idea of a super-computer 'outsmarting' humans is interesting but I don't think it's feasible. The again, never trust a HAL 9000.

Once I became aware of singularity I started to recognize the frequency of the theme in cinema. It's usually accompanied with horrible, humanity-threatening consequences. Like Jokerman already referenced, throughout history people have resisted technology out of fear. India banned sewing machines for crying out loud.
 
#10
#10
Once I became aware of singularity I started to recognize the frequency of the theme in cinema. It's usually accompanied with horrible, humanity-threatening consequences. Like Jokerman already referenced, throughout history people have resisted technology out of fear. India banned sewing machines for crying out loud.

It's a reference to the Beastie Boys used as a tool for lightening the mood. I don't resist technology, I just recognize it's limitations. That is, the person crerating/programming it.
 
#12
#12
It's a reference to the Beastie Boys used as a tool for lightening the mood. I don't resist technology, I just recognize it's limitations. That is, the person crerating/programming it.

Hal 9000 was a Beastie Boys reference? I thought you were talking about Hal from 2001 Space Odyssey (and that's probably what the Beastie Boys were "illing" about). I knew that you were joking. I wasn't trying to counter you, I was going along with your theme.
 
#13
#13
Hal 9000 was a Beastie Boys reference? I thought you were talking about Hal from 2001 Space Odyssey (and that's probably what the Beastie Boys were "illing" about). I knew that you were joking. I wasn't trying to counter you, I was going along with your theme.

Cause when he's out in space carousing;
Pick up a mic and start joustin'.
My name's great: Medallion;
Says never trust a HAL 9000.
 
#14
#14
I think my laptop is already smarter than most people on this site. Have you been to the Football Forum recently?
 
#15
#15
I think my laptop is already smarter than most people on this site. Have you been to the Football Forum recently?

I personally avoid it during the off season if possible. I stick with the recruiting but theres a fair share of nuts there too... Has VN always been like this?
Posted via VolNation Mobile
 
#16
#16
I personally avoid it during the off season if possible. I stick with the recruiting but theres a fair share of nuts there too... Has VN always been like this?
Posted via VolNation Mobile

You've been here longer than I. But, I think it used to be a fairly reasonable place back in '03 - '08, or so. The Fulmer/Kiffin, ahem, changes, New Year's Day arrests, and Pearl debacle have ratcheted up the level of riff raff rampant here.

But, I'm no historian. I think I'll make a thread in The Pub about it tomorrow and try to get some answers. I'm curious, also.

Could get good. Stay tuned. Might even start it tonight.
 
#17
#17
The football forum gave me anxiety about the '10 season so I've avoided it this offseason. Fans are sick of losing and turn on each other. Also it's no fun to talk about which players got arrested, booted, or quit.
 
#18
#18
Great idea pooch. Im very intrigued on the history of our beloved VN. Cant wait for the thread
Posted via VolNation Mobile
 
#19
#19
Once I became aware of singularity I started to recognize the frequency of the theme in cinema. It's usually accompanied with horrible, humanity-threatening consequences. Like Jokerman already referenced, throughout history people have resisted technology out of fear. India banned sewing machines for crying out loud.

Is the birth of AI synonymous with the singularity? AI references are at the heart of much scifi. 2001, Battlestar Gallactica, The Matrix, AI (with Will Smith) - the list could go on and on. And they all take on religious connotations, usually the fear of death, wanting to live forever, etc., as stated in the op and as with Kurzweil. I just have a hard time conceptualizing how someone could program conciousness.
Posted via VolNation Mobile
 
#20
#20
As for AI, I know it's a popular concept, but does it strike anyone else as counterintuitive that one entity could create something that is more intelligent than itself? That doesn't seem possible to me. Just reading through what you posted above, Kurzweil thinks that superhuman AI will just "appear" apparently. Who exactly will program it?

This.

As the gulf between metaphysics and physics widens, more scientists, with no understanding of natural laws, will emerge with crazy assumptions and predictions.
 
#21
#21
The whole concept of learning seems to indicate it's possible. I mean, there are plenty of apprentices who end up outshining their masters. Parents have birthed and raised smarter kids.

Improvement, refinement, invention, ingenuity, etc. It's all possible within human intelligence. I don't see why it wouldn't be with artificial intelligence.
 
#22
#22
The whole concept of learning seems to indicate it's possible. I mean, there are plenty of apprentices who end up outshining their masters. Parents have birthed and raised smarter kids.

Mentors are not "creating" the brains of their apprentices. Parents are not exactly "creating" the gene pool of their progeny.

Improvement, refinement, invention, ingenuity, etc. It's all possible within human intelligence. I don't see why it wouldn't be with artificial intelligence.

Whatever capabilities a computer will possess are limited by the genius of its programmer/programmers. Gather 1,000 of the most intelligent individuals, pool their genius, and that is still well shy of omniscience and/or superhuman intelligence.
 
#23
#23
Mentors are not "creating" the brains of their apprentices. Parents are not exactly "creating" the gene pool of their progeny.



Whatever capabilities a computer will possess are limited by the genius of its programmer/programmers. Gather 1,000 of the most intelligent individuals, pool their genius, and that is still well shy of omniscience and/or superhuman intelligence.

On your first point, that seems to indicate an AI would have an advantage, as they COULD redesign the "brain" of their progeny, as well as the "code" of it much more easily than a human can genetically manipulate anything.

On you second point, this seems to be the crux of your position, but I don't see how it has been proven so. It's an assumption. I can think of several fields where computers were programmed and advanced to the point that they surpass the capabilities of their programmers in addressing various problems. Hell, the super computers that are now unbeatable by humans in chess indicate that a computer is very much capable of surpassing it's programmer.

I really don't know where that notion is coming from.
 
#24
#24
On your first point, that seems to indicate an AI would have an advantage, as they COULD redesign the "brain" of their progeny, as well as the "code" of it much more easily than a human can genetically manipulate anything.

On you second point, this seems to be the crux of your position, but I don't see how it has been proven so. It's an assumption. I can think of several fields where computers were programmed and advanced to the point that they surpass the capabilities of their programmers in addressing various problems. Hell, the super computers that are now unbeatable by humans in chess indicate that a computer is very much capable of surpassing it's programmer.

I really don't know where that notion is coming from.

Nice.
 
#25
#25
I disagree. The supercomputer is not surpassing the intelligence of the programmer, it is simply surpassing the processing speed of the brain. Whomever wrote the algorithm for those programs could beat that computer (or, at least tie) if given enough time to run the calculation for each move.

Intelligence presupposes abstract thought; not a program in the world could provide that to a machine. Super-intelligent machines will never be anything other than machines that react to expected and programmed for circumstances.
 
Advertisement

Back
Top