Started By
Message

re: Ray Kurzweil. Google's Director of Engineering predicts Singularity by 2029

Posted on 3/17/17 at 12:31 pm to
Posted by musick
the internet
Member since Dec 2008
26125 posts
Posted on 3/17/17 at 12:31 pm to
I feel like it will be ushered in with a huge flash of light and almost explosion. I just have that feeling. Maybe this is/was the big bang.
Posted by musick
the internet
Member since Dec 2008
26125 posts
Posted on 3/17/17 at 12:36 pm to
Posted by TigerFanatic99
South Bend, Indiana
Member since Jan 2007
27559 posts
Posted on 3/17/17 at 12:40 pm to
quote:

If you design something to find a cure for cancer then it's got a lot less possible negative impact than if you design it with the goal of optimizing profit margins.


What happens when this computer ultimately decides that most efficient way to cure cancer is to kill humans. no humans, zero risk of humans getting cancer.

Obviously that is an extreme example, but It's a legitimate concern that can't just be dismissed with "that'll never happen because we'll tell it that it cannot harm humans". Once AI becomes more "Intelligent" than the human brain, it is impossible to predict what it will be capable of doing, including making leaps in logic that will supersede any "rules" humans can think to code into it to not harm humans. It is largely accepted that ASIs will be single mindedly driven toward accomplishing assigned tasks. Cure cancer is open ended enough that it can cause a sub routine to "improve myself so I can better figure out how to cure cancer" all the way to "killing the human with cancer is the most efficient and cost effective way to cure cancer"
Posted by Freauxzen
Utah
Member since Feb 2006
37263 posts
Posted on 3/17/17 at 12:50 pm to
quote:

including making leaps in logic that will supersede any "rules" humans can think to code into it to not harm humans.


It's worse than that.

What if, on the flip side, it has nothing to do with superseding a lame man-made rule about not harming humans and simply about extrapolating the positive out to the infinite degree.

Let's be clear, "human pleasure" our neural network firing off "yes/no", "1/0" over and over and over again in patterns that we interpret as "pleasure," and "happiness."

All of the want of a VR world with an infinite supply of hot virgins is a lot of inefficiency to fire off those 1s and 0s. Ray and OML could just be turned into switches of 1/0s and just fired off in a pattern. No virtual world. No virgins. Just switches, on and off. Forever.

I mean if that's your thing, and that's your idea of immortality, go for it. "Eternal Pleasure."

This post was edited on 3/17/17 at 12:50 pm
Posted by LucasP
Member since Apr 2012
21618 posts
Posted on 3/17/17 at 1:02 pm to
quote:

What happens when this computer ultimately decides that most efficient way to cure cancer is to kill humans. no humans, zero risk of humans getting cancer.


That's exactly what I wrote next. Did you really just stop reading to comment?
Posted by TigerFanatic99
South Bend, Indiana
Member since Jan 2007
27559 posts
Posted on 3/17/17 at 1:07 pm to
quote:

That's exactly what I wrote next. Did you really just stop reading to comment?


Kind of?










Posted by LucasP
Member since Apr 2012
21618 posts
Posted on 3/17/17 at 1:23 pm to
quote:

Let's be clear, "human pleasure" our neural network firing off "yes/no", "1/0" over and over and over again in patterns that we interpret as "pleasure," and "happiness."

All of the want of a VR world with an infinite supply of hot virgins is a lot of inefficiency to fire off those 1s and 0s. Ray and OML could just be turned into switches of 1/0s and just fired off in a pattern. No virtual world. No virgins. Just switches, on and off. Forever.


Seems like a pretty efficient way to maximize the human experience. Neural heroin, the utilitarians would approve.
Posted by chinese58
NELA. after 30 years in Dallas.
Member since Jun 2004
30394 posts
Posted on 3/17/17 at 1:37 pm to
quote:

Ill proba wait this one out for a bit. Let those first adapters give it a go.


If I'm still around for that I'll be 70. Will probably be an outcast for not participating. I don't get the fascination with binge watching TV and I'm always at least one generation, usually two, behind on phones. Won't be any different when this comes around.
Posted by TigerFanInSouthland
Louisiana
Member since Aug 2012
28065 posts
Posted on 3/17/17 at 3:07 pm to
I'm confused why these types of guys are excited about this? And why they aren't doing anything to stop this bullshite?
Posted by musick
the internet
Member since Dec 2008
26125 posts
Posted on 3/17/17 at 3:11 pm to
I think the goal is to transfer memories and consciousness into a computer so you don't have to "die" when your physical body is done.
Posted by LucasP
Member since Apr 2012
21618 posts
Posted on 3/17/17 at 3:14 pm to
Yes slapping God in the face with our dicks and cheating death is one goal.

But the reason it's called the singularity is because we simply can't comprehend what will happen, our fates will be left in the hands of things smarter than us. Very exciting, good or bad could go either way. So there's a lot of reasons to be excited, not just immortality.
Posted by musick
the internet
Member since Dec 2008
26125 posts
Posted on 3/17/17 at 3:29 pm to
quote:

But the reason it's called the singularity is because we simply can't comprehend what will happen, our fates will be left in the hands of things smarter than us. Very exciting, good or bad could go either way. So there's a lot of reasons to be excited, not just immortality.


What's interesting about this is that if you looked at the human race as an entire timeline from the start of it, and consider that singularity is going to happen eventually, say in the next 100 years, then we are already basically in that era. It has already happened. The machines are going to catch up and "take over" soon but the things that set it in motion are here now.
This post was edited on 3/17/17 at 3:31 pm
Posted by TigerFanInSouthland
Louisiana
Member since Aug 2012
28065 posts
Posted on 3/17/17 at 3:44 pm to
I honestly don't see much upside in this.
Posted by 0jersey
Paradise
Member since Sep 2006
1838 posts
Posted on 3/17/17 at 3:46 pm to
quote:

What happens when this computer ultimately decides that most efficient way to cure cancer is to kill humans. no humans, zero risk of humans getting cance


This is not an issue and the solution is not to kill humans. The reality is that cancer will be mitigated from the combination of the biotechnology and nanotechnology revolution that is in progress already. We already know that cancer is simply a gene sequence that has been turned to the "on" position without a way to turn "off" to explain it in simplistic terms.

The nanobots will be able to identify the faulty dna and repair or destroy it. There will be no need to kill the whole organism.

Kurzweil is a bit crazy, but only because he is crazy optimistic concerning his timeline. His thoughts are very well hashed out, backed up by solid evidence and the wheels are in motion.

If you haven't read his book, "The Singularity is Near" I would highly recommend it. Just know that it's a bit of an undertaking and very detailed.

The only way the Singularity doesn't come into existence is if we all nuke ourselves to the stone ages.
Posted by AjaxFury
In & out of The Matrix
Member since Sep 2014
9928 posts
Posted on 3/17/17 at 3:48 pm to
quote:

I honestly don't see much upside in this.


Exactly. Immortality mixed with infinitely more intelligent A.I overlords sounds more like eternal slavery.

It would be so advanced, it would consider our needs & wants much like humans are concerned with the needs of an anthill.
Posted by Freauxzen
Utah
Member since Feb 2006
37263 posts
Posted on 3/17/17 at 3:52 pm to
quote:

This is not an issue and the solution is not to kill humans. The reality is that cancer will be mitigated from the combination of the biotechnology and nanotechnology revolution that is in progress already. We already know that cancer is simply a gene sequence that has been turned to the "on" position without a way to turn "off" to explain it in simplistic terms.

The nanobots will be able to identify the faulty dna and repair or destroy it. There will be no need to kill the whole organism.



Why would any ASI care about any of this? Life, death, cancer, no cancer, there's no point for any of this to the ASI.

Oh wait, because we think we will "tell it what to do."

Posted by 0jersey
Paradise
Member since Sep 2006
1838 posts
Posted on 3/17/17 at 4:19 pm to
The theory is that the cure for cancer will precede the singularity. Therefore, it will be a non issue for the singularity.
Posted by Freauxzen
Utah
Member since Feb 2006
37263 posts
Posted on 3/17/17 at 4:30 pm to
quote:

The theory is that the cure for cancer will precede the singularity. Therefore, it will be a non issue for the singularity.



Sure. But the idea that we can theorize x before y is the problem with the Singularity, we have no idea what's beyond.

Or the better question is:

Is it more likely we create self-sustaining and human repairing nanobots prior to the AGI that we evolve into an ASI? The whole problem with and the timeline is the progress speed is going to be something we have no idea of until it's too late.
Posted by LucasP
Member since Apr 2012
21618 posts
Posted on 3/17/17 at 4:43 pm to
quote:

The theory is that the cure for cancer will precede the singularity. Therefore, it will be a non issue for the singularity.



It was just an example that clearly points out a bigger issue idiot. And don't come in here throwing around Kurzweil like some kind of expert, I've read all his books and even owned the Our Lady Peace album about him (on CD!!!). So don't be a dickhead about Kurzweil, ya dickhead.


Also I've begun drinking so any meaningful contribution I may have added to this thread is now over.
Posted by Freauxzen
Utah
Member since Feb 2006
37263 posts
Posted on 3/17/17 at 4:44 pm to
quote:

Also I've begun drinking so any meaningful contribution I may have added to this thread is now over.




No use for that post Singularity either. Bye bye drinking time.
first pageprev pagePage 5 of 7Next pagelast page

Back to top
logoFollow TigerDroppings for LSU Football News
Follow us on Twitter, Facebook and Instagram to get the latest updates on LSU Football and Recruiting.

FacebookTwitterInstagram