Started By
Message

Let's talk singularity.....

Posted on 5/29/14 at 8:41 am
Posted by white perch
the bright, happy side of hell
Member since Apr 2012
7165 posts
Posted on 5/29/14 at 8:41 am
.....I'm not a tech guy, but it seems we are getting closer and closer. The human-machine interface is getting more advanced with with neurorobotics and brain chip implants. Any of y'all that are more current than I care to share some inside info?
Posted by SidewalkDawg
Chair
Member since Nov 2012
9850 posts
Posted on 5/29/14 at 8:51 am to
Ray Kurzweil is considered by some to be a nut. It's been a while since I viewed any of his material but if I remember he had some pretty accurate predictions about technology. He seems to think the Technological Singularity will occur around 2040.

My only fear and hesitation about a singularity is that assumption that AI (if this is the route which brings us to it) would be benevolent enough to allow us the chance to assimliate. Chances are through sheer logic that a supreme intelligence would just wipe us out.
Posted by TigerinATL
Member since Feb 2005
61645 posts
Posted on 5/29/14 at 9:49 am to
I'm only peripherally familiar with Kurzweils views, but my understanding is he thinks that everything will change once machines become as smart as humans. At that point things we had never considered before become possible and you could see ridiculous advancements. While this technically may be possible, I don't think it takes politics into account. Yesterday in the self driving car thread it was pointed out how many powerful lobbies (law enforcement, attorneys and insurance) had a vested interest in not letting safer self driving cars become the norm.

I think there will quite possibly be a backlash against enhancing humans and creating obviously smart machines. Will it be something that takes just a generation or 2 to get over the new and unfamiliar, or will it be something like on Star Trek DS9, where the genetic enhancements Dr. Bashir had were outlawed?

Just because we have the capacity to better ourselves doesn't mean we will. People in power do their damnedest to maintain the status quo, and if a technology is disruptive enough that it's seen as a threat to their status quo, they will try to demonize and squash the technology.
Posted by BottomlandBrew
Member since Aug 2010
27225 posts
Posted on 5/29/14 at 10:48 am to
Singularity scares the shite out of me. Maybe I'm just scared of the unknown. I don't know.
Posted by Ye_Olde_Tiger
Member since Oct 2004
1200 posts
Posted on 5/29/14 at 11:06 am to
Even with a supreme intelligence, how does a single machine/being go about taking over the world?

Even if it was able to walk around I just can't see a single entity being that powerful.

I'll be concerned once we see 100% fully automated industry pipelines from concept to product rolling out of a factory. I'm talking buildings/factories/processes being designed and constructed by machines.

Once, and if, we reach that point, then I could see serious concerns for the limited existence of humans. But one individual machine, super smart or not, can be destroyed pretty easy.

But that just like, my opinion, man...
first pageprev pagePage 1 of 1Next pagelast page
refresh

Back to top
logoFollow TigerDroppings for LSU Football News
Follow us on Twitter, Facebook and Instagram to get the latest updates on LSU Football and Recruiting.

FacebookTwitterInstagram