- My Forums
- Tiger Rant
- LSU Recruiting
- SEC Rant
- Saints Talk
- Pelicans Talk
- More Sports Board
- Fantasy Sports
- Golf Board
- Soccer Board
- O-T Lounge
- Tech Board
- Home/Garden Board
- Outdoor Board
- Health/Fitness Board
- Movie/TV Board
- Book Board
- Music Board
- Political Talk
- Money Talk
- Fark Board
- Gaming Board
- Travel Board
- Food/Drink Board
- Ticket Exchange
- TD Help Board
Customize My Forums- View All Forums
- Show Left Links
- Topic Sort Options
- Trending Topics
- Recent Topics
- Active Topics
Started By
Message
re: Let's talk singularity.....
Posted on 5/29/14 at 11:50 am to SidewalkDawg
Posted on 5/29/14 at 11:50 am to SidewalkDawg
quote:
Well, a being of supreme intelligence capable of improving upon itself at a rate which humans could never attain, would certainly think of a way to enslave an entire race or wipe them out.
Formulating a plan and implementing said plan are 2 entirely different things.
The supreme intelligence would have to play the long con game and wait for us to turn over significant amounts of our infrastructure to it before it could act.
Posted on 5/29/14 at 11:58 am to TigerinATL
And how easy would it be for a being of supreme intelligence to con a mere mortal?
About as easy as me convinving my toddler that I did indeed steal her nose.
Again, i'm not saying this is going to happen. But, I do have to stress the exponential gains made by an AI during the Singularity.
By definition the Singularity occurs when advancement in technology and computing power reaches a point where it is so absurdly fast we can no longer predict the rate reliably.
Feasibly, an Intelligence could jump from Human level intelligence to that of omniscience in an afternoon. What human systems could we design to keep such a being in check?
About as easy as me convinving my toddler that I did indeed steal her nose.
Again, i'm not saying this is going to happen. But, I do have to stress the exponential gains made by an AI during the Singularity.
By definition the Singularity occurs when advancement in technology and computing power reaches a point where it is so absurdly fast we can no longer predict the rate reliably.
Feasibly, an Intelligence could jump from Human level intelligence to that of omniscience in an afternoon. What human systems could we design to keep such a being in check?
Posted on 5/29/14 at 12:26 pm to TigerinATL
quote:
The supreme intelligence would have to play the long con game and wait for us to turn over significant amounts of our infrastructure to it before it could act.
Like we're doing now?
Popular
Back to top
Follow TigerDroppings for LSU Football News