- My Forums
- Tiger Rant
- LSU Recruiting
- SEC Rant
- Saints Talk
- Pelicans Talk
- More Sports Board
- Fantasy Sports
- Golf Board
- Soccer Board
- O-T Lounge
- Tech Board
- Home/Garden Board
- Outdoor Board
- Health/Fitness Board
- Movie/TV Board
- Book Board
- Music Board
- Political Talk
- Money Talk
- Fark Board
- Gaming Board
- Travel Board
- Food/Drink Board
- Ticket Exchange
- TD Help Board
Customize My Forums- View All Forums
- Show Left Links
- Topic Sort Options
- Trending Topics
- Recent Topics
- Active Topics
Started By
Message
re: Let's talk singularity.....
Posted on 5/29/14 at 12:36 pm to Korkstand
Posted on 5/29/14 at 12:36 pm to Korkstand
quote:
That said, I'm not sure it's possible to create a machine that is conscious or capable of actual thought.
That depends on your philosophical views. Is consciousness emergent? Or does it require something else.
If it's emergent then it's merely the result of a powerul processor (brain). If not, then we will do our best to simulate intelligence, which could be even worse as it would lack anything that would qualify as a conscience.
This post was edited on 5/29/14 at 12:37 pm
Posted on 5/29/14 at 12:48 pm to SidewalkDawg
quote:I don't think it depends on my views so much as it depends on the true nature of consciousness, which we don't yet understand and may never understand.
That depends on your philosophical views. Is consciousness emergent? Or does it require something else.
quote:Getting religious here: does it require a "soul"?
Is consciousness emergent? Or does it require something else.
quote:We have clusters that we believe are more powerful than a human brain, but nothing is going to emerge from them that hasn't been programmed.
If it's emergent then it's merely the result of a powerul processor (brain).
quote:This raises new questions:
If not, then we will do our best to simulate intelligence, which could be even worse as it would lack anything that would qualify as a conscience.
Can a non-conscious intelligence be taught the difference between good and evil, right and wrong?
Could it produce original thoughts? Would it ever even think about improving itself?
Posted on 5/29/14 at 12:53 pm to SidewalkDawg
Assuming the AI does develop, and its super-intelligent exponentially beyond human capacity, what does that mean? It would still be a program running on a HDD right? Let's say it determines that humans are a threat, how exactly does it stop that threat? Is it going to turn light-switches into trick electrocution switches so that every human that comes into its server room dies? (Okay, I know that's ridiculous, but seriously what?) Is it going to build a gun? Set off nuclear weapons?
It would seem likely that its main capacity for disrupting human behavior would be through a network which could affect economies. Is it going to electronically forge documents and call in the police to have us arrested? It can run DDoS attacks and create super awesome computer viruses all day long, but that's not going to wipe out humans.
Self-replicating nano-technology. If all humans had nanobots in them AND the nanobots were networked, then yeah, I guess the AI could tell the nanobots to kill us. But again, until the AI has the capacity to actually make physical objects, it's going to need humans to support its infrastructure.
It would seem likely that its main capacity for disrupting human behavior would be through a network which could affect economies. Is it going to electronically forge documents and call in the police to have us arrested? It can run DDoS attacks and create super awesome computer viruses all day long, but that's not going to wipe out humans.
Self-replicating nano-technology. If all humans had nanobots in them AND the nanobots were networked, then yeah, I guess the AI could tell the nanobots to kill us. But again, until the AI has the capacity to actually make physical objects, it's going to need humans to support its infrastructure.
Popular
Back to top
Follow TigerDroppings for LSU Football News