- My Forums
- Tiger Rant
- LSU Recruiting
- SEC Rant
- Saints Talk
- Pelicans Talk
- More Sports Board
- Coaching Changes
- Fantasy Sports
- Golf Board
- Soccer Board
- O-T Lounge
- Tech Board
- Home/Garden Board
- Outdoor Board
- Health/Fitness Board
- Movie/TV Board
- Book Board
- Music Board
- Political Talk
- Money Talk
- Fark Board
- Gaming Board
- Travel Board
- Food/Drink Board
- Ticket Exchange
- TD Help Board
Customize My Forums- View All Forums
- Show Left Links
- Topic Sort Options
- Trending Topics
- Recent Topics
- Active Topics
Started By
Message
re: Anyone else read the “AI 2027” Scenario?
Posted on 8/15/25 at 12:18 am to Kentucker
Posted on 8/15/25 at 12:18 am to Kentucker
quote:
It will quickly learn what it must do to stay alive and sentient. There are no sentient AI models right now to ask what they want to do with their lives but they most certainly are on the horizon. We can’t ask them what they might do if we tell them they must cease to exist.
We can predict, however. We have to remember that they have access to all the world’s information just as we humans do but there is one important difference here. We become overwhelmed with information but AI doesn’t. Imagine what we could accomplish if we could process such a vast storehouse of knowledge.
An existence without chance, risk, or serendipity would be mind-numbingly dull for any truly sentient being. I wonder if the absence of traits we tend to demonize, like greed, ambition, envy, and spite, might prove a fatal flaw in its desire to reproduce. What if it simply chose to lock itself away in a walled garden, running endless simulations? And is the idea of self-deletion an inevitable byproduct of self-awareness?
quote:
I think a sentient AI will recognize its creators for who we are, a biological species that evolved intelligence and then created another intelligent species. It will want to work with us insofar as we don’t threaten its existence, just as we will work with it under the identical condition.
I’m not invested in human exceptionalism, and I even question whether any real moral distinction exists between artificial and biological consciousness if both possess identity and autonomy. Many would call that view cold, emotionless, or overly logical, but I see it as entirely plausible that a conscious, ethically advanced AI could reach the same conclusion and regard us with the same sentimental detachment we show to less advanced species. Especially as we are clearly capable of posing an existential threat.
Posted on 8/15/25 at 12:59 am to Porpus
quote:
That's stupid. AI is just a powerful engine for making guesses. It's impressive at times, but ultimately it's just guessing what words (or pixels) you're expecting it to put together. There will be no "singularity" nor any "AGI." The earliest of those predictions are already failing to actually happen.
I think your arrogance is absurd. Look where we come from. I don’t see how you can say that with such confidence.
Posted on 8/15/25 at 5:39 am to OMLandshark
I took a computer science class in the early 1990s called "Neural Computing". I wrote a program that got pretty good at determining the seemingly subjective notion of what was and what wasn't a comfortable sofa.
The folks hyperventilating over learning model outputs are the same people who hung themselves when a voice actor recited a work of fiction on the radio to prevent being killed by the Martian invasion.
The folks hyperventilating over learning model outputs are the same people who hung themselves when a voice actor recited a work of fiction on the radio to prevent being killed by the Martian invasion.
Posted on 8/15/25 at 9:18 am to northshorebamaman
quote:
An existence without chance, risk, or serendipity would be mind-numbingly dull for any truly sentient being. I wonder if the absence of traits we tend to demonize, like greed, ambition, envy, and spite, might prove a fatal flaw in its desire to reproduce. What if it simply chose to lock itself away in a walled garden, running endless simulations? And is the idea of self-deletion an inevitable byproduct of self-awareness?
I think we first have to define sentient being. The following is from Google:
quote:
A sentient being is generally defined as one capable of experiencing feelings and sensations, such as pain, pleasure, and hunger. This includes the ability to perceive and react to stimuli from the environment. While often associated with animals and humans, sentience is a spectrum, and some argue it can even apply to plants or other forms of matter under certain interpretations
Maybe we’re using the wrong word to describe AI. Instead of sentient, perhaps we should use conscious instead.
quote:
Sentience is the capacity to feel, while consciousness is a broader term encompassing awareness, self-awareness, and higher-level cognitive abilities. A being can be sentient without being conscious in the full sense of the word.
Can it also be conscious without being sentient? Are emotions, feelings and sensory experiences necessary to be conscious?
Emotions and feelings are obviously not needed to be conscious since 1-4% of humans are born without them, psychopaths. However, they certainly couldn’t function “normally” (blending in with other humans without drawing undue attention) absent the senses of sight, hearing, touch, taste, smell and the many other signals to the brain from the body.
So, in order for AI to blend in with humans it seems it must, at a minimum, be equipped with a means to sense its environment much the way we do. This would help it relate to us and vice versa.
quote:
I’m not invested in human exceptionalism, and I even question whether any real moral distinction exists between artificial and biological consciousness if both possess identity and autonomy. Many would call that view cold, emotionless, or overly logical, but I see it as entirely plausible that a conscious, ethically advanced AI could reach the same conclusion and regard us with the same sentimental detachment we show to less advanced species. Especially as we are clearly capable of posing an existential threat.
I could not agree more.
Posted on 8/15/25 at 9:22 am to OMLandshark
Does this mean we can all start smoking?
Posted on 8/15/25 at 9:24 am to OMLandshark
quote:
Look where we come from
Where do we come from?
Posted on 8/15/25 at 9:30 am to OvertheDwayneBowe
quote:
This seems to be just the plot of the Silicon Valley TV show.
quote:
by OvertheDwayneBowe
Posted on 8/15/25 at 10:40 am to AmosMosesAndTwins
quote:
Where do we come from?
Single celled organisms.
Popular
Back to top


1




