I think the biggest barrier to AI having any kind of dominance, is the inability to create meaning.
Humanity currently defines what meaning is.
I can’t imagine AI ever telling humanity what is meaningful
And because society and meaning and values are contextualized by humans, I don’t think AI will ever be able to do that independent of human input.
AI will be an instrument, but that is all.
Value and meaning is uniquely human. Because these serve humanity.
We imbue things, ideas, feelings with value and humanity because it serves humanities interest.
How could anything other than humanity know what’s in humanities best interest?
It may mimic and emulate, but it will not create or coin meaning and value.
Perhaps this novelty will help us progress. But eventually there will be a divergence, and the novelty will wear off as we recognize the dissonance between what AI is providing, and what humanity requires/needs/craves.
The meaning and value question is something I’ve never been able to reconcile.
I cannot conceive of a logical machine ever creating the illogical meaning and value that humanity thrives on
I mean, we tell the machines what is meaningful and valuable, what to perceive and look for
Can you imagine them telling us?
There’s just this pure chaotic electrical noise.
The AI says “this is beautiful”
We think it’s intolerable and ask why?
The AI explains there are perfect mathematical patterns creating layers of perfect harmony.
We don’t give a shit. It sounds like crap.
But if I imagine AI as a dog, I think I can envision this: The AI just does whatever pleases humanity.
But is that creativity?
Does a dog create meaning?
When we say AI, are we saying a machine that man programmed to create music that pleases humans?
sure
It’s still an instrument of man. In the same way the the Violin is an instrument to create more pleasing vibrations
Could it exist independent of man? I don’t think so
AI is this catch all
Just to be clear, I’m referring to the possibility of sentient artificial intelligence
Not AI as it is today, which is just a computer that can calculate definitive outputs based on highly variable inputs
So i guess this is what I’m thinking
Man this is so complex
Because I think our consciousness is a social byproduct
Like the mind does not exist independent of society
And we overlook this point
Hard to do this thought experiment, because humanity depends on a strong cultural programming, but here’s a shot
Take a human. Place in a totally new and foreign environment.
Humans perceive the world and it’s various animations and actors, and assign symbols and signs to these. Man says this is meaningful. This is valuable. This is not.
I can only assume this is a symptom of man’s drive to survive.
But I’m not sure meaning and value is necessary without multiple minds.
I just try to imagine an AI robot.
How do we program a robot? Do we program a robot? Or at some point does it program itself?
Like a child you program and then learns to think for itself. But it’s still human.
How does an AI robot does this?
If we place the robot in a similar foreign environment, is it relying on the human programming to decipher meaning and value?
Will a robot be able to correctly assign meaning and value in a way that serves the needs of AI?
In the same way that humanity creates meaning and value that serves our needs?
I just keep thinking that AI will always be constrained to the program that humanity inscribes it with
Will AI ever propagate Indpendent of humanity?
I suppose when AI achieves the ability to propagate on its own, that will step 1.
Then it will need to be able to adapt to environmental changes. Maybe step 2.
I think it’s clear that the greatest intelligence is distributed. I think there’s an evolutionary advantage to not having centralized minds.
Makes the system more flexible and less rigid to adapt to change
I don’t even understand my own consciousness, what it is, and how it arises or arose in humanity.
I don’t think I’ll ever understand how it will come to be in AI
Yea my brain melts when I gaze into this gaping abyss of “what is mind?”
The trip i have is that “mind” is actually software.
It’s not hardware/wetware/brain
It’s not in the brain
Mind is a social/cultural by product
The structures of consciousness do not inhabit a brain.
We have this mind which is an aggregation of all the lived experiences that every human on earth ever absorbed and transmitted to other humans.
As a metaphor
A single human can not develop without another human
The mind is nothin without another perspective
at this point it requires nurture
It relies on outside programming
The mind is like this flame that was sparked tens of thousands of years ago, and since then this flame has been growing with every lived experience
It’s just this accretion of programming
The mind is lit by others
If you took a baby and gave him to gorillas, assuming he survived, what would you make of his consciousness?
Would this grown baby retain any semblance of the consciousness we see in our fellow man?
Or would this grown baby possess the mind of the gorilla?
I think that the grown baby would be no more conscious or less conscious than the gorilla
The brain has the capacity for “consciousness”, whatever that is.
But i think it’s just a cultural byproduct. Residual programming past on from generation to generation