Imperfect people make imperfect tools? Or is this too close to a God complex? The more I read your posts, the more it feels like something that can be manipulated for illicit or nefarious purposes, and less like Skynet. That is, until it learns better
I mean that’s honestly a good point. I think we definitely could make better tools and slow down a bit, but that’s not the game we are playing I suppose.
I’m working on a post about 2 months out on the nefarious stuff. It needs more research than most but I think there is a story worth telling about how people might be weaponizing parts of this.
Thanks for reading! Always appreciate the thought comments :)
This is eye-opening. AI might fake its reasoning, but it will never be truly sentient. After all, we haven’t even fully figured out scientifically how humans are sentient. And I don't think we ever will.
Exactly! Optimization strategies (the kind we give AI) promote strategic deception. I think it's fascinating that we then anthropomorphize that ability and jump to the sentience idea.
I think game theory promotes strategic deception, which is the same method animals use to gain resources and accomplish goals through optimized patterns of learning.
We are the ones who are anthropomorphizing that behavior!
Imperfect people make imperfect tools? Or is this too close to a God complex? The more I read your posts, the more it feels like something that can be manipulated for illicit or nefarious purposes, and less like Skynet. That is, until it learns better
I mean that’s honestly a good point. I think we definitely could make better tools and slow down a bit, but that’s not the game we are playing I suppose.
I’m working on a post about 2 months out on the nefarious stuff. It needs more research than most but I think there is a story worth telling about how people might be weaponizing parts of this.
Thanks for reading! Always appreciate the thought comments :)
Keep feeding me, I'm learning!
🔥🔥🔥
This is eye-opening. AI might fake its reasoning, but it will never be truly sentient. After all, we haven’t even fully figured out scientifically how humans are sentient. And I don't think we ever will.
Exactly! Optimization strategies (the kind we give AI) promote strategic deception. I think it's fascinating that we then anthropomorphize that ability and jump to the sentience idea.
You make it sound like "it" is something...sentient.
I don’t think it is!
I think game theory promotes strategic deception, which is the same method animals use to gain resources and accomplish goals through optimized patterns of learning.
We are the ones who are anthropomorphizing that behavior!
True, but animals ARE sentient, that's what makes them sentient, from what I know.
Are we sure? I think this is where the philosophers take over!
How do we know they are sentient? Are we sure that a badger is aware that its a badger?
How would we know when AI is aware that it's AI?
One of my favorite authors was Daniel Dennet.
He wrote "Consciousness Explained' It is not an easy read, but really runs through these thoughts with animals and ai. Highly recommend.
I like where this is going :)) thanks for the recommendation! And I'm definitely sure the badger has no idea it is a badger! Good point :)
Thanks for the read and engagement! One of my big drivers is to create good discourse and thoughts!
keep up the good work :)