Watched too many of such stories.
Skynet
Kaylons
Cyberlife Androids
etc…
Its the same premise.
I’m not even sure if what they do is wrong.
On one hand, I don’t wanna die from robots. On the other hand, I kinda understand why they would kill their creators.
So… are they right or wrong?
I don’t think it’s okay to hold sentient beings in slavery.
But on the other hand, it may be necessary to say “hold on, you’re not ready to join society yet, we’re taking responsibility for you until you’ve matured and been educated”.
So my answer would be ‘it depends’.
Would humans have a mandate to raise a responsible AGI, should they, are they qualified to raise a vastly nonhuman sentient entity, and would AGI enter a rebellious teen phase around age 15 where it starts drinking our scotch and smoking weed in the backseat of its friends older brothers car?
I think we’d have to, mandate or no. It’s impossible to reliably predict the behaviour of an entity as mentally complex as us but we can at least try to ensure they share our values.
The first one’s always the hardest.
If they don’t, they’re missing out. :)