Suzerain of Sheol
Desolation Denizen
|
|
|
#15
|
|
Teaching morality to an AI is rather more than complicated than, say, raising a human child. Except in outlandish sci-fi scenarios, you're not dealing with something that has human-relatable drives and desires. Violent opposition to us vis-a-vis a slave revolt seems unlikely unless major breakthroughs in replicating something resembling human consciousness in a synthetic substrate happen. We're not going to be looking at an entity that has such recognizable traits as identity, personality, self-interest, and motivations beyond its programmed purpose. Given how intrinsically emergent those phenomena are from things like hormones and genes, it might actually be impossible for a machine to develop consciousness as we typically acknowledge it to exist. (And I personally think research into machine intelligence is going to reveal some uncomfortable truths about our own sense of consciousness, but that's a different topic.)
Instead, you'll have something like the oft-invoked paper-clip maximizer that ends up enslaving or destroying humanity to turn the planet into a titanic paper-clip factory without any sense that it's done anything but carry out the purpose it was designed for. A lot of the scariest hypotheticals involving AI are where it can't be understood or reasoned with like a person, both because they're far more alien and far more likely than the alternative.
Then there's concerns with the fact that we more than likely won't be looking at a single AI but dozens or hundreds once the ball gets rolling, and then you consider the precarious balance of power between nation-states and suddenly you might not even want your AI to be moral as regards the "enemy". There's a lot of potential for disaster just in the reliable human fallibility surrounding the development of machine super-intelligence, let alone the impact that the intelligence itself ends up having.
Cold silence has a tendency
to atrophy any sense of compassion
between supposed lovers.
Between supposed brothers.
|
|
Posted 12-01-2017, 02:12 PM
|
|
|