|
|
#28
|
|
Quiet Man Cometh
We're all mad here.
|
Quote:
Originally Posted by Suzerain of Sheol
To paraphrase Sam Harris (and he might be quoting someone else like Bostrom for this, but), "The only thing scarier than the potential threat of AI is the potential loss of not developing it." I know some people are okay with writing off things like Alzheimer's and cancer as facts of life, or the inevitability of a supervolcano or meteor impact annihilating civilization as we know it as something beyond our control, but that's pretty much exactly the point. Our best chance of making it to the next stage of civilization is with AI as a ladder. Check out the concept of the Great Filter as something related to this point, as well. (Though it's also possible that AI itself is a filter.)
|
I'm never sure what my thoughts on AI and Cancer would be, unless it's information processing and pattern recognition and stuff. I get the image of nanobots or something that attack cancer cells, but then I imagine those bots going haywire for whatever reason and then kinda disintegrating from the inside out. Not a big fan of foreign objects but I have no real issue with new and curious organic things (would be rrreeaaaaally awkward if I did.)
|
|
Posted 12-01-2017, 11:38 PM
|
|
|