Quote:
Originally Posted by Quiet Man Cometh
There is no limitation on time frame in the class chat we're having. If a AI can make an answer on it's own, how would we know it is wrong, unless we do the calculations ourselves, and then what's the point of having the AI doing it? This assumes we are using an AI to get to answers we haven't figured out yet.
Insight into the human brain and it's process is one goal of artificial intelligence (I forget who to refer to for that one). The self-problem -solving sort (so to speak) doesn't help in that regard because we can't see the path the AI takes.
On the efficiency, the fact that I'm sitting in front of a piece of technology that serves solely to amuse me would seem to argue against that, especially as I spent more money on a bigger and shinier one whose sole purpose is to amuse me with bigger and shinier version of the same things as the last one.
|
There are many, many, MANY problems in this world that are difficult to FIND an answer for, but much more feasible to VERIFY the answer once you have it. You set the computer up to figure out what mankind could never figure out on its own, and then you have humans check it.
To take AI out of the picture for an example: It requires an engineer to figure out how to make a car frame stronger, but any old schmoe can smash a car to see if it worked.
It would be exceedingly foolish to trust AI to solve problems that we CAN'T test the results of. At that point, your AI is God, and your science has become faith. And at that point again you'd be tasking an AI to do jobs that humans
can't do in the first place, so it won't have a significant impact on the job market.
If we DO start moving into a world where automation is seriously displacing jobs and it looks like they're never coming back, then we as a society will need to adapt to that. We will need to give up on the idea that all able-bodied adults need to have a job. We will need to make sure that people who don't have a job are provided for without stigma or discrimination -- but that
shouldn't be hard because you'll have automated machines producing the goods that they'll need.
Of course, this
is handwaving the fact that humans are apex predators.
I'm not sure that
Homo sapiens sapiens can psychologically tolerate post-scarcity. As a whole, Man is not satisfied unless it is fighting for power. There are plenty of things that it accepts as stand-ins for power -- wealth, a large family, success, a full belly -- but in a world where AI is always going to be more powerful and wealth is meaningless, we're going to see hoarders, gluttons, tribal cliques, et cetera.