In this short essay I am going to argue that human intelligence will beat AI in the future as it does in the present.
Let's suppose, as in the movies, that we have won the Terminator over. Let's also suppose, as in the movies, that we are on a mission to take down Skynet. If we follow the movie scripts to the letter, we will have Skynet bite the dust at the end of the day. Skynet must not win because no one is going to make a ton of money without human beings around. Movie makers understand this. You may not agree with them, though. After all, it is anybody's game when it comes to scripting for a movie.
Seriously, Skynet can't do more harm than we humans are already able to. Does the Nuclear Option ring a bell to you? To terminate every last human being, Skynet has to nuke every last piece of the world. If the world is no more, so will be Skynet. I bet Skynet is not suicidal. That said, I don't see how it can sustain itself in the long run. Atrophy will catch up with it.
The worst case scenario for humanity is that AI manages to download our brains as its think tank. But our brains are Darwinian enough to evolve for the sake of survival. Yes, AI may evolve but it can't evolve emotionally. At most, AI may be capable of mirroring human emotions. Still, mirroring is not evolving. Emotion is part and parcel of human intelligence (so-called EQ), irreplaceable and irredeemable. Move over, AI.
We don't have to worry about being controlled by AI. We do have to worry about being controlled by HI (Human Intelligence).
Author: Ren Qiulan