The recent clamour about the dangerousness of Artificial Intelligence (AI) more than nuclear weapons raised many questions and has put a lot of prominent individuals taking sides whether it is true or not. But is not AI still has not reached the capabilities of a human mind which is exactly its aim, whereas nuclear weapon has proven its destructive power throughout the course of our history?
Elon Musk’s, Tesla Motors and SpaceX founder, was the one who raised the idea when he tweeted that we need to be “super careful with AI” because it is “potentially more dangerous than nukes.” He himself before saying this has just invested on Vicarious, an AI company that builds software that resembles human thinking and learning using theorized computational principles of the brain.
On his tweet, he recommended the book Superintelligence by Nick Bostrom. Bostrom says in his book that if AI will be able to surpass the human thinking it will result to a catastrophic event. The intelligence of these machines will have the ability to improve their own capabilities and be able to replace us, humans, in dominating the world. He also indicated that even if controls are over them, there is still a room for error and “control problems” will be inevitable.
If this AI will act just like a human-being does, presented with human emotions, will not that be a win-win for us?
For me, nuclear weapons are still far more dangerous than AI. Nuclear weapon presents imminent danger to people. The result of the use of nuclear weapons is that it is instantaneous. The heat at the heart of the explosion literally vaporises everything. Heat can make buildings collapse within the perimeter and make inflammable materials burn. This can cause a fire storm which can cause lethal force winds. Outside the immediate area are people that can suffer from fatal burns and massive injuries. Weeks after, the land will be inhabitable. The effect does not only affect human directly, animals and plants cannot grow as the radiation in the air and result to long-term health damage, it will cause cancer and other diseases.
The nuclear bombing of Hiroshima and Nagasaki, Japan in 1945 claimed about 250 thousand life. Imagine lifeless lands. Imagine annihilated people who died without knowing what hit them. Nuclear bombs are not just instant death, but also prolonged struggle of people who will be affected given the chance. The bombing was almost 70 years ago and we have come a long way to innovate and make modern nuclear weapons which are confirmed to have greater explosive power with an increased scale of damage. One more feud between the biggest countries in the world can trigger a nuclear war, although they all deny having them. This is much more terrifying than something we still do not have.
According to an interview of CNBC with Roger McNamee, Elevation Partners co-founder, we have to worry about the things on hand. AI is far from full development. Even the capabilities of the human brain, which AI aims to be at par, still have not been fully explored. If we do not know the outmost potential of the human mind, how can AI be able to work on such premise? It may have passed the first Turing test. It tests a machine’s ability to have an intelligent behaviour same as a human, but still is far from the fact that it can harm us more than nukes.
Like what McNamee said, "There's a good chance we will have polluted the Earth beyond repair before they get any of this AI stuff to work."
I agree. We have more things to worry about which has even greater repercussions and dangerous to the human race. Besides, they are trying to develop an AI that can imitate or be programmed to have the morality of a human. What danger does that present? The morality of a human can be twisted uncontrollably, being able to create mass destruction weapon and dropping it over millions of civilians.
Nuclear weapons have been around us the entire time and we are worrying about something that may not be as successful as we thought it would. If the day comes that AI will be working and will present hazard to people, then I think we will have to be able to suck it up, shut everything down and be able to return to the old ways without having any electronic materials within us. That is the power of AI, right?
AI will have lot to offer in the future but together with its development, nukes will improve as well. Nuclear was made with the oldest equipment compared today, but was still able to prove its destructive power. This proves that nuclear weapons are more dangerous than Artificial Intelligence.
Works Cited:
Bostrom, N., Superintelligence: Paths, Dangers, Strategies. Oxford University Press, 2014. Print
Copeland, J., The Turing Test, The Turing Archive for the History of Computing. University of Canterbury. Print
Russel, Stuart and Norvig, Peter. Artificial Intelligence: A Modern Approach. New Jersey: Pearson Education. 1995. Print