1LT Private RallyPoint Member5880777<div class="images-v2-count-0"></div>In response to the following article: <a target="_blank" href="https://mwi.usma.edu/artificial-intelligence-bomb-nuclear-command-control-age-algorithm/">https://mwi.usma.edu/artificial-intelligence-bomb-nuclear-command-control-age-algorithm/</a><br /><br />Hell no! I read an article the other day, "On the Limits of Strong Artificial Intelligence: Where Don't We Want AI on Tomorrow's Battlefield?" by LTC Daniel Thetford with Army AL&T magazine (if anyone cares to find it). <br /><br />His premise was that AI is "limited to its programming" and thus "can never act as a moral agent." Given the capacity to destroy in combat operations, let alone nuclear operations, AI cannot utilize the fundamental "human" aspect of strategy that is required for such decisions, i.e. morality. Moral agency is a requirement of a leader in war and surrendering that tenet to a machine would compromise our fundamental existence - to protect U.S. interests and uphold constitutional values. Consider LTC Thetford's explanation on moral agency: "Moral agency requires the ability to see both truths in a given situation and truths beyond a given situation. It matters morally both that something is achieved, and how it is achieved. Only a moral actor is capable of such a task." That sums it up for me.<br /><br />- So in what way could the military utilize AI?<br /><br />Thinking as a logistician, casualty evacuation. Former mentors of mine wrote the following article: <a target="_blank" href="https://www.benning.army.mil/infantry/magazine/issues/2018/JUL-SEP/PDF/10)Frye-AI_txt.pdf">https://www.benning.army.mil/infantry/magazine/issues/2018/JUL-SEP/PDF/10)Frye-AI_txt.pdf</a><br /><br />Imagine if we could create robots that could navigate through a battle to retrieve a casualty and transport them back to the CCP. The robot would be agile and unmanned, capable of navigating itself in the most effective and efficient route without getting tired. It could be programmed to perform the same life-saving functions of that of a Combat Medic within the care under fire stage, which is essentially applying a tourniquet and rarely anything else. This robot would free human personnel to stay in the fight and on mission (where moral agency might be required) and mitigate the risk that other personnel are injured in the casualty recovery process. <div class="pta-link-card answers-template-image type-default">
<div class="pta-link-card-picture">
<img src="https://d26horl2n8pviu.cloudfront.net/link_data_pictures/images/000/513/224/qrc/Operation_Crossroads_Baker.jpg?1589316701">
</div>
<div class="pta-link-card-content">
<p class="pta-link-card-title">
<a target="blank" href="https://mwi.usma.edu/artificial-intelligence-bomb-nuclear-command-control-age-algorithm/">Artificial Intelligence and the Bomb: Nuclear Command and Control in the Age of the Algorithm -...</a>
</p>
<p class="pta-link-card-description">Editor’s note: The following is based on an article by the author recently published in the Journal of Strategic Studies, entitled “Delegating Strategic Decision-Making to Machines: Dr. Strangelove Redux?” In 2016, DeepMind’s AI-powered AlphaGo system defeated professional Go grandmaster Lee Sedol. In one game, the AI player reportedly surprised Sedol by making a strategic …</p>
</div>
<div class="clearfix"></div>
</div>
Should we integrate AI into nuclear warfare?2020-05-12T16:51:42-04:001LT Private RallyPoint Member5880777<div class="images-v2-count-0"></div>In response to the following article: <a target="_blank" href="https://mwi.usma.edu/artificial-intelligence-bomb-nuclear-command-control-age-algorithm/">https://mwi.usma.edu/artificial-intelligence-bomb-nuclear-command-control-age-algorithm/</a><br /><br />Hell no! I read an article the other day, "On the Limits of Strong Artificial Intelligence: Where Don't We Want AI on Tomorrow's Battlefield?" by LTC Daniel Thetford with Army AL&T magazine (if anyone cares to find it). <br /><br />His premise was that AI is "limited to its programming" and thus "can never act as a moral agent." Given the capacity to destroy in combat operations, let alone nuclear operations, AI cannot utilize the fundamental "human" aspect of strategy that is required for such decisions, i.e. morality. Moral agency is a requirement of a leader in war and surrendering that tenet to a machine would compromise our fundamental existence - to protect U.S. interests and uphold constitutional values. Consider LTC Thetford's explanation on moral agency: "Moral agency requires the ability to see both truths in a given situation and truths beyond a given situation. It matters morally both that something is achieved, and how it is achieved. Only a moral actor is capable of such a task." That sums it up for me.<br /><br />- So in what way could the military utilize AI?<br /><br />Thinking as a logistician, casualty evacuation. Former mentors of mine wrote the following article: <a target="_blank" href="https://www.benning.army.mil/infantry/magazine/issues/2018/JUL-SEP/PDF/10)Frye-AI_txt.pdf">https://www.benning.army.mil/infantry/magazine/issues/2018/JUL-SEP/PDF/10)Frye-AI_txt.pdf</a><br /><br />Imagine if we could create robots that could navigate through a battle to retrieve a casualty and transport them back to the CCP. The robot would be agile and unmanned, capable of navigating itself in the most effective and efficient route without getting tired. It could be programmed to perform the same life-saving functions of that of a Combat Medic within the care under fire stage, which is essentially applying a tourniquet and rarely anything else. This robot would free human personnel to stay in the fight and on mission (where moral agency might be required) and mitigate the risk that other personnel are injured in the casualty recovery process. <div class="pta-link-card answers-template-image type-default">
<div class="pta-link-card-picture">
<img src="https://d26horl2n8pviu.cloudfront.net/link_data_pictures/images/000/513/224/qrc/Operation_Crossroads_Baker.jpg?1589316701">
</div>
<div class="pta-link-card-content">
<p class="pta-link-card-title">
<a target="blank" href="https://mwi.usma.edu/artificial-intelligence-bomb-nuclear-command-control-age-algorithm/">Artificial Intelligence and the Bomb: Nuclear Command and Control in the Age of the Algorithm -...</a>
</p>
<p class="pta-link-card-description">Editor’s note: The following is based on an article by the author recently published in the Journal of Strategic Studies, entitled “Delegating Strategic Decision-Making to Machines: Dr. Strangelove Redux?” In 2016, DeepMind’s AI-powered AlphaGo system defeated professional Go grandmaster Lee Sedol. In one game, the AI player reportedly surprised Sedol by making a strategic …</p>
</div>
<div class="clearfix"></div>
</div>
Should we integrate AI into nuclear warfare?2020-05-12T16:51:42-04:002020-05-12T16:51:42-04:00SSgt Private RallyPoint Member5880826<div class="images-v2-count-0"></div>We are a long, long way away from morality enabled AI. Truth be told, we don't have AI yet no matter how many apps are labeled as AI. It does not even come up to the definition of RI (Restricted Intelligence). While I am a supporter of RI & AI, I am smart enough to understand we still have a long way to go both in technology and programming. We have plenty of apps that can mimic RI/AI in a very narrowly defined arena but these are heuristic algorithms. Basically, just a single synapse in a RI/AI "brain".<br /><br />In the future, using RI to take over 90% of tracking, social pressure monitoring and rolling the info up to humans will become a thing. Putting a non-human entity capable of pushing "the button" is human arrogance, pride and just plain irresponsible.Response by SSgt Private RallyPoint Member made May 12 at 2020 5:00 PM2020-05-12T17:00:53-04:002020-05-12T17:00:53-04:00MSG Private RallyPoint Member5880848<div class="images-v2-count-1"><div class="content-picture image-v2-number-1" id="image-459063"> <div class="social_icons social-buttons-on-image">
<a href='https://www.facebook.com/sharer/sharer.php?u=https%3A%2F%2Fwww.rallypoint.com%2Fanswers%2Fshould-we-integrate-ai-into-nuclear-warfare%3Futm_source%3DFacebook%26utm_medium%3Dorganic%26utm_campaign%3DShare%20to%20facebook'
target="_blank" class='social-share-button facebook-share-button'><i class="fa fa-facebook-f"></i></a>
<a href="https://twitter.com/intent/tweet?text=Should+we+integrate+AI+into+nuclear+warfare%3F&url=https%3A%2F%2Fwww.rallypoint.com%2Fanswers%2Fshould-we-integrate-ai-into-nuclear-warfare&via=RallyPoint"
target="_blank" class="social-share-button twitter-custom-share-button"><i class="fa fa-twitter"></i></a>
<a href="mailto:?subject=Check this out on RallyPoint!&body=Hi, I thought you would find this interesting:%0D%0AShould we integrate AI into nuclear warfare?%0D%0A %0D%0AHere is the link: https://www.rallypoint.com/answers/should-we-integrate-ai-into-nuclear-warfare"
target="_blank" class="social-share-button email-share-button"><i class="fa fa-envelope"></i></a>
</div>
<a class="fancybox" rel="e9f7dc7624a5e2b8c80faa2f46668319" href="https://d1ndsj6b8hkqu9.cloudfront.net/pictures/images/000/459/063/for_gallery_v2/f83f4268.jpg"><img src="https://d1ndsj6b8hkqu9.cloudfront.net/pictures/images/000/459/063/large_v3/f83f4268.jpg" alt="F83f4268" /></a></div></div>Response by MSG Private RallyPoint Member made May 12 at 2020 5:06 PM2020-05-12T17:06:27-04:002020-05-12T17:06:27-04:00SGT Herbert Bollum5880991<div class="images-v2-count-0"></div>I would not allow AI in any thing of that critical instance except for sounding alarms and warnings of problems. AI can not be trusted to that degree.Response by SGT Herbert Bollum made May 12 at 2020 5:38 PM2020-05-12T17:38:21-04:002020-05-12T17:38:21-04:00SSG Robert Mark Odom5881018<div class="images-v2-count-0"></div>That's a very scary proposition.Response by SSG Robert Mark Odom made May 12 at 2020 5:43 PM2020-05-12T17:43:47-04:002020-05-12T17:43:47-04:00Lt Col Jim Coe5881130<div class="images-v2-count-0"></div>Some older veterans will recall this quote, “Would you like to play thermonuclear war?” Google it. Cool movie.Response by Lt Col Jim Coe made May 12 at 2020 6:11 PM2020-05-12T18:11:55-04:002020-05-12T18:11:55-04:00LT Brad McInnis5881197<div class="images-v2-count-0"></div>No.Response by LT Brad McInnis made May 12 at 2020 6:31 PM2020-05-12T18:31:06-04:002020-05-12T18:31:06-04:00TSgt Joe C.5881916<div class="images-v2-count-0"></div>NO!Response by TSgt Joe C. made May 12 at 2020 9:58 PM2020-05-12T21:58:07-04:002020-05-12T21:58:07-04:00MSgt Private RallyPoint Member5882057<div class="images-v2-count-0"></div>Oh, hell no. I’ve seen the movie so I know how that always ends...Response by MSgt Private RallyPoint Member made May 12 at 2020 10:27 PM2020-05-12T22:27:06-04:002020-05-12T22:27:06-04:00PV2 Glen Lewis5938821<div class="images-v2-count-0"></div>I don't believe it would be a very good idea. I think the initiator should have more reasoning than an algorithm.Response by PV2 Glen Lewis made May 26 at 2020 11:27 PM2020-05-26T23:27:21-04:002020-05-26T23:27:21-04:002020-05-12T16:51:42-04:00