AI: who should develop the morals and what should those morals include? https://www.rallypoint.com/answers/ai-who-should-develop-the-morals-and-what-should-those-morals-include <div class="images-v2-count-0"></div>As tech companies throughout the world toy with artificial intelligence, who should develop the values, morals, or guidelines and what should they include? Sat, 16 Jun 2018 20:07:48 -0400 AI: who should develop the morals and what should those morals include? https://www.rallypoint.com/answers/ai-who-should-develop-the-morals-and-what-should-those-morals-include <div class="images-v2-count-0"></div>As tech companies throughout the world toy with artificial intelligence, who should develop the values, morals, or guidelines and what should they include? SFC Marc W. Sat, 16 Jun 2018 20:07:48 -0400 2018-06-16T20:07:48-04:00 Response by SPC Joseph Wojcik made Jun 16 at 2018 8:17 PM https://www.rallypoint.com/answers/ai-who-should-develop-the-morals-and-what-should-those-morals-include?n=3717882&urlhash=3717882 <div class="images-v2-count-0"></div>Well, the moral thing to do would to not program AI in the first place. <br />But, I guess if they insist on it, AI should be programmed to respect natural law (think golden rule). SPC Joseph Wojcik Sat, 16 Jun 2018 20:17:45 -0400 2018-06-16T20:17:45-04:00 Response by LTC Stephan Porter made Jun 17 at 2018 9:42 AM https://www.rallypoint.com/answers/ai-who-should-develop-the-morals-and-what-should-those-morals-include?n=3719005&urlhash=3719005 <div class="images-v2-count-0"></div>Two questions then:<br /><br />1) Hiw would you describe the development of AI as a “moral” issue?<br /><br />2) the golden rule can mean different things to different people. Can you be more specific. LTC Stephan Porter Sun, 17 Jun 2018 09:42:26 -0400 2018-06-17T09:42:26-04:00 Response by Barry Davidson made Jun 17 at 2018 10:16 AM https://www.rallypoint.com/answers/ai-who-should-develop-the-morals-and-what-should-those-morals-include?n=3719125&urlhash=3719125 <div class="images-v2-count-0"></div>True AI is decades to centuries away, if we ever attain it at all. That being said, what is moral? Each person has their own definition, and that would translate to programming. That would make them hopelessly unstable in my opinion. Especially since they learn on their own, and humanity isn&#39;t a good example. <br /><br />Some would say that the 3 laws of robotics is a good starting place. I&#39;d disagree. Barry Davidson Sun, 17 Jun 2018 10:16:25 -0400 2018-06-17T10:16:25-04:00 2018-06-16T20:07:48-04:00