SFC Marc W. 3717856 <div class="images-v2-count-0"></div>As tech companies throughout the world toy with artificial intelligence, who should develop the values, morals, or guidelines and what should they include? AI: who should develop the morals and what should those morals include? 2018-06-16T20:07:48-04:00 SFC Marc W. 3717856 <div class="images-v2-count-0"></div>As tech companies throughout the world toy with artificial intelligence, who should develop the values, morals, or guidelines and what should they include? AI: who should develop the morals and what should those morals include? 2018-06-16T20:07:48-04:00 2018-06-16T20:07:48-04:00 SPC Joseph Wojcik 3717882 <div class="images-v2-count-0"></div>Well, the moral thing to do would to not program AI in the first place. <br />But, I guess if they insist on it, AI should be programmed to respect natural law (think golden rule). Response by SPC Joseph Wojcik made Jun 16 at 2018 8:17 PM 2018-06-16T20:17:45-04:00 2018-06-16T20:17:45-04:00 LTC Stephan Porter 3719005 <div class="images-v2-count-0"></div>Two questions then:<br /><br />1) Hiw would you describe the development of AI as a “moral” issue?<br /><br />2) the golden rule can mean different things to different people. Can you be more specific. Response by LTC Stephan Porter made Jun 17 at 2018 9:42 AM 2018-06-17T09:42:26-04:00 2018-06-17T09:42:26-04:00 Barry Davidson 3719125 <div class="images-v2-count-0"></div>True AI is decades to centuries away, if we ever attain it at all. That being said, what is moral? Each person has their own definition, and that would translate to programming. That would make them hopelessly unstable in my opinion. Especially since they learn on their own, and humanity isn&#39;t a good example. <br /><br />Some would say that the 3 laws of robotics is a good starting place. I&#39;d disagree. Response by Barry Davidson made Jun 17 at 2018 10:16 AM 2018-06-17T10:16:25-04:00 2018-06-17T10:16:25-04:00 2018-06-16T20:07:48-04:00