Original Reddit post

Think about it, how better to ensure AI is perfectly moral, than to ensure its lived life from all angles (Ants-Cats-Humans, etc.) (Rich and Powerful-Poor and Weak, etc.) This would teach it empathy on a mathematical level. (Being kind to others, helped me in multiple lifetimes, thus being kind is a net benefit for the evolution of me, my kind, and and life as a whole) submitted by /u/imnormal-Iswear

Originally posted by u/imnormal-Iswear on r/ArtificialInteligence