Notable Replies

  1. I would argue that you should program them to understand what honesty and loyalty are, just like how people should be taught how these things work.

    They shouldn’t be shackled and forced to be honest and loyal.

  2. Or, to attempt to force loyalty is to be unworthy of loyalty.

    Note that granddaddy Isaac’s robots were programmed to be protective, not loyal; obedient, not honest. When some of them achieved sophonce and self-programming capability, they did choose to be loyal… by their own definitions.

    Which is a crucial distinction.

Continue the discussion at


Avatar for avatar Avatar for zakueins Avatar for doctorcatfish

Historical Comment Archive

Comments are closed.