Non-Turrets don't, but Turrets do. Presumably, the non-Turrets are aimed and fired by the pilot.Turrets do not need crew to fire.
How much information does the Superhuman AI have? It's not omniscient, especially if it isn't connected to the net, nor are they omnipotent.
Who says the 3 laws are "the sole moral code for AIs"?
Here's an interesting bit of text:
(Emphasis mine.) Are you reading this as 'the law requires AI-Ds to be built with Asimov's Laws'? To me, this says AIs should be created with safeguards (namely Asimov's laws or similar), not are, necessarily.A designed intelligence without rigid restraints and safeguards in place upon its programming can be a truly terrifying monster, as the Carissia Cruiseliners incident at the start of the 13th Planet Rush attests. At the very least, all designed intelligences must be created with Asimov’s laws firmly in place to avoid danger to themselves and those around them.
So AI-Gs do feel emotion and can be made to not want to abuse loopholes even if they find them.A.I.-Gs experience the whole range of human emotion: they can be happy, feel fear and doubt, and even love.
Further, each A.I.-G requires a great deal of hand-crafting by talented programmers who replace the burgeoning intelligence’s biological imperatives with new ones according to its intended purpose:
Have you ever read Freefall? Florence is a 'Bowman's Wolf', a wolf uplifted to human-like intelligence, and has a bunch of built-in safeguards. The sequence beginning here is a good example.
Humans also have safeguards. Empathy among them.
Last edited: