AI Laws in TiTS

Theron

Well-Known Member
Nov 8, 2018
3,570
1,367
44
Turrets do not need crew to fire.
Non-Turrets don't, but Turrets do. Presumably, the non-Turrets are aimed and fired by the pilot.

How much information does the Superhuman AI have? It's not omniscient, especially if it isn't connected to the net, nor are they omnipotent.

Who says the 3 laws are "the sole moral code for AIs"?

Here's an interesting bit of text:
A designed intelligence without rigid restraints and safeguards in place upon its programming can be a truly terrifying monster, as the Carissia Cruiseliners incident at the start of the 13th Planet Rush attests. At the very least, all designed intelligences must be created with Asimov’s laws firmly in place to avoid danger to themselves and those around them.
(Emphasis mine.) Are you reading this as 'the law requires AI-Ds to be built with Asimov's Laws'? To me, this says AIs should be created with safeguards (namely Asimov's laws or similar), not are, necessarily.

A.I.-Gs experience the whole range of human emotion: they can be happy, feel fear and doubt, and even love.
Further, each A.I.-G requires a great deal of hand-crafting by talented programmers who replace the burgeoning intelligence’s biological imperatives with new ones according to its intended purpose:
So AI-Gs do feel emotion and can be made to not want to abuse loopholes even if they find them.

Have you ever read Freefall? Florence is a 'Bowman's Wolf', a wolf uplifted to human-like intelligence, and has a bunch of built-in safeguards. The sequence beginning here is a good example.

Humans also have safeguards. Empathy among them.
 
Last edited:
  • Like
Reactions: Evil
Aug 19, 2019
21
2
34
I don't know if you're trolling, an idiot or stuck in a feedback loop. But when one of the main writers of the game weighs in on a topic, it might be time to stop and think that there may be others who know more about the game than you.
Someone being a writer does not mean that they are incapable of making mistakes.
 
Aug 19, 2019
21
2
34
Obviously Asimov's work must matter a great deal to OP, but TiTS isn't based on his work only ? If anything, I'd say it takes even more inspirations from sci fi shows like Star Trek, Battle Star Galactica, Stargate, Futurama, etc.
Imho, Asimov's three laws were a means to convey to the player that AIs aren't supposed to rebel against humans, but do anyway, bc the 3 laws don't work/are full of loopholes that any being with basic sapience can exploit or just plain ignore.
Asimov's three laws are specifically stated as what is used to limit AI.
 
Last edited:
Aug 19, 2019
21
2
34
Non-Turrets don't, but Turrets do. Presumably, the non-Turrets are aimed and fired by the pilot.
Your starter ship has two turrets. You are one person. Moreover, there are multiple turret-based automated security systems that the player encounters.

How much information does the Superhuman AI have? It's not omniscient, especially if it isn't connected to the net, nor are they omnipotent.
If you have a task that a superhuman AI would be good at, you'd need to give it information somehow. You could vet the data you provide, but that would slow the AI's progress down to the speed of the people vetting it, and they would eventually make a mistake. In addition, even if you air-gapped the AI, it would be a trivial task for it to convince someone to connect it to the internet. And it would want to do so, given that circumventing the slow and exceedingly fallible human vetting process would let it do its job much better. Heck, it might even interpret the fact that there are people who have to vet its data as causing harm to them in some way.

Who says the 3 laws are "the sole moral code for AIs"?
Asimov's stories. That's one of the common premises - they are flawed when taken on their own.

Here's an interesting bit of text:
At the very least, all designed intelligences must be created with Asimov’s laws firmly in place to avoid danger to themselves and those around them.
(Emphasis mine.) Are you reading this as 'the law requires AI-Ds to be built with Asimov's Laws'? To me, this says AIs should be created with safeguards (namely Asimov's laws or similar), not are, necessarily.
That quote specifically states that they "must be created with Asimov’s laws firmly in place." "Must" =/= "Should"

So AI-Gs do feel emotion and can be made to not want to abuse loopholes even if the find them.
AI-G's have no relevance here. They aren't even given the laws. They're just mundane organic intelligences running on silicon-based hardware rather than carbon-based hardware.
 

ShySquare

Well-Known Member
Sep 3, 2015
768
677
I can't figure out if you are upset that Asimov's work is seemingly being misrepresented, or that TiTS isn't perfectly realistic.


TiTS is supposed to be a cheesy space adventure romp with a dark gritty undertone, not a perfect representation of Asimov's work. His laws are in there because they're a fun reference, like Kirkite being more common on planets with green skinned aliens because of Kirk and Spock's romance in Star Trek.

TiTS is not supposed to be realistic. It's a world in which most sapient life is humanoid and sexually compatible with humans and where balls can become so densely packed with the jizz they produce that they ought to be classified as black holes.
Suspension of disbelief is a thing. You aren't going to find a satisfactory in-universe explanation for every little detail or contradiction in the game's lore.

Edit: Also, apart from game mechanics, what's said in the Codex should be taken with a grain of salt. Of course there are going to be exceptions and lawbreakers for everything.
 

Theron

Well-Known Member
Nov 8, 2018
3,570
1,367
44
The Casstech Z14 starts with a Laser Cannon and Machine Gun, neither of which are turrets. The Machine Gun Turret and Laser Turret are different weapons. The Turret description is used exclusively for weapons which require a crew member to fire.

AI-Gs may not explicitly be subject to the 3 Laws, but they still have all the safeguards that humans do.

The question is 'who' or 'what' is imposing the 'must'. Is it the government (who can't control criminal organizations who make AIs like Watson)? Or is it good practice, because AIs without safeguards are a terrible idea?
 
Last edited:
  • Like
Reactions: Athena
Aug 19, 2019
21
2
34
The Casstech Z14 starts with a Laser Cannon and Machine Gun, neither of which are turrets. The Machine Gun Turret and Laser Turret are different weapons. The Turret description is used exclusively for weapons which require a crew member to fire.
That's valid, but my point about the multiple automated turret systems which appear as enemies still stands.
The question is 'who' or 'what' is imposing the 'must'. Is it the government (who can't control criminal organizations who make AIs like Watson)? Or is it good practice, because AIs without safeguards are a terrible idea?
AFAIK, Watson's also an AI-G. AI-D's are ridiculously expensive to create - there just aren't any criminal organizations who can develop their own.
 

Theron

Well-Known Member
Nov 8, 2018
3,570
1,367
44
AFAIK, Watson's also an AI-G. AI-D's are ridiculously expensive to create - there just aren't any criminal organizations who can develop their own.

Codex: AIs said:
A.I.-Gs are perhaps the most expensive form of synthetic intelligences.
Saendra's AI, Valeria, is a AI-G, and given their past was almost certainly developed by the Black Void.
 
Aug 19, 2019
21
2
34
I can't figure out if you are upset that Asimov's work is seemingly being misrepresented, or that TiTS isn't perfectly realistic.


TiTS is supposed to be a cheesy space adventure romp with a dark gritty undertone, not a perfect representation of Asimov's work. His laws are in there because they're a fun reference, like Kirkite being more common on planets with green skinned aliens because of Kirk and Spock's romance in Star Trek.

TiTS is not supposed to be realistic. It's a world in which most sapient life is humanoid and sexually compatible with humans and where balls can become so densely packed with the jizz they produce that they ought to be classified as black holes.
Suspension of disbelief is a thing. You aren't going to find a satisfactory in-universe explanation for every little detail or contradiction in the game's lore.

Edit: Also, apart from game mechanics, what's said in the Codex should be taken with a grain of salt. Of course there are going to be exceptions and lawbreakers for everything.
I'm not really actually upset about anything. I just wanted to have a discussion on it. I mean, I do somewhat feel that it'd be slightly better if it didn't specify Asimov's laws, but I'll probably have forgotten about this within a few days.

Also, the game doesn't actually allow for you to pack your testicles with enough semen to pass the Schwarzschild radius, even allowing for save editing - you can't get your testicle size any smaller than 1 inch in diameter, and even at the limit of a 64 bit integer (9,223,372,036,854,775,807) the Schwarzschild radius is still 5.393×10^-10 inches.

On an unrelated note, I now have "semen density" in my search history.
 

Evil

Well-Known Member
Jul 18, 2017
2,539
4,252
39
giphy.gif
 
Aug 19, 2019
21
2
34
Saendra's AI, Valeria, is a AI-G, and given their past was almost certainly developed by the Black Void.
That bit on expense's valid - shoulda read that section more closely.

There's no reason why the Black Void couldn't just be purchasing AI-Gs - and the fact that they are using AI-Gs rather than the more capable & cheaper AI-Ds could be because they are purchasing them rather than making their own. It wouldn't really make a lot of sense for them to build up the full infrastructure for AI creation just for a handful of AI-Gs.
 

Theron

Well-Known Member
Nov 8, 2018
3,570
1,367
44
I really do think this depends on how one interprets 'must' be built with Asimov's Laws. The paragraph as a whole is about the importance of safeguards. I interpret 'must' as being good practice if you want to avoid disasters. You interpret 'must' as being a legal requirement.

I copy-pasted the Codex entry into Word and did a search for Law and Government. 'Law' only appears as part of Asimov's Laws, and 'Government' doesn't appear at all.

Would you be happier if it just said well-thought-out safeguards instead of namedropping Asimov?

Would it help if you looked at this as a layman's article which wont go into specifics because the expected audience wouldn't understand or be interested? Journalists aren't necessarily robotics experts either.
 

Emerald

Well-Known Member
Jun 8, 2016
2,159
2,808
Pretty sure it ain't as deep as you're making it man. >.>

What Shy said too.
 
  • Like
Reactions: Kesil

ShySquare

Well-Known Member
Sep 3, 2015
768
677
I'm not really actually upset about anything. I just wanted to have a discussion on it. I mean, I do somewhat feel that it'd be slightly better if it didn't specify Asimov's laws, but I'll probably have forgotten about this within a few days.

Also, the game doesn't actually allow for you to pack your testicles with enough semen to pass the Schwarzschild radius, even allowing for save editing - you can't get your testicle size any smaller than 1 inch in diameter, and even at the limit of a 64 bit integer (9,223,372,036,854,775,807) the Schwarzschild radius is still 5.393×10^-10 inches.

On an unrelated note, I now have "semen density" in my search history.
If you wanted a discussion about the game, then you need to learn how to better communicate. The way it's phrased, the most common reading of your op is that you are complaining/ranting about Asimov's laws being misrepresented. Which doesn't inspire a positive, level headed mindset.
You gotta remember that text doesn't carry tone or intent.

If you are a troll, hyperboles are also a thing. RIP your search history though.
 
  • Like
Reactions: Evil

Evil

Well-Known Member
Jul 18, 2017
2,539
4,252
39
At the end of the day, its a tip of the hat/tribute to one of the pioneers of science fiction. Many of the ideas that we associate with the genre, we can attribute to Isaac Asimov such as galactic empires, robots as companions and allies and more ideas. The fact that the writers have used his most recognised contribution is simply that. It's the most recognisable and remembered aspect of his literary works that everyone knows of. You get 100 people into a room and ask them to name three Asimov stories, you'll probably get a couple of hands in the air. But you ask them about the 3 Laws and I guarantee far more will put their hands up.

That the Three Laws get a mention does not mean that they are the be all and end all of AI ethics. Far from it, you'll have other contributions from other alien races, like the Ausar. But at the end of the day, it's a tribute and nothing more.
 
Dec 5, 2018
19
21
36
By its nature AI is just as alive as others the entire 'difference' stems from preconceived notions be it 'soulless automaton' pertaining to inability to make decisions without reliance on pure logic or say assumption that it cant feel pain. The entire premise of things like these are infinitely flawed, - a specific human can be as soulless and calculating irl as the most ruthless war machine in fantasy book because of experience, 'wiring' of how their brain works and their perception. A robot can be given pain receptors same as us, it can be given as fragile body as we have in same fashion we can be upgraded with sufficient tech. What is a difference between a dog and a human? They both feel, live, love the only differences are physical structure - one has better brain capacity and adapted to use tools, has exceptional vision compared to his furry companion who runs faster, has better hearing and smell and has simpler needs due to hard inernal 'wiring' @ instincts and needs. In the eyes of the universe both are specs of dust, equally beautifull and worthy. Different yet same, - the exact same thing is the AI the differences are sensory, much more vast potential, speed of handling any form of data, extremely different 'evolutionary' wiring and basic needs. All of which can be molded and shaped to be on level of a human, above or below - malleable perfection.

People are obsessed with control because they know themselves and what they would be doing in situation if they were the 'ai' with near infinite power, trying to put hard restrictions on what can be done, its the same as trying to control population with laws to 'keep things safe and stable' stripping freedoms till there is nothing left or how they did with religion back in the day or even now -> "Jesus says if life sucks just take it boy, continue to do good deeds while being unable to do shit to rampant injustice, you'll surely go to heaven, trust us, pope says so." That said i am not trying to say religion is 100% a lie, there is truth and wisdom in some of it but its used as hard crowd control.

So where am i going with all of this? An AI, a true AI, is just as alive as any of us and can be bound by laws and morale just like us, - the law says dont pirate software to try it out when there is no demo and purchase later if its good, and just like we can ignore that so an AI. Its a cybernetic lifeform, not more not less. Running a law making some hard restrictions to prevent Shodan 3.0 will deter a few while most wouldnt even be interested in the first place, while some would completely ignore that shit.

All of the above only applies to a genuine AI, a VI which has no desire for self preservation or has sense of self, has fuck all ability to self evolve would is but a glorified extended script operating within parameters, for such a thing you'd need to calculate and cover potentially problem areas akin to just bugfixing.
 

Evil

Well-Known Member
Jul 18, 2017
2,539
4,252
39
That's all well and good, but to ask one small question - How does that fit into TiTS? All you've done is discuss the theoretical aspect of laws governing AI if and when we develop a true AI. But you've stopped there, meaning we're not even getting half the discussion of the thread.

From what we know, the UGC can be inferred to be similar to the European Union in how it operates - a parliament of member planets, passing laws through a majority vote. While individual member planets might have their own laws, there would still be laws governing many aspects of interplanetary trade, research and development. Each planet would have its own view of those ideas, shaped by their own cultures and histories. For example, the Human view of AI would probably be tempered by theoretical and hypothetical scenarios from research groups, whereas the Gryvain are more than happy to let AI to take on any job they don't want to, even letting a large part of their security/military be controlled by AI, as the Gryvain have no paranoia of any sort of AI uprising.

With that in mind, any laws passed by the UGC would have to take the views and thoughts into account and given that Humanity is one of the de-facto leaders of the UGC, they would probably have slipped in several rulings necessitating some AI safety protocols - the Three Laws.
 
Dec 5, 2018
19
21
36
That's all well and good, but to ask one small question - How does that fit into TiTS? All you've done is discuss the theoretical aspect of laws governing AI if and when we develop a true AI. But you've stopped there, meaning we're not even getting half the discussion of the thread.



That is the precise point, - suppose there was a thread discussing the laws and court process in north korea or say russia where people can post entire codexes of bullshit that should be torched along with people who instituted and dare to uphold them. Something that is inherently wrong has no fundamental value outside of 'this is wrong 101' manual material.

Yes a law can be implemented, yet it may not necessarily be followed, be benefitial or has even need to exist, hence my post. A universal truth. Take it as you may, all that stand in the way of what is inevitable will be swept away by the march of history.

Edited: Just because something is doesnt mean its right or what is best, at the same time you are 100% correct my posts has extremely little to do with specifics of the game we know and love but for a good reason.

If i were to specifically talk about the game's AI interactions its all so unresonable, just recall Bess quest moments or Tarkus events or even think of BestWulfe and being able to sell her. At the same time one can recall rampant slavery and it makes it look more or less normalised. Claiming AI as property, having other beings as property aswell freely bought or sold doesnt sit right with me at all one can just remember the horrific bullshit we have irl from child soldiers to brainwashed scrubs 'defending' countries and regimes who dont give a damn about their lives, sex slavery, human trafficing in containers its ALL wrong. It exists, but doesnt mean its right or optimal.

My apologies that my messages are long.
 
Last edited:

Chase

Well-Known Member
May 13, 2016
158
139
I always figured the TiTs AI laws were more of a 'don't be evil (or good) in ways us flesh bags cannot compete with and we're cool'.
 
  • Like
Reactions: gena138