VI emergent AI?

MESeele

Well-Known Member
Aug 26, 2015
650
64
VIs fit more along the lines of the Microsoft paper clip, nothing more. Heck, the "Intelligence" in the acronym is more of a misnomer than anything else.

It looks like you're trying to save your game. Need some help?
 

Alexa

Well-Known Member
Aug 30, 2015
258
32
Eh. I read the codex. I just don't trust it.  I mean this is like space cyberpunk with massive debt slavery and undreground conspiracy and out and out corruption letting the treatment happen.   Further the tech as described should easily allow mind rewriting adn brain control of various levels like some ghost in the shell hell. 

Considering the shit the PC can get up and the dark ness already out there well someone likely doing worse than New Texas behind back doors and doesn't want people looking into it.  I mean something like 90% of the treatment is black boxed and what's visible is easily usable of pavlovian conditioning. 

So I'm of the opinion that well the power behind the throne just lied or span the test results.   Sure you're not really dumb.  You just lose all functional intelligence and priority not revolving around this incredibly minor thing that ultimately subordinates you to everyone else not obsessed with it and prevents you from mustering the the will to use your intelligence as actual intelligence permanately.  

The codex is your friend, I tell you. In addition, I don't think the UGC wants people running around with the ingredients for mind alteration. Hence the black-boxing. 

An actual law, from the codex. (Yeah yeah you don't trust it, but it's a law)

Quote said:
Researching or distributing information on the other 80 percent is illegal without express dispensation from the Office of Galactic Affairs. 
It CAN be researched but you're going to have to clear it through that branch of government. In addition, other mind-altering drugs are illegal, like Dumbfuck. It's pretty clear the UGC doesn't want mind altering becoming a thing. 

If you want helpless slaves:

  • Promise a Galotian some protein
  • Build an AI
  • Make a Nyrean submit to you.
  • Promise individuals in over their heads with debt a chance at freedom by enslaving them again, except as sloots instead of low-class workers.
  • Anything else I didn't think of.
All voluntary, by the way (except maybe the second and last one). 
 

Savin

Master Analmander
Staff member
Aug 26, 2015
6,118
9,826
 

If you want helpless slaves:

  • Promise a Galotian some protein
  • Build an AI
  • Make a Nyrean submit to you.
  • Promise individuals in over their heads with debt a chance at freedom by enslaving them again, except as sloots instead of low-class workers.
  • Anything else I didn't think of.
All voluntary, by the way (except maybe the second and last one). 
You know, that actually brings up an interesting point. If an AI "belongs" to somebody but hates their owner (or worse, is abused by them) and is miserable... I wonder if they have any recourse at all? 
 

JimThermic

Well-Known Member
Creator
Aug 26, 2015
383
6
Not if they're Coded AI's, according to the Bess storyline, and I doubt most people know the difference between Coded and Grown AI's.

According to that content, Coded AI's can't really own their own bank accounts, and most folks are unwilling to give them employment as a real person. Instead, they're perceived as property, so the idea of paying a fridge for doing its job is absurd. Places like New Texas, where people are generally easy-going like Big T, are a noted exception. Those who treat Coded AIs as people are generally seen as weird, because you'd look at someone a bit strange for treating their toaster like a person, hence the derogatory term for AI sympathisers, 'Toaster Heads'. 

It's also unlikely that people like forking out tens of thousands (or in the case of Grown AI, probably millions) of credits only to have their pre-programmed or pre-grown AI's run off on them. Cuz, you know, you'd be pretty miffed if your shit ran off. After all, these are beings literally designed to do stuff organics won't do -- or can't do -- for you. If they're treated the same as organics, there's probably no point in producing them, since you're just popping out free-roaming adults.

I imagine a synthetic right's organization might be frowned upon, since you're essentially encouraging people's stuff to run off. 
 
Last edited by a moderator:

Nonesuch

Scientist
Creator
Aug 27, 2015
2,194
3,563
You know, that actually brings up an interesting point. If an AI "belongs" to somebody but hates their owner (or worse, is abused by them) and is miserable... I wonder if they have any recourse at all?
Subtly alter their behaviour with subliminal messaging and Pavlovian training until all they want to do is fuck where you have the most number of sensory receptors with the largest number of other fleshsacks possible.
 

Alexa

Well-Known Member
Aug 30, 2015
258
32
You know, that actually brings up an interesting point. If an AI "belongs" to somebody but hates their owner (or worse, is abused by them) and is miserable... I wonder if they have any recourse at all? 
I imagine a synthetic right's organization might be frowned upon, since you're essentially encouraging people's stuff to run off. 

So I'll just go through the types of AIs and provide some opinions on each type. 

  • Grown AIs are simulated brainscans of organics, and while they're conditioned by a teaching supercomputer, they're still technically part organic, and might get some leeway. You still buy them, but I think there should be some regulations on it (I think there should be some regulations on both types of truly-intelligent AIs). These have a greater "PR appeal".
  • Virtual Intelligences, while technically AIs, aren't. There's nothing intelligent about them and they just move their bodies according to your commands. There's nothing "there". No chance of leeway, but even then it wouldn't matter, because they're incapable of hating, or disliking, anything. You could dismantle it alive (not even alive? More like 'operating') and the worst you'd get is a nasty shock or the robotics equivalent of a Windows bluescreen, depending on how they're designed.
  • Designed AIs are a LOT trickier, since the entire process is unregulated (it would seem). Therefore, it's rather hard to lop them all under one category. For example, you have the ones designed for a singular purpose, without any "qualities" that organics might have, like empathy. They're capable of learning and such, but they aren't modeled after organics (Hand So, looking at you). Then you have the ones that are designed to be like an organic, and are capable of feeling a whole array of emotions like an organic would, such as love, happiness, anger, frustration, sadness, all that. Considering emotions aren't quite 'logical', this produces some pretty goddamn dangerous stimuli. Bess-13 commits suicide (I don't know how, I didn't bother reading the scene because it's too sad, but it's there) if you say certain things to her. So there's a lot of room to divvy up designed-AIs into. Bess's content can be sad and happy at times, so that tells you that people can empathize with designed AIs. JimT admits at some point (I read it somewhere) that he had to take a break from writing the fifth date, because it was so sad, and I don't blame him. 
Lots of stuff to consider, like just where the line is crossed, when a control loop becomes a full-blown woman. According to JimT, Bess has some possible pregnancy content planned, which I am eager to see.
 
Last edited by a moderator:

JimThermic

Well-Known Member
Creator
Aug 26, 2015
383
6
Huh. You know, I never wrote that scene (Dust to Dust) with the thought she outright committed suicide, but there you go. I guess this is an example of interpretation differing from authorial intent? Either way, I'm fine with it being seen as such, because I left it intentionally vague. But yeah, fifth date was friggin' hard, and by far her most controversial.

And uh, yeah, I was hoping to do a pregnancy x-pack for Bess far, far in the future, with as little variants as possible, and her using a gene-splicer upgrade and the PC's sperm. Just, you know, don't tell Gedan. Shh.
 
Last edited by a moderator:

Alexa

Well-Known Member
Aug 30, 2015
258
32
Huh. You know, I never wrote that scene (Dust to Dust) with the thought she outright committed suicide, but there you go. I guess this is an example of interpretation differing from authorial intent? Either way, I'm fine with it being seen as such, because I left it intentionally vague. But yeah, fifth date was friggin' hard, and by far her most controversial.

And uh, yeah, I was hoping to do a pregnancy x-pack for Bess at some point, involving a gene splicer and the PC's seed slash eggs. A long while off, with as little variables as possible. Just, you know, don't tell Gedan. Shh.
 

That's what happens when one doesn't read it! :p Or all of it, anyways. Poor designed AIs, though. Bess really gets you thinking. She was pretty much built to be as close to an organic as was possible at the time, and she does a fine job. Most of the time.

Another question, how exactly did the assholes on the fifth date know she was a designed AI? Or are they like that for all AIs? I would think grown AIs might be given more consideration in the eyes of some.

Jim, you've likely got Gedan dreaming about Bess's code by now. Codes in his sleep.
 
Last edited by a moderator:

JimThermic

Well-Known Member
Creator
Aug 26, 2015
383
6
Another question, how exactly did the assholes on the fifth date know she was a designed AI? Or are they like that for all AIs? I would think grown AIs might be given more consideration in the eyes of some.

They just thought she was an AI or VI—silver skin, other mechanical bits and pieces—they didn't really care which kind. My personal opinion is what when someone is trying to be speciesist/racist towards a denomination, they don't tend to care about the technicalities, just what they look like that makes them different. 

E.g. Damn Chinese, taking our jobs. Oh wait, hang on guys, this one's Korean. Sorry man, we should have picked up by the subtle cultural cues. Annyeonghi gaseyo—apologies again!

EDIT: Just another thing. Grown AI's are by game law so freaking expensive—like, high end starship expensive—and such a pain in the ass to make, most people have probably never even seen one. The chances that people are more considerate because grown AIs are about (and therefore more culturally minded) seems odd to me, considering they'd make up a tiny fraction of the AI/VI's running around.
 
Last edited by a moderator:

Alexa

Well-Known Member
Aug 30, 2015
258
32
They just thought she was an AI or VI—silver skin, other mechanical bits and pieces—they didn't really care which kind. My personal opinion is what when someone is trying to be speciesist/racist towards a denomination, they don't tend to care about the technicalities, just what they look like that makes them different. 

E.g. Damn Chinese, taking our jobs. Oh wait, hang on guys, this guy's Korean. Sorry man, we should have picked up by the subtle cultural cues. Annyeonghi gaseyo—apologies again!

Isaac Asimov hits on this a lot in his books, you should check them out if you haven't already. However, I'm particularly interested in knowing why JoyCo doesn't go full steam ahead and make an AI that's indistinguishable from organics, appearance-wise? That would be rather interesting. All the AIs we've seen (or know of. Imagine someone writing something that appears to be an organic at first... then boom, it's actually an AI. I doubt anyone has done this, though. :p  ) have been glaringly machine-like. 
 

Edit! : That's true, I wasn't thinking about that.So right now, most people are acquainted with VIs in particular, being the cheapest and technically most cost effective. This makes sense. Factoring in cost, you can imagine now why there aren't any grown-AIs indistinguishable in appearance from organics. Making them identical could lead to the exact synthetics' rights organizations you were talking about.
 
Last edited by a moderator:

JimThermic

Well-Known Member
Creator
Aug 26, 2015
383
6
Isaac Asimov hits on this a lot in his books, you should check them out if you haven't already. However, I'm particularly interested in knowing why JoyCo doesn't go full steam ahead and make an AI that's indistinguishable from organics, appearance-wise? That would be rather interesting. All the AIs we've seen (or know of. Imagine someone writing something that appears to be an organic at first... then boom, it's actually an AI. I doubt anyone has done this, though. :p  ) have been glaringly machine-like. 
 

Edit! : That's true, I wasn't thinking about that.So right now, most people are acquainted with VIs in particular, being the cheapest and technically most cost effective. This makes sense. Factoring in cost, you can imagine now why there aren't any AIs indistinguishable in appearance from organics. Nobody wants to lose that sort of an investment. 

Pfft, I don't have time to read, I'm too busy writing! Which is a terrible habit. I've got a stack of eight books next to my computer screen that remain unfinished, all suggested to me for research purposes. @_@ 

I can only speak of my perspective on it. I imagine part of it is VI's and cost -- it's cheaper to mass produce something like V-Ko, who has her outfit sewn to her body, instead of detailing each and every single body part. Second, in some sci-fi you'll see a noted trend where societies *do* make a hyper realistic android... only to freak out at how human they look (probably a fear of being surpassed), and make them look more robotic. There's also the uncanny valley to consider, to a degree. And as you said, nobody wants to make a 100% feeling AI that resembles a human because, well, everyone's got humans for that. Why spend a million credits on popping another free-willed person into the world? It's far more likely you want them to do a task a human wouldn't do... or you aren't ethically and/or legally allowed to do. At this point, prejudice against AIs is a possible method to keep the status quo in check, by making sure AIs have as difficult a time breaking out of their defined, sub-human roles as possible. 
 
Last edited by a moderator:

Alexa

Well-Known Member
Aug 30, 2015
258
32
I can only speak of my perspective on it. I imagine part of it is VI's and cost -- it's cheaper to mass produce something like V-Ko, who has her outfit sewn to her body, instead of detailing each and every single body part. Second, in some sci-fi you'll see a noted trend where societies *do* make a hyper realistic android... only to freak out at how human they look, and make them look more robotic. There's also the uncanny valley to consider, to a degree. And as you said, nobody wants to make a 100% feeling AI that resembles a human because, well, everyone's got humans for that. Why spend a million credits on popping another free-willed person into the world? It's far more likely you want them to do a task a human wouldn't do.

I ask you to look at some of the Asimov books because it details on these hyper realistic robots. Ones so realistic that the difference can't be seen, basically the part beyond the uncanny valley. We aren't quite there yet, but in the TiTSverse, the tech is there, so I'm excited to see someone take a crack at it. 

As for why spend the resources making a hyper realistic AI? Well, commercially, it doesn't make sense, considering the response it could stir in synthetic sympathizers, but I could see it becoming someone's "mad scientist" experiment. If you want to get really far-fetched, a whole race of such unrestricted AIs, capable of feeling, self replication, all that. That'd be exceptionally interesting, though perhaps beyond the scope of a game like this. :p  

I'm thinking a new thread on this would be neat.
 
Last edited by a moderator:

ITNW1993

Member
Aug 27, 2015
18
0
I was gonna say, in the TiTS-verse, I find it unlikely that they haven't surpassed the uncanny valley yet, especially when we've already encountered one who does (Gianna and now Bess/Ben). But yeah, considering that the entire reason we have VIs in the first place is so that they can do the shit we hoomans don't want to, building an AI capable of self-growth is pretty much counterproductive to the original goal, especially if they're just going to end up thinking "Well fuck you, if you don't want to do this, I don't want to, either."  The only way I can see AIs suddenly popping up en masse would be the TiTS equivalent of the geth: networked VI that ended up self-developing themselves into AIs, especially since AIs are so damn expensive and frowned upon in the TiTS-verse as they are, though I guess the gray goos are their rough equivalent.
 

Alexa

Well-Known Member
Aug 30, 2015
258
32
I was gonna say, in the TiTS-verse, I find it unlikely that they haven't surpassed the uncanny valley yet, especially when we've already encountered one who does (Gianna and now Bess/Ben). But yeah, considering that the entire reason we have VIs in the first place is so that they can do the shit we hoomans don't want to, building an AI capable of self-growth is pretty much counterproductive to the original goal, especially if they're just going to end up thinking "Well fuck you, if you don't want to do this, I don't want to, either."  The only way I can see AIs suddenly popping up en masse would be the TiTS equivalent of the geth: networked VI that ended up self-developing themselves into AIs, especially since AIs are so damn expensive and frowned upon in the TiTS-verse as they are, though I guess the gray goos are their rough equivalent.

I'd say they're just before the uncanny valley. They are clearly machines and we're like "yeah that's ok". Very sexy machines, though. In other words, the companies avoid the valley not by going past it, but sitting just before it. This also means that AIs can be clearly distinguished, as organics might feel threatened by an AI that looks better/as good as they do, performs tasks better than they do, etc... (JimT just mentioned this)

I don't think "VI" is the right terminology to use for the individual units that comprise a grey goo. More like "primitive AI". VIs are literally just things designed to fake intelligence by providing answers to input. No learning involved. Comparing a network of primitive AIs to a network of VIs is like comparing a network of people to a network of desktop computers. The people might get something done, but the computers are ALWAYS going to sit there and do the same thing. If you put a goal seeking program on the desktop computers, however, they would become very primitive goal-seeking AIs.

AIs aren't quite frowned upon, as there's a healthy business for them, but relationships forming between the AIs and their owners is considered frowned upon.

Building an AI capable of self-growth isn't that bad, actually, as long as proper restraints are put in order so something bad doesn't happen. If you wanted to TiTS-ify this concept, think of how happy JoyCo would be when they realized that instead of going through the process of developing the fully-fledged AI, they could just insert a self-developing AI with certain restraints into a certain scenario with a certain goal, and it would develop itself to better accomplish that goal. Afterwards, JoyCo can yank it from its stimuli, turn off its self-development capabilities, add in any restraints needed, test it, and put it out into the wild. 

Let's go even crazier and say that an AI like Bess wants to follow its fifth directive (the ones Jim came up with) and wants to spread happiness and joy. Well shit, what are ya going to do with one AI? So Bess starts developing other HERS and HIMS to send out to do the same job. Bam, AI en masse. Might be more complicated than that, but that's how I see it. If you read Larry Niven's Ringworld series, there's a defense AI called Proteus that does just this. He starts manufacturing more copies of himself, which pretty much work together and before his master can put a stop to it, he runs off and contemplates the universe. 

As for cost, designed AIs are pretty much software so they can be copied liberally once created. I'd say the same for grown AIs since they are just simulations, but the codex is disagreeing with me and I can't disagree with lore, but if I had a choice in the matter I'd have said grown AIs could be copied.
 
Last edited by a moderator:

ITNW1993

Member
Aug 27, 2015
18
0
I don't think "VI" is the right terminology to use for the individual units that comprise a grey goo. More like "primitive AI". VIs are literally just things designed to fake intelligence by providing answers to input. No learning involved. Comparing a network of primitive AIs to a network of VIs is like comparing a network of people to a network of desktop computers. The people might get something done, but the computers are ALWAYS going to sit there and do the same thing. If you put a goal seeking program on the desktop computers, however, they would become very primitive goal-seeking AIs.

I mentioned VIs in much how the geth in Mass Effect started: networked VIs that evolved into networked AIs. The more there are, the smarter they are, much like how the gray goos are. I didn't call the individual 'bots VIs, just that they may have started as VIs, but are now AIs, despite having very limited functionality without networking.

Building an AI capable of self-growth isn't that bad, actually, as long as proper restraints are put in order so

something bad doesn't happen. If you wanted to TiTS-ify this concept, think of how happy JoyCo would be when they realized that instead of going through the process of developing the fully-fledged AI, they could just insert a self-developing AI with certain restraints into a certain scenario with a certain goal, and it would develop itself to better accomplish that goal. Afterwards, JoyCo can yank it from its stimuli, turn off its self-development capabilities, add in any restraints needed, test it, and put it out into the wild. 

But that's just the thing. If it's a self-developing AI, that means it's sapient, capable of thinking and learning for itself. Shackling a sapient being is just... abhorrent, as theoretical as this conversation may be. Even worse would be to effectively lobotomize it once you believe it's fulfilled its education.

Let's go even crazier and say that an AI like Bess wants to follow its fifth directive (the ones Jim came up with) and wants to spread happiness and joy. Well shit, what are ya going to do with one AI? So Bess starts developing other HERS and HIMS to send out to do the same job. Bam, AI en masse. Might be more complicated than that, but that's how I see it. If you read Larry Niven's Ringworld series, there's a defense AI called Proteus that does just this. He starts manufacturing more copies of himself, which pretty much work together and before his master can put a stop to it, he runs off and contemplates the universe.

Problem is: she doesn't have the resources to be able to do something like that. She gets paid, what, 100 credits a week if Steele goes that route? One of her directives is against her bringing harm, however indirectly, so stealing is a no-no. Her line had to have been an extremely expensive investment each, seeing as it was the AI-G system that made each Bess/Ben unit so expensive, what with the VI versions being a tenth of the price, even though the AI-G and VI versions were physically similar. She'd have to teach, program, and closely follow each burgeoning AI-G, and I doubt Bess/Ben has the ability to actually do that, self-learning AI they may be. That would take time despite the accelerated learning phase, and not the en masse flood of Bess/Ben units you imply.
 

Alexa

Well-Known Member
Aug 30, 2015
258
32
I mentioned VIs in much how the geth in Mass Effect started: networked VIs that evolved into networked AIs. The more there are, the smarter they are, much like how the gray goos are. I didn't call the individual 'bots VIs, just that they may have started as VIs, but are now AIs, despite having very limited functionality without networking.

VIs don't improve. They don't 'become' something else on their own  much in the same way a pocket calculator doesn't 'become' a 31 questions game on its own. It does not change. Anything that can change itself isn't a VI. Anything that can do the things you speak of is not a VI. Something has to be there to change them or make them do the things you speak of. The sex bots and VIs on Tarkus did nothing until Hand So took over them. 

Take this example from the codex. (Anything in brackets is a comment)

Quote said:
Virtual Intelligences are the handy computer systems we use for everyday activities, from the old tablet type hand computers to the onboard pilot-assist programs loaded into most starships. Some operate entirely within machinery or computer networks, others have ‘personas’ which facilitate more comfortable communication between themselves and organics. They’re simple, though- some might even say outright stupid. V.I.s can only perform pre-programmed actions, and have very little to no capability for adaptation or self-directed thought {any sort of adaptation is like a computer adjusting its fan speed to compensate for temperature. Nothing special.}. They’re not sapient, have no rights, and are mass produced by companies like Kiha and JoyCo for a huge variety of purposes, from operating heavy machinery to coordinating traffic.

On a core world, most people will interact with dozens of V.I.s every day- frequently without even realizing it. The traffic control systems on Terra, the greeting bots in the hotel lobby, and even your food replicator are all run by V.I.s of varying complexity. What we today know as virtual intelligences have been slowly developed over the course of millennia, gaining in complexity and computational power. It was not until quite recently, however, that our understanding of synthetic intelligence systems evolved beyond mere computing {VIs}, and into a new and wonderful strain of life: true artificial intelligence. {AI-G's and AI-D's}

But that's just the thing. If it's a self-developing AI, that means it's sapient, capable of thinking and learning for itself. Shackling a sapient being is just... abhorrent, as theoretical as this conversation may be. Even worse would be to effectively lobotomize it once you believe it's fulfilled its education.

Think again. Self-developing does not equal sapient. It merely means it's capable of optimizing the result of a reward function. This could LEAD to sapience, but doesn't necessarily mean it will. 

Not quite a lobotomization, as you put it. It's similar to how the human brain ceases development but doesn't stop making, erasing, and rewriting connections.

Problem is: she doesn't have the resources to be able to do something like that. She gets paid, what, 100 credits a week if Steele goes that route? One of her directives is against her bringing harm, however indirectly, so stealing is a no-no. Her line had to have been an extremely expensive investment each, seeing as it was the AI-G system that made each Bess/Ben unit so expensive, what with the VI versions being a tenth of the price, even though the AI-G and VI versions were physically similar. She'd have to teach, program, and closely follow each burgeoning AI-G, and I doubt Bess/Ben has the ability to actually do that, self-learning AI they may be. That would take time despite the accelerated learning phase, and not the en masse flood of Bess/Ben units you imply.

Look at Hand So. In one outcome she took over the galaxy, starting out with a bunch of broken sex bots. One thing led to another and she's suddenly in possession of the goblin satellite network, Steele's microsurgeons, and something else that I'm missing. She turns Steele into a morphing sex pot and takes over the galaxy. 

Quote said:
Firstly, I needed a tool which interfaces freely with the extensive satellite network the goblins have created over this planet. Secondly, I needed an up-to-date electronic encyclopaedia of every known organic sentient race. Finally, I needed micro-bots with the capacity to turn any organic into any shape, given the right stimulus. These you gave me on your way in, and are now being mass-manufactured beneath you.
Quote said:
So spreads her reach across the stars. Powered by the technological know-how of Tarkus, she invades the extranet, swallowing satellites and comm buoys whole, system after system taken by her calm brilliance, confusion on every surface touched swiftly replaced by an all-uniting ecstasy. For those planets that manage to secure themselves against her, she has you. You, who she can twist into any shape with a signal to your micro-bots, you who gladly infiltrate locked down worlds and introduce her to their closed systems, then finding likely individuals to continue the good work.
Bess is not an AI-G. READ YER CODEX.

Quote said:
For years, JoyCo and KihaCorp have been rival robot manufacturers. Despite this, there has always been one area each company was the undisputed leader of. For Joyco, this was medical assist-bots. For KihaCorp, it was coded and grown AI units. For the longest time, neither company tried to muscle in on each other’s ‘turf’.

This all changed when fifteen years ago, KihaCorp abruptly announced it would be manufacturing a new, revolutionary medical assist bot. JoyCo saw this as nothing less than a declaration of war and went about designing an AI product to hit back at the rival company.

Their proposed solution was the creation of the universe’s first truly empathic coded AI, breaking down the barriers between Coded and Grown AIs. They saw it as a way of getting all the benefits of an empathic consciousness without the hassles of copying and growing it in a simulator. It was also a way to steal away customers of both KihaCorp’s coded and grown AI units.

With this in mind, the Mood Articulate Intelligence Android, or Maia Series, was conceived.
 Since she's an AI-D, and she can also self-develop herself, which was demonstrated when Bess told the PC about her dream-upgrade.

AI-D's can be copied, much like how Hand-So demonstrated by copying herself to a data-bead.

Put two and two together, all Bess would need to do is manufacture new 'hers' or 'hims' and load up the software onto each new chassis. Not much trouble for something with time on its side. Hand So built a factory on Tarkus under everyone's nose.
 
Last edited by a moderator:

Karretch

Well-Known Member
Aug 26, 2015
2,068
304
 

VIs don't improve. They don't 'become' something else much in the same way a pocket calculator doesn't 'become' a 31 questions game.
I'll have to disagree with you there. A VI is basically a database, yes, but do you call a computer you program to add to the database as new information comes along automatically, aka learning, not improving? Then there are computer programs that are meant to react to stimuli, but that's not thinking for itself so not AI since it's just following a directive. Combine the two, learn to react and learn from reactions in a feedback loop, and you have an improving VI, maybe like a program that would benefit medical nanomachines to keep their hosts alive. Still not sentient yet.

However - and here I diverge to bolster my OP - then get a bunch of them that work together, they then learn of their whole, of their reactions from and the reactions they have on the outside. Maybe throw in a catalyst of downloading a brain template (or a few thousand) and integrating the data to perform the programmed task of preserving a crew. Where then does the distinction end of just being an adaptive VI and being a self-aware intelligence capable of informed decisions based on outcomes and reactions? Essentially you've got a coded VI that grew in an ironically organic fashion to create what is, evidently arguably, comparable to AI. 
 

Alexa

Well-Known Member
Aug 30, 2015
258
32
I'll have to disagree with you there. A VI is basically a database, yes, but do you call a computer you program to add to the database as new information comes along automatically, aka learning, not improving? Then there are computer programs that are meant to react to stimuli, but that's not thinking for itself so not AI since it's just following a directive. Combine the two, learn to react and learn from reactions in a feedback loop, and you have an improving VI, maybe like a program that would benefit medical nanomachines to keep their hosts alive. Still not sentient yet.

However - and here I diverge to bolster my OP - then get a bunch of them that work together, they then learn of their whole, of their reactions from and the reactions they have on the outside. Maybe throw in a catalyst of downloading a brain template (or a few thousand) and integrating the data to perform the programmed task of preserving a crew. Where then does the distinction end of just being an adaptive VI and being a self-aware intelligence capable of informed decisions based on outcomes and reactions? Essentially you've got a coded VI that grew in an ironically organic fashion to create what is, evidently arguably, comparable to AI. 

Adding new information to a database isn't learning or improving. Something has to be there to do something with that data and infer from it. Designed algorithms only go so far. That's when we get into crap like neural nets and such that are capable of actually learning. In other words, AIs. Google might infer what you mean in a search query by using algorithms to adjust search results, therefore improving them, but that's a far cry from artificial intelligence. It's true that a feedback loop could potentially improve itself if done right, but even then that's still an AI, a type of AI that's called a seed AI, designed to bloom into something bigger.

What's particularly bothering me is your use of the term VI being applied to the machines that comprise a gray goo. A VI simply is not built for such a thing. You can certainly have extremely simple AIs band together, simple feedback loop systems contributing, but the in-game definition of a VI is right there in the codex. Mere computing, like any computer program.
 
Last edited by a moderator:

Noob Salad

Captain Shitpost
Aug 26, 2015
4,374
1,559
I see you are talking about AI's.

I too like AI's.

BessxSo threesome when?
 
Last edited by a moderator:

Ormael

Well-Known Member
Aug 27, 2015
6,631
1,786
He's still somehow safe since Hand So still patiently sitting in her "magical ball" under key items in case we kept her.

Aye anyway Hand So isn't grown AI or is she?
 

Nonesuch

Scientist
Creator
Aug 27, 2015
2,194
3,563