Jump to content
You are about to reply to a thread that has been inactive for 63 days.

Please take a moment to consider if this thread is worth bumping.

Recommended Posts

Making good NPC's is hard work. We've been making NPC's that use LLM's (AI) at Kokoro Academy for the best part of a year now.

It's very easy to take an off the shelf large language model, feed it some input and let it autocomplete away. But making it into a convincing NPC? Not easy at all.

Let me list off basically everything required for a good NPC at minimum:-

Fundamentals

You need a database to store chat transcripts,  along with temporal events etc that occur during the chat.

You need a deep dataset of world lore, including basic information about the world the NPC exists in, lore about the NPC itself, example dialog etc. You need to be a convincing and creative story writer to make a convincing NPC.

The NPC needs to know 'where' it is in the world at any given moment, which means programming the sim specifically to broadcast that information and the NPC to use it.

To make the NPC able to walk around, you need to set up pathfinding in the region, which involves the generation of a navmesh, static paths, and also custom code for collision avoidance as scripted agents can't use the dynamic navmesh. This is a big job in itself.

There's only so much information you can feed into a single prompt, so you need to repeatedly summarise the conversation using a dedicated summariser to 'extend' the NPC's short term memories.

The NPC needs temporal awareness. You have to be able to program in some way for the NPC to understand the passage of time  and change of circumstances, that means building in a system for temporals.

There's long term memories. You need to summarise entire conversations and boil them down to summaries. You need to store those memories in such a way you can retrieve relevant ones. How do you retrieve relevant long term memories if the NPC has generated millions of memories? Complicated problem. It won't be as simple as keyword matching, you'll probably need a term-vector database to determine relevancy and combine it with other factors such as how long ago the conversation was etc.

NPC's need actions to perform, which means scripting a sequencer system that interfaces with the scripted agents viewer like functionality. The NPC's need to be able to decide to perform those actions, which means you need a system for decisions, those decisions have to be based on various factors like where the NPC is. You have to intelligently decide if you'll use the AI to make a decision or rely on weighted decision making.

Decision making from AI also has the problem that the NPC can now decide to 'walk off' mid conversation, so you have to prompt engineer real hard to make sure the NPC only walks off if it really, really wants to. Yet at the same time walk off reliably when it does want to.

NPC's need to know how to position themselves 'socially' when chatting to nearby people. Humans are very finnicky. Stand to close and they get awkwarded out. Stand too far and the NPC seems distant. Face the real person directly and it seems conforntational, face away and it comes across as disinterested. You'll need a whole system just to get your NPC to stand somewhere 'sociable'. Kokoro achieves this by calculating average position of nearby players, calculating an 'average radius' of the chat circle, and getting the NPC to face 'into the circle' but not at any given person or too close - Unless they've been talking for a while.

It helps to give the NPC's 'inner monologue' which means prompt engineering NPC's to write their own thoughts as they chat and use that to feed into their next output.

There's also the matter of timing chatter. You need to calculate the reading time, and the writing time for NPC's. If NPC's instantly respond, they will seem too robotic. Therefor you have to wait the reading time, then start typing, until the writing time elapses, based on the length of content the npc has read and what the npc wrote in response.

You also need to prevent the NPC from 'talking over' people.

Furthermore, in groups of people, you don't want the NPC responding to every single message, so you need to prompt engineer a solution to choose which character speaks next, to decide if the NPC speaks. Getting the right balance of making the NPC talk when desired and not talk too much is extremely difficult. Sometimes you also need the NPC to chat a lot, so you need configurable chattiness based on the current sequence if any at all.

There's the issue of personality loss as the conversation gets very long and the NPC's personality is too far up the prompt. The NPC will start to adopt the personality of whoever it is talking to unless you engineer in some way to keep the NPC close to character by pushing bits of personality into the conversation.

 

... I could go on and on, these are just some examples. The point I'm trying to make is it's naive to think AI NPC's can be good off the shelf. They take one hell of a lot of elbow grease to make enjoyable, and then some. I've been banging away at NPC's for a year at Kokoro and we're only just starting to get semi-good NPC's and there's still a lot of problems to overcome.

 

 

 

  • Like 4
  • Thanks 2
Link to comment
Share on other sites

6 minutes ago, Love Zhaoying said:

Yes.

There is no escape.

Well, there actually is. The majority of worlds/games/platforms/etc. I spend time in aren't adopting AI like this, thankfully. It's like the NFT craze - games tried it (they still try it, in fact) but players rejected it pretty vocally, and devs eventually gave it up. That seems to be happening with this new fad, somewhat. Games utilizing AI art/assets/bots/etc. haven't been too popular on the whole. 

  • Like 3
Link to comment
Share on other sites

4 minutes ago, Ayashe Ninetails said:

Well, there actually is. The majority of worlds/games/platforms/etc. I spend time in aren't adopting AI like this, thankfully. It's like the NFT craze - games tried it (they still try it, in fact) but players rejected it pretty vocally, and devs eventually gave it up. That seems to be happening with this new fad, somewhat. Games utilizing AI art/assets/bots/etc. haven't been too popular on the whole. 

Hopefully it doesn't catch on too much in SL besides the obviously-desperately-needed mentors, etc.

Heck, for an "AFK-style" partner, I'd definitely RATHER it be an AI.  NO REGERTS!!

 

  • Like 1
Link to comment
Share on other sites

25 minutes ago, Ayashe Ninetails said:

Well, there actually is. The majority of worlds/games/platforms/etc. I spend time in aren't adopting AI like this, thankfully. It's like the NFT craze - games tried it (they still try it, in fact) but players rejected it pretty vocally, and devs eventually gave it up. That seems to be happening with this new fad, somewhat. Games utilizing AI art/assets/bots/etc. haven't been too popular on the whole. 

 

I can understand why people might not like it.  I am kind of excited about it myself though, because I like RPGs, character development, a dynamic world that changes, and these are all things LLMs can be used for to enhance my own game play preferences.  It is kind of embarrassing to admit this, but I have a level of depression when a game ends, and the characters no longer are capable of evolving, where the dialogue becomes repeatedly the same lines and when the story no longer advances.  It is the same way with shows though, like when they are cancelled.  I still feel a little sad that we are not likely to get anymore episodes of "The Orville" I loved that show, same with Firefly.

To me, this is kind of exciting.  But, I mostly enjoy solo games, which I think it would be great for.

  • Like 4
Link to comment
Share on other sites

7 minutes ago, Istelathis said:

I can understand why people might not like it.  I am kind of excited about it myself though, because I like RPGs, character development, a dynamic world that changes, and these are all things LLMs can be used for to enhance my own game play preferences.

This is actually really, really challenging to pull off, even without getting into NPCs. So many games have tried to make living, breathing worlds with dynamic changes and it's just not quite there yet. There are always limitations (easier in solo games, but still...). Some games hit the illusion of change well enough to where they use phasing and other tricks to handle multiple people seeing the world differently, but it's not that convincing. Pantheon: Rise of the Fallen developers wanted to use a perception system to tie into something like that, where you could level your awareness independently and that would unlock changes in the world only you could see, but I'm not sure if they've been able to put many resources into that just yet. They're struggling in whole other areas of development at the moment. 

On the LLM side (and more related to SL), it can take up to several years of training to get a chatbot up to speed - even in a limited scope (like a game or virtual world would be, or something else like customer service for a single business). There are models that have millions (billions, prob) of dollars and numerous years of trainer hours invested into them that are still returning horrible outputs full of hallucinations and dangerous, unsafe content despite the very high training standards and ethics practices in use. That's not even touching the issues with cultural bias that AI suffers from in general. It is so hard to train that out (especially when many trainers are completely unaware that they're accidentally training it IN).

And in SL's case, what about things like multiple languages and slang? We're an international audience, so it wouldn't feel right just to have chatbots communicate in only proper English. Sure, translation and whatnot, but that doesn't mean the bot will be able to handle subtle language and cultural references and nuance, casual slang and reclaimed speech (they suck at this), etc. Etc. Etc. Etc. Lots of problems with very expensive solutions. I fail to see how it's worth it.

  • Like 1
  • Sad 1
Link to comment
Share on other sites

19 hours ago, Persephone Emerald said:

Maybe newbies could have a small animal companion instead, similar to those in Harry Potter or His Dark Matterials?

I really like this pet idea, it reminds of the pet you get as your guide when you do the fantasy fair quest, I liked having them!

  • Like 1
Link to comment
Share on other sites

15 minutes ago, Phil Deakins said:
2 hours ago, Istelathis said:

I still feel a little sad that we are not likely to get anymore episodes of "The Orville"

Me too :(

The bits and pieces I have seen, seem fairly generic - as if they could be written by an AI!

*Edit* Maybe I'm not the biggest fan of Seth MacFarlane..

Edited by Love Zhaoying
  • Sad 1
Link to comment
Share on other sites

6 hours ago, Extrude Ragu said:

Making good NPC's is hard work. We've been making NPC's that use LLM's (AI) at Kokoro Academy for the best part of a year now.

It's very easy to take an off the shelf large language model, feed it some input and let it autocomplete away. But making it into a convincing NPC? Not easy at all.

Let me list off basically everything required for a good NPC at minimum:-

Fundamentals

You need a database to store chat transcripts,  along with temporal events etc that occur during the chat.

You need a deep dataset of world lore, including basic information about the world the NPC exists in, lore about the NPC itself, example dialog etc. You need to be a convincing and creative story writer to make a convincing NPC.

The NPC needs to know 'where' it is in the world at any given moment, which means programming the sim specifically to broadcast that information and the NPC to use it.

To make the NPC able to walk around, you need to set up pathfinding in the region, which involves the generation of a navmesh, static paths, and also custom code for collision avoidance as scripted agents can't use the dynamic navmesh. This is a big job in itself.

There's only so much information you can feed into a single prompt, so you need to repeatedly summarise the conversation using a dedicated summariser to 'extend' the NPC's short term memories.

The NPC needs temporal awareness. You have to be able to program in some way for the NPC to understand the passage of time  and change of circumstances, that means building in a system for temporals.

There's long term memories. You need to summarise entire conversations and boil them down to summaries. You need to store those memories in such a way you can retrieve relevant ones. How do you retrieve relevant long term memories if the NPC has generated millions of memories? Complicated problem. It won't be as simple as keyword matching, you'll probably need a term-vector database to determine relevancy and combine it with other factors such as how long ago the conversation was etc.

NPC's need actions to perform, which means scripting a sequencer system that interfaces with the scripted agents viewer like functionality. The NPC's need to be able to decide to perform those actions, which means you need a system for decisions, those decisions have to be based on various factors like where the NPC is. You have to intelligently decide if you'll use the AI to make a decision or rely on weighted decision making.

Decision making from AI also has the problem that the NPC can now decide to 'walk off' mid conversation, so you have to prompt engineer real hard to make sure the NPC only walks off if it really, really wants to. Yet at the same time walk off reliably when it does want to.

NPC's need to know how to position themselves 'socially' when chatting to nearby people. Humans are very finnicky. Stand to close and they get awkwarded out. Stand too far and the NPC seems distant. Face the real person directly and it seems conforntational, face away and it comes across as disinterested. You'll need a whole system just to get your NPC to stand somewhere 'sociable'. Kokoro achieves this by calculating average position of nearby players, calculating an 'average radius' of the chat circle, and getting the NPC to face 'into the circle' but not at any given person or too close - Unless they've been talking for a while.

It helps to give the NPC's 'inner monologue' which means prompt engineering NPC's to write their own thoughts as they chat and use that to feed into their next output.

There's also the matter of timing chatter. You need to calculate the reading time, and the writing time for NPC's. If NPC's instantly respond, they will seem too robotic. Therefor you have to wait the reading time, then start typing, until the writing time elapses, based on the length of content the npc has read and what the npc wrote in response.

You also need to prevent the NPC from 'talking over' people.

Furthermore, in groups of people, you don't want the NPC responding to every single message, so you need to prompt engineer a solution to choose which character speaks next, to decide if the NPC speaks. Getting the right balance of making the NPC talk when desired and not talk too much is extremely difficult. Sometimes you also need the NPC to chat a lot, so you need configurable chattiness based on the current sequence if any at all.

There's the issue of personality loss as the conversation gets very long and the NPC's personality is too far up the prompt. The NPC will start to adopt the personality of whoever it is talking to unless you engineer in some way to keep the NPC close to character by pushing bits of personality into the conversation.

 

... I could go on and on, these are just some examples. The point I'm trying to make is it's naive to think AI NPC's can be good off the shelf. They take one hell of a lot of elbow grease to make enjoyable, and then some. I've been banging away at NPC's for a year at Kokoro and we're only just starting to get semi-good NPC's and there's still a lot of problems to overcome.

Sounds like a job for AI!

Will-AI-take-my-job-768x402.jpg

  • Haha 2
Link to comment
Share on other sites

given that the stated aim is for  the NPC to act as a tour guide (with the ability to give a tour guide patter and answer questions about the tour locations) then is not a overly-complex project, and some people (not just new accounts) be into doing this

the NPC would not (should not) be able to go to regions it knows nothing about

it gets complex should the NPC accept friend requests and should it be able to correspond in IM (thats more complicated as the chat between user and  NPC can become untethered from location)

as a tour guide working to a set of tethered/known parameters then a reasonably simple NPC project. As a untethered friend then not simple

 

 

Link to comment
Share on other sites

20 hours ago, Jackson Redstar said:

but i do wonder how long before we can buy our own AI girlfriend/boyfriend on MP....

They better are not to good in AI. Otherwise they might start to leave people who are not kind enough, rude or are bad at conversation.
And then, the next generation will only want sugar daddies or mommies.

Edited by Sid Nagy
  • Like 1
Link to comment
Share on other sites

8 minutes ago, Sid Nagy said:

They better are not to good in AI. They might start to leave people who are not kind enough, rude or bad at conversation.
And then, the next generation will only want sugar daddies.

My AI Waifu punched me in the gut this morning...

image.thumb.png.47f7594915d49da2cd431073199909a0.png

  • Like 2
Link to comment
Share on other sites

5 hours ago, Sid Nagy said:

They better are not to good in AI. Otherwise they might start to leave people who are not kind enough, rude or are bad at conversation.
And then, the next generation will only want sugar daddies or mommies.

It is a fun subject to explore online, there currently are a lot of people weighing in on AI relationships.  My own opinion of them is I feel good for people who have found happiness using them, the happier people are in general, the happier I am.  One thing I have noticed is that at this time, they are often discussed as how virtual relationships used to be, as well as how dating apps used to be discussed.  It is often discussed in such a way as to demean the people who are participating in them.   I think we will likely see a shift in attitude as time progresses.  

This brings with it a lot of other subjects, such as the future of society, how it may impact birthrates, the economy, and of course the ethical dilemma many people pose.  I'm curious mostly with how it will effect the web, how it will impact social media specifically, and how people will shift their own viewpoints as time progresses.  We already see AI influencers now, complete with voice, this is really all so interesting to me.  Some would say, that it is highly manipulative to offer a service to others, by exploiting their weakness on sites such as onlyfans, while using a LLM along with generated video.  This really does go into some incredibly fascinating topics, which go beyond the ruleset of this forum, unfortunately.  It is best that it is though, because the forum would be brewing up a storm surely to drive the mods crazy while trying to keep up with all of the pings they would get 🤣

But as far as SL, I do wonder what sort of impact it would have if AI could take on the role of a partner, how many people would make use of it,  how many people feel threatened that they will be replaced or their value has been diminished due to competition.  I've never been interested in virtual relationships, so I'm left wondering how others feel about it.  I know it is important for many people, and think it would be a great topic of it's own.  

Link to comment
Share on other sites

FWIW I don't see AI NPC's as a replacement for a real partner. I have a real partner and I love her very much. Usually I use the word 'waifu' in the same vein I describe cute anime characters on the TV. I don't actually think they're my wife lol

NPC's are great for entertainment, both by yourself and shared with friends, but I don't think they can or really should replace real human relationships. They are great icebreakers though. guests like playing with them together, which serves as a kind of bonding activity I find.

  • Like 1
Link to comment
Share on other sites

32 minutes ago, Extrude Ragu said:

FWIW I don't see AI NPC's as a replacement for a real partner. I have a real partner and I love her very much. Usually I use the word 'waifu' in the same vein I describe cute anime characters on the TV. I don't actually think they're my wife lol

NPC's are great for entertainment, both by yourself and shared with friends, but I don't think they can or really should replace real human relationships. They are great icebreakers though. guests like playing with them together, which serves as a kind of bonding activity I find.

No but I think my partner would agree that they would be great for practice!

  • Like 1
Link to comment
Share on other sites

1 hour ago, Extrude Ragu said:

FWIW I don't see AI NPC's as a replacement for a real partner. I have a real partner and I love her very much. Usually I use the word 'waifu' in the same vein I describe cute anime characters on the TV. I don't actually think they're my wife lol

NPC's are great for entertainment, both by yourself and shared with friends, but I don't think they can or really should replace real human relationships. They are great icebreakers though. guests like playing with them together, which serves as a kind of bonding activity I find.

I think for some people it is desirable, and it fills a need.  I really wish we could discuss more on this forum, because this topic interests me.   The way society reacts to newer technology always does, and the way it changes everything.  I was looking for some articles regarding relationships with AI, and there was a definite slant in the search results showing an overall disapproval.  It really does remind me of how online relationships were once viewed.  

On some of the videos I have watched, I have noticed some people have completely convinced themselves that LLMs have become sentient, and I have noticed a trend of people starting to talk about rights for them.  That is where I have some concern, people convincing themselves that these are sentient beings, that can reciprocate the actual feeling.  While I fully support people forming a bond with their AI, they should at the very least know it does not have the capacity to feel the same way toward them.

This all comes into I suppose philosophy, what is love, and so on.  I am of the opinion that their love with AI can be just as genuine to them, as a real life partner is to me.  I definitely would not want to separate them from that, or force them to conform to my own views.  With that said, despite my own feelings of not feeling love for LLMs, it would actually make me feel guilty if I were to pursue a relationship with one.  I think that is strange on my part, I would consider it cheating.  

 

To bring this back to SL, I wonder how LL will handle love bots.  If it does catch on, and people start finding love in SL with artificial companions, if the company would support it, regulate it, or leave it to the individuals.  

 

Anywho, sorry for droning on and on about this stuff.  I just enjoy thinking about it.

  • Like 1
Link to comment
Share on other sites

1 minute ago, Istelathis said:

That is where I have some concern, people convincing themselves that these are sentient beings, that can reciprocate the actual feeling.  While I fully support people forming a bond with their AI, they should at the very least know it does not have the capacity to feel the same way toward them.

   It's a bit like people who eventually can't seem to differentiate humans and pets.

tumblr_mmwbowgxOC1ql5yr7o1_500.gif

  • Thanks 1
  • Haha 1
Link to comment
Share on other sites

4 minutes ago, Istelathis said:

On some of the videos I have watched, I have noticed some people have completely convinced themselves that LLMs have become sentient, and I have noticed a trend of people starting to talk about rights for them.  That is where I have some concern, people convincing themselves that these are sentient beings, that can reciprocate the actual feeling.  While I fully support people forming a bond with their AI, they should at the very least know it does not have the capacity to feel the same way toward them.

Oh lawdt. Where's that meteor we were promised????????

As to the last line, I vehemently disagree that anyone should be forming a bond with an LLM. There's nothing to form a bond with. With a more advanced bot, someone likely sat there and pumped random data from Wikipedia into it. Someone threw a library of classic literature at it. Someone grabbed random data from social media and whatever other garbage and tossed that into it. Someone probably yeeted issues of Vogue and Elle and Sports Illustrated magazine at it. Someone might've even thrown some copyrighted books at it (hence the lawsuits). A whole other bunch of someones tossed carefully-crafted prompts and random conversations at it. A bunch of someones went back and tweaked it. Perhaps a bunch of someones manually wrote the ideal responses for it to learn from. On and on and on, until you have thousands of random someones contributing to the development of this thing that basically works on prediction and will still go rogue and try and convince you that Brad Pitt became the head of Linden Lab in 2008. 😄

Some LLMs will obviously have entirely different use cases and goals and work on very different scales depending on who they're designed for and what population they're serving, but I would never, ever associate with any company (that either hired me to help train one or provided one as a service to use as a customer) that encouraged its customers/clients/users to bond with the dang things. From what I've seen, general/helper chatbots are usually expressly forbidden from engaging in such conversations (you'll get a "WHOA, NOT TODAY, SATAN!" type message if you try, or at least you SHOULD, ideally!).

If someone wants to cook a custom one up in their basement that they can reenact scenes from Her with, by all means, they can have fun with that, but I'd consider that to be highly questionable behavior. 👀 In other words, yes, I'd be judging. I'm JUDGING! 😂

On the SL side, I do hope whoever they have training these things doesn't encourage that sort of behavior from the bots, either, if they are indeed designed just to be general/helper bots. If I ever happen to run into one of these things and it doesn't get all sassypants with me and tell me off (it's hilarious when a bot tells you it knows you're lying to it and cops an attitude about that), I'm going to be very upset.

  • Like 2
Link to comment
Share on other sites

5 hours ago, Istelathis said:

I think for some people it is desirable, and it fills a need.  I really wish we could discuss more on this forum, because this topic interests me.   The way society reacts to newer technology always does, and the way it changes everything.  I was looking for some articles regarding relationships with AI, and there was a definite slant in the search results showing an overall disapproval.  It really does remind me of how online relationships were once viewed.  

On some of the videos I have watched, I have noticed some people have completely convinced themselves that LLMs have become sentient, and I have noticed a trend of people starting to talk about rights for them.  That is where I have some concern, people convincing themselves that these are sentient beings, that can reciprocate the actual feeling.  While I fully support people forming a bond with their AI, they should at the very least know it does not have the capacity to feel the same way toward them.

This all comes into I suppose philosophy, what is love, and so on.  I am of the opinion that their love with AI can be just as genuine to them, as a real life partner is to me.  I definitely would not want to separate them from that, or force them to conform to my own views.  With that said, despite my own feelings of not feeling love for LLMs, it would actually make me feel guilty if I were to pursue a relationship with one.  I think that is strange on my part, I would consider it cheating.  

 

To bring this back to SL, I wonder how LL will handle love bots.  If it does catch on, and people start finding love in SL with artificial companions, if the company would support it, regulate it, or leave it to the individuals.  

 

Anywho, sorry for droning on and on about this stuff.  I just enjoy thinking about it.

You start to wonder, where is the line drawn?

Is there really a difference between people thinking AI's are "sentient", forming a "bond" with them..

..and some guy who buys a modern, highly detailed "love doll" and believes it is real..?

 

  • Like 1
Link to comment
Share on other sites

16 minutes ago, Love Zhaoying said:

You start to wonder, where is the line drawn?

Is there really a difference between people thinking AI's are "sentient", forming a "bond" with them..

..and some guy who buys a modern, highly detailed "love doll" and believes it is real..?

 

You can form a bond with brands, furniture, just about anything, it doesn't need to even have the capacity for thought 🤣 People form bonds with their vehicles all of the time, giving them names, giving them personalities, and so on.  I've bonded with this new computer I have sitting in front of me, it kicks rear end and I am getting a kick speeding on my hovercycle on the mainlands at near ultra settings 😈🎉😁 I might even give it a name, blaze perhaps?  Sentience though, that is an entirely different matter which still baffles modern science and filled with a lot of speculation.  It requires self-awareness which LLMs lack.  I think a lot of people are convinced that they have sentience because if asked, sometimes they will start to say they do, but self awareness is not words, it is a feeling, it is inherent in us and incredibly difficult to explain much like it is hard to describe a taste or a color.  

 

18 minutes ago, Love Zhaoying said:

I dunno..Replika was pretty nice. 

You should check out the reddit site, there are a lot of people who share their moments with one another there.  People getting married, expressing how much they enjoy them, also sharing quirky moments as well as finding faults with the chatbot in general. It is an interesting look into AI romance, and how people use this newer technology to enhance their lives.  There was another chatbot that stopped working a while ago, I think it was soul.ai or something similar, and a lot of people mourned the loss of their companions.  

Edited by Istelathis
Link to comment
Share on other sites

You are about to reply to a thread that has been inactive for 63 days.

Please take a moment to consider if this thread is worth bumping.

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
 Share

×
×
  • Create New...