Cathedral Haunt on 15/8/2023 at 17:44
For NPCs a technology that could work is the one character.ai uses, which can mimic different personalities and roleplay/hallucinate in a more believable way.
It can sustain a scenario or made-up story for longer, so it would be more appropriate for different personalities and NPCs in a city, such as teenagers, office people, students, policemen, etc. At least interactions of a few minutes with NPCs would seem more authentic.
If every NPC appears to have the same depressive disorder and can't even make up an answer for a simple question such as "where are you from?", it gets boring in a matter of seconds.
Starker on 15/8/2023 at 20:55
The dilemma is which NPCs this can be used for in the first place -- on one hand, there is no demand for in-depth dialogue from cookie-cutter background characters that most people are just going to pass by, but people do expect well-written interesting dialogue from the NPCs that they are going to be conversing with.
Cathedral Haunt on 15/8/2023 at 22:00
I agree. In games like Thief, NPC dialogue is not that important, although a more casual conversation with Basso or Jenivere could be interesting, the immersion might end up breaking if the player abuses it.
I guess that the technology could be interesting in games where some NPCs can give valuable information, like adventure/detective games, or games where building relationships with NPCs is important, like some kind of Sims game.
Also, another obvious one would be an adult game with a whole city to interact with (just saying this for a friend).
I think there are many possibilities, but only time will tell how this ends up being used in games.
demagogue on 15/8/2023 at 22:04
If this tech really develops, they're not going to be just games anymore ... because clever people may figure out how to have NPCs improvise better and, since gameplay can finally be more language based (if the tech gets to that level), they may become more like interactive movies or LARPing than just games.
Starker on 15/8/2023 at 23:03
We'll see if we ever actually get to that point. But my gut feeling is that we're not going to get there with statistical models, as they can only ever be derivatives of the real stuff.
Briareos H on 16/8/2023 at 08:00
I'm pretty convinced you can get paradigm-changing experiences with models trained on a mix of general-purpose, curated and custom language with prompts cleverly generated by the game systems to lock the personality and behaviour of the NPCs. At least I really hope so, because to me that's one of the rare good use cases for the tech.
Cipheron on 16/8/2023 at 09:07
Quote Posted by Starker
We'll see if we ever actually get to that point. But my gut feeling is that we're not going to get there with statistical models, as they can only ever be derivatives of the real stuff.
I think that's being a little too reductive. Because language models work by extrapolation, and you can definitely overlay different patterns together in novel ways.
One example from this thread is when I got it to write a product pitch for "Maggoty Meats" which is just an unfortunate name for a company and it wrote an advert that was trying WAY too hard to convince the reader that there weren't actual maggots in the product. Can you call that just a linear extrapolation of real stuff?
Or, what I'd argue is that overlaying enough real context of various types is enough for it to construct completely new scenarios then apply realistic dialogue for that situation, despite it being a situation that's never actually occurred in any fiction it read. For an example of that, there was the "Day of the Hitlers" one I made which is basically a zombie film but with Mein Kampf quoting Hitler clones climbing through windows to get the heroes.
Anything you can think up, it'll give it a good shot, better than
most humans if you asked them to write a short story about some weird topic. So at some level, it's deriving those from real people's work, but it's also combining everything at once.
---
Also we have to keep in mind that this demo that looks like it's running an offline model that's small enough to run on your home PC's GPU. It's not running ChatGPT or anything. You can ask ChatGPT to roleplay as a New Yorker and it'll totally generate all the backstory for the character on the fly.
So it's a tech demo. They're probably not even pushing what their current model is capable of. It's pretty normal that someone builds an engine, shows a demo, but it's up to other teams to really push what the engine can do to the limits. That's probably what will happen with projects like this.
mxleader on 16/8/2023 at 15:38
Slightly off the current thread direction but I broke ChatGTP this morning when I gave it loose parameters for a job interview role play. I didn't specify that I wanted a back and forth role play with me as the interviewee and the AI as a hiring manager. When I pushed go it started having a back and forth with itself. The AI spit out Q's and A's until it stopped working. After stopping it and adding more details it functioned much better. It's really good to sit and role play for interviews before chatting with a human for practice or even for an actual interview. I think I need to add a parameter to have the AI ask more unconventional questions to keep sharp. Not sure if it can do that though.
Starker on 18/8/2023 at 06:12
Quote Posted by Cipheron
I think that's being a little too reductive. Because language models work by extrapolation, and you can definitely overlay different patterns together in novel ways.
One example from this thread is when I got it to write a product pitch for "Maggoty Meats" which is just an unfortunate name for a company and it wrote an advert that was trying WAY too hard to convince the reader that there weren't actual maggots in the product. Can you call that just a linear extrapolation of real stuff?
If the argument is that infinite monkeys with infinite typewriters will inevitably produce a masterpiece, I have to say that it's technically possible, but good writing is more involved than stumbling on a funny combination of things. It also needs execution and, because humour is so dependent on context, it really needs that human intentionality. The idea of "Maggoty Meats" is kind of amusing, but the execution? I don't think you could call it amazing joke-writing by any stretch
Quote Posted by Cipheron
Anything you can think up, it'll give it a good shot, better than most humans if you asked them to write a short story about some weird topic. So at some level, it's deriving those from real people's work, but it's also combining everything at once.
Perhaps there aren't a lot of people who would be able to write to the level of ChatGPT, but that's why we have professional writers and why it's an actual profession that people have to spend many years to get good at. And even these people aren't able to write well consistently -- a lot of it ends up in the wastebin or stays in the drawer.
Not to mention that the argument here works against deploying ChatGPT to write NPC dialogue -- the average user wouldn't be able to coax some novel writing out of it and not even a r/iamverysmart user wouldn't be able to do that consistently.
Cipheron on 18/8/2023 at 08:25
Quote Posted by Starker
Not to mention that the argument here works against deploying ChatGPT to write NPC dialogue -- the average user wouldn't be able to coax some novel writing out of it and not even a r/iamverysmart user wouldn't be able to do that consistently.
It's not up to the player to craft clever prompts. The model gets hidden prompts written by the designer. So the "average user" isn't the one doing the work, it's the game designer which gives the behind the scenes set-up.