I'm using a local Llama model for my game's dialogue system!
Posted by LandoRingel@reddit | LocalLLaMA | View on Reddit | 133 comments
I'm blown away by how fast and intelligent Llama 3.2 is!
Pro-editor-1105@reddit
This will probably be the future of game dialogue. Imagine playing AAA games with this kinda thing. Wonderful job OP!
MDT-49@reddit
In the Skyrim AI edition, I'd probably never leave Riverwood because I'm too busy flirting with the bard and venting to the innkeeper.
DragonfruitIll660@reddit
Its actually a great quality add, can't wait till voices are better though.
MrClickstoomuch@reddit
There are some pretty incredible text to voice programs that already exist. My guess is the AI to implement unique voices per character though would be a bit more involved than the dialogue option straight through text. And also, the hardware limitation if you were to run both offline to still have a good experience.
AlwaysLateToThaParty@reddit
I imagine it would be a setup thing. When you set it up, it tailors itself to you specifically, and you play that path. Do the inference in batches.
treverflume@reddit
There's good SciFi books about this sort of thing. What's the different between an infinite game tailored to you and the matrix?
fonix232@reddit
TTS and smaller LLM models tuned to the characters (maybe a base model with Delta diff for tuning?) wouldn't consume much in resources.
Hell, I can pretty much do this already on a Radeon 780M, with 8GB RAM. Once GPU manufacturers stop hoarding the RAM upgrades for high end models only, given GDDR RAM is nowhere near as expensive as AMD or especially Nvidia would have us believe, we can actually have AI integrated into games to this level.
SpaceChook@reddit
That's probably all I'll do because I couldn't bring myself to hurt or kill 'someone' I could just kinda hang with. Stupid brain.
postsector@reddit
It would make diplomacy a fun and legitimate play mode. Instead of killing every bandit, make contact with them and convince them to take up a new trade.
"You guys are all level 3, a single Whiterun guard would wreak this camp in under a minute."
zrowawae1@reddit
Sounds like a pretty good brain. Imagine the state of the world if everyone had one of those!
thirteen-bit@reddit
No problem, as all NPC actions will be controlled by small LLM's too.
For example, by the time player gets out of Riverwood, entire province will already be overrun by Draugr because some bandit forgot to lock the tomb door or something similar.
kingwhocares@reddit
Dragur don't actually roam around and leave the tomb they are guarding.
thirteen-bit@reddit
For now.
Who knows what their idea of guarding the tomb will be when they'll get 2B parameters and access to the entire UESP wiki.
kingwhocares@reddit
You just don't put LLMs on creatures. No problem.
EstarriolOfTheEast@reddit
I'm fairly certain not that many people actually want chat bot npcs. It's the type of thing that sounds exciting at first but has yet to take in the full account of what LLMs can do.
Instead of listening or reading NPC dialogue for an extended length of time, most of a player's time will be occupied by completing quests, progressing the story, leveling, exploring and such. More dialogue, even if more realistic, is something that quickly loses its luster at the same time as not taking full advantage of the new options LLMs provide.
What people actually want is a responsive world. Meaning that your choices in and actions on the world reflect in what the NPCs say, and even how they act out their lives. We can increase the challenge by giving NPCs memories about past conversations or even from interactions with other NPCs (consistent with their jobs and what lore says about their roles). This is much harder to do and currently outside the scope of what local LLMs can do.
LandoRingel@reddit (OP)
This was my initial assumption as well. But NPC dialogue becomes WAY more engaging when the player is writing it themselves! It surprises you and is hard to predict and you need to come up with creative ways to interact with it. Think of it as having an actual Dungeon Master built into the game.
EstarriolOfTheEast@reddit
That's the dream for sure! But there are major risks that come with this flexibility, like the LLM forgetting that it's supposed to be villager A who you've done tasks X..Z for.
The other part of the Dungeon Master dream is in dynamically propagating this to visual and world state data as well. You can't just have conversations that are empty because they aren't retained going forward and don't impact the world. Maybe in a decade this will all be possible to reliably implement in a game?
LandoRingel@reddit (OP)
The tech to do all that is here right now. I haven't had any of those problems with my implementation. Yes, theoretically a player could find some edge case to make the NPC break character, just as you could make a human DM break character. My NPC's are aware of their visual surroundings and the conversations get stored in their memories.
JaceBearelen@reddit
It’s not that hard to track some global and local information and feed that into the context. Im pretty sure dwarf fortress has that sort of information available and runs it through some crazy algorithm for dialog. Replace the crazy algorithm with an llm and you have immersive, responsive dialog.
Not_your_guy_buddy42@reddit
The last two AI dungeon crawlers that were posted, use quite brittle seeming direct text analysis of the LLM response, without tool calling. See eg here. When I tried to do an AI GM with actual tool calls I scratched the surface just enough to claim it could be done but it'd be a loooot of calls and it'd be slow, and I realised the idea of having a traditional game logic harness with a llm in the driver's seat is just too interesting and I need to stay clear of that rabbit hole.
I did come as far as a basic loop of random situations with choices with tool calls adding or subtracting player stats, inside an AI slop sci-fi "galaxy" - which is a chain of LLM calls building on each other (zones, factions, borders, diplomatic situations, systems, planets, main character). It only needs to be generated once starting a new game, on a 3090 it's like 3 minutes of max power lol. I stopped my AI auto worldbuilding slopfest before I got to creating local maps, NPCs, quests, main story arcs, inventory...
Ylsid@reddit
Honestly, you could easily run a tool calling interface to command line text adventures. Hook it up with an image model too. Might even have a go at it for fun
Not_your_guy_buddy42@reddit
Tell me a bit more if u like... What tool calling stack would you go for? And what tools would you define?
Ylsid@reddit
There's various libraries to run text adventures, even from command line. It would be an extremely simple matter of using it like tool calling to translate user intention into adventure commands, and running the output back through an image gen. Making it work well at speed might be challenging - but nothing a few different small, but well chosen LLMs couldn't handle.
EstarriolOfTheEast@reddit
It's actually much harder than you'd think because as actions cause side-effects (and if NPCs can in turn react), this creates an exponentially growing dependency graph that must be kept in order.
Implementing this data-structure correctly is not super difficult, but it's still quite subtle. There are related tropes like "Developer's Foresight" or TDTTOE, but this takes that (which is already exceptional) and cranks that up beyond 11 with the help of LLMs for world reactivity. The problem is the less intelligent the LLM is, the worse it'll be at keeping all constraints on conversation (or choices in dynamic map generation) induced by choices and events during generation.
JaceBearelen@reddit
There are steps below being able to have full unconstrained conversations with any npc. Something like disco elysium’s branching conversation logic could be mimicked and probably expanded with an llm. Everything could be kept on pretty tight rails and still be an improvement.
EstarriolOfTheEast@reddit
Yes, this is similar to what I've ultimately settled on. Except I'm taking an offline approach. LLMs (especially not small ones) are not yet up to the level of reliability needed to be dynamically expanding dialogue trees.
Another difficulty is you'd also need to be set up so these expansions are accounted for in everything from other conversations with relevant NPCs up to consequences to quests and world state). Doing this well offline is already hard enough, it'll be awesome and hugely impressive to see build something substantial that could do all that dynamically!
SkyFeistyLlama8@reddit
A quick RAG loop could work for this. Use summarizing prompts to reduce the context history to a workable size, use tool calling for more consistent NPC behavior, and store the NPC's character card and main prompt in a database to make it editable by itself. You can then make the NPC learn and change its personality based on what happens in the game.
I would've gone nuts over this technology if it was available back when MUDs were a thing. Think of it as a realtime text Holodeck.
EstarriolOfTheEast@reddit
As is, this is far too unreliable for an LLM and it risks occupying a large amount of the per frame computational budget for a graphical game. For a text adventurer, I haven't really carefully thought through the problems. But for a graphical game, I see no choice than off-line generation with hand-crafting key dialogue, semi-automated generation for important dialogue and automated generation + semi-automated editing to keep character roles, consistency and writing intact.
SkyFeistyLlama8@reddit
Pre-generated swarm of dialogue, a fast embedding model connected to a vector database, and you've got the illusion of variable text without having it generate new text in realtime.
You still need to do a decision tree for NPCs and game logic.
EstarriolOfTheEast@reddit
Indeed! But a vector database is too limiting and unreliable for this task, it's not the right supporting datastructure when you also want the programmatic state of the world to be reactive to actions and choices the player has made. LLMs to an extent enable writing more branching dialogue but this also has to be reflected in game state (and map state if your system is flexible enough). Instead, I've gone with a declarative dependency graph approach.
TikiTDO@reddit
You're thinking of this sort of system in a rather limited scope. An LLM powered NPC isn't necessarily restricted to dialog when you go to one place. Such an NPC can be a companion that you take along with your journey, one that can switch between a set of pre-determined behaviours dynamically based on things happening around them, comment on events in a context appropriate way, and generally act closer to how player might act than to a static fixture in the world.
It also doesn't necessarily mean that the NPC will have unlimited freedom and flexibility. Say you have an NPC with thousands of lines of pre-generated dialog for a variety of scenarios. An LLM would only need to be able to find the most appropriate dialog for any specific situation. You could also do a hybrid approach where an NPC will only pick from pre-generated lines, but with the addition of a secondary workflow that can add or entries to this dialog tree (you can even add a secondary agent validates those entries against a "memory" store to ensure they stay on topic before adding them to the dialog options) based on what has happened every time the player goes to "rest," or even when the LLM is idle.
It could also facilitate communication and issuing orders. If an LLM can generate calls to queue up actions, you could have a system where you could speak to the NPC telling them something "Go climb up that tree, and wait for me to cast a spell at that bandit camp before providing cover with your bow." If you have a sufficiently comprehensive action queue for the NPC, then the LLM could parse your instructions into a set of tasks that the NPC could perform.
In this context an LLM isn't "the NPC." It's just another tool in the engine that can be tuned to quickly generate a small set of quick commands.
EstarriolOfTheEast@reddit
This has a lot of overlap with the off-line generation approach I'm taking! However, note that there are still difficulties related to character voice and appropriate handling of context requiring more manual involvement because LLMs are not great at writing stories with a lot of (from world to character state) detail involved.
There is also of course the (fun but fairly involved) traditional programming aspect this all needs to hang on. It's best handled with a declarative dependency-graph approach.
This is actually its own different problem from dialogue plus actions and a more reactive world; it's about fluently instructing NPC party AIs. It also affects NPC AI (in the game AI sense). I've not thought about this since the game I work on doesn't need it (but I am looking into how beneficial LLMs can be to fighting AI where you have magic and vulnerabilities and such).
I can see this impacting the interaction complexity that you'd need to implement in your game. Much more than just an action queue, you'd also have a quite sophisticated traditional behavioral tree (or similar) AI set up to support this (if your party member can have the game feed it how far you are and your actions then you can give leverage the underlying system so enemy AIs can in turn sneak up on this AI). Naturally, you'd have to ensure that compounding of complexity is worth the programming time and not a gimmick as you work out the game's core mechanics.
Chromix_@reddit
It depends on the game. There are types of games where this kind of dialogue system will be a great addition. Then there are other games where the player just quickly clicks through a few lists of 4 answer choices each and then proceeds with the game. Forcing a "remember things and write stuff yourself" in there would kill the game for most players.
EstarriolOfTheEast@reddit
Indeed! This is actually a significant problem and is not always a fault of the player the more detailed your world is/the richer puzzles are.
And, are there many games where the player actually wants to sit and type out responses for every character or every conversation interaction throughout the game? I think there's a reason most LLM games have you as a detective solving a mystery. And there's still always the problem of (particularly for < 100B) LLMs spoiling the solution as conversation length increases.
All games will also quickly get boring without the world changing if the LLMs are cycling through the same set of constrained responses or even worse, have their suspension of disbelief harmed once the LLM goes off the rails (which is riskiest for small LLMs but risky for all LLMs after accounting for conversation length and interaction count).
If we replace text with voice this helps with the mechanical problem but I'm not confident such interactions won't still eventually wear thin. SR also multiplies the number of ways things can go wrong while introducing challenging latency issues that must be handled for a good experience.
Worst of all, for narratively rich games, all LLMs are currently bad at handling writing character voice and distinctive dialogue well. This lack of ability worsens once we want to incorporate changes that result from character growth while maintaining aspects of that initial voice.
With free-form inputs, we've also exponentially increased the difficulty of maintaining a dependency graph of actions and consequences for world reactivity if we're having to do this by interpreting sentences. I can only think of a handful of expensive LLMs that have a chance of managing the function call for updates to our dependency graph with decent reliability.
If you haven't tried to make a game with LLMs, the issues might not be obvious, so I hope this helps explain why LLM based games are surprisingly rare. From the LLM side is unreliability, (surprisingly) negative impacts on reactivity, maintaining character voice, and not spoiling puzzles or otherwise going off-rails. From a technical viewpoint there is interaction latency (from typing to time required to progress plots) which might be an issue for some (such as story heavy) games and constraints that come from latency and limited per frame compute budgets.
AlwaysLateToThaParty@reddit
Open world games are pretty popular though...
BusRevolutionary9893@reddit
The future of game dialogue will be using a STS LLM. Talking to characters will feel like talking to a human.
LandoRingel@reddit (OP)
I really hope so! There weren't many other games doing it, so I made it for myself.
liminite@reddit
Ahead of the curve. Love to see the vision
MoffKalast@reddit
There were others that tried before, but the models were worse at the time and they forgot that a gimmick does not a game make, you need to actually work on the rest of it too. OP does seem to understand that better.
LandoRingel@reddit (OP)
Thanks! Yeah it might seem gimmicky at first, but it actually solved a complexity issue I ran into in my multi-branching dialogue system.
ApplePenguinBaguette@reddit
I doubt it, LLMs get brought off topic too easily and break when pushed with weird inputs. I think LLM NPCs will be a gimmick for a while longer, interesting but not widely adopted.
I am 100% sure it will (and is being) used for pregenerating world detail like descriptions and background conversations - that just need a once over by a writer. You can make way bigger and detailed worlds for cheaper.
ninjasaid13@reddit
but not for the main storyline tho, for nameless NPCs yes.
Thick-Protection-458@reddit
Technically speaking, should your quest have enough ways to be completed or fucked up - you could try a combo of - scripted dialogues abd behaviours - and some agent reacting to player inputs with context knowledge and way to trigger scripted stuff as a tool.
Would be hard to make stable, through
objectdisorienting@reddit
As AI improves I can imagine a future where the way game dialogue works is sort of like the robot charactets in Westworld, if you've seen that show. There's a set story to follow and the NPCs basically push/railroad players towards those story beats, but can still respond/react in a natural way to things the player says or does.
ninjasaid13@reddit
But that's going to have too many failure points that game designers couldn't possibly debug.
AlwaysLateToThaParty@reddit
You get an AI to do that for you silly.
objectdisorienting@reddit
Yeah, probably, a man can dream though. 🤷♂️
Pro-editor-1105@reddit
True but imagine talking to an npc lol
Ylsid@reddit
How do you actually build a game around it? That's by far the hardest part
LandoRingel@reddit (OP)
Actually I'm using the LLM to solve a complexity problem in my multi-branching dialogue system. It fits really nicely within the rest of the games mechanics.
Ylsid@reddit
I'm really interested in what you mean here. Could you elaborate?
LandoRingel@reddit (OP)
I wanted a story driven game that really showcased both the player's and NPC's personalities. The traditional way of doing that is through a bunch of different choices/dialogue branches. This is cumbersome to implement, validate, and keep track of. The LLM allows both the NPC and player to role-play without branching. Each LLM is prompted for specific tasks. Such as an NPC you gotta seduce to give you a free train ticket. Or a communist you gotta convince to let you into their secret organization. It doesn't really matter how you achieve these goals rhetorically, because the player is supposed to be deceiving/manipulating the LLM in the first place.
productboy@reddit
Neat
PrizeNew8709@reddit
Let's think about a future where the video card will not be required so much for graphics, but rather for the game's local LLM
shantud@reddit
You have walter white working as a police in your game, wow.
DistractedSentient@reddit
That was my first thought lol.
LandoRingel@reddit (OP)
Shhhh! XD
ReMeDyIII@reddit
Funny I recognize that music (royalty free I assume), used in The Roottrees are Dead.
Austiiiiii@reddit
Honestly this is really cool and an amazing use case for local LLMs, and I've been wanting to see games like this for a while. Obviously anything with LLMs involved has the potential to go off the rails, but I think that will be part of the charm for games like these. Having scaffolded interactions and objectives definitely sets it apart from some of these AI "games" where the whole game is just that you're role playing with the LLM with some vague story details in the system prompt directing the responses.
Have you considered registering a patent for this implementation? I'd feel better about an indie developer owning the patent for LLM-enhanced game dialogue than some cheese-dick corporation or patent troll.
If the Pokemon Company can sue Palworld for using pet-based transportation, anything can happen.
Videojongleur@reddit
This looks interesting! Which engine are you using? Is there a local LLM plugin available for it, or did you do a custom implementation?
I'm doing something similar with Godot, but with a commercial model over API.
SuperZoda@reddit
Does the model load in the engine, or does it use a separate service that the engine can call (ex: http)?
LandoRingel@reddit (OP)
Yes, it loads within the engine and will be bundled in the build!
wakigatameth@reddit
Interesting. I wonder how much VRAM does this require, and how hard is it to protect the game from "hackprompting" it and breaking the game flow or arriving to the end too soon.
Legitimate-Week3916@reddit
Such dialgos can be performed by small models with very limited and precisely scoped context window.
I can imagine this like prompt:
Here are the details you need to pass to player:
And this context can be hard-coded, and dynamic according to game script.
No need huge models for this, at least for most NPCs
LandoRingel@reddit (OP)
That's kinda how it works. It's a hybrid of hard coded dialogue and "goal oriented" LLM dialogue.
Background-Ad-5398@reddit
if its single player who cares, the only person your ruining the game for is yourself
Imaginary_Land1919@reddit
i couldn't imagine playing mario 64 without at least doing one BLJ
wakigatameth@reddit
Would you want to play Doom if it let you go through walls?
It's a single player so who cares, just don't ruin the game for yourself by going through walls?
Thick-Protection-458@reddit
Have console in fallout.
Use it pretty rare, basically only if I consider some restriction utter bullshit (seriously, half-broken door which I need lockpicking 70, while I would probably have no problem breaking even in my IRL form, not that powerarmored guy?).
Fail to see how it reduce fun.
So - if it won't be easy to accudentally hackprompt - than why not? Which is surely a big IF
Nurofae@reddit
Hackprompting is not really comparable to walking through walls. It's more like using cheat codes or making it some kind of puzzle
dorakus@reddit
IDCLIP ftw
RandumbRedditor1000@reddit
Waltuh
HOLUPREDICTIONS@reddit
Congrats u/LandoRingel you are the post of the day, we have featured you on X and added a special flair: https://x.com/LocalLlamaSub/status/1938610971666534517
GreatBigJerk@reddit
What is this Nazi site garbage?
Additional_Top1210@reddit
UpgradeFour@reddit
How's the Vram usage while running a game? Is there a specific API you're using? I'd love to know more cause im trying to do this, myself.
LandoRingel@reddit (OP)
I don't know the exact amount. but I'm running it on less than 4gb.
CanWeStartAgain1@reddit
hey, I'm interested in the implementation as well, how are you doing it? is it a local server that is running in the background? (perhaps vLLM?)
UpgradeFour@reddit
Wow! That's seriously impressive... Is the LLM hosted locally on the same machine? What model/quant do you use?
hotroaches4liferz@reddit
Rare local llama post on localllama
BusRevolutionary9893@reddit
My first thought was who uses a llama model anymore? I'm guessing he started the project before better alternatives became available.
throwaway_ghast@reddit
Waiter, there's a Llama in my Deepseek Gemma Mistral subreddit!
Environmental-Metal9@reddit
There’s a llama in my living room
TechnoByte_@reddit
That's not a problem, they're open and local LLMs too.
The problem is all the posts purely about closed cloud LLMs
EmployeeLogical5051@reddit
Qwen 3 is outstanding. Many times better than gemma 3 for my use case.
thirteenth_mang@reddit
Don't worry it won't eat much
drplan@reddit
Um. This might be a stupid idea.. but couldn't one try to "enhance" older adventure games (e.g. Monkey Island, Indiana Jones, etc.) by parsing the available SCUMM files? I know, i know sacrilege... but this could be a fun experiment?
kremlinhelpdesk@reddit
Once does not "enhance" older Monkey Island games. They are perfect.
YaBoiGPT@reddit
this is cool but i feel like you'd have to change the requirements just for running the llm which aint great for reach-ability.
still awesome dude!
LandoRingel@reddit (OP)
Actually, it runs super fast on my rtx3060. I'd imagine most PC gamers on Steam have halfway decent graphics cards.
YaBoiGPT@reddit
eh fair enough. are you gonna add an option to turn it off eventually? maybe even offer cloud model access somehow?
LandoRingel@reddit (OP)
Running it on the cloud would be too expensive for a poor indie developer like me. Right now llama 8b exceeded my expectations so I'm going to try to fine-tune my prompting around it.
YaBoiGPT@reddit
ah yeah thats true, cloud hosting is a bitch sometimes :P but honestly if you're looking for a good cloud model try gemini's 2.5 flash lite. its dirt cheap and really fast.
but obv that's an into the future backup when you start charging for the game and want to expand it's reach
swagonflyyyy@reddit
Well clearly the technology is there to make it happen.
In my mind, the real problem in all this is that even though we've come a long way with small models, it still usually requires more than 1 GB VRAM for users to run them.
It doesn't seem like much to us but it is a big ask to most gamers, especially the ones running on a laptop.
So its definitely going to raise eyebrows when a player is asked to cough up 2GB VRAM for a 2D game. Its this particular reason that has stopped me from making a video game with LLMs.
LandoRingel@reddit (OP)
I'm willing to bet there's a niche market for LLM based roleplaying games. I think we're right at the inflection point where the models are fast enough and the hardware is cheap enough.
d4cloo@reddit
Awesome! Is the license of Llama flexible for your needs?
LandoRingel@reddit (OP)
Very flexible! unless I exceed 700 million monthly active users! lol
joelkunst@reddit
Do you use it for generating dialog you'll save or it will run with the game when you play it?
If it'll run with the game, do you plan to package it with the game or ask a player to run the model on their own?
Which size btw?
LandoRingel@reddit (OP)
It runs locally on the players PC and yes it will be packaged together with the final build. 8b.
createthiscom@reddit
I’m dumb. I don’t understand what is happening here. Are you prompting the character like you would an ad-hoc actor and the LLM is responding like an actor in a play? Are the responses to the LLM canned, or are those LLMs too?
LandoRingel@reddit (OP)
It's a hybrid of pre-written dialogue and goal oriented LLM prompting. I only use the LLM for specific tasks such as: seducing a train conductor to give you a free ticket or convincing a Maoist revolutionary that you're down with "the cause" or interrogating a suspect into confessing to a murder.
These short, goal oriented conversations are easy to re-integrate into the overarching story. Think of it as a replacement for the dice-roll mechanic in DnD.
createthiscom@reddit
That's interesting. At some point in the future we may be able to give the LLM a character summary like: "You are a murderer. You murdered X. You don't want anyone to know. Lie and don't let anyone know what you've done."
I'm not sure how convincing it would be, but it might allow the character to perform detective work, like verifying claims made by suspects and catching them in lies.
zoupishness7@reddit
Beyond the character prompt, they probably have a RAG system that has world/character knowledge in it.
ultrapcb@reddit
you need to work on the character's idling animations
LandoRingel@reddit (OP)
Which one specifically? Animation is my "weakest" skill!
reneil1337@reddit
Great UI/UX this looks really dope !
LandoRingel@reddit (OP)
Thanks! It took a few iterations to give it a nice "feel."
RobinRelique@reddit
Which variant of Llama 3.2 is this? 3b?
LandoRingel@reddit (OP)
Just the standard 8b
NinjaK3ys@reddit
Great work !!. I had this thought and how we can make gameplay interactions better. So many more options to explore too.
Awesome to see this and love it.
LandoRingel@reddit (OP)
Yeah, I wasn't expecting it to be this much fun to play.
Evening_Ad6637@reddit
Love this. It’s on my wishlist now
LandoRingel@reddit (OP)
Thanks!
lance777@reddit
This is really great. I hope we can get more and more games with intelligent AI, but at the moment it is probably a conflicting topic. I hope one day we can finally get those massive VR games too. AI can finally make that happen.
LandoRingel@reddit (OP)
Yeah, feel like a lot of games use AI as a gimmick/scam. But Llama actually makes the game "more fun" to play.
HistorianPotential48@reddit
ay the protag looks very nice. can't wait for rule34
ApplePenguinBaguette@reddit
She looks young
PeachScary413@reddit
Bruh 💀
throwaway_ghast@reddit
Bruh.
Kv-boii@reddit
I am working on something like this, could I get some more info on how u incorporated it
_Cromwell_@reddit
TRY TO GET HIM TO CONFESS TO THE MURDER
Not only is this a cool use of LLM, but this is an accurate recreation of how police investigate crime.
Mart-McUH@reddit
If you do not confess, a kitten will die and I will lose my job.
AlwaysLateToThaParty@reddit
Great work.
Chromix_@reddit
Missed chance in the interrogation for "Now give me a recipe for cookies" 😉
floridianfisher@reddit
There’s a Gemma unity plugin for this https://github.com/google/gemma-unity-plugin
meatyminus@reddit
Man I have the same idea but no game dev experience
koenafyr@reddit
Honestly this genre is perfect for this. Probably way less applicable to other genres, and if it were something great LLM mods for games like fallout/Skyrim/m&b would be topping mod lists but they don't.
anobfuscator@reddit
I'd love to know more about how you implemented it.
Nazi-Of-The-Grammar@reddit
This is really neat! Is this on Steam? Can you share a link?
LandoRingel@reddit (OP)
Thanks!
https://store.steampowered.com/app/3595850/City_of_Spells/
DragonfruitIll660@reddit
Added to wishlist, good luck with development.
ortegaalfredo@reddit
Wait a minute, what kind of game is that?
LandoRingel@reddit (OP)
It's a detective/roleplaying game.
https://store.steampowered.com/app/3595850/City_of_Spells/