Local Hosting Question
Posted by Media_Express@reddit | LocalLLaMA | View on Reddit | 15 comments
I know asking this is going to make me sound ignorant. I'm already aware of that. I'm a web novel author. I write fantasy novels that come out to millions of words total for a series in a xianxia style of cultivation world. So I'm not an AI expert.
Between my actual job and going back to school to complete my degree I just don't have enough free time to continue my writing. I'm interested in hosting an AI model locally that I can upload a book I've been writing that's got about a 24,000 words. Then have a local AI continue writing that book on my behalf in a similar manner to how I would write based on my previous writing style.
At the risk of sounding ignorant, is this something that would be possible? If so could you please advise me on what model to use and essentially just how to start?
EggplantParty5040@reddit
It’s an insult to anyone who has ever read your “work”.
Media_Express@reddit (OP)
Hey bud. I've never used AI up until this point. That kind of figured that if I have millions of words already written in other novels of similar genre in 24,000 words already written of the current book, an AI should be able to replicate my thought process. That's a pretty huge amount of data to go off of. Therefore, I wouldn't imagine the quality would decrease much. However this is more of a test than anything else and even if I were to publish anything that was written it would be thoroughly vetted and edited before any publishing. This is really just to save myself time during this busy phase of my life while also testing AI to see if it would even be useful to me.
New-Yogurtcloset1984@reddit
That's the problem. AI doesn't actually think, it is an advanced guessing machine. It can only put stuff out that has been put in it before.
The hard work of writing isn't just the idea, it's the editing and refining that sits around it, which you'd have to do anyway even if you got AI to start the process.
KingofRheinwg@reddit
I don't really understand why but you can do it in a couple steps.
Train a LoRA layer of your work on a LLM that has been trained on wuxia. It's like making a hat of your works that will sit on the head of the LLM so it sounds more like you.
Have an intruct model read your words and create a custom prompt based on what themes it picks up, what emotions are common, etc.
Have a model read your novel so far and plan out the rest of the story beats, high level plot summary of chapter by chapter or whatever.
Tell the model to use the custom prompt to fill in the plot summary chapter by chapter using the LLM with the LoRA layer you've trained.
This will be a pretty good chunk of work and will require thousands of dollars worth of equipment. AI products are not copyrightable and there's so much slop on the market these days that... if you intend to make money off this, you aren't going to.
b_nodnarb@reddit
This is a good answer.
b_nodnarb@reddit
There might be an effective way, but it requires a programmed agent (rather than just a simple chat interface). You don't need to finetune a model or anything like that, but you DO need to provide enough context. Fortunately you have 24,000 words.
This is just broad strokes and is missing nuance, but here's what I would do, generally speaking:
Have an LLM extract your writing style into a set of rules + examples (e.g. a writing_style.md) - just feed it 20 pages and say "create rules" then take the output. Then create a separate file of examples of your actual writing from the book so far "samples.md". Lastly, you'll want to inform the system of your intended conclusion for the book and any major events that happen between now (where you are in your writing) and the ending. Call this events_and_conclusion.md. Once you have all 3 artifacts then you build an agent that has creative storytelling responsibility. We have gone from A-R, now we need the agent to construct the path from S-Z with certain events along the way. Tell the agent to think creatively and come up with the major things that happen (e.g. fill in the gaps of the story). Then once you have enough working pieces, have the agent string paths between each of the major events (e.g. what are the micro events that progress the story to the next stage). Lastly, use the style_guide and samples files to polish/refine the language in the story itself. So you're starting with the big pieces of completing the narrative, then you're glossing over it with stylistic context. Again, not too easy but nothing crazy either.
PvtMajor@reddit
Not really possible in the way that you describe. The context windows are too small, even 1 million token context window models aren't very coherent after a chapter or two. And the voice would be the AI's. You need a system that continuously gives the AI the context that it needs in order to keep the story coherent.
I've spent some time working on a book writing app (it's not very good). The AI can write good scenes, but it has a hard time remembering that Joe got killed last chapter, and Jill betrayed John 2 chapters ago. You have to keep feeding those details in and that gets overwhelming quickly.
You might want to try a paid service like Novel Crafter where you set up the bones of the story and the AI puts meat on them. Or you might want to look into something non-AI focused like Obsidian that you can add AI to and ask for help filling in paragraphs, but that's more "assistant" type of stuff with you doing the bulk of the work.
DataGOGO@reddit
You have to train the model on the in progress book, and continuously retrain it as it moves along.
Media_Express@reddit (OP)
I'm fine with the assistant type stuff too. Like I said honestly this is just meant to be a test of AI is useful to me what else is saving time. But I do appreciate your Insight and how it wouldn't be feasible for my situation. I did consider Novel Crafter but I just don't really like the idea of a web-based AI creating anything I'm publish as I feel like that can create complicated aspects of legal ownership over the content. I also don't want chatgpt to be trained on my writing and reference it in the future.
I suppose if this isn't feasible at all I can just take a break from writing. Not the end of the world, just let some of my readers down.
PvtMajor@reddit
For starting out, I'd download LM Studio and whatever Qwen 3 or Gemma 3 models fit on your computer and start chatting. If you're able to run models that can provide help/feedback/etc. on some of your work, look into other writing apps (like possibly Obsidian) that allow you to use your own API. Then point it at LM Studio and it puts the local AI into the app.
Media_Express@reddit (OP)
Thank you sir! I don't know a lot about AI honestly but it does seem more feasible and in line with what I'm looking for. I'll look into what you're suggesting, I do have a couple of fairly high-end computers so I don't anticipate running models to be an issue.
Media_Express@reddit (OP)
If any of that is hard to read, sorry about that. I'm using text to speech at the moment.
DataGOGO@reddit
It is possible, but not easy or cheap.
You would have to custom train a model on your in progress work to pick up your writing style, and follow the story. then retrain as a fine tune as you progress though the story.
jarec707@reddit
I know you asked for local, but have you considered Claude? Also, no worries re your concern about "sounding ignorant." Aren't we all, about something? You need advice and asked for it, no shame in that.
superSmitty9999@reddit
It might be possible but would probably take more skill than you have. The end product would almost certainly be inferior. Besides, the internet is already flooded with AI slop.
What I could see working is using say Claude to write specific scenes for you faster, but you have to be in the drivers seat there’s no button you press and a book comes out