Anyone else using small “prompt modules” with local models? Here are a few I’ve been testing.
Posted by Professional-Rest138@reddit | LocalLLaMA | View on Reddit | 1 comments
I’ve been playing with local models a lot recently (LLaMA, Mistral, Qwen, Hermes, etc.) and something interesting happened:
the thing that improved my workflow the most wasn’t a new model — it was building a few reusable prompt modules.
Not big chains.
Not agents.
Just small reusable blocks I paste in when I hit the same kind of task.
A few that have actually stuck:
1. Message Polisher
Great for turning a rough note or reply into something calm and clear.
2. Notes → Structured Summary
I paste raw bullets and get:
• decisions
• tasks
• next steps
• open questions
Local models handle this surprisingly well.
3. Idea Expander
One idea → a few directions: short, long, narrative, or more technical.
4. Template Starter
This saves me from the blank-page moment.
I give it 3–4 points and it creates a simple outline I can build on.
5. Weekly Layout
Feed it constraints + tasks → it produces a layout that’s actually reasonable.
These tiny routines made my local setup much more comfortable to use day-to-day, especially when switching between models.
I’ve been collecting all of them in one spot so I don’t lose them.
If you want to peek at them, here’s where I keep everything (optional):
ChatGPT Automations
NNN_Throwaway2@reddit
Congratulations, you've discovered that prompting is how you get a model to do what you want.