Runpod, Hugging Face, or what for super-simple uncensored LLM-in-the-cloud setup?
Posted by goldenapple212@reddit | LocalLLaMA | View on Reddit | 4 comments
What's the simplest way to get an uncensored LLM with image generation set up in the cloud? If one doesn't need much customization and to play with many options, but just wants speed and ease-of-use, what's the best way?
Evening_Ad6637@reddit
LLM that generates images? I’m only aware of Janus from Deepseek so far. Runpod isn’t too hard, but still doesn’t fit your requirements. Maybe huggingface space? Otherwise why not api services? Openrouter for the LLm and replicate for image generation
TheRealMasonMac@reddit
There is https://github.com/ByteDance-Seed/Bagel but they're all inferior to Flux ATM.
Evening_Ad6637@reddit
Thx for the link. Looks very promising!
henk717@reddit
In the cloud as in private instance remotely?
We have a free colab template at https://koboldai.org/colabcpp or you can do GPU rental from hosts such as https://koboldai.org/runpodcpp
KoboldCpp being one of the few that includes both the LLM engine and a basic image gen engine.
If you need help getting it setup and have discord https://koboldai.org/discord is a good place to ask, if you don't have discord ill try to assist as best I can.