TextGen is now a native desktop app. Open-source alternative to LM Studio (formerly text-generation-webui).

Posted by oobabooga4@reddit | LocalLLaMA | View on Reddit | 76 comments

Hi all,

I have been making a lot of updates to my project, and I wanted to share them here.

TextGen (previously text-generation-webui, also known as my username oobabooga or ooba) has been in development since December 2022, before LLaMa and llama.cpp existed.

In the last two months, the project has evolved from a web UI to a no-install desktop app for Windows, Linux, and macOS with a polished UI. I have created a very minimal and elegant Electron integration for that. (Did you know LM Studio is also a web UI running over Electron? Not sure many people know that.)

It works like this:

  1. You download a portable build from the releases page
  2. Unzip it
  3. Double-click textgen
  4. A window appears

There is no installation, and no files are ever created outside the extracted folder. It's fully self-contained. All your chat histories and settings are stored in a user_data folder shipped with the build.

There are builds for CUDA, Vulkan, CPU-only, Mac (Apple Silicon and Intel), and ROCm.

Some differentiating features:

I write this as a passion project/hobby. It's free and open source (AGPLv3) as always:

https://github.com/oobabooga/textgen