I made a free site with file tools + a local AI chat that connects to Ollama
Posted by opal-emporium@reddit | LocalLLaMA | View on Reddit | 8 comments
I've been working on a side project called Practical Web Tools and figured I'd share it here.
It's basically a collection of free browser-based utilities: PDF converters, file compressors, format changers, that kind of stuff. Nothing groundbreaking, but I got tired of sites that either paywall basic features or make you upload files to god-knows-where. Most of the processing happens in your browser so your files stay on your device.
The thing I'm most excited about is a local AI chat interface I just added. It connects directly to Ollama so you can chat with models running on your own machine. No API keys, no usage limits, no sending your conversations to some company's servers. If you've been curious about local LLMs but don't love the command line, it might be worth checking out.
Anyway, it's completely free — no accounts, no premium tiers, none of that. Just wanted to make something useful.
Happy to answer questions or take feedback if anyone has suggestions.
valosius@reddit
Moin,
hmm....für mich ist das nicht wirklich offline, solange ich noch eine Seite aus dem Internet lassen muss, macht es für mich keinen Sinn. Da nutze ich lieber bewärte OpenSource Lösungem. Ollama funktioniert in dem Setting von dur auch nicht für mich. Meine Ollama Instanzen befinden sich z.B. nicht auf meinem Arbeitslaptop, sondern auf auf einem Homeserver, Eine gute Integration würde bedeuten, einfach die URL der API einzugeben, und sich dann verbinden zu können, anstatt da drunter eine Anleitung zur Installation von Ollama anzuzeigen, welche in meinen Augen auch nur Redundant ist. Und wenn ich die Kommandozeile nicht mag, installiere ich mir einfsch eine einfache Open - Source Version einer webUI, per z.B. als Docker, und gut ist, oder ich nutze Pinokio.
LG Olav
Substantial_Ad5570@reddit
Nice fucking subscriptions and pay walls. We will build it ourselves
random-tomato@reddit
I really like the philosophy/design of the website!!! just hope you won't add any ads :)
henk717@reddit
Instead of ollama its much better to support the openai api with a custom URL. That is vendor agnostic.
MitsotakiShogun@reddit
What was wrong with other open-source alternatives? E.g.: * https://github.com/Stirling-Tools/Stirling-PDF * https://github.com/C4illin/ConvertX * https://github.com/CorentinTh/it-tools
Own_Professional6525@reddit
This is a really thoughtful project! Love the focus on privacy and local AI, plus the free, browser-based tools make it super accessible.
an80sPWNstar@reddit
So it's not something that runs offline in our own network? I clicked on get started and it scrolled down to a feature. I tapped on it and it asked me to upload a file.
opal-emporium@reddit (OP)
All the processing happens offline on your own device. If you are talking about the file tools, you could theoretically load a page, disconnect from the internet, and your file uploads will still work... you just can't leave the current page that you are on. You need the internet to load the pages of the website, but your uploaded file never leaves your computer.
The AI chat integration with Ollama works the same way. You can disconnect from the internet after loading the ai-chat page and connecting to Ollama. Ollama is what runs the LLM on your computer.