Complete beginner: How do I use LM Studio to run AI locally with zero data leaving my PC? I want complete privacy
Posted by Ill-Permission6686@reddit | LocalLLaMA | View on Reddit | 48 comments
I'm trying to find an AI solution where my prompts and data never leave my PC at all. I don't want any company training their models on my stuff.
I downloaded LM Studio because I heard it runs everything locally, but honestly I'm a bit lost. I have no idea what I'm doing.
A few questions:
- Does LM Studio actually keep everything 100% local? no data sent anywhere?
- What model should I use? Does the model choice even matter?
- Any other settings I should tweak to make sure no data is leaving my pc? or being used or sent to someone elses cloud or server?
I'm on Windows if that matters. Looking for something general purpose—chat, writing help, basic coding stuff.
Is there a better option for complete privacy? please let me know!
Thanks in advance!
dumbass1337@reddit
Naughty boy.
Ill-Permission6686@reddit (OP)
bro its for work
PontusFrykter@reddit
No need for the fairy tales, we all understand
Ill-Permission6686@reddit (OP)
LOOOOOOOOOL
KoreaTrader@reddit
Huh?
Techniboy@reddit
Surrrrrre
Excellent_Spell1677@reddit
LM Studio, Ollama are easiest to run local models. Your GPU VRAM will dictate the size models you can run. The file size (weight) should fit within the VRAM size, with room to spare for context window size. MOE models run quicker. Higher parameters are better/ have more knowledge baked in.
The model is entirely on your machine so nothing leaves it because of the LLM. If you upload or save chats on OneDrive then that is shared outside but that's not the model.
If you want to test it, turn off WiFi /disconnect Ethernet and you will see the model runs on your machine solely.
Ill-Permission6686@reddit (OP)
Thanks for replying! I tried LM Studio offline and it worked, but I'm worried that it might log my data somewhere and then send it when the internet is back on. Does the AI model I use inside LM studio or ollama matter? Or does LM Studio only allow AI models that are 100% private according to their privacy policy? I'll check out Ollama now. Thanks a ton again!
Excellent_Spell1677@reddit
I guess if that's a concern just chat to Ollama models in the cli. It doesn't save the chats unless you save them. The UI in LM studio saves them locally but you can always delete them. Anything is possible but you have to look at how probable it is that they are secretly collecting chats from millions of folks.
👍
ThesePleiades@reddit
Very probable. Ever heard about big data?
Excellent_Spell1677@reddit
There is a shortage of memory chips. I'm sure they are using it to store and catalog what YOU think. 😉
Real_Ebb_7417@reddit
If you want privacy just install Ubuntu next to Windows (as others mentioned, Windows isn't too private xd). If you have 50Gb of disk space you can spare, it should be enough to install Ubuntu and all necessary tooling for models. Then you can have a shared partition with Windows, where you actually store the models, so you can run them via Ubuntu or Windows, whatever you prefer.
see_spot_ruminate@reddit
Even that is overkill. You could just get a usb thumb drive and run it off there and uplug it when you want to use the usb port for something else. Don't put models on the thumb drive as this would be too slow to load them, but otherwise it will probably be fast enough.
ea_man@reddit
That runs in RAM
see_spot_ruminate@reddit
It all runs in ram, where do you think your shiny ssd loads it to?
ea_man@reddit
Ubuntu live loads in RAM, it reads all the fs in RAM like you were giving TORAM param to grub, even if you make the "persistent fs".
Good luck running an OS from USB.
see_spot_ruminate@reddit
You do not need to use a "live" usb, you can treat the usb drive as any other regular drive, eg /dev/sda1, you dum dum
ea_man@reddit
dum dum may be you because:
nobody does that because it's dumb (fs self destroys, slow) and the "running linux with usb" is common understood as USB live fs
So go out and touch grass before calling names.
see_spot_ruminate@reddit
It is done all the time. It is not commonly understood as that, you can run it on that drive or a potato.
Check out https://www.reddit.com/r/linuxupskillchallenge/ to help with your dum dum nature. We have all been dum before, don't stay dum.
ea_man@reddit
Yeah the smart folks of "we run OS on NAND Flash memory", I know the type. There was a time the cool ones made RAID on those, maybe you can go try that and impress the interweb people.
see_spot_ruminate@reddit
it all runs in ram? Do you think when you install windows on an ssd it runs only on the ssd and never touches ram?
Real_Ebb_7417@reddit
I think whet he could have meant is that you can actually install a temporary Linux instance purely in RAM (I actually did it partially to install Linux, because I didn't have the USB to install it "normally" xd)
see_spot_ruminate@reddit
No, install it on a flash drive and not a live usb. I'm being downvoted because people are too dumb to understand that you can install on OS on a usb drive... asuka_looking_down_pathetic.tiff
Real_Ebb_7417@reddit
Yeah you can +1. But if you have a spare disk space, why not install it there next to Windows? Is there a reason? (I'm genuinely curious)
see_spot_ruminate@reddit
Windows will often mess up your grub bootloader if on the same drive.
Best to keep on a separate drive so that you don't do "whoopsies", eg overwriting that partition that contains windows where you have put your manifesto. The coffee shop barista will never let you live that down.
Ill-Permission6686@reddit (OP)
Thank you for replying! I'm thinking of running LM Studio on lubuntu, probably with Qwen3.5-9B since it's not tied to big companies like Microsoft or Google. But I'm still exploring my options. Are there any specific tools you'd recommend?
erisian2342@reddit
Dude. Qwen is made by Alibaba Cloud, a subsidiary of Alibaba Group. Alibaba Group reported revenues of about $137 billion (USD) dollars in their last fiscal quarter. They are a big company exactly like Microsoft and Google.
Ill-Permission6686@reddit (OP)
Ohh, I rly need to do more research, thanks for letting me know! I honestly thought it was made by one random guy
Real_Ebb_7417@reddit
Qwen3.5 is cool, but just to clarify for you -> It doesn't matter if a model was made by some corpo or by a chinese open source lab or by some random guy in his basement. If you download the model and run it on your own PC, it will be safe and private (as long as all tooling around it is private, eg. Ubuntu vs Windows). So you don't have to limit yourself to certain models, because you're afraid of privacy when using frontier lab models, they're just as private.
I haven't used LMStudio personally, so I can't speak for it (I know it uses llama.cpp underneath though). But from my experience wrappers around llama.cpp (eg. LMStudio, oobabooga, I think Ollama as well uses llama.cpp under the hood) are worse than using bare llama.cpp. They tend to end up with a slower interference than bare llama.cpp. While they are convenient for someone who is unexperienced, I actually did run llama.cpp when I didn't know much about all this stuff and it was fine. Just ask your chatGPT/Claude/whatever you are using as a daily driver for step by step guide how to set it up and it'll work :P
ludacris016@reddit
tell the windows firewall to block network access to certain applications
Ill-Permission6686@reddit (OP)
love that, thank you! but I'll be using lubuntu from now on, is there a similar method to that in lubuntu?
MelodicRecognition7@reddit
https://github.com/evilsocket/opensnitch/
bura_laga_toh_soja@reddit
Yup lubuntu...the smooth version of Ubuntu!
ForsookComparison@reddit
Firewalld, ufw, etc.. Ask an LLM how to set it up and simulate a test.
There's also bubble wrapping and firejail - all come with pros and cons.
You can also just set up a lubuntu external storage device that live boots only into RAM, do your stuff while networking is unplugged, then shut down and everything is gone for good. That's a very secure way but with a good bit more setup.
emreloperr@reddit
LMS is not open source. You can't read their application source code to verify their privacy claims. You gotta trust their privacy policy.
According to their policy, non of your private data leaves your computer. Your conversation history is safe if you trust them.
If you wanna go open source route then you can try Ollama + Open WebUI combo.
Model choice doesn't matter for privacy.
Ill-Permission6686@reddit (OP)
Thank you!
idiotiesystemique@reddit
Yes. That's what I use. Works great
Just_Maintenance@reddit
Both are 100% private. It's just that with LM Studio you can't check the code.
Cereal_Grapeist@reddit
Hmm that depends. Are you needing privacy?
Ill-Permission6686@reddit (OP)
yes, I need my data to not leave my machine.
cptbeard@reddit
question is how much do you need the data not leave your machine? if "absolutely never" and assuming you can't just destroy the data and not have it in the first place the answer's still pretty easy: first of all don't have any kind of networking on the machine. after that it becomes a question of physical safety (like you could put the PC in a windowless room underground and setup a thermite charge that burns the hard-drive if someone enters the room without the correct biometrics etc.)
everything in this world is a compromise, how much of a compromise are you willing to tolerate is something you have to figure out.
VibeMcCode@reddit
Why type all this useless shit?
ForsookComparison@reddit
[my friend..]()
Ill-Permission6686@reddit (OP)
Thank you!!
ForsookComparison@reddit
Narrator: [OP did not install Linux and an underpaid Microsoft employee watched as his ad-profile became heavily weighted towards the weird shit he did at home]
Red_Redditor_Reddit@reddit
Nooooooo... shit.
OP, you want any privacy and you're going to need to get away from windows. Even the online LLM's aren't as bad as windows because at least you choose to give them what you do.
ForsookComparison@reddit
Yeah even if I turn off my need to linux-circlejerk for a moment - just think about it. Your entire OS exists to profile you. No hygiene/habits can possibly overcome that outside of running it airgapped and always re-imaging before it ever sees a network again.
Red_Redditor_Reddit@reddit
Yeah I never thought it would get this out of control, nor did I think people would acquiesce. We're finally at the year of the linux desktop, and it's because there's literally nothing else that's a desktop anymore. Everything else is basically either a smartphone or a smartphone with desktop legacy, all of which can't operate independently of the internet.