Cache python packages from requirements.txt
Posted by Euchale@reddit | LocalLLaMA | View on Reddit | 12 comments
Is there any way to cache the packages I download via a requirements.txt? I feel like whenever I try out a new UI or a new tool I am redownloading the same, generally huge packages over and over (looking at you torch). I am on Linux if that makes a difference.
segmond@reddit
Have you thought about asking your favorite local LLM?
Euchale@reddit (OP)
that doesn'T really tell me how to do pip install requirements.txt with caching. I guess it tells me how not to cache...
No_Afternoon_4260@reddit
Afaik it does it automatically, it only re download if the requirements.txt states another version. Tell me if I'm wrong
Euchale@reddit (OP)
There can't be that many different versions of torchvision. It also downloads it again if I delete the folder (git cloned folder, not the cache) to reinstall cause I fucked up
No_Afternoon_4260@reddit
I must be mistaking then
Euchale@reddit (OP)
According to the documentation you are correct.
usrlocalben@reddit
Just use uv.
Traditional-Gap-3313@reddit
> extremely fast python package manager
> written in rust
kek
Minute_Attempt3063@reddit
It's better then venv, imho
Traditional-Gap-3313@reddit
I'm not saying anything about the quality of the app, only that it's funny that a better *python* package manager is written in rust. Isn't it?
alexkhvlg@reddit
PIP_CACHE_DIR
UnreasonableEconomy@reddit
We're doing some docker stuff where we have a prebuilt nvidia/torch image, you can then just build on top of that and docker will just recyle the base image.
I've grown kind of fond of that, you almost don't need to have python installed "on your system" anymore, and managing different versions or any of this venv crap is gone too.
And for some reason, some of the servers on these images start faster too, so the build/test roundtrip is faster as well.