Tired of bloated requirements.txt files? Meet genreq
Posted by TheChosenMenace@reddit | Python | View on Reddit | 48 comments
Genreq – A smarter way to generate requirements file.
What My Project Does:
I built GenReq, a Python CLI tool that:
- Scans your Python files for import
statements
- Cross-checks with your virtual environment
- Outputs only the used and installed packages into requirements.txt
- Warns you about installed packages that are never imported
Works recursively (default depth = 4), ignores venv/
, and supports custom virtualenv names with --add-venv-name
.
Install it now:
pip install genreq \
genreq .
Target Audience:
Production code and hobby programmers should find it useful.
Comparison:
It has no dependency and is very light and standalone.
FrontAd9873@reddit
It’s been a while since I’ve felt the need to “freeze” my dependencies in a requirements.txt file. Can anyone help me understand why this is such a common thing?
thisismyfavoritename@reddit
lets say you want to use your software somewhere else. What happens if a library you are using or one of its dependencies has a new latest version
FrontAd9873@reddit
Interesting! It’s odd they don’t support the standard pyproject.toml file too.
thisismyfavoritename@reddit
no, don't use that thing. There are other better solutions that exist
FrontAd9873@reddit
Why did you edit your original comment? You said something about “Google Cloud Functions” requiring requirements files.
Why wouldn’t you use pyproject.tomls? Aren’t they the official file to track dependencies and other metadata for Python packaging?
thisismyfavoritename@reddit
i think you're confused buddy
FrontAd9873@reddit
OK buddy, thanks for your concern!
Here’s the PEP dictating use of pyproject.toml:
https://peps.python.org/pep-0621/ PEP 621
_squik@reddit
I create quite a few Google Cloud Functions at work and those require a requirements.txt file. I use
uv export -o src/requirements.txt
to freeze deps then deploy the src folder.muneriver@reddit
use ‘uv’ with a ‘pyproject.toml’ then run
‘uv pip compile pyproject.toml -o requirements.txt’
yc_hk@reddit
But why even bother compiling? Just use the uv.lock file for syncing.
_squik@reddit
You don't even need to go to
uv pip
for this. Just run:https://docs.astral.sh/uv/reference/cli/#uv-export
muneriver@reddit
Even better! I just pasted straight from the docs lol. But same idea- let uv do the work since it makes it so easy.
thisismyfavoritename@reddit
or pip-compile from pip-multi-tools
Amazing_Learn@reddit
I think this may be dangerous (for example see https://pypi.org/project/rest-framework-simplejwt/ ), there's no guarantee that package name if the same as package name on PyPi, also generally people favor `pyproject.toml` instead of `requirements.txt`, it solves the problem of it being "bloated" since it only contains direct dependencies.
FrontAd9873@reddit
I assumed this tool translated from the import name to the distribution name (somehow). If it doesn’t, that makes this tool a non-starter.
Also, pyproject.toml and requirements.txt serve two different purposes. The first lists project dependencies (think of it like ingredients for a recipe). The second lists a specific set of packages and versions which meets the requirements set out by the dependencies (think of it like a grocery list).
pyproject.toml might say I need some_lib~=1.2.0. It says nothing about where to find a suitable version. requirements.txt might say some_lib==1.4.6, or contain a link to a private Git repo or local file path (which you can’t put in pyproject.toml). So it specifies a specific version and often a place to find it.
Justicia-Gai@reddit
In other langs, from the toml file you can get the dependency tree, which is more useful IMO.
And you can put specific versions in the toml file.
We’re not there yet but toml might become as ubiquitous as git, hopefully. It would be nice.
FrontAd9873@reddit
Unsure what you’re getting at. I never said you can’t put specific versions in the pyproject.toml.
mfitzp@reddit
You can, or at least it works with uv
FrontAd9873@reddit
Thanks for the correction! I guess in my mind it was impossible because it seems like poor practice.
Amazing_Learn@reddit
requirements.txt doesn't have to list all the packages is their specific versions, you have lockfiles for that.
FrontAd9873@reddit
Lockfiles are a more recent thing. I’m just referring to the old distinction. requirements.txt files don’t need to refer to anything, indeed they are totally optional. I’m just delineating the standard understanding of how they differ from a dependency list as you’d find in pyproject.toml.
Amazing_Learn@reddit
Well, you're right, I can only collect opinions and feedback from my coworkers and friends.
Historically you didn't really have anything similar to lockfiles, and `requirements.txt` was the only way to declare dependencies, some people only specified direct dependencies, some did `pip freeze`.
I only started programming in 2018 and working in \~2020, quickly jumping from: `pip` -> `pipfile` -> `poetry` -> `pdm` -> `uv`, all of which except pip used a toml configuration file and generated lockfiles.
Coming back to the topic of genreq/pipreqs itself - I don't see a benefit to that in anything besides small scripts which you may want to run without installing all the requirements manually. Both projects don't solve the "bloat" of `requirements.txt` file since it only occurs if you want to pin all, including transient dependencies of your project. You also run into a problem of dependency confusion, for example I maintain a fork of `passlib` under `libpass` name, but to maintain backwards compatibility it distributes the files undre `passlib` package, and not `libpas`, or the before mentioned `rest-framework-simplejwt` is a good example when project from the start had a different distribution package name and project name on pypi.
martinky24@reddit
I’ve never felt like my requirements file was “bloated”
TheChosenMenace@reddit (OP)
I guess, rather than bloated, it would be complicated when you have 100 of packages and need a tool that warns you about installed packages that are never imported and ones that are imported but not installed. In a sense, it is a more fine tuned alternative to pip freeze which could add packages you are not even using anymore, and doesn't warn you if you are missing some.
FrontAd9873@reddit
Why are installed but never imported packages a problem? Wouldn’t any project with a few dependencies have dozens of such indirect dependencies?
I don’t see why I would want to be warned about these. I likely even wouldn’t want them in my requirements.txt.
zacker150@reddit
Because they make your docker images unnecessarily large.
FrontAd9873@reddit
How? An installed package is usually installed because it is necessary, even if it is not imported by my code.
zacker150@reddit
Code rot, which inevitably happens in large complex codebases:
Here's an example:
FrontAd9873@reddit
What you’re describing isn’t what I asked about. I asked why installed but not imported packages are a “problem,” ie why they should raise a warning in this tool.
Yes the situation you’re describing does lead to installed but not imported packages, but the presence of installed but not imported packages is not a guarantee that the situation you’re describing has occurred. It could occur because… transitive dependencies are a thing.
Transitive dependencies are still dependencies so they’re hardly unnecessary, as implied by your comment about them leading to “unnecsssarily large” Docker images.
zacker150@reddit
Transitive dependencies shouldn't be defined in your
requirements.in
file. Pip will automatically install them when you dopip install
. If you want to pin transistive dependencies, you should dopip-compile
This tool does the exact same thing as Deptry.
FrontAd9873@reddit
I agree about requirements.txt and transitive dependencies.
This tool does not do what Deptry does since it only works on requirements.txt files.
TheChosenMenace@reddit (OP)
A warning is exactly just that, a warning. If your optimizing for disk space (which i actually suffer from), having useless packages might be critical. If you decide to replace fastapi with astral, it would be nice to be warned about (very much still existing) fastapi package.
FrontAd9873@reddit
Sure, but a package not being imported doesn’t mean you’re not using it. I guess you meant “recursively imported” or something.
Spitfire1900@reddit
If you want to make a tool that scans for extra requirements that’s a fine idea, but it should use the installed metadata to do that.
The correct fix for a bloated requirements.txt is to move to pyproject.toml or requirements.in.
mrswats@reddit
Declaring your dependencies in pi pyproject.toml and compiling into a requirements.txt with pip-tools is more than enough. No bloat. Easy to use.
anentropic@reddit
This seems to be solving a non-problem that is already better handled by existing tools
ReachingForVega@reddit
Wait until you see a uv toml if you think requirements.txt are bloated.
Coretaxxe@reddit
How does it handle extensions and unmatched pacakges?
For example pycord imports as discord, pycord[voice] as extension is not used as import at all.
_MicroWave_@reddit
This isn't a good idea.
You should be using the pyproject.toml as specified in the standard.
UV is the vogue tool for doing this.
daemonengineer@reddit
Just... No. Yet another way to manage python dependencies is not what I need, and I don't think the ecosystem needs it.
TopSwagCode@reddit
I use requirements.in to compile my requirements.txt
FrontAd9873@reddit
Btw, I think deptry is an obvious comparison to this tool, but it works where you define your dependencies and not just on requirements.txt files.
TheChosenMenace@reddit (OP)
Well, you don't even need a requirements.txt! You set the directory, the recursion depth and virtual env, and it will automatically scan all python files and create one for you + warns you about installed packages that are never imported and ones that are imported but not installed.
FrontAd9873@reddit
If I don’t have a requirements.txt it is because I do not want one… I rarely see the use for one.
Wouldn’t your tool be more useful if it worked on dependencies listed in pyproject.toml?
requirements.txt is not meant for dependencies, really.
TheChosenMenace@reddit (OP)
I see your point, and this is actually a good feature to keep in my mind--doing a flag to enable using pyproject.toml. However, a lot of developers, including me, still have great use for a requirements.txt which is what this project was (initially) targeted for.
DuckSaxaphone@reddit
I actually think this is a solid idea for a tool, despite some of the comments you've been getting.
That said, pyproject.toml files are the industry standard so your library needs to support them.
ou_ryperd@reddit
Does "ignores venv/" mean it will also work if a setup doesn't use venv?
PurepointDog@reddit
This is a good check, thanks