sbg / dante Goto Github PK
View Code? Open in Web Editor NEWDante is a Python dependency validation tool
License: Apache License 2.0
Dante is a Python dependency validation tool
License: Apache License 2.0
Unless I grossly misunderstood something, I reckon that there is a bug in requirements_files
list when running dante config
. For the following setup.cfg...
[dante]
allow_named_versions = true
named_version_patterns =
0.*version
lock_file_path = requirements-dev.lock
requirements_files =
requirements-dev.txt
lock_files =
requirements-dev.lock
...I'd expect requirements-dev.txt to be there instead of requirements-dev.lock.
{
"dante": {
"requirements_files": [
"requirements-dev.lock"
],
"lock_files": [
"requirements-dev.lock"
],
}
}
When creating a list of packages and their requirements (e.g. for a dependency graph), extras
(as in extras_require
property in setuptools.setup
) are ignored. This results in missing package requirements.
A project has pkga with pkgc extra as a requirement in requirements.txt:
pkga[pkgc]
The (simplified) setup.py of pkga would be following:
setup(
name="pkga",
version="1.0.0",
install_requires=["pkgb"],
extras_require={
"pkgc": ["pkgc"]
}
)
This means that pkga requires pkgb but if installed as pkga[pkgc] it also requires pkgc.
Dante ignores the extras and compiles the requirements as:
pkga
|-> pkgb
|-> ...
...while the expected result would be:
pkga
|-> pkgb
|-> ...
|-> pkgc
|-> ...
However, this a simple example because there might be multiple extras
(e.g. pkga[pkgc,pkgz]) so gathering all the requirements gets more complex.
Getting the extras is not difficult. dante.core.models.Dependency
could have extras
property so at blocks like dante.core.models.Package.requirements
you could pass the extras
around and get correct requirements:
return RequirementCollection(sorted([
Requirement(
key=requirement.key.lower(),
name=requirement.name,
obj=requirement,
version=RequiredVersion(obj=requirement.specifier),
extras=requirement.extras,
_ignore_list=self._ignore_list
)
for requirement in self.obj.requires(extras=self.extras)
if requirement.key.lower() not in self._ignore_list
]))
However, it seems to me that creation of a graph starts with pkg_resources.working_set
which is a set of packages that doesn't contain the information on which extras were used for each package.
I think it would be a good idea to add dante check
and dante lock
as pre-commit hooks, using the pre-commit framework.
requirements-dev.txt
and requirements.txt
requirements-dev.lock
and requirements.lock
While running the hook, this configuration could be overriden by the developers, either trough pre-commit's args
option, or their own setup.cfg
If you agree with this, I can write the PR myself.
If I have multiple requirements files and packages from all of them installed in the current environment, then running dante lock -s
would lock all packages in lock_file_path
, even those that are listed in a different requirement file. So instead I need to perform multiple lock calls:
dante lock -s -r requirements.txt -f requirements.lock
dante lock -s -r requirements-dev.txt -f requirements-dev.lock
...
With the existing support for multiple requirements files and multiple lock files, wouldn't it be better if there was a clear relation between each requirement file and a lock file so that dante lock -s
would modify multiple lock files at once?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.