Git Product home page Git Product logo

pip-tools's Introduction

jazzband-image pypi pyversions pre-commit buildstatus-gha codecov Matrix Room Badge Matrix Space Badge discord-chat-image

pip-tools = pip-compile + pip-sync

A set of command line tools to help you keep your pip-based packages fresh, even when you've pinned them. You do pin them, right? (In building your Python application and its dependencies for production, you want to make sure that your builds are predictable and deterministic.)

pip-tools overview for phase II

Installation

Similar to pip, pip-tools must be installed in each of your project's virtual environments:

$ source /path/to/venv/bin/activate
(venv) $ python -m pip install pip-tools

Note: all of the remaining example commands assume you've activated your project's virtual environment.

Example usage for pip-compile

The pip-compile command lets you compile a requirements.txt file from your dependencies, specified in either pyproject.toml, setup.cfg, setup.py, or requirements.in.

Run it with pip-compile or python -m piptools compile (or pipx run --spec pip-tools pip-compile if pipx was installed with the appropriate Python version). If you use multiple Python versions, you can also run py -X.Y -m piptools compile on Windows and pythonX.Y -m piptools compile on other systems.

pip-compile should be run from the same virtual environment as your project so conditional dependencies that require a specific Python version, or other environment markers, resolve relative to your project's environment.

Note: If pip-compile finds an existing requirements.txt file that fulfils the dependencies then no changes will be made, even if updates are available. To compile from scratch, first delete the existing requirements.txt file, or see Updating requirements for alternative approaches.

Requirements from pyproject.toml

The pyproject.toml file is the latest standard for configuring packages and applications, and is recommended for new projects. pip-compile supports both installing your project.dependencies as well as your project.optional-dependencies. Thanks to the fact that this is an official standard, you can use pip-compile to pin the dependencies in projects that use modern standards-adhering packaging tools like Setuptools, Hatch or flit.

Suppose you have a 'foobar' Python application that is packaged using Setuptools, and you want to pin it for production. You can declare the project metadata as:

[build-system]
requires = ["setuptools", "setuptools-scm"]
build-backend = "setuptools.build_meta"

[project]
requires-python = ">=3.9"
name = "foobar"
dynamic = ["dependencies", "optional-dependencies"]

[tool.setuptools.dynamic]
dependencies = { file = ["requirements.in"] }
optional-dependencies.test = { file = ["requirements-test.txt"] }

If you have a Django application that is packaged using Hatch, and you want to pin it for production. You also want to pin your development tools in a separate pin file. You declare django as a dependency and create an optional dependency dev that includes pytest:

[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"

[project]
name = "my-cool-django-app"
version = "42"
dependencies = ["django"]

[project.optional-dependencies]
dev = ["pytest"]

You can produce your pin files as easily as:

$ pip-compile -o requirements.txt pyproject.toml
#
# This file is autogenerated by pip-compile with Python 3.10
# by the following command:
#
#    pip-compile --output-file=requirements.txt pyproject.toml
#
asgiref==3.6.0
    # via django
django==4.1.7
    # via my-cool-django-app (pyproject.toml)
sqlparse==0.4.3
    # via django

$ pip-compile --extra dev -o dev-requirements.txt pyproject.toml
#
# This file is autogenerated by pip-compile with Python 3.10
# by the following command:
#
#    pip-compile --extra=dev --output-file=dev-requirements.txt pyproject.toml
#
asgiref==3.6.0
    # via django
attrs==22.2.0
    # via pytest
django==4.1.7
    # via my-cool-django-app (pyproject.toml)
exceptiongroup==1.1.1
    # via pytest
iniconfig==2.0.0
    # via pytest
packaging==23.0
    # via pytest
pluggy==1.0.0
    # via pytest
pytest==7.2.2
    # via my-cool-django-app (pyproject.toml)
sqlparse==0.4.3
    # via django
tomli==2.0.1
    # via pytest

This is great for both pinning your applications, but also to keep the CI of your open-source Python package stable.

Requirements from setup.py and setup.cfg

pip-compile has also full support for setup.py- and setup.cfg-based projects that use setuptools.

Just define your dependencies and extras as usual and run pip-compile as above.

Requirements from requirements.in

You can also use plain text files for your requirements (e.g. if you don't want your application to be a package). To use a requirements.in file to declare the Django dependency:

# requirements.in
django

Now, run pip-compile requirements.in:

$ pip-compile requirements.in
#
# This file is autogenerated by pip-compile with Python 3.10
# by the following command:
#
#    pip-compile requirements.in
#
asgiref==3.6.0
    # via django
django==4.1.7
    # via -r requirements.in
sqlparse==0.4.3
    # via django

And it will produce your requirements.txt, with all the Django dependencies (and all underlying dependencies) pinned.

(updating-requirements)=

Updating requirements

pip-compile generates a requirements.txt file using the latest versions that fulfil the dependencies you specify in the supported files.

If pip-compile finds an existing requirements.txt file that fulfils the dependencies then no changes will be made, even if updates are available.

To force pip-compile to update all packages in an existing requirements.txt, run pip-compile --upgrade.

To update a specific package to the latest or a specific version use the --upgrade-package or -P flag:

# only update the django package
$ pip-compile --upgrade-package django

# update both the django and requests packages
$ pip-compile --upgrade-package django --upgrade-package requests

# update the django package to the latest, and requests to v2.0.0
$ pip-compile --upgrade-package django --upgrade-package requests==2.0.0

You can combine --upgrade and --upgrade-package in one command, to provide constraints on the allowed upgrades. For example to upgrade all packages whilst constraining requests to the latest version less than 3.0:

$ pip-compile --upgrade --upgrade-package 'requests<3.0'

Using hashes

If you would like to use Hash-Checking Mode available in pip since version 8.0, pip-compile offers --generate-hashes flag:

$ pip-compile --generate-hashes requirements.in
#
# This file is autogenerated by pip-compile with Python 3.10
# by the following command:
#
#    pip-compile --generate-hashes requirements.in
#
asgiref==3.6.0 \
    --hash=sha256:71e68008da809b957b7ee4b43dbccff33d1b23519fb8344e33f049897077afac \
    --hash=sha256:9567dfe7bd8d3c8c892227827c41cce860b368104c3431da67a0c5a65a949506
    # via django
django==4.1.7 \
    --hash=sha256:44f714b81c5f190d9d2ddad01a532fe502fa01c4cb8faf1d081f4264ed15dcd8 \
    --hash=sha256:f2f431e75adc40039ace496ad3b9f17227022e8b11566f4b363da44c7e44761e
    # via -r requirements.in
sqlparse==0.4.3 \
    --hash=sha256:0323c0ec29cd52bceabc1b4d9d579e311f3e4961b98d174201d5622a23b85e34 \
    --hash=sha256:69ca804846bb114d2ec380e4360a8a340db83f0ccf3afceeb1404df028f57268
    # via django

Output File

To output the pinned requirements in a filename other than requirements.txt, use --output-file. This might be useful for compiling multiple files, for example with different constraints on django to test a library with both versions using tox:

$ pip-compile --upgrade-package 'django<1.0' --output-file requirements-django0x.txt
$ pip-compile --upgrade-package 'django<2.0' --output-file requirements-django1x.txt

Or to output to standard output, use --output-file=-:

$ pip-compile --output-file=- > requirements.txt
$ pip-compile - --output-file=- < requirements.in > requirements.txt

Forwarding options to pip

Any valid pip flags or arguments may be passed on with pip-compile's --pip-args option, e.g.

$ pip-compile requirements.in --pip-args "--retries 10 --timeout 30"

Configuration

You can define project-level defaults for pip-compile and pip-sync by writing them to a configuration file in the same directory as your requirements input files (or the current working directory if piping input from stdin). By default, both pip-compile and pip-sync will look first for a .pip-tools.toml file and then in your pyproject.toml. You can also specify an alternate TOML configuration file with the --config option.

It is possible to specify configuration values both globally and command-specific. For example, to by default generate pip hashes in the resulting requirements file output, you can specify in a configuration file:

[tool.pip-tools]
generate-hashes = true

Options to pip-compile and pip-sync that may be used more than once must be defined as lists in a configuration file, even if they only have one value.

pip-tools supports default values for all valid command-line flags of its subcommands. Configuration keys may contain underscores instead of dashes, so the above could also be specified in this format:

[tool.pip-tools]
generate_hashes = true

Configuration defaults specific to pip-compile and pip-sync can be put beneath separate sections. For example, to by default perform a dry-run with pip-compile:

[tool.pip-tools.compile] # "sync" for pip-sync
dry-run = true

This does not affect the pip-sync command, which also has a --dry-run option. Note that local settings take preference over the global ones of the same name, whenever both are declared, thus this would also make pip-compile generate hashes, but discard the global dry-run setting:

[tool.pip-tools]
generate-hashes = true
dry-run = true

[tool.pip-tools.compile]
dry-run = false

You might be wrapping the pip-compile command in another script. To avoid confusing consumers of your custom script you can override the update command generated at the top of requirements files by setting the CUSTOM_COMPILE_COMMAND environment variable.

$ CUSTOM_COMPILE_COMMAND="./pipcompilewrapper" pip-compile requirements.in
#
# This file is autogenerated by pip-compile with Python 3.10
# by the following command:
#
#    ./pipcompilewrapper
#
asgiref==3.6.0
    # via django
django==4.1.7
    # via -r requirements.in
sqlparse==0.4.3
    # via django

Workflow for layered requirements

If you have different environments that you need to install different but compatible packages for, then you can create layered requirements files and use one layer to constrain the other.

For example, if you have a Django project where you want the newest 2.1 release in production and when developing you want to use the Django debug toolbar, then you can create two *.in files, one for each layer:

# requirements.in
django<2.2

At the top of the development requirements dev-requirements.in you use -c requirements.txt to constrain the dev requirements to packages already selected for production in requirements.txt.

# dev-requirements.in
-c requirements.txt
django-debug-toolbar<2.2

First, compile requirements.txt as usual:

$ pip-compile
#
# This file is autogenerated by pip-compile with Python 3.10
# by the following command:
#
#    pip-compile
#
django==2.1.15
    # via -r requirements.in
pytz==2023.3
    # via django

Now compile the dev requirements and the requirements.txt file is used as a constraint:

$ pip-compile dev-requirements.in
#
# This file is autogenerated by pip-compile with Python 3.10
# by the following command:
#
#    pip-compile dev-requirements.in
#
django==2.1.15
    # via
    #   -c requirements.txt
    #   django-debug-toolbar
django-debug-toolbar==2.1
    # via -r dev-requirements.in
pytz==2023.3
    # via
    #   -c requirements.txt
    #   django
sqlparse==0.4.3
    # via django-debug-toolbar

As you can see above, even though a 2.2 release of Django is available, the dev requirements only include a 2.1 version of Django because they were constrained. Now both compiled requirements files can be installed safely in the dev environment.

To install requirements in production stage use:

$ pip-sync

You can install requirements in development stage by:

$ pip-sync requirements.txt dev-requirements.txt

Version control integration

You might use pip-compile as a hook for the pre-commit. See pre-commit docs for instructions. Sample .pre-commit-config.yaml:

repos:
  - repo: https://github.com/jazzband/pip-tools
    rev: 7.4.1
    hooks:
      - id: pip-compile

You might want to customize pip-compile args by configuring args and/or files, for example:

repos:
  - repo: https://github.com/jazzband/pip-tools
    rev: 7.4.1
    hooks:
      - id: pip-compile
        files: ^requirements/production\.(in|txt)$
        args: [--index-url=https://example.com, requirements/production.in]

If you have multiple requirement files make sure you create a hook for each file.

repos:
  - repo: https://github.com/jazzband/pip-tools
    rev: 7.4.1
    hooks:
      - id: pip-compile
        name: pip-compile setup.py
        files: ^(setup\.py|requirements\.txt)$
      - id: pip-compile
        name: pip-compile requirements-dev.in
        args: [requirements-dev.in]
        files: ^requirements-dev\.(in|txt)$
      - id: pip-compile
        name: pip-compile requirements-lint.in
        args: [requirements-lint.in]
        files: ^requirements-lint\.(in|txt)$
      - id: pip-compile
        name: pip-compile requirements.in
        args: [requirements.in]
        files: ^requirements\.(in|txt)$

Example usage for pip-sync

Now that you have a requirements.txt, you can use pip-sync to update your virtual environment to reflect exactly what's in there. This will install/upgrade/uninstall everything necessary to match the requirements.txt contents.

Run it with pip-sync or python -m piptools sync. If you use multiple Python versions, you can also run py -X.Y -m piptools sync on Windows and pythonX.Y -m piptools sync on other systems.

pip-sync must be installed into and run from the same virtual environment as your project to identify which packages to install or upgrade.

Be careful: pip-sync is meant to be used only with a requirements.txt generated by pip-compile.

$ pip-sync
Uninstalling flake8-2.4.1:
    Successfully uninstalled flake8-2.4.1
Collecting click==4.1
    Downloading click-4.1-py2.py3-none-any.whl (62kB)
    100% |................................| 65kB 1.8MB/s
    Found existing installation: click 4.0
    Uninstalling click-4.0:
        Successfully uninstalled click-4.0
Successfully installed click-4.1

To sync multiple *.txt dependency lists, just pass them in via command line arguments, e.g.

$ pip-sync dev-requirements.txt requirements.txt

Passing in empty arguments would cause it to default to requirements.txt.

Any valid pip install flags or arguments may be passed with pip-sync's --pip-args option, e.g.

$ pip-sync requirements.txt --pip-args "--no-cache-dir --no-deps"

Note: pip-sync will not upgrade or uninstall packaging tools like setuptools, pip, or pip-tools itself. Use python -m pip install --upgrade to upgrade those packages.

Should I commit requirements.in and requirements.txt to source control?

Generally, yes. If you want a reproducible environment installation available from your source control, then yes, you should commit both requirements.in and requirements.txt to source control.

Note that if you are deploying on multiple Python environments (read the section below), then you must commit a separate output file for each Python environment. We suggest to use the {env}-requirements.txt format (ex: win32-py3.7-requirements.txt, macos-py3.10-requirements.txt, etc.).

Cross-environment usage of requirements.in/requirements.txt and pip-compile

The dependencies of a package can change depending on the Python environment in which it is installed. Here, we define a Python environment as the combination of Operating System, Python version (3.7, 3.8, etc.), and Python implementation (CPython, PyPy, etc.). For an exact definition, refer to the possible combinations of PEP 508 environment markers.

As the resulting requirements.txt can differ for each environment, users must execute pip-compile on each Python environment separately to generate a requirements.txt valid for each said environment. The same requirements.in can be used as the source file for all environments, using PEP 508 environment markers as needed, the same way it would be done for regular pip cross-environment usage.

If the generated requirements.txt remains exactly the same for all Python environments, then it can be used across Python environments safely. But users should be careful as any package update can introduce environment-dependent dependencies, making any newly generated requirements.txt environment-dependent too. As a general rule, it's advised that users should still always execute pip-compile on each targeted Python environment to avoid issues.

Maximizing reproducibility

pip-tools is a great tool to improve the reproducibility of builds. But there are a few things to keep in mind.

  • pip-compile will produce different results in different environments as described in the previous section.
  • pip must be used with the PIP_CONSTRAINT environment variable to lock dependencies in build environments as documented in #8439.
  • Dependencies come from many sources.

Continuing the pyproject.toml example from earlier, creating a single lock file could be done like:

$ pip-compile --all-build-deps --all-extras --output-file=constraints.txt --strip-extras pyproject.toml
#
# This file is autogenerated by pip-compile with Python 3.9
# by the following command:
#
#    pip-compile --all-build-deps --all-extras --output-file=constraints.txt --strip-extras pyproject.toml
#
asgiref==3.5.2
    # via django
attrs==22.1.0
    # via pytest
backports-zoneinfo==0.2.1
    # via django
django==4.1
    # via my-cool-django-app (pyproject.toml)
editables==0.3
    # via hatchling
hatchling==1.11.1
    # via my-cool-django-app (pyproject.toml::build-system.requires)
iniconfig==1.1.1
    # via pytest
packaging==21.3
    # via
    #   hatchling
    #   pytest
pathspec==0.10.2
    # via hatchling
pluggy==1.0.0
    # via
    #   hatchling
    #   pytest
py==1.11.0
    # via pytest
pyparsing==3.0.9
    # via packaging
pytest==7.1.2
    # via my-cool-django-app (pyproject.toml)
sqlparse==0.4.2
    # via django
tomli==2.0.1
    # via
    #   hatchling
    #   pytest

Some build backends may also request build dependencies dynamically using the get_requires_for_build_ hooks described in PEP 517 and PEP 660. This will be indicated in the output with one of the following suffixes:

  • (pyproject.toml::build-system.backend::editable)
  • (pyproject.toml::build-system.backend::sdist)
  • (pyproject.toml::build-system.backend::wheel)

Other useful tools

Deprecations

This section lists pip-tools features that are currently deprecated.

  • In the next major release, the --allow-unsafe behavior will be enabled by default (#989). Use --no-allow-unsafe to keep the old behavior. It is recommended to pass --allow-unsafe now to adapt to the upcoming change.
  • The legacy resolver is deprecated and will be removed in future versions. The new default is --resolver=backtracking.
  • In the next major release, the --strip-extras behavior will be enabled by default (#1613). Use --no-strip-extras to keep the old behavior.

A Note on Resolvers

You can choose from either default backtracking resolver or the deprecated legacy resolver.

The legacy resolver will occasionally fail to resolve dependencies. The backtracking resolver is more robust, but can take longer to run in general.

You can continue using the legacy resolver with --resolver=legacy although note that it is deprecated and will be removed in a future release.

pip-tools's People

Contributors

andydecleyre avatar atugushev avatar auvipy avatar blueyed avatar brutasse avatar chludwig-haufe avatar chrysle avatar davidmreed avatar davidovich avatar di avatar florentjeannot avatar graingert avatar hauntsaninja avatar hramezani avatar hugovk avatar hynek avatar jdufresne avatar jezdez avatar l1storez avatar mktums avatar nicoa avatar nvie avatar pre-commit-ci[bot] avatar q0w avatar richafrank avatar ssbarnea avatar svetlyak40wt avatar techalchemy avatar vphilippon avatar webknjaz avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pip-tools's Issues

[future] Error parsing spec "Unidecode>=0.04.12,<0.05"

Tried putting django-oscar in requirements.in and pip-compile

Traceback (most recent call last):
  File "/Users/prophet/Envs/lava/lib/python2.7/site-packages/ipdb/__main__.py", line 138, in main
    pdb._runscript(mainpyfile)
  File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/pdb.py", line 1233, in _runscript
    self.run(statement)
  File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/bdb.py", line 387, in run
    exec cmd in globals, locals
  File "<string>", line 1, in <module>
  File "/Users/prophet/Envs/lava/bin/pip-compile", line 3, in <module>
    __requires__ = 'pip-tools==1.0'
  File "/Users/prophet/Envs/lava/src/pip-tools/bin/pip-compile", line 141, in <module>
    main()
  File "/Users/prophet/Envs/lava/src/pip-tools/bin/pip-compile", line 132, in main
    compile_specs(src_files, dry_run=args.dry_run)
  File "/Users/prophet/Envs/lava/src/pip-tools/bin/pip-compile", line 90, in compile_specs
    pinned_spec_set = resolver.resolve()
  File "/Users/prophet/Envs/lava/src/pip-tools/piptools/resolver.py", line 44, in resolve
    if not self.resolve_one_round():
  File "/Users/prophet/Envs/lava/src/pip-tools/piptools/resolver.py", line 28, in resolve_one_round
    new_deps = self.find_new_dependencies()
  File "/Users/prophet/Envs/lava/src/pip-tools/piptools/resolver.py", line 91, in find_new_dependencies
    all_deps = self.find_all_dependencies()
  File "/Users/prophet/Envs/lava/src/pip-tools/piptools/resolver.py", line 72, in find_all_dependencies
    for spec in spec_set.normalize():
  File "/Users/prophet/Envs/lava/src/pip-tools/piptools/datastructures.py", line 386, in normalize
    new_spec_set.add_spec(self.normalize_specs_for_name(name))
  File "/Users/prophet/Envs/lava/src/pip-tools/piptools/datastructures.py", line 336, in normalize_specs_for_name
    if ops['<='](less_than, greater_than):
  File "/Users/prophet/Envs/lava/src/pip-tools/piptools/datastructures.py", line 15, in _normalized
    nv1 = NormalizedVersion(v1)
  File "/Users/prophet/Envs/lava/src/pip-tools/piptools/version.py", line 99, in __init__
    self._parse(s, error_on_huge_major_num)
  File "/Users/prophet/Envs/lava/src/pip-tools/piptools/version.py", line 116, in _parse
    block = self._parse_numdots(groups['version'], s, 2)
  File "/Users/prophet/Envs/lava/src/pip-tools/piptools/version.py", line 166, in _parse_numdots
    "version number segment: '%s' in %r" % (n, full_ver_str))
IrrationalVersionError: cannot have leading zero in version number segment: '05' in '0.05'

Installing pip-compile

Seems that the future branch hasn't been merged in, so pip-compile is still very much a WIP and not installable unless you specify a branch and install from git directly. Is that intended?

Alias pip and add --ignore flag

I don't know if this fits in the scope of this project but I think would certainly be a nice enhancement. Making it possible to do something like:

 pip install requests --ignore

And that would automatically add requests to .pipignore. What do you think?

pip-review should compare version, not test equality

$ pip-review
pelican==3.0.1 is available (you have 3.1)

I'm locally testing this package, and pip-review will just test if current installed version is the same as the latest version in pip. Which causes problem as shown above.

Support Python 3

Things to be done:

  • Make the bin/pip-* files Python 3 compatible
  • Add ,py31,py32,py33 to Tox's envlist
  • Add Python 3.1, 3.2, 3.3 to .travis.yml
  • Update Trove classifiers in setup.py

pip-review in pip-tools 0.3.1 can't deal with pytz version numbers

pip-review in pip-tools 0.3.1 reports:

Cannot work with pytz==2013b because version number can't be normalized.

If I downgrade to pytz==2012j and then run pip-review, it indeed does not indicate that pytz==2013b is available.

The previous version of pip-tools, 0.2.1 as I recall, did not have this problem.

Add support for dependencies between requirement files

In regular requirements files you can include other requirement files by specifying them like this:

dev-requirements.txt:

-r requirements.txt

I suggest that in-files should support this with the following syntax:

dev-requirements.in:

-r requirements.in

This could either result in the requirements.txt shown above, or just include all the packages from it directly. If the latter approach is used an additional benefit would be that pip-sync can be implemented as the following bash script:

pip freeze | grep -v -f $1 - | xargs pip uninstall -y
pip install -r $1

Add -i to pip-dump

The -i option (interactive mode) could, for each new/undeclared, package, ask in which requirements file this should go.

Doesn't work in Windows

While I don't mind using Linux, unfortunately it's not an option at work. Using PortablePython-2.7.5. pip works but pip-review does not - note that the ../Scripts folder has pip.exe which is part of the issue (versus pip-review with no .exe), but when I added the .exe to it I just got some weird message (can't copy-paste it - says the NTVDM CPU has encountered an illegal instruction).

pip-dump: add --dry-run option to pip-dump / dry-run y default

I have just tried pip-dump, and it did not work very well (report might follow, probably caused mainly by vcs-style/editable installs).

I would like to have a --dry-run option, which would not rewrite / add to the existing files, but display what it would output.

Maybe "dry-run" should even be the default, and a switch like "-f" would be required to write the files?!

Support private PyPI servers

We have a private PyPI server for non-public packages and it would be great if pip-tools would support that.

It’s basically about:

[install]
index-url = http://pypi.private.invalid/
extra-index-url = http://pypi.python.org/simple/

in ~/.pip/pip.conf

Pretty please? :)

handle archives with pip-dump

Hi,

for some special cases, I need to install an archive.
Here pip install PyChart==1.39 returns an error, so while the Pypi repo isn't fixed, the solution is to install with
pip install http://download.gna.org/pychart/PyChart-1.39.tar.gz

unfortunately, pip-dump won't understand it, and will remove the tar.gz file from requirements to replace by the pip package, which I don't want.

Can it be fixed ?

Thanks for your hard work, very handy tool

Idea: have pip-review stream the output

Currently pip-review doesn's show you the results till it completes everything. Verbose mode is not that helpful.

It would be nice to have it output right away so you can start checking asap why your packages are outdated :)

Bad output

I'm getting this:
(reports) C:\dev\reports>pip-review
)jango==1.6 is available (you have 1.2.1
)istribute==0.7.3 is available (you have 0.7.3
)ip-tools==0.3.4 is available (you have 0.3.4
)sycopg2==2.5.1 is available (you have 2.5.1
)ywin32==214 is available (you have 218
)eportlab==2.7 is available (you have 2.7
)irtualenv==1.10.1 is available (you have 1.10.1
)lwt==0.7.5 is available (you have 0.7.2

  • Something is going wrong with character output.
  • It seems to think all these are all back versions, even tho most are not

pip 1.5 support

Private apis probably got moved. Future branch doesn't work now.

Traceback (most recent call last):
  File "/usr/local/bin/pip-compile", line 10, in <module>
    execfile(__file__)
  File "/Users/prophet/projects/djangodash2013/gopython3/src/pip-tools/bin/pip-compile", line 13, in <module>
    from piptools.package_manager import PackageManager
  File "/Users/prophet/projects/djangodash2013/gopython3/src/pip-tools/piptools/package_manager.py", line 24, in <module>
    from pip.download import _download_url, _get_response_from_url
ImportError: cannot import name _get_response_from_url

Support tar.gz packages

If you are using a gevent beta, you will have http://gevent.googlecode.com/files/gevent-1.0b4.tar.gz in your requirements file.

Currently pip-dump will remove it.

Indentation error in pip-review

Hi,

I was install pip-tools via pip from repository but I am getting unexpected indent error line 195 in bin/pip-review. I was fix error at local but you should fix the code in repository.

pip-dump error message if no requirements.txt file

If there's no requirements.txt file (but there are $VAR-requirements.txt files) in the directory, pip-dump appears to output an error message:

$ pip-dump
cat: requirements.txt: No such file or directory

Otherwise, it seems to behave correctly.

py26 compatibility

I've been playing a bit with pip-compile from the future branch, with py26 (it says in the setup.py that it's compatible: https://github.com/nvie/pip-tools/blob/future/setup.py#L43).

However, there's a few things that make it not compatible (change to use set()):

There may be other issues, I haven't gone further than that.

pip-compile produces case-variant duplicates

requirements.in

Django==1.6.1
django-jinja

pip-compile produces the following requirements.txt:

Django==1.6.1
django-jinja==0.23.1
django==1.6.1
jinja2==2.7.2
markupsafe==0.18

which pip install chokes on:

Double requirement given: django==1.6.1 (from -r requirements.txt (line 3)) (already in Django==1.6.1 (from -r requirements.txt (line 1)), name='django')

Perhaps django-jinja is expressing its dependencies incorrectly, but since pip works with it, pip-compile probably should as well.

Support "raw" output

Besides human-readable output, output in the following form is useful, too:

$ pip-review --raw
gunicorn==0.14.6
pymongo==2.3
python-dateutil==2.1
raven==2.0.6
redis==2.6.2
requests==0.14.0
simplejson==2.6.2
times==0.5
No update information found for vim-bridge           <— goes to stderr

So that it can be fed to pip install directly:

$ pip install $(pip-review --raw)
No update information found for vim-bridge         <— echo of stderr, does not mess up args to pip

Dependency handling in requirements when updating packages

So, here is the promised brain dump, sorry for the length.

Right now naively updating requirements can lead to dependency conflicts. For instance, let's say I want to add raven to my project but pinned to a specific version:

$ pip install raven==1.9.4
…
Successfully installed raven simplejson

So raven needs simplejson. Now I run pip freeze and get in my requirements.txt:

raven==1.9.4
simplejson==2.4.0

Some time later I run pip-review and get (this is not what you'd get right now):

raven==2.0.2 is available (you have 1.9.4)
simplejson==2.6.2 is available (you have 2.4.0)

Note that the newer simplejson was already available when I initially installed raven, but raven needed simplejson>=2.3.0,<2.5.0. Raven 2.0.2 does as well, but this still encourages me to upgrade simplejson when I shouldn't.

The current version of raven dropped the >=2.3.0,<2.5.0 part so now we can get the latest and greatest raven and simplejson safely.

My point is that when updating dependencies, checking for conflicts is very hard to do by hand. This needs to be automated with a tool that yells at the developer when an update leads to a version conflict.

Ruby gets this right with Bundler. gem install bundle, create a Gemfile with the following content:

source :rubygems
gem 'compass-less-plugin'

And run bundle install. This installs the required package and its dependencies and creates a Gemfile.lock file:

GEM
  remote: http://rubygems.org/
  specs:
    chunky_png (1.2.6)
    compass (0.12.2)
      chunky_png (~> 1.2)
      fssm (>= 0.2.7)
      sass (~> 3.1)
    compass-less-plugin (1.0)
      compass (>= 0.10)
    fssm (0.2.9)
    sass (3.2.1)

PLATFORMS
  ruby

DEPENDENCIES
  compass-less-plugin

Gemfile.lock is like requirements.txt with pinned versions (not everything is pinned here but should probably be): when creating a new environment and running bundle install, bundler looks at the .lock file to install what's specified.

Then there is a bunch of commands that bundle provides. For instance, to list available updates (running this on a bundle created months ago):

$ bundle outdated
Fetching gem metadata from http://rubygems.org/.....

Outdated gems included in the bundle:
  * chunky_png (1.2.6 > 1.2.5)
  * fssm (0.2.9 > 0.2.8.1)
  * sass (3.2.1 > 3.1.12)
  * compass (0.12.2 > 0.11.7)

Updating compass-less-plugin and its dependencies can be done in one command (bundle update compass-less-plugin) and does so while checking for version conflicts.

Sorry if you're already familiar with all this. Now I'll try to explain how we can make improve requirements.txt by using this approach.

First, instead of putting all the requirements in requirements.txt, people would only list first-level deps, pinned. So for raven:

raven==1.9.4

Then some tool provided by pip-tools compiles this into the full requirements list, into an other file (like Gemfile and Gemfile.lock but with less noise):

raven==1.9.4
simplejson==2.4.0

The key point is that this tool builds the whole dependency tree for all the top-level requirements and dumps it as a safely-installable-with-no-conflicts requirements file, which pip can just use.

So next time raven is updated and doesn't require an old simplejson, the tool can update the simplejson requirement. When raven drops simplejson to use python's built-in json implementation, the 2nd-level requirement can be dropped as well, automatically.

Other use case: requests which used to have dependencies on oauthlib, certifi, chardet and doesn't anymore (and oauthlib needed rsa or pyasn1 or whatever). If I just need requests I'll list in my top-level requirements and the tool will pin or drop the dependencies if they're not needed when I upgrade requests itself.

And finally, this tool could prevent me from installing package X and Y which need Z<1.0 and Z>1.1.

That's the theory and I think pip already does some version conflict checks but that's not enough to guarantee safe updates. Now in practice, I think the dependency information is not provided by the PyPI API and requires the whole package to be fetched to actually extract it (or maybe create.io provides that info). So that's annoying but doable, and pip-tools seems like a nice place to experiment with such things.

I think buildout does check for dependency conflicts but I never managed to wrap my head around it.

What do you think? I'm happy to start a proof-of-concept that could be integrated in this project.

requirements file inheriting other file gets checked twice by pip-dump

Found by accident when looking into #31. If you have multiple requirements files that inherit from base file, base file gets checked twice. In my case it was base requirements.txt and requirements_dev.txt that inherited the first file with -r requirements.txt. Output (related to #31) was:

Requirement file contains django==1.5, but that package is not installed
Requirement file contains fabric, but that package is not installed
Requirement file contains Psycopg2, but that package is not installed
Requirement file contains Psycopg2, but that package is not installed
Requirement file contains django==1.5, but that package is not installed
Requirement file contains fabric, but that package is not installed

Notice there are 3 packages, each listed twice.

Find links causes a ValueError

It's possible this is also related to turning on wheel related settings, here's the traceback:

Traceback (most recent call last):
  File "/Users/george/.venvs/test/bin/pip-review", line 270, in <module>
    main()
  File "/Users/george/.venvs/test/bin/pip-review", line 230, in main
    installed = list(get_installed_pkgs(local=args.local))
  File "/Users/george/.venvs/test/bin/pip-review", line 148, in get_installed_pkgs
    name, version = line.split('==')
ValueError: need more than 1 value to unpack

I printed out line just before the error is thrown and get:

-f /Users/george/.pip/wheels

Relevant pip.conf settings:

wheel-dir = /Users/george/.pip/wheels
find-links = /Users/george/.pip/wheels

[install]
use-wheel = true

pip-compile unable to find PIL distribution

On simple requirements.in like that

PIL

it fails with traceback:

Traceback (most recent call last):
  File "/Users/art/tmp/env/bin/pip-compile", line 8, in <module>
    execfile(__file__)
  File "/Users/art/tmp/env/src/pip-tools/bin/pip-compile", line 154, in <module>
    main()
  File "/Users/art/tmp/env/src/pip-tools/bin/pip-compile", line 145, in main
    compile_specs(src_files, dry_run=args.dry_run)
  File "/Users/art/tmp/env/src/pip-tools/bin/pip-compile", line 103, in compile_specs
    pinned_spec_set = resolver.resolve()
  File "/Users/art/tmp/env/src/pip-tools/piptools/resolver.py", line 44, in resolve
    if not self.resolve_one_round():
  File "/Users/art/tmp/env/src/pip-tools/piptools/resolver.py", line 28, in resolve_one_round
    new_deps = self.find_new_dependencies()
  File "/Users/art/tmp/env/src/pip-tools/piptools/resolver.py", line 91, in find_new_dependencies
    all_deps = self.find_all_dependencies()
  File "/Users/art/tmp/env/src/pip-tools/piptools/resolver.py", line 73, in find_all_dependencies
    version = pkgmgr.find_best_match(spec)
  File "/Users/art/tmp/env/src/pip-tools/piptools/package_manager.py", line 288, in find_best_match
    version, source = _find_cached_match(spec)
  File "/Users/art/tmp/env/src/pip-tools/piptools/package_manager.py", line 264, in _find_cached_match
    link = finder.find_requirement(requirement, False)
  File "/Users/art/tmp/env/lib/python2.7/site-packages/pip/index.py", line 267, in find_requirement
    raise DistributionNotFound('No distributions at all found for %s' % req)
pip.exceptions.DistributionNotFound: No distributions at all found for PIL

I investigated the problem, and found that it is because pip-compile don't set --allow-external flag to pip, when searching dependencies.

Add support for new version of setuptools >=0.8

There are a new release of setuptools that merge distribute changes:

https://pypi.python.org/pypi/setuptools/0.8#upgrading-from-distribute

It breaks some behaviour with the update in pip-tools generating some UserWarnings and a failing egg_info command. A example:

django-tastypie==0.9.15 is available (you have 0.9.14)
Upgrade now? [Y]es, [N]o, [A]ll, [Q]uit y
Downloading/unpacking django-tastypie==0.9.15
  Downloading django-tastypie-0.9.15.tar.gz (722kB): 722kB downloaded
  Running setup.py egg_info for package django-tastypie
    /usr/local/lib/python2.7/distutils/dist.py:267: UserWarning: Unknown distribution option: 'zip_safe'
      warnings.warn(msg)
    /usr/local/lib/python2.7/distutils/dist.py:267: UserWarning: Unknown distribution option: 'install_requires'
      warnings.warn(msg)
    /usr/local/lib/python2.7/distutils/dist.py:267: UserWarning: Unknown distribution option: 'tests_require'
      warnings.warn(msg)
    usage: -c [global_opts] cmd1 [cmd1_opts] [cmd2 [cmd2_opts] ...]
       or: -c --help [cmd1 cmd2 ...]
       or: -c --help-commands
       or: -c cmd --help

    error: invalid command 'egg_info'
    Complete output from command python setup.py egg_info:
    /usr/local/lib/python2.7/distutils/dist.py:267: UserWarning: Unknown distribution option: 'zip_safe'

  warnings.warn(msg)

/usr/local/lib/python2.7/distutils/dist.py:267: UserWarning: Unknown distribution option: 'install_requires'

  warnings.warn(msg)

/usr/local/lib/python2.7/distutils/dist.py:267: UserWarning: Unknown distribution option: 'tests_require'

  warnings.warn(msg)

usage: -c [global_opts] cmd1 [cmd1_opts] [cmd2 [cmd2_opts] ...]

   or: -c --help [cmd1 cmd2 ...]

   or: -c --help-commands

   or: -c cmd --help



error: invalid command 'egg_info'

----------------------------------------
Command python setup.py egg_info failed with error code 1 in /tmp/pip-build/django-tastypie
Storing complete log in /root/.pip/pip.log

VCS Support

I pin many of my packages from git repositories, like so: -e git+git://github.com/jezdez/django_compressor.git@1a7f1ff7cbd2605dc5697a9f2df649d97ff3a669#egg=django_compressor.

It would be great if pip-review would check for an updated commit, but I am not sure if it would be possible to implement this. Or if the VCS backed requirement points to a version behind the version on PyPI, then it would suggest an upgrade the the PyPI version.

Also if -e git+git://github.com/jezdez/django_compressor.git#egg=django_compressor is in my requirements then pip-dump should add the installed commit hash. However I am not sure if it is possible to know what commit is installed with a package. And this is only looking at git based VCS packages, other backends may require different support, such as svn or hg

Error when run pip-review on base Python

Today I tried to run pip-review on a production server, a CentOS with Python 2.6 system-wide (no virtualenv), but this is the output I get everytime I run:

Package "s3cmd" has wrong version. It was transformed from 1.1.0-beta3 into 1.1.0b3 for interoperability.
Traceback (most recent call last):
 File "/usr/bin/pip-review", line 217, in <module>
   main()
 File "/usr/bin/pip-review", line 178, in main
   installed = list(get_installed_pkgs(local=args.local))
 File "/usr/bin/pip-review", line 109, in get_installed_pkgs
   name = line.split('#egg=', 1)[1]
IndexError: list index out of range

pip-compile fails when using -f in requirements.in

Traceback (most recent call last):
  File "/home/graingert/.virtualenvs/test/bin/pip-compile", line 150, in <module>
    main()
  File "/home/graingert/.virtualenvs/test/bin/pip-compile", line 141, in main
    compile_specs(src_files, dry_run=args.dry_run)
  File "/home/graingert/.virtualenvs/test/bin/pip-compile", line 81, in compile_specs
    top_level_specs = list(collect_source_specs(source_files))
  File "/home/graingert/.virtualenvs/test/bin/pip-compile", line 75, in collect_source_specs
    for spec in walk_specfile(filename):
  File "/home/graingert/.virtualenvs/test/bin/pip-compile", line 66, in walk_specfile
    spec = Spec.from_line(line, source='{0}:{1}'.format(filename, lineno))
  File "/home/graingert/.virtualenvs/test/local/lib/python2.7/site-packages/piptools/datastructures.py", line 67, in from_line
    req = Requirement.parse(line)
  File "/home/graingert/.virtualenvs/test/local/lib/python2.7/site-packages/pkg_resources.py", line 2914, in parse
    reqs = list(parse_requirements(s))
  File "/home/graingert/.virtualenvs/test/local/lib/python2.7/site-packages/pkg_resources.py", line 2839, in parse_requirements
    line, p, specs = scan_list(VERSION,LINE_END,line,p,(1,2),"version spec")
  File "/home/graingert/.virtualenvs/test/local/lib/python2.7/site-packages/pkg_resources.py", line 2807, in scan_list
    raise ValueError("Expected "+item_name+" in",line,"at",line[p:])
ValueError: ('Expected version spec in', '-f http://example.com/pypi/', 'at', ' http://example.com/pypi/')

Ordering issue

Some packages seem to be ordered incorrectly (perhaps by date?)

$ pip-review                                                                                  
Django==1.3.3 is available (you have 1.4)     

pip-dump returns non-zero exit status 2

I have my requirements in ./requirements directory with base.txt and development.txt (using `-r base.txt) at the top.

Using pip-dump in my virtualenv -- I get the following trace:

(verde)peter@Blizzardme:~/PycharmProjects/Verde$ pip-dump
Requirement file contains argparse, but that package is not installed
Exception:
Traceback (most recent call last):
  File "/home/peter/.virtualenvs/verde/local/lib/python2.7/site-packages/pip-1.3.1-py2.7.egg/pip/basecommand.py", line 139, in main
    status = self.run(options, args)
  File "/home/peter/.virtualenvs/verde/local/lib/python2.7/site-packages/pip-1.3.1-py2.7.egg/pip/commands/freeze.py", line 99, in run
    line_req = InstallRequirement.from_line(line)
  File "/home/peter/.virtualenvs/verde/local/lib/python2.7/site-packages/pip-1.3.1-py2.7.egg/pip/req.py", line 118, in from_line
    return cls(req, comes_from, url=url)
  File "/home/peter/.virtualenvs/verde/local/lib/python2.7/site-packages/pip-1.3.1-py2.7.egg/pip/req.py", line 43, in __init__
    req = pkg_resources.Requirement.parse(req)
  File "/home/peter/.virtualenvs/verde/local/lib/python2.7/site-packages/setuptools-0.6c11-py2.7.egg/pkg_resources.py", line 2510, in parse
    reqs = list(parse_requirements(s))
  File "/home/peter/.virtualenvs/verde/local/lib/python2.7/site-packages/setuptools-0.6c11-py2.7.egg/pkg_resources.py", line 2436, in parse_requirements
    line, p, specs = scan_list(VERSION,LINE_END,line,p,(1,2),"version spec")
  File "/home/peter/.virtualenvs/verde/local/lib/python2.7/site-packages/setuptools-0.6c11-py2.7.egg/pkg_resources.py", line 2414, in scan_list
    "Expected ',' or end-of-list in",line,"at",line[p:]
ValueError: ("Expected ',' or end-of-list in", 'wsgiref==0.1.2-r base.txt-r base.txt', 'at', ' base.txt-r base.txt')

Storing complete log in /home/peter/.pip/pip.log
Traceback (most recent call last):
  File "/home/peter/.virtualenvs/verde/bin/pip-dump", line 159, in <module>
    main()
  File "/home/peter/.virtualenvs/verde/bin/pip-dump", line 155, in main
    dump_requirements(args.files)
  File "/home/peter/.virtualenvs/verde/bin/pip-dump", line 118, in dump_requirements
    _, new = pip_info(tmpfile)
  File "/home/peter/.virtualenvs/verde/bin/pip-dump", line 91, in pip_info
    raw = check_output(cmd)
  File "/usr/lib/python2.7/subprocess.py", line 575, in check_output
    raise CalledProcessError(retcode, cmd, output=output)
subprocess.CalledProcessError: Command 'pip freeze -lr /tmp/tmpguafnJ' returned non-zero exit status 2

pip-review outputs to stderr

pip-review outputs its "Everything up-to-date" message to stderr. This output should go to stdout, and reports of packages with updates should go to either stderr or stdout (depending on whether or not you consider that an error.)

This issue is important because it can cause problems when running pip-review from cron scripts, and because it's non-standard unix/linux behavior.

The workaround (to run pip-review from cron and silence output when there are no updates) is the following:

pip-review 2>&1 | grep -v "^Everything up-to-date$"

Add --user switch

I generally use pip install --user to install packages to my own home directory on Mac OS X.

pip-review needs to either accept a --user switch itself or better respect where the package it is upgrading is installed with regards to pip's --user switch

pipdump forces dependencies to be sorted alphabetically

It would be nice to let developers leave some comments in requirements.txt or rearange
packages in groups by their category.

Might be tricky to handle, but helpful for large applications that don't split dependenceis in multiple files (the're all required at once).

pip-dump is case sensitive

Not sure if it's intentional, but when pip-dump checks requirements files, it's behaving in case-sensitive fashion. This is probably relevant only to environments where requirements files were created manually and you are running pip-dump for the first time. Example:

django==1.5 in requirements.txt gives me Requirement file contains django==1.5, but that package is not installed. If I change it to Django==1.5 (with capital D), it behaves as expected.

bash: pip-review: command not found

After installing pip-tools with pip install pip-tools, I can't access the pip-tools commands.

zak$ pip-review
bash: pip-review: command not found

I've double checked with pip that pip-tools is installed:

zak$ pip list
distribute (0.6.40)
git-remote-helpers (0.1.0)
pip-tools (0.3.4)
virtualenv-clone (0.2.4)
wsgiref (0.1.2)

Why can't I use pip-tools commands from the terminal?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.