clarete / curdling Goto Github PK
View Code? Open in Web Editor NEWConcurrent package manager for Python
Home Page: http://clarete.li/curdling
License: GNU General Public License v3.0
Concurrent package manager for Python
Home Page: http://clarete.li/curdling
License: GNU General Public License v3.0
It's a good practice to hardcode a commit in requirements.txt when using a VCS version of a dependency. This is the format used by pip (there's another issue for "-e"):
curd install git+git://github.com/espeed/bulbs.git@1e1739199bdd84b8a992af597cbb6b7782dd6e2e#egg=bulbs-dev
Retrieving: [ ]
Some milk was spilled in the process:
git+git://github.com/espeed/bulbs.git@1e1739199bdd84b8a992af597cbb6b7782dd6e2e#egg=bulbs-dev
* git+git://github.com/espeed/bulbs.git@1e1739199bdd84b8a992af597cbb6b7782dd6e2e#egg=bulbs-dev (explicit requirement): Exception:
fatal: remote error:
espeed/bulbs.git@1e1739199bdd84b8a992af597cbb6b7782dd6e2e#egg=bulbs-dev is not a valid repository name
Email [email protected] for help
We run (and I imagine many companies run) a local PyPI server with two sides to it: a cache for the actual PyPI, and a local repository for our own internal packages. We use these with pip
via the parameters --index-url
and --extra-index-url
.
Curdling supports the short form -i
... but it should understand the long form --index-url
as well, and should understand --extra-index-url
.
Less importantly, Curdling is persnickety about the parameter order:
curd -i $PIP_INDEX_URL install -r requirements.txt
usage: curd [-h] [-l {WARN,ERROR,DEBUG,INFO,WARNING,CRITICAL,NOTSET}]
[--log-file LOG_FILE] [--log-name LOG_NAME] [-q] [-v]
{install,uninstall} ...
curd: error: invalid choice: 'http://pypi.mycompany.com/local/' (choose from u'install', u'uninstall')
It needs the -i
parameter to come after install
, else it gets confused.
It's definitely a good thing that distlib provides a simple (maybe too simple?) API for locating packages on remote indexes;
Curdling uses and extends that feature by implementing a CurdlingLocator
to play well with distlib. What I didn't know is that the ironically called SimpleScrapingLocator
class doesn't have a simple implementation at all. There's too much indirection and a handful of side effects in the process of scraping a pypi
index;
After debugging curdling for a while, I noticed that running the distlib locator inside of threads doesn't seem to work. I kinda found the problem and we should override the method _get_project
in our SimpleLocator
(let's also change this name, it won't be simple anymore) and overcome this bug:
(6a38805cda91be32)starenka ~ % curd -lDEBUG install https://www.djangoproject.com/download/1.6c1/tarball/ [72% 05:23:57]
2013-11-03 06:20:26,944:curdling.services.base:DEBUG:finder.start()
2013-11-03 06:20:26,948:curdling.services.base:DEBUG:downloader.start()
2013-11-03 06:20:26,949:curdling.services.base:DEBUG:curdler.start()
2013-11-03 06:20:26,952:curdling.services.base:DEBUG:dependencer.start()
2013-11-03 06:20:26,953:curdling.services.base:DEBUG:downloader.queue(from="main", data="{u'url': 'https://www.djangoproject.com/download/1.6c1/tarball/', 'requirement': 'https://www.djangoproject.com/download/1.6c1/tarball/'}")
2013-11-03 06:20:26,954:curdling.services.base:DEBUG:downloader[Thread-11].run(data="{u'url': 'https://www.djangoproject.com/download/1.6c1/tarball/', 'requirement': 'https://www.djangoproject.com/download/1.6c1/tarball/'}")
Retrieving: [ ] 0% (1 requested, 0 retrieved, 0 processed)2013-11-03 06:20:26,955:urllib3.connectionpool:INFO:Starting new HTTPS connection (1): www.djangoproject.com
Retrieving: [ ] 0% (1 requested, 0 retrieved, 0 processed)2013-11-03 06:20:27,550:urllib3.connectionpool:DEBUG:"GET /download/1.6c1/tarball/ HTTP/1.1" 301 0
2013-11-03 06:20:27,551:urllib3.poolmanager:INFO:Redirecting https://www.djangoproject.com/download/1.6c1/tarball/ -> https://www.djangoproject.com/m/releases/1.6/Django-1.6c1.tar.gz
2013-11-03 06:20:27,551:urllib3.connectionpool:INFO:Starting new HTTPS connection (2): www.djangoproject.com
Retrieving: [ ] 0% (1 requested, 0 retrieved, 0 processed)2013-11-03 06:20:28,295:urllib3.connectionpool:DEBUG:"GET /m/releases/1.6/Django-1.6c1.tar.gz HTTP/1.1" 200 6501419
Retrieving: [ ] 0% (1 requested, 0 retrieved, 0 processed)2013-11-03 06:20:36,664:curdling.services.base:ERROR:downloader[Thread-11].run(from="main", data="{u'url': 'https://www.djangoproject.com/download/1.6c1/tarball/', 'requirement': 'https://www.djangoproject.com/download/1.6c1/tarball/'}") failed:
/data/.envs/6a38805cda91be32/local/lib/python2.7/site-packages/curdling/services/base.py:79 (_worker) result = self(requester, **sender_data) or {}
Traceback (most recent call last):
File "/data/.envs/6a38805cda91be32/local/lib/python2.7/site-packages/curdling/services/base.py", line 79, in _worker
result = self(requester, **sender_data) or {}
File "/data/.envs/6a38805cda91be32/local/lib/python2.7/site-packages/curdling/services/base.py", line 66, in __call__
return self.handle(requester, kwargs)
File "/data/.envs/6a38805cda91be32/local/lib/python2.7/site-packages/curdling/services/downloader.py", line 269, in handle
field_name, location = self.download(data['url'], data.get('locator_url'))
File "/data/.envs/6a38805cda91be32/local/lib/python2.7/site-packages/curdling/services/downloader.py", line 315, in download
return protocol_mapping[handler](url)
File "/data/.envs/6a38805cda91be32/local/lib/python2.7/site-packages/curdling/services/downloader.py", line 335, in _download_http
response.read(cache_content=True, decode_content=False))
File "/data/.envs/6a38805cda91be32/local/lib/python2.7/site-packages/curdling/index.py", line 97, in from_data
with open(destination, 'wb') as fobj:
IOError: [Errno 21] Is a directory: u'/home/starenka/.curds/.'
Retrieving: [##########] 100% (1 requested, 0 retrieved, 0 built, 1 failed)
Some milk was spilled in the process:
https://www.djangoproject.com/download/1.6c1/tarball/
* https://www.djangoproject.com/download/1.6c1/tarball/ (explicit requirement): IOError:
[Errno 21] Is a directory: u'/home/starenka/.curds/.'
(works in pip)
2013-11-03 18:23:48,520:curdling.services.base:DEBUG:finder.start()
2013-11-03 18:23:48,521:curdling.services.base:DEBUG:downloader.start()
2013-11-03 18:23:48,523:curdling.services.base:DEBUG:curdler.start()
2013-11-03 18:23:48,524:curdling.services.base:DEBUG:dependencer.start()
2013-11-03 18:23:48,525:curdling.services.base:DEBUG:downloader.queue(from="main", data="{u'url': 'hg+ssh://[email protected]//home/mycompany/repos/3rdparty/template_utils@default', 'requirement': 'hg+ssh://[email protected]//home/mycompany/repos/3rdparty/template_utils@default'}")
2013-11-03 18:23:48,525:curdling.services.base:DEBUG:downloader[Thread-11].run(data="{u'url': 'hg+ssh://[email protected]//home/mycompany/repos/3rdparty/template_utils@default', 'requirement': 'hg+ssh://[email protected]//home/mycompany/repos/3rdparty/template_utils@default'}")
2013-11-03 18:23:49,189:curdling.services.base:DEBUG:downloader[Thread-11].run(data="{u'url': 'hg+ssh://[email protected]//home/mycompany/repos/3rdparty/template_utils@default', 'requirement': 'hg+ssh://[email protected]//home/mycompany/repos/3rdparty/template_utils@default'}"): {u'directory': '/tmp/tmp9Zmflc', u'requirement': 'hg+ssh://[email protected]//home/mycompany/repos/3rdparty/template_utils@default'}
2013-11-03 18:23:49,190:curdling.services.base:DEBUG:curdler.queue(from="downloader", data="{u'directory': '/tmp/tmp9Zmflc', u'requirement': 'hg+ssh://[email protected]//home/mycompany/repos/3rdparty/template_utils@default'}")
2013-11-03 18:23:49,190:curdling.services.base:DEBUG:curdler[Thread-21].run(data="{u'directory': '/tmp/tmp9Zmflc', u'requirement': 'hg+ssh://[email protected]//home/mycompany/repos/3rdparty/template_utils@default'}")
2013-11-03 18:23:49,327:curdling.services.base:DEBUG:curdler[Thread-21].run(data="{u'directory': '/tmp/tmp9Zmflc', u'requirement': 'hg+ssh://[email protected]//home/mycompany/repos/3rdparty/template_utils@default'}"): {u'wheel': u'/home/starenka/.curds/template_utils-mycompany-py27-none-any.whl', u'requirement': 'hg+ssh://[email protected]//home/mycompany/repos/3rdparty/template_utils@default'}
2013-11-03 18:23:49,328:curdling.services.base:DEBUG:dependencer.queue(from="curdler", data="{u'wheel': u'/home/starenka/.curds/template_utils-mycompany-py27-none-any.whl', u'requirement': 'hg+ssh://[email protected]//home/mycompany/repos/3rdparty/template_utils@default'}")
2013-11-03 18:23:49,328:curdling.services.base:DEBUG:dependencer[Thread-31].run(data="{u'wheel': u'/home/starenka/.curds/template_utils-mycompany-py27-none-any.whl', u'requirement': 'hg+ssh://[email protected]//home/mycompany/repos/3rdparty/template_utils@default'}")
2013-11-03 18:23:49,328:curdling.services.base:ERROR:dependencer[Thread-31].run(from="curdler", data="{u'wheel': u'/home/starenka/.curds/template_utils-mycompany-py27-none-any.whl', u'requirement': 'hg+ssh://[email protected]//home/mycompany/repos/3rdparty/template_utils@default'}") failed:
/data/.envs/ae662e1269652509/local/lib/python2.7/site-packages/curdling/services/base.py:79 (_worker) result = self(requester, **sender_data) or {}
Traceback (most recent call last):
File "/data/.envs/ae662e1269652509/local/lib/python2.7/site-packages/curdling/services/base.py", line 79, in _worker
result = self(requester, **sender_data) or {}
File "/data/.envs/ae662e1269652509/local/lib/python2.7/site-packages/curdling/services/base.py", line 66, in __call__
return self.handle(requester, kwargs)
File "/data/.envs/ae662e1269652509/local/lib/python2.7/site-packages/curdling/services/dependencer.py", line 16, in handle
wheel = Wheel(data['wheel'])
File "/data/.envs/ae662e1269652509/local/lib/python2.7/site-packages/distlib/wheel.py", line 159, in __init__
'filename: %r' % filename)
DistlibException: Invalid name or filename: u'template_utils-mycompany-py27-none-any.whl'
The version pinning makes curdling a nightmare to package without patching. Exact version shouldn't be pinned in the setup.py. Instead, if needed, minimum version should be specified.
I created a curd-server docker container and was curious if you would be interested in listing in in your project?
Currently it lives here:
https://index.docker.io/u/sherzberg/curdserver/
Let me know and I can send a pull request.
How to reproduce:
# install curdling globally (outside of any venv)
# install virtualenvwrapper
change the $HOME/.virtualenvs/postmkvirtualenv to do: curd install curdling
mkvirtualenv foobar
which curd # should return $HOME/.virtualenvs/foobar/bin/curd but actually points to
# /usr/local/bin/curd
�[KRetrieving: [##########] 100% (107 requested, 9 retrieved, 104 built, 3 failed)2013-11-06 15:06:39,400:curdling.install:ERROR:best_version('django'): /var/lib/jenkins/workspace/scorp_default_curdlingtest/.env/local/lib/python2.7/site-packages/curdling/install.py:200 (load_installer) _, chosen_requirement = self.mapping.best_version(package_name)
Traceback (most recent call last):
File "/var/lib/jenkins/workspace/scorp_default_curdlingtest/.env/local/lib/python2.7/site-packages/curdling/install.py", line 200, in load_installer
_, chosen_requirement = self.mapping.best_version(package_name)
File "/var/lib/jenkins/workspace/scorp_default_curdlingtest/.env/local/lib/python2.7/site-packages/curdling/mapping.py", line 143, in best_version
', '.join(sorted(self.available_versions(package_name), reverse=True)),
VersionConflict: Requirement: django (>= 1.4), Available versions: 1.6c1
yipit-client@masterॐ curd install -r requirements.txt
Traceback (most recent call last):
File "/Users/gabrielfalcao/.virtualenvs/yipit-client/bin/curd", line 9, in <module>
load_entry_point('curdling==0.3.2', 'console_scripts', 'curd')()
File "/Users/gabrielfalcao/.virtualenvs/yipit-client/lib/python2.7/site-packages/curdling/tool/__init__.py", line 202, in main
return command.run()
File "/Users/gabrielfalcao/.virtualenvs/yipit-client/lib/python2.7/site-packages/curdling/install.py", line 241, in run
packages = self.retrieve_and_build()
File "/Users/gabrielfalcao/.virtualenvs/yipit-client/lib/python2.7/site-packages/curdling/install.py", line 190, in retrieve_and_build
total, retrieved, built, failed)
File "/Users/gabrielfalcao/.virtualenvs/yipit-client/lib/python2.7/site-packages/curdling/signal.py", line 34, in emit
callback(*args, **kwargs)
File "/Users/gabrielfalcao/.virtualenvs/yipit-client/lib/python2.7/site-packages/curdling/tool/__init__.py", line 84, in build_and_retrieve_progress
percent = int((processed) / float(total) * 100.0)
ZeroDivisionError: float division by zero
output of curd install plant
:
Retrieving: [##########] 100% (1 requested, 0 retrieved, 0 built, 1 failed)
Some milk was spilled in the process:
plant
* plant (explicit requirement): BuildError:
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "setup.py", line 40, in <module>
packages=get_packages())
File "/usr/local/Cellar/python/2.7.5/Frameworks/Python.framework/Versions/2.7/lib/python2.7/distutils/core.py", line 152, in setup
dist.run_commands()
File "/usr/local/Cellar/python/2.7.5/Frameworks/Python.framework/Versions/2.7/lib/python2.7/distutils/dist.py", line 953, in run_commands
self.run_command(cmd)
File "/usr/local/Cellar/python/2.7.5/Frameworks/Python.framework/Versions/2.7/lib/python2.7/distutils/dist.py", line 972, in run_command
cmd_obj.run()
File "/Users/gabrielfalcao/.virtualenvs/yipit-docs/lib/python2.7/site-packages/wheel/bdist_wheel.py", line 170, in run
self.run_command('build')
File "/usr/local/Cellar/python/2.7.5/Frameworks/Python.framework/Versions/2.7/lib/python2.7/distutils/cmd.py", line 326, in run_command
self.distribution.run_command(command)
File "/usr/local/Cellar/python/2.7.5/Frameworks/Python.framework/Versions/2.7/lib/python2.7/distutils/dist.py", line 972, in run_command
cmd_obj.run()
File "/usr/local/Cellar/python/2.7.5/Frameworks/Python.framework/Versions/2.7/lib/python2.7/distutils/command/build.py", line 127, in run
self.run_command(cmd_name)
File "/usr/local/Cellar/python/2.7.5/Frameworks/Python.framework/Versions/2.7/lib/python2.7/distutils/cmd.py", line 326, in run_command
self.distribution.run_command(command)
File "/usr/local/Cellar/python/2.7.5/Frameworks/Python.framework/Versions/2.7/lib/python2.7/distutils/dist.py", line 972, in run_command
cmd_obj.run()
File "/Users/gabrielfalcao/.virtualenvs/yipit-docs/lib/python2.7/site-packages/setuptools/command/build_py.py", line 89, in run
self.build_packages()
File "/usr/local/Cellar/python/2.7.5/Frameworks/Python.framework/Versions/2.7/lib/python2.7/distutils/command/build_py.py", line 372, in build_packages
self.build_module(module, module_file, package)
File "/Users/gabrielfalcao/.virtualenvs/yipit-docs/lib/python2.7/site-packages/setuptools/command/build_py.py", line 106, in build_module
outfile, copied = _build_py.build_module(self, module, module_file, package)
File "/usr/local/Cellar/python/2.7.5/Frameworks/Python.framework/Versions/2.7/lib/python2.7/distutils/command/build_py.py", line 333, in build_module
"'package' must be a string (dot-separated), list, or tuple")
TypeError: 'package' must be a string (dot-separated), list, or tuple
Sometimes I run into issues with my dependencies having conflicting dependencies. When you have lots of dependencies, it can be difficult to debug exactly which dependencies conflict.
When you run curd install --dry-run
it would print out the entire dependency tree that it has created internally, but doesn't actually install anything.
Alternatively, you could make this a separate command.
It would also be cool to have a validate
command that verifies the dependency tree makes sense (or the --dry-run
command could return an appropriate status code).
The dependency checker is too greedy, it doesn't care if the dependency was declared inside of any extra section. So all the dependencies, including test
and development
are being installed by default.
The only extra sections installed should be the ones present inside of the section markers of a requirement ([]
), just like this:
$ curd install curdling[server]
The above command should only install packages from the server
section. Those sections are configured inside of the setup.py
file of a package. Curdling does use that feature, so it's a must to support that.
On python 3.3, currently fails during install sub-command with:
File "/home/.../curdling/tool/__init__.py", line 70, in progress_bar
progress_bar = ('#' * percent_count) + (' ' * (10 - percent_count))
TypeError: can't multiply sequence by non-int of type 'float'
It happens once in a while in travis :/
Just a reminder of a conversation we had and you told me about implementing a requirements.lock with the calculated dependencies
Executing pip install gutter-django=0.1.5
works
Some milk was spilled in the process:
describe
* describe (from gutter-django (0.1.5)): list:
[{u'exception': ReportableError(u"Requirement `describe (1.0.0beta1)' not found",), u'dependency_of': [u'exam (0.3.1)'], u'requirement': u'describe (1.0.0beta1)'}]
* describe (1.0.0beta1) (from exam (0.3.1)): KeyError:
u'describe (1.0.0beta1)'
If you are not able to reproduce I can also paste the output with -l DEBUG
btw, thanks for this effort! :)
Am I missing something in the docs or aren't the hg+ssh
and @revision
not supported?
(foo'd, but otherwise real portion of requirement file):
hg+ssh://[email protected]//home/foocorp/repos/3rdparty/typogrify@default
* hg+ssh://[email protected]//home/foocorp/repos/3rdparty/typogrify@default (explicit requirement): Exception:
abort: no suitable response from remote hg!
git+git://github.com/mfogel/django-settings-context-processor.git@eec8a8b24e80b8cc64858aa822c6d70872502e3d
* git+git://github.com/mfogel/django-settings-context-processor.git@eec8a8b24e80b8cc64858aa822c6d70872502e3d (explicit requirement): Exception:
fatal: remote error:
mfogel/django-settings-context-processor.git@eec8a8b24e80b8cc64858aa822c6d70872502e3d is not a valid repository name
Email [email protected] for help
both of them work like a charm w/ pip
This change will handle cases where the same package is required more than once in the same installation. It should support two main strategies:
By compilation of all requirements evaluated I mean that one single requirement might be requested different times, with different constraints, but only one version of the package is enough to match all the constraints in the different requests. Just like this:
>>> request_install('library (>= 0.2.0)')
>>> request_install('library (== 0.2.5)')
As you you can see, the package library
was requested twice in the pseudo-code above but it still complies with the strict strategy cause constraints don't have any conflicts between each other.
Curdling should parse the fragment of any downloaded URL and check for hashes. Each available hash (specified in the format algorithm=hash
) should be compared against the hash of the content after the download. The exception HashVerificationFailed
should be raised in any mismatch.
Just like in pip, the following hashes should be supported: sha1
, sha224
, sha384
, sha256
, sha512
, md5
.
https://pypi.python.org/packages/source/c/curdling/curdling-0.3.6.tar.gz#md5=4caac1cee5c5c629a0d3b40496382b13
Consider a license compatible with pip.
As a python developer
In order to find and discover python packages
I want to use a `search` command
Given I have curdling installes
When I type `curd search sqlite` in the console
Then I should see a list of package names with their descriptions
This is how pip outputs it:
PyMOTW - Python Module of the Week Examples: sqlite3
HTSQL - A Database Query Language (core & SQLite backend)
pod - A Python Object Database Implemented Using cPickle and SQLite - An easy way to store and fetch Python objects
dumptruck - Relaxing interface to SQLite
pyormish - A simple, ultra-lightweight ORM for MySQL, SQLite, and Postgres
sqlkit - GUI to edit databases. Many backend supported (Postgres, Sqlite, MySql,...). Based on python, GTK, sqlalchemy. It's split into a GUI and a very rich package to create database interfaces
sqlitedict - Persistent dict in Python, backed up by sqlite3 and pickle, multithread-safe.
migranto - Simple SQL migration tool for SQLite and PostgreSQL
xls2db - Convert excel files following a particular schema into sqlite database files.
cruzdb - Interface to UCSC genomic databases. Also allows things like up/downstream/k-nearest-neighbor queries and mirroring of tables to local sqlite databases
jblite - J-Ben SQLite parsing scripts
dbtools - Lightweight SQLite interface
pysqlite - DB-API 2.0 interface for SQLite 3.x
djorm-ext-pool - DB-API2 connection pool for Django (for postgresql, mysql and sqlite)
sqlite3dbm - sqlite-backed dictionary
pyspatialite - DB-API 2.0 interface for SQLite 3.x with Spatialite 3.x
nose-perfdump - Dump per-test performance metrics to an SQLite database for querying.
macaron - Simple object-relational mapper for SQLite3, includes plugin for Bottle web framework
RecSQL - Treat SQLite tables as recarrays
PyDbLite - A pure-Python db engine + Pythonic interface to SQLite and MySQL
sqlitebck - Sqlite3 online backup API implementation.
sostore - SQLite Object Store - An absurdly simple object "database" for Python
kvlite - key-value database wrapper for SQL database (MySQL, SQLite)
pystaggrelite3 - Pure Python sqlite3 statistics aggregate functions
squillo - manipulate microarray datasets contained in SQLITE3 databases.
LiteMap - Mapping class which stores in SQLite database.
hamster-sqlite - Minimal dependency nicely abstracted sqlite backend of hamster time tracker - lets you connect to your hamster db and do stuff in python
caribou - python migrations for sqlite databases
sqmediumlite - SQLite network connection
redmine_migrator - Migrate Redmine data from SQLite to Postgres with consistent type conversion.
cloud_wiki - A wiki engine backed by sqlite that provides its own http server, user authentication and is easy to administer from the command line.
Products.ZSQLiteDA - SQLite database adapter for Zope2
minidb - A simple SQLite3 store for Python objects
redturtle.entiterritoriali - Package contatins all italian enti territoriali in sqlite db
easydb - Simple SQLite wrapper to make it easier to manage your database
bottle-sqlite - SQLite3 integration for Bottle.
sqlitefktg - SQLite foreign key trigger generator
DatabasePipe - A pipe to connect various SQL databases to a PipeStack application. Supports PostgreSQL, SQLite and SQLServer 2000 via plugins.
sqlShort - A tiny wrapper for the Python modules MySQLdb and Sqlite
kv - KV provides a dictionary-like interface on top of SQLite.
SQLiteFKTG4SA - SQLite Foreign Key Trigger Generator for SQLAlchemy
rdflib-sqlite - rdflib extension adding SQLite as back-end store
pydap.handlers.sqlite - A SQLite handler for Pydap
goatfish - A small, schemaless ORM that is backed by SQLite.
litesimple - Simple python ORM micro-framework for sqlite3
scrapy-dblite - Simple library for storing Scrapy Items in sqlite database
SQLite3Database - sqlite3 driver for the DatabasePipe package
DBWrapper - Thread-safe wrapper for Python sqlite3 bindings.
upsert - Upsert for MySQL, PostgreSQL, SQLite3.
y_serial - warehouse compressed Python objects with SQLite
DbMother - A python library to manage sqlite and postgresl connectivity
db-sqlite3 - sqlite3 driver for db
We currently hardcode this value in the __main__.py
module.
I'm trying to reuse curdling.web.Server
to set up a curdling server in a WSGI app engine:
args = Namespace(curddir=curddir, user_db=None, debug=False)
server = Server(args)
After one of the worker
handles an upload
, it updates its server.index.storage
, while the index.storage
of the other workers
stay the same.
It's like this
$ curd -v -l DEBUG uninstall kombu
curd 0.3.6
Is it on the roadmap for curdling to be able to handle external binary dependencies like pyQt?
Hi,
curd don't support "install -e" option.
With pip, I can do :
$ pip install -e foobar/mylib/
see http://www.pip-installer.org/en/latest/usage.html#options
I can also put this "-e" option in requirements.txt file, like that :
flask
-e ../mylib/
With curd, this -e option isn't implemented.
Requirements files can contain lines like:
--index-url https://pypi.mycompany.com/local
--extra-index-url https://pypi.mycompany.com/simple
but curd install -r requirements.txt
tries to install these as if they were package names:
--index-url https://pypi.mycompany.com/local
* --index-url https://pypi.mycompany.com/local (explicit requirement): UnknownURL:
"--index-url https://pypi.mycompany.com/local"
Your URL looks wrong. Make sure it's a valid HTTP
link or a valid VCS link prefixed with the name of
the VCS of your choice. Eg.:
$ curd install https://pypi.python.org/simple/curdling/curdling-0.1.2.tar.gz
$ curd install git+ssh://github.com/clarete/curdling.git
--extra-index-url https://pypi.mycompany.com/simple
* --extra-index-url https://pypi.mycompany.com/simple (explicit requirement): UnknownURL:
"--extra-index-url https://pypi.mycompany.com/simple"
Your URL looks wrong. Make sure it's a valid HTTP
link or a valid VCS link prefixed with the name of
the VCS of your choice. Eg.:
$ curd install https://pypi.python.org/simple/curdling/curdling-0.1.2.tar.gz
$ curd install git+ssh://github.com/clarete/curdling.git
It should treat these lines like pip
does, by acting on the subsequent package declarations as if it had been invoked with the -i
parameter and/or the --extra-index-urls
parameter (see #41 : ) )
Let's just kill curdling.logging and use the python logging module. Let's also rename the --debug
parameter to --log-level
, since it will get killed.
implement curd --version
and curd -v
Change the download.get_opener()
function to also add a handler for HTTPS
urls.
Curdling should be able to install packages from http
, git
, svn
, hg
and other commonly used VCS systems.
The curdling server doesn't support authentication yet.
tox is oftenly used as a way to achieve a multi-environment test suite, specially to make sure a lib works with an older version of python like 2.6.
I tried to use curdling to speedup my tox build, but look what happened:
HTTPretty@masterॐ ./.tox/py26/bin/curd install ipdb
Traceback (most recent call last):
File "./.tox/py26/bin/curd", line 9, in <module>
load_entry_point('curdling==0.2.3', 'console_scripts', 'curd')()
File "/Users/gabrielfalcao/Dropbox/projects/personal/HTTPretty/.tox/py26/lib/python2.6/site-packages/pkg_resources.py", line 357, in load_entry_point
return get_distribution(dist).load_entry_point(group, name)
File "/Users/gabrielfalcao/Dropbox/projects/personal/HTTPretty/.tox/py26/lib/python2.6/site-packages/pkg_resources.py", line 2394, in load_entry_point
return ep.load()
File "/Users/gabrielfalcao/Dropbox/projects/personal/HTTPretty/.tox/py26/lib/python2.6/site-packages/pkg_resources.py", line 2108, in load
entry = __import__(self.module_name, globals(),globals(), ['__name__'])
File "/Users/gabrielfalcao/Dropbox/projects/personal/HTTPretty/.tox/py26/lib/python2.6/site-packages/curdling/tool/__init__.py", line 5, in <module>
from ..install import Install
File "/Users/gabrielfalcao/Dropbox/projects/personal/HTTPretty/.tox/py26/lib/python2.6/site-packages/curdling/install.py", line 10, in <module>
from .services.downloader import Downloader
File "/Users/gabrielfalcao/Dropbox/projects/personal/HTTPretty/.tox/py26/lib/python2.6/site-packages/curdling/services/downloader.py", line 170
return {v['version']: self._get_distribution(v) for v in data}
^
SyntaxError: invalid syntax
HTTPretty@masterॐ
On Python 3.3:
$ curd
Traceback (most recent call last):
File "/Users/myint/Library/Python/3.3/bin/curd", line 9, in <module>
load_entry_point('curdling==0.3.6', 'console_scripts', 'curd')()
File "/Users/myint/Library/Python/3.3/lib/python/site-packages/curdling/tool/__init__.py", line 215, in main
}[args.command](args)
AttributeError: 'Namespace' object has no attribute 'command'
When curdling fails to install a given dependency, the package that requested it will never be marked as built neither failed, which will make curdling hang forever
user$ curd install numpy
Retrieving: [##########] 100% (1 requested, 1 retrieved, 1 built)
Some milk was spilled in the process:
* numpy:
BuildError:
Running from numpy source directory.
/bin/sh: svnversion: command not found
/bin/sh: svnversion: command not found
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "setup.py", line 214, in <module>
setup_package()
File "setup.py", line 207, in setup_package
configuration=configuration )
File "/private/var/folders/r7/d51_mkdx13gg39dghzmcf18r0000gn/T/tmp45Q2AA/numpy-1.7.1/numpy/distutils/core.py", line 186, in setup
return old_setup(**new_attr)
File "/usr/local/Cellar/python/2.7.5/Frameworks/Python.framework/Versions/2.7/lib/python2.7/distutils/core.py", line 152, in setup
dist.run_commands()
File "/usr/local/Cellar/python/2.7.5/Frameworks/Python.framework/Versions/2.7/lib/python2.7/distutils/dist.py", line 953, in run_commands
self.run_command(cmd)
File "/usr/local/Cellar/python/2.7.5/Frameworks/Python.framework/Versions/2.7/lib/python2.7/distutils/dist.py", line 972, in run_command
cmd_obj.run()
File "/usr/local/lib/python2.7/site-packages/wheel/bdist_wheel.py", line 170, in run
self.run_command('build')
File "/usr/local/Cellar/python/2.7.5/Frameworks/Python.framework/Versions/2.7/lib/python2.7/distutils/cmd.py", line 326, in run_command
self.distribution.run_command(command)
File "/usr/local/Cellar/python/2.7.5/Frameworks/Python.framework/Versions/2.7/lib/python2.7/distutils/dist.py", line 972, in run_command
cmd_obj.run()
File "/private/var/folders/r7/d51_mkdx13gg39dghzmcf18r0000gn/T/tmp45Q2AA/numpy-1.7.1/numpy/distutils/command/build.py", line 37, in run
old_build.run(self)
File "/usr/local/Cellar/python/2.7.5/Frameworks/Python.framework/Versions/2.7/lib/python2.7/distutils/command/build.py", line 127, in run
self.run_command(cmd_name)
File "/usr/local/Cellar/python/2.7.5/Frameworks/Python.framework/Versions/2.7/lib/python2.7/distutils/cmd.py", line 326, in run_command
self.distribution.run_command(command)
File "/usr/local/Cellar/python/2.7.5/Frameworks/Python.framework/Versions/2.7/lib/python2.7/distutils/dist.py", line 972, in run_command
cmd_obj.run()
File "/private/var/folders/r7/d51_mkdx13gg39dghzmcf18r0000gn/T/tmp45Q2AA/numpy-1.7.1/numpy/distutils/command/build_src.py", line 152, in run
self.build_sources()
File "/private/var/folders/r7/d51_mkdx13gg39dghzmcf18r0000gn/T/tmp45Q2AA/numpy-1.7.1/numpy/distutils/command/build_src.py", line 163, in build_sources
self.build_library_sources(*libname_info)
File "/private/var/folders/r7/d51_mkdx13gg39dghzmcf18r0000gn/T/tmp45Q2AA/numpy-1.7.1/numpy/distutils/command/build_src.py", line 298, in build_library_sources
sources = self.generate_sources(sources, (lib_name, build_info))
File "/private/var/folders/r7/d51_mkdx13gg39dghzmcf18r0000gn/T/tmp45Q2AA/numpy-1.7.1/numpy/distutils/command/build_src.py", line 385, in generate_sources
source = func(extension, build_dir)
File "numpy/core/setup.py", line 646, in get_mathlib_info
st = config_cmd.try_link('int main(void) { return 0;}')
File "/usr/local/Cellar/python/2.7.5/Frameworks/Python.framework/Versions/2.7/lib/python2.7/distutils/command/config.py", line 251, in try_link
libraries, library_dirs, lang)
File "/private/var/folders/r7/d51_mkdx13gg39dghzmcf18r0000gn/T/tmp45Q2AA/numpy-1.7.1/numpy/distutils/command/config.py", line 149, in _link
libraries, library_dirs, lang))
File "/private/var/folders/r7/d51_mkdx13gg39dghzmcf18r0000gn/T/tmp45Q2AA/numpy-1.7.1/numpy/distutils/command/config.py", line 89, in _wrap_method
ret = mth(*((self,)+args))
File "/usr/local/Cellar/python/2.7.5/Frameworks/Python.framework/Versions/2.7/lib/python2.7/distutils/command/config.py", line 143, in _link
(src, obj) = self._compile(body, headers, include_dirs, lang)
File "/private/var/folders/r7/d51_mkdx13gg39dghzmcf18r0000gn/T/tmp45Q2AA/numpy-1.7.1/numpy/distutils/command/config.py", line 99, in _compile
(body, headers, include_dirs, lang))
File "/private/var/folders/r7/d51_mkdx13gg39dghzmcf18r0000gn/T/tmp45Q2AA/numpy-1.7.1/numpy/distutils/command/config.py", line 89, in _wrap_method
ret = mth(*((self,)+args))
File "/usr/local/Cellar/python/2.7.5/Frameworks/Python.framework/Versions/2.7/lib/python2.7/distutils/command/config.py", line 138, in _compile
self.compiler.compile([src], include_dirs=include_dirs)
File "/private/var/folders/r7/d51_mkdx13gg39dghzmcf18r0000gn/T/tmp45Q2AA/numpy-1.7.1/numpy/distutils/ccompiler.py", line 203, in CCompiler_compile
self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts)
File "/private/var/folders/r7/d51_mkdx13gg39dghzmcf18r0000gn/T/tmp45Q2AA/numpy-1.7.1/numpy/distutils/unixccompiler.py", line 34, in UnixCCompiler__compile
self.spawn(self.compiler_so + cc_args + [src, '-o', obj] +
TypeError: coercing to Unicode: need string or buffer, list found
[KRetrieving: [######### ] 93% (107 requested, 8 retrieved, 100 processed)2013-11-06 15:06:13,131:curdling.services.base:ERROR:curdler[Thread-5].run(from="dependencer", data="{u'tarball': u'/var/lib/jenkins/.curds/Django-1.6c1.tar.gz', 'requirement': u'django (>= 1.4.2)', 'dependency_of': u'django-model-utils (1.5.0)'}") failed:
/var/lib/jenkins/workspace/scorp_default_curdlingtest/.env/local/lib/python2.7/site-packages/curdling/services/base.py:80 (_worker) result = self(requester, **sender_data) or {}
Traceback (most recent call last):
File "/var/lib/jenkins/workspace/scorp_default_curdlingtest/.env/local/lib/python2.7/site-packages/curdling/services/base.py", line 80, in _worker
result = self(requester, **sender_data) or {}
File "/var/lib/jenkins/workspace/scorp_default_curdlingtest/.env/local/lib/python2.7/site-packages/curdling/services/base.py", line 67, in __call__
return self.handle(requester, kwargs)
File "/var/lib/jenkins/workspace/scorp_default_curdlingtest/.env/local/lib/python2.7/site-packages/curdling/services/curdler.py", line 136, in handle
raise BuildError(str(exc))
BuildError: unpack requires a string argument of length 4
2013-11-06 15:06:13,163:curdling.services.base:ERROR:curdler[Thread-4].run(from="dependencer", data="{u'tarball': u'/var/lib/jenkins/.curds/Django-1.6c1.tar.gz', 'requirement': u'django (>= 1.2)', 'dependency_of': u'raven (3.5.0)'}") failed:
/var/lib/jenkins/workspace/scorp_default_curdlingtest/.env/local/lib/python2.7/site-packages/curdling/services/base.py:80 (_worker) result = self(requester, **sender_data) or {}
Traceback (most recent call last):
File "/var/lib/jenkins/workspace/scorp_default_curdlingtest/.env/local/lib/python2.7/site-packages/curdling/services/base.py", line 80, in _worker
result = self(requester, **sender_data) or {}
File "/var/lib/jenkins/workspace/scorp_default_curdlingtest/.env/local/lib/python2.7/site-packages/curdling/services/base.py", line 67, in __call__
return self.handle(requester, kwargs)
File "/var/lib/jenkins/workspace/scorp_default_curdlingtest/.env/local/lib/python2.7/site-packages/curdling/services/curdler.py", line 136, in handle
raise BuildError(str(exc))
BuildError: unpack requires a string argument of length 4
My system is unreachable to internet without HTTP proxy. But the environment variable "http_proxy" doesn't passed to Curdling.
The Uploader
service uses the request.put
feature cause we're still not polishing those details just yet.
I tried to create a new django project virtual env and wanted to use curdling to install the dependencies.
This was my workflow:
mkvirtualenv projectname
easy_install curdling
curd install django
result
workॐ curd install django
Traceback (most recent call last):
File "/Users/gabrielfalcao/.virtualenvs/projectname/bin/curd", line 9, in <module>
load_entry_point('curdling==0.2.2', 'console_scripts', 'curd')()
File "/Users/gabrielfalcao/.virtualenvs/projectname/lib/python2.7/site-packages/curdling-0.2.2-py2.7.egg/curdling/tool/__init__.py", line 120, in main
File "/Users/gabrielfalcao/.virtualenvs/projectname/lib/python2.7/site-packages/curdling-0.2.2-py2.7.egg/curdling/tool/__init__.py", line 83, in get_install_command
File "/Users/gabrielfalcao/.virtualenvs/projectname/lib/python2.7/site-packages/curdling-0.2.2-py2.7.egg/curdling/install.py", line 155, in request_install
File "/Users/gabrielfalcao/.virtualenvs/projectname/lib/python2.7/site-packages/curdling-0.2.2-py2.7.egg/curdling/database.py", line 15, in check_installed
File "/Users/gabrielfalcao/.virtualenvs/projectname/lib/python2.7/site-packages/distlib-0.1.2-py2.7.egg/distlib/database.py", line 208, in get_distribution
File "/Users/gabrielfalcao/.virtualenvs/projectname/lib/python2.7/site-packages/distlib-0.1.2-py2.7.egg/distlib/database.py", line 133, in _generate_cache
File "/Users/gabrielfalcao/.virtualenvs/projectname/lib/python2.7/site-packages/distlib-0.1.2-py2.7.egg/distlib/database.py", line 123, in _yield_distributions
File "/Users/gabrielfalcao/.virtualenvs/projectname/lib/python2.7/site-packages/distlib-0.1.2-py2.7.egg/distlib/database.py", line 851, in __init__
File "/Users/gabrielfalcao/.virtualenvs/projectname/lib/python2.7/site-packages/distlib-0.1.2-py2.7.egg/distlib/database.py", line 928, in _get_metadata
File "/Users/gabrielfalcao/.virtualenvs/projectname/lib/python2.7/site-packages/distlib-0.1.2-py2.7.egg/distlib/metadata.py", line 287, in __setitem__
File "/Users/gabrielfalcao/.virtualenvs/projectname/lib/python2.7/site-packages/distlib-0.1.2-py2.7.egg/distlib/metadata.py", line 533, in set
File "/Users/gabrielfalcao/.virtualenvs/projectname/lib/python2.7/site-packages/distlib-0.1.2-py2.7.egg/distlib/version.py", line 669, in is_valid_matcher
File "/Users/gabrielfalcao/.virtualenvs/projectname/lib/python2.7/site-packages/distlib-0.1.2-py2.7.egg/distlib/version.py", line 95, in __init__
ValueError: Not valid: u'='
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.