admin-requests's People
Forkers
davidbrochart xhochy chrisburr jjhelmus ccordoba12 jakirkham borupdaniel looooo hmaarrfk duncanmmacleod martinrenou jeromekelleher mbargull jaimergp cj-wright blink1073 pierreglaser jayfurmanek pariterre maartenbreddels isuruf chrisjsewell antoined rluria14 khaeru chrisdembia tswast trendelkampschroer mwcraig muryanto1 htenkanen adament bollwyvl wolfv tadeu ocefpaf mariusvniekerk pkgw xlwings anthchirp sylvaincorlay gonzalocasas berkelem dwhswenson xylar erykoff ericpre awvwgk silun k-dominik cbrueffer conda-forge-daemon carterbox leofang nehaljwani jtpio stuartarchibald h-vetinari bdice dbast reimarbauer marscher traversaro saraedum rly cofinoa inducer maresb ma-sadeghi janjagusch acesnik tkelman proyan kthyng mattwthompson zoj613 mardiehl ivan-gomes tomasvanpottelbergh yuleiwan ogrisel jasongrout bkpoon adriendelsalle derthorsten djhoese michalchromcak kmuehlbauer kukushking amontanez24 bastianzim carreau matter-it-does teake mmcauliffe markotoplak fsemerar ngam iainrussell lauwienadmin-requests's Issues
Orphaned Feedstock
Comment:
I am quite confident this feedstock has become orphaned from it's maintainer demonstrated by the lack of activity from them on any of the recent PRs. Can you help merge this PR?
Thanks.
Additionally I'm willing to volunteer as an additional maintainer for the aforementioned feedstock.
add ability to restart ci build on master?
YAML file instead of txt files in folders
Now that we have lots of different folders, maybe it's time to use a YAML file instead of a .txt
file with a list.
For eg:
packages:
- linux-64/win-64/cf-autotick-bot-test-package-0.4-py38_0.tar.bz2
- win-64/cf-autotick-bot-test-package-0.4-py27_0.tar.bz2
action: broken/not_broken
feedstocks:
- cf-autotick-bot-test-package
action: archive/unarchive
feedstocks:
- cf-autotick-bot-test-package
action: grant_access
resource:
- gpu-runner
- gpu-runner-pr
- cpu-runner
feedstocks:
- cf-autotick-bot-test-package
action: token_reset
ci:
- github_actions
- travis
Default installation of JAX + Tensorflow produces inconsistent environment
Solution to issue cannot be found in the documentation.
- I checked the documentation.
Issue
Installing default versions of both JAX and Tensorflow from conda-forge currently results in an inconsistent environment in which neither of them can be imported. Using either of the two separately is fine.
I can recreate this in the miniforge3 docker image like this
FROM condaforge/miniforge3
RUN conda install tensorflow jax
RUN conda list
RUN conda info
RUN python -c 'import tensorflow'
# this also fails
# RUN python -c 'import jax'
which gives the following error:
#7 [5/5] RUN python -c 'import tensorflow'
#7 sha256:c7f1bae8ee24e9614e6e17d5be2614442a2df455b7b3733c3f1e4b8b7a40132a
#7 2.331 Traceback (most recent call last):
#7 2.331 File "<string>", line 1, in <module>
#7 2.331 File "/opt/conda/lib/python3.9/site-packages/tensorflow/__init__.py", line 55, in <module>
#7 2.331 from ._api.v2 import compat
#7 2.331 File "/opt/conda/lib/python3.9/site-packages/tensorflow/_api/v2/compat/__init__.py", line 39, in <module>
#7 2.331 from . import v1
#7 2.331 File "/opt/conda/lib/python3.9/site-packages/tensorflow/_api/v2/compat/v1/__init__.py", line 34, in <module>
#7 2.331 from . import compat
#7 2.331 File "/opt/conda/lib/python3.9/site-packages/tensorflow/_api/v2/compat/v1/compat/__init__.py", line 39, in <module>
#7 2.333 from . import v1
#7 2.333 File "/opt/conda/lib/python3.9/site-packages/tensorflow/_api/v2/compat/v1/compat/v1/__init__.py", line 51, in <module>
#7 2.334 from tensorflow._api.v2.compat.v1 import lite
#7 2.334 File "/opt/conda/lib/python3.9/site-packages/tensorflow/_api/v2/compat/v1/lite/__init__.py", line 11, in <module>
#7 2.334 from . import experimental
#7 2.334 File "/opt/conda/lib/python3.9/site-packages/tensorflow/_api/v2/compat/v1/lite/experimental/__init__.py", line 10, in <module>
#7 2.334 from . import authoring
#7 2.334 File "/opt/conda/lib/python3.9/site-packages/tensorflow/_api/v2/compat/v1/lite/experimental/authoring/__init__.py", line 10, in <module>
#7 2.334 from tensorflow.lite.python.authoring.authoring import compatible
#7 2.334 File "/opt/conda/lib/python3.9/site-packages/tensorflow/lite/python/authoring/authoring.py", line 43, in <module>
#7 2.335 from tensorflow.lite.python import convert
#7 2.335 File "/opt/conda/lib/python3.9/site-packages/tensorflow/lite/python/convert.py", line 33, in <module>
#7 2.335 from tensorflow.lite.python import util
#7 2.335 File "/opt/conda/lib/python3.9/site-packages/tensorflow/lite/python/util.py", line 55, in <module>
#7 2.335 from jax import xla_computation as _xla_computation
#7 2.335 File "/opt/conda/lib/python3.9/site-packages/jax/__init__.py", line 37, in <module>
#7 2.335 from jax import config as _config_module
#7 2.335 File "/opt/conda/lib/python3.9/site-packages/jax/config.py", line 18, in <module>
#7 2.335 from jax._src.config import config
#7 2.335 File "/opt/conda/lib/python3.9/site-packages/jax/_src/config.py", line 27, in <module>
#7 2.335 from jax._src import lib
#7 2.335 File "/opt/conda/lib/python3.9/site-packages/jax/_src/lib/__init__.py", line 101, in <module>
#7 2.335 version = check_jaxlib_version(
#7 2.335 File "/opt/conda/lib/python3.9/site-packages/jax/_src/lib/__init__.py", line 90, in check_jaxlib_version
#7 2.336 raise RuntimeError(msg)
#7 2.336 RuntimeError: jaxlib is version 0.1.75, but this version of jax requires version >= 0.3.2.
#7 ERROR: executor failed running [/bin/sh -c python -c 'import tensorflow']: exit code: 1
Installed packages
#5 [3/5] RUN conda list
#5 sha256:1b4a2bef8e285ea7908ee767af6ddccd2b54750f9f686f79b9cdf637ae8dbacc
#5 1.412 # packages in environment at /opt/conda:
#5 1.412 #
#5 1.412 # Name Version Build Channel
#5 1.412 _libgcc_mutex 0.1 conda_forge conda-forge
#5 1.412 _openmp_mutex 4.5 1_gnu conda-forge
#5 1.412 abseil-cpp 20210324.2 h9c3ff4c_0 conda-forge
#5 1.412 absl-py 1.0.0 pyhd8ed1ab_0 conda-forge
#5 1.412 aiohttp 3.8.1 py39hb9d737c_1 conda-forge
#5 1.412 aiosignal 1.2.0 pyhd8ed1ab_0 conda-forge
#5 1.412 astunparse 1.6.3 pyhd8ed1ab_0 conda-forge
#5 1.412 async-timeout 4.0.2 pyhd8ed1ab_0 conda-forge
#5 1.412 attrs 21.4.0 pyhd8ed1ab_0 conda-forge
#5 1.412 blinker 1.4 py_1 conda-forge
#5 1.412 brotlipy 0.7.0 py39h3811e60_1003 conda-forge
#5 1.412 bzip2 1.0.8 h7f98852_4 conda-forge
#5 1.412 c-ares 1.18.1 h7f98852_0 conda-forge
#5 1.412 ca-certificates 2021.10.8 ha878542_0 conda-forge
#5 1.412 cached-property 1.5.2 hd8ed1ab_1 conda-forge
#5 1.412 cached_property 1.5.2 pyha770c72_1 conda-forge
#5 1.412 cachetools 4.2.4 pyhd8ed1ab_0 conda-forge
#5 1.412 certifi 2021.10.8 py39hf3d152e_2 conda-forge
#5 1.412 cffi 1.15.0 py39h4bc2ebd_0 conda-forge
#5 1.412 charset-normalizer 2.0.12 pyhd8ed1ab_0 conda-forge
#5 1.412 click 8.1.3 py39hf3d152e_0 conda-forge
#5 1.412 colorama 0.4.4 pyh9f0ad1d_0 conda-forge
#5 1.412 conda 4.12.0 py39hf3d152e_0 conda-forge
#5 1.412 conda-package-handling 1.8.0 py39hb9d737c_0 conda-forge
#5 1.412 cryptography 36.0.2 py39hd97740a_0 conda-forge
#5 1.412 frozenlist 1.3.0 py39hb9d737c_1 conda-forge
#5 1.412 gast 0.4.0 pyh9f0ad1d_0 conda-forge
#5 1.412 giflib 5.2.1 h36c2ea0_2 conda-forge
#5 1.412 google-auth 1.35.0 pyh6c4a22f_0 conda-forge
#5 1.412 google-auth-oauthlib 0.4.6 pyhd8ed1ab_0 conda-forge
#5 1.412 google-pasta 0.2.0 pyh8c360ce_0 conda-forge
#5 1.412 grpc-cpp 1.42.0 ha1441d3_1 conda-forge
#5 1.412 grpcio 1.42.0 py39hff7568b_0 conda-forge
#5 1.412 h5py 3.6.0 nompi_py39h7e08c79_100 conda-forge
#5 1.412 hdf5 1.12.1 nompi_h2386368_104 conda-forge
#5 1.412 icu 69.1 h9c3ff4c_0 conda-forge
#5 1.412 idna 3.3 pyhd8ed1ab_0 conda-forge
#5 1.412 importlib-metadata 4.11.3 py39hf3d152e_1 conda-forge
#5 1.412 jax 0.3.7 pyhd8ed1ab_0 conda-forge
#5 1.412 jaxlib 0.1.75 py39hde0f152_0 conda-forge
#5 1.412 jpeg 9e h166bdaf_1 conda-forge
#5 1.412 keras 2.7.0 pyhd8ed1ab_0 conda-forge
#5 1.412 keras-preprocessing 1.1.2 pyhd8ed1ab_0 conda-forge
#5 1.412 keyutils 1.6.1 h166bdaf_0 conda-forge
#5 1.412 krb5 1.19.3 h3790be6_0 conda-forge
#5 1.412 ld_impl_linux-64 2.36.1 hea4e1c9_2 conda-forge
#5 1.412 libblas 3.9.0 14_linux64_openblas conda-forge
#5 1.412 libcblas 3.9.0 14_linux64_openblas conda-forge
#5 1.412 libcurl 7.83.0 h7bff187_0 conda-forge
#5 1.412 libedit 3.1.20191231 he28a2e2_2 conda-forge
#5 1.412 libev 4.33 h516909a_1 conda-forge
#5 1.412 libffi 3.4.2 h7f98852_5 conda-forge
#5 1.412 libgcc-ng 11.2.0 h1d223b6_14 conda-forge
#5 1.412 libgfortran-ng 11.2.0 h69a702a_16 conda-forge
#5 1.412 libgfortran5 11.2.0 h5c6108e_16 conda-forge
#5 1.412 libgomp 11.2.0 h1d223b6_14 conda-forge
#5 1.412 liblapack 3.9.0 14_linux64_openblas conda-forge
#5 1.412 libnghttp2 1.47.0 h727a467_0 conda-forge
#5 1.412 libnsl 2.0.0 h7f98852_0 conda-forge
#5 1.412 libopenblas 0.3.20 pthreads_h78a6416_0 conda-forge
#5 1.412 libpng 1.6.37 h21135ba_2 conda-forge
#5 1.412 libprotobuf 3.19.4 h780b84a_0 conda-forge
#5 1.412 libssh2 1.10.0 ha56f1ee_2 conda-forge
#5 1.412 libstdcxx-ng 11.2.0 he4da1e4_16 conda-forge
#5 1.412 libuuid 2.32.1 h7f98852_1000 conda-forge
#5 1.412 libzlib 1.2.11 h36c2ea0_1013 conda-forge
#5 1.412 markdown 3.3.7 pyhd8ed1ab_0 conda-forge
#5 1.412 multidict 6.0.2 py39hb9d737c_1 conda-forge
#5 1.412 ncurses 6.3 h9c3ff4c_0 conda-forge
#5 1.412 numpy 1.22.3 py39hc58783e_2 conda-forge
#5 1.412 oauthlib 3.2.0 pyhd8ed1ab_0 conda-forge
#5 1.412 openssl 1.1.1o h166bdaf_0 conda-forge
#5 1.412 opt_einsum 3.3.0 pyhd8ed1ab_1 conda-forge
#5 1.412 pip 22.0.4 pyhd8ed1ab_0 conda-forge
#5 1.412 protobuf 3.19.4 py39he80948d_0 conda-forge
#5 1.412 pyasn1 0.4.8 py_0 conda-forge
#5 1.412 pyasn1-modules 0.2.7 py_0 conda-forge
#5 1.412 pycosat 0.6.3 py39h3811e60_1009 conda-forge
#5 1.412 pycparser 2.21 pyhd8ed1ab_0 conda-forge
#5 1.412 pyjwt 2.3.0 pyhd8ed1ab_1 conda-forge
#5 1.412 pyopenssl 22.0.0 pyhd8ed1ab_0 conda-forge
#5 1.412 pysocks 1.7.1 py39hf3d152e_4 conda-forge
#5 1.412 python 3.9.10 h85951f9_2_cpython conda-forge
#5 1.412 python-flatbuffers 2.0 pyhd8ed1ab_0 conda-forge
#5 1.412 python_abi 3.9 2_cp39 conda-forge
#5 1.412 pyu2f 0.1.5 pyhd8ed1ab_0 conda-forge
#5 1.412 re2 2021.11.01 h9c3ff4c_0 conda-forge
#5 1.412 readline 8.1 h46c0cb4_0 conda-forge
#5 1.412 requests 2.27.1 pyhd8ed1ab_0 conda-forge
#5 1.412 requests-oauthlib 1.3.1 pyhd8ed1ab_0 conda-forge
#5 1.412 rsa 4.8 pyhd8ed1ab_0 conda-forge
#5 1.412 ruamel_yaml 0.15.80 py39h3811e60_1006 conda-forge
#5 1.412 scipy 1.8.0 py39hee8e79c_1 conda-forge
#5 1.412 setuptools 60.10.0 py39hf3d152e_0 conda-forge
#5 1.412 six 1.16.0 pyh6c4a22f_0 conda-forge
#5 1.412 snappy 1.1.8 he1b5a44_3 conda-forge
#5 1.412 sqlite 3.37.1 h4ff8645_0 conda-forge
#5 1.412 tensorboard 2.6.0 pyhd8ed1ab_1 conda-forge
#5 1.412 tensorboard-data-server 0.6.0 py39hd97740a_2 conda-forge
#5 1.412 tensorboard-plugin-wit 1.8.1 pyhd8ed1ab_0 conda-forge
#5 1.412 tensorflow 2.7.0 cpu_py39h4655687_0 conda-forge
#5 1.412 tensorflow-base 2.7.0 cpu_py39hf4995fd_0 conda-forge
#5 1.412 tensorflow-estimator 2.7.0 cpu_py39ha241409_0 conda-forge
#5 1.412 termcolor 1.1.0 py_2 conda-forge
#5 1.412 tk 8.6.12 h27826a3_0 conda-forge
#5 1.412 tqdm 4.63.0 pyhd8ed1ab_0 conda-forge
#5 1.412 typing-extensions 4.2.0 hd8ed1ab_1 conda-forge
#5 1.412 typing_extensions 4.2.0 pyha770c72_1 conda-forge
#5 1.412 tzdata 2022a h191b570_0 conda-forge
#5 1.412 urllib3 1.26.9 pyhd8ed1ab_0 conda-forge
#5 1.412 werkzeug 2.1.2 pyhd8ed1ab_1 conda-forge
#5 1.412 wheel 0.37.1 pyhd8ed1ab_0 conda-forge
#5 1.412 wrapt 1.14.1 py39hb9d737c_0 conda-forge
#5 1.412 xz 5.2.5 h516909a_1 conda-forge
#5 1.412 yaml 0.2.5 h7f98852_2 conda-forge
#5 1.412 yarl 1.7.2 py39hb9d737c_2 conda-forge
#5 1.412 zipp 3.8.0 pyhd8ed1ab_0 conda-forge
#5 1.412 zlib 1.2.11 h36c2ea0_1013 conda-forge
Environment info
#6 [4/5] RUN conda info
#6 sha256:e04dec7bc9e71b96d819465250e444e4080f1874b5342685a25c0744c32e910d
#6 0.730
#6 0.730 active environment : None
#6 0.730 user config file : /root/.condarc
#6 0.730 populated config files : /opt/conda/.condarc
#6 0.730 conda version : 4.12.0
#6 0.730 conda-build version : not installed
#6 0.730 python version : 3.9.10.final.0
#6 0.730 virtual packages : __linux=5.10.47=0
#6 0.730 __glibc=2.31=0
#6 0.730 __unix=0=0
#6 0.730 __archspec=1=x86_64
#6 0.730 base environment : /opt/conda (writable)
#6 0.730 conda av data dir : /opt/conda/etc/conda
#6 0.730 conda av metadata url : None
#6 0.730 channel URLs : https://conda.anaconda.org/conda-forge/linux-64
#6 0.730 https://conda.anaconda.org/conda-forge/noarch
#6 0.730 package cache : /opt/conda/pkgs
#6 0.730 /root/.conda/pkgs
#6 0.730 envs directories : /opt/conda/envs
#6 0.730 /root/.conda/envs
#6 0.730 platform : linux-64
#6 0.730 user-agent : conda/4.12.0 requests/2.27.1 CPython/3.9.10 Linux/5.10.47-linuxkit ubuntu/20.04.4 glibc/2.31
#6 0.730 UID:GID : 0:0
#6 0.730 netrc file : None
#6 0.730 offline mode : False
#6 0.730
Use organization level STAGING_BINSTAR_TOKEN in github actions
Instead of setting a repository level token, we can make STAGING_BINSTAR_TOKEN
available to all repos on conda-forge and use that.
Update CI tokens for `jupyterlab-github-feedstock`
Comment:
As suggested over on conda-forge/jupyterlab-github-feedstock#4 (comment) , @conda-forge/jupyterlab-github isn't currently uploading after two build number attempts due to an invalid token.
Is this the right place to ask? Is there a bot command to streamline this?
Thanks!
checking for valid packages in the CI is currently broken
I am running the command from the CI script to check for valid packages and am thinking that conda is ignoring CONDA_SUBDIR
$ CONDA_SUBIDR=win-64 conda search python=3.9
Loading channels: done
# Name Version Build Channel
python 3.9.0 h1821ab9_0_cpython conda-forge
python 3.9.0 h1821ab9_1_cpython conda-forge
python 3.9.0 h1821ab9_2_cpython conda-forge
When looking at anaconda.org, these appear to be the osx-64 packages
conda info
$ conda info
active environment : base
active env location : /Users/beckermr/miniconda3
shell level : 1
user config file : /Users/beckermr/.condarc
populated config files : /Users/beckermr/.condarc
conda version : 4.8.5
conda-build version : 3.20.3
python version : 3.7.8.final.0
virtual packages : __osx=10.14.6
base environment : /Users/beckermr/miniconda3 (writable)
channel URLs : https://conda.anaconda.org/conda-forge/osx-64
https://conda.anaconda.org/conda-forge/noarch
https://repo.anaconda.com/pkgs/main/osx-64
https://repo.anaconda.com/pkgs/main/noarch
https://repo.anaconda.com/pkgs/r/osx-64
https://repo.anaconda.com/pkgs/r/noarch
package cache : /Users/beckermr/miniconda3/pkgs
/Users/beckermr/.conda/pkgs
envs directories : /Users/beckermr/miniconda3/envs
/Users/beckermr/.conda/envs
platform : osx-64
user-agent : conda/4.8.5 requests/2.24.0 CPython/3.7.8 Darwin/18.7.0 OSX/10.14.6
UID:GID : 501:20
netrc file : /Users/beckermr/.netrc
offline mode : False
Enable automerge on this repo
IIUC automerge doesn't currently work here. Would be good to add.
noarch/nbformat-5.1.2-* issue
I wasn't sure how to approach this issue, but from jupyter/nbformat#217, it was shown that the https://anaconda.org/conda-forge/nbformat/files 5.1.2 release was generating inappropriate random ids. We changed 5.1.3 to generate hash ids and yanked the PyPI release. The package isn't broken per se but it's highly problematic. Would the recourse be to mark it as broken for non-technical reasons?
Should mislabeled `noarch` packages be marked as `broken`?
Comment:
The r-textrecipes-feedstock
has been generating "noarch
" builds which were actually getting compiled. Effectively, all builds since v0.2.0 are really linux-64
. This has been ongoing since v0.2.0 of this package (3 years 9 months ago) and is only being corrected now in conda-forge/r-textrecipes-feedstock#28.
What is the proper solution here? Should we mark these builds as broken
? or is there a way to move them into the linux-64
subdirectory?
Support for (un)archiving feedstocks
Would be useful to have an admin request here for handling archiving/unarchiving feedstocks.
Add ability to turn maintenance branches into tags (and back)
I brought up conda-forge/conda-forge.github.io#1972 in a core call recently, and @jaimergp mentioned that this should be done through admin-requests for traceability reasons.
I'm game to give this a shot at implementation. Is there a dummy feedstock we can use to develop against (i.e. actually execute things like branch deletion)?
Token reset for nibabel-feedstock
Comment:
Following up on conda-forge/status#137, the nibabel-feedstock is unable to publish its package with error "invalid feedstock token".
Token reset for datavzrd-feedstock
Comment:
The datavzrd-feedstock seems to be affected by the token issue (conda-forge/status#137), see the build logs.
As instructed in conda-forge/status#137, this is the request for resetting the token.
mark broken is broken
The action on master is not currently copying packages between labels correctly
mark singularityce > 3.9.5 and singularity > 3.8.6 as broken
Solution to issue cannot be found in the documentation.
- I checked the documentation.
Issue
mark singularityce > 3.9.5 and singularity > 3.8.6 as broken
Installed packages
mark singularityce > 3.9.5 and singularity > 3.8.6 as broken
Environment info
mark singularityce > 3.9.5 and singularity > 3.8.6 as broken
Using GHA's `concurrency` syntax
Currently we have this logic to prevent multiple runs on main
for different commits:
admin-requests/.github/workflows/main.yml
Lines 14 to 20 in cd36865
Another option would be to use GHA's concurrency
syntax. Here's an example.
If this seems reasonable, GHA configs in other repos may be able to employ the same strategy.
Adding access control to CI providers and other resources
This comes from conda-forge/conda-forge.github.io#1875
I've been thinking about how to implement opt-in CI access control for Travis and other services we might have in the future. I think a potential solution might be formulated like this:
- We will open a new
resource-access/
directory (or something like that)in this repository. - This directory will contain a number of files (e.g. TXT files), one per resource where we are providing opt-in mechanisms:
- Each TXT file will simply list which feedstocks should have access to that resource, similar to what we do with the osx-arm64 migration file.
For example:
resource-access/
travis_ci.txt
a_gpu_provider.txt
some_aws_credits.txt
long_running_jobs.txt
# travis_ci.txt
numpy
scipy
# long_running_jobs.txt
tensorflow
pytorch
The workflow would be something like this:
- A feedstock maintainer opens a PR adding their feedstock to the desired service file. Different services might need different application requirements; these could be added as comments at the top of the file or something.
- The core team will review the application and merge if accepted.
- Once merged, automation machinery will get the diff of the new change and register the feedstock for the new service. Each service might require different automation.
- A cronjob will run every now and then to make sure that all listed feedstocks have access to the service, and will fix it if needed.
The cronjob might be tricky to set up so an alternative it's two have a three file set-up for each service, similar to what we do with the broken/not_broken infra:
the_service.txt
-- to be modified by the automation infrastructure, not the user; maybe in a different repository to make sure human errors are prevented. This is here for our own record keeping so we know who has access to what service, quickly.grant-access/the_service.txt
-- the input file to signal the automation to add this feedstock to the servicerevoke-access/the_service.txt
-- the input file to signal the automation to remove access to this feedstock.
Thoughts?
add ability to reset feedstock tokens
R-forge token reset not fully successful
Solution to issue cannot be found in the documentation.
- I checked the documentation.
Issue
R-forge-feedstock is currently failing because the packages posted to staging can't be copied over successfully (Xref: conda-forge/r-forge-feedstock#8 ). We've tried resetting the token to clear this up ( #741 ), but the problem persists. This same issue arose last year ( #496 ). I suspect someone will either need to manually create a project on Travis or temporarily hack around this issue like last year.
@beckermr You temporarily worked around this issue last year, do you have any thoughts about how to best proceed?
Installed packages
NA
Environment info
NA
Add static hosted HTML page with PR builder forms
Comment:
elevator pitch
Provide a low-barrier way to make precise, pre-validated admin requests.
motivation
After looking at the GH PR templates feature (suggested in #535), I was unsatisfied with the specificity of the language (as usual, not quite JSON schema),
design ideas
So I wrote a thing that:
- takes in JSON schema
- builds a form with rjsf
- makes a yaml/json/toml of the form
- makes a link for new PR based on the input, which can include exactly on file
Here's a demo for an outrageously long schema:
https://deathbeds.github.io/jupyak/shaver.html
challenges
The downside: to get the nice UI (dropdown/autocomplete), all the feedstock names would need to be embedded in the schema, e.g.
"feedstocks": {
"type": "array",
"items": {
"type": "string",
"enum": ["aalto-boss", "a-few-others", "zziplib"]
}
}
But this might be something that could be generated in one place...
{
"$id": "https://conda-forge.org/schema/feedstocks.schema.json",
"type": "string",
"enum": ["aalto-boss", "a-few-others", "zziplib"]
}
And then referenced here:
{
"feedstocks": {
"type": "array",
"items": {
"$ref": "https://conda-forge.org/schema/feedstocks.schema.json"
}
}
}
implementation ideas
After the... experience... with pydantic
over on conda-smithy
, it seems like schema-first design (but perhaps authored in YAML) to get to a well-typed TypedDict
might be easier and give strictly better validation.
alternatives
- use the semi-decent Issue form to generate PRs
- the specification language is also not-quite-schema, and not portable to anything else
- an action would have to re-parse the generated markdown
- the generated PR is not editable by the original owner with more complexity
- use a bot pidgin grammar
- these are harder to discover, and don't have autocomplete (other than the bot name)
Cellpose conda package not made by Cellpose authors
Comment:
My apologies if this is the wrong place for this question. Thanks to all of you for your work on conda-forge!
I am one of the authors and developers of Cellpose which is distributed on pypi. There is a conda package on conda-forge for Cellpose (link), but we did not make this conda-forge recipe. Was it automatically made from our pip packages somehow? We're worried that if it's not automatically made then someone else is making them and could include malicious or different code in the package.
Thanks for the help,
Carsen Stringer
Is it possible to overwrite a version tag?
Comment:
I've looked through the broken packages docs but am not finding an answer to whether I can overwrite/rename a mistaken package version:
In this case there are no issues with the dependencies, we just got the version tag wrong. This causes issues in scripts that sort calver version strings (see below we should've published 2023.01.04
not 2023.1.4
:
pangeo-dask 2022.12.17 hd8ed1ab_0 conda-forge
pangeo-dask 2023.1.4 hd8ed1ab_0 conda-forge
pangeo-dask 2023.01.11 hd8ed1ab_0 conda-forge
pangeo-dask 2023.01.13 hd8ed1ab_0 conda-forge
In this case it would also be ok to delete the 2023.1.4
version entirely since this is a metapackage that simply pins other package combinations over time...
feedstock issue: conda-forge/pangeo-dask-feedstock#110
Rename / remove redis-feedstock
I'm trying to add a new recipe for redis (the full package, not the python wrapper) here however the linter is failing as a feedstock for a duplicate (#142) redis-py
package already exists here: https://github.com/conda-forge/redis-feedstock
Could the archived redis-feedstock
be renamed / removed? Or is there another preferred path forward here?
New feedstocks not finishing migration due to expired token
Solution to issue cannot be found in the documentation.
- I checked the documentation.
Issue
As noted on gitter
, new recipes are not finishing feedstockification: due to great foresight, of course, they are still in staged-recipes
.
A user helpfully dug through the logs, and found messages like:
vsts.exceptions.VstsServiceError: Access Denied: The Personal Access Token used has expired.
Seems like a button push for someone with perms!
Installed packages
CI
Environment info
GHA
use unpatched repodata for checks
It might be useful to use the unpatched repodata for checking packages exist. We would have to pull the labels by hand.
BUG mixup between .conda and .tar.bz2
There appears to be some weird mixup between .conda and .tar.bz2 in the logic for checking if artifacts exist for marking broken.
xref: #790
rename to admin-requests
Add GH check to staged-recipes commits linking to conversion log
Recently staged-recipes
feedstock conversion was migrated to admin-requests
( #542 ). As a result of this it is no longer possible to see on staged-recipes
why a particular conversion failed since the status is not updated there. It would be helpful to have the status check on staged-recipes
updated to link to jobs run here for conversion to make it easier to understand what went wrong when diagnosing issues.
Rename weave-feedstock
Comment:
A while ago, the weave
package was split-off from scipy.weave
in maintenance-only mode (in particular, it was Python2-only with no ongoing development). Recently, the package has again seen a bit of development, in particular it has now experimental support for Python 3. More importantly, it has been renamed to scipy-weave
, to release the name weave
on pypi. For more details, see https://mail.python.org/archives/list/[email protected]/message/AGFIP5TMTIYDZAF6B5WAFYCXJ66LAHDW/
The conda-forge recipe for weave
is still referring to weave
on pypi which is now an unrelated package. I guess the best approach would be to rename the weave
package to scipy-weave
on conda-forge as well? I don't know what the correct procedure is in this case – I can open a new PR on staged recipes for scipy-weave
, but I don't know how to delete/deprecate an existing package. Otherwise I could of course update the package recipe so that it refers to scipy-weave
on pypi, but I feel that would only lead to more confusion in the future.
Thanks for your help!
[EDIT: forgot a few links]
Current weave feedstock: https://github.com/conda-forge/weave-feedstock
Scipy-weave on Pypi (formerly known as weave): https://pypi.org/project/scipy-weave/
PR that renamed weave to scipy-weave: scipy/weave#18
New, unrelated weave package on Pypi: https://pypi.org/project/weave/
Deleting a single release on anaconda.org for https://anaconda.org/conda-forge/interpret
Comment:
A few days ago I published a new package on conda-forge (https://anaconda.org/conda-forge/interpret). My intention was to make this package a noarch package. For reasons that aren't important to get into here, I made the initial release "win-64" with the intention of putting out a noarch update shortly after the feedstock was created. That noarch update has been released. I've noticed now though that I can't get rid of the "win-64" platform label on https://anaconda.org/conda-forge/interpret
In my local environment I noticed that the conda channels list specific architectures first, before the noarch "platform", I'm concerned that on Windows it might attempt to download the obsolete version first before the newer noarch package. Even if that is not the case, this is the kind of thing that feels like it might be an issue someday for someone with some kind of configuration that I'm not currently aware could exist.
I think the safest thing to do here would be to delete the initial win-64 specific file on anaconda.org (FILE: win-64/interpret-0.2.7-py38haa244fe_0.conda). At this point nobody will have taken a dependency on that file, and there are very few downloads (20-ish). I imagine all of those have been from automated systems so far and from my own testing.
I did note there is a process to mark packages as "broken" ( https://conda-forge.org/docs/maintainer/updating_pkgs.html#removing-broken-packages ). Is that the right approach here? Would that be sufficient to remove the "win-64" platform label? Outright deletion still feels like a cleaner solution to me given this isn't depended on yet.
Archive weave-feedstock
Please archive https://github.com/conda-forge/weave-feedstock/
The upstream package has been renamed to scipy-weave
, the package that is currently as weave
on PyPi is unrelated.
See discussion here #766
Add option to require extras in bot `update-grayskull`
Comment:
I am really happy about the update-grayskull
option in the bot section in conda-forge.yml
file:
https://conda-forge.org/docs/maintainer/conda_forge_yml.html#bot
Still there are a couple of packages where I would prefer to include all extra requirements in the dependencies of the conda-forge package and to my understanding this is not happening at the moment. The grayskull
package provides the option --extras-require-all
but I am not sure how to activate this for the conda-forge grayskull updates.
Broken `token_reset` pipeline
Last jobs have been failing for a few hours:
Traceback (most recent call last):
File "/usr/share/miniconda3/envs/cf/lib/python3.8/site-packages/conda_smithy/feedstock_tokens.py", line 455, in _register_token
func(user, project, feedstock_token, clobber, *args)
File "/usr/share/miniconda3/envs/cf/lib/python3.8/site-packages/conda_smithy/feedstock_tokens.py", line 658, in add_feedstock_token_to_travis
r.raise_for_status()
File "/usr/share/miniconda3/envs/cf/lib/python3.8/site-packages/requests/models.py", line 1021, in raise_for_status
raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 403 Client Error: Forbidden for url: https://api.travis-ci.com/repo/24240621/env_var/814bc388-3cc7-452b-87b1-47d270ef0e81
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/share/miniconda3/envs/cf/bin/conda-smithy", line 10, in <module>
sys.exit(main())
File "/usr/share/miniconda3/envs/cf/lib/python3.8/site-packages/conda_smithy/cli.py", line 669, in main
args.subcommand_func(args)
File "/usr/share/miniconda3/envs/cf/lib/python3.8/site-packages/conda_smithy/cli.py", line 826, in __call__
register_feedstock_token_with_providers(
File "/usr/share/miniconda3/envs/cf/lib/python3.8/site-packages/conda_smithy/feedstock_tokens.py", line 520, in register_feedstock_token_with_providers
raise e
File "/usr/share/miniconda3/envs/cf/lib/python3.8/site-packages/conda_smithy/feedstock_tokens.py", line 494, in register_feedstock_token_with_providers
_register_token(
File "/usr/share/miniconda3/envs/cf/lib/python3.8/site-packages/conda_smithy/feedstock_tokens.py", line 464, in _register_token
raise FeedstockTokenError(err_msg)
conda_smithy.feedstock_tokens.FeedstockTokenError: Failed to register feedstock token for conda-forge/mpltoolbox-feedstock on travis for args ()!
failed to reset token for 'mpltoolbox': CalledProcessError(1, ['conda', 'smithy', 'register-feedstock-token', '--without-circle', '--without-drone', '--without-github-actions', '--feedstock_directory', '/tmp/tmpfudbg0je/mpltoolbox-feedstock', '--organization', 'conda-forge', '--token_repo', 'https://x-access-token:${GITHUB_TOKEN}@github.com/conda-forge/feedstock-tokens'])
[main be3d73a] Keeping token_reset/mpltoolbox.txt after failed token reset
Is this an expired token?
cc @beckermr
Typo in PR template
Solution to issue cannot be found in the documentation.
- I checked the documentation. (Not really, I didn't know where to look for this.)
Issue
When opening a pull request, I saw a text that said:
What will happen when a package is marked broken?
* Our bots will add the `broken` label to the package. The `main` label will remain on the package and this is normal.
* Our bots will rebuild our repodata pacthes to remove this package from the repodata.
* In a few hours after the `anaconda.org` CDN picks up the new patches, you will no longer be able to install the package from the `main` channel.
Note that the second bullet point has a typo pacthes
.
I couldn't find this string in the repository. Where does it come from?
Are broken packages supposed to be removed from main?
Solution to issue cannot be found in the documentation.
- I checked the documentation.
Issue
It seems that the recently marked broken qt-webengine package is being downloaded.
My hunch is that it is the main label isn't being removed.
https://anaconda.org/conda-forge/qt-webengine/files
Documentation states that they should be removed from the main
label.
https://conda-forge.org/docs/orga/guidelines.html?highlight=broken#fixing-broken-packages
Installed packages
qt-webengine
Environment info
mamba
yank setuptools_scm 6.1.0?
It appears setuptools_scm 6.1.0 was yanked from pypi: https://pypi.org/project/setuptools-scm/#history
shall we do the same @conda-forge/core?
Investigate the usage of multiple PR templates
Apparently Github allows several PR templates to be used with template
query parameters (see docs). It's not as intuitive as the choice UI for issues, but it's a start.
This would allow us to customize the default PR template so it reflects the task needed for broken, archival or token regeneration.
Package built with wrong package name
Comment:
I'm a maintainer of the package pyam (https://github.com/conda-forge/pyam-feedstock), but for some reasons in its early days, this package was published as pyam-iamc on Pypi.
I recently played around with grayskull and used it to update the recipe - however, I did not pay attention that grayskull wasn't aware of the pypi-vs-conda name inconsistency, and conda-forge/pyam-feedstock#53 changed the package name by mistake.
I fixed this via conda-forge/pyam-feedstock#54, but now there is a duplicate of the package on conda-forge:
I know that you usually do not remove packages from conda-forge, but as this is 1) a silly mistake 2) a duplicate of an existing package and 3) not on the platform for more than a few hours, could you remove that package to avoid confusion by our users?
Invalid versions in conda-forge/label/broken lead to installation failure
Basically conda create -n test-env -c conda-forge/label/broken pyside2
will fail with:
InvalidVersionSpec: Invalid version '2.0.0~alpha0': invalid character(s)
This is because In this line conda constructs a list of available packages and parses their versions in here. This list contains PackageRecord
s where the version is version="2.0.0~alpha0"
. This string cannot be parsed according to conda's rules.
The packages that are violating the version rules (and will need to be deleted) are:
[
PackageRecord(
_hash=3870681504580325642,
name="pyside2",
version="2.0.0~alpha0",
build="py27_0",
build_number=0,
channel=Channel("conda-forge/label/broken/linux-64"),
subdir="linux-64",
fn="pyside2-2.0.0~alpha0-py27_0.tar.bz2",
md5="934d955394d4fe55df52389c3a24fb0b",
url="https://conda.anaconda.org/conda-forge/label/broken/linux-64/pyside2-2.0.0~alpha0-py27_0.tar.bz2",
arch="x86_64",
platform="linux",
depends=("libgcc", "libxml2", "libxslt", "python 2.7*", "qt 5.6.*"),
license="LGPL3",
size=8482448,
),
PackageRecord(
_hash=3428084316280495246,
name="pyside2",
version="2.0.0~alpha0",
build="py36_0",
build_number=0,
channel=Channel("conda-forge/label/broken/linux-64"),
subdir="linux-64",
fn="pyside2-2.0.0~alpha0-py36_0.tar.bz2",
md5="601efc1d67cda89332fd3e51dc8f1faf",
url="https://conda.anaconda.org/conda-forge/label/broken/linux-64/pyside2-2.0.0~alpha0-py36_0.tar.bz2",
arch="x86_64",
platform="linux",
depends=("libgcc", "libxml2", "libxslt", "python 3.6*", "qt 5.6.*"),
license="LGPL3",
size=8474771,
),
PackageRecord(
_hash=8670771201715925028,
name="pyside2",
version="2.0.0~alpha0",
build="py35_0",
build_number=0,
channel=Channel("conda-forge/label/broken/linux-64"),
subdir="linux-64",
fn="pyside2-2.0.0~alpha0-py35_0.tar.bz2",
md5="741a030b6c25705123c8d7dd8c9e08bf",
url="https://conda.anaconda.org/conda-forge/label/broken/linux-64/pyside2-2.0.0~alpha0-py35_0.tar.bz2",
arch="x86_64",
platform="linux",
depends=("libgcc", "libxml2", "libxslt", "python 3.5*", "qt 5.6.*"),
license="LGPL3",
size=8478530,
),
]
Could these packages be deleted?
Trouble updating environment variables on Travis CI (token reset)
As part of the token reset process, we set environment variables on CI providers. Lately this has been running into 403s on Travis CI. For example (also on CI):
Traceback (most recent call last):
File "/usr/share/miniconda3/envs/cf/lib/python3.8/site-packages/conda_smithy/feedstock_tokens.py", line 455, in _register_token
func(user, project, feedstock_token, clobber, *args)
File "/usr/share/miniconda3/envs/cf/lib/python3.8/site-packages/conda_smithy/feedstock_tokens.py", line 658, in add_feedstock_token_to_travis
r.raise_for_status()
File "/usr/share/miniconda3/envs/cf/lib/python3.8/site-packages/requests/models.py", line 1021, in raise_for_status
raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 403 Client Error: Forbidden for url: https://api.travis-ci.com/repo/12190770/env_var/8e33ebfd-c0ad-4801-bc53-abb7b0f416ea
Filing to document/track this issue
Scheduled runs are failing
Solution to issue cannot be found in the documentation.
- I checked the documentation.
Issue
Both repodata_patching
and create_feedstocks
are failing:
Create feedstocks error below. Fixed by conda-forge/staged-recipes#24488
++ python .github/workflows/scripts/create_feedstocks.py
Traceback (most recent call last):
File "/home/runner/work/admin-requests/admin-requests/.github/workflows/scripts/create_feedstocks.py", line 216, in <module>
gh_remaining = print_rate_limiting_info(gh, 'GH_TOKEN')
File "/home/runner/work/admin-requests/admin-requests/.github/workflows/scripts/create_feedstocks.py", line 142, in print_rate_limiting_info
gh_api_reset_time -= datetime.utcnow()
TypeError: can't subtract offset-naive and offset-aware datetimes
Repodata patching error below. Fixed by #860
Traceback (most recent call last):
File "show_diff.py", line 161, in <module>
from gen_patch_json import SUBDIRS
File "/tmp/tmpt0c4w9vu/conda-forge-repodata-patches-feedstock/recipe/gen_patch_json.py", line 22, in <module>
from patch_yaml_utils import (
File "/tmp/tmpt0c4w9vu/conda-forge-repodata-patches-feedstock/recipe/patch_yaml_utils.py", line 21, in <module>
from patch_yaml_model import PatchYaml # noqa
File "/tmp/tmpt0c4w9vu/conda-forge-repodata-patches-feedstock/recipe/patch_yaml_model.py", line 8, in <module>
from typing import Annotated
ImportError: cannot import name 'Annotated' from 'typing'
Installed packages
NA
Environment info
NA
add ability to unmark broken
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.