mamba-org / micromamba-releases Goto Github PK
View Code? Open in Web Editor NEWMicromamba executables mirrored from conda-forge as Github releases
Micromamba executables mirrored from conda-forge as Github releases
install.sh prints [Y/n]
, which mean Y is default value, but it uses CONDA_FORGE_YES="${CONDA_FORGE_YES:-no}"
The provided .sha256
files are always hashes of micromamba binaries and never the archives. In other words, micromamba-linux-64.tar.bz2.sha256 is not the sha256sum of micromamba-linux-64.tar.bz2. Relatedly, micromamba-linux-64.sha256 and micromamba-linux-64.tar.bz2.sha256 are identical. Assuming this isn't an artifact of how anaconda.org
operates, I think these should be the hashes of the archive files instead of the binaries they contain.
The workflow file is invalid as described in #12 (comment)
There was a problem with the Test micro.mamba.pm workflow, please investigate.
There was a problem with the Test micro.mamba.pm workflow, please investigate.
critical libmamba The package "/tmp/linux-64/xxxx.tar.bz3" is not available for the specified platform
Failing back to 1.4.9 it all works, but some change at 1.5.0 caused this.
I install a package from conda-forge and then install locally built plugins for that package using micromamba. But cannot install the local package since the latest release
We recently started having build failures after 1.5.1-1 was released because we use https://github.com/mamba-org/setup-micromamba with version set to the default latest
and linux-aarch64 runners. The errors are 404s like:
Install micromamba
Error: Error installing micromamba: Unexpected HTTP response: 404
Error: Unexpected HTTP response: 404
They seem to be caused by there being no linux-aarch64 files in the 1.5.1-1 assets list. Could you please add the linux-aarch64 files to the release?
There was a problem with the Test micro.mamba.pm workflow, please investigate.
Invoke-Expression ((Invoke-WebRequest -Uri https://micro.mamba.pm/install.ps1).Content)
can be simplified to
irm https://micro.mamba.pm/install.ps1 | iex
, where irm -> Invoke-RestMethod
and iex -> Invoke-Expression
are built-in aliases.
xref mamba-org/mamba#2536
Who can do this?
In the current version of install.ps1 (946ea16), although the installation script asks the user if they want to initialize using a custom prefix, this variable is not used during the actual initialization process. This can lead to results that are inconsistent with the user's expectations after installation.
See https://github.com/mamba-org/micromamba-releases/blame/946ea168f610f3d8f7cdb25c12561ba286a59d75/install.ps1#L41 and below.
Right now https://micro.mamba.pm/install.sh is an old version of https://micromamba.pfx.dev/install.sh. Can we redirect to the new place? @wolfv
From readme:
On Windows, the executable micromamba.exe is installed into $Env:LocalAppData\micromamba\micromamba.exe.
Shouldn't it honour MAMBA_ROOT_PREFIX env variable if it's defined?
There was a problem with the Test micro.mamba.pm workflow, please investigate.
There was a problem with the Test micro.mamba.pm workflow, please investigate.
There was a problem with the Test micro.mamba.pm workflow, please investigate.
My shared hosting provider uses a pretty locked-down shell. I got the following output when trying to install {python}
using micromamba. Perhaps there are things micromamba is looking for in /bin
or /usr/bin
that I don't have? However, the first errors I'm seeing are just about not being able to create files/folders in ~/lib
which is odd. I set the permissions on ~/lib
to 775
.
Transaction starting
critical libmamba Can't create 'lib/libz.so.1'
(�m$Ve-3.45.2-h2797004_0.conda extraction failed
error libmamba Error when extracting package: std::bad_alloc
p[�m$V.0.1-hd590300_0.conda extraction failed
error libmamba Error when extracting package: std::bad_alloc
critical libmamba Can't create 'bin/bzcmp'
critical libmamba Can't create 'lib/libquadmath.so.0'
error libmamba Error opening for reading "/home3/arielbal/micromamba/pkgs/libzlib-1.2.13-hd590300_5/info/index.json": No such file or directory
error libmamba Error opening for reading "/home3/arielbal/micromamba/pkgs/bzip2-1.0.8-hd590300_5/info/index.json": No such file or directory
@��m$V`2�m$Vd590300_5.conda extraction failed
Any ideas for what the problems are or possible workarounds?
Here is my environment.yml
file:
name:
channels:
- conda-forge
- pytorch
- nvidia
dependencies:
- python >= 3.8, <3.9
- pytorch
- pytorch-cuda=11.8
- torchvision
- pip
- pip:
- ultralytics
- pillow
- numpy
- pandas
- git+https://github.com/openai/CLIP.git
When running the following command:
micromamba env create -f environment.yml -p ./.conda
it fails with
Installing pip packages: ultralytics, pillow, numpy, pandas, git+https://github.com/openai/CLIP.git
critical libmamba Cannot activate, prefix does not exist at: '{ SNIP }\micromambaenv\envs.conda'
critical libmamba pip failed to install packages
It seems to be trying to find an environment called .conda
instead of using the environment I have locally in my project.
I'm using micromamba version 1.5.6 on windows.
There was a problem with the Test micro.mamba.pm workflow, please investigate.
I'm working on a open-sourced repo that uses micromamba to create/configure environments to execute unit tests but the latest release candidate seems to be causing some command parsing error. The unit tests (a job in a GHA workflow) ran daily and succeeded 2 days ago but failed 14 hours ago despite no changes in between.
micromamba env create --yes --file environment.yml
In case helpful, environment.yml: https://github.com/BCG-X-Official/artkit/blob/1.0.x/environment.yml
critical libmamba Error parsing version "". Empty version.
Full error log: https://github.com/BCG-X-Official/artkit/actions/runs/9967061757/job/27570489821
If 2.0.0rc0-1 truly is the culprit, is there a way to prevent release candidates from getting pulled via curl -Ls https://micro.mamba.pm/api/micromamba/linux-64/latest
?
Micromamba
ps unknown option
Not applicable
Error using recommended micromamba install script on Windows Git Bash:
"${SHELL}" <(curl -L micro.mamba.pm/install.sh)
ps: unknown option -- o
Indeed, the ps
installed in my Git Bash does not have -o
:
The installation continues and appears to work though.
$ ps -h
Usage: ps [-aefls] [-u UID] [-p PID]
Report process status
-a, --all show processes of all users
-e, --everyone show processes of all users
-f, --full show process uids, ppids
-h, --help output usage information and exit
-l, --long show process uids, ppids, pgids, winpids
-p, --process show information for specified PID
-s, --summary show process summary
-u, --user list processes owned by UID
-V, --version output version information and exit
-W, --windows show windows as well as cygwin processes
With no options, ps outputs the long format by default
No response
No response
No response
No response
As of: https://github.com/mamba-org/micromamba-releases/releases/tag/1.5.3-0
Windows artifacts:
.exe
extension)I have problems activating the environment by default within a container.
Is micromamba using a different installation directory than conda?
Bootstrap: docker
From: mambaorg/micromamba:1.5.8
%files
environment.yaml /environment.yaml
requirements.txt /requirements.txt
%post
apt-get update && apt-get install -y libopenmpi-dev curl wget vim watch procps ncdu tree
micromamba create -n __apptainer__ && \
micromamba install -n __apptainer__ --file environment.yaml && \
micromamba clean --all --yes
# Update PATH to include micromamba binaries and other environment variables
echo 'export PATH="/usr/bin/micromamba:$PATH"' >> $SINGULARITY_ENVIRONMENT
# Add micromamba shell hook and activation to Singularity environment script
echo 'micromamba shell init --shell bash' >> $SINGULARITY_ENVIRONMENT
# Add micromamba shell hook and activation to Singularity environment script
echo '/usr/bin/micromamba activate __apptainer__' >> $SINGULARITY_ENVIRONMENT
# Run script
srun apptainer exec \
--nv \
--env-file ~/.env \
-B /home/:/home/ \
$APPTAINER_SIF \
python script.py
source: open /opt/conda/bin/activate: no such file or directory
critical libmamba Shell not initialized
'micromamba' is running as a subprocess and can't modify the parent shell.
Thus you must initialize your shell before using activate and deactivate.
To initialize the current shell, run:
$ eval "$(micromamba shell hook --shell )"
and then activate or deactivate with:
$ micromamba activate
To automatically initialize all future () shells, run:
$ micromamba shell init --shell --root-prefix=~/micromamba
If your shell was already initialized, reinitialize your shell with:
$ micromamba shell reinit --shell
Otherwise, this may be an issue. In the meantime you can run commands. See:
$ micromamba run --help
Supported shells are {bash, zsh, csh, xonsh, cmd.exe, powershell, fish}.
There was a problem with the Test micro.mamba.pm workflow, please investigate.
2.0.0rc0-1 seems to treat --root-prefix
differently than prior versions. The code below used to work, but now breaks with the latest release.
Scenario
micromamba create --channel conda-forge --name Python3.11 --root-prefix ~/micromamba --yes "python~=3.11.0"
micromamba activate Python3.11
Result
critical libmamba Cannot activate, prefix does not exist at: '/home/user/.local/share/mamba/envs/Python3.11'
Expected
The use of --root-prefix
would cause activation to work as expected (as it did with previous versions).
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.