Comments (7)
Hi @Todiq
Thanks for your feedback.
In fact, during the test stage (test Release & test Debug jobs), I first run conan cache restore to load the cache previously created by the matching build jobs. Unfortunately, that means that, if any third-party library was compiled through --build=missing, it no longer leaves it the inital position it was built to. This prevents anything that has been built in the build folder from running at all.
I am not sure if I understand this, or how this can be an issue. In the normal conan upload
+ conan install
process, the build folders are never uploaded, and are never installed. Furthermore, the location or name of the folders doesn't match either when you install them, compared with the "build" locations and names.
So I suspect there might be something different here. For example, if locally it works because you are relying on the RPATHs of the "build" stage, but you are not using VirtualXXXEnv
to implement the location of the binary artifacts, then yes, it works in the "build" tree in the local cache, but it will also fail in the same way as in the cache save/restore
flow if you don't use the virtual envs to enable finding some of the dependencies artifacts like shared libs.
I think we would need here a more detailed and minimal reproducible example to understand it better and try it out.
While the CONAN_HOME folder would contain both the packages from the caches of Release & Debug, the cache.sqlite3 would simply be erased by the latest one downloaded.
This would be a different issue, if the DB is not correctly storing the result of multiple consecutive conan cache restore
, then something might be failing, this might be a bug. Do you mean that if I do:
- in CONAN_HOME1: conan create . -s build_type=Release + conan cache save ...
- in CONAN_HOME2: conan create . -s build_type=Debug + conan cache save ...
- in CONAN_HOME3: conan cache restore (from home1)
- in CONAN_HOME3: conan cache restore (from home2)
A conan list mypkg:*
will only list 1 binary, because the second cache restore
removed the entry from the first cache restore
? Is this the behavior you are seeing? If you can provide more details to reproduce this one too, that would help a lot.
from conan.
Hello @memsharded,
Apologies. It was not very clear indeed.
conan cache restore
works perfectly for me. I was talking about the case where I don't use conan cache save
& conan cache restore
and rely entirely on the built-in gitlab functionality to create artifacts. The .gitlab-ci.yml
will be a nice support:
stages:
- build
- test
- upload
variables:
CONAN_HOME: "${CI_PROJECT_DIR}/.conan"
CONAN_LOGIN_USERNAME: "${ARTIFACTORY_USERNAME}"
CONAN_PASSWORD: "${ARTIFACTORY_TOKEN}"
BUILD_DIR: "${CI_PROJECT_DIR}/build"
GIT_DEPTH: 1
.build:
stage: build
cache:
key: ${CI_COMMIT_REF_NAME}
paths:
- "${BUILD_DIR}"
- "${CONAN_HOME}"
before_script:
- conan config install "/opt/linux-profiles" --target-folder "profiles"
- conan remote add mycompany https://artifactory.corp.mycompany.com/artifactory/api/conan/conan --force --insecure
- conan remove "companypkgs*/*" --confirm
- conan cache clean --source --download --build --temp
after_script:
- conan cache clean --source --download --build --temp
- conan cache save "*/*:*" --file="${CI_JOB_NAME}.tgz"
artifacts:
name: ${CI_JOB_NAME_SLUG}
when: always
paths:
- "${BUILD_DIR}"
- "${CI_JOB_NAME}.tgz"
.test:
stage: test
variables:
GIT_STRATEGY: none
script:
- ls -lA
- find ${BUILD_DIR} -type f -executable -name "*tests" -exec {} --gtest_output=xml:${CI_PROJECT_DIR}/${CI_JOB_NAME_SLUG}/ \;
artifacts:
when: always
paths:
- ${CONAN_HOME}
reports:
junit: "${CI_JOB_NAME_SLUG}/*.xml"
linux-x86_64-release:
extends: .build
script:
- conan install . --settings:all "arch=x86_64" --settings:all "&:build_type=Release" --profile:all linux --build=missing --remote mycompany
- conan build . --settings:all "arch=x86_64" --settings:all "&:build_type=Release" --profile:all linux --no-remote
- conan export-pkg . --settings:all "arch=x86_64" --settings:all "&:build_type=Release" --profile:all linux --no-remote
linux-x86_64-debug:
extends: .build
script:
- conan install . --settings:all "arch=x86_64" --settings:all "&:build_type=Debug" --profile:all linux --build=missing --remote mycompany
- conan build . --settings:all "arch=x86_64" --settings:all "&:build_type=Debug" --profile:all linux --no-remote
- conan export-pkg . --settings:all "arch=x86_64" --settings:all "&:build_type=Debug" --profile:all linux --no-remote
test-x86_64-release:
extends: .test
needs:
- "linux-x86_64-release"
before_script:
- conan cache restore "linux-x86_64-release.tgz"
test-x86_64-debug:
extends: .test
needs:
- "linux-x86_64-debug"
before_script:
- conan cache restore "linux-x86_64-debug.tgz"
upload-linux:
stage: upload
variables:
GIT_STRATEGY: none
dependencies:
- "linux-x86_64-release"
- "linux-x86_64-debug"
before_script:
- conan remote add mycompany https://artifactory.corp.mycompany.com/artifactory/api/conan/conan --force --insecure
- conan remote login mycompany
script:
- conan cache restore "linux-x86_64-release.tgz"
- conan cache restore "linux-x86_64-debug.tgz"
- conan upload "*" --remote mycompany --check --confirm
All variables specified in the variables:
section will be applied to all jobs. Which means the conan home will always be set to .conan
(relative to the current job's directory).
In the build jobs, not only do I compile my project, but also third-parties dependencies, if required. I decoupled conan install
& conan build
because it is more explicit this way in my opinion.
If I understood correctly, the third-parties that needed a recompile for whatever reason would be in ${CONAN_HOME}/p/b/
.
Once conan install
finished gathering and building them, I run conan build
to build my own code. This creates the build
folder in my project's tree. My library and the executable that tests it are in that folder. Thanks to RPATH, ldd
correctly locates the libraries, even without activating any virtualenv. Everything works fine as long as nothing moves. One can also fine tune the RPATH in the CMakeLists.txt
.
Then, for the future upload job, I export my project to the cache thanks to conan export-pkg
. However, I do not want to upload it just yet. I first want to test it.
To do so, I want to pass my build
folder from job to job (either Release or Debug). I first conan cache save ...
, then create a zip (GitLab calls it an artifact) containing both my build
folder (for tests purposes) and the just archived conan cache.
In the test-x86_64-release
job, I want to test the output of linux-x86_64-release
(same thing for the debug scenario). The artifact previously created is automatically unzipped. I end up with both the build
folder (and everything under it) and the conan cache of linux-x86_64-release
. The CONAN_HOME
is set to the correct location.
Since conan cache save ...
does keep the ${CONAN_HOME}/p/b/
folder, but automatically moved (exported?) the tird-parties to ${CONAN_HOME}/p/
, the lib and the test that are in the build
folder cannot find the third parties anymore. Even when activating the venv through conanbuild.sh
& conanrun.sh
, since the PATH would point to the previous location of the third-parties as well.
Unless I misunderstood, the combo of conan cache save
& conan cache restore
is not compatible with this workflow.
However, instead of using these commands, I could simply include the whole CONAN_HOME in the artifact of each job alongside the build
folder (but I would rather use the dedicated commands for their lightweight output). This way, everything in the build
folders in the test jobs properly find its dependencies.
Now, this does not work with the upload job. In fact, when gathering the artifacts from both the release & debug jobs (gitlab way), the CONAN_HOME
folders merge correctly, but files sharing the same name would only be erased by the latest one of that name that has been downloaded. I found nowhere to change this behaviour. Thus, even though my project is correctly in the merged folder in both the Release & Debug versions, the database only reports one of them.
conan cache save
+ consecutive conan cache restore
, however, work perfectly fine for the upload job, since the database is properly updated.
So, I simply cannot have the best of both worlds. Either I do not use conan cache save
+ conan cache restore
, but uploading all at once (third-parties & my project) is not possible (I want to avoid race condition), or I use these commands, but testing my project won't work.
Please let my know if this is clearer.
from conan.
Some hints:
- It is not necessary to
conan cache save "*/*:*" --file="${CI_JOB_NAME}.tgz"
. You can capture thegraph.json
of theinstall
operations, create a package list out of it, and callconan cache save --list=xxxx
, to save exclusively the packages you want, and not the full Conan cache.
Since conan cache save ... does keep the ${CONAN_HOME}/p/b/ folder, but automatically moved (exported?) the tird-parties to ${CONAN_HOME}/p/, the lib and the test that are in the build folder cannot find the third parties anymore. Even when activating the venv through conanbuild.sh & conanrun.sh, since the PATH would point to the previous location of the third-parties as well.
Unless I misunderstood, the combo of conan cache save & conan cache restore is not compatible with this workflow.
Once the packages have been conan cache restore
, it would be necessary to conan install
to generate new, updated conanbuild.sh
files, that will be pointing to the just restored packages. With this, the restored packages should be usable, in the same way an upload
+ install
makes them usable.
Now, this does not work with the upload job. In fact, when gathering the artifacts from both the release & debug jobs (gitlab way), the CONAN_HOME folders merge correctly, but files sharing the same name would only be erased by the latest one of that name that has been downloaded. I found nowhere to change this behaviour. Thus, even though my project is correctly in the merged folder in both the Release & Debug versions, the database only reports one of them.
The other alternative would be to use a secondary server side repo to accumulate the binaries, and then promote them when ready. If you create a "build" server repo, you can upload the Debug and Release binaries when their respective jobs finish. Then, instead of uploading them to the permanent repo after the conan cache restore
, you run a "promotion", basically a copy of both Debug and Release binaries from the "build" repo to the "develop" repo. There is a extension command in the extensions repo that allows promoting a package list easily.
from conan.
Thanks for you answer!
Once the packages have been conan cache restore, it would be necessary to conan install to generate new, updated conanbuild.sh files, that will be pointing to the just restored packages.
That indeed works this way. However, I wanted to exclude any conan install
command for the tests jobs, making it easier to read and understand. It would force me to pass the exact same arguments that I did during the build jobs, which is not particularly clean or easy. Plus, if I end up calling conan install
, I might as well upload only the third-parties in the build jobs and gather them through artifactory in the tests jobs.
I forgot to mention than running a script that loads the environment (conanrun.sh
here) does not propagate to the current shell that is running the subsequent commands (at least for GitLab CI).
The other alternative would be to use a secondary server side repo to accumulate the binaries, and then promote them when ready
I have not tested that just yet, but it feels unnecessary complex in my opinion and needs extra-setup on my side, while that could simply be managed through the existing commands.
The initial proposal was to maximise the conan cache save
& conan cache restore
combo usefulness, since it was specifically made to help things out during CI and avoid any complex setup.
from conan.
The initial proposal was to maximise the conan cache save & conan cache restore combo usefulness, since it was specifically made to help things out during CI and avoid any complex setup.
I am afraid that it is not possible to handle the "build" folder. The conan cache save/restore
it is designed and intended to manage packages, but not the temporarily build folders. There are issues in the addressing and creation of these folders that makes this quite challenging, so this wouldn't be planned.
I have not tested that just yet, but it feels unnecessary complex in my opinion and needs extra-setup on my side, while that could simply be managed through the existing commands.
This might require some extra setup, but it is not unnecessarily complex, this is already kind of a standard approach to build and do CI, specifically the moment that different configurations (like Windows and Linux) are to be built in parallel, then it might be even way easier than trying to transfer packages with conan cache save/restore
, and it can be way more convenient when the build of the dependency graph is parallelized accordingly to graph build-order
for scale.
So, to summarize, I am sorry, but unfortunately, it doesn't seem possible to extend the conan cache save/restore
to include the build folders. It would be necessary to use the same flows as conan upload + conan install
, in this case it means a new conan install
will be necessary after the conan cache restore
.
from conan.
Thanks for the explanation and especially the hints! I will try them out in order to achieve what I want
from conan.
Thanks to you for the feedback!
from conan.
Related Issues (20)
- [bug] unable to install openssl/3.2.1 recipe with conan version: 2.0.17 HOT 4
- [question] autotools with CFLAGS/LDFLAGS HOT 5
- [question] How to Conan download binaries and find binaries HOT 5
- Cache Gets wrong package[bug] HOT 8
- How can I read CPPFLAGS HOT 17
- [question] Recommended way of calling conan from python program HOT 8
- [question] Problem deploying oneMKL prepackaged binaries in custom package HOT 21
- [bug] tools.build:compiler_executables won't work with Meson due to spaces in the compiler path HOT 5
- Cannot install qt 6.7.0 prebuilt binary on Windows HOT 4
- [question] How to use editable library with cmake's conan provider file HOT 2
- [bug] `qt` install broken since 2.3.0 HOT 2
- [question] How can I not export sources when creating a package and publishing it HOT 3
- [question] Compatibility Issues HOT 5
- Conan CLI: Add option to conan remove to just skip without error if package does not exist HOT 2
- [feature] expose CMakeToolchain conf to be able to set CMAKE_GENERATOR_INSTANCE HOT 1
- conan export-pkg command not passing --version to the recipe with Conan 2.2.3 HOT 8
- [question] Revision mismatch between OSes HOT 1
- [bug] detect_api.detect_clang_compiler provides the minor version of the compiler HOT 4
- SSLError: HTTPSConnectionPool HOT 2
- [question] "control" cmake include dir variable HOT 7
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from conan.