Git Product home page Git Product logo

arcticdb-feedstock's Introduction

About arcticdb-feedstock

Feedstock license: BSD-3-Clause

Home: https://arcticdb.io/

Package license: BUSL-1.1

Summary: ArcticDB is a high performance, serverless DataFrame database built for the Python Data Science ecosystem.

Development: https://github.com/man-group/ArcticDB

Documentation: https://docs.arcticdb.io

ArcticDB is a high performance, serverless DataFrame database built for the Python Data Science ecosystem. Launched in March 2023, it is the successor to Arctic. ArcticDB offers an intuitive Python-centric API enabling you to read and write Pandas DataFrames to S3 or LMDB utilising a fast C++ data-processing and compression engine.

Current build status

Azure
VariantStatus
linux_64_numpy1.22python3.10.____cpython variant
linux_64_numpy1.22python3.8.____cpython variant
linux_64_numpy1.22python3.9.____cpython variant
linux_64_numpy1.23python3.11.____cpython variant
osx_64_numpy1.22python3.10.____cpython variant
osx_64_numpy1.22python3.9.____cpython variant
osx_64_numpy1.23python3.11.____cpython variant
osx_arm64_numpy1.22python3.10.____cpython variant
osx_arm64_numpy1.22python3.9.____cpython variant
osx_arm64_numpy1.23python3.11.____cpython variant

Current release info

Name Downloads Version Platforms
Conda Recipe Conda Downloads Conda Version Conda Platforms

Installing arcticdb

Installing arcticdb from the conda-forge channel can be achieved by adding conda-forge to your channels with:

conda config --add channels conda-forge
conda config --set channel_priority strict

Once the conda-forge channel has been enabled, arcticdb can be installed with conda:

conda install arcticdb

or with mamba:

mamba install arcticdb

It is possible to list all of the versions of arcticdb available on your platform with conda:

conda search arcticdb --channel conda-forge

or with mamba:

mamba search arcticdb --channel conda-forge

Alternatively, mamba repoquery may provide more information:

# Search all versions available on your platform:
mamba repoquery search arcticdb --channel conda-forge

# List packages depending on `arcticdb`:
mamba repoquery whoneeds arcticdb --channel conda-forge

# List dependencies of `arcticdb`:
mamba repoquery depends arcticdb --channel conda-forge

About conda-forge

Powered by NumFOCUS

conda-forge is a community-led conda channel of installable packages. In order to provide high-quality builds, the process has been automated into the conda-forge GitHub organization. The conda-forge organization contains one repository for each of the installable packages. Such a repository is known as a feedstock.

A feedstock is made up of a conda recipe (the instructions on what and how to build the package) and the necessary configurations for automatic building using freely available continuous integration services. Thanks to the awesome service provided by Azure, GitHub, CircleCI, AppVeyor, Drone, and TravisCI it is possible to build and upload installable packages to the conda-forge anaconda.org channel for Linux, Windows and OSX respectively.

To manage the continuous integration and simplify feedstock maintenance conda-smithy has been developed. Using the conda-forge.yml within this repository, it is possible to re-render all of this feedstock's supporting files (e.g. the CI configuration files) with conda smithy rerender.

For more information please check the conda-forge documentation.

Terminology

feedstock - the conda recipe (raw material), supporting scripts and CI configuration.

conda-smithy - the tool which helps orchestrate the feedstock. Its primary use is in the construction of the CI .yml files and simplify the management of many feedstocks.

conda-forge - the place where the feedstock and smithy live and work to produce the finished article (built conda distributions)

Updating arcticdb-feedstock

If you would like to improve the arcticdb recipe or build a new package version, please fork this repository and submit a PR. Upon submission, your changes will be run on the appropriate platforms to give the reviewer an opportunity to confirm that the changes result in a successful build. Once merged, the recipe will be re-built and uploaded automatically to the conda-forge channel, whereupon the built conda packages will be available for everybody to install and use from the conda-forge channel. Note that all branches in the conda-forge/arcticdb-feedstock are immediately built and any created packages are uploaded, so PRs should be based on branches in forks and branches in the main repository should only be used to build distinct package versions.

In order to produce a uniquely identifiable distribution:

  • If the version of a package is not being increased, please add or increase the build/number.
  • If the version of a package is being increased, please remember to return the build/number back to 0.

Feedstock Maintainers

arcticdb-feedstock's People

Contributors

alexowens90 avatar conda-forge-admin avatar conda-forge-curator[bot] avatar derthorsten avatar github-actions[bot] avatar h-vetinari avatar ivodd avatar jamesmunro avatar jjerphan avatar joe-iddon avatar johanmabille avatar mehertz avatar muhammadhamzasajjad avatar phoebusm avatar poodlewars avatar regro-cf-autotick-bot avatar vasil-pashov avatar

Stargazers

 avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

arcticdb-feedstock's Issues

Run the python tests for Azure Blob Storage

Describe the bug

The tests for Azure using Azurite on the the feedstock's containerized environment are currently failing.

It seems that the failure happens at the library creation and access.

Relevant pieces of logs:

Azurite Blob service is starting at http://127.0.0.1:43121
Azurite Blob service is successfully listening at http://127.0.0.1:43121
Azurite Queue service is starting at http://127.0.0.1:0
Azurite Queue service is successfully listening at http://127.0.0.1:38037
Azurite Table service is starting at http://127.0.0.1:0
Azurite Table service is successfully listening at http://127.0.0.1:36899
2024-01-11 16:11:27,337 - INFO - Request URL: 'http://127.0.0.1:43121/devstoreaccount1/testbucket1?restype=REDACTED'
Request method: 'GET'
Request headers:
    'x-ms-version': 'REDACTED'
    'Accept': 'application/xml'
    'User-Agent': 'azsdk-python-storage-blob/12.19.0 Python/3.9.18 (Linux-6.6.9-200.fc39.x86_64-x86_64-with-glibc2.17)'
    'x-ms-date': 'REDACTED'
    'x-ms-client-request-id': '12e3dcf4-b09c-11ee-b8b3-0242ac110002'
    'Authorization': 'REDACTED'
No body was attached to the request
2024-01-11 16:11:27,340 - DEBUG - Starting new HTTP connection (1): 127.0.0.1:43121
2024-01-11 16:11:27,365 - DEBUG - http://127.0.0.1:43121 "GET /devstoreaccount1/testbucket1?restype=container HTTP/1.1" 404 None
2024-01-11 16:11:27,366 - INFO - Response status: 404
Response headers:
    'Server': 'Azurite-Blob/3.29.0'
    'x-ms-error-code': 'ContainerNotFound'
    'x-ms-request-id': '3bc0ea98-6518-4fdf-b3da-992c49af8ddf'
    'content-type': 'application/xml'
    'Date': 'Thu, 11 Jan 2024 16:11:27 GMT'
    'Connection': 'keep-alive'
    'Keep-Alive': 'REDACTED'
    'Transfer-Encoding': 'chunked'
2024-01-11 16:11:27,367 - INFO - Request URL: 'http://127.0.0.1:43121/devstoreaccount1/testbucket1?restype=REDACTED'
Request method: 'PUT'
Request headers:
    'x-ms-version': 'REDACTED'
    'Accept': 'application/xml'
    'User-Agent': 'azsdk-python-storage-blob/12.19.0 Python/3.9.18 (Linux-6.6.9-200.fc39.x86_64-x86_64-with-glibc2.17)'
    'x-ms-date': 'REDACTED'
    'x-ms-client-request-id': '12e876b0-b09c-11ee-b8b3-0242ac110002'
    'Authorization': 'REDACTED'
No body was attached to the request
2024-01-11 16:11:27,371 - DEBUG - http://127.0.0.1:43121 "PUT /devstoreaccount1/testbucket1?restype=container HTTP/1.1" 201 0
2024-01-11 16:11:27,371 - INFO - Response status: 201
Response headers:
    'Server': 'Azurite-Blob/3.29.0'
    'etag': '"0x21652EFD9EDD8C0"'
    'last-modified': 'Thu, 11 Jan 2024 16:11:27 GMT'
    'x-ms-client-request-id': '12e876b0-b09c-11ee-b8b3-0242ac110002'
    'x-ms-request-id': 'd3805824-0548-4795-9741-822f84f862ef'
    'x-ms-version': 'REDACTED'
    'Date': 'Thu, 11 Jan 2024 16:11:27 GMT'
    'Connection': 'keep-alive'
    'Keep-Alive': 'REDACTED'
    'Content-Length': '0'
ERROR2024-01-11 16:11:27,481 - DEBUG - Starting new HTTP connection (1): localhost:60315
2024-01-11 16:11:27,508 - INFO - 127.0.0.1 - - [11/Jan/2024 16:11:27] "POST /moto-api/reset HTTP/1.1" 200 -
2024-01-11 16:11:27,508 - DEBUG - http://localhost:60315 "POST /moto-api/reset HTTP/1.1" 200 16
Killing Azurite

==================================== ERRORS ====================================
__ ERROR at setup of test_library_creation_deletion[Azure-EncodingVersion.V1] __

request = <SubRequest 'arctic_client' for <Function test_library_creation_deletion[Azure-EncodingVersion.V1]>>
moto_s3_uri_incl_bucket = 's3://localhost:test_bucket_0?access=awd&secret=awd&port=60315'
tmp_path = PosixPath('/tmp/pytest-of-conda/pytest-0/test_library_creation_deletion0')
encoding_version = <EncodingVersion.V1: 0>

    @pytest.fixture(
        scope="function",
        params=[
            "S3",
            "LMDB",
            "MEM",
            pytest.param("Azure", marks=AZURE_TESTS_MARK),
            pytest.param("Mongo", marks=MONGO_TESTS_MARK),
            pytest.param("Real_S3", marks=REAL_S3_TESTS_MARK),
        ],
    )
    def arctic_client(request, moto_s3_uri_incl_bucket, tmp_path, encoding_version):
        if request.param == "S3":
            ac = Arctic(moto_s3_uri_incl_bucket, encoding_version)
        elif request.param == "Azure":
>           ac = Arctic(request.getfixturevalue("azurite_azure_uri_incl_bucket"), encoding_version)

python/tests/conftest.py:240:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placeho/lib/python3.9/site-packages/arcticdb/arctic.py:181: in __init__
    self._library_manager = LibraryManager(self._library_adapter.config_library)
../_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placeho/lib/python3.9/site-packages/arcticdb/adapters/azure_library_adapter.py:85: in config_library
    lib = NativeVersionStore.create_store_from_config(
../_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placeho/lib/python3.9/site-packages/arcticdb/version_store/_store.py:264: in create_store_from_config
    lib = cls.create_lib_from_lib_config(lib_cfg, env, open_mode)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

lib_cfg = lib_desc {
  name: "_arctic_cfg"
  storage_ids: "_arctic_cfg_store"
  version {
  }
}
storage_by_id {
  key: "_arctic_...VErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==;BlobEndpoint=http://127.0.0.1:43121/devstoreaccount1*\013_arctic_cfg"
    }
  }
}

env = 'local', open_mode = <OpenMode.DELETE: 7>

    @staticmethod
    def create_lib_from_lib_config(lib_cfg, env, open_mode):
        envs_cfg = _env_config_from_lib_config(lib_cfg, env)
        cfg_resolver = _create_mem_config_resolver(envs_cfg)
        lib_idx = _LibraryIndex.create_from_resolver(env, cfg_resolver)
>       return lib_idx.get_library(lib_cfg.lib_desc.name, _OpenMode(open_mode))
E       arcticdb_ext.exceptions.InternalException: std::length_error(basic_string::_M_create)

I cannot reproduce the error locally in an non-containerized environment.

#141 aims at reproducing, understanding and eventually solving this issue.

Steps/Code to Reproduce

Build ArcticDD using the feedstock and the conda-forge's Docker Image with the Azure Test selected e.g. using #141.

Expected Results

All the tests for Azure must pass in a containerized environment.

OS, Python Version and ArcticDB Version

Python: all version
ArcticDB: 4.2.1

Backend storage used

Azure

Additional Context

Full Trace
+ pytest -s python/tests -vvv -k 'test_library_creation_deletion and Azure' --maxfail=1
============================= test session starts ==============================
platform linux -- Python 3.9.18, pytest-7.4.4, pluggy-1.3.0 -- $PREFIX/bin/python3.9
cachedir: .pytest_cache
hypothesis profile 'dev' -> database=DirectoryBasedExampleDatabase('$SRC_DIR/.hypothesis/examples')
rootdir: $SRC_DIR
plugins: anyio-4.2.0, hypothesis-6.72.4, cpp-2.5.0, rerunfailures-13.0, timeout-2.2.0, xdist-3.5.0
collecting ... collected 3764 items / 3762 deselected / 2 selected

python/tests/integration/arcticdb/test_arctic.py::test_library_creation_deletion[Azure-EncodingVersion.V1] 2024-01-11 16:11:24,841 - INFO - WARNING: This is a development server. Do not use it in a production deployment. Use a production WSGI server instead.
 * Running on all addresses (0.0.0.0)
 * Running on http://127.0.0.1:60315
 * Running on http://172.17.0.2:60315
2024-01-11 16:11:24,841 - INFO - Press CTRL+C to quit
2024-01-11 16:11:25,044 - DEBUG - Starting new HTTP connection (1): localhost:60315
2024-01-11 16:11:25,181 - INFO - 127.0.0.1 - - [11/Jan/2024 16:11:25] "GET / HTTP/1.1" 200 -
2024-01-11 16:11:25,181 - DEBUG - http://localhost:60315 "GET / HTTP/1.1" 200 205
2024-01-11 16:11:25,182 - DEBUG - Changing event name from creating-client-class.iot-data to creating-client-class.iot-data-plane
2024-01-11 16:11:25,183 - DEBUG - Changing event name from before-call.apigateway to before-call.api-gateway
2024-01-11 16:11:25,184 - DEBUG - Changing event name from request-created.machinelearning.Predict to request-created.machine-learning.Predict
2024-01-11 16:11:25,184 - DEBUG - Changing event name from before-parameter-build.autoscaling.CreateLaunchConfiguration to before-parameter-build.auto-scaling.CreateLaunchConfiguration
2024-01-11 16:11:25,184 - DEBUG - Changing event name from before-parameter-build.route53 to before-parameter-build.route-53
2024-01-11 16:11:25,185 - DEBUG - Changing event name from request-created.cloudsearchdomain.Search to request-created.cloudsearch-domain.Search
2024-01-11 16:11:25,185 - DEBUG - Changing event name from docs.*.autoscaling.CreateLaunchConfiguration.complete-section to docs.*.auto-scaling.CreateLaunchConfiguration.complete-section
2024-01-11 16:11:25,186 - DEBUG - Changing event name from before-parameter-build.logs.CreateExportTask to before-parameter-build.cloudwatch-logs.CreateExportTask
2024-01-11 16:11:25,186 - DEBUG - Changing event name from docs.*.logs.CreateExportTask.complete-section to docs.*.cloudwatch-logs.CreateExportTask.complete-section
2024-01-11 16:11:25,186 - DEBUG - Changing event name from before-parameter-build.cloudsearchdomain.Search to before-parameter-build.cloudsearch-domain.Search
2024-01-11 16:11:25,186 - DEBUG - Changing event name from docs.*.cloudsearchdomain.Search.complete-section to docs.*.cloudsearch-domain.Search.complete-section
2024-01-11 16:11:25,188 - DEBUG - Loading JSON file: $PREFIX/lib/python3.9/site-packages/botocore/data/endpoints.json
2024-01-11 16:11:25,258 - DEBUG - Loading JSON file: $PREFIX/lib/python3.9/site-packages/botocore/data/sdk-default-configuration.json
2024-01-11 16:11:25,258 - DEBUG - Event choose-service-name: calling handler <function handle_service_name_alias at 0x7f0acf425a60>
2024-01-11 16:11:25,264 - DEBUG - Loading JSON file: $PREFIX/lib/python3.9/site-packages/botocore/data/s3/2006-03-01/service-2.json
2024-01-11 16:11:25,274 - DEBUG - Loading JSON file: $PREFIX/lib/python3.9/site-packages/botocore/data/s3/2006-03-01/endpoint-rule-set-1.json
2024-01-11 16:11:25,276 - DEBUG - Loading JSON file: $PREFIX/lib/python3.9/site-packages/botocore/data/partitions.json
2024-01-11 16:11:25,277 - DEBUG - Event creating-client-class.s3: calling handler <function add_generate_presigned_post at 0x7f0acf491d30>
2024-01-11 16:11:25,277 - DEBUG - Event creating-client-class.s3: calling handler <function lazy_call.<locals>._handler at 0x7f0acb66b5e0>
2024-01-11 16:11:25,293 - DEBUG - Event creating-client-class.s3: calling handler <function add_generate_presigned_url at 0x7f0acf491af0>
2024-01-11 16:11:25,295 - DEBUG - Setting s3 timeout as (60, 60)
2024-01-11 16:11:25,297 - DEBUG - Loading JSON file: $PREFIX/lib/python3.9/site-packages/botocore/data/_retry.json
2024-01-11 16:11:25,297 - DEBUG - Registering retry handlers for service: s3
2024-01-11 16:11:25,297 - DEBUG - Registering S3 region redirector handler
2024-01-11 16:11:25,297 - DEBUG - Registering S3Express Identity Resolver
2024-01-11 16:11:25,298 - DEBUG - Event before-parameter-build.s3.CreateBucket: calling handler <function validate_bucket_name at 0x7f0acf3c7160>
2024-01-11 16:11:25,298 - DEBUG - Event before-parameter-build.s3.CreateBucket: calling handler <function remove_bucket_from_url_paths_from_model at 0x7f0acf3c8f70>
2024-01-11 16:11:25,298 - DEBUG - Event before-parameter-build.s3.CreateBucket: calling handler <bound method S3RegionRedirectorv2.annotate_request_context of <botocore.utils.S3RegionRedirectorv2 object at 0x7f0ac9e37c10>>
2024-01-11 16:11:25,298 - DEBUG - Event before-parameter-build.s3.CreateBucket: calling handler <bound method S3ExpressIdentityResolver.inject_signing_cache_key of <botocore.utils.S3ExpressIdentityResolver object at 0x7f0ac9e37c40>>
2024-01-11 16:11:25,298 - DEBUG - Event before-parameter-build.s3.CreateBucket: calling handler <function generate_idempotent_uuid at 0x7f0acf428f70>
2024-01-11 16:11:25,298 - DEBUG - Event before-endpoint-resolution.s3: calling handler <function customize_endpoint_resolver_builtins at 0x7f0acf3cb160>
2024-01-11 16:11:25,298 - DEBUG - Event before-endpoint-resolution.s3: calling handler <bound method S3RegionRedirectorv2.redirect_from_cache of <botocore.utils.S3RegionRedirectorv2 object at 0x7f0ac9e37c10>>
2024-01-11 16:11:25,298 - DEBUG - Calling endpoint provider with parameters: {'Bucket': 'test_bucket_0', 'Region': 'us-east-1', 'UseFIPS': False, 'UseDualStack': False, 'Endpoint': 'http://localhost:60315', 'ForcePathStyle': True, 'Accelerate': False, 'UseGlobalEndpoint': True, 'DisableAccessPoints': True, 'DisableMultiRegionAccessPoints': False, 'UseArnRegion': True, 'UseS3ExpressControlEndpoint': True}
2024-01-11 16:11:25,299 - DEBUG - Endpoint provider result: http://localhost:60315/test_bucket_0
2024-01-11 16:11:25,299 - DEBUG - Selecting from endpoint provider's list of auth schemes: "sigv4". User selected auth scheme is: "None"
2024-01-11 16:11:25,299 - DEBUG - Selected auth type "v4" as "v4" with signing context params: {'region': 'us-east-1', 'signing_name': 's3', 'disableDoubleEncoding': True}
2024-01-11 16:11:25,299 - DEBUG - Event before-call.s3.CreateBucket: calling handler <function add_expect_header at 0x7f0acf3c74c0>
2024-01-11 16:11:25,299 - DEBUG - Event before-call.s3.CreateBucket: calling handler <bound method S3ExpressIdentityResolver.apply_signing_cache_key of <botocore.utils.S3ExpressIdentityResolver object at 0x7f0ac9e37c40>>
2024-01-11 16:11:25,299 - DEBUG - Event before-call.s3.CreateBucket: calling handler <function add_recursion_detection_header at 0x7f0acf428c10>
2024-01-11 16:11:25,299 - DEBUG - Event before-call.s3.CreateBucket: calling handler <function inject_api_version_header_if_needed at 0x7f0acf3c8820>
2024-01-11 16:11:25,299 - DEBUG - Making request for OperationModel(name=CreateBucket) with params: {'url_path': '', 'query_string': {}, 'method': 'PUT', 'headers': {'User-Agent': 'Boto3/1.34.16 md/Botocore#1.34.16 ua/2.0 os/linux#6.6.9-200.fc39.x86_64 md/arch#x86_64 lang/python#3.9.18 md/pyimpl#CPython cfg/retry-mode#legacy Botocore/1.34.16'}, 'body': b'', 'auth_path': '/test_bucket_0/', 'url': 'http://localhost:60315/test_bucket_0', 'context': {'client_region': 'us-east-1', 'client_config': <botocore.config.Config object at 0x7f0aca161760>, 'has_streaming_input': False, 'auth_type': 'v4', 's3_redirect': {'redirected': False, 'bucket': 'test_bucket_0', 'params': {'Bucket': 'test_bucket_0'}}, 'S3Express': {'bucket_name': 'test_bucket_0'}, 'signing': {'region': 'us-east-1', 'signing_name': 's3', 'disableDoubleEncoding': True}, 'endpoint_properties': {'authSchemes': [{'disableDoubleEncoding': True, 'name': 'sigv4', 'signingName': 's3', 'signingRegion': 'us-east-1'}]}}}
2024-01-11 16:11:25,299 - DEBUG - Event request-created.s3.CreateBucket: calling handler <bound method RequestSigner.handler of <botocore.signers.RequestSigner object at 0x7f0aca1613a0>>
2024-01-11 16:11:25,299 - DEBUG - Event choose-signer.s3.CreateBucket: calling handler <bound method ClientCreator._default_s3_presign_to_sigv2 of <botocore.client.ClientCreator object at 0x7f0acb231c40>>
2024-01-11 16:11:25,300 - DEBUG - Event choose-signer.s3.CreateBucket: calling handler <function set_operation_specific_signer at 0x7f0acf428e50>
2024-01-11 16:11:25,300 - DEBUG - Event before-sign.s3.CreateBucket: calling handler <function remove_arn_from_signing_path at 0x7f0acf3cb0d0>
2024-01-11 16:11:25,300 - DEBUG - Event before-sign.s3.CreateBucket: calling handler <bound method S3ExpressIdentityResolver.resolve_s3express_identity of <botocore.utils.S3ExpressIdentityResolver object at 0x7f0ac9e37c40>>
2024-01-11 16:11:25,300 - DEBUG - Calculating signature using v4 auth.
2024-01-11 16:11:25,300 - DEBUG - CanonicalRequest:
PUT
/test_bucket_0

host:localhost:60315
x-amz-content-sha256:e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855
x-amz-date:20240111T161125Z

host;x-amz-content-sha256;x-amz-date
e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855
2024-01-11 16:11:25,300 - DEBUG - StringToSign:
AWS4-HMAC-SHA256
20240111T161125Z
20240111/us-east-1/s3/aws4_request
88f10e2824fa8d5941b93fb87fc31db19b584bd7a7e662f1dd3d745394d8a681
2024-01-11 16:11:25,300 - DEBUG - Signature:
5c491a8ec593f5c9880a693222d757fd9578bd61e56ffb24d4c88173c03f29c3
2024-01-11 16:11:25,300 - DEBUG - Event request-created.s3.CreateBucket: calling handler <function add_retry_headers at 0x7f0acf3c8ee0>
2024-01-11 16:11:25,300 - DEBUG - Sending http request: <AWSPreparedRequest stream_output=False, method=PUT, url=http://localhost:60315/test_bucket_0, headers={'User-Agent': b'Boto3/1.34.16 md/Botocore#1.34.16 ua/2.0 os/linux#6.6.9-200.fc39.x86_64 md/arch#x86_64 lang/python#3.9.18 md/pyimpl#CPython cfg/retry-mode#legacy Botocore/1.34.16', 'X-Amz-Date': b'20240111T161125Z', 'X-Amz-Content-SHA256': b'e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855', 'Authorization': b'AWS4-HMAC-SHA256 Credential=awd/20240111/us-east-1/s3/aws4_request, SignedHeaders=host;x-amz-content-sha256;x-amz-date, Signature=5c491a8ec593f5c9880a693222d757fd9578bd61e56ffb24d4c88173c03f29c3', 'amz-sdk-invocation-id': b'521f43d5-b81a-4f43-b03b-0e288032f55f', 'amz-sdk-request': b'attempt=1', 'Content-Length': '0'}>
2024-01-11 16:11:25,300 - DEBUG - Event before-send.s3.CreateBucket: calling handler <moto.core.botocore_stubber.BotocoreStubber object at 0x7f0acecb1d00>
2024-01-11 16:11:25,301 - DEBUG - Starting new HTTP connection (1): localhost:60315
2024-01-11 16:11:25,304 - INFO - 127.0.0.1 - - [11/Jan/2024 16:11:25] "PUT /test_bucket_0 HTTP/1.1" 200 -
2024-01-11 16:11:25,304 - DEBUG - http://localhost:60315 "PUT /test_bucket_0 HTTP/1.1" 200 167
2024-01-11 16:11:25,304 - DEBUG - Response headers: {'Server': 'Werkzeug/3.0.1 Python/3.9.18', 'Date': 'Thu, 11 Jan 2024 16:11:25 GMT', 'x-amzn-requestid': 'sVdP3BGOw7kNS7Yj7raeG5TRj3LZCwuMMP8u8Ck8qcyCew3kfBg9', 'Content-Type': 'text/html; charset=utf-8', 'Content-Length': '167', 'Access-Control-Allow-Origin': '*', 'Connection': 'close'}
2024-01-11 16:11:25,304 - DEBUG - Response body:
b'<CreateBucketResponse xmlns="http://s3.amazonaws.com/doc/2006-03-01"><CreateBucketResponse><Bucket>test_bucket_0</Bucket></CreateBucketResponse></CreateBucketResponse>'
2024-01-11 16:11:25,305 - DEBUG - Event needs-retry.s3.CreateBucket: calling handler <botocore.retryhandler.RetryHandler object at 0x7f0ac9e37b50>
2024-01-11 16:11:25,305 - DEBUG - No retry needed.
2024-01-11 16:11:25,305 - DEBUG - Event needs-retry.s3.CreateBucket: calling handler <bound method S3RegionRedirectorv2.redirect_from_error of <botocore.utils.S3RegionRedirectorv2 object at 0x7f0ac9e37c10>>
Azurite Blob service is starting at http://127.0.0.1:43121
Azurite Blob service is successfully listening at http://127.0.0.1:43121
Azurite Queue service is starting at http://127.0.0.1:0
Azurite Queue service is successfully listening at http://127.0.0.1:38037
Azurite Table service is starting at http://127.0.0.1:0
Azurite Table service is successfully listening at http://127.0.0.1:36899
2024-01-11 16:11:27,337 - INFO - Request URL: 'http://127.0.0.1:43121/devstoreaccount1/testbucket1?restype=REDACTED'
Request method: 'GET'
Request headers:
    'x-ms-version': 'REDACTED'
    'Accept': 'application/xml'
    'User-Agent': 'azsdk-python-storage-blob/12.19.0 Python/3.9.18 (Linux-6.6.9-200.fc39.x86_64-x86_64-with-glibc2.17)'
    'x-ms-date': 'REDACTED'
    'x-ms-client-request-id': '12e3dcf4-b09c-11ee-b8b3-0242ac110002'
    'Authorization': 'REDACTED'
No body was attached to the request
2024-01-11 16:11:27,340 - DEBUG - Starting new HTTP connection (1): 127.0.0.1:43121
2024-01-11 16:11:27,365 - DEBUG - http://127.0.0.1:43121 "GET /devstoreaccount1/testbucket1?restype=container HTTP/1.1" 404 None
2024-01-11 16:11:27,366 - INFO - Response status: 404
Response headers:
    'Server': 'Azurite-Blob/3.29.0'
    'x-ms-error-code': 'ContainerNotFound'
    'x-ms-request-id': '3bc0ea98-6518-4fdf-b3da-992c49af8ddf'
    'content-type': 'application/xml'
    'Date': 'Thu, 11 Jan 2024 16:11:27 GMT'
    'Connection': 'keep-alive'
    'Keep-Alive': 'REDACTED'
    'Transfer-Encoding': 'chunked'
2024-01-11 16:11:27,367 - INFO - Request URL: 'http://127.0.0.1:43121/devstoreaccount1/testbucket1?restype=REDACTED'
Request method: 'PUT'
Request headers:
    'x-ms-version': 'REDACTED'
    'Accept': 'application/xml'
    'User-Agent': 'azsdk-python-storage-blob/12.19.0 Python/3.9.18 (Linux-6.6.9-200.fc39.x86_64-x86_64-with-glibc2.17)'
    'x-ms-date': 'REDACTED'
    'x-ms-client-request-id': '12e876b0-b09c-11ee-b8b3-0242ac110002'
    'Authorization': 'REDACTED'
No body was attached to the request
2024-01-11 16:11:27,371 - DEBUG - http://127.0.0.1:43121 "PUT /devstoreaccount1/testbucket1?restype=container HTTP/1.1" 201 0
2024-01-11 16:11:27,371 - INFO - Response status: 201
Response headers:
    'Server': 'Azurite-Blob/3.29.0'
    'etag': '"0x21652EFD9EDD8C0"'
    'last-modified': 'Thu, 11 Jan 2024 16:11:27 GMT'
    'x-ms-client-request-id': '12e876b0-b09c-11ee-b8b3-0242ac110002'
    'x-ms-request-id': 'd3805824-0548-4795-9741-822f84f862ef'
    'x-ms-version': 'REDACTED'
    'Date': 'Thu, 11 Jan 2024 16:11:27 GMT'
    'Connection': 'keep-alive'
    'Keep-Alive': 'REDACTED'
    'Content-Length': '0'
ERROR2024-01-11 16:11:27,481 - DEBUG - Starting new HTTP connection (1): localhost:60315
2024-01-11 16:11:27,508 - INFO - 127.0.0.1 - - [11/Jan/2024 16:11:27] "POST /moto-api/reset HTTP/1.1" 200 -
2024-01-11 16:11:27,508 - DEBUG - http://localhost:60315 "POST /moto-api/reset HTTP/1.1" 200 16
Killing Azurite


==================================== ERRORS ====================================
__ ERROR at setup of test_library_creation_deletion[Azure-EncodingVersion.V1] __

request = <SubRequest 'arctic_client' for <Function test_library_creation_deletion[Azure-EncodingVersion.V1]>>
moto_s3_uri_incl_bucket = 's3://localhost:test_bucket_0?access=awd&secret=awd&port=60315'
tmp_path = PosixPath('/tmp/pytest-of-conda/pytest-0/test_library_creation_deletion0')
encoding_version = <EncodingVersion.V1: 0>

    @pytest.fixture(
        scope="function",
        params=[
            "S3",
            "LMDB",
            "MEM",
            pytest.param("Azure", marks=AZURE_TESTS_MARK),
            pytest.param("Mongo", marks=MONGO_TESTS_MARK),
            pytest.param("Real_S3", marks=REAL_S3_TESTS_MARK),
        ],
    )
    def arctic_client(request, moto_s3_uri_incl_bucket, tmp_path, encoding_version):
        if request.param == "S3":
            ac = Arctic(moto_s3_uri_incl_bucket, encoding_version)
        elif request.param == "Azure":
>           ac = Arctic(request.getfixturevalue("azurite_azure_uri_incl_bucket"), encoding_version)

python/tests/conftest.py:240:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placeho/lib/python3.9/site-packages/arcticdb/arctic.py:181: in __init__
    self._library_manager = LibraryManager(self._library_adapter.config_library)
../_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placeho/lib/python3.9/site-packages/arcticdb/adapters/azure_library_adapter.py:85: in config_library
    lib = NativeVersionStore.create_store_from_config(
../_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placeho/lib/python3.9/site-packages/arcticdb/version_store/_store.py:264: in create_store_from_config
    lib = cls.create_lib_from_lib_config(lib_cfg, env, open_mode)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

lib_cfg = lib_desc {
  name: "_arctic_cfg"
  storage_ids: "_arctic_cfg_store"
  version {
  }
}
storage_by_id {
  key: "_arctic_...VErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==;BlobEndpoint=http://127.0.0.1:43121/devstoreaccount1*\013_arctic_cfg"
    }
  }
}

env = 'local', open_mode = <OpenMode.DELETE: 7>

    @staticmethod
    def create_lib_from_lib_config(lib_cfg, env, open_mode):
        envs_cfg = _env_config_from_lib_config(lib_cfg, env)
        cfg_resolver = _create_mem_config_resolver(envs_cfg)
        lib_idx = _LibraryIndex.create_from_resolver(env, cfg_resolver)
>       return lib_idx.get_library(lib_cfg.lib_desc.name, _OpenMode(open_mode))
E       arcticdb_ext.exceptions.InternalException: std::length_error(basic_string::_M_create)

../_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placeho/lib/python3.9/site-packages/arcticdb/version_store/_store.py:272: InternalException
------------------------------ Captured log setup ------------------------------
DEBUG    urllib3.connectionpool:connectionpool.py:246 Starting new HTTP connection (1): localhost:60315
DEBUG    urllib3.connectionpool:connectionpool.py:474 http://localhost:60315 "GET / HTTP/1.1" 200 205
DEBUG    botocore.hooks:hooks.py:482 Changing event name from creating-client-class.iot-data to creating-client-class.iot-data-plane
DEBUG    botocore.hooks:hooks.py:482 Changing event name from before-call.apigateway to before-call.api-gateway
DEBUG    botocore.hooks:hooks.py:482 Changing event name from request-created.machinelearning.Predict to request-created.machine-learning.Predict
DEBUG    botocore.hooks:hooks.py:482 Changing event name from before-parameter-build.autoscaling.CreateLaunchConfiguration to before-parameter-build.auto-scaling.CreateLaunchConfiguration
DEBUG    botocore.hooks:hooks.py:482 Changing event name from before-parameter-build.route53 to before-parameter-build.route-53
DEBUG    botocore.hooks:hooks.py:482 Changing event name from request-created.cloudsearchdomain.Search to request-created.cloudsearch-domain.Search
DEBUG    botocore.hooks:hooks.py:482 Changing event name from docs.*.autoscaling.CreateLaunchConfiguration.complete-section to docs.*.auto-scaling.CreateLaunchConfiguration.complete-section
DEBUG    botocore.hooks:hooks.py:482 Changing event name from before-parameter-build.logs.CreateExportTask to before-parameter-build.cloudwatch-logs.CreateExportTask
DEBUG    botocore.hooks:hooks.py:482 Changing event name from docs.*.logs.CreateExportTask.complete-section to docs.*.cloudwatch-logs.CreateExportTask.complete-section
DEBUG    botocore.hooks:hooks.py:482 Changing event name from before-parameter-build.cloudsearchdomain.Search to before-parameter-build.cloudsearch-domain.Search
DEBUG    botocore.hooks:hooks.py:482 Changing event name from docs.*.cloudsearchdomain.Search.complete-section to docs.*.cloudsearch-domain.Search.complete-section
DEBUG    botocore.loaders:loaders.py:180 Loading JSON file: $PREFIX/lib/python3.9/site-packages/botocore/data/endpoints.json
DEBUG    botocore.loaders:loaders.py:180 Loading JSON file: $PREFIX/lib/python3.9/site-packages/botocore/data/sdk-default-configuration.json
DEBUG    botocore.hooks:hooks.py:238 Event choose-service-name: calling handler <function handle_service_name_alias at 0x7f0acf425a60>
DEBUG    botocore.loaders:loaders.py:180 Loading JSON file: $PREFIX/lib/python3.9/site-packages/botocore/data/s3/2006-03-01/service-2.json
DEBUG    botocore.loaders:loaders.py:180 Loading JSON file: $PREFIX/lib/python3.9/site-packages/botocore/data/s3/2006-03-01/endpoint-rule-set-1.json
DEBUG    botocore.loaders:loaders.py:180 Loading JSON file: $PREFIX/lib/python3.9/site-packages/botocore/data/partitions.json
DEBUG    botocore.hooks:hooks.py:238 Event creating-client-class.s3: calling handler <function add_generate_presigned_post at 0x7f0acf491d30>
DEBUG    botocore.hooks:hooks.py:238 Event creating-client-class.s3: calling handler <function lazy_call.<locals>._handler at 0x7f0acb66b5e0>
DEBUG    botocore.hooks:hooks.py:238 Event creating-client-class.s3: calling handler <function add_generate_presigned_url at 0x7f0acf491af0>
DEBUG    botocore.endpoint:endpoint.py:408 Setting s3 timeout as (60, 60)
DEBUG    botocore.loaders:loaders.py:180 Loading JSON file: $PREFIX/lib/python3.9/site-packages/botocore/data/_retry.json
DEBUG    botocore.client:client.py:285 Registering retry handlers for service: s3
DEBUG    botocore.utils:utils.py:1728 Registering S3 region redirector handler
DEBUG    botocore.utils:utils.py:1669 Registering S3Express Identity Resolver
DEBUG    botocore.hooks:hooks.py:238 Event before-parameter-build.s3.CreateBucket: calling handler <function validate_bucket_name at 0x7f0acf3c7160>
DEBUG    botocore.hooks:hooks.py:238 Event before-parameter-build.s3.CreateBucket: calling handler <function remove_bucket_from_url_paths_from_model at 0x7f0acf3c8f70>
DEBUG    botocore.hooks:hooks.py:238 Event before-parameter-build.s3.CreateBucket: calling handler <bound method S3RegionRedirectorv2.annotate_request_context of <botocore.utils.S3RegionRedirectorv2 object at 0x7f0ac9e37c10>>
DEBUG    botocore.hooks:hooks.py:238 Event before-parameter-build.s3.CreateBucket: calling handler <bound method S3ExpressIdentityResolver.inject_signing_cache_key of <botocore.utils.S3ExpressIdentityResolver object at 0x7f0ac9e37c40>>
DEBUG    botocore.hooks:hooks.py:238 Event before-parameter-build.s3.CreateBucket: calling handler <function generate_idempotent_uuid at 0x7f0acf428f70>
DEBUG    botocore.hooks:hooks.py:238 Event before-endpoint-resolution.s3: calling handler <function customize_endpoint_resolver_builtins at 0x7f0acf3cb160>
DEBUG    botocore.hooks:hooks.py:238 Event before-endpoint-resolution.s3: calling handler <bound method S3RegionRedirectorv2.redirect_from_cache of <botocore.utils.S3RegionRedirectorv2 object at 0x7f0ac9e37c10>>
DEBUG    botocore.regions:regions.py:498 Calling endpoint provider with parameters: {'Bucket': 'test_bucket_0', 'Region': 'us-east-1', 'UseFIPS': False, 'UseDualStack': False, 'Endpoint': 'http://localhost:60315', 'ForcePathStyle': True, 'Accelerate': False, 'UseGlobalEndpoint': True, 'DisableAccessPoints': True, 'DisableMultiRegionAccessPoints': False, 'UseArnRegion': True, 'UseS3ExpressControlEndpoint': True}
DEBUG    botocore.regions:regions.py:513 Endpoint provider result: http://localhost:60315/test_bucket_0
DEBUG    botocore.regions:regions.py:660 Selecting from endpoint provider's list of auth schemes: "sigv4". User selected auth scheme is: "None"
DEBUG    botocore.regions:regions.py:733 Selected auth type "v4" as "v4" with signing context params: {'region': 'us-east-1', 'signing_name': 's3', 'disableDoubleEncoding': True}
DEBUG    botocore.hooks:hooks.py:238 Event before-call.s3.CreateBucket: calling handler <function add_expect_header at 0x7f0acf3c74c0>
DEBUG    botocore.hooks:hooks.py:238 Event before-call.s3.CreateBucket: calling handler <bound method S3ExpressIdentityResolver.apply_signing_cache_key of <botocore.utils.S3ExpressIdentityResolver object at 0x7f0ac9e37c40>>
DEBUG    botocore.hooks:hooks.py:238 Event before-call.s3.CreateBucket: calling handler <function add_recursion_detection_header at 0x7f0acf428c10>
DEBUG    botocore.hooks:hooks.py:238 Event before-call.s3.CreateBucket: calling handler <function inject_api_version_header_if_needed at 0x7f0acf3c8820>
DEBUG    botocore.endpoint:endpoint.py:114 Making request for OperationModel(name=CreateBucket) with params: {'url_path': '', 'query_string': {}, 'method': 'PUT', 'headers': {'User-Agent': 'Boto3/1.34.16 md/Botocore#1.34.16 ua/2.0 os/linux#6.6.9-200.fc39.x86_64 md/arch#x86_64 lang/python#3.9.18 md/pyimpl#CPython cfg/retry-mode#legacy Botocore/1.34.16'}, 'body': b'', 'auth_path': '/test_bucket_0/', 'url': 'http://localhost:60315/test_bucket_0', 'context': {'client_region': 'us-east-1', 'client_config': <botocore.config.Config object at 0x7f0aca161760>, 'has_streaming_input': False, 'auth_type': 'v4', 's3_redirect': {'redirected': False, 'bucket': 'test_bucket_0', 'params': {'Bucket': 'test_bucket_0'}}, 'S3Express': {'bucket_name': 'test_bucket_0'}, 'signing': {'region': 'us-east-1', 'signing_name': 's3', 'disableDoubleEncoding': True}, 'endpoint_properties': {'authSchemes': [{'disableDoubleEncoding': True, 'name': 'sigv4', 'signingName': 's3', 'signingRegion': 'us-east-1'}]}}}
DEBUG    botocore.hooks:hooks.py:238 Event request-created.s3.CreateBucket: calling handler <bound method RequestSigner.handler of <botocore.signers.RequestSigner object at 0x7f0aca1613a0>>
DEBUG    botocore.hooks:hooks.py:238 Event choose-signer.s3.CreateBucket: calling handler <bound method ClientCreator._default_s3_presign_to_sigv2 of <botocore.client.ClientCreator object at 0x7f0acb231c40>>
DEBUG    botocore.hooks:hooks.py:238 Event choose-signer.s3.CreateBucket: calling handler <function set_operation_specific_signer at 0x7f0acf428e50>
DEBUG    botocore.hooks:hooks.py:238 Event before-sign.s3.CreateBucket: calling handler <function remove_arn_from_signing_path at 0x7f0acf3cb0d0>
DEBUG    botocore.hooks:hooks.py:238 Event before-sign.s3.CreateBucket: calling handler <bound method S3ExpressIdentityResolver.resolve_s3express_identity of <botocore.utils.S3ExpressIdentityResolver object at 0x7f0ac9e37c40>>
DEBUG    botocore.auth:auth.py:425 Calculating signature using v4 auth.
DEBUG    botocore.auth:auth.py:426 CanonicalRequest:
PUT
/test_bucket_0

host:localhost:60315
x-amz-content-sha256:e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855
x-amz-date:20240111T161125Z

host;x-amz-content-sha256;x-amz-date
e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855
DEBUG    botocore.auth:auth.py:428 StringToSign:
AWS4-HMAC-SHA256
20240111T161125Z
20240111/us-east-1/s3/aws4_request
88f10e2824fa8d5941b93fb87fc31db19b584bd7a7e662f1dd3d745394d8a681
DEBUG    botocore.auth:auth.py:430 Signature:
5c491a8ec593f5c9880a693222d757fd9578bd61e56ffb24d4c88173c03f29c3
DEBUG    botocore.hooks:hooks.py:238 Event request-created.s3.CreateBucket: calling handler <function add_retry_headers at 0x7f0acf3c8ee0>
DEBUG    botocore.endpoint:endpoint.py:265 Sending http request: <AWSPreparedRequest stream_output=False, method=PUT, url=http://localhost:60315/test_bucket_0, headers={'User-Agent': b'Boto3/1.34.16 md/Botocore#1.34.16 ua/2.0 os/linux#6.6.9-200.fc39.x86_64 md/arch#x86_64 lang/python#3.9.18 md/pyimpl#CPython cfg/retry-mode#legacy Botocore/1.34.16', 'X-Amz-Date': b'20240111T161125Z', 'X-Amz-Content-SHA256': b'e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855', 'Authorization': b'AWS4-HMAC-SHA256 Credential=awd/20240111/us-east-1/s3/aws4_request, SignedHeaders=host;x-amz-content-sha256;x-amz-date, Signature=5c491a8ec593f5c9880a693222d757fd9578bd61e56ffb24d4c88173c03f29c3', 'amz-sdk-invocation-id': b'521f43d5-b81a-4f43-b03b-0e288032f55f', 'amz-sdk-request': b'attempt=1', 'Content-Length': '0'}>
DEBUG    botocore.hooks:hooks.py:238 Event before-send.s3.CreateBucket: calling handler <moto.core.botocore_stubber.BotocoreStubber object at 0x7f0acecb1d00>
DEBUG    urllib3.connectionpool:connectionpool.py:246 Starting new HTTP connection (1): localhost:60315
DEBUG    urllib3.connectionpool:connectionpool.py:474 http://localhost:60315 "PUT /test_bucket_0 HTTP/1.1" 200 167
DEBUG    botocore.parsers:parsers.py:240 Response headers: {'Server': 'Werkzeug/3.0.1 Python/3.9.18', 'Date': 'Thu, 11 Jan 2024 16:11:25 GMT', 'x-amzn-requestid': 'sVdP3BGOw7kNS7Yj7raeG5TRj3LZCwuMMP8u8Ck8qcyCew3kfBg9', 'Content-Type': 'text/html; charset=utf-8', 'Content-Length': '167', 'Access-Control-Allow-Origin': '*', 'Connection': 'close'}
DEBUG    botocore.parsers:parsers.py:241 Response body:
b'<CreateBucketResponse xmlns="http://s3.amazonaws.com/doc/2006-03-01"><CreateBucketResponse><Bucket>test_bucket_0</Bucket></CreateBucketResponse></CreateBucketResponse>'
DEBUG    botocore.hooks:hooks.py:238 Event needs-retry.s3.CreateBucket: calling handler <botocore.retryhandler.RetryHandler object at 0x7f0ac9e37b50>
DEBUG    botocore.retryhandler:retryhandler.py:211 No retry needed.
DEBUG    botocore.hooks:hooks.py:238 Event needs-retry.s3.CreateBucket: calling handler <bound method S3RegionRedirectorv2.redirect_from_error of <botocore.utils.S3RegionRedirectorv2 object at 0x7f0ac9e37c10>>
INFO     azure.core.pipeline.policies.http_logging_policy:_universal.py:514 Request URL: 'http://127.0.0.1:43121/devstoreaccount1/testbucket1?restype=REDACTED'
Request method: 'GET'
Request headers:
    'x-ms-version': 'REDACTED'
    'Accept': 'application/xml'
    'User-Agent': 'azsdk-python-storage-blob/12.19.0 Python/3.9.18 (Linux-6.6.9-200.fc39.x86_64-x86_64-with-glibc2.17)'
    'x-ms-date': 'REDACTED'
    'x-ms-client-request-id': '12e3dcf4-b09c-11ee-b8b3-0242ac110002'
    'Authorization': 'REDACTED'
No body was attached to the request
DEBUG    urllib3.connectionpool:connectionpool.py:246 Starting new HTTP connection (1): 127.0.0.1:43121
DEBUG    urllib3.connectionpool:connectionpool.py:474 http://127.0.0.1:43121 "GET /devstoreaccount1/testbucket1?restype=container HTTP/1.1" 404 None
INFO     azure.core.pipeline.policies.http_logging_policy:_universal.py:550 Response status: 404
Response headers:
    'Server': 'Azurite-Blob/3.29.0'
    'x-ms-error-code': 'ContainerNotFound'
    'x-ms-request-id': '3bc0ea98-6518-4fdf-b3da-992c49af8ddf'
    'content-type': 'application/xml'
    'Date': 'Thu, 11 Jan 2024 16:11:27 GMT'
    'Connection': 'keep-alive'
    'Keep-Alive': 'REDACTED'
    'Transfer-Encoding': 'chunked'
INFO     azure.core.pipeline.policies.http_logging_policy:_universal.py:514 Request URL: 'http://127.0.0.1:43121/devstoreaccount1/testbucket1?restype=REDACTED'
Request method: 'PUT'
Request headers:
    'x-ms-version': 'REDACTED'
    'Accept': 'application/xml'
    'User-Agent': 'azsdk-python-storage-blob/12.19.0 Python/3.9.18 (Linux-6.6.9-200.fc39.x86_64-x86_64-with-glibc2.17)'
    'x-ms-date': 'REDACTED'
    'x-ms-client-request-id': '12e876b0-b09c-11ee-b8b3-0242ac110002'
    'Authorization': 'REDACTED'
No body was attached to the request
DEBUG    urllib3.connectionpool:connectionpool.py:474 http://127.0.0.1:43121 "PUT /devstoreaccount1/testbucket1?restype=container HTTP/1.1" 201 0
INFO     azure.core.pipeline.policies.http_logging_policy:_universal.py:550 Response status: 201
Response headers:
    'Server': 'Azurite-Blob/3.29.0'
    'etag': '"0x21652EFD9EDD8C0"'
    'last-modified': 'Thu, 11 Jan 2024 16:11:27 GMT'
    'x-ms-client-request-id': '12e876b0-b09c-11ee-b8b3-0242ac110002'
    'x-ms-request-id': 'd3805824-0548-4795-9741-822f84f862ef'
    'x-ms-version': 'REDACTED'
    'Date': 'Thu, 11 Jan 2024 16:11:27 GMT'
    'Connection': 'keep-alive'
    'Keep-Alive': 'REDACTED'
    'Content-Length': '0'
=========================== short test summary info ============================
ERROR python/tests/integration/arcticdb/test_arctic.py::test_library_creation_deletion[Azure-EncodingVersion.V1]
!!!!!!!!!!!!!!!!!!!!!!!!!! stopping after 1 failures !!!!!!!!!!!!!!!!!!!!!!!!!!!
====================== 3762 deselected, 1 error in 4.00s =======================

Full logs on the feedstock from #141.

Over-linkage for some dependencies

For now, we mainly have warnings for over-linkage but None for missing dependencies:

WARNING (arcticdb): plugin library (Python) package conda-forge::grpcio-tools-1.54.2-py310hc6cd4ac_1 in requirements/run but it is not used (i.e. it is overdepending or perhaps statically linked? If that is what you want then add it to `build/ignore_run_exports`)
WARNING (arcticdb): run-exports library package conda-forge::cyrus-sasl-2.1.27-h9033bb2_6 in requirements/run but it is not used (i.e. it is overdepending or perhaps statically linked? If that is what you want then add it to `build/ignore_run_exports`)
WARNING (arcticdb): run-exports library package conda-forge::krb5-1.20.1-h81ceb04_0 in requirements/run but it is not used (i.e. it is overdepending or perhaps statically linked? If that is what you want then add it to `build/ignore_run_exports`)
WARNING (arcticdb): run-exports library package conda-forge::libevent-2.1.10-h28343ad_4 in requirements/run but it is not used (i.e. it is overdepending or perhaps statically linked? If that is what you want then add it to `build/ignore_run_exports`)
WARNING (arcticdb): run-exports library package conda-forge::gtest-1.14.0-h00ab1b0_1 in requirements/run but it is not used (i.e. it is overdepending or perhaps statically linked? If that is what you want then add it to `build/ignore_run_exports`)
WARNING (arcticdb): run-exports library package conda-forge::msgpack-c-6.0.0-hfc55251_0 in requirements/run but it is not used (i.e. it is overdepending or perhaps statically linked? If that is what you want then add it to `build/ignore_run_exports`)
WARNING (arcticdb): run-exports library package conda-forge::xxhash-0.8.2-hd590300_0 in requirements/run but it is not used (i.e. it is overdepending or perhaps statically linked? If that is what you want then add it to `build/ignore_run_exports`)
WARNING (arcticdb): run-exports library package conda-forge::gflags-2.2.2-he1b5a44_1004 in requirements/run but it is not used (i.e. it is overdepending or perhaps statically linked? If that is what you want then add it to `build/ignore_run_exports`)
WARNING (arcticdb): plugin library (Python) package conda-forge::protobuf-4.21.12-py310heca2aa9_0 in requirements/run but it is not used (i.e. it is overdepending or perhaps statically linked? If that is what you want then add it to `build/ignore_run_exports`)
WARNING (arcticdb): run-exports library package conda-forge::double-conversion-3.2.0-h27087fc_1 in requirements/run but it is not used (i.e. it is overdepending or perhaps statically linked? If that is what you want then add it to `build/ignore_run_exports`)
WARNING (arcticdb): interpreter (Python) package conda-forge::python-3.10.12-hd12c33a_0_cpython in requirements/run but it is not used (i.e. it is overdepending or perhaps statically linked? If that is what you want then add it to `build/ignore_run_exports`)

We should activate strict under- and over-linkage checks on the feedstock and understand the root cause of those warnings.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.