googleapis / python-aiplatform Goto Github PK
View Code? Open in Web Editor NEWA Python SDK for Vertex AI, a fully managed, end-to-end platform for data science and machine learning.
License: Apache License 2.0
A Python SDK for Vertex AI, a fully managed, end-to-end platform for data science and machine learning.
License: Apache License 2.0
Fixed resources should be passed in by an environment variables. See External Resources section of the sample style guide.
Users and Googlers without access to these resources should be able to run tests on these samples w/o having to modify them
Originally posted by @kurtisvg in #13 (comment)
See discussion in #17.
Not clear at the moment how to do this via the Microgenerator Docker image.
This test failed!
To configure my behavior, see the Build Cop Bot documentation.
If I'm commenting on this issue too often, add the buildcop: quiet
label and
I will stop commenting.
commit: e6acf37
buildURL: Build Status, Sponge
status: failed
args = (name: "projects/ucaip-sample-tests/locations/us-central1/models/5162251072873431040/evaluations/5615675837586029221" ,) kwargs = {'metadata': [('x-goog-request-params', 'name=projects/ucaip-sample-tests/locations/us-central1/models/5162251072873431040/evaluations/5615675837586029221'), ('x-goog-api-client', 'gl-python/3.7.7 grpc/1.33.2 gax/1.23.0 gapic/0.3.1')]}@six.wraps(callable_) def error_remapped_callable(*args, **kwargs): try:
return callable_(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:57:
self = <grpc._channel._UnaryUnaryMultiCallable object at 0x7f07efb4d790>
request = name: "projects/ucaip-sample-tests/locations/us-central1/models/5162251072873431040/evaluations/5615675837586029221"timeout = None
metadata = [('x-goog-request-params', 'name=projects/ucaip-sample-tests/locations/us-central1/models/5162251072873431040/evaluations/5615675837586029221'), ('x-goog-api-client', 'gl-python/3.7.7 grpc/1.33.2 gax/1.23.0 gapic/0.3.1')]
credentials = None, wait_for_ready = None, compression = Nonedef __call__(self, request, timeout=None, metadata=None, credentials=None, wait_for_ready=None, compression=None): state, call, = self._blocking(request, timeout, metadata, credentials, wait_for_ready, compression)
return _end_unary_response_blocking(state, call, False, None)
.nox/py-3-7/lib/python3.7/site-packages/grpc/_channel.py:923:
state = <grpc._channel._RPCState object at 0x7f07efb4d410>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7f07efde3f00>
with_call = False, deadline = Nonedef _end_unary_response_blocking(state, call, with_call, deadline): if state.code is grpc.StatusCode.OK: if with_call: rendezvous = _MultiThreadedRendezvous(state, call, None, deadline) return state.response, rendezvous else: return state.response else:
raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.NOT_FOUND
E details = "The Model does not exist."
E debug_error_string = "{"created":"@1606907540.586318659","description":"Error received from peer ipv4:142.250.107.95:443","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"The Model does not exist.","grpc_status":5}"
E >.nox/py-3-7/lib/python3.7/site-packages/grpc/_channel.py:826: _InactiveRpcError
The above exception was the direct cause of the following exception:
capsys = <_pytest.capture.CaptureFixture object at 0x7f07efc9d710>
def test_ucaip_generated_get_model_evaluation_sample(capsys): get_model_evaluation_sample.get_model_evaluation_sample(
project=PROJECT_ID, model_id=MODEL_ID, evaluation_id=EVALUATION_ID
)
get_model_evaluation_sample_test.py:27:
get_model_evaluation_sample.py:33: in get_model_evaluation_sample
response = client.get_model_evaluation(name=name)
../../google/cloud/aiplatform_v1beta1/services/model_service/client.py:1022: in get_model_evaluation
response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:59: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.NOT_FOUND
details = "The Model does not exist."
...","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"The Model does not exist.","grpc_status":5}"???
E google.api_core.exceptions.NotFound: 404 The Model does not exist.:3: NotFound
Use conftest.py for repeated fixtures in sample tests
As suggested here, combine identical fixtures across tests
/samples/tests/
into aconftest.py
file.samples.snippets.create_batch_prediction_job_video_object_tracking_sample_test: test_ucaip_generated_create_batch_prediction_vcn_sample failed
This test failed!
To configure my behavior, see the Build Cop Bot documentation.
If I'm commenting on this issue too often, add the
buildcop: quiet
label and
I will stop commenting.
commit: e6acf37
buildURL: Build Status, Sponge
status: failedTest output
shared_state = {}@pytest.fixture(scope="function", autouse=True) def teardown(shared_state): yield
assert "/" in shared_state["batch_prediction_job_name"]
E KeyError: 'batch_prediction_job_name'
create_batch_prediction_job_video_object_tracking_sample_test.py:47: KeyError
empty export model response - 'google.cloud.aiplatform_v1beta1.types.model_service.ExportModelResponse'
@dizcology @morgandu While trying to export an edge model to a cloud bucket I noticed that the export_model_sample, as well as the example shown in the AiPlatform documentation, prints an empty response when the last line is executed:
print("export_model_response:", export_model_response)
The snippets section of this Github repository, doe not have a separate example to get the export status. Because of this, I was expecting that the last line: print("export_model_response:", export_model_response) would print the status of the operation like the example using the REST &CMD Line that looks like:
{ "name": "projects/PROJECT/locations/LOCATION/models/MODEL_ID/operations/OPERATION_ID", "metadata": { "@type": "type.googleapis.com/google.cloud.aiplatform.v1beta1.ExportModelOperationMetadata", "genericMetadata": { "createTime": "2020-10-12T20:53:40.130785Z", "updateTime": "2020-10-12T20:53:40.793983Z" }, "outputInfo": { **"artifactOutputUri": "gs://OUTPUT_BUCKET/model-MODEL_ID/EXPORT_FORMAT/YYYY-MM-DDThh:mm:ss.sssZ"** } }, "done": true, "response": { "@type": "type.googleapis.com/google.cloud.aiplatform.v1beta1.ExportModelResponse" } }
Using Python I need to get the "artifactOutputUri" to know the full path where the exported model was stored at.
Empty protobuf.Value
Some generated Python samples have empty parameters converted into protobuf.Value, for example https://github.com/googleapis/python-aiplatform/pull/158/files#r553670007.
We should make it so that the Python samples have these removed while still allowing the other languages to have it.
Synthesis failed for python-aiplatform
Hello! Autosynth couldn't regenerate python-aiplatform. 💔
Here's the output from running
synth.py
:lassification.proto google/cloud/aiplatform/v1beta1/schema/trainingjob/definition/automl_text_extraction.proto google/cloud/aiplatform/v1beta1/schema/trainingjob/definition/automl_text_sentiment.proto google/cloud/aiplatform/v1beta1/schema/trainingjob/definition/automl_video_action_recognition.proto google/cloud/aiplatform/v1beta1/schema/trainingjob/definition/automl_video_classification.proto google/cloud/aiplatform/v1beta1/schema/trainingjob/definition/automl_video_object_tracking.proto google/cloud/aiplatform/v1beta1/schema/trainingjob/definition/export_evaluated_data_items_config.proto` failed (Exit 1) protoc failed: error executing command bazel-out/host/bin/external/com_google_protobuf/protoc --experimental_allow_proto3_optional '--plugin=protoc-gen-python_gapic=bazel-out/host/bin/external/gapic_generator_python/gapic_plugin' ... (remaining 29 argument(s) skipped) Use --sandbox_debug to see verbose messages from the sandbox google/cloud/aiplatform/v1beta1/schema/trainingjob/definition/export_evaluated_data_items_config.proto:19:1: warning: Import google/api/annotations.proto is unused. google/cloud/aiplatform/v1beta1/schema/trainingjob/definition/automl_forecasting.proto:20:1: warning: Import google/api/annotations.proto is unused. google/cloud/aiplatform/v1beta1/schema/trainingjob/definition/automl_image_classification.proto:19:1: warning: Import google/api/annotations.proto is unused. google/cloud/aiplatform/v1beta1/schema/trainingjob/definition/automl_image_object_detection.proto:19:1: warning: Import google/api/annotations.proto is unused. google/cloud/aiplatform/v1beta1/schema/trainingjob/definition/automl_image_segmentation.proto:19:1: warning: Import google/api/annotations.proto is unused. google/cloud/aiplatform/v1beta1/schema/trainingjob/definition/automl_tables.proto:20:1: warning: Import google/api/annotations.proto is unused. google/cloud/aiplatform/v1beta1/schema/trainingjob/definition/automl_text_classification.proto:19:1: warning: Import google/api/annotations.proto is unused. google/cloud/aiplatform/v1beta1/schema/trainingjob/definition/automl_text_extraction.proto:19:1: warning: Import google/api/annotations.proto is unused. google/cloud/aiplatform/v1beta1/schema/trainingjob/definition/automl_text_sentiment.proto:19:1: warning: Import google/api/annotations.proto is unused. google/cloud/aiplatform/v1beta1/schema/trainingjob/definition/automl_video_action_recognition.proto:19:1: warning: Import google/api/annotations.proto is unused. google/cloud/aiplatform/v1beta1/schema/trainingjob/definition/automl_video_classification.proto:19:1: warning: Import google/api/annotations.proto is unused. google/cloud/aiplatform/v1beta1/schema/trainingjob/definition/automl_video_object_tracking.proto:19:1: warning: Import google/api/annotations.proto is unused. Traceback (most recent call last): File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/1140/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/cli/generate_with_pandoc.py", line 3, in <module> from gapic.cli import generate File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/1140/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/cli/generate.py", line 23, in <module> from gapic import generator File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/1140/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/generator/__init__.py", line 21, in <module> from .generator import Generator File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/1140/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/generator/generator.py", line 24, in <module> from gapic.samplegen import manifest, samplegen File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/1140/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/samplegen/__init__.py", line 15, in <module> from gapic.samplegen import samplegen File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/1140/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/samplegen/samplegen.py", line 27, in <module> from gapic.schema import wrappers File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/1140/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/schema/__init__.py", line 23, in <module> from gapic.schema.api import API File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/1140/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/schema/api.py", line 29, in <module> from google.api_core import exceptions # type: ignore ModuleNotFoundError: No module named 'google.api_core' --python_gapic_out: protoc-gen-python_gapic: Plugin failed with status code 1. Target //google/cloud/aiplatform/v1beta1:aiplatform-v1beta1-py failed to build Use --verbose_failures to see the command lines of failed build steps. [9 / 15] checking cached actions INFO: Elapsed time: 2.311s, Critical Path: 1.72s INFO: 0 processes. FAILED: Build did NOT complete successfully FAILED: Build did NOT complete successfully Traceback (most recent call last): File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main "__main__", mod_spec) File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code exec(code, run_globals) File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module> main() File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__ return self.main(*args, **kwargs) File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 782, in main rv = self.invoke(ctx) File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke return callback(*args, **kwargs) File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 94, in main spec.loader.exec_module(synth_module) # type: ignore File "<frozen importlib._bootstrap_external>", line 678, in exec_module File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed File "/home/kbuilder/.cache/synthtool/python-aiplatform/synth.py", line 34, in <module> bazel_target="//google/cloud/aiplatform/v1beta1:aiplatform-v1beta1-py", File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 52, in py_library return self._generate_code(service, version, "python", **kwargs) File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 193, in _generate_code shell.run(bazel_run_args) File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 39, in run raise exc File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 33, in run encoding="utf-8", File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 438, in run output=stdout, stderr=stderr) subprocess.CalledProcessError: Command '['bazel', '--max_idle_secs=240', 'build', '//google/cloud/aiplatform/v1beta1:aiplatform-v1beta1-py']' returned non-zero exit status 1. 2021-01-21 05:20:57,780 autosynth [ERROR] > Synthesis failed 2021-01-21 05:20:57,780 autosynth [DEBUG] > Running: git reset --hard HEAD HEAD is now at 6d3027b chore(deps): update dependency google-cloud-aiplatform to v0.4.0 (#172) 2021-01-21 05:20:57,788 autosynth [DEBUG] > Running: git checkout autosynth Switched to branch 'autosynth' 2021-01-21 05:20:57,795 autosynth [DEBUG] > Running: git clean -fdx Removing __pycache__/ Traceback (most recent call last): File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main "__main__", mod_spec) File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code exec(code, run_globals) File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 354, in <module> main() File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 189, in main return _inner_main(temp_dir) File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 334, in _inner_main commit_count = synthesize_loop(x, multiple_prs, change_pusher, synthesizer) File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 65, in synthesize_loop has_changes = toolbox.synthesize_version_in_new_branch(synthesizer, youngest) File "/tmpfs/src/github/synthtool/autosynth/synth_toolbox.py", line 259, in synthesize_version_in_new_branch synthesizer.synthesize(synth_log_path, self.environ) File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize synth_proc.check_returncode() # Raise an exception. File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode self.stderr) subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1.
Google internal developers can see the full log here.
samples.snippets.export_model_tabular_classification_sample_test: test_ucaip_generated_export_model_tabular_classification_sample failed
This test failed!
To configure my behavior, see the Build Cop Bot documentation.
If I'm commenting on this issue too often, add the
buildcop: quiet
label and
I will stop commenting.
commit: e6acf37
buildURL: Build Status, Sponge
status: failedTest output
args = (name: "projects/ucaip-sample-tests/locations/us-central1/models/5359002081594179584" output_config { export_format_...output_uri_prefix: "gs://ucaip-samples-test-output/tmp/export_model_test_79652c7e-5f25-4aad-bd9a-ac40e029c6f0" } } ,) kwargs = {'metadata': [('x-goog-request-params', 'name=projects/ucaip-sample-tests/locations/us-central1/models/5359002081594179584'), ('x-goog-api-client', 'gl-python/3.7.7 grpc/1.33.2 gax/1.23.0 gapic/0.3.1')]}@six.wraps(callable_) def error_remapped_callable(*args, **kwargs): try:
return callable_(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:57:
self = <grpc._channel._UnaryUnaryMultiCallable object at 0x7f07efc982d0>
request = name: "projects/ucaip-sample-tests/locations/us-central1/models/5359002081594179584"
output_config {
export_format_i... output_uri_prefix: "gs://ucaip-samples-test-output/tmp/export_model_test_79652c7e-5f25-4aad-bd9a-ac40e029c6f0"
}
}timeout = None
metadata = [('x-goog-request-params', 'name=projects/ucaip-sample-tests/locations/us-central1/models/5359002081594179584'), ('x-goog-api-client', 'gl-python/3.7.7 grpc/1.33.2 gax/1.23.0 gapic/0.3.1')]
credentials = None, wait_for_ready = None, compression = Nonedef __call__(self, request, timeout=None, metadata=None, credentials=None, wait_for_ready=None, compression=None): state, call, = self._blocking(request, timeout, metadata, credentials, wait_for_ready, compression)
return _end_unary_response_blocking(state, call, False, None)
.nox/py-3-7/lib/python3.7/site-packages/grpc/_channel.py:923:
state = <grpc._channel._RPCState object at 0x7f07efc98790>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7f07fe73f820>
with_call = False, deadline = Nonedef _end_unary_response_blocking(state, call, with_call, deadline): if state.code is grpc.StatusCode.OK: if with_call: rendezvous = _MultiThreadedRendezvous(state, call, None, deadline) return state.response, rendezvous else: return state.response else:
raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.NOT_FOUND
E details = "The Model does not exist."
E debug_error_string = "{"created":"@1606907539.997788449","description":"Error received from peer ipv4:74.125.20.95:443","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"The Model does not exist.","grpc_status":5}"
E >.nox/py-3-7/lib/python3.7/site-packages/grpc/_channel.py:826: _InactiveRpcError
The above exception was the direct cause of the following exception:
capsys = <_pytest.capture.CaptureFixture object at 0x7f07efb16e50>
def test_ucaip_generated_export_model_tabular_classification_sample(capsys): export_model_tabular_classification_sample.export_model_tabular_classification_sample( project=PROJECT_ID, model_id=MODEL_ID,
gcs_destination_output_uri_prefix=f"{GCS_BUCKET}/{GCS_PREFIX}",
)
export_model_tabular_classification_sample_test.py:44:
export_model_tabular_classification_sample.py:37: in export_model_tabular_classification_sample
response = client.export_model(name=name, output_config=output_config)
../../google/cloud/aiplatform_v1beta1/services/model_service/client.py:937: in export_model
response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:59: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.NOT_FOUND
details = "The Model does not exist."
...","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"The Model does not exist.","grpc_status":5}"???
E google.api_core.exceptions.NotFound: 404 The Model does not exist.:3: NotFound
Fixed resources should be passed in by environment variables (2/2)
Essentially what we want here is that I as a developer, should be able to run the tests on these samples without modifying them to confirm they still work. However unless I'm on the team maintaining the samples, I likely don't have access to the resources it's tested with. By using env variables means all I need to do when running locally is change the env var, and not worry about changing the model ids when there is a problem.
So something like this is perfectly acceptable:
resource_id = os.getenv("YOUR-RESOURCE-NAME")For bonus points you can include a link to instructions for creating the resource, if you really want to make it accessible for other developers.
Originally posted by @kurtisvg in #13 (comment)
Action Required: Fix Renovate Configuration
There is an error with this repository's Renovate configuration that needs to be fixed. As a precaution, Renovate will stop PRs until it is resolved.
Error type: Cannot find preset's package (github>whitesource/merge-confidence:beta)
samples.snippets.get_model_evaluation_tabular_classification_sample_test: test_ucaip_generated_get_model_evaluation_tabular_classification_sample failed
This test failed!
To configure my behavior, see the Build Cop Bot documentation.
If I'm commenting on this issue too often, add the
buildcop: quiet
label and
I will stop commenting.
commit: e6acf37
buildURL: Build Status, Sponge
status: failedTest output
args = (name: "projects/ucaip-sample-tests/locations/us-central1/models/5162251072873431040/evaluations/5615675837586029221" ,) kwargs = {'metadata': [('x-goog-request-params', 'name=projects/ucaip-sample-tests/locations/us-central1/models/5162251072873431040/evaluations/5615675837586029221'), ('x-goog-api-client', 'gl-python/3.7.7 grpc/1.33.2 gax/1.23.0 gapic/0.3.1')]}@six.wraps(callable_) def error_remapped_callable(*args, **kwargs): try:
return callable_(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:57:
self = <grpc._channel._UnaryUnaryMultiCallable object at 0x7f07efbfae90>
request = name: "projects/ucaip-sample-tests/locations/us-central1/models/5162251072873431040/evaluations/5615675837586029221"timeout = None
metadata = [('x-goog-request-params', 'name=projects/ucaip-sample-tests/locations/us-central1/models/5162251072873431040/evaluations/5615675837586029221'), ('x-goog-api-client', 'gl-python/3.7.7 grpc/1.33.2 gax/1.23.0 gapic/0.3.1')]
credentials = None, wait_for_ready = None, compression = Nonedef __call__(self, request, timeout=None, metadata=None, credentials=None, wait_for_ready=None, compression=None): state, call, = self._blocking(request, timeout, metadata, credentials, wait_for_ready, compression)
return _end_unary_response_blocking(state, call, False, None)
.nox/py-3-7/lib/python3.7/site-packages/grpc/_channel.py:923:
state = <grpc._channel._RPCState object at 0x7f07efbfa350>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7f07efc12f00>
with_call = False, deadline = Nonedef _end_unary_response_blocking(state, call, with_call, deadline): if state.code is grpc.StatusCode.OK: if with_call: rendezvous = _MultiThreadedRendezvous(state, call, None, deadline) return state.response, rendezvous else: return state.response else:
raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.NOT_FOUND
E details = "The Model does not exist."
E debug_error_string = "{"created":"@1606907541.391839330","description":"Error received from peer ipv4:74.125.20.95:443","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"The Model does not exist.","grpc_status":5}"
E >.nox/py-3-7/lib/python3.7/site-packages/grpc/_channel.py:826: _InactiveRpcError
The above exception was the direct cause of the following exception:
capsys = <_pytest.capture.CaptureFixture object at 0x7f07efbdce10>
def test_ucaip_generated_get_model_evaluation_tabular_classification_sample(capsys): get_model_evaluation_tabular_classification_sample.get_model_evaluation_tabular_classification_sample(
project=PROJECT_ID, model_id=MODEL_ID, evaluation_id=EVALUATION_ID
)
get_model_evaluation_tabular_classification_sample_test.py:27:
get_model_evaluation_tabular_classification_sample.py:33: in get_model_evaluation_tabular_classification_sample
response = client.get_model_evaluation(name=name)
../../google/cloud/aiplatform_v1beta1/services/model_service/client.py:1022: in get_model_evaluation
response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:59: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.NOT_FOUND
details = "The Model does not exist."
...","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"The Model does not exist.","grpc_status":5}"???
E google.api_core.exceptions.NotFound: 404 The Model does not exist.:3: NotFound
predict_image_object_detection_sample.py uses wrong parameter format
- OS type and version: Debian GNU/ Linux 10
- Python version: 3.7
- pip version: 20.2.4
google-cloud-aiplatform
version: 0.3.1Steps to reproduce
Line 45 in predict_image_object_detection_sample.py
Parameters have no effect because of "_". Need to change to camalCase
Code example
If available to you: https://github.com/googleapis/python-aiplatform/compare/master...Crash-GHaun:patch-1?expand=1
Following process to create bug before pull request
Samples should provide instructions for users on how to set variables.
Raising this issue because I saw @kurtisvg 's comment in #13.
A snippet like https://github.com/googleapis/python-aiplatform/blob/sample-final-stage/samples/snippets/get_model_evaluation_video_classification_sample.py as one example, does not provide the user instructions on what a valid value for
project
,model_id
, orevaluation_id
should look like. That's something we should fix.Kurtis brings up two ideas - one is adding comments, the other is adding a
run_sample()
wrapper function. Both are valid ideas - and I want the team to make the choice here. My personal opinion is that comments work better in the python ecosystem. But again, happy to let the team decide.I want to make sure the solution is one that we can scalably roll out across all ~100 of our samples. So I want the change to be made in the generator rather than as one-offs in changes to already-generated samples.
Synthesis failed for python-aiplatform
Hello! Autosynth couldn't regenerate python-aiplatform. 💔
Here's the output from running
synth.py
:oud/aiplatform/v1beta1/accelerator_type.proto is unused. google/cloud/aiplatform/v1beta1/data_labeling_job.proto:24:1: warning: Import google/cloud/aiplatform/v1beta1/specialist_pool.proto is unused. google/cloud/aiplatform/v1beta1/dataset.proto:25:1: warning: Import google/api/annotations.proto is unused. google/cloud/aiplatform/v1beta1/operation.proto:22:1: warning: Import google/api/annotations.proto is unused. google/cloud/aiplatform/v1beta1/deployed_model_ref.proto:21:1: warning: Import google/api/annotations.proto is unused. google/cloud/aiplatform/v1beta1/model.proto:28:1: warning: Import google/api/annotations.proto is unused. google/cloud/aiplatform/v1beta1/model.proto:21:1: warning: Import google/cloud/aiplatform/v1beta1/dataset.proto is unused. google/cloud/aiplatform/v1beta1/pipeline_state.proto:19:1: warning: Import google/api/annotations.proto is unused. google/cloud/aiplatform/v1beta1/training_pipeline.proto:30:1: warning: Import google/api/annotations.proto is unused. google/cloud/aiplatform/v1beta1/training_pipeline.proto:23:1: warning: Import google/cloud/aiplatform/v1beta1/machine_resources.proto is unused. google/cloud/aiplatform/v1beta1/training_pipeline.proto:24:1: warning: Import google/cloud/aiplatform/v1beta1/manual_batch_tuning_parameters.proto is unused. google/cloud/aiplatform/v1beta1/dataset_service.proto:28:1: warning: Import google/cloud/aiplatform/v1beta1/training_pipeline.proto is unused. google/cloud/aiplatform/v1beta1/endpoint.proto:25:1: warning: Import google/api/annotations.proto is unused. google/cloud/aiplatform/v1beta1/study.proto:25:1: warning: Import google/api/annotations.proto is unused. google/cloud/aiplatform/v1beta1/study.proto:21:1: warning: Import google/protobuf/duration.proto is unused. google/cloud/aiplatform/v1beta1/study.proto:24:1: warning: Import google/protobuf/wrappers.proto is unused. google/cloud/aiplatform/v1beta1/hyperparameter_tuning_job.proto:27:1: warning: Import google/api/annotations.proto is unused. google/cloud/aiplatform/v1beta1/job_service.proto:31:1: warning: Import google/protobuf/timestamp.proto is unused. google/cloud/aiplatform/v1beta1/job_service.proto:27:1: warning: Import google/cloud/aiplatform/v1beta1/operation.proto is unused. google/cloud/aiplatform/v1beta1/migratable_resource.proto:22:1: warning: Import google/api/annotations.proto is unused. google/cloud/aiplatform/v1beta1/migration_service.proto:19:1: warning: Import google/cloud/aiplatform/v1beta1/dataset.proto is unused. google/cloud/aiplatform/v1beta1/migration_service.proto:20:1: warning: Import google/cloud/aiplatform/v1beta1/model.proto is unused. google/cloud/aiplatform/v1beta1/model_evaluation.proto:24:1: warning: Import google/api/annotations.proto is unused. google/cloud/aiplatform/v1beta1/model_evaluation_slice.proto:23:1: warning: Import google/api/annotations.proto is unused. Traceback (most recent call last): File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/1143/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/cli/generate_with_pandoc.py", line 3, in <module> from gapic.cli import generate File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/1143/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/cli/generate.py", line 23, in <module> from gapic import generator File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/1143/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/generator/__init__.py", line 21, in <module> from .generator import Generator File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/1143/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/generator/generator.py", line 24, in <module> from gapic.samplegen import manifest, samplegen File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/1143/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/samplegen/__init__.py", line 15, in <module> from gapic.samplegen import samplegen File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/1143/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/samplegen/samplegen.py", line 27, in <module> from gapic.schema import wrappers File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/1143/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/schema/__init__.py", line 23, in <module> from gapic.schema.api import API File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/1143/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/schema/api.py", line 29, in <module> from google.api_core import exceptions # type: ignore ModuleNotFoundError: No module named 'google.api_core' --python_gapic_out: protoc-gen-python_gapic: Plugin failed with status code 1. Target //google/cloud/aiplatform/v1beta1:aiplatform-v1beta1-py failed to build Use --verbose_failures to see the command lines of failed build steps. INFO: Elapsed time: 2.493s, Critical Path: 1.87s INFO: 0 processes. FAILED: Build did NOT complete successfully FAILED: Build did NOT complete successfully Traceback (most recent call last): File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main "__main__", mod_spec) File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code exec(code, run_globals) File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module> main() File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__ return self.main(*args, **kwargs) File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 782, in main rv = self.invoke(ctx) File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke return callback(*args, **kwargs) File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 94, in main spec.loader.exec_module(synth_module) # type: ignore File "<frozen importlib._bootstrap_external>", line 678, in exec_module File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed File "/home/kbuilder/.cache/synthtool/python-aiplatform/synth.py", line 34, in <module> bazel_target="//google/cloud/aiplatform/v1beta1:aiplatform-v1beta1-py", File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 52, in py_library return self._generate_code(service, version, "python", **kwargs) File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 197, in _generate_code shell.run(bazel_run_args) File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 39, in run raise exc File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 33, in run encoding="utf-8", File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 438, in run output=stdout, stderr=stderr) subprocess.CalledProcessError: Command '['bazel', '--max_idle_secs=240', 'build', '//google/cloud/aiplatform/v1beta1:aiplatform-v1beta1-py']' returned non-zero exit status 1. 2021-01-28 05:21:10,320 autosynth [ERROR] > Synthesis failed 2021-01-28 05:21:10,320 autosynth [DEBUG] > Running: git reset --hard HEAD HEAD is now at 6589383 fix(deps): remove optional dependencies (#187) 2021-01-28 05:21:10,327 autosynth [DEBUG] > Running: git checkout autosynth Switched to branch 'autosynth' 2021-01-28 05:21:10,333 autosynth [DEBUG] > Running: git clean -fdx Removing __pycache__/ Traceback (most recent call last): File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main "__main__", mod_spec) File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code exec(code, run_globals) File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 354, in <module> main() File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 189, in main return _inner_main(temp_dir) File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 334, in _inner_main commit_count = synthesize_loop(x, multiple_prs, change_pusher, synthesizer) File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 65, in synthesize_loop has_changes = toolbox.synthesize_version_in_new_branch(synthesizer, youngest) File "/tmpfs/src/github/synthtool/autosynth/synth_toolbox.py", line 259, in synthesize_version_in_new_branch synthesizer.synthesize(synth_log_path, self.environ) File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize synth_proc.check_returncode() # Raise an exception. File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode self.stderr) subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1.
Google internal developers can see the full log here.
503 error and AttributeError: module 'google.auth.transport' has no attribute 'requests'
Getting the following error when I am running with python for prediction, even I have added JSON file for the cloud auth key. Some other machines work my script without any problem though.
runfile('C:/bot/googlex.py', wdir='C:/bot') collecting data Traceback (most recent call last): File "c:\programdata\anaconda3\lib\site-packages\google\api_core\grpc_helpers.py", line 58, in error_remapped_callable return callable_(*args, **kwargs) File "C:\ProgramData\Anaconda3\lib\site-packages\grpc\_channel.py", line 923, in __call__ return _end_unary_response_blocking(state, call, False, None) File "C:\ProgramData\Anaconda3\lib\site-packages\grpc\_channel.py", line 826, in _end_unary_response_blocking raise _InactiveRpcError(state) _InactiveRpcError: <_InactiveRpcError of RPC that terminated with: status = StatusCode.UNAVAILABLE details = "502:Bad Gateway" debug_error_string = "{"created":"@1606950672.430000000","description":"Error received from peer ipv4:172.217.14.234:443","file":"src/core/lib/surface/call.cc","file_line":1062,"grpc_message":"502:Bad Gateway","grpc_status":14}" > The above exception was the direct cause of the following exception: Traceback (most recent call last): File "C:\bot\googlex.py", line 278, in <module> predict_tabular_regression_sample( File "C:\bot\googlex.py", line 261, in predict_tabular_regression_sample response = client.predict( File "c:\programdata\anaconda3\lib\site-packages\google\cloud\aiplatform_v1beta1\services\prediction_service\client.py", line 438, in predict response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) File "c:\programdata\anaconda3\lib\site-packages\google\api_core\gapic_v1\method.py", line 145, in __call__ return wrapped_func(*args, **kwargs) File "c:\programdata\anaconda3\lib\site-packages\google\api_core\grpc_helpers.py", line 60, in error_remapped_callable six.raise_from(exceptions.from_grpc_error(exc), exc) File "<string>", line 3, in raise_from ServiceUnavailable: 503 502:Bad Gateway runfile('C:/bot/googlex.py', wdir='C:/bot') collecting data Traceback (most recent call last): File "C:\bot\googlex.py", line 278, in <module> predict_tabular_regression_sample( File "C:\bot\googlex.py", line 251, in predict_tabular_regression_sample client = aiplatform.gapic.PredictionServiceClient(client_options=client_options) File "c:\programdata\anaconda3\lib\site-packages\google\cloud\aiplatform_v1beta1\services\prediction_service\client.py", line 327, in __init__ self._transport = Transport( File "c:\programdata\anaconda3\lib\site-packages\google\cloud\aiplatform_v1beta1\services\prediction_service\transports\grpc.py", line 160, in __init__ self._grpc_channel = type(self).create_channel( File "c:\programdata\anaconda3\lib\site-packages\google\cloud\aiplatform_v1beta1\services\prediction_service\transports\grpc.py", line 217, in create_channel return grpc_helpers.create_channel( File "c:\programdata\anaconda3\lib\site-packages\google\api_core\grpc_helpers.py", line 276, in create_channel composite_credentials = _create_composite_credentials( File "c:\programdata\anaconda3\lib\site-packages\google\api_core\grpc_helpers.py", line 223, in _create_composite_credentials request = google.auth.transport.requests.Request() AttributeError: module 'google.auth.transport' has no attribute 'requests'
refactor: Increase library's default gRPC timeout to 300 seconds
Issue created from PR comment
Making sure to follow these steps will guarantee the quickest resolution possible.
Thanks!
Synthesis failed for python-aiplatform
Hello! Autosynth couldn't regenerate python-aiplatform. 💔
Here's the output from running
synth.py
:truct_54__handle_cancellation_from_core.tp_print = 0; ^~~~~~~~ tp_dict bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:132284:72: error: 'PyTypeObject {aka struct _typeobject}' has no member named 'tp_print'; did you mean 'tp_dict'? __pyx_type_7_cython_6cygrpc___pyx_scope_struct_55__schedule_rpc_coro.tp_print = 0; ^~~~~~~~ tp_dict bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:132290:65: error: 'PyTypeObject {aka struct _typeobject}' has no member named 'tp_print'; did you mean 'tp_dict'? __pyx_type_7_cython_6cygrpc___pyx_scope_struct_56__handle_rpc.tp_print = 0; ^~~~~~~~ tp_dict bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:132296:67: error: 'PyTypeObject {aka struct _typeobject}' has no member named 'tp_print'; did you mean 'tp_dict'? __pyx_type_7_cython_6cygrpc___pyx_scope_struct_57__request_call.tp_print = 0; ^~~~~~~~ tp_dict bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:132302:71: error: 'PyTypeObject {aka struct _typeobject}' has no member named 'tp_print'; did you mean 'tp_dict'? __pyx_type_7_cython_6cygrpc___pyx_scope_struct_58__server_main_loop.tp_print = 0; ^~~~~~~~ tp_dict bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:132308:59: error: 'PyTypeObject {aka struct _typeobject}' has no member named 'tp_print'; did you mean 'tp_dict'? __pyx_type_7_cython_6cygrpc___pyx_scope_struct_59_start.tp_print = 0; ^~~~~~~~ tp_dict bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:132314:74: error: 'PyTypeObject {aka struct _typeobject}' has no member named 'tp_print'; did you mean 'tp_dict'? __pyx_type_7_cython_6cygrpc___pyx_scope_struct_60__start_shutting_down.tp_print = 0; ^~~~~~~~ tp_dict bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:132320:62: error: 'PyTypeObject {aka struct _typeobject}' has no member named 'tp_print'; did you mean 'tp_dict'? __pyx_type_7_cython_6cygrpc___pyx_scope_struct_61_shutdown.tp_print = 0; ^~~~~~~~ tp_dict bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:132326:74: error: 'PyTypeObject {aka struct _typeobject}' has no member named 'tp_print'; did you mean 'tp_dict'? __pyx_type_7_cython_6cygrpc___pyx_scope_struct_62_wait_for_termination.tp_print = 0; ^~~~~~~~ tp_dict bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp: In function 'PyObject* __Pyx_decode_c_bytes(const char*, Py_ssize_t, Py_ssize_t, Py_ssize_t, const char*, const char*, PyObject* (*)(const char*, Py_ssize_t, const char*))': bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:136866:45: warning: 'PyObject* PyUnicode_FromUnicode(const Py_UNICODE*, Py_ssize_t)' is deprecated [-Wdeprecated-declarations] return PyUnicode_FromUnicode(NULL, 0); ^ In file included from bazel-out/host/bin/external/local_config_python/_python3/_python3_include/unicodeobject.h:1026:0, from bazel-out/host/bin/external/local_config_python/_python3/_python3_include/Python.h:97, from bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:4: bazel-out/host/bin/external/local_config_python/_python3/_python3_include/cpython/unicodeobject.h:551:42: note: declared here Py_DEPRECATED(3.3) PyAPI_FUNC(PyObject*) PyUnicode_FromUnicode( ^~~~~~~~~~~~~~~~~~~~~ bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp: In function 'void __pyx_f_7_cython_6cygrpc__unified_socket_write(int)': bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:72692:3: warning: ignoring return value of 'ssize_t write(int, const void*, size_t)', declared with attribute warn_unused_result [-Wunused-result] (void)(write(__pyx_v_fd, ((char *)"1"), 1)); ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp: At global scope: bazel-out/host/bin/external/com_github_grpc_grpc/src/python/grpcio/grpc/_cython/cygrpc.cpp:144607:1: warning: 'void __Pyx_PyAsyncGen_Fini()' defined but not used [-Wunused-function] __Pyx_PyAsyncGen_Fini(void) ^~~~~~~~~~~~~~~~~~~~~ Target //google/cloud/aiplatform/v1beta1:aiplatform-v1beta1-py failed to build Use --verbose_failures to see the command lines of failed build steps. INFO: Elapsed time: 3.271s, Critical Path: 2.64s INFO: 0 processes. FAILED: Build did NOT complete successfully FAILED: Build did NOT complete successfully Traceback (most recent call last): File "/usr/local/lib/python3.9/runpy.py", line 197, in _run_module_as_main return _run_code(code, main_globals, None, File "/usr/local/lib/python3.9/runpy.py", line 87, in _run_code exec(code, run_globals) File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module> main() File "/tmpfs/src/github/synthtool/env/lib/python3.9/site-packages/click/core.py", line 829, in __call__ return self.main(*args, **kwargs) File "/tmpfs/src/github/synthtool/env/lib/python3.9/site-packages/click/core.py", line 782, in main rv = self.invoke(ctx) File "/tmpfs/src/github/synthtool/env/lib/python3.9/site-packages/click/core.py", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File "/tmpfs/src/github/synthtool/env/lib/python3.9/site-packages/click/core.py", line 610, in invoke return callback(*args, **kwargs) File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 94, in main spec.loader.exec_module(synth_module) # type: ignore File "<frozen importlib._bootstrap_external>", line 790, in exec_module File "<frozen importlib._bootstrap>", line 228, in _call_with_frames_removed File "/root/.cache/synthtool/python-aiplatform/synth.py", line 31, in <module> library = gapic.py_library( File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 45, in py_library return self._generate_code(service, version, "python", **kwargs) File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 182, in _generate_code shell.run(bazel_run_args) File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 39, in run raise exc File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 27, in run return subprocess.run( File "/usr/local/lib/python3.9/subprocess.py", line 524, in run raise CalledProcessError(retcode, process.args, subprocess.CalledProcessError: Command '['bazel', '--max_idle_secs=240', 'build', '//google/cloud/aiplatform/v1beta1:aiplatform-v1beta1-py']' returned non-zero exit status 1. 2020-12-05 00:10:08,888 autosynth [ERROR] > Synthesis failed 2020-12-05 00:10:08,889 autosynth [DEBUG] > Running: git reset --hard HEAD HEAD is now at 5362a4d chore: update sample test resouce names (#120) 2020-12-05 00:10:08,895 autosynth [DEBUG] > Running: git checkout autosynth Switched to branch 'autosynth' 2020-12-05 00:10:08,901 autosynth [DEBUG] > Running: git clean -fdx Removing __pycache__/ Traceback (most recent call last): File "/usr/local/lib/python3.9/runpy.py", line 197, in _run_module_as_main return _run_code(code, main_globals, None, File "/usr/local/lib/python3.9/runpy.py", line 87, in _run_code exec(code, run_globals) File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 354, in <module> main() File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 189, in main return _inner_main(temp_dir) File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 334, in _inner_main commit_count = synthesize_loop(x, multiple_prs, change_pusher, synthesizer) File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 65, in synthesize_loop has_changes = toolbox.synthesize_version_in_new_branch(synthesizer, youngest) File "/tmpfs/src/github/synthtool/autosynth/synth_toolbox.py", line 259, in synthesize_version_in_new_branch synthesizer.synthesize(synth_log_path, self.environ) File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize synth_proc.check_returncode() # Raise an exception. File "/usr/local/lib/python3.9/subprocess.py", line 456, in check_returncode raise CalledProcessError(self.returncode, self.args, self.stdout, subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1.
Google internal developers can see the full log here.
pass request object instead of using json_format.ParseDict
Issue raised in PR #15
samples.snippets.get_model_sample_test: test_ucaip_generated_get_model_sample failed
This test failed!
To configure my behavior, see the Build Cop Bot documentation.
If I'm commenting on this issue too often, add the
buildcop: quiet
label and
I will stop commenting.
commit: e6acf37
buildURL: Build Status, Sponge
status: failedTest output
args = (name: "projects/ucaip-sample-tests/locations/us-central1/models/1478306577684365312" ,) kwargs = {'metadata': [('x-goog-request-params', 'name=projects/ucaip-sample-tests/locations/us-central1/models/1478306577684365312'), ('x-goog-api-client', 'gl-python/3.7.7 grpc/1.33.2 gax/1.23.0 gapic/0.3.1')]}@six.wraps(callable_) def error_remapped_callable(*args, **kwargs): try:
return callable_(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:57:
self = <grpc._channel._UnaryUnaryMultiCallable object at 0x7f07efc2a190>
request = name: "projects/ucaip-sample-tests/locations/us-central1/models/1478306577684365312"timeout = None
metadata = [('x-goog-request-params', 'name=projects/ucaip-sample-tests/locations/us-central1/models/1478306577684365312'), ('x-goog-api-client', 'gl-python/3.7.7 grpc/1.33.2 gax/1.23.0 gapic/0.3.1')]
credentials = None, wait_for_ready = None, compression = Nonedef __call__(self, request, timeout=None, metadata=None, credentials=None, wait_for_ready=None, compression=None): state, call, = self._blocking(request, timeout, metadata, credentials, wait_for_ready, compression)
return _end_unary_response_blocking(state, call, False, None)
.nox/py-3-7/lib/python3.7/site-packages/grpc/_channel.py:923:
state = <grpc._channel._RPCState object at 0x7f07efbf1310>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7f07efdb7370>
with_call = False, deadline = Nonedef _end_unary_response_blocking(state, call, with_call, deadline): if state.code is grpc.StatusCode.OK: if with_call: rendezvous = _MultiThreadedRendezvous(state, call, None, deadline) return state.response, rendezvous else: return state.response else:
raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.NOT_FOUND
E details = "The Model does not exist."
E debug_error_string = "{"created":"@1606907543.324315067","description":"Error received from peer ipv4:74.125.20.95:443","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"The Model does not exist.","grpc_status":5}"
E >.nox/py-3-7/lib/python3.7/site-packages/grpc/_channel.py:826: _InactiveRpcError
The above exception was the direct cause of the following exception:
capsys = <_pytest.capture.CaptureFixture object at 0x7f07efb4d390>
def test_ucaip_generated_get_model_sample(capsys):
get_model_sample.get_model_sample(project=PROJECT_ID, model_id=MODEL_ID)
get_model_sample_test.py:26:
get_model_sample.py:30: in get_model_sample
response = client.get_model(name=name)
../../google/cloud/aiplatform_v1beta1/services/model_service/client.py:580: in get_model
response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:59: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.NOT_FOUND
details = "The Model does not exist."
...","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"The Model does not exist.","grpc_status":5}"???
E google.api_core.exceptions.NotFound: 404 The Model does not exist.:3: NotFound
match snippet file names to their tests
Issue raised in PR #12
refactor: shorten duration of deploy_model timeout
Issue created from PR comment
Mark generated samples
- Generated sample files should have warnings to avoid manual updates.
- Sample generation configuration should live in this repo.
Use schema types for predictions.
Example:
Investigate why isinstance() doesn't work in test_value_converter.py
When using
isinstance()
to compare an object to a class in thetest_value_converter.py
file, the assertion fails with this message:E AssertionError: assert False E + where False = isinstance(test_str: "Omnia Gallia est divisa"\ntest_int64: 3\ntest_bool: true\n, SomeMessage)
samples.snippets.predict_text_sentiment_analysis_sample_test: test_ucaip_generated_predict_text_sentiment_analysis_sample failed
This test failed!
To configure my behavior, see the Build Cop Bot documentation.
If I'm commenting on this issue too often, add the
buildcop: quiet
label and
I will stop commenting.
commit: 5155dee
buildURL: Build Status, Sponge
status: failedTest output
args = (endpoint: "projects/ucaip-sample-tests/locations/us-central1/endpoints/7811563922418302976" instances { struct_valu... string_value: "The Chicago Bears is a great football team!" } } } } parameters { struct_value { } } ,) kwargs = {'metadata': [('x-goog-request-params', 'endpoint=projects/ucaip-sample-tests/locations/us-central1/endpoints/7811563922418302976'), ('x-goog-api-client', 'gl-python/3.7.7 grpc/1.33.2 gax/1.23.0 gapic/0.3.1')]}@six.wraps(callable_) def error_remapped_callable(*args, **kwargs): try:
return callable_(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:57:
self = <grpc._channel._UnaryUnaryMultiCallable object at 0x7f66280e58d0>
request = endpoint: "projects/ucaip-sample-tests/locations/us-central1/endpoints/7811563922418302976"
instances {
struct_value... string_value: "The Chicago Bears is a great football team!"
}
}
}
}
parameters {
struct_value {
}
}timeout = None
metadata = [('x-goog-request-params', 'endpoint=projects/ucaip-sample-tests/locations/us-central1/endpoints/7811563922418302976'), ('x-goog-api-client', 'gl-python/3.7.7 grpc/1.33.2 gax/1.23.0 gapic/0.3.1')]
credentials = None, wait_for_ready = None, compression = Nonedef __call__(self, request, timeout=None, metadata=None, credentials=None, wait_for_ready=None, compression=None): state, call, = self._blocking(request, timeout, metadata, credentials, wait_for_ready, compression)
return _end_unary_response_blocking(state, call, False, None)
.nox/py-3-7/lib/python3.7/site-packages/grpc/_channel.py:923:
state = <grpc._channel._RPCState object at 0x7f66280e5490>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7f66280e6be0>
with_call = False, deadline = Nonedef _end_unary_response_blocking(state, call, with_call, deadline): if state.code is grpc.StatusCode.OK: if with_call: rendezvous = _MultiThreadedRendezvous(state, call, None, deadline) return state.response, rendezvous else: return state.response else:
raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.FAILED_PRECONDITION
E details = "Endpoint projects/ucaip-sample-tests/locations/us-central1/endpoints/7811563922418302976 doesn't have traffic_split."
E debug_error_string = "{"created":"@1605785199.361124669","description":"Error received from peer ipv4:74.125.142.95:443","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"Endpoint projects/ucaip-sample-tests/locations/us-central1/endpoints/7811563922418302976 doesn't have traffic_split.","grpc_status":9}"
E >.nox/py-3-7/lib/python3.7/site-packages/grpc/_channel.py:826: _InactiveRpcError
The above exception was the direct cause of the following exception:
capsys = <_pytest.capture.CaptureFixture object at 0x7f66280dcc10>
def test_ucaip_generated_predict_text_sentiment_analysis_sample(capsys): predict_text_sentiment_analysis_sample.predict_text_sentiment_analysis_sample(
content=content, project=PROJECT_ID, endpoint_id=ENDPOINT_ID
)
predict_text_sentiment_analysis_sample_test.py:29:
predict_text_sentiment_analysis_sample.py:41: in predict_text_sentiment_analysis_sample
endpoint=endpoint, instances=instances, parameters=parameters
../../google/cloud/aiplatform_v1beta1/services/prediction_service/client.py:438: in predict
response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:59: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.FAILED_PRECONDITION
details = "Endpoint projects...ucaip-sample-tests/locations/us-central1/endpoints/7811563922418302976 doesn't have traffic_split.","grpc_status":9}"???
E google.api_core.exceptions.FailedPrecondition: 400 Endpoint projects/ucaip-sample-tests/locations/us-central1/endpoints/7811563922418302976 doesn't have traffic_split.:3: FailedPrecondition
model_type in create_training_pipeline samples should use Enum
samples.snippets.list_model_evaluation_slices_sample_test: test_ucaip_generated_get_model_evaluation_slices_sample failed
This test failed!
To configure my behavior, see the Build Cop Bot documentation.
If I'm commenting on this issue too often, add the
buildcop: quiet
label and
I will stop commenting.
commit: e6acf37
buildURL: Build Status, Sponge
status: failedTest output
args = (parent: "projects/ucaip-sample-tests/locations/us-central1/models/5162251072873431040/evaluations/5615675837586029221" ,) kwargs = {'metadata': [('x-goog-request-params', 'parent=projects/ucaip-sample-tests/locations/us-central1/models/5162251072873431040/evaluations/5615675837586029221'), ('x-goog-api-client', 'gl-python/3.7.7 grpc/1.33.2 gax/1.23.0 gapic/0.3.1')]}@six.wraps(callable_) def error_remapped_callable(*args, **kwargs): try:
return callable_(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:57:
self = <grpc._channel._UnaryUnaryMultiCallable object at 0x7f07efbc3a10>
request = parent: "projects/ucaip-sample-tests/locations/us-central1/models/5162251072873431040/evaluations/5615675837586029221"timeout = None
metadata = [('x-goog-request-params', 'parent=projects/ucaip-sample-tests/locations/us-central1/models/5162251072873431040/evaluations/5615675837586029221'), ('x-goog-api-client', 'gl-python/3.7.7 grpc/1.33.2 gax/1.23.0 gapic/0.3.1')]
credentials = None, wait_for_ready = None, compression = Nonedef __call__(self, request, timeout=None, metadata=None, credentials=None, wait_for_ready=None, compression=None): state, call, = self._blocking(request, timeout, metadata, credentials, wait_for_ready, compression)
return _end_unary_response_blocking(state, call, False, None)
.nox/py-3-7/lib/python3.7/site-packages/grpc/_channel.py:923:
state = <grpc._channel._RPCState object at 0x7f07efbc3950>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7f07ef1dea00>
with_call = False, deadline = Nonedef _end_unary_response_blocking(state, call, with_call, deadline): if state.code is grpc.StatusCode.OK: if with_call: rendezvous = _MultiThreadedRendezvous(state, call, None, deadline) return state.response, rendezvous else: return state.response else:
raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.NOT_FOUND
E details = "The Model does not exist."
E debug_error_string = "{"created":"@1606908373.506968741","description":"Error received from peer ipv4:74.125.195.95:443","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"The Model does not exist.","grpc_status":5}"
E >.nox/py-3-7/lib/python3.7/site-packages/grpc/_channel.py:826: _InactiveRpcError
The above exception was the direct cause of the following exception:
capsys = <_pytest.capture.CaptureFixture object at 0x7f07efd19b50>
def test_ucaip_generated_get_model_evaluation_slices_sample(capsys): list_model_evaluation_slices_sample.list_model_evaluation_slices_sample(
project=PROJECT_ID, model_id=MODEL_ID, evaluation_id=EVALUATION_ID
)
list_model_evaluation_slices_sample_test.py:29:
list_model_evaluation_slices_sample.py:33: in list_model_evaluation_slices_sample
response = client.list_model_evaluation_slices(parent=parent)
../../google/cloud/aiplatform_v1beta1/services/model_service/client.py:1264: in list_model_evaluation_slices
response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:59: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.NOT_FOUND
details = "The Model does not exist."
...","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"The Model does not exist.","grpc_status":5}"???
E google.api_core.exceptions.NotFound: 404 The Model does not exist.:3: NotFound
Synthesis failed for python-aiplatform
Hello! Autosynth couldn't regenerate python-aiplatform. 💔
Here's the output from running
synth.py
:g: git log -1 --pretty=%at 1085e9cf6394ca68f6f690844a7ef3622777e8c8 2020-11-04 05:28:45,278 autosynth [DEBUG] > Running: git log -1 --pretty=%at a783321fd55f010709294455584a553f4b24b944 2020-11-04 05:28:45,281 autosynth [DEBUG] > Running: git log -1 --pretty=%at 89c849ba5013e45e8fb688b138f33c2ec6083dc5 2020-11-04 05:28:45,284 autosynth [DEBUG] > Running: git log -1 --pretty=%at f68649c5f26bcff6817c6d21e90dac0fc71fef8e 2020-11-04 05:28:45,287 autosynth [DEBUG] > Running: git log -1 --pretty=%at fd3584b01cedd6c9f79d08be4e2365085f955aa5 2020-11-04 05:28:45,291 autosynth [DEBUG] > Running: git log -1 --pretty=%at ea52b8a0bd560f72f376efcf45197fb7c8869120 2020-11-04 05:28:45,294 autosynth [DEBUG] > Running: git log -1 --pretty=%at 6542bd723403513626f61642fc02ddca528409aa 2020-11-04 05:28:45,297 autosynth [DEBUG] > Running: git log -1 --pretty=%at b19b401571e77192f8dd38eab5fb2300a0de9324 2020-11-04 05:28:45,301 autosynth [DEBUG] > Running: git log -1 --pretty=%at ba9918cd22874245b55734f57470c719b577e591 2020-11-04 05:28:45,304 autosynth [DEBUG] > Running: git checkout 3c713d5cf47bf343bf53583296daed6161d4f4ed Note: checking out '3c713d5cf47bf343bf53583296daed6161d4f4ed'. You are in 'detached HEAD' state. You can look around, make experimental changes and commit them, and you can discard any commits you make in this state without impacting any branches by performing another checkout. If you want to create a new branch to retain commits you create, you may do so (now or later) by using -b with the checkout command again. Example: git checkout -b <new-branch-name> HEAD is now at 3c713d5 fix: re-add py sessions to noxfile (#22) 2020-11-04 05:28:45,315 autosynth [DEBUG] > Running: git checkout ba9918cd22874245b55734f57470c719b577e591 Note: checking out 'ba9918cd22874245b55734f57470c719b577e591'. You are in 'detached HEAD' state. You can look around, make experimental changes and commit them, and you can discard any commits you make in this state without impacting any branches by performing another checkout. If you want to create a new branch to retain commits you create, you may do so (now or later) by using -b with the checkout command again. Example: git checkout -b <new-branch-name> HEAD is now at ba9918c build(node): add KOKORO_BUILD_ARTIFACTS_SUBDIR to env (#834) 2020-11-04 05:28:45,321 autosynth [DEBUG] > Running: git branch -f autosynth-37 2020-11-04 05:28:45,325 autosynth [DEBUG] > Running: git checkout autosynth-37 Switched to branch 'autosynth-37' 2020-11-04 05:28:45,330 autosynth [INFO] > Running synthtool 2020-11-04 05:28:45,330 autosynth [INFO] > ['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--'] 2020-11-04 05:28:45,330 autosynth [DEBUG] > log_file_path: /tmpfs/src/logs/python-aiplatform/37/sponge_log.log 2020-11-04 05:28:45,332 autosynth [DEBUG] > Running: /tmpfs/src/github/synthtool/env/bin/python3 -m synthtool --metadata synth.metadata synth.py -- 2020-11-04 05:28:45,589 synthtool [DEBUG] > Executing /home/kbuilder/.cache/synthtool/python-aiplatform/synth.py. On branch autosynth-37 nothing to commit, working tree clean 2020-11-04 05:28:45,733 synthtool [DEBUG] > Using precloned repo /home/kbuilder/.cache/synthtool/synthtool 2020-11-04 05:28:45,741 synthtool [DEBUG] > Ensuring dependencies. 2020-11-04 05:28:45,745 synthtool [DEBUG] > Using precloned repo /home/kbuilder/.cache/synthtool/synthtool 2020-11-04 05:28:45,749 synthtool [DEBUG] > Cloning googleapis. 2020-11-04 05:28:46,356 synthtool [DEBUG] > Pulling Docker image: gapic-generator-python:0.20 0.20: Pulling from gapic-images/gapic-generator-python 68ced04f60ab: Pulling fs layer d1d516f83cca: Pulling fs layer 84e300e06912: Pulling fs layer 4100d15544f4: Pulling fs layer 2cb85865485e: Pulling fs layer 3f4a9382f7a3: Pulling fs layer f4117b3f6872: Pulling fs layer 3a6ff6947881: Pulling fs layer 5f54ab71eb0e: Pulling fs layer a7b6d1c73b89: Pulling fs layer f4117b3f6872: Waiting 3f4a9382f7a3: Waiting 3a6ff6947881: Waiting 5f54ab71eb0e: Waiting a7b6d1c73b89: Waiting 4100d15544f4: Waiting 2cb85865485e: Waiting d1d516f83cca: Verifying Checksum d1d516f83cca: Download complete 68ced04f60ab: Download complete 4100d15544f4: Verifying Checksum 4100d15544f4: Download complete 2cb85865485e: Verifying Checksum 2cb85865485e: Download complete 84e300e06912: Download complete f4117b3f6872: Verifying Checksum f4117b3f6872: Download complete 3a6ff6947881: Verifying Checksum 3a6ff6947881: Download complete 5f54ab71eb0e: Verifying Checksum 5f54ab71eb0e: Download complete 3f4a9382f7a3: Verifying Checksum 3f4a9382f7a3: Download complete 68ced04f60ab: Pull complete d1d516f83cca: Pull complete a7b6d1c73b89: Verifying Checksum a7b6d1c73b89: Download complete 84e300e06912: Pull complete 4100d15544f4: Pull complete 2cb85865485e: Pull complete 3f4a9382f7a3: Pull complete f4117b3f6872: Pull complete 3a6ff6947881: Pull complete 5f54ab71eb0e: Pull complete a7b6d1c73b89: Pull complete Digest: sha256:b84796906d2f7f00b60563017b64b394320064d804f54a05900a001e3638e47b Status: Downloaded newer image for gcr.io/gapic-images/gapic-generator-python:0.20 2020-11-04 05:28:53,266 synthtool [DEBUG] > Generating code for: google/cloud/aiplatform/v1beta1. google/cloud/aiplatform/v1beta1/migratable_resource.proto:28:8: Option "(google.api.resource_definition)" unknown. 2020-11-04 05:28:55,702 synthtool [ERROR] > Failed executing docker run --mount type=bind,source=/home/kbuilder/.cache/synthtool/googleapis/google/cloud/aiplatform/v1beta1/,destination=/in/google/cloud/aiplatform/v1beta1/,readonly --mount type=bind,source=/tmpfs/tmp/tmpw7mxfbi7/,destination=/out/ --rm --user 1000 gcr.io/gapic-images/gapic-generator-python:0.20: None Traceback (most recent call last): File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main "__main__", mod_spec) File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code exec(code, run_globals) File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module> main() File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__ return self.main(*args, **kwargs) File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 782, in main rv = self.invoke(ctx) File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke return callback(*args, **kwargs) File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 94, in main spec.loader.exec_module(synth_module) # type: ignore File "<frozen importlib._bootstrap_external>", line 678, in exec_module File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed File "/home/kbuilder/.cache/synthtool/python-aiplatform/synth.py", line 38, in <module> library = gapic.py_library("aiplatform", "v1beta1", generator_version="0.20") File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_microgenerator.py", line 57, in py_library return self._generate_code(service, version, "python", **kwargs) File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_microgenerator.py", line 189, in _generate_code shell.run(docker_run_args, hide_output=False) File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 39, in run raise exc File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 33, in run encoding="utf-8", File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 438, in run output=stdout, stderr=stderr) subprocess.CalledProcessError: Command '['docker', 'run', '--mount', 'type=bind,source=/home/kbuilder/.cache/synthtool/googleapis/google/cloud/aiplatform/v1beta1/,destination=/in/google/cloud/aiplatform/v1beta1/,readonly', '--mount', 'type=bind,source=/tmpfs/tmp/tmpw7mxfbi7/,destination=/out/', '--rm', '--user', '1000', 'gcr.io/gapic-images/gapic-generator-python:0.20']' returned non-zero exit status 1. 2020-11-04 05:28:55,766 autosynth [ERROR] > Synthesis failed 2020-11-04 05:28:55,766 autosynth [DEBUG] > Running: git reset --hard HEAD HEAD is now at 3c713d5 fix: re-add py sessions to noxfile (#22) 2020-11-04 05:28:55,772 autosynth [DEBUG] > Running: git checkout autosynth Switched to branch 'autosynth' 2020-11-04 05:28:55,776 autosynth [DEBUG] > Running: git clean -fdx Removing __pycache__/ Traceback (most recent call last): File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main "__main__", mod_spec) File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code exec(code, run_globals) File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 354, in <module> main() File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 189, in main return _inner_main(temp_dir) File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 334, in _inner_main commit_count = synthesize_loop(x, multiple_prs, change_pusher, synthesizer) File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 65, in synthesize_loop has_changes = toolbox.synthesize_version_in_new_branch(synthesizer, youngest) File "/tmpfs/src/github/synthtool/autosynth/synth_toolbox.py", line 259, in synthesize_version_in_new_branch synthesizer.synthesize(synth_log_path, self.environ) File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize synth_proc.check_returncode() # Raise an exception. File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode self.stderr) subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1.
Google internal developers can see the full log here.
Fix import_data sample tests
import_data_*_sample
tests are not done in the same way (some using mocks, some calling the actual service). They are also flaky and sometimes getting TimeoutError.Add samples and sample tests
Describe the solution you'd like
Please add relevant code samples and tests.endpoint Response size to large
I have a working model endpoint deployed on the AI Platform (Unified).
However, the response size is exceeding the maximum allowed by the gRPC max message setting.
How can I increase this limit of the tensorflow server running on the endpoint?Environment details
- OS type and version:
- Python version:
python --version
- pip version:
pip --version
google-cloud-aiplatform
version:pip show google-cloud-aiplatform
Steps to reproduce
- Deployed an objection detection model trained from the tensorflow/models repository on the AI Platform (Unified)
- Send a single base64 encoded image using the aiplatform prediction client.
Code example
with open(filename, "rb") as f: file_content = f.read() encoded_content = base64.urlsafe_b64encode(file_content).decode("utf-8") instances = [{"input_tensor": encoded_content}] parameters_dict = {} parameters = json_format.ParseDict(parameters_dict, Value()) endpoint = client.endpoint_path( project=project, location=location, endpoint=endpoint_id ) response = client.predict( endpoint=endpoint, instances=instances, parameters=parameters, )Stack trace
InvalidArgument: 400 Failed to handle request. endpoint_id: 3414397020616523776, deployed_model_id: 7183628433748918272 with error: `Response size too large. Received at least 3288694 bytes; max is 2000000.`
samples.snippets.get_model_evaluation_tabular_regression_sample_test: test_ucaip_generated_get_model_evaluation_sample failed
This test failed!
To configure my behavior, see the Build Cop Bot documentation.
If I'm commenting on this issue too often, add the
buildcop: quiet
label and
I will stop commenting.
commit: e6acf37
buildURL: Build Status, Sponge
status: failedTest output
args = (name: "projects/ucaip-sample-tests/locations/us-central1/models/3125638878883479552/evaluations/2025948722346981108" ,) kwargs = {'metadata': [('x-goog-request-params', 'name=projects/ucaip-sample-tests/locations/us-central1/models/3125638878883479552/evaluations/2025948722346981108'), ('x-goog-api-client', 'gl-python/3.7.7 grpc/1.33.2 gax/1.23.0 gapic/0.3.1')]}@six.wraps(callable_) def error_remapped_callable(*args, **kwargs): try:
return callable_(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:57:
self = <grpc._channel._UnaryUnaryMultiCallable object at 0x7f07efd8c5d0>
request = name: "projects/ucaip-sample-tests/locations/us-central1/models/3125638878883479552/evaluations/2025948722346981108"timeout = None
metadata = [('x-goog-request-params', 'name=projects/ucaip-sample-tests/locations/us-central1/models/3125638878883479552/evaluations/2025948722346981108'), ('x-goog-api-client', 'gl-python/3.7.7 grpc/1.33.2 gax/1.23.0 gapic/0.3.1')]
credentials = None, wait_for_ready = None, compression = Nonedef __call__(self, request, timeout=None, metadata=None, credentials=None, wait_for_ready=None, compression=None): state, call, = self._blocking(request, timeout, metadata, credentials, wait_for_ready, compression)
return _end_unary_response_blocking(state, call, False, None)
.nox/py-3-7/lib/python3.7/site-packages/grpc/_channel.py:923:
state = <grpc._channel._RPCState object at 0x7f07efd8c990>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7f07efd8b4b0>
with_call = False, deadline = Nonedef _end_unary_response_blocking(state, call, with_call, deadline): if state.code is grpc.StatusCode.OK: if with_call: rendezvous = _MultiThreadedRendezvous(state, call, None, deadline) return state.response, rendezvous else: return state.response else:
raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.NOT_FOUND
E details = "The Model does not exist."
E debug_error_string = "{"created":"@1606907541.908400298","description":"Error received from peer ipv4:74.125.20.95:443","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"The Model does not exist.","grpc_status":5}"
E >.nox/py-3-7/lib/python3.7/site-packages/grpc/_channel.py:826: _InactiveRpcError
The above exception was the direct cause of the following exception:
capsys = <_pytest.capture.CaptureFixture object at 0x7f07efaf7d90>
def test_ucaip_generated_get_model_evaluation_sample(capsys): get_model_evaluation_sample.get_model_evaluation_sample(
project=PROJECT_ID, model_id=MODEL_ID, evaluation_id=EVALUATION_ID
)
get_model_evaluation_tabular_regression_sample_test.py:27:
get_model_evaluation_sample.py:33: in get_model_evaluation_sample
response = client.get_model_evaluation(name=name)
../../google/cloud/aiplatform_v1beta1/services/model_service/client.py:1022: in get_model_evaluation
response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:59: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.NOT_FOUND
details = "The Model does not exist."
...","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"The Model does not exist.","grpc_status":5}"???
E google.api_core.exceptions.NotFound: 404 The Model does not exist.:3: NotFound
Use latest generator version
This library is pinned to an older version of the library via
synth.py
.Lines 38 to 42 in 42fa6fd
Eventually, the library should move to using the latest version of the microgenerator to get new features and bug fixes.
I've confirmed that continuing to use the Docker image is fine for the time being, and AC Tools will continue to publish it in addition to supporting the Bazel workflow.
Fix Samples - Lint
Samples - Lint checks are failing. This needs to be fixed right after the current batch of sample PRs are merged. Example: https://source.cloud.google.com/results/invocations/86c56410-27c9-4fcb-ad03-6a3e45c23f2c
Synthesis failed for python-aiplatform
Hello! Autosynth couldn't regenerate python-aiplatform. 💔
Here's the output from running
synth.py
:Running: git config push.default simple 2020-10-29 22:33:03,874 autosynth [DEBUG] > Running: git branch -f autosynth 2020-10-29 22:33:03,877 autosynth [DEBUG] > Running: git checkout autosynth Switched to branch 'autosynth' 2020-10-29 22:33:03,888 autosynth [DEBUG] > Running: git rev-parse --show-toplevel 2020-10-29 22:33:03,892 autosynth [DEBUG] > Running: git log -1 --pretty=%H 2020-10-29 22:33:03,895 autosynth [DEBUG] > Running: git remote get-url origin 2020-10-29 22:33:04,209 autosynth [DEBUG] > Running: git log 487eba79f8260e34205d8ceb1ebcc65685085e19..HEAD --pretty=%H --no-decorate 2020-10-29 22:33:04,213 autosynth [DEBUG] > Running: git log -1 --pretty=%at 487eba79f8260e34205d8ceb1ebcc65685085e19 2020-10-29 22:33:04,217 autosynth [DEBUG] > Running: git log -1 --pretty=%at b6164c26a111f7f587099d31253abb96b5737bb2 2020-10-29 22:33:04,220 autosynth [DEBUG] > Running: git log -1 --pretty=%at e0ae456852bf22f38796deb79cff30b516fde244 2020-10-29 22:33:04,223 autosynth [DEBUG] > Running: git log -1 --pretty=%at befc24dcdeb8e57ec1259826fd33120b05137e8f 2020-10-29 22:33:04,226 autosynth [DEBUG] > Running: git log -1 --pretty=%at a850272b459e3578e6ec03f6a6d4a15c2fbb8082 2020-10-29 22:33:04,229 autosynth [DEBUG] > Running: git log -1 --pretty=%at ce61cf9ce6448082f56008aba3cee2aa77430063 2020-10-29 22:33:04,232 autosynth [DEBUG] > Running: git log -1 --pretty=%at 9b7fadcd01ca5acb03a23519b6afc2fd347df3b5 2020-10-29 22:33:04,236 autosynth [DEBUG] > Running: git log -1 --pretty=%at b65ef07d99946d23e900ef2cc490274a16edd336 2020-10-29 22:33:04,239 autosynth [DEBUG] > Running: git log -1 --pretty=%at 3c1fd09ba8d1c7b7092662f6f8330f521d4e7739 2020-10-29 22:33:04,242 autosynth [DEBUG] > Running: git log -1 --pretty=%at 477764cc4ee6db346d3febef2bb1ea0abf27de52 2020-10-29 22:33:04,245 autosynth [DEBUG] > Running: git log -1 --pretty=%at c81691e3116222154da62d979bf124b497ce66dc 2020-10-29 22:33:04,247 autosynth [DEBUG] > Running: git log -1 --pretty=%at 27e0e916cbfdb3d5ff6639b686cc04f78a0b0386 2020-10-29 22:33:04,250 autosynth [DEBUG] > Running: git log -1 --pretty=%at d5d03413e1879108b5ade8839ce38de01be652da 2020-10-29 22:33:04,253 autosynth [DEBUG] > Running: git log -1 --pretty=%at 7c5370937dd9ba9dcf9cd7d2af880a58b389b4f1 2020-10-29 22:33:04,256 autosynth [DEBUG] > Running: git log -1 --pretty=%at 5451633881133e5573cc271a18e73b18caca8b1b 2020-10-29 22:33:04,259 autosynth [DEBUG] > Running: git log -1 --pretty=%at 31682399fd2cfce2d5856d31c6b328e839d5141c 2020-10-29 22:33:04,262 autosynth [DEBUG] > Running: git log -1 --pretty=%at da5c6050d13b4950c82666a81d8acd25157664ae 2020-10-29 22:33:04,265 autosynth [DEBUG] > Running: git log -1 --pretty=%at 77c5ba85e05950f5b19ce8a553c1c0db2fba9896 2020-10-29 22:33:04,268 autosynth [DEBUG] > Running: git log -1 --pretty=%at 2f8ac35b02608af41317873d8d663ecea91a524d 2020-10-29 22:33:04,271 autosynth [DEBUG] > Running: git log -1 --pretty=%at 6dc98b26a9c823e92fa21455fb6e5862613c49c2 2020-10-29 22:33:04,274 autosynth [DEBUG] > Running: git log -1 --pretty=%at 5a506ec8765cc04f7e29f888b8e9b257d9a7ae11 2020-10-29 22:33:04,277 autosynth [DEBUG] > Running: git log -1 --pretty=%at 6abb59097be84599a1d6091fe534a49e5c5cf948 2020-10-29 22:33:04,280 autosynth [DEBUG] > Running: git log -1 --pretty=%at f96d3b455fe27c3dc7bc37c3c9cd27b1c6d269c8 2020-10-29 22:33:04,283 autosynth [DEBUG] > Running: git log -1 --pretty=%at 901ddd44e9ef7887ee681b9183bbdea99437fdcc 2020-10-29 22:33:04,286 autosynth [DEBUG] > Running: git log -1 --pretty=%at 9593c3b5b714cc9b17c445aee8834ac2b4b9348b 2020-10-29 22:33:04,289 autosynth [DEBUG] > Running: git log -1 --pretty=%at 5f6ef0ec5501d33c4667885b37a7685a30d41a76 2020-10-29 22:33:04,292 autosynth [DEBUG] > Running: git log -1 --pretty=%at b7413d38b763827c72c0360f0a3d286c84656eeb 2020-10-29 22:33:04,295 autosynth [DEBUG] > Running: git log -1 --pretty=%at 721a7d2abb129029eca9d85a40da6eb7b8b1739a 2020-10-29 22:33:04,298 autosynth [DEBUG] > Running: git log -1 --pretty=%at 1085e9cf6394ca68f6f690844a7ef3622777e8c8 2020-10-29 22:33:04,301 autosynth [DEBUG] > Running: git log -1 --pretty=%at a783321fd55f010709294455584a553f4b24b944 2020-10-29 22:33:04,304 autosynth [DEBUG] > Running: git log -1 --pretty=%at 89c849ba5013e45e8fb688b138f33c2ec6083dc5 2020-10-29 22:33:04,307 autosynth [DEBUG] > Running: git log -1 --pretty=%at f68649c5f26bcff6817c6d21e90dac0fc71fef8e 2020-10-29 22:33:04,310 autosynth [DEBUG] > Running: git log -1 --pretty=%at fd3584b01cedd6c9f79d08be4e2365085f955aa5 2020-10-29 22:33:04,313 autosynth [DEBUG] > Running: git log -1 --pretty=%at ea52b8a0bd560f72f376efcf45197fb7c8869120 2020-10-29 22:33:04,316 autosynth [DEBUG] > Running: git checkout 3c713d5cf47bf343bf53583296daed6161d4f4ed Note: checking out '3c713d5cf47bf343bf53583296daed6161d4f4ed'. You are in 'detached HEAD' state. You can look around, make experimental changes and commit them, and you can discard any commits you make in this state without impacting any branches by performing another checkout. If you want to create a new branch to retain commits you create, you may do so (now or later) by using -b with the checkout command again. Example: git checkout -b <new-branch-name> HEAD is now at 3c713d5 fix: re-add py sessions to noxfile (#22) 2020-10-29 22:33:04,326 autosynth [DEBUG] > Running: git checkout ea52b8a0bd560f72f376efcf45197fb7c8869120 Note: checking out 'ea52b8a0bd560f72f376efcf45197fb7c8869120'. You are in 'detached HEAD' state. You can look around, make experimental changes and commit them, and you can discard any commits you make in this state without impacting any branches by performing another checkout. If you want to create a new branch to retain commits you create, you may do so (now or later) by using -b with the checkout command again. Example: git checkout -b <new-branch-name> HEAD is now at ea52b8a docs: add proto-plus to intersphinx mapping (#832) 2020-10-29 22:33:04,333 autosynth [DEBUG] > Running: git branch -f autosynth-34 2020-10-29 22:33:04,336 autosynth [DEBUG] > Running: git checkout autosynth-34 Switched to branch 'autosynth-34' 2020-10-29 22:33:04,341 autosynth [INFO] > Running synthtool 2020-10-29 22:33:04,341 autosynth [INFO] > ['/usr/local/bin/python', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--'] 2020-10-29 22:33:04,341 autosynth [DEBUG] > log_file_path: /kokoro/artifacts/logs/python-aiplatform/34/sponge_log.log 2020-10-29 22:33:04,343 autosynth [DEBUG] > Running: /usr/local/bin/python -m synthtool --metadata synth.metadata synth.py -- 2020-10-29 22:33:04,550 synthtool [DEBUG] > Executing /root/.cache/synthtool/python-aiplatform/synth.py. On branch autosynth-34 nothing to commit, working tree clean 2020-10-29 22:33:04,677 synthtool [DEBUG] > Using precloned repo /root/.cache/synthtool/synthtool 2020-10-29 22:33:04,685 synthtool [DEBUG] > Ensuring dependencies. Traceback (most recent call last): File "/usr/local/lib/python3.6/runpy.py", line 193, in _run_module_as_main "__main__", mod_spec) File "/usr/local/lib/python3.6/runpy.py", line 85, in _run_code exec(code, run_globals) File "/synthtool/synthtool/__main__.py", line 102, in <module> main() File "/usr/local/lib/python3.6/site-packages/click/core.py", line 829, in __call__ return self.main(*args, **kwargs) File "/usr/local/lib/python3.6/site-packages/click/core.py", line 782, in main rv = self.invoke(ctx) File "/usr/local/lib/python3.6/site-packages/click/core.py", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File "/usr/local/lib/python3.6/site-packages/click/core.py", line 610, in invoke return callback(*args, **kwargs) File "/synthtool/synthtool/__main__.py", line 94, in main spec.loader.exec_module(synth_module) # type: ignore File "<frozen importlib._bootstrap_external>", line 678, in exec_module File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed File "/root/.cache/synthtool/python-aiplatform/synth.py", line 25, in <module> gapic = gcp.GAPICMicrogenerator() File "/synthtool/synthtool/gcp/gapic_microgenerator.py", line 46, in __init__ self._ensure_dependencies_installed() File "/synthtool/synthtool/gcp/gapic_microgenerator.py", line 255, in _ensure_dependencies_installed f"Dependencies missing: {', '.join(failed_dependencies)}" OSError: Dependencies missing: docker 2020-10-29 22:33:04,749 autosynth [ERROR] > Synthesis failed 2020-10-29 22:33:04,749 autosynth [DEBUG] > Running: git reset --hard HEAD HEAD is now at 3c713d5 fix: re-add py sessions to noxfile (#22) 2020-10-29 22:33:04,754 autosynth [DEBUG] > Running: git checkout autosynth Switched to branch 'autosynth' 2020-10-29 22:33:04,759 autosynth [DEBUG] > Running: git clean -fdx Removing __pycache__/ Traceback (most recent call last): File "/usr/local/lib/python3.6/runpy.py", line 193, in _run_module_as_main "__main__", mod_spec) File "/usr/local/lib/python3.6/runpy.py", line 85, in _run_code exec(code, run_globals) File "/synthtool/autosynth/synth.py", line 354, in <module> main() File "/synthtool/autosynth/synth.py", line 189, in main return _inner_main(temp_dir) File "/synthtool/autosynth/synth.py", line 334, in _inner_main commit_count = synthesize_loop(x, multiple_prs, change_pusher, synthesizer) File "/synthtool/autosynth/synth.py", line 65, in synthesize_loop has_changes = toolbox.synthesize_version_in_new_branch(synthesizer, youngest) File "/synthtool/autosynth/synth_toolbox.py", line 259, in synthesize_version_in_new_branch synthesizer.synthesize(synth_log_path, self.environ) File "/synthtool/autosynth/synthesizer.py", line 120, in synthesize synth_proc.check_returncode() # Raise an exception. File "/usr/local/lib/python3.6/subprocess.py", line 389, in check_returncode self.stderr) subprocess.CalledProcessError: Command '['/usr/local/bin/python', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1.
Google internal developers can see the full log here.
samples.snippets.get_model_evaluation_video_classification_sample_test: test_ucaip_generated_get_model_evaluation_sample failed
This test failed!
To configure my behavior, see the Build Cop Bot documentation.
If I'm commenting on this issue too often, add the
buildcop: quiet
label and
I will stop commenting.
commit: e6acf37
buildURL: Build Status, Sponge
status: failedTest output
args = (name: "projects/ucaip-sample-tests/locations/us-central1/models/667940119734386688/evaluations/789396572185034752" ,) kwargs = {'metadata': [('x-goog-request-params', 'name=projects/ucaip-sample-tests/locations/us-central1/models/667940119734386688/evaluations/789396572185034752'), ('x-goog-api-client', 'gl-python/3.7.7 grpc/1.33.2 gax/1.23.0 gapic/0.3.1')]}@six.wraps(callable_) def error_remapped_callable(*args, **kwargs): try:
return callable_(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:57:
self = <grpc._channel._UnaryUnaryMultiCallable object at 0x7f07efc0fd90>
request = name: "projects/ucaip-sample-tests/locations/us-central1/models/667940119734386688/evaluations/789396572185034752"timeout = None
metadata = [('x-goog-request-params', 'name=projects/ucaip-sample-tests/locations/us-central1/models/667940119734386688/evaluations/789396572185034752'), ('x-goog-api-client', 'gl-python/3.7.7 grpc/1.33.2 gax/1.23.0 gapic/0.3.1')]
credentials = None, wait_for_ready = None, compression = Nonedef __call__(self, request, timeout=None, metadata=None, credentials=None, wait_for_ready=None, compression=None): state, call, = self._blocking(request, timeout, metadata, credentials, wait_for_ready, compression)
return _end_unary_response_blocking(state, call, False, None)
.nox/py-3-7/lib/python3.7/site-packages/grpc/_channel.py:923:
state = <grpc._channel._RPCState object at 0x7f07efc0f890>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7f07efd6e820>
with_call = False, deadline = Nonedef _end_unary_response_blocking(state, call, with_call, deadline): if state.code is grpc.StatusCode.OK: if with_call: rendezvous = _MultiThreadedRendezvous(state, call, None, deadline) return state.response, rendezvous else: return state.response else:
raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.NOT_FOUND
E details = "The Model does not exist."
E debug_error_string = "{"created":"@1606907542.423832671","description":"Error received from peer ipv4:74.125.142.95:443","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"The Model does not exist.","grpc_status":5}"
E >.nox/py-3-7/lib/python3.7/site-packages/grpc/_channel.py:826: _InactiveRpcError
The above exception was the direct cause of the following exception:
capsys = <_pytest.capture.CaptureFixture object at 0x7f07efb16490>
def test_ucaip_generated_get_model_evaluation_sample(capsys): get_model_evaluation_video_classification_sample.get_model_evaluation_video_classification_sample(
project=PROJECT_ID, model_id=MODEL_ID, evaluation_id=EVALUATION_ID
)
get_model_evaluation_video_classification_sample_test.py:27:
get_model_evaluation_video_classification_sample.py:33: in get_model_evaluation_video_classification_sample
response = client.get_model_evaluation(name=name)
../../google/cloud/aiplatform_v1beta1/services/model_service/client.py:1022: in get_model_evaluation
response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:59: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.NOT_FOUND
details = "The Model does not exist."
...","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"The Model does not exist.","grpc_status":5}"???
E google.api_core.exceptions.NotFound: 404 The Model does not exist.:3: NotFound
style recommendation - wrap long urls in samples
Issue raised in PR #14
Should use "tabular" in schema types.
This file (and maybe others too) refers to "tables", when it should be "tabular": https://github.com/googleapis/python-aiplatform/blob/master/google/cloud/aiplatform/v1beta1/schema/trainingjob/definition_v1beta1/types/automl_tables.py
export_model samples should show `export_format_id` in the request
export_model samples should show
export_format_id
in the request. However the service should default to the first supported format.Reference: #194
samples.snippets.create_batch_prediction_job_video_classification_sample_test: test_ucaip_generated_create_batch_prediction_vcn_sample failed
This test failed!
To configure my behavior, see the Build Cop Bot documentation.
If I'm commenting on this issue too often, add the
buildcop: quiet
label and
I will stop commenting.
commit: e6acf37
buildURL: Build Status, Sponge
status: failedTest output
@pytest.fixture(scope="function") def shared_state():shared_state = {} yield shared_state
assert "/" in shared_state["batch_prediction_job_name"]
E KeyError: 'batch_prediction_job_name'
create_batch_prediction_job_video_classification_sample_test.py:44: KeyError
samples.snippets.get_model_evaluation_video_object_tracking_sample_test: test_ucaip_generated_get_model_evaluation_sample failed
This test failed!
To configure my behavior, see the Build Cop Bot documentation.
If I'm commenting on this issue too often, add the
buildcop: quiet
label and
I will stop commenting.
commit: e6acf37
buildURL: Build Status, Sponge
status: failedTest output
args = (name: "projects/ucaip-sample-tests/locations/us-central1/models/20547673299877888/evaluations/1165447141070471168" ,) kwargs = {'metadata': [('x-goog-request-params', 'name=projects/ucaip-sample-tests/locations/us-central1/models/20547673299877888/evaluations/1165447141070471168'), ('x-goog-api-client', 'gl-python/3.7.7 grpc/1.33.2 gax/1.23.0 gapic/0.3.1')]}@six.wraps(callable_) def error_remapped_callable(*args, **kwargs): try:
return callable_(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:57:
self = <grpc._channel._UnaryUnaryMultiCallable object at 0x7f07efb38a90>
request = name: "projects/ucaip-sample-tests/locations/us-central1/models/20547673299877888/evaluations/1165447141070471168"timeout = None
metadata = [('x-goog-request-params', 'name=projects/ucaip-sample-tests/locations/us-central1/models/20547673299877888/evaluations/1165447141070471168'), ('x-goog-api-client', 'gl-python/3.7.7 grpc/1.33.2 gax/1.23.0 gapic/0.3.1')]
credentials = None, wait_for_ready = None, compression = Nonedef __call__(self, request, timeout=None, metadata=None, credentials=None, wait_for_ready=None, compression=None): state, call, = self._blocking(request, timeout, metadata, credentials, wait_for_ready, compression)
return _end_unary_response_blocking(state, call, False, None)
.nox/py-3-7/lib/python3.7/site-packages/grpc/_channel.py:923:
state = <grpc._channel._RPCState object at 0x7f07efd52850>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7f07efbc9550>
with_call = False, deadline = Nonedef _end_unary_response_blocking(state, call, with_call, deadline): if state.code is grpc.StatusCode.OK: if with_call: rendezvous = _MultiThreadedRendezvous(state, call, None, deadline) return state.response, rendezvous else: return state.response else:
raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.NOT_FOUND
E details = "The Model does not exist."
E debug_error_string = "{"created":"@1606907542.929950362","description":"Error received from peer ipv4:74.125.195.95:443","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"The Model does not exist.","grpc_status":5}"
E >.nox/py-3-7/lib/python3.7/site-packages/grpc/_channel.py:826: _InactiveRpcError
The above exception was the direct cause of the following exception:
capsys = <_pytest.capture.CaptureFixture object at 0x7f07efbac810>
def test_ucaip_generated_get_model_evaluation_sample(capsys): get_model_evaluation_video_object_tracking_sample.get_model_evaluation_video_object_tracking_sample(
project=PROJECT_ID, model_id=MODEL_ID, evaluation_id=EVALUATION_ID
)
get_model_evaluation_video_object_tracking_sample_test.py:27:
get_model_evaluation_video_object_tracking_sample.py:33: in get_model_evaluation_video_object_tracking_sample
response = client.get_model_evaluation(name=name)
../../google/cloud/aiplatform_v1beta1/services/model_service/client.py:1022: in get_model_evaluation
response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:59: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.NOT_FOUND
details = "The Model does not exist."
...","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"The Model does not exist.","grpc_status":5}"???
E google.api_core.exceptions.NotFound: 404 The Model does not exist.:3: NotFound
samples.snippets.import_data_text_entity_extraction_sample_test: test_ucaip_generated_import_data_text_entity_extraction_sample failed
This test failed!
To configure my behavior, see the Build Cop Bot documentation.
If I'm commenting on this issue too often, add the
buildcop: quiet
label and
I will stop commenting.
commit: 860d12e
buildURL: Build Status, Sponge
status: failedTest output
target = functools.partial(>) predicate = .if_exception_type_predicate at 0x7fa73dcfaf80> sleep_generator = deadline = 1800, on_error = Nonedef retry_target(target, predicate, sleep_generator, deadline, on_error=None): """Call a function and retry if it fails. This is the lowest-level retry helper. Generally, you'll use the higher-level retry helper :class:`Retry`. Args: target(Callable): The function to call and retry. This must be a nullary function - apply arguments with `functools.partial`. predicate (Callable[Exception]): A callable used to determine if an exception raised by the target should be considered retryable. It should return True to retry or False otherwise. sleep_generator (Iterable[float]): An infinite iterator that determines how long to sleep between retries. deadline (float): How long to keep retrying the target. The last sleep period is shortened as necessary, so that the last retry runs at ``deadline`` (and not considerably beyond it). on_error (Callable[Exception]): A function to call while processing a retryable exception. Any error raised by this function will *not* be caught. Returns: Any: the return value of the target function. Raises: google.api_core.RetryError: If the deadline is exceeded while retrying. ValueError: If the sleep generator stops yielding values. Exception: If the target raises a method that isn't retryable. """ if deadline is not None: deadline_datetime = datetime_helpers.utcnow() + datetime.timedelta( seconds=deadline ) else: deadline_datetime = None last_exc = None for sleep in sleep_generator: try:
return target()
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/retry.py:184:
self = <google.api_core.operation.Operation object at 0x7fa736b2bcd0>
retry = <google.api_core.retry.Retry object at 0x7fa73dd074d0>def _done_or_raise(self, retry=DEFAULT_RETRY): """Check if the future is done and raise if it's not.""" kwargs = {} if retry is DEFAULT_RETRY else {"retry": retry} if not self.done(**kwargs):
raise _OperationNotComplete()
E google.api_core.future.polling._OperationNotComplete
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/future/polling.py:86: _OperationNotComplete
The above exception was the direct cause of the following exception:
self = <google.api_core.operation.Operation object at 0x7fa736b2bcd0>
timeout = 1800, retry = <google.api_core.retry.Retry object at 0x7fa73dd074d0>def _blocking_poll(self, timeout=None, retry=DEFAULT_RETRY): """Poll and wait for the Future to be resolved. Args: timeout (int): How long (in seconds) to wait for the operation to complete. If None, wait indefinitely. """ if self._result_set: return retry_ = self._retry.with_deadline(timeout) try: kwargs = {} if retry is DEFAULT_RETRY else {"retry": retry}
retry_(self._done_or_raise)(**kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/future/polling.py:107:
args = (), kwargs = {}
target = functools.partial(<bound method PollingFuture._done_or_raise of <google.api_core.operation.Operation object at 0x7fa736b2bcd0>>)
sleep_generator = <generator object exponential_sleep_generator at 0x7fa736b99b50>@general_helpers.wraps(func) def retry_wrapped_func(*args, **kwargs): """A wrapper that calls target function with retry.""" target = functools.partial(func, *args, **kwargs) sleep_generator = exponential_sleep_generator( self._initial, self._maximum, multiplier=self._multiplier ) return retry_target( target, self._predicate, sleep_generator, self._deadline,
on_error=on_error,
)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/retry.py:286:
target = functools.partial(<bound method PollingFuture._done_or_raise of <google.api_core.operation.Operation object at 0x7fa736b2bcd0>>)
predicate = <function if_exception_type..if_exception_type_predicate at 0x7fa73dcfaf80>
sleep_generator = <generator object exponential_sleep_generator at 0x7fa736b99b50>
deadline = 1800, on_error = Nonedef retry_target(target, predicate, sleep_generator, deadline, on_error=None): """Call a function and retry if it fails. This is the lowest-level retry helper. Generally, you'll use the higher-level retry helper :class:`Retry`. Args: target(Callable): The function to call and retry. This must be a nullary function - apply arguments with `functools.partial`. predicate (Callable[Exception]): A callable used to determine if an exception raised by the target should be considered retryable. It should return True to retry or False otherwise. sleep_generator (Iterable[float]): An infinite iterator that determines how long to sleep between retries. deadline (float): How long to keep retrying the target. The last sleep period is shortened as necessary, so that the last retry runs at ``deadline`` (and not considerably beyond it). on_error (Callable[Exception]): A function to call while processing a retryable exception. Any error raised by this function will *not* be caught. Returns: Any: the return value of the target function. Raises: google.api_core.RetryError: If the deadline is exceeded while retrying. ValueError: If the sleep generator stops yielding values. Exception: If the target raises a method that isn't retryable. """ if deadline is not None: deadline_datetime = datetime_helpers.utcnow() + datetime.timedelta( seconds=deadline ) else: deadline_datetime = None last_exc = None for sleep in sleep_generator: try: return target() # pylint: disable=broad-except # This function explicitly must deal with broad exceptions. except Exception as exc: if not predicate(exc): raise last_exc = exc if on_error is not None: on_error(exc) now = datetime_helpers.utcnow() if deadline_datetime is not None: if deadline_datetime <= now: six.raise_from( exceptions.RetryError( "Deadline of {:.1f}s exceeded while calling {}".format( deadline, target ), last_exc, ),
last_exc,
)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/retry.py:206:
value = None, from_value = _OperationNotComplete()
???
E google.api_core.exceptions.RetryError: Deadline of 1800.0s exceeded while calling functools.partial(<bound method PollingFuture._done_or_raise of <google.api_core.operation.Operation object at 0x7fa736b2bcd0>>), last exception::3: RetryError
During handling of the above exception, another exception occurred:
capsys = <_pytest.capture.CaptureFixture object at 0x7fa736b23610>
dataset_name = 'projects/580378083368/locations/us-central1/datasets/1868888292242489344'def test_ucaip_generated_import_data_text_entity_extraction_sample( capsys, dataset_name ): dataset_id = dataset_name.split("/")[-1] import_data_text_entity_extraction_sample.import_data_text_entity_extraction_sample(
gcs_source_uri=GCS_SOURCE, project=PROJECT_ID, dataset_id=dataset_id
)
import_data_text_entity_extraction_sample_test.py:57:
import_data_text_entity_extraction_sample.py:40: in import_data_text_entity_extraction_sample
import_data_response = response.result(timeout=timeout)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/future/polling.py:129: in result
self._blocking_poll(timeout=timeout, **kwargs)
self = <google.api_core.operation.Operation object at 0x7fa736b2bcd0>
timeout = 1800, retry = <google.api_core.retry.Retry object at 0x7fa73dd074d0>def _blocking_poll(self, timeout=None, retry=DEFAULT_RETRY): """Poll and wait for the Future to be resolved. Args: timeout (int): How long (in seconds) to wait for the operation to complete. If None, wait indefinitely. """ if self._result_set: return retry_ = self._retry.with_deadline(timeout) try: kwargs = {} if retry is DEFAULT_RETRY else {"retry": retry} retry_(self._done_or_raise)(**kwargs) except exceptions.RetryError: raise concurrent.futures.TimeoutError(
"Operation did not complete within the designated " "timeout."
)
E concurrent.futures._base.TimeoutError: Operation did not complete within the designated timeout.
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/future/polling.py:110: TimeoutError
Pass fixed resources as environment variables
Some sample tests use hardcoded fixed resource values which external users won't be able to access.
We should pass these resources as environment variables, so that the user points to their own resources.
There's some discussion about trying to get publicly accessible fixed resources, but that's not feasible at this time.
unable to import google.cloud.aiplatform.gapic.schema in version 0.3.1
Environment details
- OS type and version: Debian 10
- Python version: 3.8.5
- pip version: pip 20.3.3
google-cloud-aiplatform
version: Version: 0.3.1Steps to reproduce
I'm trying to construct a client predict request for images in a deployed object detection model endpoint. Following this example.
https://github.com/googleapis/python-aiplatform/blob/master/samples/snippets/predict_image_object_detection_sample.pyCode example
from google.cloud.aiplatform.gapic.schema import predictStack trace
ModuleNotFoundError: No module named 'google.cloud.aiplatform.gapic.schema'
Fails to install alongside packages that depend on older versions of `mock`
Hi,
I'm using the latest
pip
version where the default dependency resolver changed, and there are some issues installinggoogle-cloud-aiplatform
alongsideapache-beam
due to incompatible versions ofmock
.(env) ➜ python --version Python 3.8.5 (env) ➜ pip --version pip 20.3.3 from /usr/local/google/home/dcavazos/src/sandbox/env/lib/python3.8/site-packages/pip (python 3.8) (env) ➜ pip install apache-beam==2.27.0 google-cloud-aiplatform==0.4.0 # Installation fails after a very long time
I searched through the
python-aiplatform
repo andmock
is only used for testing, so it could safely be part oftests_require
instead ofinstall_requires
in thesetup.py
file.➜ python-aiplatform git:(master) egrep -R '(import|from) mock' * tests/unit/gapic/aiplatform_v1beta1/test_specialist_pool_service.py:import mock tests/unit/gapic/aiplatform_v1beta1/test_dataset_service.py:import mock tests/unit/gapic/aiplatform_v1beta1/test_endpoint_service.py:import mock tests/unit/gapic/aiplatform_v1beta1/test_prediction_service.py:import mock tests/unit/gapic/aiplatform_v1beta1/test_migration_service.py:import mock tests/unit/gapic/aiplatform_v1beta1/test_model_service.py:import mock tests/unit/gapic/aiplatform_v1beta1/test_job_service.py:import mock tests/unit/gapic/aiplatform_v1beta1/test_pipeline_service.py:import mockThe
mock
dependency should also be a part of thetests_require
in Apache Beam as well, but due to BEAM-8840 thesetup_requires
andtests_require
sections were removed.Using an older version of
pip
like20.2.*
throws an error/warning, but the installation still succeeds.# This error/warning shows when using pip 20.2.4, both packages can still be installed. ERROR: google-cloud-aiplatform 0.4.0 has requirement mock>=4.0.2, but you'll have mock 2.0.0 which is incompatible.Starting with
pip 20.3
, the new dependency resolver cannot install both libraries together due to themock
versions incompatibility.Environment details
- OS type and version:
Linux 5.7.17-1rodete4-amd64 #1 SMP Debian 5.7.17-1rodete4 (2020-10-01) x86_64
- Python version:
Python 3.8.5
- pip version:
pip 20.3.3 from /usr/local/google/home/dcavazos/src/sandbox/env/lib/python3.8/site-packages/pip (python 3.8)
google-cloud-aiplatform
version:0.4.0
but cannot be installed alongsideapache-beam==2.27.0
Steps to reproduce
Update
pip
to the latest version.pip install -U pipInstall
apache-beam
andgoogle-cloud-aiplatform
(it takes a really long time to resolve dependencies as well, but that's out of the scope for this).pip install apache-beam==2.27.0 google-cloud-aiplatform==0.4.0Suggested fix
In the
setup.py
file, create a new section calledtests_require
and move themock
dependency to it.Workaround
In the meantime, the only workaround is to force downgrading your
pip
version before installing your requirements, which is not always possible in some managed services.pip install -U pip=='20.2.*' pip install -r requirements.txt
GCP AI Platform (unified) Python export_model FailedPrecondition: 400 Exporting artifact in format `` is not supported
I am using the Google AiPlatform (Unified) Python client to export a trained model to a Google Cloud bucket. I am following the sample code from: export_model_sample.
The application has "owner" credentials at the moment because I want to make sure it is not a permissions issue. However, when I try to execute the sample code I am getting the following error:
Traceback (most recent call last):
File "/usr/local/lib/python3.8/site-packages/google/api_core/grpc_helpers.py", line 57, in error_remapped_callable
return callable_(*args, **kwargs)
File "/usr/local/lib/python3.8/site-packages/grpc/_channel.py", line 923, in call
return _end_unary_response_blocking(state, call, False, None)
File "/usr/local/lib/python3.8/site-packages/grpc/_channel.py", line 826, in _end_unary_response_blocking
raise _InactiveRpcError(state)
grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
status = StatusCode.FAILED_PRECONDITION
details = "Exporting artifact for modelprojects/101010101010/locations/us-central1/models/123123123123123
in formatis not supported." debug_error_string = "{"created":"@1611864688.554145696","description":"Error received from peer ipv4:172.217.12.202:443","file":"src/core/lib/surface/call.cc","file_line":1067,"grpc_message":"Exporting artifact for model `projects/110101010101/locations/us-central1/models/123123123123123` in format
is not supported.","grpc_status":9}"The above exception was the direct cause of the following exception:
Traceback (most recent call last): File "/app/main.py", line 667, in
response = aiplatform_model_client.export_model(name=name, output_config=output_config) File
"/usr/local/lib/python3.8/site-packages/google/cloud/aiplatform_v1beta1/services/model_service/client.py",
line 937, in export_model
response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) File
"/usr/local/lib/python3.8/site-packages/google/api_core/gapic_v1/method.py",
line 145, in call
return wrapped_func(*args, **kwargs) File "/usr/local/lib/python3.8/site-packages/google/api_core/grpc_helpers.py",
line 59, in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc) File "", line 3, in raise_from
google.api_core.exceptions.FailedPrecondition: 400 Exporting artifact
for model
projects/111101010101/locations/us-central1/models/123123123123123123
in format `` is not supported.(I have omitted the project id and the models id. Using 10101 and 123123)
I have verified my inputs but everything seems ok:
gcs_destination_output_uri_prefix = "gs://my-bucket-vcm/model-123123123123123/tflite/2021-01-28T16:00:00.000Z/" gcs_destination = {"output_uri_prefix": gcs_destination_output_uri_prefix} output_config = {"artifact_destination": gcs_destination,} name = "projects/10101010101/locations/us-central1/models/123123123123123" response = aiplatform_model_client.export_model(name=name, output_config=output_config) print("Long running operation:", response.operation.name) export_model_response = response.result(timeout=300) print("export_model_response:", export_model_response)
I am also using the latest version of google-cloud-aiplatform==0.4.0
The trained model that I am trying to export is of type: MOBILE_TF_LOW_LATENCY_1I would like to just export the model to a cloud bucket.
My requirements.txt is:
firebase-admin==4.5.0 google-api-python-client==1.12.8 google-cloud-error-reporting==1.1.0 google-cloud-secret-manager==2.1.0 google-cloud-firestore==2.0.2 google-cloud-core==1.5.0 google-cloud-pubsub==2.2.0 google-cloud-aiplatform==0.4.0 python-dateutil pandas numpy pyopenssl requests
samples.snippets.get_model_evaluation_slice_sample_test: test_ucaip_generated_get_model_evaluation_slice_sample failed
This test failed!
To configure my behavior, see the Build Cop Bot documentation.
If I'm commenting on this issue too often, add the
buildcop: quiet
label and
I will stop commenting.
commit: e6acf37
buildURL: Build Status, Sponge
status: failedTest output
args = (name: "projects/ucaip-sample-tests/locations/us-central1/models/5162251072873431040/evaluations/5615675837586029221/slices/4322488217836113260" ,) kwargs = {'metadata': [('x-goog-request-params', 'name=projects/ucaip-sample-tests/locations/us-central1/models/516225107287343...37586029221/slices/4322488217836113260'), ('x-goog-api-client', 'gl-python/3.7.7 grpc/1.33.2 gax/1.23.0 gapic/0.3.1')]}@six.wraps(callable_) def error_remapped_callable(*args, **kwargs): try:
return callable_(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:57:
self = <grpc._channel._UnaryUnaryMultiCallable object at 0x7f07efce34d0>
request = name: "projects/ucaip-sample-tests/locations/us-central1/models/5162251072873431040/evaluations/5615675837586029221/slices/4322488217836113260"timeout = None
metadata = [('x-goog-request-params', 'name=projects/ucaip-sample-tests/locations/us-central1/models/5162251072873431040/evaluati...837586029221/slices/4322488217836113260'), ('x-goog-api-client', 'gl-python/3.7.7 grpc/1.33.2 gax/1.23.0 gapic/0.3.1')]
credentials = None, wait_for_ready = None, compression = Nonedef __call__(self, request, timeout=None, metadata=None, credentials=None, wait_for_ready=None, compression=None): state, call, = self._blocking(request, timeout, metadata, credentials, wait_for_ready, compression)
return _end_unary_response_blocking(state, call, False, None)
.nox/py-3-7/lib/python3.7/site-packages/grpc/_channel.py:923:
state = <grpc._channel._RPCState object at 0x7f07efc98e90>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7f08039cc870>
with_call = False, deadline = Nonedef _end_unary_response_blocking(state, call, with_call, deadline): if state.code is grpc.StatusCode.OK: if with_call: rendezvous = _MultiThreadedRendezvous(state, call, None, deadline) return state.response, rendezvous else: return state.response else:
raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.NOT_FOUND
E details = "The Model does not exist."
E debug_error_string = "{"created":"@1606907541.010522189","description":"Error received from peer ipv4:74.125.195.95:443","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"The Model does not exist.","grpc_status":5}"
E >.nox/py-3-7/lib/python3.7/site-packages/grpc/_channel.py:826: _InactiveRpcError
The above exception was the direct cause of the following exception:
capsys = <_pytest.capture.CaptureFixture object at 0x7f07fcc64410>
def test_ucaip_generated_get_model_evaluation_slice_sample(capsys): get_model_evaluation_slice_sample.get_model_evaluation_slice_sample( project=PROJECT_ID, model_id=MODEL_ID, evaluation_id=EVALUATION_ID,
slice_id=SLICE_ID,
)
get_model_evaluation_slice_sample_test.py:31:
get_model_evaluation_slice_sample.py:38: in get_model_evaluation_slice_sample
response = client.get_model_evaluation_slice(name=name)
../../google/cloud/aiplatform_v1beta1/services/model_service/client.py:1184: in get_model_evaluation_slice
response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-7/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:59: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.NOT_FOUND
details = "The Model does not exist."
...","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"The Model does not exist.","grpc_status":5}"???
E google.api_core.exceptions.NotFound: 404 The Model does not exist.:3: NotFound
Add comment with link to node hours pricing
Issue raised by PR #12
link to specific commitRecommend Projects
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
TensorFlow
An Open Source Machine Learning Framework for Everyone
Django
The Web framework for perfectionists with deadlines.
Laravel
A PHP framework for web artisans
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
Recommend Topics
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
web
Some thing interesting about web. New door for the world.
server
A server is a program made to process requests and deliver data to clients.
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Visualization
Some thing interesting about visualization, use data art
Game
Some thing interesting about game, make everyone happy.
Recommend Org
We are working to build community through open source technology. NB: members must have two-factor auth.
Microsoft
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba
Alibaba Open Source for everyone
D3
Data-Driven Documents codes.
Tencent
China tencent open source team.