This library has moved to https://github.com/googleapis/google-cloud-python/tree/main/packages/google-cloud-tasks
License: Apache License 2.0
python-tasks's Introduction
NOTE
This github repository is archived. The repository contents and history have moved togoogle-cloud-python.
Python Client for Cloud Tasks API
Cloud Tasks API: a fully managed service that allows you to manage the execution, dispatch and delivery of a large number of distributed tasks. You can asynchronously perform work outside of a user request. Your tasks can be executed on App Engine or any arbitrary HTTP endpoint.
Install this library in a virtual environment using venv. venv is a tool that creates isolated Python environments. These isolated environments can have separate versions of Python packages, which allows you to isolate one project's dependencies from the dependencies of other projects.
With venv, it's possible to install this library without needing system install permissions, and without clashing with the installed system dependencies.
Code samples and snippets
Code samples and snippets live in the samples/ folder.
Supported Python Versions
Our client libraries are compatible with all current active and maintenance versions of Python.
Python >= 3.7
Unsupported Python Versions
Python <= 3.6
If you are using an end-of-life version of Python, we recommend that you update as soon as possible to an actively supported version.
state = <grpc._channel._RPCState object at 0x7ff6c8afe2b0>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7ff6c67a5f40>
with_call = False, deadline = None
def _end_unary_response_blocking(state, call, with_call, deadline):
if state.code is grpc.StatusCode.OK:
if with_call:
rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
return state.response, rendezvous
else:
return state.response
else:
raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.UNAUTHENTICATED
E details = "Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project."
E debug_error_string = "{"created":"@1626174370.877133679","description":"Error received from peer ipv4:173.194.202.95:443","file":"src/core/lib/surface/call.cc","file_line":1066,"grpc_message":"Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.","grpc_status":16}"
E >
Is there a documentation with full examples on how to create a task in the new version?
We've updated the library (2.2.0) and our code no longer works, I haven't found a full example on how to create a task and am stuck on setting headers.
Supposed to use types.HttpRequest.HeadersEntry but how do I actually set the headers values?
I've tried HeadersEntry(content_type='application/json') or HeadersEntry(**{"Content-Type": "application/json"}) and neither work.
task = {
'app_engine_http_request': { # Specify the type of request.
'http_method': tasks_v2.HttpMethod.POST,
'relative_uri': '/example_task_handler'
}
}
I modified it to be an http_request rather than app_engine_http_request after reading through the docs here https://cloud.google.com/tasks/docs/reference/rest/v2/projects.locations.queues.tasks/create. I also want to use the OAuthToken feature so I can have cloud tasks trigger a call to another google cloud API. Because the http_method above uses tasks_v2_HttpMethod.POST, I assumed I needed to use tasks_v2.OAuthToken() in my task definition. I ended up with this (I'm including a full snippet, but note the task definition):
But when I run that code I get the following error:
Traceback (most recent call last):
File "/path/to/example_tasks.py", line 18, in <module>
resp = client.create_task(parent=parent, task=task)
File "/path/to/.venv/lib/python3.9/site-packages/google/cloud/tasks_v2/services/cloud_tasks/client.py", line 1700, in create_task
request.task = task
File "/path/to/.venv/lib/python3.9/site-packages/proto/message.py", line 632, in __setattr__
pb_value = marshal.to_proto(pb_type, value)
File "/path/to/.venv/lib/python3.9/site-packages/proto/marshal/marshal.py", line 208, in to_proto
pb_value = rule.to_proto(value)
File "/path/to/.venv/lib/python3.9/site-packages/proto/marshal/rules/message.py", line 32, in to_proto
return self._descriptor(**value)
TypeError: Parameter to MergeFrom() must be instance of same class: expected google.cloud.tasks.v2.OAuthToken got OAuthToken.
That error seems odd, because it looks like I passed it what it expected.I suspect that is a bug, but below you'll see I got it working with a dict so I wasn't sure... so I submitted this as a support request rather than a bug.
After some fumbling around I noticed if I pass it a dictionary it seems to work. The example payload here is:
I'm working on a POC using CloudTasks, and I'm making progress by testing various things, and trying to read through the code, but it been a bit frustrating so far.
According to Cloud Tasks API, when the audience field is unset, it will take the default value of the specified URI in the
HttpRequest's target. However, this could become a problem when the URI itself contains parts such as query parameters or fragments. Having these in the audience field will lead to 401.
Part of this issue lies with the APIs, but as far as documentation and examples go, we should make the audience field specific to avoid having such problems for customers who start using the product from Google provided examples.
The Types for Cloud Tasks API documentation seems to have a markdown formatting issue. There are a number of places where a property is specified in square brackets immediately followed by the same property fully specified with the library in square brackets, e.g. [AppEngineHttpRequest][google.cloud.tasks.v2.AppEngineHttpRequest]
Search for occurrences of a closing square bracket immediately followed by an opening square bracket i.e. ][
The first occurrence is at AppEngineHttpRequest. The issue also exists in max_burst_size. A search for "][" returns 138 results, but it's possible that some results are not indicative of markdown issues.
I'm running a Python app on GAE and since a few hours, all calls to create a task are returning the following error :
Traceback (most recent call last): File "/env/lib/python3.7/site-packages/flask/app.py", line 2446, in wsgi_app response = self.full_dispatch_request() File "/env/lib/python3.7/site-packages/flask/app.py", line 1951, in full_dispatch_request rv = self.handle_user_exception(e) File "/env/lib/python3.7/site-packages/flask/app.py", line 1820, in handle_user_exception reraise(exc_type, exc_value, tb) File "/env/lib/python3.7/site-packages/flask/_compat.py", line 39, in reraise raise value File "/env/lib/python3.7/site-packages/flask/app.py", line 1949, in full_dispatch_request rv = self.dispatch_request() File "/env/lib/python3.7/site-packages/flask/app.py", line 1935, in dispatch_request return self.view_functionsrule.endpoint File "/srv/main.py", line 616, in createDailyCheckins response = client_verifier.create_task(parent, task) TypeError: create_task() takes from 1 to 2 positional arguments but 3 were given
My GAE requirements file for the app is the following:
Flask==1.1.1
google-api-python-client
requests
google-cloud-tasks
google-cloud-firestore
pytz
pyjwt
cryptography
hyper
The app has been creating tasks for months without error with the exact same code.
I'm having an issue creating scheduled Cloud Tasks, using python 2. 7 : they won't work throwing a TypeError.
It's probably worth noting that as long as I dont use the in_seconds every thing work as expected.
Am I missing something ?
Thanks
Environment details
OS type and version: Linux / Ubuntu 20.10
Python version: 2.7
pip version: 20.3.3
google-cloud-tasks version: 1.5.0
Steps to reproduce
Follow the code sample to create a task using an app_engine_http_request
I also tried different combinations for 'schedule_time' :
task['schedule_time'] =date.isoformat('T')
task['schedule_time'] =' 2021-01-07T10:20:50.52Z'# future date, inferior to 30 days from now # Using future.backports.datetimetask['schedule_time'] =date.isoformat('T')
Nothing worked.
Stack trace
ERROR 2020-12-31 11:28:56,442 handlers.py:120] Traceback (most recent call last):
File "/home/nathan/myproject/lib/flask/app.py", line 1950, in full_dispatch_request
rv = self.dispatch_request()
File "/home/nathan/myproject/lib/flask/app.py", line 1936, in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
File "/home/nathan/myproject/app/main/admin/migrations.py", line 34, in reindex_enterprise
migrate_by_pieces(400, _in_seconds=5 * 60)
File "/home/nathan/myproject/app/deferredv3.py", line 70, in wrapper_defer
path=u'/function-handler/{}'.format(f.__name__))
File "/home/nathan/myproject/app/deferredv3.py", line 113, in send_deferred_task
task = client.create_task(parent=parent, task=task)
File "/home/nathan/myproject/lib/google/cloud/tasks_v2/gapic/cloud_tasks_client.py", line 1492, in create_task
parent=parent, task=task, response_view=response_view
TypeError: Parameter to MergeFrom() must be instance of same class: expected google.protobuf.Timestamp got datetime.datetime.
Attempt to update the queue using the update_queue method with a dictionary to specify the queue name.
Observe exception: google.api_core.exceptions.InvalidArgument: 400 "x-goog-request-params" header is either missing or misformatted. "x-goog-request-params" must contain "queue.name=projects/[projectname]/locations/[location]/queues/[name]"
Replace the queue object in update_queue with an instance of google.cloud.tasks_v2beta3.types.Queue and observe success.
Code example
importgooglefromgoogle.cloudimporttasks_v2beta3TASKS_CLIENT=tasks_v2beta3.CloudTasksClient()
TASKS_QUEUE_NAME='queuenamehere'TASKS_CLIENT.update_queue({'name': TASKS_QUEUE_NAME})
vs.
queue=google.cloud.tasks_v2beta3.types.Queue(name=TASKS_QUEUE_NAME)
TASKS_CLIENT.update_queue(queue)
Note: After I resolved this issue, I was using this to update the retry policy, but this is the simplest example to demonstrate the failure -- namely that update_queue doesn't actually work if a dictionary is provided for the queue, even though the type annotation says that the Union[dict, Queue] is supported.
Stack trace
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/google/api_core/grpc_helpers.py", line 57, in error_remapped_callable
return callable_(*args, **kwargs)
File "/usr/local/lib/python3.7/site-packages/grpc/_channel.py", line 565, in __call__
return _end_unary_response_blocking(state, call, False, None)
File "/usr/local/lib/python3.7/site-packages/grpc/_channel.py", line 467, in _end_unary_response_blocking
raise _Rendezvous(state, None, None, deadline)
grpc._channel._Rendezvous: <_Rendezvous of RPC that terminated with:
status = StatusCode.INVALID_ARGUMENT
details = ""x-goog-request-params" header is either missing or misformatted. "x-goog-request-params" must contain "queue.name=projects/gecko-ops-sandbox/locations/us-east4/queues/loader-mapper""
debug_error_string = "{"created":"@1567636043.780459442","description":"Error received from peer ipv4:74.125.141.95:443","file":"src/core/lib/surface/call.cc","file_line":1052,"grpc_message":""x-goog-request-params" header is either missing or misformatted. "x-goog-request-params" must contain "queue.name=projects/gecko-ops-sandbox/locations/us-east4/queues/loader-mapper"","grpc_status":3}"
>
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python3.7/site-packages/google/cloud/tasks_v2beta3/gapic/cloud_tasks_client.py", line 640, in update_queue
request, retry=retry, timeout=timeout, metadata=metadata
File "/usr/local/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py", line 143, in __call__
return wrapped_func(*args, **kwargs)
File "/usr/local/lib/python3.7/site-packages/google/api_core/retry.py", line 273, in retry_wrapped_func
on_error=on_error,
File "/usr/local/lib/python3.7/site-packages/google/api_core/retry.py", line 182, in retry_target
return target()
File "/usr/local/lib/python3.7/site-packages/google/api_core/timeout.py", line 214, in func_with_timeout
return func(*args, **kwargs)
File "/usr/local/lib/python3.7/site-packages/google/api_core/grpc_helpers.py", line 59, in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
File "<string>", line 3, in raise_from
google.api_core.exceptions.InvalidArgument: 400 "x-goog-request-params" header is either missing or misformatted. "x-goog-request-params" must contain "queue.name=[redacted]"
Fortunately creating the Queue object resolves this but if the dict is not supported then the annotation should be updated and ideally there should be a nice error that lets the API user know that a dict isn't supported. Right now the AttributeError exception for accessing .name is silently passed but perhaps should be raised.
Thanks for stopping by to let us know something could be better!
PLEASE READ: If you have a support contract with Google, please create an issue in the support console instead of filing on GitHub. This will ensure a timely response.
Please run down the following list and make sure you've tried the usual "quick fixes":
Execute it with valid parameters: python create_app_engine_queue_task.py --project my-project --location us-central1 --in_seconds 30 --queue myqueue
See error:
Traceback (most recent call last):
File "/path/to/create_app_engine_queue_task.py", line 110, in <module>
create_task(
File "/path/to/create_app_engine_queue_task.py", line 68, in create_task
response = client.create_task(parent=parent, task=task)
File "/path/to/.venv/lib/python3.9/site-packages/google/cloud/tasks_v2/services/cloud_tasks/client.py", line 1700, in create_task
request.task = task
File "/path/to/.venv/lib/python3.9/site-packages/proto/message.py", line 632, in __setattr__
pb_value = marshal.to_proto(pb_type, value)
File "/path/to/.venv/lib/python3.9/site-packages/proto/marshal/marshal.py", line 208, in to_proto
pb_value = rule.to_proto(value)
File "/path/to/.venv/lib/python3.9/site-packages/proto/marshal/rules/message.py", line 32, in to_proto
return self._descriptor(**value)
TypeError: Parameter to MergeFrom() must be instance of same class: expected google.protobuf.Timestamp got datetime.datetime.
Traceback (most recent call last):
File "/path/to/create_app_engine_queue_task.py", line 110, in <module>
create_task(
File "/path/to/create_app_engine_queue_task.py", line 68, in create_task
response = client.create_task(parent=parent, task=task)
File "/path/to/.venv/lib/python3.9/site-packages/google/cloud/tasks_v2/services/cloud_tasks/client.py", line 1700, in create_task
request.task = task
File "/path/to/.venv/lib/python3.9/site-packages/proto/message.py", line 632, in __setattr__
pb_value = marshal.to_proto(pb_type, value)
File "/path/to/.venv/lib/python3.9/site-packages/proto/marshal/marshal.py", line 208, in to_proto
pb_value = rule.to_proto(value)
File "/path/to/.venv/lib/python3.9/site-packages/proto/marshal/rules/message.py", line 32, in to_proto
return self._descriptor(**value)
TypeError: Parameter to MergeFrom() must be instance of same class: expected google.protobuf.Timestamp got datetime.datetime.
Making sure to follow these steps will guarantee the quickest resolution possible.
Thanks for stopping by to let us know something could be better!
PLEASE READ: If you have a support contract with Google, please create an issue in the support console instead of filing on GitHub. This will ensure a timely response.
Please run down the following list and make sure you've tried the usual "quick fixes":
st/bin/external/com_google_protobuf/protoc --experimental_allow_proto3_optional --plugin=protoc-gen-python_gapic=bazel-out/host/bin/external/gapic_generator_python/gapic_plugin --python_gapic_out=retry-config=google/cloud/tasks/v2beta2/cloudtasks_grpc_service_config.json:bazel-out/k8-fastbuild/bin/google/cloud/tasks/v2beta2/tasks_py_gapic.srcjar.zip -Igoogle/cloud/tasks/v2beta2/cloudtasks.proto=google/cloud/tasks/v2beta2/cloudtasks.proto -Igoogle/cloud/tasks/v2beta2/queue.proto=google/cloud/tasks/v2beta2/queue.proto -Igoogle/cloud/tasks/v2beta2/target.proto=google/cloud/tasks/v2beta2/target.proto -Igoogle/cloud/tasks/v2beta2/task.proto=google/cloud/tasks/v2beta2/task.proto -Igoogle/api/annotations.proto=google/api/annotations.proto -Igoogle/api/http.proto=google/api/http.proto -Igoogle/protobuf/descriptor.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/descriptor_proto/google/protobuf/descriptor.proto -Igoogle/api/client.proto=google/api/client.proto -Igoogle/api/field_behavior.proto=google/api/field_behavior.proto -Igoogle/api/resource.proto=google/api/resource.proto -Igoogle/iam/v1/iam_policy.proto=google/iam/v1/iam_policy.proto -Igoogle/iam/v1/options.proto=google/iam/v1/options.proto -Igoogle/iam/v1/policy.proto=google/iam/v1/policy.proto -Igoogle/type/expr.proto=google/type/expr.proto -Igoogle/rpc/status.proto=google/rpc/status.proto -Igoogle/protobuf/any.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/any_proto/google/protobuf/any.proto -Igoogle/protobuf/duration.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/duration_proto/google/protobuf/duration.proto -Igoogle/protobuf/empty.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/empty_proto/google/protobuf/empty.proto -Igoogle/protobuf/field_mask.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/field_mask_proto/google/protobuf/field_mask.proto -Igoogle/protobuf/timestamp.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/timestamp_proto/google/protobuf/timestamp.proto google/cloud/tasks/v2beta2/cloudtasks.proto google/cloud/tasks/v2beta2/queue.proto google/cloud/tasks/v2beta2/target.proto google/cloud/tasks/v2beta2/task.proto` failed (Exit 1) protoc failed: error executing command bazel-out/host/bin/external/com_google_protobuf/protoc --experimental_allow_proto3_optional '--plugin=protoc-gen-python_gapic=bazel-out/host/bin/external/gapic_generator_python/gapic_plugin' ... (remaining 25 argument(s) skipped)
Use --sandbox_debug to see verbose messages from the sandbox
google/cloud/tasks/v2beta2/target.proto:20:1: warning: Import google/api/annotations.proto is unused.
google/cloud/tasks/v2beta2/queue.proto:24:1: warning: Import google/api/annotations.proto is unused.
google/cloud/tasks/v2beta2/task.proto:24:1: warning: Import google/api/annotations.proto is unused.
Traceback (most recent call last):
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/45/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/cli/generate_with_pandoc.py", line 3, in <module>
from gapic.cli import generate
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/45/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/cli/generate.py", line 23, in <module>
from gapic import generator
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/45/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/generator/__init__.py", line 21, in <module>
from .generator import Generator
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/45/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/generator/generator.py", line 24, in <module>
from gapic.samplegen import manifest, samplegen
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/45/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/samplegen/__init__.py", line 15, in <module>
from gapic.samplegen import samplegen
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/45/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/samplegen/samplegen.py", line 27, in <module>
from gapic.schema import wrappers
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/45/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/schema/__init__.py", line 23, in <module>
from gapic.schema.api import API
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/45/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/schema/api.py", line 29, in <module>
from google.api_core import exceptions # type: ignore
ModuleNotFoundError: No module named 'google.api_core'
--python_gapic_out: protoc-gen-python_gapic: Plugin failed with status code 1.
Target //google/cloud/tasks/v2beta2:tasks-v2beta2-py failed to build
Use --verbose_failures to see the command lines of failed build steps.
INFO: Elapsed time: 1.006s, Critical Path: 0.80s
INFO: 0 processes.
FAILED: Build did NOT complete successfully
FAILED: Build did NOT complete successfully
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module>
main()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 782, in main
rv = self.invoke(ctx)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 94, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/home/kbuilder/.cache/synthtool/python-tasks/synth.py", line 36, in <module>
include_protos=True,
File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 52, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 193, in _generate_code
shell.run(bazel_run_args)
File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 39, in run
raise exc
File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 33, in run
encoding="utf-8",
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 438, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['bazel', '--max_idle_secs=240', 'build', '//google/cloud/tasks/v2beta2:tasks-v2beta2-py']' returned non-zero exit status 1.
2021-01-21 05:47:32,220 autosynth [ERROR] > Synthesis failed
2021-01-21 05:47:32,220 autosynth [DEBUG] > Running: git reset --hard HEAD
HEAD is now at 6da7304 chore(deps): update dependency google-cloud-tasks to v2.1.0 (#63)
2021-01-21 05:47:32,225 autosynth [DEBUG] > Running: git checkout autosynth
Switched to branch 'autosynth'
2021-01-21 05:47:32,230 autosynth [DEBUG] > Running: git clean -fdx
Removing __pycache__/
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 354, in <module>
main()
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 189, in main
return _inner_main(temp_dir)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 334, in _inner_main
commit_count = synthesize_loop(x, multiple_prs, change_pusher, synthesizer)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 65, in synthesize_loop
has_changes = toolbox.synthesize_version_in_new_branch(synthesizer, youngest)
File "/tmpfs/src/github/synthtool/autosynth/synth_toolbox.py", line 259, in synthesize_version_in_new_branch
synthesizer.synthesize(synth_log_path, self.environ)
File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize
synth_proc.check_returncode() # Raise an exception.
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode
self.stderr)
subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1.
Google internal developers can see the full log here.
OS type and version: Ubuntu 16.04.7 LTS (Xenial Xerus)
Python version: Python 3.6.10
pip version: pip 20.2.2
google-cloud-tasks version: 2.5.1
Additional details:
Steps to reproduce
App is running on Google AppEngine flexible on a Gunicorn server (gunicorn==20.0.4) with Gevent (gevent==20.6.2) worker. grpcio==1.40.0. Recently bumped gevent and grpcio from an earlier version and issue still happens.
1 in about every 400 requests takes way too long (several hundred seconds) to complete despite setting a timeout.
Screenshot from New Relic APM showing a request which took 380s mostly from create_task:
I also have a timer instrumented in the python function that calls create_task and confirms it takes 380s.
# GRPC must be monkeypatched to support gevent# https://github.com/grpc/grpc/issues/4629#issuecomment-376962677importgrpc.experimental.geventasgrpc_geventgrpc_gevent.init_gevent()
st/bin/external/com_google_protobuf/protoc --experimental_allow_proto3_optional --plugin=protoc-gen-python_gapic=bazel-out/host/bin/external/gapic_generator_python/gapic_plugin --python_gapic_out=retry-config=google/cloud/tasks/v2beta2/cloudtasks_grpc_service_config.json:bazel-out/k8-fastbuild/bin/google/cloud/tasks/v2beta2/tasks_py_gapic.srcjar.zip -Igoogle/cloud/tasks/v2beta2/cloudtasks.proto=google/cloud/tasks/v2beta2/cloudtasks.proto -Igoogle/cloud/tasks/v2beta2/queue.proto=google/cloud/tasks/v2beta2/queue.proto -Igoogle/cloud/tasks/v2beta2/target.proto=google/cloud/tasks/v2beta2/target.proto -Igoogle/cloud/tasks/v2beta2/task.proto=google/cloud/tasks/v2beta2/task.proto -Igoogle/api/annotations.proto=google/api/annotations.proto -Igoogle/api/http.proto=google/api/http.proto -Igoogle/protobuf/descriptor.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/descriptor_proto/google/protobuf/descriptor.proto -Igoogle/api/client.proto=google/api/client.proto -Igoogle/api/field_behavior.proto=google/api/field_behavior.proto -Igoogle/api/resource.proto=google/api/resource.proto -Igoogle/iam/v1/iam_policy.proto=google/iam/v1/iam_policy.proto -Igoogle/iam/v1/options.proto=google/iam/v1/options.proto -Igoogle/iam/v1/policy.proto=google/iam/v1/policy.proto -Igoogle/type/expr.proto=google/type/expr.proto -Igoogle/rpc/status.proto=google/rpc/status.proto -Igoogle/protobuf/any.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/any_proto/google/protobuf/any.proto -Igoogle/protobuf/duration.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/duration_proto/google/protobuf/duration.proto -Igoogle/protobuf/empty.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/empty_proto/google/protobuf/empty.proto -Igoogle/protobuf/field_mask.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/field_mask_proto/google/protobuf/field_mask.proto -Igoogle/protobuf/timestamp.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/timestamp_proto/google/protobuf/timestamp.proto google/cloud/tasks/v2beta2/cloudtasks.proto google/cloud/tasks/v2beta2/queue.proto google/cloud/tasks/v2beta2/target.proto google/cloud/tasks/v2beta2/task.proto` failed (Exit 1) protoc failed: error executing command bazel-out/host/bin/external/com_google_protobuf/protoc --experimental_allow_proto3_optional '--plugin=protoc-gen-python_gapic=bazel-out/host/bin/external/gapic_generator_python/gapic_plugin' ... (remaining 25 argument(s) skipped)
Use --sandbox_debug to see verbose messages from the sandbox
google/cloud/tasks/v2beta2/target.proto:19:1: warning: Import google/api/annotations.proto is unused.
google/cloud/tasks/v2beta2/queue.proto:24:1: warning: Import google/api/annotations.proto is unused.
google/cloud/tasks/v2beta2/task.proto:23:1: warning: Import google/api/annotations.proto is unused.
Traceback (most recent call last):
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/47/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/cli/generate_with_pandoc.py", line 3, in <module>
from gapic.cli import generate
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/47/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/cli/generate.py", line 23, in <module>
from gapic import generator
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/47/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/generator/__init__.py", line 21, in <module>
from .generator import Generator
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/47/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/generator/generator.py", line 24, in <module>
from gapic.samplegen import manifest, samplegen
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/47/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/samplegen/__init__.py", line 15, in <module>
from gapic.samplegen import samplegen
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/47/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/samplegen/samplegen.py", line 27, in <module>
from gapic.schema import wrappers
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/47/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/schema/__init__.py", line 23, in <module>
from gapic.schema.api import API
File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/47/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/schema/api.py", line 29, in <module>
from google.api_core import exceptions # type: ignore
ModuleNotFoundError: No module named 'google.api_core'
--python_gapic_out: protoc-gen-python_gapic: Plugin failed with status code 1.
Target //google/cloud/tasks/v2beta2:tasks-v2beta2-py failed to build
Use --verbose_failures to see the command lines of failed build steps.
INFO: Elapsed time: 1.040s, Critical Path: 0.83s
INFO: 0 processes.
FAILED: Build did NOT complete successfully
FAILED: Build did NOT complete successfully
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module>
main()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 782, in main
rv = self.invoke(ctx)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 94, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/home/kbuilder/.cache/synthtool/python-tasks/synth.py", line 36, in <module>
include_protos=True,
File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 52, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 197, in _generate_code
shell.run(bazel_run_args)
File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 39, in run
raise exc
File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 33, in run
encoding="utf-8",
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 438, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['bazel', '--max_idle_secs=240', 'build', '//google/cloud/tasks/v2beta2:tasks-v2beta2-py']' returned non-zero exit status 1.
2021-01-28 05:47:55,036 autosynth [ERROR] > Synthesis failed
2021-01-28 05:47:55,036 autosynth [DEBUG] > Running: git reset --hard HEAD
HEAD is now at 6da7304 chore(deps): update dependency google-cloud-tasks to v2.1.0 (#63)
2021-01-28 05:47:55,041 autosynth [DEBUG] > Running: git checkout autosynth
Switched to branch 'autosynth'
2021-01-28 05:47:55,047 autosynth [DEBUG] > Running: git clean -fdx
Removing __pycache__/
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 354, in <module>
main()
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 189, in main
return _inner_main(temp_dir)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 334, in _inner_main
commit_count = synthesize_loop(x, multiple_prs, change_pusher, synthesizer)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 65, in synthesize_loop
has_changes = toolbox.synthesize_version_in_new_branch(synthesizer, youngest)
File "/tmpfs/src/github/synthtool/autosynth/synth_toolbox.py", line 259, in synthesize_version_in_new_branch
synthesizer.synthesize(synth_log_path, self.environ)
File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize
synth_proc.check_returncode() # Raise an exception.
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode
self.stderr)
subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1.
Google internal developers can see the full log here.
state = <grpc._channel._RPCState object at 0x7ff6c6a01550>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7ff6c697d0c0>
with_call = False, deadline = None
def _end_unary_response_blocking(state, call, with_call, deadline):
if state.code is grpc.StatusCode.OK:
if with_call:
rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
return state.response, rendezvous
else:
return state.response
else:
raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.UNAUTHENTICATED
E details = "Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project."
E debug_error_string = "{"created":"@1626174372.475567666","description":"Error received from peer ipv4:74.125.195.95:443","file":"src/core/lib/surface/call.cc","file_line":1066,"grpc_message":"Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.","grpc_status":16}"
E >
../../google/cloud/tasks_v2/services/cloud_tasks/client.py:814: in delete_queue
rpc(
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/retry.py:285: in retry_wrapped_func
return retry_target(
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/retry.py:188: in retry_target
return target()
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/grpc_helpers.py:69: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
We're trying to migrate from Appengine Standard to Appengine Flexible, and we're also changing frameworks from Webapp2 to Pyramid, as webapp2 doesn't support python 3 (and is quite old)
We have an endpoint that receives 1 request per second, roughly, and that endpoint kicks off a task. It seems that the process of creating a task isn't releasing memory. I'll try to give as much info as I can.
Environment details
OS type and version: Appengine Flexible custom runtime using gcr.io/google-appengine/python
Python version: 3.7
google-cloud-tasks version: Unsure how to check on appengine but probably 1.5
Steps to reproduce
Create a task
Code example
Task class to help facilitate creating tasks in various handlers. Bulk of this is from Cloud Tasks documentation:
importjsonimportloggingfromgoogle.cloudimporttasks_v2classTasksQueue(object):
def__init__(self):
self.client=tasks_v2.CloudTasksClient()
defadd(self, url, payload=None, queue_name='default'):
iftype(payload) ==dict:
payload=json.dumps(payload)
# Construct the fully qualified queue name.parent=self.client.queue_path('my-project', 'us-central1', queue_name)
# Construct the request body.task= {
'app_engine_http_request': { # Specify the type of request.'http_method': 'POST',
'relative_uri': url
}
}
ifpayloadisnotNone:
# The API expects a payload of type bytes.converted_payload=payload.encode()
# Add the payload to the request.task['app_engine_http_request']['body'] =converted_payloadlogging.info(task)
# Use the client to build and send the task.response=self.client.create_task(parent, task)
logging.info('Created task {}'.format(response.name))
returnresponse
The full handler itself has a connection to a redis MemoryStore but I ran a bunch of requests through another handler that solely gets/sets redis entries and saw no increase in memory usage on the instances, so that leads me to believe theres an issue with cloud tasks.
Further, I used tracemalloc to check memory usage before and after each request. These are unfortunately backwards (bottom row is the biggest difference) because of the way logs are displayed in Logging under flexible when new lines are present. As you can see though, pyasn1/type/base.py and pyasn1/type/integer.py` have objects that balloon in size after the request and never seem to go down.
Before:
A 2020-04-14T21:43:44Z /env/lib/python3.7/site-packages/grpc/_channel.py:770: size=960 B (+560 B), count=12 (+7), average=80 B
A 2020-04-14T21:43:44Z /env/lib/python3.7/site-packages/google/api_core/retry.py:286: size=576 B (-576 B), count=1 (-1), average=576 B
A 2020-04-14T21:43:44Z /env/lib/python3.7/site-packages/pyramid/router.py:139: size=576 B (+576 B), count=1 (+1), average=576 B
A 2020-04-14T21:43:44Z /env/lib/python3.7/site-packages/grpc/_channel.py:825: size=584 B (-584 B), count=1 (-1), average=584 B
A 2020-04-14T21:43:44Z /env/lib/python3.7/site-packages/urllib3/util/retry.py:196: size=284 KiB (+588 B), count=1980 (+4), average=147 B
A 2020-04-14T21:43:44Z /opt/python3.7/lib/python3.7/threading.py:235: size=1048 B (+592 B), count=5 (+2), average=210 B
A 2020-04-14T21:43:44Z /env/lib/python3.7/site-packages/google/protobuf/json_format.py:481: size=1200 B (-600 B), count=2 (-1), average=600 B
A 2020-04-14T21:43:44Z /env/lib/python3.7/site-packages/grpc/_channel.py:1353: size=1968 B (+712 B), count=17 (+5), average=116 B
A 2020-04-14T21:43:44Z /env/lib/python3.7/site-packages/grpc/_channel.py:99: size=744 B (+744 B), count=2 (+2), average=372 B
A 2020-04-14T21:43:44Z /env/lib/python3.7/site-packages/gunicorn/http/message.py:110: size=817 B (-766 B), count=12 (-11), average=68 B
A 2020-04-14T21:43:44Z /env/lib/python3.7/site-packages/gunicorn/http/wsgi.py:136: size=827 B (-827 B), count=12 (-12), average=69 B
A 2020-04-14T21:43:44Z /env/lib/python3.7/site-packages/pyasn1/compat/integer.py:99: size=461 KiB (+956 B), count=2470 (+5), average=191 B
A 2020-04-14T21:43:44Z /env/lib/python3.7/site-packages/grpc/_channel.py:1149: size=1728 B (+1008 B), count=12 (+7), average=144 B
A 2020-04-14T21:43:44Z /env/lib/python3.7/site-packages/pyasn1/type/base.py:373: size=418 KiB (+1008 B), count=2558 (+6), average=167 B
A 2020-04-14T21:43:44Z /env/lib/python3.7/site-packages/gunicorn/http/wsgi.py:139: size=1112 B (-1112 B), count=1 (-1), average=1112 B
A 2020-04-14T21:43:44Z /env/lib/python3.7/site-packages/pyasn1/codec/ber/decoder.py:609: size=12.3 KiB (-1472 B), count=37 (-2), average=341 B
A 2020-04-14T21:43:44Z /home/vmagent/app/externalping/AppengineLogHandler.py:106: size=58.8 KiB (+1504 B), count=742 (+10), average=81 B
A 2020-04-14T21:43:44Z /opt/python3.7/lib/python3.7/tracemalloc.py:185: size=29.2 KiB (-60.6 KiB), count=466 (-970), average=64 B
A 2020-04-14T21:43:44Z /opt/python3.7/lib/python3.7/tracemalloc.py:113: size=1152 B (-97.0 KiB), count=12 (-1035), average=96 B
A 2020-04-14T21:43:44Z [ Top 20 differences ]
After roughly 1 hour:
2020-04-14T22:39:23Z /env/lib/python3.7/site-packages/pyasn1/type/base.py:54: size=1870 KiB (+480 B), count=21036 (+6), average=91 BA
2020-04-14T22:39:23Z /env/lib/python3.7/site-packages/grpc/_common.py:70: size=936 B (+513 B), count=12 (+7), average=78 BA
2020-04-14T22:39:23Z /env/lib/python3.7/site-packages/grpc/_channel.py:770: size=960 B (+560 B), count=12 (+7), average=80 BA
2020-04-14T22:39:23Z /env/lib/python3.7/site-packages/pyramid/router.py:139: size=576 B (+576 B), count=1 (+1), average=576 BA
2020-04-14T22:39:23Z /env/lib/python3.7/site-packages/urllib3/util/retry.py:196: size=1839 KiB (+588 B), count=12808 (+4), average=147 BA
2020-04-14T22:39:23Z /opt/python3.7/lib/python3.7/threading.py:235: size=1048 B (+592 B), count=5 (+2), average=210 BA
2020-04-14T22:39:23Z /env/lib/python3.7/site-packages/google/protobuf/json_format.py:481: size=1200 B (-600 B), count=2 (-1), average=600 BA
2020-04-14T22:39:23Z /home/vmagent/app/externalping/AppengineLogHandler.py:106: size=451 KiB (+672 B), count=5760 (+4), average=80 BA
2020-04-14T22:39:23Z /env/lib/python3.7/site-packages/grpc/_channel.py:1353: size=1968 B (+712 B), count=17 (+5), average=116 BA
2020-04-14T22:39:23Z /env/lib/python3.7/site-packages/grpc/_channel.py:99: size=744 B (+744 B), count=2 (+2), average=372 BA
2020-04-14T22:39:23Z /env/lib/python3.7/site-packages/gunicorn/http/message.py:110: size=818 B (-766 B), count=12 (-11), average=68 BA
2020-04-14T22:39:23Z /env/lib/python3.7/site-packages/gunicorn/http/wsgi.py:136: size=827 B (-827 B), count=12 (-12), average=69 BA
2020-04-14T22:39:23Z /env/lib/python3.7/site-packages/pyasn1/compat/integer.py:99: size=2988 KiB (+956 B), count=16005 (+5), average=191 BA
2020-04-14T22:39:23Z /env/lib/python3.7/site-packages/grpc/_channel.py:1149: size=1728 B (+1008 B), count=12 (+7), average=144 BA
2020-04-14T22:39:23Z /env/lib/python3.7/site-packages/gunicorn/http/wsgi.py:139: size=1112 B (-1112 B), count=1 (-1), average=1112 BA
2020-04-14T22:39:23Z /env/lib/python3.7/functools.py:60: size=675 KiB (+1168 B), count=6627 (+7), average=104 BA
2020-04-14T22:39:23Z /env/lib/python3.7/site-packages/pyasn1/type/base.py:373: size=3045 KiB (+1176 B), count=18740 (+7), average=166 BA
2020-04-14T22:39:23Z /opt/python3.7/lib/python3.7/tracemalloc.py:185: size=67.7 KiB (-59.5 KiB), count=1083 (-952), average=64 BA
2020-04-14T22:39:23Z /opt/python3.7/lib/python3.7/tracemalloc.py:113: size=1152 B (-98.7 KiB), count=12 (-1053), average=96 BA
2020-04-14T22:39:23Z [ Top 20 differences ]
Here is a screenshot of the memory usage on the instance
at 11am I changed it to an instance with more memory. This is where its sitting right now
I've tried gc.collect() after the handler is done, del taskqueue after the task is created, and taskqueue.client.transport.channel.close(), nothing works to keep memory usage in check. I'm not sure what else I can do here or what other logs I can provide to help nail this down. Any help would be greatly appreciated.
After upgrading to 2.0.0 I am getting the following whenever I am trying to create tasks:
tasks_client=awaitsync_to_async(tasks_v2.CloudTasksClient)()
parent=tasks_client.queue_path(project, loc, name)
# Construct the request body.task= {
'http_request': { # Specify the type of request.'http_method': 'POST',
'url': url
}
}
# The API expects a payload of type bytes.converted_payload=payload.encode()
# Add the payload to the request.task['http_request']['body'] =converted_payload# create the taskresponse=awaitsync_to_async(tasks_client.create_task)(parent, task)
Reproducible in cloud run, running python 3.8.2 as well as locally running under windows.
PIP 20.1.1 locally.
create_task() takes from 1 to 2 positional arguments but 3 were given
Exact same code works in 1.5.0.
I am guessing something has changed in 2.0.0 but at the time of writing the release notes are not published, and I didnt have the time to debug the 2.0.0 to figure it out on my own.
Here's the documentation example for creating a task:
Is it just me, or is it missing the important parts?
Specifically, I would like to know how to provide the URL of the handler I want the task to call, and second, I'd like to send a payload.
I'm going to research the docs now. Thanks
state = <grpc._channel._RPCState object at 0x7ff6c8b04a00>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7ff6c8b3c340>
with_call = False, deadline = None
def _end_unary_response_blocking(state, call, with_call, deadline):
if state.code is grpc.StatusCode.OK:
if with_call:
rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
return state.response, rendezvous
else:
return state.response
else:
raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.UNAUTHENTICATED
E details = "Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project."
E debug_error_string = "{"created":"@1626174371.410723544","description":"Error received from peer ipv4:74.125.195.95:443","file":"src/core/lib/surface/call.cc","file_line":1066,"grpc_message":"Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.","grpc_status":16}"
E >
list_queues.py:32: in list_queues
for queue in response:
../../google/cloud/tasks_v2/services/cloud_tasks/pagers.py:87: in iter
for page in self.pages:
../../google/cloud/tasks_v2/services/cloud_tasks/pagers.py:83: in pages
self._response = self._method(self._request, metadata=self._metadata)
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/retry.py:285: in retry_wrapped_func
return retry_target(
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/retry.py:188: in retry_target
return target()
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/timeout.py:102: in func_with_timeout
return func(*args, **kwargs)
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/grpc_helpers.py:69: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
Cloning into 'working_repo'...
Switched to branch 'autosynth'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/synth.py.
On branch autosynth
nothing to commit, working tree clean
HEAD detached at FETCH_HEAD
nothing to commit, working tree clean
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:6aec9c34db0e4be221cdaf6faba27bdc07cfea846808b3d3b964dfce3a9a0f9b
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/cloud/tasks/artman_cloudtasks_v2beta2.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/tasks-v2beta2.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/tasks/v2beta2/cloudtasks.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/tasks-v2beta2/google/cloud/tasks_v2beta2/proto/cloudtasks.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/tasks/v2beta2/queue.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/tasks-v2beta2/google/cloud/tasks_v2beta2/proto/queue.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/tasks/v2beta2/target.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/tasks-v2beta2/google/cloud/tasks_v2beta2/proto/target.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/tasks/v2beta2/task.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/tasks-v2beta2/google/cloud/tasks_v2beta2/proto/task.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/tasks-v2beta2/google/cloud/tasks_v2beta2/proto.
synthtool > Replaced '(Google IAM .*?_) ' in google/cloud/tasks_v2beta2/gapic/cloud_tasks_client.py.
synthtool > Replaced '>`__' in google/cloud/tasks_v2beta2/proto/queue_pb2.py.
synthtool > Replaced '>`__' in google/cloud/tasks_v2beta2/proto/task_pb2.py.
synthtool > Replaced '>`__' in google/cloud/tasks_v2beta2/proto/target_pb2.py.
synthtool > Replaced '>`__' in google/cloud/tasks_v2beta2/proto/cloudtasks_pb2.py.
synthtool > Running generator for google/cloud/tasks/artman_cloudtasks_v2beta3.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/tasks-v2beta3.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/tasks/v2beta3/cloudtasks.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/tasks-v2beta3/google/cloud/tasks_v2beta3/proto/cloudtasks.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/tasks/v2beta3/queue.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/tasks-v2beta3/google/cloud/tasks_v2beta3/proto/queue.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/tasks/v2beta3/target.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/tasks-v2beta3/google/cloud/tasks_v2beta3/proto/target.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/tasks/v2beta3/task.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/tasks-v2beta3/google/cloud/tasks_v2beta3/proto/task.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/tasks-v2beta3/google/cloud/tasks_v2beta3/proto.
synthtool > Replaced '(Google IAM .*?_) ' in google/cloud/tasks_v2beta3/gapic/cloud_tasks_client.py.
synthtool > Replaced '>`__' in google/cloud/tasks_v2beta3/proto/queue_pb2.py.
synthtool > Replaced '>`__' in google/cloud/tasks_v2beta3/proto/task_pb2.py.
synthtool > Replaced '>`__' in google/cloud/tasks_v2beta3/proto/target_pb2.py.
synthtool > Replaced '>`__' in google/cloud/tasks_v2beta3/proto/cloudtasks_pb2.py.
synthtool > Running generator for google/cloud/tasks/artman_cloudtasks_v2.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/tasks-v2.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/tasks/v2/cloudtasks.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/tasks-v2/google/cloud/tasks_v2/proto/cloudtasks.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/tasks/v2/queue.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/tasks-v2/google/cloud/tasks_v2/proto/queue.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/tasks/v2/target.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/tasks-v2/google/cloud/tasks_v2/proto/target.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/tasks/v2/task.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/tasks-v2/google/cloud/tasks_v2/proto/task.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/tasks-v2/google/cloud/tasks_v2/proto.
synthtool > Replaced '(Google IAM .*?_) ' in google/cloud/tasks_v2/gapic/cloud_tasks_client.py.
synthtool > Replaced '>`__' in google/cloud/tasks_v2/proto/queue_pb2.py.
synthtool > Replaced '>`__' in google/cloud/tasks_v2/proto/task_pb2.py.
synthtool > Replaced '>`__' in google/cloud/tasks_v2/proto/target_pb2.py.
synthtool > Replaced '>`__' in google/cloud/tasks_v2/proto/cloudtasks_pb2.py.
synthtool > Replaced '(in queue.yaml/xml) <\n\\s+' in google/cloud/tasks_v2beta2/proto/queue_pb2.py.
synthtool > Replaced '#retry_parameters>\n `__\\.' in google/cloud/tasks_v2/proto/queue_pb2.py.
synthtool > Replaced '>>> # TODO: Initialize `queue`:' in google/cloud/tasks_v2beta3/gapic/cloud_tasks_client.py.
synthtool > Replaced '^(\\s+)>>> queue = {}\n' in google/cloud/tasks_v2beta3/gapic/cloud_tasks_client.py.
synthtool > Replaced 'types\\.View' in google/cloud/tasks_v2beta2/gapic/cloud_tasks_client.py.
synthtool > Replaced 'types\\.View' in google/cloud/tasks_v2/gapic/cloud_tasks_client.py.
synthtool > Replaced 'types\\.View' in google/cloud/tasks_v2beta3/gapic/cloud_tasks_client.py.
synthtool > Replaced ' retry \\(Optional\\[google\\.api_core\\.retry\\.Retry\\]\\): A retry object used\n to retry requests\\. If ``None`` is specified, requests will\n be retried using a default configuration\\.\n timeout \\(Optional\\[float\\]\\): The amount of time, in seconds, to wait\n for the request to complete\\. Note that if ``retry`` is\n specified, the timeout applies to each individual attempt\\.\n metadata \\(Optional\\[Sequence\\[Tuple\\[str, str\\]\\]\\]\\): Additional metadata\n that is provided to the method\\.\n\n' in google/cloud/tasks_v2beta2/gapic/cloud_tasks_client.py.
synthtool > Replaced ' retry \\(Optional\\[google\\.api_core\\.retry\\.Retry\\]\\): A retry object used\n to retry requests\\. If ``None`` is specified, requests will\n be retried using a default configuration\\.\n timeout \\(Optional\\[float\\]\\): The amount of time, in seconds, to wait\n for the request to complete\\. Note that if ``retry`` is\n specified, the timeout applies to each individual attempt\\.\n metadata \\(Optional\\[Sequence\\[Tuple\\[str, str\\]\\]\\]\\): Additional metadata\n that is provided to the method\\.\n\n' in google/cloud/tasks_v2/gapic/cloud_tasks_client.py.
synthtool > Replaced ' retry \\(Optional\\[google\\.api_core\\.retry\\.Retry\\]\\): A retry object used\n to retry requests\\. If ``None`` is specified, requests will\n be retried using a default configuration\\.\n timeout \\(Optional\\[float\\]\\): The amount of time, in seconds, to wait\n for the request to complete\\. Note that if ``retry`` is\n specified, the timeout applies to each individual attempt\\.\n metadata \\(Optional\\[Sequence\\[Tuple\\[str, str\\]\\]\\]\\): Additional metadata\n that is provided to the method\\.\n\n' in google/cloud/tasks_v2beta3/gapic/cloud_tasks_client.py.
.coveragerc
.flake8
.github/CONTRIBUTING.md
.github/ISSUE_TEMPLATE/bug_report.md
.github/ISSUE_TEMPLATE/feature_request.md
.github/ISSUE_TEMPLATE/support_request.md
.github/PULL_REQUEST_TEMPLATE.md
.github/release-please.yml
.gitignore
.kokoro/build.sh
.kokoro/continuous/common.cfg
.kokoro/continuous/continuous.cfg
.kokoro/docs/common.cfg
.kokoro/docs/docs.cfg
.kokoro/presubmit/common.cfg
.kokoro/presubmit/presubmit.cfg
.kokoro/publish-docs.sh
.kokoro/release.sh
.kokoro/release/common.cfg
.kokoro/release/release.cfg
.kokoro/trampoline.sh
CODE_OF_CONDUCT.md
CONTRIBUTING.rst
LICENSE
MANIFEST.in
docs/_static/custom.css
docs/_templates/layout.html
docs/conf.py.j2
noxfile.py.j2
renovate.json
setup.cfg
Running session blacken
Creating virtual environment (virtualenv) using python3.6 in .nox/blacken
pip install black==19.3b0
Error: pip is not installed into the virtualenv, it is located at /tmpfs/src/git/autosynth/env/bin/pip. Pass external=True into run() to explicitly allow this.
Session blacken failed.
synthtool > Failed executing nox -s blacken:
None
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 102, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 94, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/synth.py", line 121, in <module>
s.shell.run(["nox", "-s", "blacken"], hide_output=False)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 39, in run
raise exc
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 33, in run
encoding="utf-8",
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/subprocess.py", line 418, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['nox', '-s', 'blacken']' returned non-zero exit status 1.
Synthesis failed
Google internal developers can see the full log here.
Traceback (most recent call last): File "/home/vmagent/app/cache/views/tasks/cron_task_viewset.py", line 647, in cache object_processing_uri=PATH_QUEUE_TASKS + "cache/", File "/home/vmagent/app/utils/tasks/__init__.py", line 148, in chunk task_generator.queue_task() File "/home/vmagent/app/utils/tasks/__init__.py", line 90, in queue_task response = self.client.create_task(self.parent, task) File "/env/lib/python3.6/site-packages/google/cloud/tasks_v2/gapic/cloud_tasks_client.py", line 1508, in create_task request, retry=retry, timeout=timeout, metadata=metadata File "/env/lib/python3.6/site-packages/google/api_core/gapic_v1/method.py", line 143, in __call__ return wrapped_func(*args, **kwargs) File "/env/lib/python3.6/site-packages/google/api_core/retry.py", line 286, in retry_wrapped_func on_error=on_error, File "/env/lib/python3.6/site-packages/google/api_core/retry.py", line 184, in retry_target return target() File "/env/lib/python3.6/site-packages/google/api_core/timeout.py", line 214, in func_with_timeout return func(*args, **kwargs) File "/env/lib/python3.6/site-packages/google/api_core/grpc_helpers.py", line 59, in error_remapped_callable six.raise_from(exceptions.from_grpc_error(exc), exc) File "<string>", line 3, in raise_from google.api_core.exceptions.ServiceUnavailable: 503 Deadline Exceeded
There appear to be breaking changes in gunicorn 20.x.x and google-cloud-tasks needs to be updated accordingly.
zel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
- <builtin>
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:209:1
Analyzing: target //google/cloud/tasks/v2beta2:tasks-v2beta2-py (1 packages loaded, 0 targets configured)
INFO: Call stack for the definition of repository 'go_sdk' which is a _go_download_sdk (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/io_bazel_rules_go/go/private/sdk.bzl:79:20):
- <builtin>
- /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/io_bazel_rules_go/go/private/sdk.bzl:92:5
- /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/io_bazel_rules_go/go/private/sdk.bzl:260:13
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:81:1
ERROR: While resolving toolchains for target @pypi_black//:black: invalid registered toolchain '@gapic_generator_python//:pyenv3_toolchain': no such package '@gapic_generator_python//': The repository '@gapic_generator_python' could not be resolved
ERROR: Analysis of target '//google/cloud/tasks/v2beta2:tasks-v2beta2-py' failed; build aborted: invalid registered toolchain '@gapic_generator_python//:pyenv3_toolchain': no such package '@gapic_generator_python//': The repository '@gapic_generator_python' could not be resolved
INFO: Elapsed time: 4.019s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (20 packages loaded, 14 targets configured)
FAILED: Build did NOT complete successfully (20 packages loaded, 14 targets configured)
ERROR:synthtool:Failed executing bazel --max_idle_secs=60 build //google/cloud/tasks/v2beta2:tasks-v2beta2-py:
Loading:
Loading: 0 packages loaded
Loading: 0 packages loaded
Loading: 0 packages loaded
INFO: SHA256 (https://github.com/googleapis/gapic-generator/archive/4cb5d58f258afdb8abc0b99706370b4a59252b22.zip) = 3cb59685c8a4ae3db1dec60b286f22d8d3aa3d3b36bb08bf003b5e088fac83cb
DEBUG: Rule 'com_google_api_codegen' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "3cb59685c8a4ae3db1dec60b286f22d8d3aa3d3b36bb08bf003b5e088fac83cb"
DEBUG: Call stack for the definition of repository 'com_google_api_codegen' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
- <builtin>
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:62:1
Loading: 0 packages loaded
INFO: SHA256 (https://github.com/googleapis/protoc-java-resource-names-plugin/archive/5bd90a1f67c1c128291702cc320d667060f40f95.zip) = c3c0661b6c30fce5c63b1d5f473b1c6c4d59e19853ce3b9e8f5a447f953af906
DEBUG: Rule 'com_google_protoc_java_resource_names_plugin' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "c3c0661b6c30fce5c63b1d5f473b1c6c4d59e19853ce3b9e8f5a447f953af906"
DEBUG: Call stack for the definition of repository 'com_google_protoc_java_resource_names_plugin' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
- <builtin>
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:155:1
INFO: SHA256 (https://github.com/googleapis/gapic-generator-go/archive/v0.13.2.tar.gz) = ab7a2ffd74e6a6dac6da38027d4acadb84d0075c055289e3335d86a46f9f3b22
DEBUG: Rule 'com_googleapis_gapic_generator_go' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "ab7a2ffd74e6a6dac6da38027d4acadb84d0075c055289e3335d86a46f9f3b22"
DEBUG: Call stack for the definition of repository 'com_googleapis_gapic_generator_go' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
- <builtin>
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:209:1
Analyzing: target //google/cloud/tasks/v2beta2:tasks-v2beta2-py (1 packages loaded, 0 targets configured)
INFO: Call stack for the definition of repository 'go_sdk' which is a _go_download_sdk (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/io_bazel_rules_go/go/private/sdk.bzl:79:20):
- <builtin>
- /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/io_bazel_rules_go/go/private/sdk.bzl:92:5
- /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/io_bazel_rules_go/go/private/sdk.bzl:260:13
- /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:81:1
ERROR: While resolving toolchains for target @pypi_black//:black: invalid registered toolchain '@gapic_generator_python//:pyenv3_toolchain': no such package '@gapic_generator_python//': The repository '@gapic_generator_python' could not be resolved
ERROR: Analysis of target '//google/cloud/tasks/v2beta2:tasks-v2beta2-py' failed; build aborted: invalid registered toolchain '@gapic_generator_python//:pyenv3_toolchain': no such package '@gapic_generator_python//': The repository '@gapic_generator_python' could not be resolved
INFO: Elapsed time: 4.019s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (20 packages loaded, 14 targets configured)
FAILED: Build did NOT complete successfully (20 packages loaded, 14 targets configured)
2020-06-20 08:40:45,152 synthtool [DEBUG] > Wrote metadata to synth.metadata.
DEBUG:synthtool:Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module>
main()
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 782, in main
rv = self.invoke(ctx)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 94, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/home/kbuilder/.cache/synthtool/python-tasks/synth.py", line 35, in <module>
include_protos=True,
File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 46, in py_library
return self._generate_code(service, version, "python", **kwargs)
File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 180, in _generate_code
shell.run(bazel_run_args)
File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 39, in run
raise exc
File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 33, in run
encoding="utf-8",
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 438, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['bazel', '--max_idle_secs=60', 'build', '//google/cloud/tasks/v2beta2:tasks-v2beta2-py']' returned non-zero exit status 1.
2020-06-20 08:40:45,201 autosynth [ERROR] > Synthesis failed
2020-06-20 08:40:45,201 autosynth [DEBUG] > Running: git reset --hard HEAD
HEAD is now at 01304e7 chore: Migrate python-tasks synth.py from artman to bazel (#21)
2020-06-20 08:40:45,206 autosynth [DEBUG] > Running: git checkout autosynth-self
Switched to branch 'autosynth-self'
2020-06-20 08:40:45,211 autosynth [ERROR] > Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1.
2020-06-20 08:40:45,493 autosynth [INFO] > PR already exists: https://github.com/googleapis/python-tasks/pull/24
2020-06-20 08:40:45,493 autosynth [DEBUG] > Running: git clean -fdx
Removing __pycache__/
Removing google/__pycache__/
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 649, in <module>
main()
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 506, in main
return _inner_main(temp_dir)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 629, in _inner_main
commit_count = synthesize_loop(x, multiple_prs, change_pusher, synthesizer)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 367, in synthesize_loop
synthesize_inner_loop(fork, synthesizer)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 411, in synthesize_inner_loop
synthesizer, len(toolbox.versions) - 1
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 266, in synthesize_version_in_new_branch
synthesizer.synthesize(synth_log_path, self.environ)
File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize
synth_proc.check_returncode() # Raise an exception.
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode
self.stderr)
subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1.
Google internal developers can see the full log here.
I am using Python Cloud Task Client for communication between microservices.
Last few days I am running load tests to check for bottlenecks in my application and have noticed that the CPU always grows during a long load test. This happens when I call the create_task function to submit the task to the queue.
When I call the / webhook endpoint at 2 requests per second (I use Locust for load tests), my CPU values โโalways go up every minute. After around 10-minute load tests my Kubernetes limits / resources are exhausted and my app crashes.
Is your feature request related to a problem? Please describe.
We wish to migrate our app from python2 to python3 and we heavily rely on transactional enqueuing. The migration doc does not explain how this might work with cloud ndb and tasks.
Describe the solution you'd like
There should be some way of attaching actions with the transaction with or without limit of allowed actions.
Describe alternatives you've considered
We considered storing the tasks in datastore during the transaction and dequeue later to run when transaction is completed but that seems to consume an entity group (25 entity group limit) which is not feasible either in our case.
state = <grpc._channel._RPCState object at 0x7ff6c8ae7760>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7ff6c8b00b80>
with_call = False, deadline = None
def _end_unary_response_blocking(state, call, with_call, deadline):
if state.code is grpc.StatusCode.OK:
if with_call:
rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
return state.response, rendezvous
else:
return state.response
else:
raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.UNAUTHENTICATED
E details = "Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project."
E debug_error_string = "{"created":"@1626174369.983732944","description":"Error received from peer ipv4:173.194.203.95:443","file":"src/core/lib/surface/call.cc","file_line":1066,"grpc_message":"Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.","grpc_status":16}"
E >
../../google/cloud/tasks_v2/services/cloud_tasks/client.py:814: in delete_queue
rpc(
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/retry.py:285: in retry_wrapped_func
return retry_target(
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/retry.py:188: in retry_target
return target()
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/grpc_helpers.py:69: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
We are using google cloud tasks to offload asynchronous work. We have an API endpoint that publishes notifications to subscribers and in that API it fetches subscribers from the database and multiple schedules tasks to invoke another endpoint say /send-sms or /send-email.
It seems that the process of creating a task isn't releasing memory.
Environment Details
OS type and version: python:3.7-slim
Python version: 3.7
google-cloud-tasks==1.3.0
Environment details
OS type and version: python:3.7-slim
Python version: 3.7
pip version: 20.2.4
google-cloud-tasks==1.3.0
Steps to reproduce
Create Cloud Task
Code example
LOGGER.info(f"Task create request")
payload=copy.deepcopy(payload)
client=tasks_v2beta3.CloudTasksClient()
# Construct the fully qualified queue name.parent=client.queue_path(
constants.PROJECT_ID,
constants.LOCATION_ID, tasks_queue
)
ifpayloadisnotNone:
# The API expects a payload of type bytes.payload=json.dumps(payload).encode()
# Construct the request body.task= {
"http_request": {
"headers": self.headers,
"http_method": "POST",
"url": url,
"body": payload,
}
}
# Default delay of 2 secondsin_seconds=5ifdelay_hoursisnotNone:
# Convert into secondsin_seconds=delay_hours*3600# Convert "seconds from now" into an rfc3339 datetime string.d=datetime.datetime.utcnow() +datetime.timedelta(seconds=in_seconds)
# Create Timestamp protobuf.timestamp=timestamp_pb2.Timestamp()
timestamp.FromDatetime(d)
# Add the timestamp to the tasks.task["schedule_time"] =timestamptry:
LOGGER.info(f"Creating task on queue {parent} with following config {task}")
# Use the client to build and send the task.response=client.create_task(parent, task, retry=self.getRetryParams())
LOGGER.info(f"Created task {response.name}")
exceptAlreadyExistsase:
LOGGER.info(f"Task {task} already exist. Ignoring exception {str(e)}", exc_info=True)
Stack trace
Line # Mem usage Increment Occurences Line Contents
============================================================
54 68.8 MiB 68.8 MiB 1 @profile
55 def create_task(self, delay_hours, payload, url, tasks_queue):
56 """
57 Create Task in respective Queue
58 """
59 68.8 MiB 0.0 MiB 1 LOGGER.info(f"Task create request")
60 68.8 MiB 0.0 MiB 1 payload = copy.deepcopy(payload)
61 68.8 MiB 0.0 MiB 1 client = tasks_v2beta3.CloudTasksClient()
62 # Construct the fully qualified queue name.
63 68.8 MiB 0.0 MiB 1 parent = client.queue_path(
64 68.8 MiB 0.0 MiB 1 constants.PROJECT_ID,
65 68.8 MiB 0.0 MiB 1 constants.LOCATION_ID, tasks_queue
66 )
67
68 68.8 MiB 0.0 MiB 1 if payload is not None:
69 # The API expects a payload of type bytes.
70 68.8 MiB 0.0 MiB 1 payload = json.dumps(payload).encode()
71
72 # Construct the request body.
73 task = {
74 68.8 MiB 0.0 MiB 1 "http_request": {
75 68.8 MiB 0.0 MiB 1 "headers": self.headers,
76 68.8 MiB 0.0 MiB 1 "http_method": "POST",
77 68.8 MiB 0.0 MiB 1 "url": url,
78 68.8 MiB 0.0 MiB 1 "body": payload,
79 }
80 }
81
82 # Default delay of 2 seconds
83 68.8 MiB 0.0 MiB 1 in_seconds = 5
84
85 68.8 MiB 0.0 MiB 1 if delay_hours is not None:
86 # Convert into seconds
87 in_seconds = delay_hours * 3600
88
89 # Convert "seconds from now" into an rfc3339 datetime string.
90 68.8 MiB 0.0 MiB 1 d = datetime.datetime.utcnow() + datetime.timedelta(seconds=in_seconds)
91
92 # Create Timestamp protobuf.
93 68.8 MiB 0.0 MiB 1 timestamp = timestamp_pb2.Timestamp()
94 68.8 MiB 0.0 MiB 1 timestamp.FromDatetime(d)
95 # Add the timestamp to the tasks.
96 68.8 MiB 0.0 MiB 1 task"schedule_time"] = timestamp
97
98 68.8 MiB 0.0 MiB 1 try:
99 68.8 MiB 0.0 MiB 1 LOGGER.info(f"Creating task on queue {parent} with following config {task}")
100 # Use the client to build and send the task.
101 69.8 MiB 1.0 MiB 1 response = client.create_task(parent, task, retry=self.getRetryParams())
102 69.8 MiB 0.0 MiB 1 LOGGER.info(f"Created task {response.name}")
103 except AlreadyExists as e:
104 LOGGER.info(f"Task {task} already exist. Ignoring exception {str(e)}", exc_info=True)
Following are the memory footprints of the instance.
I have tried deleting the client object manually and gc.collect() but nothing works in keeping memory on track. I am not sure what else we can do or what else information is required to nail down the issue.
In the documentation for create_task, retry is documented as (Optional[google.api_core.retry.Retry]) โ A retry object used to retry client library requests. If None is specified, requests will be retried using a default configuration.
However currently according to this feature request, they are not supported by cloud tasks.
I tried it and it seems like the task does not work as of now, it seems like the retry option is ignored. Considering the issue I'm guessing that it was a problem with the documentation rather than the code.
If it is an issue with the code the steps to reproduce amount to making a task with a short deadline in a queue without one/a longer one and the task-level deadline will be ignored.
_InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
status = StatusCode.UNAVAILABLE
details = "Connection reset by peer"
debug_error_string = "{"created":"@1618704900.216123229","description":"Error received from peer ipv4:108.177.121.95:443","file":"src/core/lib/surface/call.cc","file_line":1067,"grpc_message":"Connection reset by peer","grpc_status":14}"
>
When attempting to create a Cloud Task via any official Python client (e.g. tasks_v2, tasks_v2beta2, or tasks_v2beta3), a 504 Deadline Exceeded error is consistently returned. This occurs regardless of the authentication technique used.
Note: if the same thing is attempted (using the same Cloud Task queue) with a Golang client, e.g. the create_task application and example from here, the task is created successfully, i.e. the problem only seems to affect Python clients.
Environment details
OS type and version: Ubuntu 18.04 Python version: Python 3.6.9 and Python 3.8.0 Output of pip freeze:
Alternatively, the issue can be reliably reproduced by following any Python example from the documentation which attempts to create a task, such as this one.
Traceback (most recent call last):
File "/path/to/python3.6/site-packages/google/api_core/grpc_helpers.py", line 57, in error_remapped_callable
return callable_(*args, **kwargs)
File "/path/to/python3.6/site-packages/grpc/_channel.py", line 826, in __call__
return _end_unary_response_blocking(state, call, False, None)
File "/path/to/python3.6/site-packages/grpc/_channel.py", line 729, in _end_unary_response_blocking
raise _InactiveRpcError(state)
grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
status = StatusCode.DEADLINE_EXCEEDED
details = "Deadline Exceeded"
debug_error_string = "{"created":"@1598472279.156412856","description":"Deadline Exceeded","file":"src/core/ext/filters/deadline/deadline_filter.cc","file_line":69,"grpc_status":4}"
>
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "create_app_engine_queue_task.py", line 110, in <module>
args.payload, args.in_seconds)
File "create_app_engine_queue_task.py", line 66, in create_task
response = client.create_task(parent, task)
File "/path/to/python3.6/site-packages/google/cloud/tasks_v2/gapic/cloud_tasks_client.py", line 1508, in create_task
request, retry=retry, timeout=timeout, metadata=metadata
File "/path/to/python3.6/site-packages/google/api_core/gapic_v1/method.py", line 145, in __call__
return wrapped_func(*args, **kwargs)
File "/path/to/python3.6/site-packages/google/api_core/retry.py", line 286, in retry_wrapped_func
on_error=on_error,
File "/path/to/python3.6/site-packages/google/api_core/retry.py", line 184, in retry_target
return target()
File "/path/to/python3.6/site-packages/google/api_core/timeout.py", line 214, in func_with_timeout
return func(*args, **kwargs)
File "/path/to/python3.6/site-packages/google/api_core/grpc_helpers.py", line 59, in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
File "<string>", line 3, in raise_from
google.api_core.exceptions.DeadlineExceeded: 504 Deadline Exceeded
Note: if the client.create_task() call is made with a timeout parameter specified, an almost-identical error and stack-trace is returned, with the only differences being the status code (status = StatusCode.UNAVAILABLE) and exception thrown (google.api_core.exceptions.ServiceUnavailable: 503 Deadline Exceeded). It's possible, especially in light of the API error code documentation, that there is a discrepancy worth investigating here.
Finally, it is probably worth mentioning this issue report which mentions that the gunicorn version used may be relevant. For what it's worth, I'm seeing the same error(s) regardless of the gunicorn version the task handler uses.
In fact, on another REST API web page, it explicitly states:
RetryConfig
Settings that determine the retry behavior.
For tasks created using Cloud Tasks: the queue-level retry settings apply to all tasks in the queue that were created using Cloud Tasks. Retry settings cannot be set on individual tasks.
For tasks created using the App Engine SDK: the queue-level retry settings apply to all tasks in the queue which do not have retry settings explicitly set on the task and were created by the App Engine SDK. See App Engine documentation.