Git Product home page Git Product logo

python-tasks's Introduction

NOTE

This github repository is archived. The repository contents and history have moved to google-cloud-python.

Python Client for Cloud Tasks API

stable pypi versions

Cloud Tasks API: a fully managed service that allows you to manage the execution, dispatch and delivery of a large number of distributed tasks. You can asynchronously perform work outside of a user request. Your tasks can be executed on App Engine or any arbitrary HTTP endpoint.

Quick Start

In order to use this library, you first need to go through the following steps:

  1. Select or create a Cloud Platform project.
  2. Enable billing for your project.
  3. Enable the Cloud Tasks API.
  4. Setup Authentication.

Installation

Install this library in a virtual environment using venv. venv is a tool that creates isolated Python environments. These isolated environments can have separate versions of Python packages, which allows you to isolate one project's dependencies from the dependencies of other projects.

With venv, it's possible to install this library without needing system install permissions, and without clashing with the installed system dependencies.

Code samples and snippets

Code samples and snippets live in the samples/ folder.

Supported Python Versions

Our client libraries are compatible with all current active and maintenance versions of Python.

Python >= 3.7

Unsupported Python Versions

Python <= 3.6

If you are using an end-of-life version of Python, we recommend that you update as soon as possible to an actively supported version.

Mac/Linux

python3 -m venv <your-env>
source <your-env>/bin/activate
pip install google-cloud-tasks

Windows

py -m venv <your-env>
.\<your-env>\Scripts\activate
pip install google-cloud-tasks

Next Steps

python-tasks's People

Contributors

arithmetic1728 avatar averikitsch avatar busunkim96 avatar crwilcox avatar dandhlee avatar dinagraves avatar dpebot avatar ehsan-karamad avatar gcf-owl-bot[bot] avatar google-cloud-policy-bot[bot] avatar justinbeckwith avatar msampathkumar avatar msdinit avatar parthea avatar pravindahal avatar release-please[bot] avatar renovate-bot avatar surferjeffatgoogle avatar theacodes avatar tobked avatar tseaver avatar vam-google avatar ylil93 avatar yoshi-automation avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

python-tasks's Issues

samples.snippets.delete_queue_test: test_delete_queue failed

This test failed!

To configure my behavior, see the Flaky Bot documentation.

If I'm commenting on this issue too often, add the flakybot: quiet label and
I will stop commenting.


commit: 47b9c1c
buildURL: Build Status, Sponge
status: failed

Test output
args = (parent: "projects/python-docs-samples-tests/locations/us-central1"
queue {
  name: "projects/python-docs-samples-tests/locations/us-central1/queues/my-queue-124d2f811e414a0bbecd9e920231638b"
}
,)
kwargs = {'metadata': [('x-goog-request-params', 'parent=projects/python-docs-samples-tests/locations/us-central1'), ('x-goog-api-client', 'gl-python/3.8.8 grpc/1.38.1 gax/1.31.0 gapic/2.4.0')]}
@six.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
    try:
      return callable_(*args, **kwargs)

.nox/py-3-8/lib/python3.8/site-packages/google/api_core/grpc_helpers.py:67:


self = <grpc._channel._UnaryUnaryMultiCallable object at 0x7ff6c69b2790>
request = parent: "projects/python-docs-samples-tests/locations/us-central1"
queue {
name: "projects/python-docs-samples-tests/locations/us-central1/queues/my-queue-124d2f811e414a0bbecd9e920231638b"
}

timeout = None
metadata = [('x-goog-request-params', 'parent=projects/python-docs-samples-tests/locations/us-central1'), ('x-goog-api-client', 'gl-python/3.8.8 grpc/1.38.1 gax/1.31.0 gapic/2.4.0')]
credentials = None, wait_for_ready = None, compression = None

def __call__(self,
             request,
             timeout=None,
             metadata=None,
             credentials=None,
             wait_for_ready=None,
             compression=None):
    state, call, = self._blocking(request, timeout, metadata, credentials,
                                  wait_for_ready, compression)
  return _end_unary_response_blocking(state, call, False, None)

.nox/py-3-8/lib/python3.8/site-packages/grpc/_channel.py:946:


state = <grpc._channel._RPCState object at 0x7ff6c8afe2b0>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7ff6c67a5f40>
with_call = False, deadline = None

def _end_unary_response_blocking(state, call, with_call, deadline):
    if state.code is grpc.StatusCode.OK:
        if with_call:
            rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
            return state.response, rendezvous
        else:
            return state.response
    else:
      raise _InactiveRpcError(state)

E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.UNAUTHENTICATED
E details = "Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project."
E debug_error_string = "{"created":"@1626174370.877133679","description":"Error received from peer ipv4:173.194.202.95:443","file":"src/core/lib/surface/call.cc","file_line":1066,"grpc_message":"Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.","grpc_status":16}"
E >

.nox/py-3-8/lib/python3.8/site-packages/grpc/_channel.py:849: _InactiveRpcError

The above exception was the direct cause of the following exception:

@pytest.fixture()
def test_queue():
    client = tasks_v2.CloudTasksClient()
    parent = f"projects/{TEST_PROJECT_ID}/locations/{TEST_LOCATION}"
    queue = {
        # The fully qualified path to the queue
        "name": client.queue_path(TEST_PROJECT_ID, TEST_LOCATION, TEST_QUEUE_NAME),
    }
  q = client.create_queue(request={"parent": parent, "queue": queue})

delete_queue_test.py:38:


../../google/cloud/tasks_v2/services/cloud_tasks/client.py:628: in create_queue
response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,)
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/grpc_helpers.py:69: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)


value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.UNAUTHENTICATED
details = "Request had invalid a...entication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.","grpc_status":16}"

???
E google.api_core.exceptions.Unauthenticated: 401 Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.

:3: Unauthenticated

Cloud Tasks version 2 documentation

Is there a documentation with full examples on how to create a task in the new version?

We've updated the library (2.2.0) and our code no longer works, I haven't found a full example on how to create a task and am stuck on setting headers.

Supposed to use types.HttpRequest.HeadersEntry but how do I actually set the headers values?

I've tried HeadersEntry(content_type='application/json') or HeadersEntry(**{"Content-Type": "application/json"}) and neither work.

When to use types vs dicts with this library

I've had a hard time using this library. I first used an example I found to create a task using this simple dictionary

I found the example by going here https://cloud.google.com/tasks/docs/quickstart which linked me to here https://github.com/GoogleCloudPlatform/python-docs-samples/blob/master/appengine/flexible/tasks/create_app_engine_queue_task.py

The example looks like this:

    task = {
            'app_engine_http_request': {  # Specify the type of request.
                'http_method': tasks_v2.HttpMethod.POST,
                'relative_uri': '/example_task_handler'
            }
    }

I modified it to be an http_request rather than app_engine_http_request after reading through the docs here https://cloud.google.com/tasks/docs/reference/rest/v2/projects.locations.queues.tasks/create. I also want to use the OAuthToken feature so I can have cloud tasks trigger a call to another google cloud API. Because the http_method above uses tasks_v2_HttpMethod.POST, I assumed I needed to use tasks_v2.OAuthToken() in my task definition. I ended up with this (I'm including a full snippet, but note the task definition):

from google.cloud import tasks_v2

client = tasks_v2.CloudTasksClient()

task = {
    "http_request": {
        "http_method": tasks_v2.HttpMethod.POST,
        "url": "https://pubsub.googleapis.com/v1/projects/my-project/topics/testtopic:publish",
        "body": b"eyJtZXNzYWdlcyI6IFt7ImRhdGEiOiAiVkdocGN5QnBjeUJoSUhSbGMzUUsifV19Cg==",
        "oauth_token": tasks_v2.OAuthToken(service_account_email='[email protected]'),
    }
}

parent = 'projects/my-project/locations/us-central1/queues/my-queue'

resp = client.create_task(parent=parent, task=task)

But when I run that code I get the following error:

Traceback (most recent call last):
  File "/path/to/example_tasks.py", line 18, in <module>
    resp = client.create_task(parent=parent, task=task)
  File "/path/to/.venv/lib/python3.9/site-packages/google/cloud/tasks_v2/services/cloud_tasks/client.py", line 1700, in create_task
    request.task = task
  File "/path/to/.venv/lib/python3.9/site-packages/proto/message.py", line 632, in __setattr__
    pb_value = marshal.to_proto(pb_type, value)
  File "/path/to/.venv/lib/python3.9/site-packages/proto/marshal/marshal.py", line 208, in to_proto
    pb_value = rule.to_proto(value)
  File "/path/to/.venv/lib/python3.9/site-packages/proto/marshal/rules/message.py", line 32, in to_proto
    return self._descriptor(**value)
TypeError: Parameter to MergeFrom() must be instance of same class: expected google.cloud.tasks.v2.OAuthToken got OAuthToken.

That error seems odd, because it looks like I passed it what it expected.I suspect that is a bug, but below you'll see I got it working with a dict so I wasn't sure... so I submitted this as a support request rather than a bug.

After some fumbling around I noticed if I pass it a dictionary it seems to work. The example payload here is:

task = {
    "http_request": {
        "http_method": tasks_v2.HttpMethod.POST,
        "url": "https://pubsub.googleapis.com/v1/projects/my-project/topics/testtopic:publish",
        "body": b"eyJtZXNzYWdlcyI6IFt7ImRhdGEiOiAiVkdocGN5QnBjeUJoSUhSbGMzUUsifV19Cg==",
        "oauth_token": {"service_account_email":"[email protected]"},
    }
}

The documentation here isn't very helpful either: https://googleapis.dev/python/cloudtasks/latest/

I'm working on a POC using CloudTasks, and I'm making progress by testing various things, and trying to read through the code, but it been a bit frustrating so far.

Authenticated (OIDC) HTTP task example should set an audience field

According to Cloud Tasks API, when the audience field is unset, it will take the default value of the specified URI in the
HttpRequest's target
. However, this could become a problem when the URI itself contains parts such as query parameters or fragments. Having these in the audience field will lead to 401.

Part of this issue lies with the APIs, but as far as documentation and examples go, we should make the audience field specific to avoid having such problems for customers who start using the product from Google provided examples.

Tasks: Broken links in docstring ReStructured text.

The Types for Cloud Tasks API documentation seems to have a markdown formatting issue. There are a number of places where a property is specified in square brackets immediately followed by the same property fully specified with the library in square brackets, e.g. [AppEngineHttpRequest][google.cloud.tasks.v2.AppEngineHttpRequest]

Steps to reproduce

  1. Visit Types for Cloud Tasks API
  2. Search for occurrences of a closing square bracket immediately followed by an opening square bracket i.e. ][

The first occurrence is at AppEngineHttpRequest. The issue also exists in max_burst_size. A search for "][" returns 138 results, but it's possible that some results are not indicative of markdown issues.

Calling create_task() from Python app is now failing

Hello,

I also posted this issue on the Gcloud Issue tracker. https://issuetracker.google.com/169252786

I'm running a Python app on GAE and since a few hours, all calls to create a task are returning the following error :

Traceback (most recent call last): File "/env/lib/python3.7/site-packages/flask/app.py", line 2446, in wsgi_app response = self.full_dispatch_request() File "/env/lib/python3.7/site-packages/flask/app.py", line 1951, in full_dispatch_request rv = self.handle_user_exception(e) File "/env/lib/python3.7/site-packages/flask/app.py", line 1820, in handle_user_exception reraise(exc_type, exc_value, tb) File "/env/lib/python3.7/site-packages/flask/_compat.py", line 39, in reraise raise value File "/env/lib/python3.7/site-packages/flask/app.py", line 1949, in full_dispatch_request rv = self.dispatch_request() File "/env/lib/python3.7/site-packages/flask/app.py", line 1935, in dispatch_request return self.view_functionsrule.endpoint File "/srv/main.py", line 616, in createDailyCheckins response = client_verifier.create_task(parent, task) TypeError: create_task() takes from 1 to 2 positional arguments but 3 were given

My GAE requirements file for the app is the following:
Flask==1.1.1
google-api-python-client
requests
google-cloud-tasks
google-cloud-firestore
pytz
pyjwt
cryptography
hyper

The app has been creating tasks for months without error with the exact same code.

Documentation on scheduling a app_engine_http_request seems incorrect

Hi,

I'm having an issue creating scheduled Cloud Tasks, using python 2. 7 : they won't work throwing a TypeError.
It's probably worth noting that as long as I dont use the in_seconds every thing work as expected.

Am I missing something ?

Thanks

Environment details

  • OS type and version: Linux / Ubuntu 20.10
  • Python version: 2.7
  • pip version: 20.3.3
  • google-cloud-tasks version: 1.5.0

Steps to reproduce

  1. Follow the code sample to create a task using an app_engine_http_request
  2. Create a task using in_seconds

Code example

-> https://cloud.google.com/tasks/docs/samples/cloud-tasks-appengine-create-task

I also tried different combinations for 'schedule_time' :

task['schedule_time'] = date.isoformat('T')
task['schedule_time'] = ' 2021-01-07T10:20:50.52Z'  # future date, inferior to 30 days from now 

# Using future.backports.datetime
task['schedule_time'] = date.isoformat('T')

Nothing worked.

Stack trace

ERROR    2020-12-31 11:28:56,442 handlers.py:120] Traceback (most recent call last):
  File "/home/nathan/myproject/lib/flask/app.py", line 1950, in full_dispatch_request
    rv = self.dispatch_request()
  File "/home/nathan/myproject/lib/flask/app.py", line 1936, in dispatch_request
    return self.view_functions[rule.endpoint](**req.view_args)
  File "/home/nathan/myproject/app/main/admin/migrations.py", line 34, in reindex_enterprise
    migrate_by_pieces(400, _in_seconds=5 * 60)
  File "/home/nathan/myproject/app/deferredv3.py", line 70, in wrapper_defer
    path=u'/function-handler/{}'.format(f.__name__))
  File "/home/nathan/myproject/app/deferredv3.py", line 113, in send_deferred_task
    task = client.create_task(parent=parent, task=task)
  File "/home/nathan/myproject/lib/google/cloud/tasks_v2/gapic/cloud_tasks_client.py", line 1492, in create_task
    parent=parent, task=task, response_view=response_view
TypeError: Parameter to MergeFrom() must be instance of same class: expected google.protobuf.Timestamp got datetime.datetime.

Tasks: Update queue, "x-goog-request-params" header

Environment details

Running google-cloud-tasks==1.2.1.

Steps to reproduce

  1. Create a Cloud Tasks queue.
  2. Attempt to update the queue using the update_queue method with a dictionary to specify the queue name.
  3. Observe exception: google.api_core.exceptions.InvalidArgument: 400 "x-goog-request-params" header is either missing or misformatted. "x-goog-request-params" must contain "queue.name=projects/[projectname]/locations/[location]/queues/[name]"
  4. Replace the queue object in update_queue with an instance of google.cloud.tasks_v2beta3.types.Queue and observe success.

Code example

import google
from google.cloud import tasks_v2beta3

TASKS_CLIENT = tasks_v2beta3.CloudTasksClient()
TASKS_QUEUE_NAME = 'queuenamehere'
TASKS_CLIENT.update_queue({'name': TASKS_QUEUE_NAME})

vs.

queue = google.cloud.tasks_v2beta3.types.Queue(name=TASKS_QUEUE_NAME)
TASKS_CLIENT.update_queue(queue)

Note: After I resolved this issue, I was using this to update the retry policy, but this is the simplest example to demonstrate the failure -- namely that update_queue doesn't actually work if a dictionary is provided for the queue, even though the type annotation says that the Union[dict, Queue] is supported.

Stack trace

Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/google/api_core/grpc_helpers.py", line 57, in error_remapped_callable
    return callable_(*args, **kwargs)
  File "/usr/local/lib/python3.7/site-packages/grpc/_channel.py", line 565, in __call__
    return _end_unary_response_blocking(state, call, False, None)
  File "/usr/local/lib/python3.7/site-packages/grpc/_channel.py", line 467, in _end_unary_response_blocking
    raise _Rendezvous(state, None, None, deadline)
grpc._channel._Rendezvous: <_Rendezvous of RPC that terminated with:
        status = StatusCode.INVALID_ARGUMENT
        details = ""x-goog-request-params" header is either missing or misformatted. "x-goog-request-params" must contain "queue.name=projects/gecko-ops-sandbox/locations/us-east4/queues/loader-mapper""
        debug_error_string = "{"created":"@1567636043.780459442","description":"Error received from peer ipv4:74.125.141.95:443","file":"src/core/lib/surface/call.cc","file_line":1052,"grpc_message":""x-goog-request-params" header is either missing or misformatted. "x-goog-request-params" must contain "queue.name=projects/gecko-ops-sandbox/locations/us-east4/queues/loader-mapper"","grpc_status":3}"
>

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/local/lib/python3.7/site-packages/google/cloud/tasks_v2beta3/gapic/cloud_tasks_client.py", line 640, in update_queue
    request, retry=retry, timeout=timeout, metadata=metadata
  File "/usr/local/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py", line 143, in __call__
    return wrapped_func(*args, **kwargs)
  File "/usr/local/lib/python3.7/site-packages/google/api_core/retry.py", line 273, in retry_wrapped_func
    on_error=on_error,
  File "/usr/local/lib/python3.7/site-packages/google/api_core/retry.py", line 182, in retry_target
    return target()
  File "/usr/local/lib/python3.7/site-packages/google/api_core/timeout.py", line 214, in func_with_timeout
    return func(*args, **kwargs)
  File "/usr/local/lib/python3.7/site-packages/google/api_core/grpc_helpers.py", line 59, in error_remapped_callable
    six.raise_from(exceptions.from_grpc_error(exc), exc)
  File "<string>", line 3, in raise_from
google.api_core.exceptions.InvalidArgument: 400 "x-goog-request-params" header is either missing or misformatted. "x-goog-request-params" must contain "queue.name=[redacted]"

Fortunately creating the Queue object resolves this but if the dict is not supported then the annotation should be updated and ideally there should be a nice error that lets the API user know that a dict isn't supported. Right now the AttributeError exception for accessing .name is silently passed but perhaps should be raised.

Creating a task fails with "expected google.protobuf.Timestamp got datetime.datetime."

Thanks for stopping by to let us know something could be better!

PLEASE READ: If you have a support contract with Google, please create an issue in the support console instead of filing on GitHub. This will ensure a timely response.

Please run down the following list and make sure you've tried the usual "quick fixes":

If you are still having issues, please be sure to include as much information as possible:

Environment details

  • OS type and version: MacOS
  • Python version: python --version 3.9.6
  • pip version: pip --version 21.1.3
  • google-cloud-tasks version: pip show google-cloud-tasks 2.5.1

Steps to reproduce

  1. Download the code sample found here: https://github.com/GoogleCloudPlatform/python-docs-samples/blob/master/appengine/flexible/tasks/create_app_engine_queue_task.py
  2. Execute it with valid parameters: python create_app_engine_queue_task.py --project my-project --location us-central1 --in_seconds 30 --queue myqueue
  3. See error:
Traceback (most recent call last):
  File "/path/to/create_app_engine_queue_task.py", line 110, in <module>
    create_task(
  File "/path/to/create_app_engine_queue_task.py", line 68, in create_task
    response = client.create_task(parent=parent, task=task)
  File "/path/to/.venv/lib/python3.9/site-packages/google/cloud/tasks_v2/services/cloud_tasks/client.py", line 1700, in create_task
    request.task = task
  File "/path/to/.venv/lib/python3.9/site-packages/proto/message.py", line 632, in __setattr__
    pb_value = marshal.to_proto(pb_type, value)
  File "/path/to/.venv/lib/python3.9/site-packages/proto/marshal/marshal.py", line 208, in to_proto
    pb_value = rule.to_proto(value)
  File "/path/to/.venv/lib/python3.9/site-packages/proto/marshal/rules/message.py", line 32, in to_proto
    return self._descriptor(**value)
TypeError: Parameter to MergeFrom() must be instance of same class: expected google.protobuf.Timestamp got datetime.datetime.

Code example

See python code sample here: https://github.com/GoogleCloudPlatform/python-docs-samples/blob/master/appengine/flexible/tasks/create_app_engine_queue_task.py
Also linked to from the google cloud documentation

Stack trace

Traceback (most recent call last):
  File "/path/to/create_app_engine_queue_task.py", line 110, in <module>
    create_task(
  File "/path/to/create_app_engine_queue_task.py", line 68, in create_task
    response = client.create_task(parent=parent, task=task)
  File "/path/to/.venv/lib/python3.9/site-packages/google/cloud/tasks_v2/services/cloud_tasks/client.py", line 1700, in create_task
    request.task = task
  File "/path/to/.venv/lib/python3.9/site-packages/proto/message.py", line 632, in __setattr__
    pb_value = marshal.to_proto(pb_type, value)
  File "/path/to/.venv/lib/python3.9/site-packages/proto/marshal/marshal.py", line 208, in to_proto
    pb_value = rule.to_proto(value)
  File "/path/to/.venv/lib/python3.9/site-packages/proto/marshal/rules/message.py", line 32, in to_proto
    return self._descriptor(**value)
TypeError: Parameter to MergeFrom() must be instance of same class: expected google.protobuf.Timestamp got datetime.datetime.

Making sure to follow these steps will guarantee the quickest resolution possible.

Thanks!

scripts/fixup*.py run as shell scripts

Thanks for stopping by to let us know something could be better!

PLEASE READ: If you have a support contract with Google, please create an issue in the support console instead of filing on GitHub. This will ensure a timely response.

Please run down the following list and make sure you've tried the usual "quick fixes":

If you are still having issues, please be sure to include as much information as possible:

Environment details

  • OS type and version: MaxOS
  • Python version: python --version Python 3.7.8
  • pip version: pip --version pip 20.1.1
  • google-cloud-tasks version: pip show google-cloud-tasks 2.0.0

Steps to reproduce

  1. Run fixup_tasks_v2_keywords.py --input-directory .samples/ --output-directory samples/ in terminal, as described in UPGRADING.mst
  2. Observe output below
line 18: import: command not found
...

Code example

None

Stack trace

None

Making sure to follow these steps will guarantee the quickest resolution possible.

Thanks!

Synthesis failed for python-tasks

Hello! Autosynth couldn't regenerate python-tasks. ๐Ÿ’”

Here's the output from running synth.py:

st/bin/external/com_google_protobuf/protoc --experimental_allow_proto3_optional --plugin=protoc-gen-python_gapic=bazel-out/host/bin/external/gapic_generator_python/gapic_plugin --python_gapic_out=retry-config=google/cloud/tasks/v2beta2/cloudtasks_grpc_service_config.json:bazel-out/k8-fastbuild/bin/google/cloud/tasks/v2beta2/tasks_py_gapic.srcjar.zip -Igoogle/cloud/tasks/v2beta2/cloudtasks.proto=google/cloud/tasks/v2beta2/cloudtasks.proto -Igoogle/cloud/tasks/v2beta2/queue.proto=google/cloud/tasks/v2beta2/queue.proto -Igoogle/cloud/tasks/v2beta2/target.proto=google/cloud/tasks/v2beta2/target.proto -Igoogle/cloud/tasks/v2beta2/task.proto=google/cloud/tasks/v2beta2/task.proto -Igoogle/api/annotations.proto=google/api/annotations.proto -Igoogle/api/http.proto=google/api/http.proto -Igoogle/protobuf/descriptor.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/descriptor_proto/google/protobuf/descriptor.proto -Igoogle/api/client.proto=google/api/client.proto -Igoogle/api/field_behavior.proto=google/api/field_behavior.proto -Igoogle/api/resource.proto=google/api/resource.proto -Igoogle/iam/v1/iam_policy.proto=google/iam/v1/iam_policy.proto -Igoogle/iam/v1/options.proto=google/iam/v1/options.proto -Igoogle/iam/v1/policy.proto=google/iam/v1/policy.proto -Igoogle/type/expr.proto=google/type/expr.proto -Igoogle/rpc/status.proto=google/rpc/status.proto -Igoogle/protobuf/any.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/any_proto/google/protobuf/any.proto -Igoogle/protobuf/duration.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/duration_proto/google/protobuf/duration.proto -Igoogle/protobuf/empty.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/empty_proto/google/protobuf/empty.proto -Igoogle/protobuf/field_mask.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/field_mask_proto/google/protobuf/field_mask.proto -Igoogle/protobuf/timestamp.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/timestamp_proto/google/protobuf/timestamp.proto google/cloud/tasks/v2beta2/cloudtasks.proto google/cloud/tasks/v2beta2/queue.proto google/cloud/tasks/v2beta2/target.proto google/cloud/tasks/v2beta2/task.proto` failed (Exit 1) protoc failed: error executing command bazel-out/host/bin/external/com_google_protobuf/protoc --experimental_allow_proto3_optional '--plugin=protoc-gen-python_gapic=bazel-out/host/bin/external/gapic_generator_python/gapic_plugin' ... (remaining 25 argument(s) skipped)

Use --sandbox_debug to see verbose messages from the sandbox
google/cloud/tasks/v2beta2/target.proto:20:1: warning: Import google/api/annotations.proto is unused.
google/cloud/tasks/v2beta2/queue.proto:24:1: warning: Import google/api/annotations.proto is unused.
google/cloud/tasks/v2beta2/task.proto:24:1: warning: Import google/api/annotations.proto is unused.
Traceback (most recent call last):
  File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/45/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/cli/generate_with_pandoc.py", line 3, in <module>
    from gapic.cli import generate
  File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/45/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/cli/generate.py", line 23, in <module>
    from gapic import generator
  File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/45/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/generator/__init__.py", line 21, in <module>
    from .generator import Generator
  File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/45/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/generator/generator.py", line 24, in <module>
    from gapic.samplegen import manifest, samplegen
  File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/45/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/samplegen/__init__.py", line 15, in <module>
    from gapic.samplegen import samplegen
  File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/45/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/samplegen/samplegen.py", line 27, in <module>
    from gapic.schema import wrappers
  File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/45/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/schema/__init__.py", line 23, in <module>
    from gapic.schema.api import API
  File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/45/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/schema/api.py", line 29, in <module>
    from google.api_core import exceptions  # type: ignore
ModuleNotFoundError: No module named 'google.api_core'
--python_gapic_out: protoc-gen-python_gapic: Plugin failed with status code 1.
Target //google/cloud/tasks/v2beta2:tasks-v2beta2-py failed to build
Use --verbose_failures to see the command lines of failed build steps.
INFO: Elapsed time: 1.006s, Critical Path: 0.80s
INFO: 0 processes.
FAILED: Build did NOT complete successfully
FAILED: Build did NOT complete successfully

Traceback (most recent call last):
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module>
    main()
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__
    return self.main(*args, **kwargs)
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 782, in main
    rv = self.invoke(ctx)
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke
    return callback(*args, **kwargs)
  File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 94, in main
    spec.loader.exec_module(synth_module)  # type: ignore
  File "<frozen importlib._bootstrap_external>", line 678, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/home/kbuilder/.cache/synthtool/python-tasks/synth.py", line 36, in <module>
    include_protos=True,
  File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 52, in py_library
    return self._generate_code(service, version, "python", **kwargs)
  File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 193, in _generate_code
    shell.run(bazel_run_args)
  File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 39, in run
    raise exc
  File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 33, in run
    encoding="utf-8",
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 438, in run
    output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['bazel', '--max_idle_secs=240', 'build', '//google/cloud/tasks/v2beta2:tasks-v2beta2-py']' returned non-zero exit status 1.
2021-01-21 05:47:32,220 autosynth [ERROR] > Synthesis failed
2021-01-21 05:47:32,220 autosynth [DEBUG] > Running: git reset --hard HEAD
HEAD is now at 6da7304 chore(deps): update dependency google-cloud-tasks to v2.1.0 (#63)
2021-01-21 05:47:32,225 autosynth [DEBUG] > Running: git checkout autosynth
Switched to branch 'autosynth'
2021-01-21 05:47:32,230 autosynth [DEBUG] > Running: git clean -fdx
Removing __pycache__/
Traceback (most recent call last):
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 354, in <module>
    main()
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 189, in main
    return _inner_main(temp_dir)
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 334, in _inner_main
    commit_count = synthesize_loop(x, multiple_prs, change_pusher, synthesizer)
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 65, in synthesize_loop
    has_changes = toolbox.synthesize_version_in_new_branch(synthesizer, youngest)
  File "/tmpfs/src/github/synthtool/autosynth/synth_toolbox.py", line 259, in synthesize_version_in_new_branch
    synthesizer.synthesize(synth_log_path, self.environ)
  File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize
    synth_proc.check_returncode()  # Raise an exception.
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode
    self.stderr)
subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1.

Google internal developers can see the full log here.

create_task takes takes too long to complete despite timeout

Environment details

  • OS type and version: Ubuntu 16.04.7 LTS (Xenial Xerus)
  • Python version: Python 3.6.10
  • pip version: pip 20.2.2
  • google-cloud-tasks version: 2.5.1

Additional details:

Steps to reproduce

  1. App is running on Google AppEngine flexible on a Gunicorn server (gunicorn==20.0.4) with Gevent (gevent==20.6.2) worker. grpcio==1.40.0. Recently bumped gevent and grpcio from an earlier version and issue still happens.
  2. 1 in about every 400 requests takes way too long (several hundred seconds) to complete despite setting a timeout.

Screenshot from New Relic APM showing a request which took 380s mostly from create_task:

image

I also have a timer instrumented in the python function that calls create_task and confirms it takes 380s.

Code example

CLIENT_TIMEOUT_SEC = 8
@timer(dict(name='cloud_task_enqueue'))
def enqueue_task(
    queue: str,
    callback_url_path: str,
    data: [Any] = None,
):
    parent = client.queue_path(GAE_SERVICE, GAE_REGION, queue)

    task = {
        'app_engine_http_request': {
            'http_method': tasks_v2.HttpMethod.POST,
            'relative_uri': callback_url_path,
            'body': json.dumps(data).encode(),
        },
    }

    response = client.create_task(request={'parent': parent, 'task': task}, timeout=CLIENT_TIMEOUT_SEC)
    return response

GRPC has been monkeypatched to support Gevent

# GRPC must be monkeypatched to support gevent
# https://github.com/grpc/grpc/issues/4629#issuecomment-376962677
import grpc.experimental.gevent as grpc_gevent
grpc_gevent.init_gevent()

Synthesis failed for python-tasks

Hello! Autosynth couldn't regenerate python-tasks. ๐Ÿ’”

Here's the output from running synth.py:

st/bin/external/com_google_protobuf/protoc --experimental_allow_proto3_optional --plugin=protoc-gen-python_gapic=bazel-out/host/bin/external/gapic_generator_python/gapic_plugin --python_gapic_out=retry-config=google/cloud/tasks/v2beta2/cloudtasks_grpc_service_config.json:bazel-out/k8-fastbuild/bin/google/cloud/tasks/v2beta2/tasks_py_gapic.srcjar.zip -Igoogle/cloud/tasks/v2beta2/cloudtasks.proto=google/cloud/tasks/v2beta2/cloudtasks.proto -Igoogle/cloud/tasks/v2beta2/queue.proto=google/cloud/tasks/v2beta2/queue.proto -Igoogle/cloud/tasks/v2beta2/target.proto=google/cloud/tasks/v2beta2/target.proto -Igoogle/cloud/tasks/v2beta2/task.proto=google/cloud/tasks/v2beta2/task.proto -Igoogle/api/annotations.proto=google/api/annotations.proto -Igoogle/api/http.proto=google/api/http.proto -Igoogle/protobuf/descriptor.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/descriptor_proto/google/protobuf/descriptor.proto -Igoogle/api/client.proto=google/api/client.proto -Igoogle/api/field_behavior.proto=google/api/field_behavior.proto -Igoogle/api/resource.proto=google/api/resource.proto -Igoogle/iam/v1/iam_policy.proto=google/iam/v1/iam_policy.proto -Igoogle/iam/v1/options.proto=google/iam/v1/options.proto -Igoogle/iam/v1/policy.proto=google/iam/v1/policy.proto -Igoogle/type/expr.proto=google/type/expr.proto -Igoogle/rpc/status.proto=google/rpc/status.proto -Igoogle/protobuf/any.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/any_proto/google/protobuf/any.proto -Igoogle/protobuf/duration.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/duration_proto/google/protobuf/duration.proto -Igoogle/protobuf/empty.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/empty_proto/google/protobuf/empty.proto -Igoogle/protobuf/field_mask.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/field_mask_proto/google/protobuf/field_mask.proto -Igoogle/protobuf/timestamp.proto=bazel-out/k8-fastbuild/bin/external/com_google_protobuf/_virtual_imports/timestamp_proto/google/protobuf/timestamp.proto google/cloud/tasks/v2beta2/cloudtasks.proto google/cloud/tasks/v2beta2/queue.proto google/cloud/tasks/v2beta2/target.proto google/cloud/tasks/v2beta2/task.proto` failed (Exit 1) protoc failed: error executing command bazel-out/host/bin/external/com_google_protobuf/protoc --experimental_allow_proto3_optional '--plugin=protoc-gen-python_gapic=bazel-out/host/bin/external/gapic_generator_python/gapic_plugin' ... (remaining 25 argument(s) skipped)

Use --sandbox_debug to see verbose messages from the sandbox
google/cloud/tasks/v2beta2/target.proto:19:1: warning: Import google/api/annotations.proto is unused.
google/cloud/tasks/v2beta2/queue.proto:24:1: warning: Import google/api/annotations.proto is unused.
google/cloud/tasks/v2beta2/task.proto:23:1: warning: Import google/api/annotations.proto is unused.
Traceback (most recent call last):
  File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/47/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/cli/generate_with_pandoc.py", line 3, in <module>
    from gapic.cli import generate
  File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/47/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/cli/generate.py", line 23, in <module>
    from gapic import generator
  File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/47/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/generator/__init__.py", line 21, in <module>
    from .generator import Generator
  File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/47/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/generator/generator.py", line 24, in <module>
    from gapic.samplegen import manifest, samplegen
  File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/47/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/samplegen/__init__.py", line 15, in <module>
    from gapic.samplegen import samplegen
  File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/47/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/samplegen/samplegen.py", line 27, in <module>
    from gapic.schema import wrappers
  File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/47/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/schema/__init__.py", line 23, in <module>
    from gapic.schema.api import API
  File "/home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/sandbox/linux-sandbox/47/execroot/com_google_googleapis/bazel-out/host/bin/external/gapic_generator_python/gapic_plugin.runfiles/gapic_generator_python/gapic/schema/api.py", line 29, in <module>
    from google.api_core import exceptions  # type: ignore
ModuleNotFoundError: No module named 'google.api_core'
--python_gapic_out: protoc-gen-python_gapic: Plugin failed with status code 1.
Target //google/cloud/tasks/v2beta2:tasks-v2beta2-py failed to build
Use --verbose_failures to see the command lines of failed build steps.
INFO: Elapsed time: 1.040s, Critical Path: 0.83s
INFO: 0 processes.
FAILED: Build did NOT complete successfully
FAILED: Build did NOT complete successfully

Traceback (most recent call last):
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module>
    main()
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__
    return self.main(*args, **kwargs)
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 782, in main
    rv = self.invoke(ctx)
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke
    return callback(*args, **kwargs)
  File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 94, in main
    spec.loader.exec_module(synth_module)  # type: ignore
  File "<frozen importlib._bootstrap_external>", line 678, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/home/kbuilder/.cache/synthtool/python-tasks/synth.py", line 36, in <module>
    include_protos=True,
  File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 52, in py_library
    return self._generate_code(service, version, "python", **kwargs)
  File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 197, in _generate_code
    shell.run(bazel_run_args)
  File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 39, in run
    raise exc
  File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 33, in run
    encoding="utf-8",
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 438, in run
    output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['bazel', '--max_idle_secs=240', 'build', '//google/cloud/tasks/v2beta2:tasks-v2beta2-py']' returned non-zero exit status 1.
2021-01-28 05:47:55,036 autosynth [ERROR] > Synthesis failed
2021-01-28 05:47:55,036 autosynth [DEBUG] > Running: git reset --hard HEAD
HEAD is now at 6da7304 chore(deps): update dependency google-cloud-tasks to v2.1.0 (#63)
2021-01-28 05:47:55,041 autosynth [DEBUG] > Running: git checkout autosynth
Switched to branch 'autosynth'
2021-01-28 05:47:55,047 autosynth [DEBUG] > Running: git clean -fdx
Removing __pycache__/
Traceback (most recent call last):
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 354, in <module>
    main()
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 189, in main
    return _inner_main(temp_dir)
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 334, in _inner_main
    commit_count = synthesize_loop(x, multiple_prs, change_pusher, synthesizer)
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 65, in synthesize_loop
    has_changes = toolbox.synthesize_version_in_new_branch(synthesizer, youngest)
  File "/tmpfs/src/github/synthtool/autosynth/synth_toolbox.py", line 259, in synthesize_version_in_new_branch
    synthesizer.synthesize(synth_log_path, self.environ)
  File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize
    synth_proc.check_returncode()  # Raise an exception.
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode
    self.stderr)
subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1.

Google internal developers can see the full log here.

samples.snippets.list_queues_test: test_list_queues_present failed

This test failed!

To configure my behavior, see the Flaky Bot documentation.

If I'm commenting on this issue too often, add the flakybot: quiet label and
I will stop commenting.


commit: 47b9c1c
buildURL: Build Status, Sponge
status: failed

Test output
args = (name: "projects/python-docs-samples-tests/locations/us-central1/queues/my-queue-9529df6c3b894d05b5617c7da41f9f05"
,)
kwargs = {'metadata': [('x-goog-request-params', 'name=projects/python-docs-samples-tests/locations/us-central1/queues/my-queue-9529df6c3b894d05b5617c7da41f9f05'), ('x-goog-api-client', 'gl-python/3.8.8 grpc/1.38.1 gax/1.31.0 gapic/2.4.0')]}
@six.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
    try:
      return callable_(*args, **kwargs)

.nox/py-3-8/lib/python3.8/site-packages/google/api_core/grpc_helpers.py:67:


self = <grpc._channel._UnaryUnaryMultiCallable object at 0x7ff6c69a6670>
request = name: "projects/python-docs-samples-tests/locations/us-central1/queues/my-queue-9529df6c3b894d05b5617c7da41f9f05"

timeout = None
metadata = [('x-goog-request-params', 'name=projects/python-docs-samples-tests/locations/us-central1/queues/my-queue-9529df6c3b894d05b5617c7da41f9f05'), ('x-goog-api-client', 'gl-python/3.8.8 grpc/1.38.1 gax/1.31.0 gapic/2.4.0')]
credentials = None, wait_for_ready = None, compression = None

def __call__(self,
             request,
             timeout=None,
             metadata=None,
             credentials=None,
             wait_for_ready=None,
             compression=None):
    state, call, = self._blocking(request, timeout, metadata, credentials,
                                  wait_for_ready, compression)
  return _end_unary_response_blocking(state, call, False, None)

.nox/py-3-8/lib/python3.8/site-packages/grpc/_channel.py:946:


state = <grpc._channel._RPCState object at 0x7ff6c6a01550>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7ff6c697d0c0>
with_call = False, deadline = None

def _end_unary_response_blocking(state, call, with_call, deadline):
    if state.code is grpc.StatusCode.OK:
        if with_call:
            rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
            return state.response, rendezvous
        else:
            return state.response
    else:
      raise _InactiveRpcError(state)

E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.UNAUTHENTICATED
E details = "Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project."
E debug_error_string = "{"created":"@1626174372.475567666","description":"Error received from peer ipv4:74.125.195.95:443","file":"src/core/lib/surface/call.cc","file_line":1066,"grpc_message":"Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.","grpc_status":16}"
E >

.nox/py-3-8/lib/python3.8/site-packages/grpc/_channel.py:849: _InactiveRpcError

The above exception was the direct cause of the following exception:

@pytest.fixture()
def test_queue():
    client = tasks_v2.CloudTasksClient()
    parent = f"projects/{TEST_PROJECT_ID}/locations/{TEST_LOCATION}"
    queue = {
        # The fully qualified path to the queue
        "name": client.queue_path(TEST_PROJECT_ID, TEST_LOCATION, TEST_QUEUE_NAME),
    }
    q = client.create_queue(request={"parent": parent, "queue": queue})

    yield q
  client.delete_queue(request={"name": q.name})

list_queues_test.py:40:


../../google/cloud/tasks_v2/services/cloud_tasks/client.py:814: in delete_queue
rpc(
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/retry.py:285: in retry_wrapped_func
return retry_target(
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/retry.py:188: in retry_target
return target()
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/grpc_helpers.py:69: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)


value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.UNAUTHENTICATED
details = "Request had invalid a...entication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.","grpc_status":16}"

???
E google.api_core.exceptions.Unauthenticated: 401 Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.

:3: Unauthenticated

Possible memory leak when creating tasks

We're trying to migrate from Appengine Standard to Appengine Flexible, and we're also changing frameworks from Webapp2 to Pyramid, as webapp2 doesn't support python 3 (and is quite old)

We have an endpoint that receives 1 request per second, roughly, and that endpoint kicks off a task. It seems that the process of creating a task isn't releasing memory. I'll try to give as much info as I can.

Environment details

  • OS type and version: Appengine Flexible custom runtime using gcr.io/google-appengine/python
  • Python version: 3.7
  • google-cloud-tasks version: Unsure how to check on appengine but probably 1.5

Steps to reproduce

  1. Create a task

Code example

Task class to help facilitate creating tasks in various handlers. Bulk of this is from Cloud Tasks documentation:

import json
import logging

from google.cloud import tasks_v2


class TasksQueue(object):
    def __init__(self):
        self.client = tasks_v2.CloudTasksClient()

    def add(self, url, payload=None, queue_name='default'):
        if type(payload) == dict:
            payload = json.dumps(payload)

        # Construct the fully qualified queue name.
        parent = self.client.queue_path('my-project', 'us-central1', queue_name)

        # Construct the request body.
        task = {
            'app_engine_http_request': {  # Specify the type of request.
                'http_method': 'POST',
                'relative_uri': url
            }
        }
        if payload is not None:
            # The API expects a payload of type bytes.
            converted_payload = payload.encode()

            # Add the payload to the request.
            task['app_engine_http_request']['body'] = converted_payload
        logging.info(task)
        # Use the client to build and send the task.
        response = self.client.create_task(parent, task)

        logging.info('Created task {}'.format(response.name))
        return response

And our usage of the above class:

taskqueue = TasksQueue()
taskqueue.add(url='/other_endpoint', queue_name='my-queue', payload={'data': 1})

The full handler itself has a connection to a redis MemoryStore but I ran a bunch of requests through another handler that solely gets/sets redis entries and saw no increase in memory usage on the instances, so that leads me to believe theres an issue with cloud tasks.

Further, I used tracemalloc to check memory usage before and after each request. These are unfortunately backwards (bottom row is the biggest difference) because of the way logs are displayed in Logging under flexible when new lines are present. As you can see though, pyasn1/type/base.py and pyasn1/type/integer.py` have objects that balloon in size after the request and never seem to go down.

Before:

A 2020-04-14T21:43:44Z /env/lib/python3.7/site-packages/grpc/_channel.py:770: size=960 B (+560 B), count=12 (+7), average=80 B
A 2020-04-14T21:43:44Z /env/lib/python3.7/site-packages/google/api_core/retry.py:286: size=576 B (-576 B), count=1 (-1), average=576 B
A 2020-04-14T21:43:44Z /env/lib/python3.7/site-packages/pyramid/router.py:139: size=576 B (+576 B), count=1 (+1), average=576 B
A 2020-04-14T21:43:44Z /env/lib/python3.7/site-packages/grpc/_channel.py:825: size=584 B (-584 B), count=1 (-1), average=584 B
A 2020-04-14T21:43:44Z /env/lib/python3.7/site-packages/urllib3/util/retry.py:196: size=284 KiB (+588 B), count=1980 (+4), average=147 B
A 2020-04-14T21:43:44Z /opt/python3.7/lib/python3.7/threading.py:235: size=1048 B (+592 B), count=5 (+2), average=210 B
A 2020-04-14T21:43:44Z /env/lib/python3.7/site-packages/google/protobuf/json_format.py:481: size=1200 B (-600 B), count=2 (-1), average=600 B
A 2020-04-14T21:43:44Z /env/lib/python3.7/site-packages/grpc/_channel.py:1353: size=1968 B (+712 B), count=17 (+5), average=116 B
A 2020-04-14T21:43:44Z /env/lib/python3.7/site-packages/grpc/_channel.py:99: size=744 B (+744 B), count=2 (+2), average=372 B
A 2020-04-14T21:43:44Z /env/lib/python3.7/site-packages/gunicorn/http/message.py:110: size=817 B (-766 B), count=12 (-11), average=68 B
A 2020-04-14T21:43:44Z /env/lib/python3.7/site-packages/gunicorn/http/wsgi.py:136: size=827 B (-827 B), count=12 (-12), average=69 B
A 2020-04-14T21:43:44Z /env/lib/python3.7/site-packages/pyasn1/compat/integer.py:99: size=461 KiB (+956 B), count=2470 (+5), average=191 B
A 2020-04-14T21:43:44Z /env/lib/python3.7/site-packages/grpc/_channel.py:1149: size=1728 B (+1008 B), count=12 (+7), average=144 B
A 2020-04-14T21:43:44Z /env/lib/python3.7/site-packages/pyasn1/type/base.py:373: size=418 KiB (+1008 B), count=2558 (+6), average=167 B
A 2020-04-14T21:43:44Z /env/lib/python3.7/site-packages/gunicorn/http/wsgi.py:139: size=1112 B (-1112 B), count=1 (-1), average=1112 B
A 2020-04-14T21:43:44Z /env/lib/python3.7/site-packages/pyasn1/codec/ber/decoder.py:609: size=12.3 KiB (-1472 B), count=37 (-2), average=341 B
A 2020-04-14T21:43:44Z /home/vmagent/app/externalping/AppengineLogHandler.py:106: size=58.8 KiB (+1504 B), count=742 (+10), average=81 B
A 2020-04-14T21:43:44Z /opt/python3.7/lib/python3.7/tracemalloc.py:185: size=29.2 KiB (-60.6 KiB), count=466 (-970), average=64 B
A 2020-04-14T21:43:44Z /opt/python3.7/lib/python3.7/tracemalloc.py:113: size=1152 B (-97.0 KiB), count=12 (-1035), average=96 B
A 2020-04-14T21:43:44Z [ Top 20 differences ]

After roughly 1 hour:

2020-04-14T22:39:23Z /env/lib/python3.7/site-packages/pyasn1/type/base.py:54: size=1870 KiB (+480 B), count=21036 (+6), average=91 BA 
2020-04-14T22:39:23Z /env/lib/python3.7/site-packages/grpc/_common.py:70: size=936 B (+513 B), count=12 (+7), average=78 BA 
2020-04-14T22:39:23Z /env/lib/python3.7/site-packages/grpc/_channel.py:770: size=960 B (+560 B), count=12 (+7), average=80 BA 
2020-04-14T22:39:23Z /env/lib/python3.7/site-packages/pyramid/router.py:139: size=576 B (+576 B), count=1 (+1), average=576 BA 
2020-04-14T22:39:23Z /env/lib/python3.7/site-packages/urllib3/util/retry.py:196: size=1839 KiB (+588 B), count=12808 (+4), average=147 BA 
2020-04-14T22:39:23Z /opt/python3.7/lib/python3.7/threading.py:235: size=1048 B (+592 B), count=5 (+2), average=210 BA 
2020-04-14T22:39:23Z /env/lib/python3.7/site-packages/google/protobuf/json_format.py:481: size=1200 B (-600 B), count=2 (-1), average=600 BA 
2020-04-14T22:39:23Z /home/vmagent/app/externalping/AppengineLogHandler.py:106: size=451 KiB (+672 B), count=5760 (+4), average=80 BA 
2020-04-14T22:39:23Z /env/lib/python3.7/site-packages/grpc/_channel.py:1353: size=1968 B (+712 B), count=17 (+5), average=116 BA 
2020-04-14T22:39:23Z /env/lib/python3.7/site-packages/grpc/_channel.py:99: size=744 B (+744 B), count=2 (+2), average=372 BA 
2020-04-14T22:39:23Z /env/lib/python3.7/site-packages/gunicorn/http/message.py:110: size=818 B (-766 B), count=12 (-11), average=68 BA 
2020-04-14T22:39:23Z /env/lib/python3.7/site-packages/gunicorn/http/wsgi.py:136: size=827 B (-827 B), count=12 (-12), average=69 BA 
2020-04-14T22:39:23Z /env/lib/python3.7/site-packages/pyasn1/compat/integer.py:99: size=2988 KiB (+956 B), count=16005 (+5), average=191 BA 
2020-04-14T22:39:23Z /env/lib/python3.7/site-packages/grpc/_channel.py:1149: size=1728 B (+1008 B), count=12 (+7), average=144 BA 
2020-04-14T22:39:23Z /env/lib/python3.7/site-packages/gunicorn/http/wsgi.py:139: size=1112 B (-1112 B), count=1 (-1), average=1112 BA 
2020-04-14T22:39:23Z /env/lib/python3.7/functools.py:60: size=675 KiB (+1168 B), count=6627 (+7), average=104 BA 
2020-04-14T22:39:23Z /env/lib/python3.7/site-packages/pyasn1/type/base.py:373: size=3045 KiB (+1176 B), count=18740 (+7), average=166 BA 
2020-04-14T22:39:23Z /opt/python3.7/lib/python3.7/tracemalloc.py:185: size=67.7 KiB (-59.5 KiB), count=1083 (-952), average=64 BA 
2020-04-14T22:39:23Z /opt/python3.7/lib/python3.7/tracemalloc.py:113: size=1152 B (-98.7 KiB), count=12 (-1053), average=96 BA 
2020-04-14T22:39:23Z [ Top 20 differences ] 

Here is a screenshot of the memory usage on the instance

at 11am I changed it to an instance with more memory. This is where its sitting right now

I've tried gc.collect() after the handler is done, del taskqueue after the task is created, and taskqueue.client.transport.channel.close(), nothing works to keep memory usage in check. I'm not sure what else I can do here or what other logs I can provide to help nail this down. Any help would be greatly appreciated.

Can't assign dispatch deadline

The tasks I create are killed by "(504) DEADLINE_EXCEEDED" if it exceeds 60 seconds. So I try to increase the deadline to the maximum like this:

def send_task(payload, queue, uri, *args):
    url = f'https://www.mywebsite.com/{uri}'
    payload = json.dumps(payload)
    payload = payload.encode()
    parent = client.queue_path(project=project, location=location, queue=queue)
    service_account_email = 'myaccount.com'
    # Construct the request body.
    td = timedelta(minutes=30)
    duration = duration_pb2.Duration()
    time = duration.FromTimedelta(td)
    task = {
        'http_request': {  # Specify the type of request.
            'http_method': tasks.HttpMethod.POST,
            'url': url,
            'body': payload, # Convert dictionary to string
            'headers': { # Add custom header
                'Content-Type': 'application/json'
            },
            'oidc_token': {'service_account_email': service_account_email}
        }
    }
    task['dispatch_deadline'] = time
    response = client.create_task(request={"parent": parent, "task": task})

I tried to do it with timestamp_pb2 as well, no luck. Is this a bug or am I missing something?

Creation of tasks in 2.0 fails

After upgrading to 2.0.0 I am getting the following whenever I am trying to create tasks:

    tasks_client = await sync_to_async(tasks_v2.CloudTasksClient)()
    parent = tasks_client.queue_path(project, loc, name)

    # Construct the request body.
    task = {
            'http_request': {  # Specify the type of request.
                'http_method': 'POST',
                'url': url
            }
    }

    # The API expects a payload of type bytes.
    converted_payload = payload.encode()

    # Add the payload to the request.
    task['http_request']['body'] = converted_payload

    # create the task
    response = await sync_to_async(tasks_client.create_task)(parent, task)

Reproducible in cloud run, running python 3.8.2 as well as locally running under windows.
PIP 20.1.1 locally.

create_task() takes from 1 to 2 positional arguments but 3 were given

Exact same code works in 1.5.0.

I am guessing something has changed in 2.0.0 but at the time of writing the release notes are not published, and I didnt have the time to debug the 2.0.0 to figure it out on my own.

Clear example of usage for `create_task`

Here's the documentation example for creating a task:
image

Is it just me, or is it missing the important parts?
Specifically, I would like to know how to provide the URL of the handler I want the task to call, and second, I'd like to send a payload.
I'm going to research the docs now. Thanks

samples.snippets.list_queues_test: test_list_queues_not_present failed

This test failed!

To configure my behavior, see the Flaky Bot documentation.

If I'm commenting on this issue too often, add the flakybot: quiet label and
I will stop commenting.


commit: 47b9c1c
buildURL: Build Status, Sponge
status: failed

Test output
args = (parent: "projects/python-docs-samples-tests/locations/us-central1"
page_token: "CmQKK3F1ZXVlLWQ1NjNjODA2LTZmMjQtNGUxNi1hNTdiLWFjZmYwYjFlNzVjMgASCwij57WHBhCFj5djGigqGXB5dGhvbi1kb2NzLXNhbXBsZXMtdGVzdHMyC3VzLWNlbnRyYWwx"
,)
kwargs = {'metadata': [('x-goog-request-params', 'parent=projects/python-docs-samples-tests/locations/us-central1'), ('x-goog-api-client', 'gl-python/3.8.8 grpc/1.38.1 gax/1.31.0 gapic/2.4.0')], 'timeout': 10.0}
@six.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
    try:
      return callable_(*args, **kwargs)

.nox/py-3-8/lib/python3.8/site-packages/google/api_core/grpc_helpers.py:67:


self = <grpc._channel._UnaryUnaryMultiCallable object at 0x7ff6c68630a0>
request = parent: "projects/python-docs-samples-tests/locations/us-central1"
page_token: "CmQKK3F1ZXVlLWQ1NjNjODA2LTZmMjQtNGUxNi1hNTdiLWFjZmYwYjFlNzVjMgASCwij57WHBhCFj5djGigqGXB5dGhvbi1kb2NzLXNhbXBsZXMtdGVzdHMyC3VzLWNlbnRyYWwx"

timeout = 10.0
metadata = [('x-goog-request-params', 'parent=projects/python-docs-samples-tests/locations/us-central1'), ('x-goog-api-client', 'gl-python/3.8.8 grpc/1.38.1 gax/1.31.0 gapic/2.4.0')]
credentials = None, wait_for_ready = None, compression = None

def __call__(self,
             request,
             timeout=None,
             metadata=None,
             credentials=None,
             wait_for_ready=None,
             compression=None):
    state, call, = self._blocking(request, timeout, metadata, credentials,
                                  wait_for_ready, compression)
  return _end_unary_response_blocking(state, call, False, None)

.nox/py-3-8/lib/python3.8/site-packages/grpc/_channel.py:946:


state = <grpc._channel._RPCState object at 0x7ff6c8b04a00>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7ff6c8b3c340>
with_call = False, deadline = None

def _end_unary_response_blocking(state, call, with_call, deadline):
    if state.code is grpc.StatusCode.OK:
        if with_call:
            rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
            return state.response, rendezvous
        else:
            return state.response
    else:
      raise _InactiveRpcError(state)

E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.UNAUTHENTICATED
E details = "Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project."
E debug_error_string = "{"created":"@1626174371.410723544","description":"Error received from peer ipv4:74.125.195.95:443","file":"src/core/lib/surface/call.cc","file_line":1066,"grpc_message":"Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.","grpc_status":16}"
E >

.nox/py-3-8/lib/python3.8/site-packages/grpc/_channel.py:849: _InactiveRpcError

The above exception was the direct cause of the following exception:

capsys = <_pytest.capture.CaptureFixture object at 0x7ff6c69dd580>

def test_list_queues_not_present(capsys):
  list_queues.list_queues(TEST_PROJECT_ID, TEST_LOCATION)

list_queues_test.py:44:


list_queues.py:32: in list_queues
for queue in response:
../../google/cloud/tasks_v2/services/cloud_tasks/pagers.py:87: in iter
for page in self.pages:
../../google/cloud/tasks_v2/services/cloud_tasks/pagers.py:83: in pages
self._response = self._method(self._request, metadata=self._metadata)
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/retry.py:285: in retry_wrapped_func
return retry_target(
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/retry.py:188: in retry_target
return target()
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/timeout.py:102: in func_with_timeout
return func(*args, **kwargs)
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/grpc_helpers.py:69: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)


value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.UNAUTHENTICATED
details = "Request had invalid a...entication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.","grpc_status":16}"

???
E google.api_core.exceptions.Unauthenticated: 401 Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.

:3: Unauthenticated

Synthesis failed for python-tasks

Hello! Autosynth couldn't regenerate python-tasks. ๐Ÿ’”

Here's the output from running synth.py:

Cloning into 'working_repo'...
Switched to branch 'autosynth'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/synth.py.
On branch autosynth
nothing to commit, working tree clean
HEAD detached at FETCH_HEAD
nothing to commit, working tree clean
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:6aec9c34db0e4be221cdaf6faba27bdc07cfea846808b3d3b964dfce3a9a0f9b
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/cloud/tasks/artman_cloudtasks_v2beta2.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/tasks-v2beta2.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/tasks/v2beta2/cloudtasks.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/tasks-v2beta2/google/cloud/tasks_v2beta2/proto/cloudtasks.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/tasks/v2beta2/queue.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/tasks-v2beta2/google/cloud/tasks_v2beta2/proto/queue.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/tasks/v2beta2/target.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/tasks-v2beta2/google/cloud/tasks_v2beta2/proto/target.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/tasks/v2beta2/task.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/tasks-v2beta2/google/cloud/tasks_v2beta2/proto/task.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/tasks-v2beta2/google/cloud/tasks_v2beta2/proto.
synthtool > Replaced '(Google IAM .*?_) ' in google/cloud/tasks_v2beta2/gapic/cloud_tasks_client.py.
synthtool > Replaced '>`__' in google/cloud/tasks_v2beta2/proto/queue_pb2.py.
synthtool > Replaced '>`__' in google/cloud/tasks_v2beta2/proto/task_pb2.py.
synthtool > Replaced '>`__' in google/cloud/tasks_v2beta2/proto/target_pb2.py.
synthtool > Replaced '>`__' in google/cloud/tasks_v2beta2/proto/cloudtasks_pb2.py.
synthtool > Running generator for google/cloud/tasks/artman_cloudtasks_v2beta3.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/tasks-v2beta3.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/tasks/v2beta3/cloudtasks.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/tasks-v2beta3/google/cloud/tasks_v2beta3/proto/cloudtasks.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/tasks/v2beta3/queue.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/tasks-v2beta3/google/cloud/tasks_v2beta3/proto/queue.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/tasks/v2beta3/target.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/tasks-v2beta3/google/cloud/tasks_v2beta3/proto/target.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/tasks/v2beta3/task.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/tasks-v2beta3/google/cloud/tasks_v2beta3/proto/task.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/tasks-v2beta3/google/cloud/tasks_v2beta3/proto.
synthtool > Replaced '(Google IAM .*?_) ' in google/cloud/tasks_v2beta3/gapic/cloud_tasks_client.py.
synthtool > Replaced '>`__' in google/cloud/tasks_v2beta3/proto/queue_pb2.py.
synthtool > Replaced '>`__' in google/cloud/tasks_v2beta3/proto/task_pb2.py.
synthtool > Replaced '>`__' in google/cloud/tasks_v2beta3/proto/target_pb2.py.
synthtool > Replaced '>`__' in google/cloud/tasks_v2beta3/proto/cloudtasks_pb2.py.
synthtool > Running generator for google/cloud/tasks/artman_cloudtasks_v2.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/tasks-v2.
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/tasks/v2/cloudtasks.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/tasks-v2/google/cloud/tasks_v2/proto/cloudtasks.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/tasks/v2/queue.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/tasks-v2/google/cloud/tasks_v2/proto/queue.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/tasks/v2/target.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/tasks-v2/google/cloud/tasks_v2/proto/target.proto
synthtool > Copy: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/tasks/v2/task.proto to /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/tasks-v2/google/cloud/tasks_v2/proto/task.proto
synthtool > Placed proto files into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/python/tasks-v2/google/cloud/tasks_v2/proto.
synthtool > Replaced '(Google IAM .*?_) ' in google/cloud/tasks_v2/gapic/cloud_tasks_client.py.
synthtool > Replaced '>`__' in google/cloud/tasks_v2/proto/queue_pb2.py.
synthtool > Replaced '>`__' in google/cloud/tasks_v2/proto/task_pb2.py.
synthtool > Replaced '>`__' in google/cloud/tasks_v2/proto/target_pb2.py.
synthtool > Replaced '>`__' in google/cloud/tasks_v2/proto/cloudtasks_pb2.py.
synthtool > Replaced '(in queue.yaml/xml) <\n\\s+' in google/cloud/tasks_v2beta2/proto/queue_pb2.py.
synthtool > Replaced '#retry_parameters>\n          `__\\.' in google/cloud/tasks_v2/proto/queue_pb2.py.
synthtool > Replaced '>>> # TODO: Initialize `queue`:' in google/cloud/tasks_v2beta3/gapic/cloud_tasks_client.py.
synthtool > Replaced '^(\\s+)>>> queue = {}\n' in google/cloud/tasks_v2beta3/gapic/cloud_tasks_client.py.
synthtool > Replaced 'types\\.View' in google/cloud/tasks_v2beta2/gapic/cloud_tasks_client.py.
synthtool > Replaced 'types\\.View' in google/cloud/tasks_v2/gapic/cloud_tasks_client.py.
synthtool > Replaced 'types\\.View' in google/cloud/tasks_v2beta3/gapic/cloud_tasks_client.py.
synthtool > Replaced '            retry \\(Optional\\[google\\.api_core\\.retry\\.Retry\\]\\):  A retry object used\n                to retry requests\\. If ``None`` is specified, requests will\n                be retried using a default configuration\\.\n            timeout \\(Optional\\[float\\]\\): The amount of time, in seconds, to wait\n                for the request to complete\\. Note that if ``retry`` is\n                specified, the timeout applies to each individual attempt\\.\n            metadata \\(Optional\\[Sequence\\[Tuple\\[str, str\\]\\]\\]\\): Additional metadata\n                that is provided to the method\\.\n\n' in google/cloud/tasks_v2beta2/gapic/cloud_tasks_client.py.
synthtool > Replaced '            retry \\(Optional\\[google\\.api_core\\.retry\\.Retry\\]\\):  A retry object used\n                to retry requests\\. If ``None`` is specified, requests will\n                be retried using a default configuration\\.\n            timeout \\(Optional\\[float\\]\\): The amount of time, in seconds, to wait\n                for the request to complete\\. Note that if ``retry`` is\n                specified, the timeout applies to each individual attempt\\.\n            metadata \\(Optional\\[Sequence\\[Tuple\\[str, str\\]\\]\\]\\): Additional metadata\n                that is provided to the method\\.\n\n' in google/cloud/tasks_v2/gapic/cloud_tasks_client.py.
synthtool > Replaced '            retry \\(Optional\\[google\\.api_core\\.retry\\.Retry\\]\\):  A retry object used\n                to retry requests\\. If ``None`` is specified, requests will\n                be retried using a default configuration\\.\n            timeout \\(Optional\\[float\\]\\): The amount of time, in seconds, to wait\n                for the request to complete\\. Note that if ``retry`` is\n                specified, the timeout applies to each individual attempt\\.\n            metadata \\(Optional\\[Sequence\\[Tuple\\[str, str\\]\\]\\]\\): Additional metadata\n                that is provided to the method\\.\n\n' in google/cloud/tasks_v2beta3/gapic/cloud_tasks_client.py.
.coveragerc
.flake8
.github/CONTRIBUTING.md
.github/ISSUE_TEMPLATE/bug_report.md
.github/ISSUE_TEMPLATE/feature_request.md
.github/ISSUE_TEMPLATE/support_request.md
.github/PULL_REQUEST_TEMPLATE.md
.github/release-please.yml
.gitignore
.kokoro/build.sh
.kokoro/continuous/common.cfg
.kokoro/continuous/continuous.cfg
.kokoro/docs/common.cfg
.kokoro/docs/docs.cfg
.kokoro/presubmit/common.cfg
.kokoro/presubmit/presubmit.cfg
.kokoro/publish-docs.sh
.kokoro/release.sh
.kokoro/release/common.cfg
.kokoro/release/release.cfg
.kokoro/trampoline.sh
CODE_OF_CONDUCT.md
CONTRIBUTING.rst
LICENSE
MANIFEST.in
docs/_static/custom.css
docs/_templates/layout.html
docs/conf.py.j2
noxfile.py.j2
renovate.json
setup.cfg
Running session blacken
Creating virtual environment (virtualenv) using python3.6 in .nox/blacken
pip install black==19.3b0
Error: pip is not installed into the virtualenv, it is located at /tmpfs/src/git/autosynth/env/bin/pip. Pass external=True into run() to explicitly allow this.
Session blacken failed.
synthtool > Failed executing nox -s blacken:

None
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
  File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 102, in <module>
    main()
  File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
    return self.main(*args, **kwargs)
  File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
    rv = self.invoke(ctx)
  File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
    return callback(*args, **kwargs)
  File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 94, in main
    spec.loader.exec_module(synth_module)  # type: ignore
  File "<frozen importlib._bootstrap_external>", line 678, in exec_module
  File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
  File "/tmpfs/src/git/autosynth/working_repo/synth.py", line 121, in <module>
    s.shell.run(["nox", "-s", "blacken"], hide_output=False)
  File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 39, in run
    raise exc
  File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 33, in run
    encoding="utf-8",
  File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/subprocess.py", line 418, in run
    output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['nox', '-s', 'blacken']' returned non-zero exit status 1.

Synthesis failed

Google internal developers can see the full log here.

gunicorn update from 19.x.x to 20.x.x results in 503 Deadline Exceeded

Environment details

  • OS type and version:
  • Python version: 3.6
  • pip version: 10.0.1
  • google-cloud-tasks version: 1.5.0
  • Django: 3.0.5
  • Gunicorn: 20.0.0

Steps to reproduce

  1. Deploy Django server to GAE Flex environment with gunicorn version 20.0.0
  2. Attempt to queue a cloud task
  3. Receive an error google.api_core.exceptions.ServiceUnavailable: 503 Deadline Exceeded and the task is not created.
  4. If we revert the gunicorn version to 19.9.0 and redeploy, we are able to successfully queue a task.

Code example

    from google.cloud import tasks_v2
    project = getattr(settings, โ€˜GCLOUD_PROJECTโ€™)
    location = getattr(settings, โ€˜LOCATIONโ€™)
    client = tasks_v2.CloudTasksClient()
    parent = client.queue_path(project, location, queue)
    task = {
        โ€˜app_engine_http_requestโ€™: { # Specify the type of request.
            โ€˜http_methodโ€™: โ€˜POSTโ€™,
            โ€˜relative_uriโ€™: <>
            โ€˜headersโ€™: {
                โ€˜Content-Typeโ€™: โ€˜application/jsonโ€™
            }
        }
    }
    response = client.create_task(parent, task)

Stack trace

Traceback (most recent call last): File "/home/vmagent/app/cache/views/tasks/cron_task_viewset.py", line 647, in cache object_processing_uri=PATH_QUEUE_TASKS + "cache/", File "/home/vmagent/app/utils/tasks/__init__.py", line 148, in chunk task_generator.queue_task() File "/home/vmagent/app/utils/tasks/__init__.py", line 90, in queue_task response = self.client.create_task(self.parent, task) File "/env/lib/python3.6/site-packages/google/cloud/tasks_v2/gapic/cloud_tasks_client.py", line 1508, in create_task request, retry=retry, timeout=timeout, metadata=metadata File "/env/lib/python3.6/site-packages/google/api_core/gapic_v1/method.py", line 143, in __call__ return wrapped_func(*args, **kwargs) File "/env/lib/python3.6/site-packages/google/api_core/retry.py", line 286, in retry_wrapped_func on_error=on_error, File "/env/lib/python3.6/site-packages/google/api_core/retry.py", line 184, in retry_target return target() File "/env/lib/python3.6/site-packages/google/api_core/timeout.py", line 214, in func_with_timeout return func(*args, **kwargs) File "/env/lib/python3.6/site-packages/google/api_core/grpc_helpers.py", line 59, in error_remapped_callable six.raise_from(exceptions.from_grpc_error(exc), exc) File "<string>", line 3, in raise_from google.api_core.exceptions.ServiceUnavailable: 503 Deadline Exceeded

There appear to be breaking changes in gunicorn 20.x.x and google-cloud-tasks needs to be updated accordingly.

I hope that the above is helpful.

Thanks!

Synthesis failed for python-tasks

Hello! Autosynth couldn't regenerate python-tasks. ๐Ÿ’”

Here's the output from running synth.py:

zel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
 - <builtin>
 - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:209:1
Analyzing: target //google/cloud/tasks/v2beta2:tasks-v2beta2-py (1 packages loaded, 0 targets configured)
INFO: Call stack for the definition of repository 'go_sdk' which is a _go_download_sdk (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/io_bazel_rules_go/go/private/sdk.bzl:79:20):
 - <builtin>
 - /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/io_bazel_rules_go/go/private/sdk.bzl:92:5
 - /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/io_bazel_rules_go/go/private/sdk.bzl:260:13
 - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:81:1
ERROR: While resolving toolchains for target @pypi_black//:black: invalid registered toolchain '@gapic_generator_python//:pyenv3_toolchain': no such package '@gapic_generator_python//': The repository '@gapic_generator_python' could not be resolved
ERROR: Analysis of target '//google/cloud/tasks/v2beta2:tasks-v2beta2-py' failed; build aborted: invalid registered toolchain '@gapic_generator_python//:pyenv3_toolchain': no such package '@gapic_generator_python//': The repository '@gapic_generator_python' could not be resolved
INFO: Elapsed time: 4.019s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (20 packages loaded, 14 targets configured)
FAILED: Build did NOT complete successfully (20 packages loaded, 14 targets configured)

ERROR:synthtool:Failed executing bazel --max_idle_secs=60 build //google/cloud/tasks/v2beta2:tasks-v2beta2-py:

Loading: 
Loading: 0 packages loaded
Loading: 0 packages loaded
Loading: 0 packages loaded
INFO: SHA256 (https://github.com/googleapis/gapic-generator/archive/4cb5d58f258afdb8abc0b99706370b4a59252b22.zip) = 3cb59685c8a4ae3db1dec60b286f22d8d3aa3d3b36bb08bf003b5e088fac83cb
DEBUG: Rule 'com_google_api_codegen' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "3cb59685c8a4ae3db1dec60b286f22d8d3aa3d3b36bb08bf003b5e088fac83cb"
DEBUG: Call stack for the definition of repository 'com_google_api_codegen' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
 - <builtin>
 - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:62:1
Loading: 0 packages loaded
INFO: SHA256 (https://github.com/googleapis/protoc-java-resource-names-plugin/archive/5bd90a1f67c1c128291702cc320d667060f40f95.zip) = c3c0661b6c30fce5c63b1d5f473b1c6c4d59e19853ce3b9e8f5a447f953af906
DEBUG: Rule 'com_google_protoc_java_resource_names_plugin' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "c3c0661b6c30fce5c63b1d5f473b1c6c4d59e19853ce3b9e8f5a447f953af906"
DEBUG: Call stack for the definition of repository 'com_google_protoc_java_resource_names_plugin' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
 - <builtin>
 - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:155:1
INFO: SHA256 (https://github.com/googleapis/gapic-generator-go/archive/v0.13.2.tar.gz) = ab7a2ffd74e6a6dac6da38027d4acadb84d0075c055289e3335d86a46f9f3b22
DEBUG: Rule 'com_googleapis_gapic_generator_go' indicated that a canonical reproducible form can be obtained by modifying arguments sha256 = "ab7a2ffd74e6a6dac6da38027d4acadb84d0075c055289e3335d86a46f9f3b22"
DEBUG: Call stack for the definition of repository 'com_googleapis_gapic_generator_go' which is a http_archive (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/bazel_tools/tools/build_defs/repo/http.bzl:296:16):
 - <builtin>
 - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:209:1
Analyzing: target //google/cloud/tasks/v2beta2:tasks-v2beta2-py (1 packages loaded, 0 targets configured)
INFO: Call stack for the definition of repository 'go_sdk' which is a _go_download_sdk (rule definition at /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/io_bazel_rules_go/go/private/sdk.bzl:79:20):
 - <builtin>
 - /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/io_bazel_rules_go/go/private/sdk.bzl:92:5
 - /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/io_bazel_rules_go/go/private/sdk.bzl:260:13
 - /home/kbuilder/.cache/synthtool/googleapis/WORKSPACE:81:1
ERROR: While resolving toolchains for target @pypi_black//:black: invalid registered toolchain '@gapic_generator_python//:pyenv3_toolchain': no such package '@gapic_generator_python//': The repository '@gapic_generator_python' could not be resolved
ERROR: Analysis of target '//google/cloud/tasks/v2beta2:tasks-v2beta2-py' failed; build aborted: invalid registered toolchain '@gapic_generator_python//:pyenv3_toolchain': no such package '@gapic_generator_python//': The repository '@gapic_generator_python' could not be resolved
INFO: Elapsed time: 4.019s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (20 packages loaded, 14 targets configured)
FAILED: Build did NOT complete successfully (20 packages loaded, 14 targets configured)

2020-06-20 08:40:45,152 synthtool [DEBUG] > Wrote metadata to synth.metadata.
DEBUG:synthtool:Wrote metadata to synth.metadata.
Traceback (most recent call last):
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module>
    main()
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__
    return self.main(*args, **kwargs)
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 782, in main
    rv = self.invoke(ctx)
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke
    return callback(*args, **kwargs)
  File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 94, in main
    spec.loader.exec_module(synth_module)  # type: ignore
  File "<frozen importlib._bootstrap_external>", line 678, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/home/kbuilder/.cache/synthtool/python-tasks/synth.py", line 35, in <module>
    include_protos=True,
  File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 46, in py_library
    return self._generate_code(service, version, "python", **kwargs)
  File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 180, in _generate_code
    shell.run(bazel_run_args)
  File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 39, in run
    raise exc
  File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 33, in run
    encoding="utf-8",
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 438, in run
    output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['bazel', '--max_idle_secs=60', 'build', '//google/cloud/tasks/v2beta2:tasks-v2beta2-py']' returned non-zero exit status 1.
2020-06-20 08:40:45,201 autosynth [ERROR] > Synthesis failed
2020-06-20 08:40:45,201 autosynth [DEBUG] > Running: git reset --hard HEAD
HEAD is now at 01304e7 chore: Migrate python-tasks synth.py from artman to bazel (#21)
2020-06-20 08:40:45,206 autosynth [DEBUG] > Running: git checkout autosynth-self
Switched to branch 'autosynth-self'
2020-06-20 08:40:45,211 autosynth [ERROR] > Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1.
2020-06-20 08:40:45,493 autosynth [INFO] > PR already exists: https://github.com/googleapis/python-tasks/pull/24
2020-06-20 08:40:45,493 autosynth [DEBUG] > Running: git clean -fdx
Removing __pycache__/
Removing google/__pycache__/
Traceback (most recent call last):
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 649, in <module>
    main()
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 506, in main
    return _inner_main(temp_dir)
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 629, in _inner_main
    commit_count = synthesize_loop(x, multiple_prs, change_pusher, synthesizer)
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 367, in synthesize_loop
    synthesize_inner_loop(fork, synthesizer)
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 411, in synthesize_inner_loop
    synthesizer, len(toolbox.versions) - 1
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 266, in synthesize_version_in_new_branch
    synthesizer.synthesize(synth_log_path, self.environ)
  File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize
    synth_proc.check_returncode()  # Raise an exception.
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode
    self.stderr)
subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1.

Google internal developers can see the full log here.

Dependency Dashboard

This issue provides visibility into Renovate updates and their statuses. Learn more

This repository currently has no open or pending branches.


  • Check this box to trigger a request for Renovate to run again on this repository

CPU grows under load

Hello,

I am using Python Cloud Task Client for communication between microservices.
Last few days I am running load tests to check for bottlenecks in my application and have noticed that the CPU always grows during a long load test. This happens when I call the create_task function to submit the task to the queue.

# app.py

@root.route('/webhook', methods=['POST'])
def webhook():
   send_task(...) 
# utils.py

cloud_task_client = tasks_v2.CloudTasksClient()
queue_path = cloud_task_client.queue_path(config.PROJECT_ID, config.LOCATION, config.QUEUE_NAME)

def send_task(body: dict) -> None:
    task = {
        'http_request': {
            'http_method': 'POST',
            'url': config.URL,
            'headers': {},
            'body': json.dumps(body).encode()
        }
    }
    cloud_task_client.create_task(request={'parent': queue_path, 'task': task})
CMD ["gunicorn", "-w", "4", "-b", ":5000", "app.wsgi:app"]
google-cloud-tasks==2.0.0
Flask==1.1.1
gunicorn==20.0.4
dialogflow==1.0.0
mysql-connector-python==8.0.20
python-json-logger==0.1.11

When I call the / webhook endpoint at 2 requests per second (I use Locust for load tests), my CPU values โ€‹โ€‹always go up every minute. After around 10-minute load tests my Kubernetes limits / resources are exhausted and my app crashes.

Have any ideas why this is happening?

image
image

Transactional Enqueuing

Is your feature request related to a problem? Please describe.
We wish to migrate our app from python2 to python3 and we heavily rely on transactional enqueuing. The migration doc does not explain how this might work with cloud ndb and tasks.

Describe the solution you'd like
There should be some way of attaching actions with the transaction with or without limit of allowed actions.

Describe alternatives you've considered
We considered storing the tasks in datastore during the transaction and dequeue later to run when transaction is completed but that seems to consume an entity group (25 entity group limit) which is not feasible either in our case.

Additional context
N/A

samples.snippets.create_http_task_with_token_test: test_create_http_task_with_token failed

This test failed!

To configure my behavior, see the Flaky Bot documentation.

If I'm commenting on this issue too often, add the flakybot: quiet label and
I will stop commenting.


commit: 47b9c1c
buildURL: Build Status, Sponge
status: failed

Test output
args = (name: "projects/python-docs-samples-tests/locations/us-central1/queues/my-queue-52e76217567c4303a226231a10cce1c7"
,)
kwargs = {'metadata': [('x-goog-request-params', 'name=projects/python-docs-samples-tests/locations/us-central1/queues/my-queue-52e76217567c4303a226231a10cce1c7'), ('x-goog-api-client', 'gl-python/3.8.8 grpc/1.38.1 gax/1.31.0 gapic/2.4.0')]}
@six.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
    try:
      return callable_(*args, **kwargs)

.nox/py-3-8/lib/python3.8/site-packages/google/api_core/grpc_helpers.py:67:


self = <grpc._channel._UnaryUnaryMultiCallable object at 0x7ff6c8aaf640>
request = name: "projects/python-docs-samples-tests/locations/us-central1/queues/my-queue-52e76217567c4303a226231a10cce1c7"

timeout = None
metadata = [('x-goog-request-params', 'name=projects/python-docs-samples-tests/locations/us-central1/queues/my-queue-52e76217567c4303a226231a10cce1c7'), ('x-goog-api-client', 'gl-python/3.8.8 grpc/1.38.1 gax/1.31.0 gapic/2.4.0')]
credentials = None, wait_for_ready = None, compression = None

def __call__(self,
             request,
             timeout=None,
             metadata=None,
             credentials=None,
             wait_for_ready=None,
             compression=None):
    state, call, = self._blocking(request, timeout, metadata, credentials,
                                  wait_for_ready, compression)
  return _end_unary_response_blocking(state, call, False, None)

.nox/py-3-8/lib/python3.8/site-packages/grpc/_channel.py:946:


state = <grpc._channel._RPCState object at 0x7ff6c8ae7760>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7ff6c8b00b80>
with_call = False, deadline = None

def _end_unary_response_blocking(state, call, with_call, deadline):
    if state.code is grpc.StatusCode.OK:
        if with_call:
            rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
            return state.response, rendezvous
        else:
            return state.response
    else:
      raise _InactiveRpcError(state)

E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.UNAUTHENTICATED
E details = "Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project."
E debug_error_string = "{"created":"@1626174369.983732944","description":"Error received from peer ipv4:173.194.203.95:443","file":"src/core/lib/surface/call.cc","file_line":1066,"grpc_message":"Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.","grpc_status":16}"
E >

.nox/py-3-8/lib/python3.8/site-packages/grpc/_channel.py:849: _InactiveRpcError

The above exception was the direct cause of the following exception:

@pytest.fixture()
def test_queue():
    client = tasks_v2.CloudTasksClient()
    parent = f"projects/{TEST_PROJECT_ID}/locations/{TEST_LOCATION}"
    queue = {
        # The fully qualified path to the queue
        "name": client.queue_path(TEST_PROJECT_ID, TEST_LOCATION, TEST_QUEUE_NAME),
    }
    q = client.create_queue(request={"parent": parent, "queue": queue})

    yield q
  client.delete_queue(request={"name": q.name})

create_http_task_with_token_test.py:43:


../../google/cloud/tasks_v2/services/cloud_tasks/client.py:814: in delete_queue
rpc(
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/gapic_v1/method.py:145: in call
return wrapped_func(*args, **kwargs)
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/retry.py:285: in retry_wrapped_func
return retry_target(
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/retry.py:188: in retry_target
return target()
.nox/py-3-8/lib/python3.8/site-packages/google/api_core/grpc_helpers.py:69: in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)


value = None
from_value = <_InactiveRpcError of RPC that terminated with:
status = StatusCode.UNAUTHENTICATED
details = "Request had invalid a...entication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.","grpc_status":16}"

???
E google.api_core.exceptions.Unauthenticated: 401 Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.

:3: Unauthenticated

possible memory leak while creating tasks

We are using google cloud tasks to offload asynchronous work. We have an API endpoint that publishes notifications to subscribers and in that API it fetches subscribers from the database and multiple schedules tasks to invoke another endpoint say /send-sms or /send-email.

It seems that the process of creating a task isn't releasing memory.

Environment Details
OS type and version: python:3.7-slim
Python version: 3.7
google-cloud-tasks==1.3.0

Environment details

  • OS type and version: python:3.7-slim
  • Python version: 3.7
  • pip version: 20.2.4
  • google-cloud-tasks==1.3.0

Steps to reproduce

  1. Create Cloud Task

Code example

        LOGGER.info(f"Task create request")
        payload = copy.deepcopy(payload)
        client = tasks_v2beta3.CloudTasksClient()
        # Construct the fully qualified queue name.
        parent = client.queue_path(
            constants.PROJECT_ID,
            constants.LOCATION_ID, tasks_queue
        )

        if payload is not None:
            # The API expects a payload of type bytes.
            payload = json.dumps(payload).encode()

        # Construct the request body.
        task = {
            "http_request": {
                "headers": self.headers,
                "http_method": "POST",
                "url": url,
                "body": payload,
            }
        }

        # Default delay of 2 seconds
        in_seconds = 5

        if delay_hours is not None:
            # Convert into seconds
            in_seconds = delay_hours * 3600

        # Convert "seconds from now" into an rfc3339 datetime string.
        d = datetime.datetime.utcnow() + datetime.timedelta(seconds=in_seconds)

        # Create Timestamp protobuf.
        timestamp = timestamp_pb2.Timestamp()
        timestamp.FromDatetime(d)
        # Add the timestamp to the tasks.
        task["schedule_time"] = timestamp

        try:
            LOGGER.info(f"Creating task on queue {parent} with following config {task}")
            # Use the client to build and send the task.
            response = client.create_task(parent, task, retry=self.getRetryParams())
            LOGGER.info(f"Created task {response.name}")
        except AlreadyExists as e:
            LOGGER.info(f"Task {task} already exist. Ignoring exception {str(e)}", exc_info=True)

Stack trace

 Line #    Mem usage    Increment  Occurences   Line Contents
 ============================================================
     54     68.8 MiB     68.8 MiB           1       @profile
     55                                             def create_task(self, delay_hours, payload, url, tasks_queue):
     56                                                 """
     57                                                 Create Task in respective Queue
     58                                                 """
     59     68.8 MiB      0.0 MiB           1           LOGGER.info(f"Task create request")
     60     68.8 MiB      0.0 MiB           1           payload = copy.deepcopy(payload)
     61     68.8 MiB      0.0 MiB           1           client = tasks_v2beta3.CloudTasksClient()
     62                                                 # Construct the fully qualified queue name.
     63     68.8 MiB      0.0 MiB           1           parent = client.queue_path(
     64     68.8 MiB      0.0 MiB           1               constants.PROJECT_ID,
     65     68.8 MiB      0.0 MiB           1               constants.LOCATION_ID, tasks_queue
     66                                                 )
     67
     68     68.8 MiB      0.0 MiB           1           if payload is not None:
     69                                                     # The API expects a payload of type bytes.
     70     68.8 MiB      0.0 MiB           1               payload = json.dumps(payload).encode()
     71
     72                                                 # Construct the request body.
     73                                                 task = {
     74     68.8 MiB      0.0 MiB           1               "http_request": {
     75     68.8 MiB      0.0 MiB           1                   "headers": self.headers,
     76     68.8 MiB      0.0 MiB           1                   "http_method": "POST",
     77     68.8 MiB      0.0 MiB           1                   "url": url,
     78     68.8 MiB      0.0 MiB           1                   "body": payload,
     79                                                     }
     80                                                 }
     81
     82                                                 # Default delay of 2 seconds
     83     68.8 MiB      0.0 MiB           1           in_seconds = 5
     84
     85     68.8 MiB      0.0 MiB           1           if delay_hours is not None:
     86                                                     # Convert into seconds
     87                                                     in_seconds = delay_hours * 3600
     88
     89                                                 # Convert "seconds from now" into an rfc3339 datetime string.
     90     68.8 MiB      0.0 MiB           1           d = datetime.datetime.utcnow() + datetime.timedelta(seconds=in_seconds)
     91
     92                                                 # Create Timestamp protobuf.
     93     68.8 MiB      0.0 MiB           1           timestamp = timestamp_pb2.Timestamp()
     94     68.8 MiB      0.0 MiB           1           timestamp.FromDatetime(d)
     95                                                 # Add the timestamp to the tasks.
     96     68.8 MiB      0.0 MiB           1           task"schedule_time"] = timestamp
     97
     98     68.8 MiB      0.0 MiB           1           try:
     99     68.8 MiB      0.0 MiB           1               LOGGER.info(f"Creating task on queue {parent} with following config {task}")
    100                                                     # Use the client to build and send the task.
    101     69.8 MiB      1.0 MiB           1               response = client.create_task(parent, task, retry=self.getRetryParams())
    102     69.8 MiB      0.0 MiB           1               LOGGER.info(f"Created task {response.name}")
    103                                                 except AlreadyExists as e:
    104                                                     LOGGER.info(f"Task {task} already exist. Ignoring exception {str(e)}", exc_info=True)

Following are the memory footprints of the instance.

Screenshot 2021-02-24 at 11 13 27 PM

Screenshot 2021-02-24 at 11 13 50 PM

I have tried deleting the client object manually and gc.collect() but nothing works in keeping memory on track. I am not sure what else we can do or what else information is required to nail down the issue.

Your help would be appreciated.

Task level retries documentation wrong.

Hello.

In the documentation for create_task, retry is documented as
(Optional[google.api_core.retry.Retry]) โ€“ A retry object used to retry client library requests. If None is specified, requests will be retried using a default configuration.
However currently according to this feature request, they are not supported by cloud tasks.
I tried it and it seems like the task does not work as of now, it seems like the retry option is ignored. Considering the issue I'm guessing that it was a problem with the documentation rather than the code.
If it is an issue with the code the steps to reproduce amount to making a task with a short deadline in a queue without one/a longer one and the task-level deadline will be ignored.

_InactiveRpcError

Hi, team!

Constantly got ServiceUnavailable 503 Connection reset by peer errors.

Environment details

  • OS type and version: docker python3.9-slim
  • Python version: 3.9
  • pip version: 21.0.1
  • google-cloud-tasks version: 2.2.0

Code example

client = tasks_v2.CloudTasksClient()
queue_path = client.queue_path(
    settings.GCLOUD_PROJECT_ID,
    settings.GCLOUD_QUEUE_REGION,
    settings.GCLOUD_NEW_ORDER_QUEUE_NAME,
)
task = {
    'http_request': {
        'http_method': 'POST',
        'url': settings.ORDERS_URL + '/orders/',
        'body': json.dumps(payload).encode(),
    }
}

client.create_task(parent=queue_path, task=task)

Stack trace

_InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Connection reset by peer"
	debug_error_string = "{"created":"@1618704900.216123229","description":"Error received from peer ipv4:108.177.121.95:443","file":"src/core/lib/surface/call.cc","file_line":1067,"grpc_message":"Connection reset by peer","grpc_status":14}"
>

How to fix this problem?

Thanks!

Cloud Tasks: 504 Deadline Exceeded (or 503 Deadline Exceeded) when trying to create a Cloud Task via Python client

Problem

When attempting to create a Cloud Task via any official Python client (e.g. tasks_v2, tasks_v2beta2, or tasks_v2beta3), a 504 Deadline Exceeded error is consistently returned. This occurs regardless of the authentication technique used.

Note: if the same thing is attempted (using the same Cloud Task queue) with a Golang client, e.g. the create_task application and example from here, the task is created successfully, i.e. the problem only seems to affect Python clients.

Environment details

OS type and version: Ubuntu 18.04
Python version: Python 3.6.9 and Python 3.8.0
Output of pip freeze:

google-api-core==1.22.1
google-auth==1.20.1
google-cloud-core==1.4.1
google-cloud-datastore==1.15.0
google-cloud-tasks==1.5.0
googleapis-common-protos==1.52.0
grpc-google-iam-v1==0.12.3
grpcio==1.31.0

Steps to reproduce

  1. Follow the Run the Sample Using the Command Line instructions.
  2. Alternatively, the issue can be reliably reproduced by following any Python example from the documentation which attempts to create a task, such as this one.

Code example

python create_app_engine_queue_task.py --project=$PROJECT_ID --queue=$QUEUE_ID --location=$LOCATION_ID --payload=hello

Stack trace

Traceback (most recent call last):
  File "/path/to/python3.6/site-packages/google/api_core/grpc_helpers.py", line 57, in error_remapped_callable
    return callable_(*args, **kwargs)
  File "/path/to/python3.6/site-packages/grpc/_channel.py", line 826, in __call__
    return _end_unary_response_blocking(state, call, False, None)
  File "/path/to/python3.6/site-packages/grpc/_channel.py", line 729, in _end_unary_response_blocking
    raise _InactiveRpcError(state)
grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
	status = StatusCode.DEADLINE_EXCEEDED
	details = "Deadline Exceeded"
	debug_error_string = "{"created":"@1598472279.156412856","description":"Deadline Exceeded","file":"src/core/ext/filters/deadline/deadline_filter.cc","file_line":69,"grpc_status":4}"
>

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "create_app_engine_queue_task.py", line 110, in <module>
    args.payload, args.in_seconds)
  File "create_app_engine_queue_task.py", line 66, in create_task
    response = client.create_task(parent, task)
  File "/path/to/python3.6/site-packages/google/cloud/tasks_v2/gapic/cloud_tasks_client.py", line 1508, in create_task
    request, retry=retry, timeout=timeout, metadata=metadata
  File "/path/to/python3.6/site-packages/google/api_core/gapic_v1/method.py", line 145, in __call__
    return wrapped_func(*args, **kwargs)
  File "/path/to/python3.6/site-packages/google/api_core/retry.py", line 286, in retry_wrapped_func
    on_error=on_error,
  File "/path/to/python3.6/site-packages/google/api_core/retry.py", line 184, in retry_target
    return target()
  File "/path/to/python3.6/site-packages/google/api_core/timeout.py", line 214, in func_with_timeout
    return func(*args, **kwargs)
  File "/path/to/python3.6/site-packages/google/api_core/grpc_helpers.py", line 59, in error_remapped_callable
    six.raise_from(exceptions.from_grpc_error(exc), exc)
  File "<string>", line 3, in raise_from
google.api_core.exceptions.DeadlineExceeded: 504 Deadline Exceeded

Note: if the client.create_task() call is made with a timeout parameter specified, an almost-identical error and stack-trace is returned, with the only differences being the status code (status = StatusCode.UNAVAILABLE) and exception thrown (google.api_core.exceptions.ServiceUnavailable: 503 Deadline Exceeded). It's possible, especially in light of the API error code documentation, that there is a discrepancy worth investigating here.

A similar (if not identical) problem seems to be mentioned here with regards to Pub/sub Python clients, and more possibly-related issue reports are this one from the python-docs-samples repository and #6060 as well as this one from the python-firestore repo.

Finally, it is probably worth mentioning this issue report which mentions that the gunicorn version used may be relevant. For what it's worth, I'm seeing the same error(s) regardless of the gunicorn version the task handler uses.

Synthesis failed for python-tasks

Hello! Autosynth couldn't regenerate python-tasks. ๐Ÿ’”

Here's the output from running synth.py:

uirements.txt (line 4))
  Using cached https://files.pythonhosted.org/packages/30/9e/f663a2aa66a09d838042ae1a2c5659828bb9b41ea3a6efa20a20fd92b121/Jinja2-2.11.2-py2.py3-none-any.whl
  Saved ./Jinja2-2.11.2-py2.py3-none-any.whl
Collecting MarkupSafe==1.1.1 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 5))
  Using cached https://files.pythonhosted.org/packages/b2/5f/23e0023be6bb885d00ffbefad2942bc51a620328ee910f64abe5a8d18dd1/MarkupSafe-1.1.1-cp36-cp36m-manylinux1_x86_64.whl
  Saved ./MarkupSafe-1.1.1-cp36-cp36m-manylinux1_x86_64.whl
Collecting protobuf==3.13.0 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 6))
  Using cached https://files.pythonhosted.org/packages/30/79/510974552cebff2ba04038544799450defe75e96ea5f1675dbf72cc8744f/protobuf-3.13.0-cp36-cp36m-manylinux1_x86_64.whl
  Saved ./protobuf-3.13.0-cp36-cp36m-manylinux1_x86_64.whl
Collecting pypandoc==1.5 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 7))
  Using cached https://files.pythonhosted.org/packages/d6/b7/5050dc1769c8a93d3ec7c4bd55be161991c94b8b235f88bf7c764449e708/pypandoc-1.5.tar.gz
    Complete output from command python setup.py egg_info:
    Traceback (most recent call last):
      File "<string>", line 1, in <module>
      File "/tmpfs/tmp/tmp85fvma7i/setuptools-tmp/setuptools/__init__.py", line 6, in <module>
        import distutils.core
      File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/_distutils_hack/__init__.py", line 82, in create_module
        return importlib.import_module('._distutils', 'setuptools')
      File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/importlib/__init__.py", line 126, in import_module
        return _bootstrap._gcd_import(name[level:], package, level)
    ModuleNotFoundError: No module named 'setuptools._distutils'
    
    ----------------------------------------
 (Command "python setup.py egg_info" failed with error code 1 in /tmpfs/tmp/pip-build-sw1feq6e/pypandoc/
)
ERROR: no such package '@gapic_generator_python_pip_deps//': pip_import failed: Collecting click==7.1.2 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 1))
  Using cached https://files.pythonhosted.org/packages/d2/3d/fa76db83bf75c4f8d338c2fd15c8d33fdd7ad23a9b5e57eb6c5de26b430e/click-7.1.2-py2.py3-none-any.whl
  Saved ./click-7.1.2-py2.py3-none-any.whl
Collecting google-api-core==1.22.1 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 2))
  Using cached https://files.pythonhosted.org/packages/e0/2d/7c6c75013105e1d2b6eaa1bf18a56995be1dbc673c38885aea31136e9918/google_api_core-1.22.1-py2.py3-none-any.whl
  Saved ./google_api_core-1.22.1-py2.py3-none-any.whl
Collecting googleapis-common-protos==1.52.0 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 3))
  Using cached https://files.pythonhosted.org/packages/03/74/3956721ea1eb4bcf7502a311fdaa60b85bd751de4e57d1943afe9b334141/googleapis_common_protos-1.52.0-py2.py3-none-any.whl
  Saved ./googleapis_common_protos-1.52.0-py2.py3-none-any.whl
Collecting jinja2==2.11.2 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 4))
  Using cached https://files.pythonhosted.org/packages/30/9e/f663a2aa66a09d838042ae1a2c5659828bb9b41ea3a6efa20a20fd92b121/Jinja2-2.11.2-py2.py3-none-any.whl
  Saved ./Jinja2-2.11.2-py2.py3-none-any.whl
Collecting MarkupSafe==1.1.1 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 5))
  Using cached https://files.pythonhosted.org/packages/b2/5f/23e0023be6bb885d00ffbefad2942bc51a620328ee910f64abe5a8d18dd1/MarkupSafe-1.1.1-cp36-cp36m-manylinux1_x86_64.whl
  Saved ./MarkupSafe-1.1.1-cp36-cp36m-manylinux1_x86_64.whl
Collecting protobuf==3.13.0 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 6))
  Using cached https://files.pythonhosted.org/packages/30/79/510974552cebff2ba04038544799450defe75e96ea5f1675dbf72cc8744f/protobuf-3.13.0-cp36-cp36m-manylinux1_x86_64.whl
  Saved ./protobuf-3.13.0-cp36-cp36m-manylinux1_x86_64.whl
Collecting pypandoc==1.5 (from -r /home/kbuilder/.cache/bazel/_bazel_kbuilder/a732f932c2cbeb7e37e1543f189a2a73/external/gapic_generator_python/requirements.txt (line 7))
  Using cached https://files.pythonhosted.org/packages/d6/b7/5050dc1769c8a93d3ec7c4bd55be161991c94b8b235f88bf7c764449e708/pypandoc-1.5.tar.gz
    Complete output from command python setup.py egg_info:
    Traceback (most recent call last):
      File "<string>", line 1, in <module>
      File "/tmpfs/tmp/tmp85fvma7i/setuptools-tmp/setuptools/__init__.py", line 6, in <module>
        import distutils.core
      File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/_distutils_hack/__init__.py", line 82, in create_module
        return importlib.import_module('._distutils', 'setuptools')
      File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/importlib/__init__.py", line 126, in import_module
        return _bootstrap._gcd_import(name[level:], package, level)
    ModuleNotFoundError: No module named 'setuptools._distutils'
    
    ----------------------------------------
 (Command "python setup.py egg_info" failed with error code 1 in /tmpfs/tmp/pip-build-sw1feq6e/pypandoc/
)
INFO: Elapsed time: 2.154s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (0 packages loaded)
FAILED: Build did NOT complete successfully (0 packages loaded)

Traceback (most recent call last):
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 102, in <module>
    main()
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__
    return self.main(*args, **kwargs)
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 782, in main
    rv = self.invoke(ctx)
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/tmpfs/src/github/synthtool/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke
    return callback(*args, **kwargs)
  File "/tmpfs/src/github/synthtool/synthtool/__main__.py", line 94, in main
    spec.loader.exec_module(synth_module)  # type: ignore
  File "<frozen importlib._bootstrap_external>", line 678, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/home/kbuilder/.cache/synthtool/python-tasks/synth.py", line 35, in <module>
    include_protos=True,
  File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 46, in py_library
    return self._generate_code(service, version, "python", **kwargs)
  File "/tmpfs/src/github/synthtool/synthtool/gcp/gapic_bazel.py", line 183, in _generate_code
    shell.run(bazel_run_args)
  File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 39, in run
    raise exc
  File "/tmpfs/src/github/synthtool/synthtool/shell.py", line 33, in run
    encoding="utf-8",
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 438, in run
    output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['bazel', '--max_idle_secs=240', 'build', '//google/cloud/tasks/v2beta2:tasks-v2beta2-py']' returned non-zero exit status 1.
2020-08-31 05:25:29,065 autosynth [ERROR] > Synthesis failed
2020-08-31 05:25:29,065 autosynth [DEBUG] > Running: git reset --hard HEAD
HEAD is now at a88e42b chore: update Py2 support msg to reflect passage of time (#31)
2020-08-31 05:25:29,070 autosynth [DEBUG] > Running: git checkout autosynth
Switched to branch 'autosynth'
2020-08-31 05:25:29,075 autosynth [DEBUG] > Running: git clean -fdx
Removing __pycache__/
Removing google/__pycache__/
Traceback (most recent call last):
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 690, in <module>
    main()
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 539, in main
    return _inner_main(temp_dir)
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 670, in _inner_main
    commit_count = synthesize_loop(x, multiple_prs, change_pusher, synthesizer)
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 375, in synthesize_loop
    has_changes = toolbox.synthesize_version_in_new_branch(synthesizer, youngest)
  File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 273, in synthesize_version_in_new_branch
    synthesizer.synthesize(synth_log_path, self.environ)
  File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 120, in synthesize
    synth_proc.check_returncode()  # Raise an exception.
  File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode
    self.stderr)
subprocess.CalledProcessError: Command '['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'synth.metadata', 'synth.py', '--']' returned non-zero exit status 1.

Google internal developers can see the full log here.

Inconsistent documentation on creating a task

This client library's documentation for the create_task function includes optional parameters for retry, timeout, and metadata, as shown here:
https://googleapis.dev/python/cloudtasks/latest/gapic/v2/api.html?highlight=create_task#google.cloud.tasks_v2.CloudTasksClient.create_task

However, the Cloud Tasks REST API does not include such parameters, as shown here:
https://cloud.google.com/tasks/docs/reference/rest/v2/projects.locations.queues.tasks/create

In fact, on another REST API web page, it explicitly states:

RetryConfig
Settings that determine the retry behavior.

For tasks created using Cloud Tasks: the queue-level retry settings apply to all tasks in the queue that were created using Cloud Tasks. Retry settings cannot be set on individual tasks.

For tasks created using the App Engine SDK: the queue-level retry settings apply to all tasks in the queue which do not have retry settings explicitly set on the task and were created by the App Engine SDK. See App Engine documentation.

https://googleapis.github.io/google-cloud-dotnet/docs/Google.Cloud.Tasks.V2/api/Google.Cloud.Tasks.V2.Queue.html#Google_Cloud_Tasks_V2_Queue_RetryConfig

It's not clear what to rely on here. Can you please clarify? Thanks.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.