Git Product home page Git Product logo

apitools's Introduction

DEPRECATED - Please see alternatives below

google-apitools

pypi build coverage

google-apitools is a collection of utilities to make it easier to build client-side tools, especially those that talk to Google APIs.

NOTE: This library is deprecated and unsupported. Please read below for suggested alternatives.

Alternatives to apitools -----------------------For the official Cloud client libraries used to communicating with Google Cloud APIs, go to https://cloud.google.com/apis/docs/cloud-client-libraries.

To generate Python API client libraries for APIs specified by protos, such as those inside Google, see https://github.com/googleapis/gapic-generator-python. API client library generators for other languages can be found in https://github.com/googleapis.

Installing as a library

To install the library into the current virtual environment:

$ pip install google-apitools

Installing the command-line tools

To install the command-line scripts into the current virtual environment:

$ pip install google-apitools[cli]

Running the tests

First, install the testing dependencies:

$ pip install google-apitools[testing]

and the nose testrunner:

$ pip install nose

Then run the tests:

$ nosetests

apitools's People

Contributors

benhead avatar betamos avatar catleeball avatar charlesccychen avatar cherba avatar cherba29 avatar cococlyde avatar craigcitro avatar dhermes avatar dilipped avatar eap avatar gbin avatar harrisongregg avatar houglum avatar jameslynnwu avatar jcoatgoogle avatar joelzed avatar kevinli7 avatar khtg avatar markpell avatar mfschwartz avatar mooman219 avatar przemekwitek avatar qdii avatar silviulica avatar sixolet avatar thobrla avatar tseaver avatar vilasj avatar znewman01 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

apitools's Issues

Upload.__StreamMedia breaks on non-seekable streams

I'd like to be able to use non-seekable streams, such as werkzeug.wsgi.LimitedStream that is used in Flask for streaming request and respone data. This is how we do streaming to/from Google Cloud Storage. However, the Upload.__StreamMedia calls stream.seek() to verify completion:

  File "site-packages/gcloud/storage/blob.py", line 350, in upload_from_file
    finish_callback=lambda *args: None)
  File "site-packages/apitools/base/py/transfer.py", line 807, in StreamInChunks
    additional_headers=additional_headers)
  File "site-packages/apitools/base/py/transfer.py", line 773, in __StreamMedia
    self.stream.seek(0, os.SEEK_END)
AttributeError: 'LimitedStream' object has no attribute 'seek'

There's even a TODO:

# TODO(craigcitro): Decide how to handle errors in the
      # non-seekable case.

Add more integration tests.

apitools is largely tested by a set of integration tests, which currently aren't included in this repo. i need to:

  • add them, and
  • get Travis to run them.

Helpstring in gen_client is incorrect for discovery_url

The gen_client help says

  --discovery_url: URL (or "name/version") of the ...

but the code rejects if it doesn't come in the form name.version.


Example stacktrace:

$ gen_client --discovery_url="storage/v1" --outdir=foo client
Traceback (most recent call last):
  File "/usr/local/bin/gen_client", line 9, in <module>
    load_entry_point('google-apitools==0.4.1', 'console_scripts', 'gen_client')()
  File "/usr/local/lib/python2.7/dist-packages/apitools/gen/gen_client.py", line 240, in run_main
    appcommands.Run()
  File "/usr/local/lib/python2.7/dist-packages/google/apputils/appcommands.py", line 791, in Run
    return app.run()
  File "/usr/local/lib/python2.7/dist-packages/google/apputils/app.py", line 238, in run
    return _actual_start()
  File "/usr/local/lib/python2.7/dist-packages/google/apputils/app.py", line 267, in _actual_start
    really_start()
  File "/usr/local/lib/python2.7/dist-packages/google/apputils/appcommands.py", line 788, in InterceptReallyStart
    original_really_start(main=_CommandsStart)
  File "/usr/local/lib/python2.7/dist-packages/google/apputils/app.py", line 220, in really_start
    sys.exit(main(argv))
  File "/usr/local/lib/python2.7/dist-packages/google/apputils/appcommands.py", line 773, in _CommandsStart
    sys.exit(command.CommandRun(GetCommandArgv()))
  File "/usr/local/lib/python2.7/dist-packages/google/apputils/appcommands.py", line 293, in CommandRun
    ret = self.Run(argv)
  File "/usr/local/lib/python2.7/dist-packages/apitools/gen/gen_client.py", line 203, in Run
    codegen = _GetCodegenFromFlags()
  File "/usr/local/lib/python2.7/dist-packages/apitools/gen/gen_client.py", line 109, in _GetCodegenFromFlags
    discovery_doc = util.FetchDiscoveryDoc(FLAGS.discovery_url)
  File "/usr/local/lib/python2.7/dist-packages/apitools/gen/util.py", line 289, in FetchDiscoveryDoc
    discovery_url = NormalizeDiscoveryUrl(discovery_url)
  File "/usr/local/lib/python2.7/dist-packages/apitools/gen/util.py", line 281, in NormalizeDiscoveryUrl
    raise ValueError('Unrecognized value "%s" for discovery url')
ValueError: Unrecognized value "%s" for discovery url

protorpc.messages import statement in auto-generated code, might collide with proto field names

Consider this scenario:

message SomeMessage {
   required int32 x = 1;
}

message SomeOtherMessage {
  repeated SomeMessage messages = 1;
  required string uid = 2;
}

The generated code will be something like this:

[...]
from protorpc import messages
[...]

class SomeOtherMessage(messages.Message):
  messages = messages.MessageField('SomeMessage',...)
  uid = messages.StringField(2)

When using the auto-generated code above, an error will be raised:
eg. 'MessageField' object has no attribute 'StringField'.

Since SomeOtherMessage.messages field shadows protorpc.messages module.
Module messages from protorpc should be qualified to avoid these sorts of name collisions:

First tox run fails

With a brand-new checkout, running tox -e py27 fails with

======================================================================
ERROR: testGeneration (apitools.gen.client_generation_test.ClientGenerationTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File ".../apitools/apitools/gen/client_generation_test.py", line 61, in testGeneration
    retcode = subprocess.call(args)
  File "/usr/lib/python2.7/subprocess.py", line 522, in call
    return Popen(*popenargs, **kwargs).wait()
  File "/usr/lib/python2.7/subprocess.py", line 710, in __init__
    errread, errwrite)
  File "/usr/lib/python2.7/subprocess.py", line 1327, in _execute_child
    raise child_exception
OSError: [Errno 2] No such file or directory
-------------------- >> begin captured logging << --------------------
root: INFO: Testing API drive.v2 with command line: gen_client --client_id=12345 --client_secret=67890 --discovery_url=drive.v2 --outdir=generated --overwrite client
--------------------- >> end captured logging << ---------------------

Instantiate known HTTP status codes as subclasses of HttpError

This is an FR to use separate classes for known status codes, which all may derive from HttpError, for instance:

class ConflictError(HttpError):
   """This is an HTTP Conflict 409 error."""

Why? The reason is to greatly simplify control flow for the caller. It is very common to want to switch on specifically the status code. For instance, let's say you want to display a resource, but if it doesn't yet exist you want to prompt for its creation. The current recommended flow is:

try:
  my_resource = MyMessageClass.Get()  # calls API
except apitools_exceptions.HttpError as e:
  if e.status_code == 409:
    my_resource = OfferCreateInteractively()
  elif e.status_code == 403:
    raise PermissionError('Please go to your console and grant access to the user account')
  else:
    raise UserFacingCatchAllError(e)

With real classes, the control flow could look like this:

try:
  my_resource = MyMessageClass.Get()  # calls API
except apitools_exceptions.HttpConflictError:
  my_resource = OfferCreateInteractively()
except apitools_exceptions.HttpForbiddenError:
  raise MyPermissionError('Please go to your console and grant access to the user account')
except apitools_exceptions.HttpError as e:
  raise UserFacingCatchAllError(e)

This way, we:

  • avoid boilerplate for re-raising
  • utilize Python-idiomatic control flow for exceptions
  • allow for easier mocking of errors in unit tests

while still being fully backwards compatible.

AdditionalProperties not able to decode int values encoded as strings.

Discovery doc describing the following type (conceptually map<string, int64>())

    "MyMapOfValues": {
      "description": "",
      "type": "object",
      "additionalProperties": {
        "type": "string",
        "format": "int64"
      }
    }

gets converted to

@encoding.MapUnrecognizedFields('additionalProperties')
class MyMapOfValues(_messages.Message):
  class AdditionalProperty(_messages.Message):
    key = _messages.StringField(1)
    value = _messages.IntegerField(2)
  additionalProperties = _messages.MessageField('AdditionalProperty', 1, repeated=True)

Unfortunately decoding json payloads like:

message = encoding.JsonToMessage(MyMapOfValues, '{"myValue": "5"}')

results in

ValidationError: Expected type (<type 'int'>, <type 'long'>) for field value, found 5 (type <type 'unicode'>)

note that

message = encoding.JsonToMessage(MyMapOfValues, '{"myValue": 5}')

works as expected.

Better error messages on failing to parse a server response

Currently, failing to parse a response as the appropriate type from the server leads to somewhat incomprehensible error messages (usually related to type mismatches between a dict and a str).

Instead, we should provide something clearer, along the lines of Could not parse response "{...}" as object of type <typename>.

Version 0.5.18 reads entire upload streams into memory, which will fail for files larger than available amount of memory

7ccade5 added code that calls read() on a StreamSlice, thus effectively reading the entire contents of a stream into memory. This takes away one of the main advantages of using streams - being able to read them gradually and buffer their contents into memory, allowing the transfer of large files. If you attempt uploads of files larger than the available amount of memory you have, this will fail.

AttributeError in gsutil _UploadObject cases

This occurred when copying a 1MiB object with a low parallel composite upload threshold. This in turn split the request into two 512KiB calls to UploadObject made simultaneously across two threads.

It's a rare occurrence (can't reproduce it easily), but it seems like it should never happen.

Encountered exception while copying:
Traceback (most recent call last):
  File "/usr/local/google/home/thobrla/gsutil/gslib/command.py", line 1680, in PerformTask
    results = task.func(cls, task.args, thread_state=self.thread_gsutil_api)
  File "/usr/local/google/home/thobrla/gsutil/gslib/copy_helper.py", line 279, in _PerformParallelUploadFileToObject
    gzip_exts=None, allow_splitting=False)
  File "/usr/local/google/home/thobrla/gsutil/gslib/copy_helper.py", line 1649, in _UploadFileToObject
    dst_obj_metadata, preconditions, gsutil_api, logger)
  File "/usr/local/google/home/thobrla/gsutil/gslib/copy_helper.py", line 1402, in _UploadFileToObjectNonResumable
    provider=dst_url.scheme, fields=UPLOAD_RETURN_FIELDS)
  File "/usr/local/google/home/thobrla/gsutil/gslib/cloud_api_delegator.py", line 233, in UploadObject
    fields=fields)
  File "/usr/local/google/home/thobrla/gsutil/gslib/gcs_json_api.py", line 1024, in UploadObject
    apitools_strategy=apitools_transfer.SIMPLE_UPLOAD)
  File "/usr/local/google/home/thobrla/gsutil/gslib/gcs_json_api.py", line 887, in _UploadObject
    global_params=global_params)
  File "/usr/local/google/home/thobrla/gsutil/gslib/third_party/storage_apitools/storage_v1_client.py", line 975, in Insert
    download=download)
  File "/usr/local/google/home/thobrla/gsutil/third_party/apitools/apitools/base/py/base_api.py", line 616, in _RunMethod
    download)
  File "/usr/local/google/home/thobrla/gsutil/third_party/apitools/apitools/base/py/base_api.py", line 598, in PrepareHttpRequest
    self.__FinalizeRequest(http_request, url_builder)
  File "/usr/local/google/home/thobrla/gsutil/third_party/apitools/apitools/base/py/base_api.py", line 517, in __FinalizeRequest
    http_request.url = url_builder.url
  File "/usr/local/google/home/thobrla/gsutil/third_party/apitools/apitools/base/py/base_api.py", line 184, in url
    self.__scheme, self.__netloc, self.relative_path, self.query, ''))
  File "/usr/local/google/home/thobrla/gsutil/third_party/apitools/apitools/base/py/base_api.py", line 176, in query
    return urllib.parse.urlencode(self.query_params, doseq=True)
  File "/usr/local/google/home/thobrla/gsutil/third_party/six/six.py", line 92, in __get__
    delattr(obj.__class__, self.name)
AttributeError: urlencode

some files still using python2-specific print statements

these files use print the statement instead of print() the function. this fails under python 3.

git grep 'print ' (and ignore some of the comment/docstring hits) shows the bad files:

  • ez_setup.py
  • run_pylint.py
  • samples/bigquery_sample/bigquery_v2/bigquery_v2.py
  • samples/bigquery_sample/bigquery_v2/bigquery_v2_messages.py
  • samples/dns_sample/dns_v1/dns_v1.py
  • samples/fusiontables_sample/fusiontables_v1/fusiontables_v1.py
  • samples/fusiontables_sample/fusiontables_v1/fusiontables_v1_messages.py
  • samples/iam_sample/iam_v1/iam_v1.py
  • samples/iam_sample/iam_v1/iam_v1_messages.py
  • samples/servicemanagement_sample/servicemanagement_v1/servicemanagement_v1.py
  • samples/servicemanagement_sample/servicemanagement_v1/servicemanagement_v1_messages.py
  • samples/storage_sample/downloads_test.py
  • samples/storage_sample/storage_v1/storage_v1.py

Please add per-file licenses

The Chromium project (www.chromium.org) pulls in apitools indirectly through the catapult (https://github.com/catapult-project/catapult/) and gsutil (https://github.com/GoogleCloudPlatform/gsutil) source repositories. In order for Chromium to be pulled into various Linux source distributions there's a requirement that all of the third party files pass the Linux licensecheck utility. Currently there are many files in the apitools repository missing per-file licenses. From a current run of licensecheck, they are:

$ licensecheck -r . | grep "No copyright"
./run_pylint.py: *No copyright* UNKNOWN
./samples/storage_sample/storage/__init__.py: *No copyright* UNKNOWN
./samples/storage_sample/storage/storage_v1_client.py: *No copyright* UNKNOWN
./samples/storage_sample/storage/storage_v1.py: *No copyright* UNKNOWN
./samples/storage_sample/storage/storage_v1_messages.py: *No copyright* UNKNOWN
./samples/storage_sample/downloads_test.py: *No copyright* UNKNOWN
./samples/storage_sample/uploads_test.py: *No copyright* UNKNOWN
./ez_setup.py: *No copyright* UNKNOWN
./apitools/__init__.py: *No copyright* UNKNOWN
./apitools/gen/service_registry.py: *No copyright* UNKNOWN
./apitools/gen/util.py: *No copyright* UNKNOWN
./apitools/gen/__init__.py: *No copyright* UNKNOWN
./apitools/gen/extended_descriptor.py: *No copyright* UNKNOWN
./apitools/gen/util_test.py: *No copyright* UNKNOWN
./apitools/gen/command_registry.py: *No copyright* UNKNOWN
./apitools/gen/client_generation_test.py: *No copyright* UNKNOWN
./apitools/gen/message_registry.py: *No copyright* UNKNOWN
./apitools/gen/gen_client.py: *No copyright* UNKNOWN
./apitools/gen/gen_client_lib.py: *No copyright* UNKNOWN
./apitools/base/py/exceptions.py: *No copyright* UNKNOWN
./apitools/base/py/util.py: *No copyright* UNKNOWN
./apitools/base/py/testing/mock.py: *No copyright* UNKNOWN
./apitools/base/py/testing/__init__.py: *No copyright* UNKNOWN
./apitools/base/py/testing/testclient/fusiontables_v1_messages.py: *No copyright* GENERATED FILE
./apitools/base/py/testing/testclient/__init__.py: *No copyright* UNKNOWN
./apitools/base/py/testing/testclient/fusiontables_v1_client.py: *No copyright* UNKNOWN
./apitools/base/py/testing/mock_test.py: *No copyright* UNKNOWN
./apitools/base/py/credentials_lib.py: *No copyright* GENERATED FILE
./apitools/base/py/base_cli.py: *No copyright* GENERATED FILE
./apitools/base/py/stream_slice_test.py: *No copyright* UNKNOWN
./apitools/base/py/encoding_test.py: *No copyright* UNKNOWN
./apitools/base/py/cli.py: *No copyright* UNKNOWN
./apitools/base/py/stream_slice.py: *No copyright* UNKNOWN
./apitools/base/py/http_wrapper.py: *No copyright* UNKNOWN
./apitools/base/py/extra_types_test.py: *No copyright* UNKNOWN
./apitools/base/py/base_api_test.py: *No copyright* UNKNOWN
./apitools/base/py/transfer_test.py: *No copyright* UNKNOWN
./apitools/base/py/batch_test.py: *No copyright* UNKNOWN
./apitools/base/py/__init__.py: *No copyright* UNKNOWN
./apitools/base/py/http_wrapper_test.py: *No copyright* UNKNOWN
./apitools/base/py/list_pager_test.py: *No copyright* UNKNOWN
./apitools/base/py/encoding.py: *No copyright* UNKNOWN
./apitools/base/py/list_pager.py: *No copyright* UNKNOWN
./apitools/base/py/credentials_lib_test.py: *No copyright* UNKNOWN
./apitools/base/py/buffered_stream.py: *No copyright* UNKNOWN
./apitools/base/py/base_api.py: *No copyright* UNKNOWN
./apitools/base/py/util_test.py: *No copyright* UNKNOWN
./apitools/base/py/extra_types.py: *No copyright* UNKNOWN
./apitools/base/py/buffered_stream_test.py: *No copyright* UNKNOWN
./apitools/base/py/batch.py: *No copyright* UNKNOWN
./apitools/base/py/transfer.py: *No copyright* UNKNOWN
./apitools/base/py/app2.py: *No copyright* UNKNOWN
./apitools/base/__init__.py: *No copyright* UNKNOWN

We'd like to ask that per-file licenses be added to these files to make it easier to integrate apitools not only into Chromium, but also Linux distributions in general. Thanks.

Support Python 3

What is the status of Python 3 support for google-apitools?

We do not list any language classifiers here:

classifiers=[

Do we know if any extra work is required to make the package Python3-compatible?

Request message eliding doesn't respect `@OutputOnly` fields

A method that takes a path parameter called "foo" and a request body with a field called "foo" has the request message and path parameter elided, assuming that the value of "foo" specified in the request body will be the same one that should be specified in the path.

This ignores the case where the "foo" field in the request body is marked as @OutputOnly, where it's not valid to pass the value in the request body.

--unelidable_request_methods can specify that the message should not be elided, but ideally the eliding logic would skip eliding messages when @OutputOnly is involved.

I believe the necessary change is somewhere in _NeedRequestType here.

list_pager.YieldFromList should prevent batchSize > limit

If a limit is provided, YieldFromList should enforce that the requested batch size is less than the remaining limit. Otherwise, more data will be requested from the service than requested by the user.

This should apply to the limit as it decreases, for example:
Request 1: limit: 90, batch_size: 50 --> request with batch_size: 50
Request 2: limit: 40, batch_size: 50 --> request with batch_size: 40

Drop apputils and gflags.

We need to drop these two dependencies.

These two libraries get used in two places:

  • gen_client uses them for processing command-line args
  • generated CLIs use them.

I'll probably take care of this in two steps.

Handle transient oauth2client refresh failures in default handler

Presently, if an Oauth2 token URI returns a transient error code (like a 503), it's up to the application to handle it. But apitools should be able to retry now that HttpAccessTokenRefreshError in oauth2client v1.5.2 allows inspection of the status code.

Something like this in the default retry function HandleExceptionsAndRebuildHttpConnections should work:

elif (isinstance(retry_args.exc,
                 oauth2client.client.HttpAccessTokenRefreshError)
      and (retry_args.exc.status == TOO_MANY_REQUESTS or
           retry_args.exc.status >= 500)):
    logging.debug(
        'Caught transient credential refresh error (%s), retrying',
        retry_args.exc)       

auto_transfer=False uploads double-POST for resumable uploads

The upload is initialized and an initial POST is made here, but because auto_transfer is False, the HTTP response to the POST is not returned. Then we make a second POST, this time with bytes_http, throwing away the first uploadID and using the second one.

I'll get a pull request out to fix this once I've had an opportunity to do some testing.

Make apitools-generated classes pickleable

Presently, apitools can generate nested classes, which, because they aren't defined at the top-level of the module, are un-pickleable. Pickling is necessary for passing apitools objects to other processes, which in turn is useful when optimizing performance by spreading work out over multiple processes and threads.

This is a feature request to add __reduce__ and other appropriate logic so that apitools-generated objects can be pickled without any intervention by library consumers. This could be added either to generated classes or the base message classes in protorpclite

As a workaround, the consumers of a generated library can manual using encoding.MessageToJson prior to pickling and JsonToMessage after pickling, but it's extra work for each consumer.

six 1.11.0 causes TypeError: Error when calling the metaclass bases

Version 1.11.0 of the six package was released yesterday and causes:

  File "/Users/florenthemmi/dev/env/lib/python2.7/site-packages/apache_beam/internal/gcp/json_value.py", line 23, in <module>
    from apitools.base.py import extra_types
  File "/Users/florenthemmi/dev/env/lib/python2.7/site-packages/apitools/base/py/__init__.py", line 21, in <module>
    from apitools.base.py.base_api import *
  File "/Users/florenthemmi/dev/env/lib/python2.7/site-packages/apitools/base/py/base_api.py", line 31, in <module>
    from apitools.base.protorpclite import message_types
  File "/Users/florenthemmi/dev/env/lib/python2.7/site-packages/apitools/base/protorpclite/message_types.py", line 25, in <module>
    from apitools.base.protorpclite import messages
  File "/Users/florenthemmi/dev/env/lib/python2.7/site-packages/apitools/base/protorpclite/messages.py", line 1165, in <module>
    class Field(six.with_metaclass(_FieldMeta, object)):
TypeError: Error when calling the metaclass bases
    metaclass conflict: the metaclass of a derived class must be a (non-strict) subclass of the metaclasses of all its bases

This issue has been encountered by several more folks and has been reported as issue benjaminp/six#210.

This issue is to track its resolution, or to make whatever change in apitools could fix this.

Bad Assumption in AcceptableMimeType, Leading to "Invalid MIME type"

The function AcceptableMimeType in base/py/util.py incorrectly assumes that all mimetypes will have a forward slash within them:

def AcceptableMimeType(accept_patterns, mime_type):
    """Return True iff mime_type is acceptable for one of accept_patterns.

    Note that this function assumes that all patterns in accept_patterns
    will be simple types of the form "type/subtype", where one or both
    of these can be "*". We do not support parameters (i.e. "; q=") in
    patterns.

    Args:
      accept_patterns: list of acceptable MIME types.
      mime_type: the mime type we would like to match.

    Returns:
      Whether or not mime_type matches (at least) one of these patterns.
    """
    if '/' not in mime_type:
        raise exceptions.InvalidUserInputError(
            'Invalid MIME type: "%s"' % mime_type)

This assumption fails for .p12 files, which are used for server-to-server authentication. The function mimetypes.guess_type returns x-pkcs12 for .p12 files instead of application/x-pkcs12. This may be a bug with mimetypes, but it should nevertheless be handled properly by the apitools.

extra_types not always imported when required for generated messages

When generating message with AdditionalProperties if value type is "any"
from apitools.base.py import extra_types
is not present in messages file.

extra_types module defines messages such as JsonValue and JsonObject which are referenced by the message.

One would get following error when trying to instantiate a message using these extra types:

File "apitools\base\protorpclite\messages.py", line 1992, in find_definition
    'Could not find definition for %s' % name)
apitools.base.protorpclite.messages.DefinitionNotFoundError:
    Could not find definition for extra_types.JsonValue

Up till now what was saving this code from failing is that there was a wild card import in the generated __init__.py file which was pulling in extra_types at the package level, and the type lookup routine is designed to look for types up the hierarchy.

Uncaught AttributeError during decode

When a decode encounters a MessageField which has a non-dict value (e.g. "None" or "1"), an uncaught AttributeError is raised. The following is an associated stack trace:

File "apitools/base/py/encoding.py", line 110, in DictToMessage
  return JsonToMessage(message_type, json.dumps(d))
File "apitools/base/py/encoding.py", line 104, in JsonToMessage
  return _ProtoJsonApiTools.Get().decode_message(message_type, message)
File "apitools/base/py/encoding.py", line 290, in decode_message
  message_type, result)
File "apitools/base/protorpclite/protojson.py", line 211, in decode_message
  message = self.__decode_dictionary(message_type, dictionary)
File "apitools/base/protorpclite/protojson.py", line 284, in __decode_dictionary
  for item in value]
File "apitools/base/py/encoding.py", line 312, in decode_field
  field.message_type, json.dumps(value))
File "apitools/base/py/encoding.py", line 290, in decode_message
  message_type, result)
File "apitools/base/protorpclite/protojson.py", line 211, in decode_message
  message = self.__decode_dictionary(message_type, dictionary)
File "apitools/base/protorpclite/protojson.py", line 262, in __decode_dictionary
  for key, value in six.iteritems(dictionary):
File "six/__init__.py", line 605, in iteritems
  return d.iteritems(**kw)

Enum in Python 2.7

Enum is referenced before definition.

Reproduce:

  1. Create virtualenv
  2. Activate
  3. Clone repo
  4. python setup.py install
  5. In interpreter from apitools.base.protorpclite import messages
In [1]: from apitools.base.protorpclite import messages
---------------------------------------------------------------------------
NameError                                 Traceback (most recent call last)
<ipython-input-1-8b7eda70bd5b> in <module>()
----> 1 from apitools.base.protorpclite import messages

/Users/sergioisidoro/apitools/apitools/base/protorpclite/messages.py in <module>()
    402 
    403 
--> 404 class Enum(six.with_metaclass(_EnumClass, object)):
    405     """Base class for all enumerated types."""
    406 

/Users/sergioisidoro/apitools/apitools/base/protorpclite/messages.py in __init__(cls, name, bases, dct)
    301     def __init__(cls, name, bases, dct):
    302         # Can only define one level of sub-classes below Enum.
--> 303         if not (bases == (object,) or bases == (Enum,)):
    304             raise EnumDefinitionError(
    305                 'Enum type %s may only inherit from Enum' % name)

NameError: global name 'Enum' is not defined

https://github.com/google/apitools/blob/master/apitools/base/protorpclite/messages.py#L303

Releases could be improved

Currently, google-apitools releases do not contain any changelog, and it is hard to see what the actual changes between two versions are. When you tag a release, it would be nice if that could be added.

Also, if you click on "Releases" in the github UI, it pretends that 0.5.16 (which it calls "HttpError Class") is the latest release.

This would make it easier to maintain OS packages of google-apitools, like for example http://pkgsrc.se/www/py-google-apitools :)

Generated cli is not compatible with windows.

apitools/base/py/base_cli.py uses readline which is not supported on Windows. Perhaps there is a way to use pyreadline.

As a side note, it would be nice to separate cli code into separate package, because of some wildcard imports if generated cli is present it is being pulled in even if it is not used.

decode_message duplicates keys from response

At a high level, when we get a response with custom object metadata such as "456": "def" in gsutil, set_unrecognized_field is being called twice:
Once from protojson.py.__decode_dictionary() with the non-unicode version of the string, i.e., '456'

and once from encoding.py(536)_ProcessUnknownMessages() with the unicode version of the string, i.e., u'456'

Resulting in a message with duplicate entries:
metadata: <MetadataValue
additionalProperties: [<AdditionalProperty
key: '456'
value: u'def'>, <AdditionalProperty
key: '456'
value: u'def'>]>

client generation fails on a schema of type "string"

the BigQuery API had a discovery doc which contained the following (more or less):

{
  "schemas": {
    "JobReference": {
      "id": "JobReference",
      "type": "object",
      "properties": {
        "location": {
          "$ref": "Location"
        }
      }
    },
    "Location": {
      "id": "Location",
      "type": "string"
    }
  }
}

and it seems that apitools doesn't like a schema that's just a string:

Traceback (most recent call last):
  File "apitools/gen/gen_client.py", line 347, in main
    return args.func(args) or 0
  File "apitools/gen/gen_client.py", line 160, in GenerateClient
    codegen = _GetCodegenFromFlags(args)
  File "apitools/gen/gen_client.py", line 105, in _GetCodegenFromFlags
    apitools_version=args.apitools_version)
  File "apitools/gen/gen_client_lib.py", line 95, in __init__
    schema_name, schema)
  File "apitools/gen/message_registry.py", line 266, in AddDescriptorFromSchema
    schema.get('type'))
ValueError: ('Cannot create message descriptors for type %s', u'string')

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.