Git Product home page Git Product logo

protobuf-javascript's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

protobuf-javascript's Issues

both google protobuf npm and nuget package do not have LICENSE file

Hi,

I install the google-protobuf npm package through "npm install google-protobuf". But I see there is no LICENSE file inside.

For the nuget package, it also does not have the LICENSE file.

May I know how to ask adding the LICENSE file into npm package? otherwise looks like hard to really consume the package :)

Thank you very much.

getJsPbMessageId() retrun ""

here is what i have done:

  1. define a proto.file like this
syntax = "proto2";
message Awsome
{   
    required string  some_field   = 1;
}

  1. compile proto with protoc.exe (version 3.5.1) and get message_pb.js
var jspb = require('google-protobuf');
var goog = jspb;
var global = Function('return this')();

goog.exportSymbol('proto.Awsome', null, global);

/**
 * Generated by JsPbCodeGenerator.
 * @param {Array=} opt_data Optional initial data array, typically from a
 * server response, or constructed directly in Javascript. The array is used
 * in place and becomes part of the constructed object. It is not cloned.
 * If no data is provided, the constructed object will be empty, but still
 * valid.
 * @extends {jspb.Message}
 * @constructor
 */
proto.Awsome = function(opt_data) {
  jspb.Message.initialize(this, opt_data, 0, -1, null, null);
};
goog.inherits(proto.Awsome, jspb.Message);
if (goog.DEBUG && !COMPILED) {
  proto.Awsome.displayName = 'proto.Awsome';
}
……
  1. define a new message, and try to judge it's type in another method
function doSomeWor(message) {
     // code 1
     if(message.getJsPbMessageId()=="Awsome "){
           // do something
     }

     // code 2
     if(message.displayName =="proto.Awsome "){
           // do something
     }
}

Unfortunately, both codes can not work,message.getJsPbMessageId() return “” while message.displayName return undefine
4.if i change the message_pb.js to this,code 1 will work

jspb.Message.initialize(this, opt_data, 'Awsome', -1, null, null);

but i think you may do this automaticly when generating message_pb.js。 thank you

JS - toObject() inconsistent behavior with empty fields

When a message contains an inner message field, if it's left empty, toObject() will return undefined for that field value and that Implies the field was empty in the original message.
But when the empty field is a Singular Scalar Field, toObject() will return default value for its type and it's not possible to know if this was the original value of the field or was it originally empty.

Probably the difference is derived from the same difference in getField(), but why not use hasFeild() to really determine if the field is empty and to set the value of it to "undefined" in all cases? otherwise it's not possible to trust and use the object representation.

Failure: Type not convertible to Uint8Array.

Using version google protocol v3.6.1 with nwjs 68.0.3440.106
I have a very simple example, the issue is when the script is creating a Uint8Array and passing it to deserializeBinary, the function throws "Failure: Type not convertible to Uint8Array." But, deserializeBinary works if the byte array is created by google protocol buffer via serializeBinary it works. I guess the types Uint8Array from node is not the same as the google protocol Uint8Array? Is there some way to convert to the correct buffer type?

Here is a simple example:

var messages = require("./js/test_pb.js");
var testMsg = new messages.Test()
testMsg.setNum(10.1);
testMsg.setPayload("asfsfsd");
var bytes = testMsg.serializeBinary();
console.log("type: " + typeof(bytes));
var dupTestMsg = messages.Test.deserializeBinary(bytes);
console.log("dumpTestMsg.num:" + dupTestMsg.getNum());
console.log("dumpTestMsg.payload:" + dupTestMsg.getPayload());

var simpleArray = new Uint8Array(bytes);
/* the above code works fine, until a copy is made of bytes using Uint8Array */
try
{
var dupTestMsg = messages.Test.deserializeBinary(simpleArray);
}
catch(err)
{
console.log("Error:" + err);
}

JS - How to get and set fields of inner message field

Given the following message:
message A{
-- message B{
------ int32 x = 1
------ int32 y = 2
-- }
-- int32 foo = 1
-- B bar = 2
}

msg = A.deserializeBinary(buffer)

  1. Is msg.getBar().getX() is the only way to get x value? or there is a nicer solution like msg.bar.getX() (without using toObject())?

  2. How can I set the value of x field without creating a new B message?

[Enhancement] pre-built protoc binary via npm package

What language does this apply to?
Javascript / TypeScript (via gRPC-web)
Describe the problem you are trying to solve.
We have (windows/mac/linux) developer laptops and want to have similar flow for downloading and using protoc.
Looks like making protoc from source code or downloading binaries from this repo is not good solution for distribution in 2018
Describe the solution you'd like
It's nice to have pre-built binaries installed via nodejs npm for using it. For example when I want to npm install google-protobuf npm package I also get pre-built binary for my platform (linux/windows/macosx) and use it via terminal
Describe alternatives you've considered
choco for windows / protoc via make install? / homebrew for mac os

Need to better document that toObject() is *not* proto3 JSON

proto3 JSON mapping specifies special output for some well known types (e.g. Timestamp, Duration). However, the generated _pb.js files for these types treats them as regular protobufs.

Repro code:

var tspb = require('google-protobuf/google/protobuf/timestamp_pb.js');
var t = new tspb.Timestamp();
console.log(t.toObject());

Expected output:
0001-01-01T00:00:00Z

Observed output:
{ seconds: 0, nanos: 0 }

[js] message containing map fields are not properly cloned.

Summary

Cloned messages containing map fields cannot be properly serialized, if no other operations trigger an lazy initialization of wrappers_.

What version of protobuf and what language are you using?

$ protoc --version
libprotoc 3.6.1

JavaScript

What operating system (Linux, Windows, ...) and version?

$ uname -a
Darwin 17.7.0 Darwin Kernel Version 17.7.0: Thu Jun 21 22:53:14 PDT 2018; root:xnu-4570.71.2~1/RELEASE_X86_64 x86_64
$ sw_vers
ProductName:    Mac OS X
ProductVersion: 10.13.6
BuildVersion:   17G65

What runtime / compiler are you using (e.g., python version or gcc version)

$ node --version
v10.11.0

What did you do?
Given the following proto and test snippet:

// test.proto
syntax = "proto3";

package test;

message Foo {
	map<uint32, string> bar = 1;
}
// test.js

var test = require('./test_pb');

var foo1 = new test.Foo();
foo1.getBarMap().set(0, 'foo');
foo1.getBarMap().set(1, 'bar');

var foo2 = foo1.clone();

console.log(foo1.serializeBinary());
//foo2.getBarMap();
console.log(foo2.serializeBinary());
$ protoc --js_out=import_style=commonjs,binary:. test.proto
$ node test.js
Uint8Array [ 10, 7, 8, 0, 18, 3, 102, 111, 111, 10, 7, 8, 1, 18, 3, 98, 97, 114 ]
Uint8Array []

What did you expect to see
foo1 and foo2 should serialize to the same binary blob. namely, foo2 should contain a map.

What did you see instead?
foo2 is serialized as an empty message.

I think this is related to https://github.com/protocolbuffers/protobuf/blob/81c3e0cf52e63e6e4f8e5c6e39c4d63c41096068/js/message.js#L905 and the fact that the generated protobuf js set opt_noLazyCreate to true:

proto.test.Foo.serializeBinaryToWriter = function(message, writer) {
  var f = undefined;
  f = message.getBarMap(true);
  if (f && f.getLength() > 0) {
    f.serializeBinary(1, writer, jspb.BinaryWriter.prototype.writeUint32, jspb.BinaryWriter.prototype.writeString);
  }
};

proto.test.Foo.prototype.getBarMap = function(opt_noLazyCreate) {
  return /** @type {!jspb.Map<number,string>} */ (
      jspb.Message.getMapField(this, 1, opt_noLazyCreate,
      null));
};

If we uncomment the line //foo2.getBarMap(); then we can force the wrappers_ field to be generated and subsequent serialization of foo2 would return expected result.

Anything else we should know about your project / environment

Is it possible to use 'import_style' for Typescript

I want to have access to types when I get statically generated protobuff files.

Running code:
protoc -I=Protos --js_out=import_style=es6,binary:Javascript Protos/MessageEvnelope.proto Protos/Messages.proto

Generates .js files that are not easily imported in Typescript. When:
import { SomeMessage } from './Messages_pb';
I got error:

Could not find a declaration file for module './Messages_pb' ... implicitly has an 'any' type.

To overcome this, I have to use require('./Messages_pb') which results with no types, which does not satisfy me.

Is there an automatic way to create types for such generated js file? Or some other way to have types in typescript?

Public import is not included in generated CommonJS file

System

Version: v3.6.1
Language: Javascript
macOS 10.14

Steps to reproduce

  1. Create a src directory with 3 files:
    // id.proto
    syntax = "proto3";
    
    message Id {
      string value = 1;
    }
    // task.proto adding a transitive import for `Id`
    syntax = "proto3";
    import public "id.proto";
    
    message Task {
      Id id = 1;
    }
    // project.proto importing both `Id` and `Task` from Project
    syntax = "proto3";
    import "task.proto";
    
    message Project {
      Id id = 1;
      repeated Task task = 2;
    }
  2. Compile to js using
    mkdir build
    protoc --proto_path=src --js_out=import_style=commonjs,binary:build id.proto task.proto project.proto 
  3. npm install google-protobuf under the root directory.
  4. node test.js
    // test.js
    var id_pb = require('./build/id_pb');
    var project_pb = require('./build/project_pb');
    
    var id = new id_pb.Id();
    id.setValue("Everything is fine");
    
    var project = new project_pb.Project()
    project.setId(id);
    
    process.stdout.write(project.getId().getValue());
    process.stdout.write("\n");

Expected behaviour

Everything is fine is printed to the console.

Actual behaviour

.../test-proto/build/project_pb.js:173
    jspb.Message.getWrapperField(this, id_pb.Id, 1));
                                       ^

ReferenceError: id_pb is not defined
    at proto.Project.getId (.../test-proto/build/project_pb.js:173:40)
    at Object.<anonymous> (.../test-proto/test.js:10:30)
    at Module._compile (internal/modules/cjs/loader.js:688:30)
    at Object.Module._extensions..js (internal/modules/cjs/loader.js:699:10)
    at Module.load (internal/modules/cjs/loader.js:598:32)
    at tryModuleLoad (internal/modules/cjs/loader.js:537:12)
    at Function.Module._load (internal/modules/cjs/loader.js:529:3)
    at Function.Module.runMain (internal/modules/cjs/loader.js:741:12)
    at startup (internal/bootstrap/node.js:285:19)
    at bootstrapNodeJSCore (internal/bootstrap/node.js:739:3)

id_pb import is missing from project_pb. Alternatively it may be referenced from task_pb, since it’s declared public in task.proto.

The error does not show up if id.proto is explicitly imported in project.proto.

JS: Type not convertible to Uint8Array

Hi!

I've been trying to get protobuf deserialization working in JavaScript but hitting a wall. I have a small sample message just to test everything is working:

message Sample {
    string text = 1;
    uint32 number = 2;
    bool truthiness = 3;
}

When I create a new Sample message (without deserializing it from anything), the message works fine.
Serializing it to bytes yields me a 24 byte Uint8Array.

However, if I try to create a Sample message from existing bytes, I get an assertion error: Type not convertible to Uint8Array. This is what I've tried:

var Sample = // loaded in from require;
var dataBytes = // some existing uint8array;
var sample = new Sample(dataBytes);
var sample2 = Sample.deserializeBytes(dataBytes);

The first one (sample) will silently fail, but the message is broken from there out (sample.getText() returns 10 instead of a string, as expected.. other fields return garbage as well).

The second one throws the assertion error. I've verified that the Uint8Array I'm receiving and testing with is the same one I get from manually building one via serializeBytes in the console.

Here's my data:
screen shot 2016-03-13 at 4 28 28 am

Am I perhaps just using this wrong?
Thanks in advance!

toObject method of my message class uses an undeclared variable

What version of protobuf and what language are you using?
Version: 3.7.0
Language: Javascript

What operating system (Linux, Windows, ...) and version?

Ubuntu 16.04.5 LTS (Bitnami LAMP)
MacOS Mojave 10.14.2

What runtime / compiler are you using (e.g., python version or gcc version)
node js v11.10.1

What did you do?
Steps to reproduce the behavior:
file minimal.proto:

syntax = "proto3";

message M1 {
  uint64 x = 1;
}

message M2 {
  M1 m = 1;
}

run protoc --js_out="import_style=commonjs,binary:." minimal.proto
What did you expect to see

proto.M2.toObject = function(includeInstance, msg) {
  var obj, f = {
    m: (f = msg.getM()) && proto.M1.toObject(includeInstance, f)
  };

  if (includeInstance) {
    obj.$jspbMessageInstance = msg;
  }
  return obj;
};

(or at least that's what it was doing when it worked with version 3.5)

What did you see instead?

proto.M2.toObject = function(includeInstance, msg) {
  var obj = {
    m: (f = msg.getM()) && proto.M1.toObject(includeInstance, f)
  };

  if (includeInstance) {
    obj.$jspbMessageInstance = msg;
  }
  return obj;
};

note that the declaration of f is missing in this code

Make sure you include information that can help us debug (full error message, exception listing, stack trace, logs).

Anything else we should know about your project / environment

Generate Descriptor For Proto Messages in Javascript.

I'm trying to generate the descriptor for proto messages in Javascript so that i can use the field names to create Dynamic Forms.

Java has a '.getDescriptor()'. Is there any equivalent in Javascript. Any Pointers would be helpful.

Thanks.

Assertion failed

I have a problem with JavaScript protobuf (version 3.4.0 , )

in the following code

var message = MyMessage.deserializeBinary(bytesFromServer);

When I have a repeated uint64 field I have Assertion failed error, in other cases I have no problem

I know bytesFromServer is correct , other languages can deserialize the message

JavaScript: Closure compilation at the advanced level (WIP)

Hey @haberman!

I've recently found your TODO and started adding those proper @export etc. annotations. My current work can be found here (WIP - I'll surely squash the commits in the future).

I'm actually nearly done now: All required annotations are added, the gulp file as well as the tests have been modified & fixed. The problem I now have is that you guys are using goog.asserts, which gets stripped out on the advanced closure level and there is no CLI switch to disable that (there is only setRemoveClosureAsserts(boolean) in the API).

So... What should be the way forward now? Should I simply copy/paste the required parts from goog.asserts into this project, write a modified closure-compiler CLI or simply drop all asserts in the dist release? 🙂

cc: @xfxyjwf

P.S.: The size of the gzipped dist release is about 60% smaller (28kB vs. 11kB) and permanently (!) about 30-50% (!) faster without the asserts @ advanced level. So maybe if we continue dropping all non-essential asserts we could even gain a non-trivial speed improvement (or release a "fast" version alternatively, since almost always incoming data in the browser is received by trusted entities anyways).

JS generator missing jspb.generate_from_object option

The comment for jspb.Message.GENERATE_FROM_OBJECT states:

 *     NOTE: By default no protos actually have a fromObject method. You need to
 *     add the jspb.generate_from_object options to the proto definition to
 *     activate the feature.

However, no such option is currently implemented in the open sourced version. I see the code which generates output for this option (GenerateClassFromObject) but it is invoked 0 times.

~/protobuf$ grep -r GenerateClassFromObject .
./src/google/protobuf/compiler/js/js_generator.cc:void Generator::GenerateClassFromObject(const GeneratorOptions& options,
./src/google/protobuf/compiler/js/js_generator.h:  void GenerateClassFromObject(const GeneratorOptions& options,

Can the jspb.generate_from_object option be open sourced?

JavaScript documentation missing methods

Hi!

I noticed the JS generated code documentation (https://developers.google.com/protocol-buffers/docs/reference/javascript-generated#repeated) was missing information about the following methods that are being generated by the latest protoc:

addFooList(): Adds value at specified index in list

And for repeated bytes:

getFooList_asU8(): Gets the array of bytes, with each value converted to Uint8Array
getFooList_asB64(): Gets the array of bytes, with each value converted to Base64 strings

I looked around in this repo and couldn't find anywhere to contribute this, could someone point me in the right direction?

JavaScript: Package and type metadata in generated files

I need to be able to get the package name and type name for any given Protobuf object, but I can't find this metadata anywhere in the code. In e.g. the C# plugin, this is available in the ProtosReflection.FileDescriptor property.

Am I missing something or is this functionality just not available?

js/{closure,commonjs}: create a new js_module option

What language does this apply to?

If it's a proto syntax change, is it for proto2 or proto3? proto3
If it's about generated code change, what programming language? javascript/commonjs but it can be applied to es6 or closure as well.

Describe the problem you are trying to solve.

We are generating code of multiple packages that we want to publish independently but with references between them. Currently the generator creates a tightly coupled relative paths between folders.

Describe the solution you'd like

An option like Go's go_package or C#'s csharp_namespace where I can put the import path, in this case the JS module.

I think a js_module called option would do the trick.

Describe alternatives you've considered

Overriding the generated code.

GCC 8: js_generator.cc:492:1: warning: control reaches end of non-void function [-Wreturn-type]

New warning:

../../3rdparty/chromium/third_party/protobuf/src/google/protobuf/compiler/js/js_generator.cc: In function 'std::__cxx11::string google::protobuf::compiler::js::{anonymous}::JSByteGetterSuffix(google::protobuf::compiler::js::{anonymous}::BytesMode)':
../../3rdparty/chromium/third_party/protobuf/src/google/protobuf/compiler/js/js_generator.cc:492:1: warning: control reaches end of non-void function [-Wreturn-type]
 }

Pre-processed version of the source file:

string JSByteGetterSuffix(BytesMode bytes_mode) {
  switch (bytes_mode) {
    case BYTES_DEFAULT:
      return "";
    case BYTES_B64:
      return "B64";
    case BYTES_U8:
      return "U8";
    default:
      
# 490 "../../3rdparty/chromium/third_party/protobuf/src/google/protobuf/compiler/js/js_generator.cc" 3 4
     (static_cast<void> (0))
# 490 "../../3rdparty/chromium/third_party/protobuf/src/google/protobuf/compiler/js/js_generator.cc"
                  ;
  }
}

I believe proper fix is to replace the https://github.com/google/protobuf/blob/master/src/google/protobuf/compiler/js/js_generator.cc#L523 with __builtin_unreachable.

node.js and typescript

Great to see you have JS support for browsers and node.js. However, for those of us writing with typescript (for either browser or node.js) it would be very helpful to have a typescript definition file (.d.ts file).

Possible?

JavaScript Bug - deserializeBinary uint64 field to a unsafe int and not correct number

I use protoc-3.6.0-osx-x86_64 to convert xx.proto to xx_pb.js and npm install google-protobuf(^3.6.0)

InitConnect.proto

syntax = "proto2";
package InitConnect;

import "Common.proto";

message C2S
{
	required int32 clientVer = 1; //客户端版本号,clientVer = "."以前的数 * 100 + "."以后的,举例:1.1版本的clientVer为1 * 100 + 1 = 101,2.21版本为2 * 100 + 21 = 221
	required string clientID = 2; //客户端唯一标识,无生具体生成规则,客户端自己保证唯一性即可
	optional bool recvNotify = 3; //此连接是否接收市场状态、交易需要重新解锁等等事件通知,true代表接收,FutuOpenD就会向此连接推送这些通知,反之false代表不接收不推送
}

message S2C
{
	required int32 serverVer = 1; //FutuOpenD的版本号
	required uint64 loginUserID = 2; //FutuOpenD登陆的牛牛用户ID
	required uint64 connID = 3; //此连接的连接ID,连接的唯一标识
	required string connAESKey = 4; //此连接后续AES加密通信的Key,固定为16字节长字符串
	required int32 keepAliveInterval = 5; //心跳保活间隔
}

message Request
{
	required C2S c2s = 1;
}

message Response
{
	required int32 retType = 1 [default = -400]; //返回结果,参见Common.RetType的枚举定义
	optional string retMsg = 2; //返回结果描述
	optional int32 errCode = 3; //错误码,客户端一般通过retType和retMsg来判断结果和详情,errCode只做日志记录,仅在个别协议失败时对账用
	
	optional S2C s2c = 4;
}

I convert InitConnect.proto to InitConnect_pb.js And Import it to node.js code

const InitConnectMessage = require("InitConnect_pb.js")

//I get PackageBody_Buffer from net, then deserializeBinary

let ResponseBody_Object = InitConnectMessage.Response.deserializeBinary(PackageBody_Buffer);
ResponseBody_Object = ResponseBody_Object.toObject();
console.log(ResponseBody_Object)

The log is :

{ rettype: 0,
  retmsg: '',
  errcode: 0,
  s2c:
   { serverver: 100,
     loginuserid: 2131552,
     connid: 167141872653707230,
     connaeskey: 'D7279ECAA5CF51E8',
     keepaliveinterval: 10 } } 

As the result, the deserialized connid is 167141872653707230, more than js Number.MAX_SAFE_INTEGER// → 9_007_199_254_740_991
Moreover the deserialized connid is not equal to the original connid!

Global scope pollution in google-protobuf npm library cause conflict between sub dependencies

The google-protobuf npm library pollutes the global scope through the global.proto variable.
This pollution causes conflict between different versions of google-protobuf in the same node process as they overwrites each other.

Reproduce:

If dependency B is required after A, the older version of google-protobuf owervrites the new versions global.proto in the process.

JS: Possible to de-searlize object from JSON?

Part of the Proto3 spec is to be able top serialize protomodels to and from JSON. I see a way to serialize a proto object to JSON, but not parse it back. Is this possible? And if so, how should i go about it.

AssertionError: Assertion failed

when i was trying to use deserializeBinary method, I got this error AssertionError: Assertion failed

let dataEncoded = JSON.stringify(request.toObject());
  var options = {
    hostname: host,
    port: port,
    path: '/test/request',
    method: 'POST',
    headers: {
      'Content-Length': Buffer.from(dataEncoded).length,
      'Content-Type': 'application/json'
    }
  };

    let req1 = http.request(
      options,
      res => {
        let data1 = ''
        res.on('error', reject);
        res.on('data', chunk => {
          data1 += chunk.toString();
        });
        res.on('end', () => {
            if (res.statusCode === 200) {
              var u8_2 = new Uint8Array(atob(data1).split('').map(function (c) {
                return c.charCodeAt(0)
              }))
              var convertedUser = interop.TestResponse.deserializeBinary(u8_2);
              console.log(convertedUser.toObject());
            }
          });
      }
    );
}

Error:

AssertionError: Assertion failed

dependencies:

"atob": "^2.1.2",
"google-protobuf": "^3.6.1",
"http": "*",
"protocol-buffers": "^4.1.0"

[JS] Getter function for singular message fields

Getter function for singular message fields in javascript returns "undefined" when the field is not set. This is not consistent with the behavior of getter function for such fields in all other languages.

This is very inconvenient and is a source of bugs because programmer has always check for undefined values for nested messages (imagine multi levels of nested submessages). It can be fixed by introducing new getter function like msg.GetFooOrDefault().

Include standalone comments at top of file in generated code

What language does this apply to?

Go, Node.js.

Describe the problem you are trying to solve.

I'm trying to pass information (in the form of standalone comment block at the top of the proto file) to our build tooling where it only has access to the generated files but not the source proto files.

Describe the solution you'd like

Include standalone leading comments at the top of the proto file in the generated pb and grpc_pb files.

Describe alternatives you've considered

I worked around this limitation by attaching comments to various definitions (message, service, etc) to get them included in the generated files.

JavaScript: repeated enum AssertionError when deserializingBinary

What version of protobuf and what language are you using?
Version: v3.6.1
Language: Javascript with Typescript

What operating system (Linux, Windows, ...) and version?
Windows 10 x64

What runtime / compiler are you using (e.g., python version or gcc version)
im using tsc (Typescript compiler)
In order to generate the .ts files from the .proto im using the command protoc --plugin option with the protoc-gen-ts plugin

What did you do?
Steps to reproduce the behavior:

  1. create a protobuf file that contains a repeated enum message
  2. Send a Protobuf binary from my Android App to my Nodejs Express Server
  3. Attempt to deserialize the buffer in my Express Server

What did you expect to see
a Protobuf Object

What did you see instead?
AssertionError on the repeated enum field.

Make sure you include information that can help us debug (full error message, exception listing, stack trace, logs).
my custom repeated enum field is repeated Amenities amenities = 22; and another one at =23;

my protobuff buffer base64 encoded:
ChoyMDE4LTEwLTA2VDIwOjM5OjI0LjI2NDAyORDcCxgCIAEq8gEKA0dhbBIIU2h0ZW5nZWwangFodHRwczovL3Njb250ZW50LmZzZHYzLTEuZm5hLmZiY2RuLm5ldC92L3QzMS4wLTgvMjYyMjEwMTZfMjA5NzUwMzIwNjk0NTg0Nl82NDE3NzA1NTMyMzI5NDYzMDcyX28uanBnP19uY19jYXQ9MCZvaD1iMjE4Y2ZiNDE2MTFlMWY4YTJjZDA0ODM1MzZjZGI0OCZvZT01QzA5MEFCNCAYggE9SGkgaW0gR2FsLCAyNCBsaWtlcyBhIGxvdCBvZiBzdHVmZiBsaWtlIEZsdXV0dGVyIGFuZCBNb3JlZWVlLjIkMDg3ZWQwM2EtMzJmMS00ZDljLWE1YWEtODQ5Y2U3NDliNWQ1QkkKFteq15wg15DXkdeZ15EgLSDXmdek15USGdep15PXqNeV16og16jXldeY16nXmdec15MaAjU2MVbozfKGCEBAOTaVoBqKY0FAUANYAYIBGjIwMTgtMTAtMDZUMjA6Mzk6MjQuMjY0MjMziAEykAEBmAEEoAEAqgElIEdvb2QgbmV3IGFwYXJ0bWVudCB0byBqb2luIHRvbyAhISEgIbABA7ABArgBAbgBArgBAMABEsgBJNABAA==

my error:
[TypeScript] AssertionError: Assertion failed [TypeScript] at new goog.asserts.AssertionError (C:\Users\Gal\Desktop\roommiesusersmicroservice\node_modules\google-protobuf\google-protobuf.js:98:603) [TypeScript] at Object.goog.asserts.doAssertFailure_ (C:\Users\Gal\Desktop\roommiesusersmicroservice\node_modules\google-protobuf\google-protobuf.js:99:126) [TypeScript] at Object.goog.asserts.assert (C:\Users\Gal\Desktop\roommiesusersmicroservice\node_modules\google-protobuf\google-protobuf.js:99:385) [TypeScript] at jspb.BinaryReader.readPackedField_ (C:\Users\Gal\Desktop\roommiesusersmicroservice\node_modules\google-protobuf\google-protobuf.js:359:71) [TypeScript] at jspb.BinaryReader.readPackedEnum (C:\Users\Gal\Desktop\roommiesusersmicroservice\node_modules\google-protobuf\google-protobuf.js:364:287) [TypeScript] at Function.proto.ProtobufApartment.deserializeBinaryFromReader (C:\Users\Gal\Desktop\roommiesusersmicroservice\dist\protobuf\generated\apartment_pb.js:210:67) [TypeScript] at Function.proto.ProtobufApartment.deserializeBinary (C:\Users\Gal\Desktop\roommiesusersmicroservice\dist\protobuf\generated\apartment_pb.js:116:34)

Anything else we should know about your project / environment
It crashes when it tries to read my repeated enum field.

please help me..
thanks!

[JavaScript] Add insertion points to the compiler

What language does this apply to?
This feature request is for proto3 and JavaScript language generator.

Describe the problem you are trying to solve.

Problem I am trying to solve, is that when using gRPC (and generated proto files) in combination with create-react-app, the eslint complain about all the generated proto JS code with the following error reported all over the shop:
Line 29: 'proto' is not defined no-undef

Describe the solution you'd like

Since I already have my own plugin to help generate some other things along with JS, it'd be nice to have access to insertion points for the top and bottom of the file, so I could add the following myself:

/* eslint-disable */ and /* eslint-enable */ respectively

Describe alternatives you've considered

Ana lternative would be to have an option in the compiler that would add those for me, for instance:

    protoc 
      --js_out="import_style=commonjs,disable_eslint=true,binary:$OUTPUT_PATH"
      ${currentProto}

Unable to access extensions from JavaScript

We use code like this to encode a version of the gRPC protocol:

import "google/protobuf/descriptor.proto";

extend google.protobuf.FileOptions {
    string protocol_version = 51000;
}

option (protocol_version) = "0.1.0";

And in Python, one can access it using:

import test_service_pb2
version = test_service_pb2.DESCRIPTOR.GetOptions().Extensions[test_service_pb2.protocol_version]

But it seems there is no way to access extensions from JavaScript code?

JS compiler generates invalid code

protoc 3.4.0 generated this:

/** @param {string} value */
proto.PastebinMixin.prototype.setText = function(value) {
  jspb.Message.setProto3StringField(this, 1, value);
};

But I have to manually change it to this to work:

/** @param {string} value */
proto.PastebinMixin.prototype.setText = function(value) {
  jspb.Message.setField(this, 1, value);
};

js codegen does not produce helper functions for Any when using closure imports

What version of protobuf and what language are you using?
Version: v3.6.1 (binary downloaded from official releases)
Language: Javascript, when closure styled imports

What operating system (Linux, Windows, ...) and version?

Linux x86_64

What runtime / compiler are you using (e.g., python version or gcc version)
NA

What did you do?
Steps to reproduce the behavior:

root@ec834298ad98:/github/grpc-web-base/third_party/grpc/third_party/protobuf# mkdir -p /tmp/proto_out/closure /tmp/proto_out/commonjs
root@ec834298ad98:/github/grpc-web-base/third_party/grpc/third_party/protobuf# /tmp/bin/protoc -I /github/grpc-web-base/third_party/grpc/third_party/protobuf --js_out=import_style=closure,binary:/tmp/proto_out/closure google/protobuf/any.proto
root@ec834298ad98:/github/grpc-web-base/third_party/grpc/third_party/protobuf# /tmp/bin/protoc -I /github/grpc-web-base/third_party/grpc/third_party/protobuf --js_out=import_style=commonjs,binary:/tmp/proto_out/commonjs/ google/protobuf/any.proto

root@ec834298ad98:/github/grpc-web-base/third_party/grpc/third_party/protobuf# find /tmp/proto_out/ -type f
/tmp/proto_out/commonjs/google/protobuf/any_pb.js
/tmp/proto_out/closure/any.js

root@ec834298ad98:/github/grpc-web-base/third_party/grpc/third_party/protobuf# grep -rHin -C 5  unpack /tmp/proto_out/
/tmp/proto_out/commonjs/google/protobuf/any_pb.js-240-};
/tmp/proto_out/commonjs/google/protobuf/any_pb.js-241-
/tmp/proto_out/commonjs/google/protobuf/any_pb.js-242-
/tmp/proto_out/commonjs/google/protobuf/any_pb.js-243-/**
/tmp/proto_out/commonjs/google/protobuf/any_pb.js-244- * @template T
/tmp/proto_out/commonjs/google/protobuf/any_pb.js:245: * Unpacks this Any into the given message object.
/tmp/proto_out/commonjs/google/protobuf/any_pb.js-246- * @param {function(Uint8Array):T} deserialize Function that will deserialize
/tmp/proto_out/commonjs/google/protobuf/any_pb.js-247- *     the binary data properly.
/tmp/proto_out/commonjs/google/protobuf/any_pb.js-248- * @param {string} name The expected type name of this message object.
/tmp/proto_out/commonjs/google/protobuf/any_pb.js-249- * @return {?T} If the name matched the expected name, returns the deserialized
/tmp/proto_out/commonjs/google/protobuf/any_pb.js-250- *     object, otherwise returns null.
/tmp/proto_out/commonjs/google/protobuf/any_pb.js-251- */
/tmp/proto_out/commonjs/google/protobuf/any_pb.js:252:proto.google.protobuf.Any.prototype.unpack = function(deserialize, name) {
/tmp/proto_out/commonjs/google/protobuf/any_pb.js-253-  if (this.getTypeName() == name) {
/tmp/proto_out/commonjs/google/protobuf/any_pb.js-254-    return deserialize(this.getValue_asU8());
/tmp/proto_out/commonjs/google/protobuf/any_pb.js-255-  } else {
/tmp/proto_out/commonjs/google/protobuf/any_pb.js-256-    return null;
/tmp/proto_out/commonjs/google/protobuf/any_pb.js-257-  }

What did you expect to see
I expect to see pack/unpack functions for closure import code output

What did you see instead?
I do not see pack/unpack. However, when I use commonjs outputs, I see pack/unpack as demonstrated in the repro steps.

Make sure you include information that can help us debug (full error message, exception listing, stack trace, logs).

Anything else we should know about your project / environment

@TeBoring had some investigation into the root cause:

https://github.com/protocolbuffers/protobuf/blob/master/src/google/protobuf/compiler/js/js_generator.cc#L3682
This line is only called in GenerateFile.
https://github.com/protocolbuffers/protobuf/blob/master/src/google/protobuf/compiler/js/js_generator.cc#L3515
When you specify closure style, options.output_mode() == GeneratorOptions::kOneOutputFilePerType, and GenerateFile is not called.

Proto3 integration with JS

Hello Team,

I have used proto on java side successfully.

However, now I need to use it for JS side or more specifically on the browser side. I am able to generate the proto_pb.js file for a proto3 source file. However, the steps mentioned in the below site does not work(tried the commonjs style), saying require is not supported for browsers and use browserify/webpack etc. When tried to use browserify it complained that module(proto_pb.js) not found. No luck yet.

https://github.com/google/protobuf/tree/master/js

Could you please share the complete steps for JS or browser side proto3 integration.

Thanks in advance

Failed to parse server response when using jstype = JS_STRING

What version of protobuf and what language are you using?
Version: v3.6.1
Language: Javascript

What operating system (Linux, Windows, ...) and version?
Mac/Windows/Linux

What runtime / compiler are you using (e.g., python version or gcc version)
Node v10.15.1 (as well as older versions)

What did you do?
Steps to reproduce the behavior:
Set jstype = JS_STRING on a uint64 field within a response message definition. Make a grpc call that includes this field in the response.

What did you expect to see
A successful response.

What did you see instead?
An INTERNAL error of failed to parse server response.

Anything else we should know about your project / environment
This issue does not seem to be present in v3.4.0, downgrading makes the problem go away.

Related issue: grpc/grpc-node#789

Use Google Protocol Buffer library in Javascript

i have a buffer data like CgAaygUKAm1kEAItAAAAAC0AACBCLQAANEItAABIQi0AAFxCLQ; second step: decode it into basse64; third step: convert base 64 decoded data into Uint8Array get the data in bytes; fourth step: pass Uint8Array bytes data to deserializeBinary(bytes) function and final result is in string bytes which is not expected result, expected result is float data. Result is: " md-- B-4B-HB-\B-pB-B-B-B-B-B-B-B-Ȃ-҂-

Using JS typed arrays in JoinFloat

Hi,

Has anyone thought of using JS typed arrays in the splitFloat and joinFloat methods for the JavaScript library (https://github.com/google/protobuf/blob/master/js/binary/utils.js#L388)?

One could imagine adding something like:

jspb.utils.joinFloat64 = function(bitsLow, bitsHigh) {
    if (window.Uint32Array) {
        bytes = Uint8Array.of(bitsLow, bitsHigh)
        data = new Float64Array(bytes.buffer)
        return data[0];
    }
    // ...
}

And vica-versa for the splitFloat method. This might increase readability (and depending on if it taps into the JS engine's native implementation, maybe even speed).

Happy to write the PR, just curious if it would be useful / appreciated!

Let me know,

ReferenceError: goog is not defined

I follow the steps in the docs, convert the .proto message to myproto_libs.js file while importing the file

following error occurred
ReferenceError: goog is not defined

[Javascript] Incorrectly generated import paths when using `google/api` or `google/type`

/cc @zchee

What version of protobuf and what language are you using?
Version: v3.6.0
Language: Javascript

What operating system (Linux, Windows, ...) and version?
Docker Image: node:9

What runtime / compiler are you using (e.g., python version or gcc version)

What did you do?
Steps to reproduce the behavior:
When using imports for google/api/.. or google/type/... the imports generated for javascript resolve to incorrect paths
Example:

import "google/api/annotations.proto";
import "google/type/date.proto";

What did you expect to see
Correctly resolved import paths

What did you see instead?
Wrongly resolved import paths:
generated output (these directories don't exist):

var google_api_annotations_pb = require('../../../../../../google/api/annotations_pb.js');
var google_type_date_pb = require('../../../../../../google/type/date_pb.js');

We resolved these imports by generating these files ourselves (from https://github.com/googleapis/googleapis repository) and copying them to the correct folder but inside of annotations_pb.js another incorrect import is generated so it seems that this problem is carrying through to any imports down the road:

// inside of annotations_pb.js 
var google_api_http_pb = require('../../../../../../google/api/http_pb.js');

We wondered how we could resolve this? Generating the files ourselves and copying them is totally ok but the wrongly generated imports inside these files are an issue we can not resolve on our end unfortunately, so any help is appreciated.

Anything else we should know about your project / environment
We use grpc_tools_node_protoc and grpc_tools_node_protoc_ts plugins to generate typescript and grpc code.

Command we execute:

# Variables:
GRPC_GATEWAY_REPO=github.com/grpc-ecosystem/grpc-gateway/third_party/googleapis
GOOGLE_API_REPO=github.com/googleapis/googleapis
VENDOR_DIR=vendor
PROTOC_OPTION=-I. -I$(VENDOR_DIR) -I$(VENDOR_DIR)/$(GRPC_GATEWAY_REPO) -I$(VENDOR_DIR)/$(GOOGLE_API_REPO)
TYPESCRIPT_OUTPUT=gen/typescript

# Command:
grpc_tools_node_protoc $(PROTOC_OPTION) --js_out=import_style=commonjs,binary:$(TYPESCRIPT_OUTPUT) --grpc_out=$(TYPESCRIPT_OUTPUT) --plugin=protoc-gen-grpc=$(shell which grpc_tools_node_protoc_plugin)

# followed by this command for typescript generation:
protoc $(PROTOC_OPTION) --plugin=protoc-gen-ts=`which protoc-gen-ts` --ts_out=$(TYPESCRIPT_OUTPUT)

JS Unexpected behavior when serializing/deserializing

Hello opening this issue because I've seen an unexpected behavior and want to discuss about it and see what's the best pattern

Given the following message:

message MyMessage {
    String pkid = 1;
}

If I set a number field into pkid, I'm able to retrieve it correctly. Once I serialize and then deserialize the message, the value gets coerced as an empty string:

> protoConfig.setPkid(123);
> protoConfig.getPkid();
123
> MyMessage.deserializeBinary((protoConfig.serializeBinary())).getPkid()
""

I wasn't expecting the field to be transformed silently once the message is serialized. What I would expect from order of preference:

  1. Setting the field with setPkid to crash (or a warning) because the type is not what was expected
  2. Serialization crashing (or a warning) because the type is not expected
  3. Coercing the type using toString which would set it to '123'

I understand suggested behaviors may have performance implications but I'm not sure what's the reason of current behavior because this still forces the user to do type checks before setting fields in a protobuf message when using javascript? IMO silently changing the value of a field when serializing a message is dangerous and I would aim for correctness of data first.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.