Git Product home page Git Product logo

rules_proto's Introduction

rules_proto (v3)

build-status Go Reference

Bazel starlark rules for building protocol buffers +/- gRPC ✨.

bazel gazelle protobuf grpc

@build_stack_rules_proto provides:

  1. Rules for driving the protoc tool within a bazel workspace.
  2. A gazelle extension that generates rules based on the content of your .proto files.
  3. A repository rule that runs gazelle in an external workspace.
  4. Example setups for a variety of languages.

Table of Contents

Getting Started

WORKSPACE Boilerplate

load("@bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")

# Release: v3.1.0
# TargetCommitish: master
# Date: 2024-02-13 05:53:39 +0000 UTC
# URL: https://github.com/stackb/rules_proto/releases/tag/v3.1.0
# Size: 1995581 (2.0 MB)
http_archive(
    name = "build_stack_rules_proto",
    sha256 = "ee7a11d66e7bbc5b0f7a35ca3e960cb9a5f8a314b22252e19912dfbc6e22782d",
    strip_prefix = "rules_proto-3.1.0",
    urls = ["https://github.com/stackb/rules_proto/archive/v3.1.0.tar.gz"],
)
register_toolchains("@build_stack_rules_proto//toolchain:standard")

This prepares protoc for the proto_compile rule. For simple setups, consider @build_stack_rules_proto//toolchain:prebuilt to skip compilation of the tool.

NOTE: if you are planning on hand-writing your BUILD.bazel rules yourself (not using the gazelle build file generator), STOP HERE. You'll need to provide typical proto dependencies such as @rules_proto and @com_google_protobuf (use macros below if desired), but no additional core dependencies are needed at this point.


load("@build_stack_rules_proto//deps:core_deps.bzl", "core_deps")

core_deps()

This brings in @io_bazel_rules_go, @bazel_gazelle, and @rules_proto if you don't already have them.

The gazelle extension and associated golang dependencies are optional; you can write proto_compile and other derived rules by hand. For gazelle support, carry on.


load(
    "@io_bazel_rules_go//go:deps.bzl",
    "go_register_toolchains",
    "go_rules_dependencies",
)

go_rules_dependencies()

go_register_toolchains(version = "1.18.2")

Standard biolerplate for @io_bazel_rules_go.


load( "@bazel_gazelle//:deps.bzl", "gazelle_dependencies")

gazelle_dependencies()

Standard boilerplate for @bazel_gazelle.


load("@build_stack_rules_proto//:go_deps.bzl", "gazelle_protobuf_extension_go_deps")

gazelle_protobuf_extension_go_deps()

This brings in @com_github_emicklei_proto. github.com/emicklei/proto is used by the gazelle extension to parse proto files.


load("@build_stack_rules_proto//deps:protobuf_core_deps.bzl", "protobuf_core_deps")

protobuf_core_deps()

This brings in @com_google_protobuf and friends if you don't already have them.

Gazelle Setup

load("@bazel_gazelle//:def.bzl", "gazelle", "gazelle_binary")

gazelle_binary(
    name = "gazelle-protobuf",
    languages = [
        "@bazel_gazelle//language/go",
        "@bazel_gazelle//language/proto",
        # must be after the proto extension (order matters)
        "@build_stack_rules_proto//language/protobuf",
    ],
)

gazelle(
    name = "gazelle",
    gazelle = ":gazelle-protobuf",
)

The gazelle setup is typically placed in the root BUILD.bazel file.


Gazelle Configuration

The gazelle extension can be configured using "build directives" and/or a YAML file.

Build Directives

Gazelle is configured by special comments in BUILD files called directives.

Gazelle works by visiting each package in your workspace; configuration is done "on the way in" whereas actual rule generation is done "on the way out". Gazelle configuration of a subdirectory inherits that from its parents. As such, directives placed in the root BUILD.bazel file apply to the entire workspace.

# gazelle:proto_rule proto_compile implementation stackb:rules_proto:proto_compile

# gazelle:proto_plugin cpp implementation builtin:cpp
# gazelle:proto_plugin protoc-gen-grpc-cpp implementation grpc:grpc:cpp

# gazelle:proto_rule proto_cc_library implementation stackb:rules_proto:proto_cc_library
# gazelle:proto_rule proto_cc_library deps @com_google_protobuf//:protobuf
# gazelle:proto_rule proto_cc_library visibility //visibility:public
# gazelle:proto_rule grpc_cc_library implementation stackb:rules_proto:grpc_cc_library
# gazelle:proto_rule grpc_cc_library deps @com_github_grpc_grpc//:grpc++
# gazelle:proto_rule grpc_cc_library deps @com_github_grpc_grpc//:grpc++_reflection
# gazelle:proto_rule grpc_cc_library visibility //visibility:public

# gazelle:proto_language cpp plugin cpp
# gazelle:proto_language cpp plugin protoc-gen-grpc-cpp
# gazelle:proto_language cpp rule proto_compile
# gazelle:proto_language cpp rule proto_cc_library
# gazelle:proto_language cpp rule grpc_cc_library

Let's unpack this a bit:

  • gazelle:proto_plugin cpp implementation builtin:cpp associates the name cpp with a piece of golang code that implements the protoc.Plugin interface. The extension maintains a registry of these actors (the gazelle extension ships with a number of them out of the box, but you can also write your own). The core responsibility a protoc.Plugin implementation is to to predict the files that a protoc plugin tool will generate for an individual proto_library rule. The implemention has full read access to the protoc.Files in the proto_library to be able to predict if a file will be generated and where it will appear in the filesystem (specifically, relative to the bazel execution root during a proto_compile action).
  • gazelle:proto_rule proto_compile implementation stackb:rules_proto:proto_compile associates the name proto_compile with a piece of golang code that implements the protoc.LanguageRule interface. The extension maintains a registry of rule implementations. Similarly, the extension ships with a bunch of them out of the box, but you can write your own custom rules as well. The core responsibility a protoc.LanguageRule implementation is construct a gazelle rule.Rule based upon a proto_library rule and the set of plugins that are configured with it.
  • gazelle:proto_language cpp plugin cpp instantiates a protoc.LanguageConfig having the name cpp and adds the cpp plugin to it. The language configuration bundles bundles plugins and rules together.
  • gazelle:proto_rule grpc_cc_library deps @com_github_grpc_grpc//:grpc++ configures the rule such that all generated rules will have that dependency.

+/- intent modifiers. Although not pictured in this example, many of the directives take an intent modifier to turn configuration on/off. For example, if you wanted to suppress the grpc c++ plugin in the package //proto/javaapi, put a directive like gazelle:proto_language cpp rule -grpc_cc_library in proto/javaapi/BUILD.bazel (note the - symbol preceding the name). To suppress the language entirely, use gazelle:proto_language cpp enabled false.

YAML Configuration

You can also configure the extension using a YAML file. This is semantically similar to adding gazelle directives to the root BUILD file; the YAML configuration applies to all downstream packages. The equivalent YAML config for the above directives looks like:

plugins:
  - name: cpp
    implementation: builtin:cpp
  - name: protoc-gen-grpc-cpp
    implementation: grpc:grpc:cpp
rules:
  - name: proto_compile
    implementation: stackb:rules_proto:proto_compile
    visibility:
      -  //visibility:public
  - name: proto_cc_library
    implementation: stackb:rules_proto:proto_cc_library
    visibility:
      -  //visibility:public
    deps:
      - "@com_google_protobuf//:protobuf"
  - name: grpc_cc_library
    implementation: stackb:rules_proto:grpc_cc_library
    visibility:
      -  //visibility:public
    deps:
      - "@com_github_grpc_grpc//:grpc++"
      - "@com_github_grpc_grpc//:grpc++_reflection"
languages:
  - name: "cpp"
    plugins:
      - cpp
      - protoc-gen-grpc-cpp
    rules:
      - proto_compile
      - proto_cc_library
      - grpc_cc_library

A yaml config is particularly useful in conjunction with the proto_repository rule, for example to apply a set of custom plugins over the googleapis/googleapis repo.

To use this in a gazelle rule, specify -proto_configs in args (comma-separated list):

gazelle(
    name = "gazelle",
    gazelle = ":gazelle-protobuf",
    args = [
        "-proto_configs=example/config.yaml",
    ],
)

Running Gazelle

Now that we have the WORKSPACE setup and gazelle configured, we can run gazelle:

$ bazel run //:gazelle -- proto/

To restrict gazelle to only a particular subdirectory example/routeguide/:

$ bazel run //:gazelle -- example/routeguide/

Gazelle should now have generated something like the following:

load("@rules_proto//proto:defs.bzl", "proto_library")
load("@build_stack_rules_proto//rules/cc:grpc_cc_library.bzl", "grpc_cc_library")
load("@build_stack_rules_proto//rules/cc:proto_cc_library.bzl", "proto_cc_library")
load("@build_stack_rules_proto//rules:proto_compile.bzl", "proto_compile")

proto_library(
    name = "routeguide_proto",
    srcs = ["routeguide.proto"],
    visibility = ["//visibility:public"],
)

proto_compile(
    name = "routeguide_cpp_compile",
    outputs = [
        "routeguide.grpc.pb.cc",
        "routeguide.grpc.pb.h",
        "routeguide.pb.cc",
        "routeguide.pb.h",
    ],
    plugins = [
        "@build_stack_rules_proto//plugin/builtin:cpp",
        "@build_stack_rules_proto//plugin/grpc/grpc:protoc-gen-grpc-cpp",
    ],
    proto = "routeguide_proto",
)

proto_cc_library(
    name = "routeguide_cc_library",
    srcs = ["routeguide.pb.cc"],
    hdrs = ["routeguide.pb.h"],
    visibility = ["//visibility:public"],
    deps = ["@com_google_protobuf//:protobuf"],
)

grpc_cc_library(
    name = "routeguide_grpc_cc_library",
    srcs = ["routeguide.grpc.pb.cc"],
    hdrs = ["routeguide.grpc.pb.h"],
    visibility = ["//visibility:public"],
    deps = [
        ":routeguide_cc_library",
        "@com_github_grpc_grpc//:grpc++",
        "@com_github_grpc_grpc//:grpc++_reflection",
    ],
)

Regarding rules like @build_stack_rules_proto//rules/cc:proto_cc_library.bzl%proto_cc_library". These are nearly always very thin wrappers for the "real" rule. For example, here's the implementation in proto_cc_library.bzl:

load("@rules_cc//cc:defs.bzl", "cc_library")

def proto_cc_library(**kwargs):
    cc_library(**kwargs)

An implementation detail of gazelle itself is that two different language extensions should not claim the same load namespace, so in order to prevent potential conflicts with other possible gazelle extensions, using the name @rules_cc//cc:defs.bzl%cc_library is undesirable.

Build Rules

The core of stackb/rules_proto contains two build rules:

Rule Description
proto_compile Executes the protoc tool.
proto_plugin Provides static protoc plugin-specific configuration.

proto_compile

Example:

load("@rules_proto//proto:defs.bzl", "proto_library")
load("@build_stack_rules_proto//rules:proto_compile.bzl", "proto_compile")

proto_library(
    name = "thing_proto",
    srcs = ["thing.proto"],
    deps = ["@com_google_protobuf//:timestamp_proto"],
)

proto_plugin(name = "cpp")

proto_compile(
    name = "person_cpp_compile",
    outputs = [
        "person.pb.cc",
        "person.pb.h",
    ],
    plugins = [":cpp"],
    proto = "person_proto",
)

Takeaways:

  • A proto_library rule forms the basis for other language-specific derived rules.
  • proto_library is provided by bazelbuild/rules_proto.
  • A proto_compile rule references a single proto_library target.
  • The plugins attribute is a list of labels to proto_plugin targets.
  • The outputs attribute names the files that will be generated by the protoc invocation.
  • The proto extension provided by [bazel-gazelle] is responsible for generating proto_library.

proto_plugin

proto_plugin primarily provides the plugin tool executable. The example seen above is the simplest case where the plugin is builtin to protoc itself; no separate plugin tool is required. In this case the proto_plugin rule degenerates into just a name.

It is possible to add additional plugin-specific name = "foo", options = ["bar"] on the proto_plugin rule, but the use-case for this is narrow. Generally it is preferred to say # gazelle:proto_plugin foo option bar such that the option can be interpreted during a gazelle run.

proto_compiled_sources

proto_compiled_sources is used when you prefer to check the generated files into source control. This may be necessary for legacy reasons, during an initial Bazel migration, or to support better IDE integration.

The shape of a proto_compiled_sources rule is essentially identical to proto_compile with one exception: generated source are named in the srcs attribute rather than outputs.

For example, a proto_compiled_sources named //example/thing:proto_go_sources is a macro that generates three rules:

  1. bazel build //example/thing:proto_go_sources emits the generated files.
  2. bazel run //example/thing:proto_go_sources.update copies the generated files back into the source package.
  3. bazel test //example/thing:proto_go_sources_test asserts the source files are identical to generated files.

In this scenario, 2. is used to build the generated files (in the bazel-bin/ output tree) and copy the example/thing/thing.pb.go back into place where it will be committed under source control. 3. is used to prevent drift: if a developer modifies thing.proto and neglects to run the .update the test will fail in CI.

proto_compile_assets

The macro proto_compile_assets aggregates a list of dependencies (which provide ProtoCompileInfo) into a single runnable target that copies files in bulk.

For example, bazel run //proto:assets will copy all the generated .pb.go files back into the source tree:

load("@build_stack_rules_proto//rules:proto_compile_assets.bzl", "proto_compile_assets")

proto_compile_assets(
    name = "assets",
    deps = [,
      "//proto/api/v1:proto_go_compile",
      "//proto/api/v2:proto_go_compile",
      "//proto/api/v3:proto_go_compile",
    ],
)

The output_mappings attribute

Consider the following rule within the package example/thing:

proto_compile(
    name = "thing_go_compile",
    output_mappings = ["thing.pb.go=github.com/stackb/rules_proto/example/thing/thing.pb.go"],
    outputs = ["thing.pb.go"],
    plugins = ["@build_stack_rules_proto//plugin/golang/protobuf:protoc-gen-go"],
    proto = "thing_proto",
)

This rule is declaring that a file bazel-bin/example/thing/thing.pb.go will be output when the action is run. When we bazel build //example/thing:thing_go_compile, the file is indeed created.

Let's temporarily comment out the output_mappings attribute and rebuild:

proto_compile(
    name = "thing_go_compile",
    # output_mappings = ["thing.pb.go=github.com/stackb/rules_proto/example/thing/thing.pb.go"],
    outputs = ["thing.pb.go"],
    plugins = ["@build_stack_rules_proto//plugin/golang/protobuf:protoc-gen-go"],
    proto = "thing_proto",
)
$ bazel build //example/thing:thing_go_compile
ERROR: /github.com/stackb/rules_proto/example/thing/BUILD.bazel:54:14: output 'example/thing/thing.pb.go' was not created

What happened? Let's add a debugging attribute verbose = True on the rule: this will print debugging information and show the bazel sandbox before and after the protoc tool is invoked:

proto_compile(
    name = "thing_go_compile",
    # output_mappings = ["thing.pb.go=github.com/stackb/rules_proto/example/thing/thing.pb.go"],
    outputs = ["thing.pb.go"],
    plugins = ["@build_stack_rules_proto//plugin/golang/protobuf:protoc-gen-go"],
    proto = "thing_proto",
    verbose = True,
)
$ bazel build //example/thing:thing_go_compile
##### SANDBOX BEFORE RUNNING PROTOC
./bazel-out/host/bin/external/com_google_protobuf/protoc
./bazel-out/darwin-opt-exec-2B5CBBC6/bin/external/com_github_golang_protobuf/protoc-gen-go/protoc-gen-go_/protoc-gen-go
./bazel-out/darwin-fastbuild/bin/example/thing/thing_proto-descriptor-set.proto.bin
./bazel-out/darwin-fastbuild/bin/external/com_google_protobuf/timestamp_proto-descriptor-set.proto.bin

##### SANDBOX AFTER RUNNING PROTOC
./bazel-out/darwin-fastbuild/bin/github.com/stackb/rules_proto/example/thing/thing.pb.go

So, the file was created, but not in the location we wanted. In this case the protoc-gen-go plugin is not "playing nice" with Bazel. Because this thing.proto has option go_package = "github.com/stackb/rules_proto/example/thing;thing";, the output location is no longer based on the package. This is a problem, because Bazel semantics disallow declaring a File outside its package boundary. As a result, we need to do a mv ./bazel-out/darwin-fastbuild/bin/github.com/stackb/rules_proto/example/thing/thing.pb.go ./bazel-out/darwin-fastbuild/bin/example/thing/thing.pb.go to relocate the file into its expected location before the action terminates.

Therefore, the output_mappings attribute is a list of entries that map file locations want=got relative to the action execution root. It is required when the actual output location does not match the desired location. This can occur if the proto package statement does not match the Bazel package path, or in special circumstances specific to the plugin itself (like go_package).

Repository Rules

proto_repository

From an implementation standpoint, this is very similar to the go_repository rule. Both can download external files and then run the gazelle generator over the downloaded files. Example:

proto_repository(
    name = "googleapis",
    build_directives = [
        "gazelle:resolve proto google/api/http.proto //google/api:http_proto",
    ],
    build_file_generation = "clean",
    build_file_proto_mode = "file",
    reresolve_known_proto_imports = True,
    proto_language_config_file = "@//:rules_proto_config.yaml",
    strip_prefix = "googleapis-02710fa0ea5312d79d7fb986c9c9823fb41049a9",
    type = "zip",
    urls = ["https://codeload.github.com/googleapis/googleapis/zip/02710fa0ea5312d79d7fb986c9c9823fb41049a9"],
)

Takeaways:

  • The urls, strip_prefix and type behave similarly to http_archive.
  • build_file_proto_mode is the same the go_repository attribute of the same name; additionally the value file is permitted which generates a proto_library for each individual proto file.
  • build_file_generation is the same the go_repository attribute of the same name; additionally the value clean is supported. For example, googleapis already has a set of BUILD files; the clean mode will remove all the existing build files before generating new ones.
  • build_directives is the same as go_repository. Resolve directives in this case are needed because the gazelle language/proto extension hardcodes a proto import like google/api/http.proto to resolve to the @go_googleapis workspace; here we want to make sure that http.proto resolves to the same external workspace.
  • proto_language_config_file is an optional label pointing to a valid config.yaml file to configure this extension.
  • reresolve_known_proto_imports is a boolean attribute that has special meaning for the googleapis repository. Due to the fact that the builtin gazelle "proto" extension has hardcoded logic for what googleapis deps look like, additional work is needed to override that. With this sample configuration, the following build command succeeds:
$ bazel build @googleapis//google/api:annotations_cc_library
Target @googleapis//google/api:annotations_cc_library up-to-date:
  bazel-bin/external/googleapis/google/api/libannotations_cc_library.a
  bazel-bin/external/googleapis/google/api/libannotations_cc_library.so

Another example using the Bazel repository:

load("@build_stack_rules_proto//rules/proto:proto_repository.bzl", "proto_repository")

proto_repository(
    name = "bazelapis",
    build_directives = [
        "gazelle:exclude third_party",
        "gazelle:proto_language go enable true",
        "gazelle:proto_language closure enabled true",
        "gazelle:prefix github.com/bazelbuild/bazelapis",
    ],
    build_file_expunge = True,
    build_file_proto_mode = "file",
    cfgs = ["//proto:config.yaml"],
    imports = [
        "@googleapis//:imports.csv",
        "@protoapis//:imports.csv",
        "@remoteapis//:imports.csv",
    ],
    strip_prefix = "bazel-02ad3e3bc6970db11fe80f966da5707a6c389fdd",
    type = "zip",
    urls = ["https://codeload.github.com/bazelbuild/bazel/zip/02ad3e3bc6970db11fe80f966da5707a6c389fdd"],
)

Takeaways:

  • The build_directives are setting the gazelle:prefix for the language/go plugin; two proto_language configs named in the proto/config.yaml are being enabled.
  • build_file_expunge means remove all existing BUILD files before generating new ones.
  • build_file_proto_mode = "file" means make a separate proto_library rule for every .proto file.
  • cfgs = ["//proto:config.yaml"] means use the configuration in this YAML file as a base set of gazelle directives. It is easier/more consistent to share the same config.yaml file across multiple proto_repository rules.

The last one imports = ["@googleapis//:imports.csv", ...] requires extra explanation. When the proto_repository gazelle process finishes, it writes a file named imports.csv in the root of the external workspace. This file records the association between import statements and bazel labels, much like a gazelle:resolve directive:

# GENERATED FILE, DO NOT EDIT (created by gazelle)
# lang,imp.lang,imp,label
go,go,github.com/googleapis/gapic-showcase/server/genproto,@googleapis//google/example/showcase/v1:compliance_go_proto
go,go,google.golang.org/genproto/firestore/bundle,@googleapis//google/firestore/bundle:bundle_go_proto
go,go,google.golang.org/genproto/googleapis/actions/sdk/v2,@googleapis//google/actions/sdk/v2:account_linking_go_proto

Therefore, the imports attribute assists gazelle in figuring how to resolve imports. Therefore, when gazelle is preparing a go_library rule and finds a main.go file having an import on google.golang.org/genproto/firestore/bundle, it knows to put @googleapis//google/firestore/bundle:bundle_go_proto in the rule deps.

To take advantage of this mechanism in the default workspace, use the proto_gazelle rule.

proto_gazelle

proto_gazelle is not a repository rule: it's just like the typical gazelle rule, but with extra deps resolution superpowers. But, we discuss it here since it works in conjunction with proto_repository:

load("@build_stack_rules_proto//rules:proto_gazelle.bzl", "DEFAULT_LANGUAGES", "proto_gazelle")

proto_gazelle(
    name = "gazelle",
    cfgs = ["//proto:config.yaml"],
    command = "update",
    gazelle = ":gazelle-protobuf",
    imports = [
        "@bazelapis//:imports.csv",
        "@googleapis//:imports.csv",
        "@protoapis//:imports.csv",
        "@remoteapis//:imports.csv",
    ],
)

In this example, we are again setting the base gazelle config using the YAML file (the same one used in for the proto_repository rules). We are also now importing resolve information from four external sources.

With this setup, we can simply place an import statement like import "src/main/java/com/google/devtools/build/lib/buildeventstream/proto/build_event_stream.proto"; in a foo.proto file in the default workspace, and gazelle will automagically figure out the import dependency tree spanning @bazelapis, @remoteapis, @googleapis, and the well-known types from @protoapis.

This works for any proto_language, with any set of custom protoc plugins.

golden_filegroup

golden_filegroup is a utility macro for golden file testing. It works like a native filegroup, but adds .update and .test targets. Example:

load("@build_stack_rules_proto//rules:golden_filegroup.bzl", "golden_filegroup")

# golden_filegroup asserts that generated files named in 'srcs' are
# identical to the ones checked into source control.
#
# Usage:
#
# $ bazel build :golden        # not particularly useful, just a regular filegroup
#
# $ bazel test  :golden.test   # checks that generated files are identical to
# ones in git (for CI)
#
# $ bazel run   :golden.update # copies the generated files into source tree
# (then 'git add' to your PR if it looks good)
golden_filegroup(
    name = "golden",
    srcs = [
        ":some_generated_file1.json",
        ":some_generated_file2.json",
    ],
)

Plugin Implementations

The plugin name is an opaque string, but by convention they are maven-esqe artifact identifiers that follow a GitHub org/repo/plugin_name convention.

Plugin
builtin:cpp
builtin:csharp
builtin:java
builtin:js:closure
builtin:js:common
builtin:objc
builtin:php
builtin:python
builtin:pyi
builtin:ruby
grpc:grpc:cpp
grpc:grpc:protoc-gen-grpc-python
golang:protobuf:protoc-gen-go
grpc:grpc-go:protoc-gen-go-grpc
grpc:grpc-java:protoc-gen-grpc-java
grpc:grpc-node:protoc-gen-grpc-node
grpc:grpc-web:protoc-gen-grpc-web
gogo:protobuf:protoc-gen-combo
gogo:protobuf:protoc-gen-gogo
gogo:protobuf:protoc-gen-gogofast
gogo:protobuf:protoc-gen-gogofaster
gogo:protobuf:protoc-gen-gogoslick
gogo:protobuf:protoc-gen-gogotypes
gogo:protobuf:protoc-gen-gostring
grpc-ecosystem:grpc-gateway:protoc-gen-grpc-gateway
scalapb:scalapb:protoc-gen-scala
stackb:grpc.js:protoc-gen-grpc-js
stephenh:ts-proto:protoc-gen-ts-proto

Rule Implementations

The rule name is an opaque string, but by convention they are maven-esqe artifact identifiers that follow a GitHub org/repo/rule_name convention.

Plugin
stackb:rules_proto:grpc_cc_library
stackb:rules_proto:grpc_closure_js_library
stackb:rules_proto:grpc_java_library
stackb:rules_proto:grpc_nodejs_library
stackb:rules_proto:grpc_web_js_library
stackb:rules_proto:grpc_py_library
stackb:rules_proto:proto_cc_library
stackb:rules_proto:proto_closure_js_library
stackb:rules_proto:proto_compile
stackb:rules_proto:proto_compiled_sources
stackb:rules_proto:proto_descriptor_set
stackb:rules_proto:proto_go_library
stackb:rules_proto:proto_java_library
stackb:rules_proto:proto_nodejs_library
stackb:rules_proto:proto_py_library
bazelbuild:rules_scala:scala_proto_library

Please consult the example/ directory and unit tests for more additional detail.

Writing Custom Plugins and Rules

Custom plugin implementations and rule implementations can be written in golang or starlark. Golang implementations are statically compiled into the final gazelle_binary whereas starlark plugins are evaluated at gazelle runtime.

+/- of golang implementations

  • + Full power of a statically compiled language, the golang stdlib, and external dependencies.
  • + Easier to test.
  • + API not experimental.
  • - Cannot be used in a proto_repository rule without forking stackb/rules_proto.
  • - Initial setup harder, often housed within your own custom gazelle extension.

Until a dedicated tutorial is available, please consult the source code for examples.

+/- of starlark implementations

  • + More familiar to developer with starlark experience but not golang.
  • + Easier setup (*.star files in your gazelle repository)
  • + Possible to use in conjunction with the proto_repository rule.
  • - Limited API: can only reference state that has been already configured via gazelle directives.
  • - Not possible to implement stateful design.
  • - No standard library.

Until a dedicated tutorial is available, please consult the reference example in example/testdata/starlark_java.

History

The original rules_proto was https://github.com/pubref/rules_proto. This was redesigned around the proto_library rule and moved to https://github.com/stackb/rules_proto.

Following earlier experiments with aspects, this ruleset was forked to https://github.com/rules-proto-grpc/rules_proto_grpc. Aspect-based compilation was featured for quite a while there but has recently been deprecated.

Maintaining stackb/rules_proto and its polyglot set of languages in its original v0-v1 form became a full-time (unpaid) job. The main issue is that the {LANG}_{PROTO|GRPC}_library rules are tightly bound to a specific set of dependencies. As such, rules_proto users are tightly bound to the specific labels named by those rules. This is a problem for the maintainer as one must keep the dependencies current. It is also a problem for rule consumers: it becomes increasingly difficult to upgrade as the dependencies as assumptions and dependencies drift.

With stackb/rules_proto in its v2 gazelle-based form, it is largely dependency-free: other than gazelle and the protoc toolchain, there are no dependencies that you cannot fully control in your own workspace via the gazelle configuration.

The gazelle based design also makes things much simpler and powerful, because the content of the proto files is the source of truth. Due to the fact that Bazel does not permit reading/interpreting a file during the scope of an action, it is impossible to make a decision about what to do. A prime example of this is the go_package option. If the go_package option is present, the location of the output file for protoc-gen-go is completely different. As a result, the information about the go_package metadata ultimately needs to be duplicated so that the build system can know about it.

The gazelle-based approach moves all that messy interpretation and evaluation into a precompiled state; as a result, the work that needs to be done in the action itself is dramatically simplified.

Furthermore, by parsing the proto files it is easy to support complex custom plugins that do things like:

  • Emit no files (only assert/lint).
  • Emit a file only if a specific enum constant is found. With the previous design, this was near impossible. With the v2 design, the protoc.Plugin implementation can trivially perform that evaluation because it is handed the complete proto AST during gazelle evaluation.

rules_proto's People

Contributors

aaliddell avatar alexeagle avatar brandjon avatar cwoodcock avatar erikreed avatar ewhauser avatar globegitter avatar gou4shi1 avatar heartless-clown avatar ivucica avatar jgiles avatar joeljeske avatar joelwilliamson avatar jvolkman avatar laurentlb avatar lolski avatar melindalu avatar meteorcloudy avatar nmalsang avatar paladine avatar pcj avatar qzmfranklin avatar smukherj1 avatar stevewolter avatar thomas-kielbus avatar tuxtag avatar vihangm avatar vmax avatar werkt avatar ypxu avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

rules_proto's Issues

Inconsistent documentation vs implementation of transitive argument

For a number of languages, the readme states that transitive has a default of false: https://github.com/stackb/rules_proto/blame/master/python/README.md#L329

Whilst in the bzl file the argument is ignored and is harddoded as true:

transitive = True,

To correct for this, two things need doing:

  • The documentation should be updated to state the default is True for affected languages
  • The rules of affected languages should not use hardcoded true; the value should be pulled from kwargs with a default of true: transitive = kwargs.get("transitive", True)

If this sounds correct, I can send this as a PR?

http_archive must be in the WORKSPACE file

Issue

I'm trying to build a java_proto_library target. It fails with the following error

ERROR: /home/basav/dir/BUILD:6:1: Traceback (most recent call last):
	File "/home/basav/dir/BUILD", line 6
		java_proto_library(name = "target", ..."])
	File "/root/.cache/bazel/_bazel_root/<>/external/build_stack_rules_proto/java/deps.bzl", line 62, in java_proto_library
		java_proto_compile(**kwargs)
	File "/root/.cache/bazel/_bazel_root/<>/external/build_stack_rules_proto/java/deps.bzl", line 55, in java_proto_compile
		protobuf(**kwargs)
	File "/root/.cache/bazel/_bazel_root/<>/external/build_stack_rules_proto/protobuf/deps.bzl", line 12, in protobuf
		com_github_madler_zlib(**kwargs)
	File "/root/.cache/bazel/_bazel_root/<>/external/build_stack_rules_proto/deps.bzl", line 123, in com_github_madler_zlib
		http_archive(name = "com_github_madler_zlib", b..., <3 more arguments>)
http_archive must be in the WORKSPACE file (used by //dir:com_github_madler_zlib)

bazel version

0.23.0

WORKSPACE

load("@bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")
http_archive(
    name = "build_stack_rules_proto",
    urls = ["https://github.com/stackb/rules_proto/archive/b93b544f851fdcd3fc5c3d47aee3b7ca158a8841.tar.gz"],
    sha256 = "c62f0b442e82a6152fcd5b1c0b7c4028233a9e314078952b6b04253421d56d61",
    strip_prefix = "rules_proto-b93b544f851fdcd3fc5c3d47aee3b7ca158a8841",
)

load("@build_stack_rules_proto//java:deps.bzl", "java_proto_compile")
java_proto_compile()

load("@build_stack_rules_proto//java:deps.bzl", "java_grpc_compile")
java_grpc_compile()

load("@build_stack_rules_proto//java:deps.bzl", "java_proto_library")
java_proto_library()

load("@build_stack_rules_proto//python:deps.bzl", "python_proto_library")
python_proto_library()

load("@io_bazel_rules_python//python:pip.bzl", "pip_import", "pip_repositories")
pip_repositories()

pip_import(
    name = "protobuf_py_deps",
    requirements = "@build_stack_rules_proto//python/requirements:protobuf.txt",
)

load("@protobuf_py_deps//:requirements.bzl", protobuf_pip_install = "pip_install")
protobuf_pip_install()

load("@build_stack_rules_proto//:deps.bzl", "io_grpc_grpc_java")
io_grpc_grpc_java()

load("@io_grpc_grpc_java//:repositories.bzl", "grpc_java_repositories")
grpc_java_repositories(omit_com_google_protobuf = True)

load("@build_stack_rules_proto//java:deps.bzl", "java_grpc_library")
java_grpc_library()

BUILD file

package(default_visibility = ["//visibility:public"])
load("@build_stack_rules_proto//java:deps.bzl", "java_proto_library")

java_proto_library(
    name = "target",
    deps = [
        "test.proto",
    ],
)

Checksum mismatch - Dart on MacOS

Kudos to @pcj for jumping though GH support hoops to enable the tracker.

This is looks like a trivial one. Though checksum/hash errors always pique my attention as to why. I'll take it if nobody else chimes in.

Cloned at 8087f91

Ran make

<<SNIP>>
Starting local Bazel server and connecting to it...
fixed /private/var/tmp/_bazel_colinw/1739b7ac93d75b5cb919f64ec72e8af5/external/io_bazel_rules_closure/java/io/bazel/rules/closure/worker/BUILD
fixed /private/var/tmp/_bazel_colinw/1739b7ac93d75b5cb919f64ec72e8af5/external/io_bazel_rules_closure/java/io/bazel/rules/closure/webfiles/server/BUILD
fixed /private/var/tmp/_bazel_colinw/1739b7ac93d75b5cb919f64ec72e8af5/external/io_bazel_rules_closure/java/io/bazel/rules/closure/http/BUILD
fixed /private/var/tmp/_bazel_colinw/1739b7ac93d75b5cb919f64ec72e8af5/external/io_bazel_rules_closure/java/io/bazel/rules/closure/webfiles/BUILD
fixed /private/var/tmp/_bazel_colinw/1739b7ac93d75b5cb919f64ec72e8af5/external/io_bazel_rules_closure/java/io/bazel/rules/closure/BUILD
ERROR: error loading package '': Encountered error while reading extension file 'deps.bzl': no such package '@dart_pub_deps_protoc_plugin//': no such package '@dart_sdk//': java.io.IOException: Error downloading [https://storage.googleapis.com/dart-archive/channels/dev/release/2.1.0-dev.1.0/sdk/dartsdk-macos-x64-release.zip] to /private/var/tmp/_bazel_colinw/1739b7ac93d75b5cb919f64ec72e8af5/external/dart_sdk/dartsdk-macos-x64-release.zip: Checksum was ec1e2f948886962abb36e6bc729a8fd9fbf50e2a7803191a53c6487af00d4046 but wanted b376957f9cd4069443c5880066ff5b5d117f3450393de7ad57f06148e7a6b92c
ERROR: error loading package '': Encountered error while reading extension file 'deps.bzl': no such package '@dart_pub_deps_protoc_plugin//': no such package '@dart_sdk//': java.io.IOException: Error downloading [https://storage.googleapis.com/dart-archive/channels/dev/release/2.1.0-dev.1.0/sdk/dartsdk-macos-x64-release.zip] to /private/var/tmp/_bazel_colinw/1739b7ac93d75b5cb919f64ec72e8af5/external/dart_sdk/dartsdk-macos-x64-release.zip: Checksum was ec1e2f948886962abb36e6bc729a8fd9fbf50e2a7803191a53c6487af00d4046 but wanted b376957f9cd4069443c5880066ff5b5d117f3450393de7ad57f06148e7a6b92c
INFO: Elapsed time: 103.362s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (0 packages loaded)

Issue with latest version of grpc

Hi,
I'm having problems updating my pin in rules_k8s to com_github_grpc_grpc due to an issue with this repo on which rules_k8s also depends on.
After updating to latest version of grpc (bazelbuild/rules_k8s#384) my builds are failing with the following error:
(19:18:24) ERROR: /var/lib/buildkite-agent/.cache/bazel/_bazel_buildkite-agent/ec321eb2cc2d0f8f91b676b6d4c66c29/external/build_stack_rules_proto/cpp/BUILD.bazel:12:1: no such target '@com_github_grpc_grpc//:grpc_cpp_plugin': target 'grpc_cpp_plugin' not declared in package '' defined by /var/lib/buildkite-agent/.cache/bazel/_bazel_buildkite-agent/ec321eb2cc2d0f8f91b676b6d4c66c29/external/com_github_grpc_grpc/BUILD and referenced by '@build_stack_rules_proto//cpp:grpc_cpp'
Full error trace: https://buildkite.com/bazel/rules-k8s-k8s/builds/1879#5aaacfdf-f2f5-4c8e-a749-a2efe82bad4c
thanks!

Update bazel-gazelle

Current bazel-gazelle version specified here can break with shallow-since:

 git clone '--shallow-since=1551386336 +0000' https://go.googlesource.com/tools /private/var/tmp/_bazel_i868039/7b08591af2b3d71f45f2e4029050db37/external/org_golang_x_tools
Cloning into '/private/var/tmp/_bazel_i868039/7b08591af2b3d71f45f2e4029050db37/external/org_golang_x_tools'...
fatal: Server does not support --shallow-since
fatal: The remote end hung up unexpectedly
+ git clone https://go.googlesource.com/tools /private/var/tmp/_bazel_i868039/7b08591af2b3d71f45f2e4029050db37/external/org_golang_x_tools
Cloning into '/private/var/tmp/_bazel_i868039/7b08591af2b3d71f45f2e4029050db37/external/org_golang_x_tools'...
fatal: unable to access 'https://go.googlesource.com/tools/': The requested URL returned error: 502

Dart: vendor_front_end/lib/src /base/source.dart:1: Error: No such file or directory

I can't generate protobufs for Dart. This is probably a mistake on my part but I'd appreciate some help.

I have a test message and the build in the same folder:

syntax = "proto3";

package vehicle;

message Hello {
  string hello = 1;
}

load("@build_stack_rules_proto//dart:dart_proto_library.bzl", "dart_proto_library")
load("@io_bazel_rules_dart//dart/build_rules:core.bzl", "dart_library")

proto_library(
    name = "vehicle_proto",
    srcs = [
        "messages.proto"
    ]
)

dart_proto_library(
    name = "vechicle_grpc",
    deps = [":vehicle_proto"],
)

My workspace looks like this:

load("@bazel_tools//tools/build_defs/repo:git.bzl", "git_repository")

# Toolchains
git_repository(
    name = "bazel_toolchains",
    remote = "https://github.com/bazelbuild/bazel-toolchains.git",
    commit = "999a4d472d5bf98437ed0a108c874efcfd014f93"
)

# Protocol Buffers
git_repository(
    name = "build_stack_rules_proto",
    remote = "https://github.com/stackb/rules_proto.git",
    commit = "3ef9c0d847f9f23e8086ea1bb8c02231671beb31"
)

# Dart
git_repository(
    name = "io_bazel_rules_dart",
    remote = "https://github.com/dart-lang/rules_dart.git",
    commit = "78a4e1ba257bbe9a9d7a064c8cde8c5317059e17"
)

# Setup
load("@build_stack_rules_proto//dart:deps.bzl", "dart_grpc_compile")
dart_grpc_compile()
load("@build_stack_rules_proto//dart:deps.bzl", "dart_proto_library")
dart_proto_library()
load("@dart_pub_deps_protoc_plugin//:deps.bzl", dart_protoc_plugin_deps = "pub_deps")
dart_protoc_plugin_deps()
load("@io_bazel_rules_dart//dart/build_rules:repositories.bzl", "dart_repositories")
dart_repositories()
load("@io_bazel_rules_dart//dart/build_rules/internal:pub.bzl", "pub_repository")
pub_repository(
  name = "vendor_isolate",
  output = ".",
  package = "isolate",
  version = "2.0.2"
)

However attempting a build results in an error:

INFO: Invocation ID: d1ead863-3c54-4358-9542-b2c7bf2845c6
INFO: Analysed 3 targets (1 packages loaded, 5 targets configured).
INFO: Found 3 targets...
ERROR: /private/var/tmp/_bazel_marcguilera/10b327fd9d5413a300b42a0aa4f3edb4/external/build_stack_rules_pro
to/dart/BUILD.bazel:32:1: Building Dart VM snapshot <rule context for @build_stack_rules_proto//dart:dart_
protoc_plugin> failed (Exit 254) dart failed: error executing command external/dart_sdk/bin/dart '--packag
es=bazel-out/host/bin/external/build_stack_rules_proto/dart/dart_protoc_plugin.build/dart/dart_protoc_plug

in.packages' ... (remaining 2 argument(s) skipped)

Use --sandbox_debug to see verbose messages from the sandbox
bazel-out/host/bin/external/build_stack_rules_proto/dart/dart_protoc_plugin.build/vendor_front_end/lib/src
/base/source.dart:1: Error: No such file or directory
INFO: Elapsed time: 1.218s, Critical Path: 0.95s
INFO: 0 processes.
FAILED: Build did NOT complete successfully

Conflicting - or at least confusing - requirements.txt files

python/requirements.txt specifies:

  • grpcio==1.20.0
  • protobuf==3.7.1

python/requirements/grpc.txt specifies:

  • grpcio==1.15.0

python/requirements/protobuf.txt specifies:

  • protobuf==3.6.1

So:

  1. What are these different files used for?
  2. Is it a problem that they don't name the same versions?
  3. What is the relationship between the grpcio version named here and the one named in my own repos' WORKSPACE?
  4. What is the relationship between the protobuf version named here and the one named in my own repos' WORKSPACE? (I.e., which runtime is used, and which protoc is used?)
    • I actually don't have any protobuf repository named in my WORKSPACE though I do have grpc and grpcio repositories, so:
      1. Where is my protobuf coming from (grpc? rules_proto? bazel itself?)
      2. How do I find the versions of the protobuf (as well as grpc, grpcio) used by my code?

(I was unable to find answers to these questions in docs here, or in Bazel's docs - so a possible resolution of this issue might include updating docs somewhere...I'm sure I'm not the only one with questions like this (or at least I hope it's not just me ...))

`proto_source_root` of `proto_library` dependencies is ignored

proto_library dependencies accept a proto_source_root argument, which allows protobuf definitions to be stored anywhere in the tree (there are some other attributes that can also change the behaviour).

When using proto_compile with protos defined this way, an error is thrown that suggests that the proto_source_root is being ignored.

Undefined variable for generated closure proto js

Generated code with latest versions of grpc.js and rules_closure yields:

bazel-out/darwin-fastbuild/genfiles/foo/foo_closure_proto.js:774: ERROR - variable f is undeclared
    expiry: (f = msg.getExpiry()) && proto.google.protobuf.Timestamp.toObject(includeInstance, f)
             ^
  ProTip: "JSC_UNDEFINED_VARIABLE" or "checkVars" or "missingSourcesWarnings" or "undefinedVars" can be added to the `suppress` attribute. Alternatively /** @suppress {undefinedVars} */ can be added to the source file.

closure_grpc_library doesn't generate js files for transitive dependency

bazel version: 0.27.0
system: ubuntu 16.04

closure_grpc_library doesn't generate the js files for the transitive dependent protos. You can reproduce the errors I got through the following steps,

  1. clone my repository
git clone [email protected]:manazhao/experimental.git
  1. build the closure grpc targets:
# This service doesn't depend on other protos and the build should succeed.
bazel build protobuf:say_goodbye_service_closure
# Expected output:
# INFO: Analyzed target //protobuf:say_goodbye_service_closure (0 packages loaded, 0 targets configured).
# INFO: Found 1 target...
# Target //protobuf:say_goodbye_service_closure up-to-date (nothing to build)
# INFO: Elapsed time: 0.157s, Critical Path: 0.01s
# INFO: 0 processes.
# INFO: Build completed successfully, 1 total action
# This service depends on "a.proto" and the build will fail.
bazel build protobuf:say_hello_service_closure

# expected output:
# ERROR: /home/manazhao/git/experimental/protobuf/BUILD:28:1: output 'protobuf/say_hello_service_closure_pb_grpc/protobuf/a_grpc_web_pb.js' was not created
# ERROR: /home/manazhao/git/experimental/protobuf/BUILD:28:1: not all outputs were created or valid
# Target //protobuf:say_hello_service_closure failed to build
# Use --verbose_failures to see the command lines of failed build steps.
# INFO: Elapsed time: 0.252s, Critical Path: 0.09s
# INFO: 1 process: 1 linux-sandbox.
# FAILED: Build did NOT complete successfully

Though the "transitive" is set to true for "say_hello_service_closure", no js files are generated for the transitive dependency. any ideas?

python_grpc_library

I have a build file to generate grpc stubs for java but I am not sure what changes should I made to make it work for python. For java version, I am using: load("@io_grpc_grpc_java//:java_grpc_library.bzl", "java_grpc_library") but

It looks like we can not pass the output of python_proto_library as a dependency to python_grpc_library correct?

What should I expect to see in bazel-bin when I generate grpc stubs using python_grpc_library function? Here is the java version.

https://github.com/dnosproject/dnos-core-grpc/blob/master/protobuf/proto/BUILD

Thanks,
Adib

Problematic package path for generated python files

I am in the process of transitioning a project from rules_protobuf to this new library (as suggested) and I seem to have hit a roadblock along the way. I think there is a problem in the way the python packages are consumed, and more importantly, generated. Looking at the Python example, this is how the generated protobuf & grpc libraries are consumed

from routeguide_pb.example.proto import routeguide_pb2
from routeguide_pb.example.proto import routeguide_pb2_grpc

Which seems a little weird to do. Looking at the rules_protobuf way of doing it, it should have been something that looks like

from example.proto import routeguide_pb2
from example.proto import routeguide_pb2_grpc

which seems more natural to me (personally) because you can import the python library as if it was written inside of example/proto without being aware of how/whether it was generated.

To make things worse, I think this leads to a problem with import other python packages/modules. Llisting everything in the sys.path, this is how it looks

.../bazel-out/k8-fastbuild/bin/python/example/routeguide/server.runfiles/pypi__setuptools_40_8_0
.../bazel-out/k8-fastbuild/bin/python/example/routeguide/server.runfiles/pypi__six_1_12_0
.../bazel-out/k8-fastbuild/bin/python/example/routeguide/server.runfiles/pypi__protobuf_3_6_1
.../bazel-out/k8-fastbuild/bin/python/example/routeguide/server.runfiles/pypi__enum34_1_1_6
.../bazel-out/k8-fastbuild/bin/python/example/routeguide/server.runfiles/pypi__grpcio_1_15_0
.../bazel-out/k8-fastbuild/bin/python/example/routeguide/server.runfiles/pypi__futures_3_2_0
.../bazel-out/k8-fastbuild/bin/python/example/routeguide/server.runfiles/build_stack_rules_proto/python/example/routeguide/routeguide_pb
.../bazel-out/k8-fastbuild/bin/python/example/routeguide/server.runfiles/build_stack_rules_proto/python/example/routeguide
.../bazel-out/k8-fastbuild/bin/python/example/routeguide/server.runfiles/build_stack_rules_proto

Looking inside the path in bold above, that is where the generated modules are. The path for these generated modules looks like

.../bazel-out/k8-fastbuild/bin/python/example/routeguide/server.runfiles/build_stack_rules_proto/python/example/routeguide/routeguide_pb/example/python/routeguide/routeguide_pb2.py

I do not understand why the files are within example/python/routeguide within this path. When I looked at how rules_proto did this, all the pb2.py and pb2_grpc.py files would be directly within .../bazel-out/k8-fastbuild/bin/python/example/routeguide/server.runfiles/build_stack_rules_proto/python/example/routeguide/routeguide_pb2.py

This also leads to a problem that is not highlighted by this simple python example. I have a this basic example that recreates the problem.

I simply have a proto library under python/example/routeguide/bar/ and I add it as a dependency to python/example/routeguide:server. As per the observation above, this leads to a module generated at the following path

.../bazel-out/k8-fastbuild/bin/python/example/routeguide/server.runfiles/build_stack_rules_proto/python/example/routeguide/routeguide_pb2/python/example/routeguide/bar/

Along with .../bazel-out/k8-fastbuild/bin/python/example/routeguide/server.runfiles/build_stack_rules_proto/python/example/routeguide/routeguide_pb2 in sys.path.

Now, whenever you would like to import some other module from say for example the python/example package, it would collide with the package with the same name in the path mentioned above. In my example, adding another python library python/example/routeguide/foo highlights the problem.

When you run bazel run python/example/routeguide:server, everything runs okay. But, when you run bazel run python/example/routeguide:server_with_bar, it throws an error saying

Traceback (most recent call last): File "/root/.cache/bazel/_bazel_root/7fb78493d4016bfee9e4b3025cb7c4f3/execroot/build_stack_rules_proto/bazel-out/k8-fastbuild/bin/python/example/routeguide/server.runfiles/build_stack_rules_proto/python/example/routeguide/server.py", line 22, in <module> from python.example.routeguide.foo import foo ImportError: No module named foo

This is because it cannot find foo on the path ../bazel-out/k8-fastbuild/bin/python/example/routeguide/server.runfiles/build_stack_rules_proto/python/example/routeguide/routeguide_pb2/python/example/routeguide

go rules are incompatible with bazel 0.23

deps.bzl uses old bazel_gazelle, which is incompatible with 0.23

ERROR: /home/zaspire/.cache/bazel/_bazel_zaspire/1010357ee0e0c546a5a0c1dee8f8184b/external/bazel_gazelle/def.bzl:80:22: Traceback (most recent call last):
	File "/home/zaspire/.cache/bazel/_bazel_zaspire/1010357ee0e0c546a5a0c1dee8f8184b/external/bazel_gazelle/def.bzl", line 57
		rule(implementation = _gazelle_runner..., <2 more arguments>)
	File "/home/zaspire/.cache/bazel/_bazel_zaspire/1010357ee0e0c546a5a0c1dee8f8184b/external/bazel_gazelle/def.bzl", line 80, in rule
		attr.label(mandatory = True, allow_single_fil..., ...")
Using cfg = "data" on an attribute is a noop and no longer supported. Please remove it. You can use --incompatible_disallow_data_transition=false to temporarily disable this check.
ERROR: /home/zaspire/.cache/bazel/_bazel_zaspire/1010357ee0e0c546a5a0c1dee8f8184b/external/grpc_ecosystem_grpc_gateway/codegenerator/BUILD.bazel:5:1: error loading package '@grpc_ecosystem_grpc_gateway//': Extension file 'def.bzl' has errors and referenced by '@grpc_ecosystem_grpc_gateway//codegenerator:go_default_library'
ERROR: /home/zaspire/.cache/bazel/_bazel_zaspire/1010357ee0e0c546a5a0c1dee8f8184b/external/grpc_ecosystem_grpc_gateway/codegenerator/BUILD.bazel:5:1: label '@grpc_ecosystem_grpc_gateway//:generators' does not refer to a package group

Can't depend on a proto in the same Bazel package but with a different proto package name

I'm having trouble trying to depend on a proto in a subdirectory of my bazel package that has a different proto package name.

I've put a minimal broken example here:

https://github.com/jf647/stackb_example

I can build everything in subdirectory foo without issue, as it depends on no other protos:

elocie ➜  stackb_example git:(master) bazel build foo/...
INFO: Invocation ID: 8baafcac-4132-421b-b493-9b1967bcfa71
INFO: Analysed 3 targets (0 packages loaded, 0 targets configured).
INFO: Found 3 targets...
INFO: Elapsed time: 0.324s, Critical Path: 0.00s
INFO: 0 processes.
INFO: Build completed successfully, 1 total action
elocie ➜  stackb_example git:(master) ls -la bazel-stackb_example//bazel-out/darwin-fastbuild/genfiles/foo/foo_go_proto_pb/foo/foo.pb.go
-r-xr-xr-x  1 james  wheel  2377 Jan 28 08:57 bazel-stackb_example//bazel-out/darwin-fastbuild/genfiles/foo/foo_go_proto_pb/foo/foo.pb.go

But if I try to build everything in bar (which depends on foo), I get errors from protoc about inconsistent import paths:

elocie ➜  stackb_example git:(master) bazel build bar/...
INFO: Invocation ID: 42247b9b-02cb-4ce2-b7d5-dbb5a87963ce
INFO: Analysed 3 targets (1 packages loaded, 4 targets configured).
INFO: Found 3 targets...
ERROR: /Users/james/projects/stackb_example/bar/BUILD.bazel:23:1: error executing shell command: '/bin/bash -c bazel-out/host/bin/external/com_google_protobuf/protoc --descriptor_set_out=bazel-out/darwin-fastbuild/genfiles/bar/bar_go_proto_pb/descriptor.source.bin --proto_path=bazel-out/darwin-...' failed (Exit 1) bash failed: error executing command /bin/bash -c ... (remaining 1 argument(s) skipped)

Use --sandbox_debug to see verbose messages from the sandbox
2019/01/28 08:58:51 protoc-gen-gogo: error:inconsistent package import paths: "bar", "foo"
--gogo_out: protoc-gen-gogo: Plugin failed with status code 1.
INFO: Elapsed time: 0.475s, Critical Path: 0.15s
INFO: 3 processes: 3 darwin-sandbox.
FAILED: Build did NOT complete successfully

Using --sandbox_debug, it appears that protoc is being invoked on all of the protos simultaneously:

elocie ➜  stackb_example git:(master) bazel build bar/... --sandbox_debug
INFO: Invocation ID: f5921fcd-eaf3-4d33-9ae8-85fe73836d7a
INFO: Analysed 3 targets (0 packages loaded, 0 targets configured).
INFO: Found 3 targets...
ERROR: /Users/james/projects/stackb_example/bar/BUILD.bazel:23:1: error executing shell command: '/bin/bash -c bazel-out/host/bin/external/com_google_protobuf/protoc --descriptor_set_out=bazel-out/darwin-fastbuild/genfiles/bar/bar_go_proto_pb/descriptor.source.bin --proto_path=bazel-out/darwin-...' failed (Exit 1) sandbox-exec failed: error executing command
  (cd /private/var/tmp/_bazel_james/adc497d6faec319414998cd6b215e16f/execroot/__main__ && \
  exec env - \
    TMPDIR=/var/folders/gm/kk0_399j7_7cdd6qh1j_xsbr0000gn/T/ \
  /usr/bin/sandbox-exec -f /private/var/tmp/_bazel_james/adc497d6faec319414998cd6b215e16f/sandbox/darwin-sandbox/1/sandbox.sb /var/tmp/_bazel_james/install/b9adaa5c006b96c0300b56da1ec2ef08/_embedded_binaries/process-wrapper '--timeout=0' '--kill_delay=15' /bin/bash -c 'bazel-out/host/bin/external/com_google_protobuf/protoc --descriptor_set_out=bazel-out/darwin-fastbuild/genfiles/bar/bar_go_proto_pb/descriptor.source.bin --proto_path=bazel-out/darwin-fastbuild/genfiles/bar/bar_go_proto_pb --include_imports --include_source_info --gogo_out=Mgoogle/protobuf/any.proto=github.com/gogo/protobuf/types,Mgoogle/protobuf/duration.proto=github.com/gogo/protobuf/types,Mgoogle/protobuf/struct.proto=github.com/gogo/protobuf/types,Mgoogle/protobuf/timestamp.proto=github.com/gogo/protobuf/types,Mgoogle/protobuf/wrappers.proto=github.com/gogo/protobuf/types:bazel-out/darwin-fastbuild/genfiles/bar/bar_go_proto_pb --plugin=protoc-gen-gogo=bazel-out/host/bin/external/com_github_gogo_protobuf/protoc-gen-gogo/darwin_amd64_stripped/protoc-gen-gogo bazel-out/darwin-fastbuild/genfiles/bar/bar_go_proto_pb/bar/bar.proto bazel-out/darwin-fastbuild/genfiles/bar/bar_go_proto_pb/foo/foo.proto')
2019/01/28 08:59:11 protoc-gen-gogo: error:inconsistent package import paths: "bar", "foo"
--gogo_out: protoc-gen-gogo: Plugin failed with status code 1.
INFO: Elapsed time: 0.339s, Critical Path: 0.08s
INFO: 0 processes.
FAILED: Build did NOT complete successfully

(a little hard to read but you can see we have one protoc-gen-go invocation with two args - bar.proto and foo.proto)

This appears not to be supported (see golang/protobuf#39 for a rather drawn-out discussion). It looks like a rewrite to protoc-gen-go might be in the works, but for now I'm wondering if it might be easier to just process each proto_library on its own.

Why `go_deps`?

For the protobuf foo that depends on google.types.Any, the go_proto_library has to be generated like this:

go_proto_library(
    name = "go_foo_proto",
    go_deps = [
        "@com_github_golang_protobuf//ptypes/any:go_default_library",
    ],
    deps = [":foo_proto"],
    importpath = "...",
)

I checked the examples for C++ / python really quickly, this (explicit dependency) is not the case for those languages. Why is this required for Go? Is there plan to remove this thing?
As soon as I have a bar that depends foo, I have to specify the go_deps for Any again in go_bar_proto. It's very contagious...

python output directories don't match proto directories with dashes

I run into the following error when trying to use a python_proto_library with protos that are in a directory with dashes:

ERROR: /Projects/thing/protos/api/thing/v1/BUILD:66:1: output 'protos/api/thing/v1/thingservice_proto_py_pb/protoc-gen-swagger/options/annotations_pb2.py' was not created
ERROR: /Projects/thing/protos/api/thing/v1/BUILD:66:1: output 'protos/api/thing/v1/thingservice_proto_py_pb/protoc-gen-swagger/options/openapiv2_pb2.py' was not created
ERROR: /Projects/thing/protos/api/thing/v1/BUILD:66:1: not all outputs were created or valid

The expected output should be set to use underscores instead of dashes:

##### SANDBOX AFTER RUNNING PROTOC

./bazel-out/k8-fastbuild/genfiles/protos/api/thing/v1/thingervice_proto_py_pb/protoc-gen-swagger/options/openapiv2.proto
./bazel-out/k8-fastbuild/genfiles/protos/api/thing/v1/thingervice_proto_py_pb/protoc-gen-swagger/options/annotations.proto

./bazel-out/k8-fastbuild/genfiles/protos/api/thing/v1/thingervice_proto_py_pb/protoc_gen_swagger/options/openapiv2_pb2.py
./bazel-out/k8-fastbuild/genfiles/protos/api/thing/v1/thingervice_proto_py_pb/protoc_gen_swagger/options/annotations_pb2.py

Potential name conflicts when depending on external protos

Hi,

I have a use case that references proto_library targets defined in external repositories .e.g @io_bazel.

Here is how to reproduce the issue.

  1. Pull in the dependencies of rules_proto for python.

  2. Create tools/cpp/analysis/BUILD with the following content:

py_proto_library(
    name = 'extra_actions_base_py_proto',
    deps = ['@io_bazel//src/main/protobuf:extra_actions_base_proto'],
)

The py_proto_library is the one imported from rules_proto/python.

The extra_actions_base_py_proto target, when built, yields the file

build/genfiles/tools/cpp/analysis/extra_actions_base_py_proto_pb/src/main/protobuf/extra_actions_base_pb2.py

I see two issues:

  • This path is probably too long. This could be argued. I won't touch this in this issue.
  • There is a potential name conflict issue. I will explain more on this.

Looking through the code base of rules_proto, I figured that you really wanted me to import the generated python proto library like the following:

from src.main.protobuf import extra_actions_base_pb2

The assumption is that the native.py_library() target generated by py_proto_library automatically adds the name_pb to the includes field so that the above import works.

However, here is the issue I ran into:

By chance, my repository also has a src directory:

src/
├── algorithm
├── base
├── __configure_deploy__.py
├── deploy_config.yaml
├── protos
├── __pycache__
└── ui

The python target I want to create depend on both the aformentioned py_proto_library target, as imported as extra_actions_base_pb2, and a py_library target in the src/base directory of my own repository:

py_binary(
    name = 'gen_cmd_py',
    srcs = ['gen_cmd.py'],
    main = 'gen_cmd.py',
    deps = [
        ':extra_actions_base_py_proto',
        '//src/base/python:arg_parse',
        '//src/base/python:util',
    ],
)

The gen_cmd.py script looks like the following:

from src.main.protobuf import extra_actions_base_pb2
from src.base.python import arg_parse
from src.base.python import util

Well, this target builds correctly. But at runtime, it fails with the following error:

    from src.base.python import arg_parse
ModuleNotFoundError: No module named 'src.base'

After digging around, I found that the PYTHONPATH (i.e., sys.path) has the following directories:

['/home/zhongming/git/LogiOcean/tools/cpp/analysis',
 '/home/zhongming/.cache/bazel/_bazel_zhongming/2aea414f4d422ad78870bb37da8563ec/execroot/logi/bazel-out/k8-fastbuild/bin/tools/cpp/analysis/gen_cmd_py.runfiles',
 '/home/zhongming/.cache/bazel/_bazel_zhongming/2aea414f4d422ad78870bb37da8563ec/execroot/logi/bazel-out/k8-fastbuild/bin/tools/cpp/analysis/gen_cmd_py.runfiles/com_google_protobuf/python',
 '/home/zhongming/.cache/bazel/_bazel_zhongming/2aea414f4d422ad78870bb37da8563ec/execroot/logi/bazel-out/k8-fastbuild/bin/tools/cpp/analysis/gen_cmd_py.runfiles/logi/tools/cpp/analysis/extra_actions_base_py_proto_pb',
 '/home/zhongming/.cache/bazel/_bazel_zhongming/2aea414f4d422ad78870bb37da8563ec/execroot/logi/bazel-out/k8-fastbuild/bin/tools/cpp/analysis/gen_cmd_py.runfiles/com_google_protobuf',
 '/home/zhongming/.cache/bazel/_bazel_zhongming/2aea414f4d422ad78870bb37da8563ec/execroot/logi/bazel-out/k8-fastbuild/bin/tools/cpp/analysis/gen_cmd_py.runfiles/six',
 '/home/zhongming/.cache/bazel/_bazel_zhongming/2aea414f4d422ad78870bb37da8563ec/execroot/logi/bazel-out/k8-fastbuild/bin/tools/cpp/analysis/gen_cmd_py.runfiles/logi',
 '/usr/lib/python36.zip',
 '/usr/lib/python3.6',
 '/usr/lib/python3.6/lib-dynload',
 '/usr/local/lib/python3.6/dist-packages',
 '/usr/local/lib/python3.6/dist-packages/python_dateutil-2.7.3-py3.6.egg',
 '/usr/local/lib/python3.6/dist-packages/swagger_client-1.0.0-py3.6.egg',
 '/usr/lib/python3/dist-packages']

In this list, /home/zhongming/.cache/bazel/_bazel_zhongming/2aea414f4d422ad78870bb37da8563ec/execroot/logi/bazel-out/k8-fastbuild/bin/tools/cpp/analysis/gen_cmd_py.runfiles/logi/tools/cpp/analysis/extra_actions_base_py_proto_pb is what leads to the previously mentioned ModuleNotFoundError because it also has a src sub-directory:

├── __init__.py
└── src
    ├── __init__.py
    ├── main
    └── __pycache__

I tried removing imports = [name_pb] from python_proto_library.bzl. But that wouldn't do me any good. Because at that time, my own src/base directory would cause the import of the proto library to fail.

Contemplating on this, I believe the source problem lies in the fact that py_proto_library copies the .proto files over and generates the _pb2.py file (and .pb.cc and pb.h files for cc_proto_library) in a different directory than the directory containing the original .proto file (not the copied one).

As a comparison, the Bazel built-in cc_proto_library would generate the pb.cc and pb.h files in the same directory as the original .proto file. For example, if the proto_library is @io_bazel//src/main/protobuf:extra_actions_base_proto, the generated files would be

# From $(bazel info bazel-genfiles)

# For py_proto_library()
external/io_bazel/src/main/protobuf/extra_actions_base_pb2.py

# For cc_proto_library()
external/io_bazel/src/main/protobuf/extra_actions_base.pb.cc
external/io_bazel/src/main/protobuf/extra_actions_base.pb.h

That would allow consumers of the {lang}_proto_library() targets to import them in a safe, and sane way:

from external.io_bazel.src.main.protobuf import extra_actions_base_pb2
#include <external/io_bazel/src/main/protobuf/extra_actions_base.pb.h>

I'd propose to change the structure of rules_proto to be consistent with this paradigm so that referencing proto_library targets defined in external repositories becomes unambiguous.

breakage when upgrading to bazel 0.22

I am upgrading from bazel 0.19.2 to 0.22 and encountered the following error:

proto_library(
    name = 'attr_def',
    srcs = ['attr_def.proto'],
    deps = ['@com_google_protobuf//:timestamp_proto',]
)

py_proto_library(
    name = 'attr_def.pb.py',
    deps = ['attr_def'],
)

Building the attr_def.pb.py target leads to the following error snippet:

in proto attribute of go_proto_library rule @io_bazel_rules_go//proto/wkt:any_go_proto: '@com_google_protobuf//:any_proto' does not have mandatory providers: 'proto'. 

Digging it out a bit I found that in //:compile.bzl, the proto_compile.deps attribute has a mandatory provider proto, as specified below:

        "deps": attr.label_list(
            doc = "proto_library dependencies",
            mandatory = True,
            providers = ["proto"],
        ),

As I understand, the intended usage of the deps attribute is for collecting all the .proto files.

However, this design is broken by the latest changes in Bazel's proto_library (here).

The protobuf related providers, including proto are moved to the ProtoInfo provider.

Working on a PR to fix this issue. But it is probably going to be incompatible with Bazel versions below 0.22 (or the exact commit that changed the provider structure of the native proto_library rule).

Do python rules work with python3?

I spent a couple of hours trying to get the python rules to work with python3, and due to a dependency on enum34 library, it won't work.

My environment:

~/projects/rules_proto/example/python/python_grpc_library$ python3 --version
Python 3.6.8

~/projects/rules_proto/example/python/python_grpc_library$ lsb_release -a
No LSB modules are available.
Distributor ID: Ubuntu
Description:    Ubuntu 18.04.2 LTS
Release:        18.04
Codename:       bionic

Reproducing the issue

  • Fresh clone of master

New file server.py:

`/rules_proto/example/python/python_grpc_library/server.py:`

```python
import platform
print(platform.python_version())

Edit BUILD.bazel:

/rules_proto/example/python/python_grpc_library/BUILD.bazel:

load("@build_stack_rules_proto//python:python_grpc_library.bzl", "python_grpc_library")

python_grpc_library(
    name = "greeter_python_library",
    deps = ["@build_stack_rules_proto//example/proto:greeter_grpc"],
)

py_binary(
    name = "server",
    srcs = ["server.py"],
    deps = [":greeter_python_library"],
)

Then run:

~/projects/rules_proto/example/python/python_grpc_library$ bazel run :server
Loading: 
Loading: 0 packages loaded
Analyzing: target //:server (1 packages loaded, 0 targets configured)
INFO: Analyzed target //:server (1 packages loaded, 4 targets configured).
INFO: Found 1 target...
[0 / 1] [Prepa] BazelWorkspaceStatusAction stable-status.txt
Target //:server up-to-date:
  bazel-bin/server
INFO: Elapsed time: 0.667s, Critical Path: 0.08s
INFO: 0 processes.
INFO: Build completed successfully, 4 total actions
INFO: Running command line: bazel-bin/server
INFO: Build completed successfully, 4 total actions
Traceback (most recent call last):
  File "/home/realz/.cache/bazel/_bazel_realz/3809334db000da46ec72d728a0e8e403/execroot/__main__/bazel-out/k8-fastbuild/bin/server.runfiles/__main__/server.py", line 3, in <module>
    import platform
  File "/usr/lib/python3.6/platform.py", line 116, in <module>
    import sys, os, re, subprocess
  File "/usr/lib/python3.6/re.py", line 142, in <module>
    class RegexFlag(enum.IntFlag):
AttributeError: module 'enum' has no attribute 'IntFlag'
Error in sys.excepthook:
Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/apport_python_hook.py", line 53, in apport_excepthook
    if not enabled():
  File "/usr/lib/python3/dist-packages/apport_python_hook.py", line 24, in enabled
    import re
  File "/usr/lib/python3.6/re.py", line 142, in <module>
    class RegexFlag(enum.IntFlag):
AttributeError: module 'enum' has no attribute 'IntFlag'

Original exception was:
Traceback (most recent call last):
  File "/home/realz/.cache/bazel/_bazel_realz/3809334db000da46ec72d728a0e8e403/execroot/__main__/bazel-out/k8-fastbuild/bin/server.runfiles/__main__/server.py", line 3, in <module>
    import platform
  File "/usr/lib/python3.6/platform.py", line 116, in <module>
    import sys, os, re, subprocess
  File "/usr/lib/python3.6/re.py", line 142, in <module>
    class RegexFlag(enum.IntFlag):
AttributeError: module 'enum' has no attribute 'IntFlag'

If I remove the dependency on :greeter_python_library, then I get this:

~/projects/rules_proto/example/python/python_grpc_library$ bazel run :server
Loading: 
Loading: 0 packages loaded
Analyzing: target //:server (1 packages loaded, 0 targets configured)
INFO: Analyzed target //:server (1 packages loaded, 2 targets configured).
INFO: Found 1 target...
[0 / 1] [Prepa] BazelWorkspaceStatusAction stable-status.txt
Target //:server up-to-date:
  bazel-bin/server
INFO: Elapsed time: 0.461s, Critical Path: 0.03s
INFO: 0 processes.
INFO: Build completed successfully, 4 total actions
INFO: Running command line: bazel-bin/server
INFO: Build completed successfully, 4 total actions
3.6.8

From googling around, it appears that simply having a dependency on enum34 in python > 3.6 makes everything break, and grpc for python used by rules_proto seems to have this dependency.

Other things I tried

Attempted using python 3.5

Added to /rules_proto/example/python/python_grpc_library/BUILD.bazel:

py_runtime(
    name = "python-3.5",
    files = [],
    interpreter_path = "/usr/bin/python3.5",
    python_version = 'PY3',
)

However, when actually trying to import things in server.py it doesn't work.

/rules_proto/example/python/python_grpc_library/server.py:

from example.proto import greeter_pb2
from example.proto import greeter_pb2_grpc

import platform
print(platform.python_version())
~/projects/rules_proto/example/python/python_grpc_library$ bazel clean --expunge
~/projects/rules_proto/example/python/python_grpc_library$ bazel run :server --python_top=:python-3.4 --incompatible_use_python_toolchains=false

Traceback (most recent call last):
  File "/home/realz/.cache/bazel/_bazel_realz/3809334db000da46ec72d728a0e8e403/execroot/__main__/bazel-out/k8-fastbuild/bin/server.runfiles/__main__/server.py", line 3, in <module>
    from example.proto import greeter_pb2_grpc
  File "/home/realz/.cache/bazel/_bazel_realz/3809334db000da46ec72d728a0e8e403/execroot/__main__/bazel-out/k8-fastbuild/bin/server.runfiles/build_stack_rules_proto/example/proto/greeter_pb2_grpc.py", line 2, in <module>
    import grpc
  File "/home/realz/.cache/bazel/_bazel_realz/3809334db000da46ec72d728a0e8e403/execroot/__main__/bazel-out/k8-fastbuild/bin/server.runfiles/pypi__grpcio_1_20_0/grpc/__init__.py", line 23, in <module>
    from grpc._cython import cygrpc as _cygrpc
ImportError: /home/realz/.cache/bazel/_bazel_realz/3809334db000da46ec72d728a0e8e403/execroot/__main__/bazel-out/k8-fastbuild/bin/server.runfiles/pypi__grpcio_1_20_0/grpc/_cython/cygrpc.so: undefined symbol: _Py_ZeroStruct

I couldn't find much about this error online w.r.t bazel. All indications are that something is mixing between python 2.x and python 3.x.

Attempted using python 3.8

I have also tried using python 3.8, and that gets the enum34 issue again.

Attempted using virtualenv

With python 3.5:

Traceback (most recent call last):
  File "/home/realz/.cache/bazel/_bazel_realz/3809334db000da46ec72d728a0e8e403/execroot/__main__/bazel-out/k8-fastbuild/bin/server.runfiles/__main__/server.py", line 3, in <module>
    from example.proto import greeter_pb2_grpc
  File "/home/realz/.cache/bazel/_bazel_realz/3809334db000da46ec72d728a0e8e403/execroot/__main__/bazel-out/k8-fastbuild/bin/server.runfiles/build_stack_rules_proto/example/proto/greeter_pb2_grpc.py", line 2, in <module>
    import grpc
  File "/home/realz/.cache/bazel/_bazel_realz/3809334db000da46ec72d728a0e8e403/execroot/__main__/bazel-out/k8-fastbuild/bin/server.runfiles/pypi__grpcio_1_20_0/grpc/__init__.py", line 23, in <module>
    from grpc._cython import cygrpc as _cygrpc
ImportError: cannot import name 'cygrpc'

With python 3.6: it works.

Conclusion

Is this expected behavior? Do I have to run bazel in virtualenv? I wish there was something in the documentation that could have saved me some time here.

Missing Assembly Reference error with csharp_grpc_library

Hello,

When building a  csharp_grpc_library, I get the error:

error CS0234: The type or namespace name 'BindServiceMethodAttribute'
does not exist in the namespace 'Grpc.Core' (are you missing an assembly reference?)

BindServiceMethodAttribute seems to be part of the Grpc.Core.Api assembly which is not included in csharp/nuget/nuget.bzl. It might be an issue with the script generating this file.

Here is a repository to reproduce the issue:
Repository
bazel build //:missing_assembly_reference

Python rules fail since 3ca0bf4

Some of the changes made in 3ca0bf4 appear to have broken the Python rules (or at least changed their behaviour significantly), for previously working rules based on the documentation.

The main error seen is:

ERROR: <snip>/BUILD:18:1: output '<snip>/blah_pb2.py' was not created
... etc for every proto file

Running with verbose = 4 shows these files are created in the sandbox next to their proto file, but the expected path is missing part of the prefix. e.g consider this file tree:

BUILD
protos/
   something.proto
   somthing_pb2.py <- File is generated here correctly
something_pb2.py <- Expected output is here, which is incorrect

I'm trying to put together a smallest failing case but there seems to be a huge amount of code duplication that is making debugging difficult (compilation occurring in both compile and aspect). What is the aspect based compilation doing here compared to the compile rule? Is this not what the transitive argument is for?

Also, two other things to note:

  • Output paths appear to no longer be prefixed by the name of the rule, meaning if multiple separate rules depend on and build the same proto file, the build will fail due to both rules touching the same file.
  • The transitive argument is still mis-documented as being default False (see #56).

no such package 'closure/example/proto'

So here's the next Mac build blocker. Doesn't look as trivial as the last and suspect it might tread on toes. This one I won't take without discussion.

DEBUG: /Volumes/Projects/src/github.com/cwoodcock/rules_proto/dart/dart_pub_deps.bzl:21:9: front_end 0.1.6+4 override 0.1.6
INFO: Repository rule 'bazel_skylib' returned: {"remote": "https://github.com/bazelbuild/bazel-skylib.git", "commit": "3fea8cb680f4a53a129f7ebace1a5a4d1e035914", "shallow_since": "2018-06-13", "init_submodules": False, "verbose": False, "strip_prefix": "", "patches": [], "patch_tool": "patch", "patch_args": ["-p0"], "patch_cmds": [], "name": "bazel_skylib"}
ERROR: Skipping '//closure/example/proto:routeguide': no such package 'closure/example/proto': BUILD file not found on package path
ERROR: no such package 'closure/example/proto': BUILD file not found on package path
INFO: Elapsed time: 96.274s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (0 packages loaded)
make: *** [compile] Error 1

java_grpc_library seems does not work with protos import other protos.

Say we have:

  1. foo.proto
  2. bar. proto
    import foo
  3. baz.proto
    import bar

And we create proto_library for them respectively.
I expect java_grpc_library works as follows:

java_grpc_library(
    name = "bar_grpc_library",
    deps = [
        ":foo_proto",
        ":bar_proto",
        ":baz_proto",
    ],
    verbose = 1,
)

but it reports some errors like "foo.proto" is not found.

dart make_dart_context not found

Bazel version: 0.28.0

Error:

io_bazel_rules_dart/dart/build_rules/internal/dart_vm_snapshot.bzl:1:1: file ':common.bzl' does not contain symbol 'make_dart_context

Incompatible with Bazel 0.25

When running with Bazel 0.25, analysis fails with the following error:

ERROR: <snip>/external/com_github_grpc_grpc/bazel/grpc_build_system.bzl:114:1: load() statements must be called before any other statement. First non-load() statement appears at <snip>/external/com_github_grpc_grpc/bazel/grpc_build_system.bzl:27:1. Use --incompatible_bzl_disallow_load_after_statement=false to temporarily disable this check.

If incompatible_bzl_disallow_load_after_statement is disabled, there is a second error that cannot be avoided:

ERROR: <snip>/external/build_stack_rules_proto/python/BUILD.bazel:9:1: error loading package '@com_github_grpc_grpc//': in <snip>/external/com_github_grpc_grpc/bazel/grpc_build_system.bzl: Label '@com_github_grpc_grpc//:bazel/cc_grpc_library.bzl' crosses boundary of subpackage '@com_github_grpc_grpc//bazel' (perhaps you meant to put the colon here: '@com_github_grpc_grpc//bazel:cc_grpc_library.bzl'?) and referenced by '@build_stack_rules_proto//python:grpc_python'

Both these issues appear to have been fixed in the upstream gRPC repo a while back, so the fix for rules_proto would just be to move to a newer dependency version. The fix appears to be in everything after 1.19.0.

Python: transitive proto dependencies are missing

py_proto_library rule seems to make available only _pb2.py files for protos that are target's direct dependency.
This is problematic when proto file depends on other proto, making the final _pb2 file unusable because of missing imports.

I created simple example to reproduce this problem:
https://github.com/aszady/stackb_examples/tree/master/exampleA

bazel test //... against quite recent version (06dd540) results in:

==================== Test output for //python:foo_test:
Traceback (most recent call last):
  File "(…)/foo_test.runfiles/__main__/python/foo_test.py", line 1, in <module>
    import protos.something_pb2 as something_proto
  File "(…)/foo_test.runfiles/__main__/protos/something_pb2.py", line 16, in <module>
    from protos import common_pb2 as protos_dot_common__pb2
ImportError: cannot import name 'common_pb2' from 'protos' ((…)/foo_test.runfiles/__main__/protos/__init__.py)

And indeed, file common_pb2.py is nowhere present.
ls -R bazel-bin/python/foo_test.runfiles/__main__ -I 'pypi__*' says:

bazel-bin/python/foo_test.runfiles/__main__:
external  protos  python

bazel-bin/python/foo_test.runfiles/__main__/external:
__init__.py

bazel-bin/python/foo_test.runfiles/__main__/protos:
__init__.py  something_pb2.py

bazel-bin/python/foo_test.runfiles/__main__/python:
foo_test  foo_test.py  __init__.py

Swift build not working with Linux + Clang8

Bazel version 0.24.1

Command:

cd swift/example/swift_grpc_library
bazel test -c opt //...

Error message:

clang-8: error: unsupported option '-pass-exit-codes'

Detailed command (with -s)

/usr/bin/clang @bazel-out/host/bin/external/com_github_apple_swift_swift_protobuf/ProtoCompilerPlugin-0.params -o bazel-out/host/bin/external/com_github_apple_swift_swift_protobuf/ProtoCompilerPlugin @bazel-out/host/bin/external/com_github_apple_swift_swift_protobuf/ProtoCompilerPlugin.autolink '-fuse-ld=/usr/bin/ld' -L/usr/lib/swift/linux -Wl,-rpath,/usr/lib/swift/linux -lm -lstdc++ -lrt -ldl -static-libgcc '-fuse-ld=gold' -Wl,-no-as-needed -Wl,-z,relro,-z,now -B/usr/bin -pass-exit-codes -lstdc++ -lm -Wl,--gc-sections @bazel-out/host/bin/external/com_github_apple_swift_swift_protobuf/SwiftProtobuf.autolink @bazel-out/host/bin/external/com_github_apple_swift_swift_protobuf/SwiftProtobufPluginLibrary.autolink

I tried swapping the command with gcc, but got "gcc: error: unrecognized command line option ‘-fuse-ld=/usr/bin/ld’" error, so the command above works with neither clang nor gcc.

rules_proto overrides some external dependencies with specific versions

Hello,

In deps.bzl, when importing external dependencies, some functions check if the dependency is already registered before importing it and some do not. For example here, we check for an existing reference to boringssl but here we download a specific version of GRPC right away.

Could we generalize the if "xxxxx" not in native.existing_rules(): pattern to every import? That would allow the user to update dependencies sooner and more easily.

There is a similar issue with the nuget packages but I do not know how we could improve this behavior.

csharp_grpc_library: Missing output when building a GRPC .proto importing a message-only .proto

Hello,

Using the csharp_* rules, if I have a proto file with a service importing a proto file without a service, then I cannot build a GRPC library out of the first one. The error is:

ERROR: ...: output 'missing_output_pb/WithoutServiceGrpc.cs' was not created
ERROR: ...: not all outputs were created or valid

csharp_grpc_library declares all its dependencies with csharp_grpc_compile, leading to this issue.

The only workaround I found is to create empty services in the .proto files. What would be the proper way to do that?

Repository to reproduce the issue
bazel build //:missing_output

Thanks!

Documentation needs to clarify relationship between various rules.

The documentation does not sufficiently clarify the difference betwen 'artifact' and 'library'. It also does not sufficiently clarify the relationship between the 'compile' and 'library' rules.

The documentation should, for example, clarify whether the 'deps' and/or 'srcs' of the '_library' rules are intended to be the output of the '_compile' rules or whether the 'deps'/'srcs' of the '_library' rules are intended to be the same as that of the '_compile' rules. That is, it is unclear from the documentation whether the 'compile' and 'library' rules represent separate build steps that are intended to be chained together, or whether one is a superset of the other.

Likewise, the documentation fails to clarify the relationship between these rules and that of the native 'proto_library' rule; for example, can the output of a 'proto_library' be specified as an input to either of these rules? Or are these rules always going to duplicate the 'proto_library' step?

failed to build github.com/stackb/grpc.js/example/routeguide/client:bundle

I go the following error when running the following command,
bazel build github.com/stackb/grpc.js/example/routeguide/client:bundle

ERROR: /home/manazhao/.cache/bazel/_bazel_manazhao/cebe4156e94c57479031319b57bea141/external/io_bazel_rules_closure/java/io/bazel/rules/closure/BUILD:37:1: Building external/io_bazel_rules_closure/java/io/bazel/rules/closure/libtarjan.jar (1 source file) and running annotation processors (AutoAnnotationProcessor, AutoOneOfProcessor, AutoValueProcessor) failed: Worker process did not return a WorkResponse:

---8<---8<--- Start of log, file at /home/manazhao/.cache/bazel/_bazel_manazhao/cebe4156e94c57479031319b57bea141/bazel-workers/worker-4-Javac.log ---8<---8<---
Exception in thread "main" java.lang.UnsupportedClassVersionError: javax/lang/model/element/TypeElement has been compiled by a more recent version of the Java Runtime (class file version 55.0), this version of the Java Runtime only recognizes class file versions up to 54.0
at java.base/java.lang.ClassLoader.defineClass1(Native Method)
at java.base/java.lang.ClassLoader.defineClass(Unknown Source)
at java.base/java.lang.ClassLoader.defineClass(Unknown Source)
at java.base/java.security.SecureClassLoader.defineClass(Unknown Source)
at java.base/jdk.internal.loader.BuiltinClassLoader.defineClass(Unknown Source)
at java.base/jdk.internal.loader.BuiltinClassLoader.findClassInModuleOrNull(Unknown Source)
at java.base/jdk.internal.loader.BuiltinClassLoader.loadClassOrNull(Unknown Source)
at java.base/jdk.internal.loader.BuiltinClassLoader.loadClassOrNull(Unknown Source)
at java.base/jdk.internal.loader.BuiltinClassLoader.loadClassOrNull(Unknown Source)
at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(Unknown Source)
at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(Unknown Source)
at java.base/java.lang.ClassLoader.loadClass(Unknown Source)
at com.google.devtools.build.buildjar.BazelJavaBuilder.parse(BazelJavaBuilder.java:137)
at com.google.devtools.build.buildjar.BazelJavaBuilder.processRequest(BazelJavaBuilder.java:91)
at com.google.devtools.build.buildjar.BazelJavaBuilder.runPersistentWorker(BazelJavaBuilder.java:68)
at com.google.devtools.build.buildjar.BazelJavaBuilder.main(BazelJavaBuilder.java:46)
---8<---8<--- End of log ---8<---8<---
Target //github.com/stackb/grpc.js/example/routeguide/client:bundle failed to build
Use --verbose_failures to see the command lines of failed build steps.
INFO: Elapsed time: 2.602s, Critical Path: 1.82s
INFO: 0 processes.

bazel version: 0.27.0
jdk version: OpenJDK Runtime Environment (build 11.0.2+7-LTS) by Azul Systems, Inc.

inconsistent workspace name for grpc-gateway

The grpc-gateway repo officially uses grpc_ecosystem_grpc_gateway as the Bazel workspace name, as demonstrated here.

In github.com/grpc-ecosystem/grpc-gateway, the same repo is referenced via a different name com_github_grpc_ecosystem_grpc_gateway.

This inconsistency causes build errors for me (and likely other people too).

Admittedly, the name prefixed with com_github (i.e., the name rules_proto chose) sounds more correct to me. But I don't think it is easy to convince grpc-gateway to change the workspace name just because of this.

What's an acceptable solution here?

Incompatible with io_grpc_grpc_java 0.21.0 or newer

This commit shipped in v0.21.0 made @io_grpc_grpc_java//compiler:grpc_java_plugin private, causing breakage in rules_proto.

Bazel will error-out with the following trace:

ERROR: /private/var/tmp/_bazel_xyz/2b76f1f79d664c3181ce1190fe27bebe/external/build_stack_rules_proto/java/BUILD.bazel:10:1:
in proto_plugin rule @build_stack_rules_proto//java:grpc_java:
target '@io_grpc_grpc_java//compiler:grpc_java_plugin' is not visible from target '@build_stack_rules_proto//java:grpc_java'.
Check the visibility declaration of the former target if you think the dependency is legitimate

Does not work with remote action execution

Here is my repository. The software is pretty complicated, but the part relevant to this example is simple. I have a proto_library with some protos, a proto_plugin with a Java binary and a proto_compile rule that wants to use this plugin to generate some files from the mentioned library. If one was to clone this repository, checkout to branch bazel-1 and run bazel test //... with three last bazel releases it would succeed. If one was to use remote bazel execution and use the standard bazelrc for it, when running bazel test //... --config=remote -s one would get the following error:

SUBCOMMAND: # //:test_proto_compile [action 'ProtoCompile test_proto_compile/test_proto_compile.jar']
(cd /home/monnoroch/.cache/bazel/_bazel_monnoroch/feda265a792323e37770534c35ab788f/execroot/blerpc && \
  exec env - \
  /bin/bash -c 'bazel-out/host/bin/external/com_google_protobuf/protoc --descriptor_set_out=bazel-out/k8-fastbuild/bin/test_proto_compile/descriptor.source.bin --proto_path=bazel-out/k8-fastbuild/bin/test_proto_compile --include_imports --include_source_info --java_out=bazel-out/k8-fastbuild/bin/test_proto_compile/test_proto_compile.jar --reactive-blerpc-generator_out=bazel-out/k8-fastbuild/bin/test_proto_compile --plugin=protoc-gen-reactive-blerpc-generator=bazel-out/host/bin/reactive-blerpc bazel-out/k8-fastbuild/bin/test_proto_compile/reactive-blerpc-test/src/main/proto/test_service.proto')
ERROR: /home/monnoroch/dev/blerpc-android/BUILD.bazel:74:1: error executing shell command: '/bin/bash -c bazel-out/host/bin/external/com_google_protobuf/protoc --descriptor_set_out=bazel-out/k8-fastbuild/bin/test_proto_compile/descriptor.source.bin --proto_path=bazel-out/k8-fastbuild/bin/...' failed (Exit 1)
bazel-out/host/bin/reactive-blerpc: Cannot locate runfiles directory. (Set $JAVA_RUNFILES to inhibit searching.)
--reactive-blerpc-generator_out: protoc-gen-reactive-blerpc-generator: Plugin failed with status code 1.

Building googleapis

For my project, I need a few protos defined in googleapis. How can I use this with stackb rules_proto? The repository works of course with the official proto rules, but other parts of my project depend on rules_proto, so I wonder deal with situations like this.

$ bazel build @com_google_googleapis//google/type:quaternion_proto
ERROR: Skipping '@com_google_googleapis//google/type:quaternion_proto': error loading package '@com_google_googleapis//google/type': Unable to load package for '@io_grpc_grpc_java//:java_grpc_library.bzl': The repository could not be resolved
WARNING: Target pattern parsing failed.
ERROR: error loading package '@com_google_googleapis//google/type': Unable to load package for '@io_grpc_grpc_java//:java_grpc_library.bzl': The repository could not be resolved
INFO: Elapsed time: 0.084s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (0 packages loaded)
    currently loading: @com_google_googleapis//google/type

In WORKSPACE, put:

http_archive(
        name = "com_google_googleapis",
        sha256 = "31f6686df2ca54f6d258f6448cc53c2b483fbf08a91e2885f4d2520f7c60daeb",
        strip_prefix = "googleapis-285b7fb430c19b2939313d79ae4ee5170d01cf35",
        urls = [
            "https://github.com/googleapis/googleapis/archive/285b7fb430c19b2939313d79ae4ee5170d01cf35.tar.gz",
        ])

import error with python_proto_library

Basically I got the following import error when run a py_test rule which indirectly depends on python_proto_library. The code can be found at https://github.com/manazhao/py_test_fail

Command: bazel test playground/... --incompatible_disallow_dict_plus=false

=== error log ===
from playground.lib import print_message
ImportError: No module named 'playground.lib'

After remove ":python_test_proto" from the dependency list of "lib", the error is gone. It seems to somehow the use of python_proto_library messes up the import path? I didn't have this issue when using the legacy "rules_protobuf". Meanwhile, I found the following issues setting up the stackb_rules,

  1. need to add --incompatible_disallow_dict_plus=false for every command. This is very inconvenient. This is probably the archive in the document is not up to date. I tried to find a newer version but just couldn't find it.
  2. Following the documentation on setting up WORKSPACE file is insufficient. I also had to add "protobuf_py_deps" in the WORKSPACE file.

Remove trailing whitespaces with a linter?

Hi,

I am trying to hack around this repository.

One frustrating fact is the proliferation of trailing whtiespaces.

The trailing whitespaces are so ubiquitous that I find it impossible to modify the existing code without incurring meaningless whitespace-only changes.

I and motivated (out of personal hacking interest) and would like to send a PR that trims the trailing whitespaces and add a linter for it if the owners are willing to commit to the no-trailing-whtiespace style.

Thanks,
Zhongming

Replace `bind()` with `alias()`

In Bazel, bind() has long been deprecated in favor of alias():
bazelbuild/bazel#1952

This repo still uses quite a few of them, for example:

//external:protobuf_clib
//external:protobuf_headers
//external:protocol_compiler

Indeed, a git grep -n 'native\.bind' would yield a lot more.

I'd like to contribute to this part if @pcj 'd think this is a good idea.

Building with python_grpc_library leads to conflicting actions: Same python wrapper generated twice

MCVE is in this other repo over here: https://github.com/dave-xnor/bazel-py-rule-issue

I have three proto definition files - one has a grpc service defined, it depends on the other two. Each of the three has a proto_library rule; two also have a python_proto_library rule and the third (with the service) has a python_grpc_library rule.

Building the targets independently works: bazel query //... | xargs -n 1 bazel build.

Building them together fails with a conflicting action report, apparently due to one of the python protobuf wrappers of the non-service protobufs being generated twice: bazel build //src/proto:all.

(In the real repository I boiled down to get this MCVE the file reported as the conflicting action varied on each run - though it was always one of the non-service-defining protos.)

This could be a problem with rules_proto, or something underlying in Bazel, or in grpc, or, very possibly, in my understanding of how these rules are supposed to be used to build something. (In which case the "fix" isn't in code, it's in documentation. IMO. Because I do read the docs, and even look at examples.)

ERROR: file 'src/proto/b_pb2.py' is generated by these conflicting actions:
Label: //src/proto:b_proto
RuleClass: proto_library rule
Configuration: de2b1250788dfeca9c7ec0dbeba85f06
Mnemonic: ProtoCompile
Action key: 7e9a1878eff3415c62f50f7e698db12d, 53d25eb3e1a7c9d9533bc6d2d132d176
Progress message: ProtoCompile src/proto/b_pb2.py
PrimaryInput: File:[/home/dbakin/src/bazel-py-rule-issue[source]]src/proto/a.proto
PrimaryOutput: File:[[<execution_root>]bazel-out/k8-fastbuild/bin]src/proto/b_pb2.py
Primary outputs are different: 1971704056, 2050816446
Owner information: []#@build_stack_rules_proto//python:python_grpc_compile.bzl%python_grpc_compile_aspect[verbose_string="1"] BuildConfigurationValue.Key[de2b1250788dfeca9c7ec0dbeba85f06] //src/proto:b_proto BuildConfigurationValue.Key[de2b1250788dfeca9c7ec0dbeba85f06] false {verbose_string=[1]}, []#@build_stack_rules_proto//python:python_proto_compile.bzl%python_proto_compile_aspect[verbose_string="1"] BuildConfigurationValue.Key[de2b1250788dfeca9c7ec0dbeba85f06] //src/proto:b_proto BuildConfigurationValue.Key[de2b1250788dfeca9c7ec0dbeba85f06] false {verbose_string=[1]}
MandatoryInputs: Attempted action contains artifacts not in previous action (first 5): 
	external/com_github_grpc_grpc/grpc_python_plugin
Outputs: Attempted action contains artifacts not in previous action (first 5): 
	src/proto/b_pb2.py
	src/proto/b_pb2_grpc.py
Previous action contains artifacts not in attempted action (first 5): 
	src/proto/b_pb2.py
ERROR: com.google.devtools.build.lib.actions.MutableActionGraph$ActionConflictException: for src/proto/b_pb2.py, previous action: action 'ProtoCompile src/proto/b_pb2.py', attempted action: action 'ProtoCompile src/proto/b_pb2.py'
INFO: Elapsed time: 6.556s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (13 packages loaded, 1848 targets configured)

ruby_grpc_compile tries to generate services for well known proto

Hi,
I have a proto file containing gRPC service definition which references the google.protobuf.Timestamp proto.

I have following bazel target:

ruby_grpc_compile(
    name = "x_ruby_grpc_compile",
    visibility = ["//visibility:private"],
    deps = [":x_proto"],
    verbose = 4,
)

Target x_proto is generated like so

proto_library(
    name = "x_proto",
    srcs = ["x.proto"],
    visibility = ["//visibility:public"],
    deps = ["@com_google_protobuf//:timestamp_proto"],
)

From the verbose output I see that it is expecting bazel-out/darwin-fastbuild/bin/external/com_google_protobuf/google/protobuf/timestamp_services_pb.rb to be created as well and fails saying output 'external/com_google_protobuf/google/protobuf/timestamp_services_pb.rb' was not created. How to prevent it from expecting this timestamp_services_pb.rb?

Link to repo with reproducible code https://github.com/rsinghumasscs/stackb_ruby_grpc_compile_repro

Thanks a lot!

android_proto_library fails with latest Bazel daily build

Attempting to use android_proto_library fails with the following error, related to the version of protobuf defined in deps.bzl

ERROR:` /home/tcj/.cache/bazel/_bazel_tcj/86df34957ac88767728a08679bac58bf/external/com_google_protobuf_lite/BUILD:538:1: Traceback (most recent call last):
        File "/home/tcj/.cache/bazel/_bazel_tcj/86df34957ac88767728a08679bac58bf/external/com_google_protobuf_lite/BUILD", line 538
                internal_gen_well_known_protos_java(srcs = WELL_KNOWN_PROTOS)
        File "/home/tcj/.cache/bazel/_bazel_tcj/86df34957ac88767728a08679bac58bf/external/com_google_protobuf_lite/protobuf.bzl", line 266, in internal_gen_well_known_protos_java
                Label(("%s//protobuf_java" % REPOSITOR...))
        File "/home/tcj/.cache/bazel/_bazel_tcj/86df34957ac88767728a08679bac58bf/external/com_google_protobuf_lite/protobuf.bzl", line 266, in Label
                REPOSITORY_NAME
builtin variable 'REPOSITORY_NAME' is referenced before assignment.
ERROR: /home/tcj/.cache/bazel/_bazel_tcj/86df34957ac88767728a08679bac58bf/external/com_google_protobuf_lite/BUILD:593:1: Target '@com_google_protobuf_lite//:android' contains an error and its package is in error and referenced by '@com_google_protobuf_lite//:protoc_gen_javalite'
ERROR: /home/tcj/.cache/bazel/_bazel_tcj/86df34957ac88767728a08679bac58bf/external/build_stack_rules_proto/android/BUILD.bazel:3:1: Target '@com_google_protobuf_lite//:protoc_gen_javalite' contains an error and its package is in error and referenced by '@build_stack_rules_proto//android:javalite'
ERROR: Analysis of target '//euchre:android_euchre_proto' failed; build aborted: Analysis failed
INFO: Elapsed time: 0.048s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (2 packages loaded, 2 targets conf\
igured)

It appears the issue has been fixed in the protobuf git repo, so updating to the latest commit should in theory resolve the issue - protocolbuffers/protobuf@3849895

I can see how to update the commit hash in deps.bzl, but I'm not sure how to update the sha256.

Fix Bazel incompatible changes

I'd like to be able to build rules_closure with --all_incompatible_changes:
bazel test --all_incompatible_changes //...

It may not be possible to fix every issue now due to transitive dependencies, but we should try to fix the issues from this repository.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.