Git Product home page Git Product logo

protobuf-gradle-plugin's Introduction

❗ Please read release notes before upgrading the plugin, as usage or compatibility requirements may change.

Protobuf Plugin for Gradle

The Gradle plugin that compiles Protocol Buffer (aka. Protobuf) definition files (*.proto) in your project. There are two pieces of its job:

  1. It assembles the Protobuf Compiler (protoc) command line and uses it to generate Java source files out of your proto files.
  2. It adds the generated Java source files to the input of the corresponding Java compilation unit (sourceSet in a Java project; variant in an Android project), so that they can be compiled along with your Java sources.
    • Note if you are generating non-Java/Kotlin source files, they will not be included for compilation automatically, you will need to add them to sources for language-specific compilations. See details in Default options section.

For more information about the Protobuf Compiler, please refer to Google Developers Site.

Latest Version

The latest version is 0.9.4. It requires at least Gradle 5.6 and Java 8. To use it with Groovy DSL:

plugins {
  id "com.google.protobuf" version "0.9.4"
}

Development Version

To try out the head version, you can download the source and build it with ./gradlew publishToMavenLocal -x test (we skip tests here because they require Android SDK), then in settings.gradle:

pluginManagement {
  repositories {
    gradlePluginPortal()
    mavenLocal()
  }
}

And in build.gradle:

plugins {
  id "com.google.protobuf" version "0.9.5-SNAPSHOT"
}

Examples

Stand-alone examples are available for each of gradle's supported languages.

  • Groovy (Default)
    • Run ../../gradlew build under the example directory to test it out.
  • Kotlin (Experimental)
    • Run ./gradlew build under the Kotlin example directory to test it out. This example is set up with Gradle 4.10, the minimum required version.

Directories that start with testProject can also serve as usage examples for advanced options, although they cannot be compiled as individual projects.

Adding the plugin to your project

This plugin must work with either the Java plugin or the Android plugin.

Configuring Protobuf compilation

The Protobuf plugin assumes Protobuf files (*.proto) are organized in the same way as Java source files, in sourceSets. The Protobuf files of a sourceSet (or variant in an Android project) are compiled in a single protoc run, and the generated files are added to the input of the Java compilation run of that sourceSet (or variant).

Customizing source directories

The plugin adds a new sources block named proto alongside java to every sourceSet. By default, it includes all *.proto files under src/$sourceSetName/proto. You can customize it in the same way as you would customize the java sources.

Java projects: use the top-level sourceSet:

sourceSets {
  main {
    proto {
      // In addition to the default 'src/main/proto'
      srcDir 'src/main/protobuf'
      srcDir 'src/main/protocolbuffers'
      // In addition to the default '**/*.proto' (use with caution).
      // Using an extension other than 'proto' is NOT recommended,
      // because when proto files are published along with class files, we can
      // only tell the type of a file from its extension.
      include '**/*.protodevel'
    }
    java {
      ...
    }
  }
  test {
    proto {
      // In addition to the default 'src/test/proto'
      srcDir 'src/test/protocolbuffers'
    }
  }
}

Android projects: use android.sourceSets:

android {
  sourceSets {
    main {
      proto {
        ...
      }
      java {
        ...
      }
    }
  }
}

Customizing Protobuf compilation

The plugin adds a protobuf block to the project. It provides all the configuration knobs.

Locate external executables

By default the plugin will search for the protoc executable in the system search path. We recommend you to take the advantage of pre-compiled protoc that we have published on Maven Central:

protobuf {
  ...
  // Configure the protoc executable
  protoc {
    // Download from repositories
    artifact = 'com.google.protobuf:protoc:3.0.0'
  }
  ...
}

You may also specify a local path.

protobuf {
  ...
  protoc {
    path = '/usr/local/bin/protoc'
  }
  ...
}

Multiple assignments are allowed in the protoc block. The last one wins.

You may also run protoc with codegen plugins. For a codegen plugin named as "foo", protoc will by default use protoc-gen-foo from system search path. You can also specify a downloadable artifact or a local path for it in the plugins block, in the same syntax as in the protoc block above. This will not apply the plugins. You need to configure the tasks in the generateProtoTasks block introduced below to apply the plugins defined here.

protobuf {
  ...
  // Locate the codegen plugins
  plugins {
    // Locate a plugin with name 'grpc'. This step is optional.
    // If you leave it empty, it uses the current directory.
    // If you don't specify it, protoc will try to use "protoc-gen-grpc" from
    // system search path.
    grpc {
      artifact = 'io.grpc:protoc-gen-grpc-java:1.0.0-pre2'
      // or
      // path = 'tools/protoc-gen-grpc-java'
    }
    // Any other plugins
    ...
  }
  ...
}

The syntax for artifact follows Artifact Classifiers where the default classifier is project.osdetector.classifier (ie "${project.osdetector.os}-${project.osdetector.arch}") and the default extension is "exe". Non-C++ implementations of codegen plugins can be used if a constant classifier is specified (eg "com.example:example-generator:1.0.0:-jvm8_32").

Customize code generation tasks

The Protobuf plugin generates a task for each protoc run, which is for a sourceSet in a Java project, or a variant in an Android project. The task has configuration interfaces that allow you to control the type of outputs, the codegen plugins to use, and parameters.

You must configure these tasks in the generateProtoTasks block, which provides you helper functions to conveniently access tasks that are tied to a certain build element, and also ensures you configuration will be picked up correctly by the plugin.

DONOTs:

  • DO NOT assume the names of the tasks, as they may change.
  • DO NOT configure the tasks outside of the generateProtoTasks block, because there are subtle timing constraints on when the tasks should be configured.
protobuf {
  ...
  generateProtoTasks {
    // all() returns the collection of all protoc tasks
    all().configureEach { task ->
      // Here you can configure the task
    }

    // In addition to all(), you may select tasks by various criteria:

    // (Java-only) returns tasks for a sourceSet
    ofSourceSet('main')

    // (Android-only selectors)
    // Returns tasks for a flavor
    ofFlavor('demo')
    // Returns tasks for a buildType
    ofBuildType('release')
    // Returns tasks for a variant
    ofVariant('demoRelease')
    // Returns non-androidTest tasks
    ofNonTest()
    // Return androidTest tasks
    ofTest()
  }
}

Each code generation task has two collections:

  • builtins: code generators built in protoc, e.g., java, cpp, python.
  • plugins: code generation plugins that work with protoc, e.g., grpc. They must be defined in the protobuf.plugins block in order to be added to a task.

Configure what to generate

Code generation is done by protoc builtins and plugins. Each builtin/plugin generates a certain type of code. To add or configure a builtin/plugin on a task, list its name followed by a braces block. Put options in the braces if wanted. For example:

task.builtins {
  // This yields
  // "--java_out=example_option1=true,example_option2:/path/to/output"
  // on the protoc commandline, which is equivalent to
  // "--java_out=/path/to/output --java_opt=example_option1=true,example_option2"
  // with the latest version of protoc.
  java {
    option 'example_option1=true'
    option 'example_option2'
  }
  // Add cpp output without any option.
  // DO NOT omit the braces if you want this builtin to be added.
  // This yields
  // "--cpp_out=/path/to/output" on the protoc commandline.
  cpp { }
}

task.plugins {
  // Add grpc output without any option.  grpc must have been defined in the
  // protobuf.plugins block.
  // This yields
  // "--grpc_out=/path/to/output" on the protoc commandline.
  grpc { }
}

Default outputs

Java projects: the java builtin is added by default: without any further specification, Java classes will be generated during the build process.

Python output can be generated by adding the python builtin:

protobuf {
  generateProtoTasks {
    all().configureEach { task ->
      task.builtins {
        // Generates Python code
        python { }

        // If you wish to avoid generating Java files:
        remove java
      }
    }
  }
}

Note the generated Python code will not be included for compilation, you will need to add them as sources to Python's compilation tasks manually. See this section for details about where the code will be generated.

Android projects: no default output will be added. Since Protobuf 3.0.0, the lite runtime is the recommended Protobuf library for Android.

For Protobuf versions from 3.0.x through 3.7.x, lite code generation is provided as a protoc plugin (protobuf-lite). Example:

dependencies {
  // You need to depend on the lite runtime library, not protobuf-java
  implementation 'com.google.protobuf:protobuf-lite:3.0.0'
}

protobuf {
  protoc {
    // You still need protoc like in the non-Android case
    artifact = 'com.google.protobuf:protoc:3.7.0'
  }
  plugins {
    javalite {
      // The codegen for lite comes as a separate artifact
      artifact = 'com.google.protobuf:protoc-gen-javalite:3.0.0'
    }
  }
  generateProtoTasks {
    all().configureEach { task ->
      task.builtins {
        // In most cases you don't need the full Java output
        // if you use the lite output.
        remove java
      }
      task.plugins {
        javalite { }
      }
    }
  }
}

Starting from Protobuf 3.8.0, lite code generation is built into protoc's "java" output. Example:

dependencies {
  // You need to depend on the lite runtime library, not protobuf-java
  implementation 'com.google.protobuf:protobuf-javalite:3.8.0'
}

protobuf {
  protoc {
    artifact = 'com.google.protobuf:protoc:3.8.0'
  }
  generateProtoTasks {
    all().configureEach { task ->
      task.builtins {
        java {
          option "lite"
        }
      }
    }
  }
}

Generate descriptor set files

{ task ->
  // If true, will generate a descriptor_set.desc file under
  // task.outputBaseDir. Default is false.
  // See --descriptor_set_out in protoc documentation about what it is.
  task.generateDescriptorSet = true

  // Allows to override the default for the descriptor set location
  task.descriptorSetOptions.path =
    "${projectDir}/build/descriptors/${task.sourceSet.name}.dsc"

  // If true, the descriptor set will contain line number information
  // and comments. Default is false.
  task.descriptorSetOptions.includeSourceInfo = true

  // If true, the descriptor set will contain all transitive imports and
  // is therefore self-contained. Default is false.
  task.descriptorSetOptions.includeImports = true
}

Change where files are generated

Generated files are under task.outputBaseDir with a subdirectory per builtin and plugin. This produces a folder structure of $buildDir/generated/source/proto/$sourceSet/$builtinPluginName.

The subdirectory name, which is by default $builtinPluginName, can be changed by setting the outputSubDir property in the builtins or plugins block of a task configuration within generateProtoTasks block (see previous section). E.g.,

{ task ->
  task.plugins {
    grpc {
      // Use subdirectory 'grpcjava' instead of the default 'grpc'
      outputSubDir = 'grpcjava'
    }
  }
}

Protos in dependencies

If a Java project contains proto files, they will be packaged in the jar files along with the compiled classes.

Protos in dependencies (e.g. upstream jars) can be put in either in the compile configuration or the protobuf configuration.

If the dependency is put in the compile configuration, the proto files are extracted to an extracted-include-protos directory and added to the --proto_path flag of the protoc command line, so that they can be imported by the proto files of the current project. The imported proto files will not be compiled since they have already been compiled in their own projects. Example:

dependencies {
  implementation project(':someProjectWithProtos')
  testImplementation files("lib/some-testlib-with-protos.jar")
}

If the dependency is put in the protobuf configuration, the proto files are extracted to a extracted-protos directory and added to the protoc command line as files to compile, in the same protoc invocation as the current project's proto files (if any). Example:

dependencies {
  // protos can be from a local package,
  protobuf files('lib/protos.tar.gz')
  // ... a local directory,
  protobuf files('ext/')   // NEVER use fileTree(). See issue #248.
  // ... or an artifact from a repository
  testProtobuf 'com.example:published-protos:1.0.0'
}

Pre-compiled protoc artifacts

This Maven Central directory lists pre-compiled protoc artifacts that can be used by this plugin.

Tips for IDEs

IntelliJ IDEA

Be sure to enable delegate IDE build/run actions to Gradle so that Intellij does not use its internal build mechanism to compile source code. This plugin ensures that code generation happens before Gradle's build step. If the setting is off, Intellij's own build system will be used instead of Gradle.

Enable the setting with:

Settings -> Build, Execution, Deployment
  -> Build Tools -> Gradle -> Runner
  -> Delegate IDE build/run actions to gradle.

This plugin integrates with the idea plugin and automatically registers the proto files and generated Java code as sources.

Testing the plugin

testProject* are testing projects that uses this plugin to compile .proto files. Because the tests include an Android project, you need to install Android SDK Tools.

After you made any change to the plugin, be sure to run these tests.

$ ./gradlew test

protobuf-gradle-plugin's People

Contributors

aantono avatar bigdaz avatar clayburn avatar ejona86 avatar eskatos avatar fml2 avatar gavra0 avatar jaredsburrows avatar marcoferrer avatar mkobit avatar mleinart avatar mrylander avatar ngyukman avatar noel-yap avatar oehme avatar pcostell avatar rougsig avatar rpalcolea avatar runningcode avatar steffenyount avatar stephenh avatar stkent avatar the-alchemist avatar tinsolo123 avatar valkolovos avatar voidzcy avatar wolfs avatar yifeizhuang avatar zhangkun83 avatar zpencer avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

protobuf-gradle-plugin's Issues

Support earlier version of glibc

When trying to use a pre-compiled linux protoc binary, getting an error:
/lib64/libc.so.6: version GLIBC_2.14' not found`
If possible, recompile with a lower version (2.10 or 2.12) to support a wider range of Linux OS.

Multiple protobufs inside a folder

When you have multiple protobufs inside the proto folder for example, proto/proto1 & proto/proto2 but inside proto1 you have an proto file with an import that imports a file inside proto1 and in proto2 you have an proto file with na import that imports a file inside proto2 and both import files are the same file name as the others.

This causes an issue with the import already being imported from the previous folder. Is there a way to run the protobuf gradle task multiple times whilst pointing to different folders?

Build python grpc?

By using io.grpc:protoc-gen-grpc-java as an artifact for grpc plugin, I can generate grpc libraries in Java.

But I could not figure out how to generate python grpc libraries with protobuf-gradle-plugin.

Is it not supported yet?

GenerateProtoTask should support incremental build and more thorough clean

The current implementation of GenerateProtoTask will not support scenario where a proto file is renamed or even deleted. In that scenario, the old generated file will be left behind and may cause unwanted build success or failure (by duplication). The task should take an IncrementalTaskInputs parameters to remove stalled output. It will also be able to build only the proto file that as changed - very useful in big proto source set.

Proto sourceset not generated

after adding the proto to the source block, i dont see any proto directory.

Here is the definition as added in my build.gradle file

android {
    sourceSets {
        main {
            proto {
                srcDir 'src/main/proto'
                include '**/*.protodevel'
            }
        }
    }
}

Breaking change of DefaultSourceDirectorySet constructor interface

The current protobuf-gradle-plugin cannot be used with gradle 2.12 due to the breaking change of the DefaultSourceDirectorySet constructor interface.

super(name, String.format("%s Proto source", name), fileResolver)

https://github.com/gradle/gradle/blob/REL_2.12/subprojects/core/src/main/groovy/org/gradle/api/internal/file/DefaultSourceDirectorySet.java#L44

A way to specify --descriptor_set_out on protoc command

Hi,

Our use case requires us to generate a descriptor file. Is there a way to specify --descriptor_set_out flag on protoc command via the plugin?

We've tried:

protobuf {
    generateProtoTasks {
        all().each { task ->
            task.builtins {
                descriptor_set {}
            }
        }
    }
}

But, it looks like this generates
protoc --descriptor_set_out=src/main/descriptor_set
where src/main/descriptor_set is a directory.

This doesn't work because protoc requires that --descriptor_set_out should point to a file instead of a directory.

Your help in this matter would be greatly appreciated.

Unable to compile with errors

Hi team.

I am attaching a set of files representing a simple proto setup. There is a gradle build (with wrapper) and a single proto file. I have followed the instructions on using the protobuf-gradle-plugin. However, I get the following errors when I run gradlew build.

proto.zip

:extractIncludeProto
:extractProto UP-TO-DATE
:generateProto FAILED

FAILURE: Build failed with an exception.

  • What went wrong:
    Execution failed for task ':generateProto'.

    protoc: stdout: . stderr: /media/Data/projects/new/protos/mcq/build/extracted-protos/main: warning: directory does not exist.
    google/protobuf/any.proto:95:10: "google.protobuf.Any.type_url" is already defined in file "build/extracted-include-protos/main/google/protobuf/any.proto".
    google/protobuf/any.proto:98:9: "google.protobuf.Any.value" is already defined in file "build/extracted-include-protos/main/google/protobuf/any.proto".
    google/protobuf/any.proto:74:9: "google.protobuf.Any" is already defined in file "build/extracted-include-protos/main/google/protobuf/any.proto".
    google/protobuf/type.proto: Import "google/protobuf/any.proto" was not found or had errors.
    google/protobuf/type.proto:171:3: "google.protobuf.Any" seems to be defined in "build/extracted-include-protos/main/google/protobuf/any.proto", which is not imported by "google/protobuf/type.proto". To use it here, please add the necessary import.
    build/extracted-include-protos/main/google/protobuf/api.proto: Import "google/protobuf/type.proto" was not found or had errors.
    build/extracted-include-protos/main/google/protobuf/api.proto:56:12: "Option" is not defined.
    build/extracted-include-protos/main/google/protobuf/api.proto:90:3: "Syntax" is not defined.
    build/extracted-include-protos/main/google/protobuf/api.proto:112:12: "Option" is not defined.
    build/extracted-include-protos/main/google/protobuf/api.proto:115:3: "Syntax" is not defined.

Another issue is that I am having to create an empty src/main/proto directory just because the plugin requires that. Is there a way to avoid this?

Ultimately, I want to be able to define a GRPC service in this proto file, and make it build with the grpc plugin to protoc.

Can someone kindly help?

Feature Request: support Java protoc plugins and allow specifying main class

We have an existing codegen plugin for protobuf that is invoked in a maven build thusly:

          <protocPlugins>
            <protocPlugin>
              <id>foo-protobuf</id>
              <groupId>com.foo.protobuf</groupId>
              <artifactId>foo-protobuf-plugin</artifactId>
              <version>HEAD-SNAPSHOT</version>
              <mainClass>com.foo.protobuf.compiler.plugin.PluginMain</mainClass>
            </protocPlugin>
          </protocPlugins>

If I try to add this same plugin the way this project specifies that plugins should be added:

plugins {
    foo {
      artifact = 'com.foo.protobuf:foo-protobuf-plugin:HEAD-SNAPSHOT'
    }

The build fails (target protobufToolsLocator_foo) because it's trying to download an executable artifact foo-protobuf-plugin-osx-x86_64.exe which doesn't exist (the artifact is actually foo-protobuf-plugin-HEAD-SNAPSHOT.jar).

I think I can work around this by creating a local executable that downloads the jar and runs it with the classname specified, but it would be rad if this library included this functionality.

Java keywords in proto files not handled correctly.

One of my protobufs has a field called "interface". When invoking protoc manually, the generated Java file has a field called "interface_". However when using the Gradle plugin to build, the field is named "interface" and thus the code doesn't compile.

Here is my build.gradle for reference:

apply plugin: 'java'
apply plugin: 'com.google.protobuf'

allprojects {
    repositories {
        mavenCentral()
    }
}

buildscript {
  repositories {
    mavenCentral()
  }
  dependencies {
    classpath 'com.google.protobuf:protobuf-gradle-plugin:0.7.0'
  }
}

protobuf {
  protoc {
    // The version of protoc must match protobuf-java. If you don't depend on
    // protobuf-java directly, you will be transitively depending on the
    // protobuf-java version that grpc depends on.
    artifact = "com.google.protobuf:protoc:3.0.0-beta-1"
  }
  plugins {
    grpc {
      artifact = 'io.grpc:protoc-gen-grpc-java:0.9.0'
    }
  }
  generateProtoTasks {
    all()*.plugins {
      grpc {}
    }
  }
}

sourceSets {
  main {
    proto {
      srcDir 'proto'
    }
  }
}

dependencies {
  compile 'io.grpc:grpc-all:0.9.0'
}

The proto:

syntax = "proto3";
package foo;

message Foo {
  string interface = 1;
}

generatedFileDir does not follow Gradle Java Plugin conventions (nor can you)

In the plugin, generatedSourceDir is always set to ${generatedSourceDir}/${sourceSet.name}.

Given generatedFileDir = "${projectDir}/src", this dumps the compiled sources into subproject/src/main/com/test vs. a more canonical subproject/src/main/java/com/test. Since the sourceSet.name is always appended to the generatedFileDir, there's no good way to get the latter behavior.

The default behavior of the Java Plugin would be ${project.projectDir}/src/${sourceSet.name}/java

I'm afraid I'm not familiar enough with Gradle internals to suggest a better solution, but I'd almost rather just be able to say generatedFileDir = "${projectDir}/src/main/java and not worry about the sourceSet name.

InvalidUserDataException thrown without any details

InvalidUserDataException message should say that protoc failed.

In my particular case, the dynamic linker failed because of libc issues.

It seems like the missing message could be caused by not calling waitForProcessOutput() [as required](http://docs.groovy-lang.org/latest/html/groovy-jdk/java/lang/Process.html#consumeProcessOutput%28java.io.OutputStream, java.io.OutputStream%29). It could also be because we are passing a non-threadsafe output stream to two different threads. There don't seem to be any pre-made solutions for solving the threading issue.

Premature file validation in ToolsLocator causes slowdown during configuration phase

I'm trying to use the protobuf-gradle-plugin in an environment with hundreds of gradle projects. In this environment it is vital that we keep the Configuration phase time spent on each of the individual projects down to a minimum since they accumulate to affect every subsequent invocation of gradle. For the most part we have achieved acceptable Configuration phase times by making sure that all our projects' dependency resolution and related file I/O is deferred to the Execution phase.

One simple test we use for detecting premature dependency resolution and related file I/O has been to run 'gradlew help --profile' from the root project directory, and then to look at the results under the 'Dependency Resolution' tab. The only entries we should find there when running the 'gradlew help' target are the unavoidable project classpath entries ending in '**:classpath'. Any other entries found there indicate premature dependency resolution and related file I/O being done during the Configuration phase for the listed dependency.

The protobuf-gradle-plugin is failing this test. We're seeing numerous entries ending in '**:protobufToolsLocator_protoc' under the 'Dependency Resolution' tab, and these are cumulatively adding more than a second to our total Configuration phase time on every invocation of gradle.

Looking at the source code it seems that the cause of this slowdown are lines 88-91 of ToolsLocator.groovy:

    File file = config.fileCollection(dep).singleFile
    if (!file.canExecute() && !file.setExecutable(true)) {
      throw new GradleException("Cannot set ${file} as executable")
    }

Where the config.fileCollection(dep).singleFile is forcing dependency resolution to happen during the Configuration phase and then the !file.canExecute() && !file.setExecutable(true) is also forcing file I/O to happen during the Configuration phase.

Can this be refactored to either 1) remove this dependency check here, 2) allow this dependency check to be disabled here, or even better yet 3) defer this dependency check until the Execution phase when the item being verified is actually going to be used?

proto import of main proto from test fails by default

See pcostell#1 for an example test of where this fails.

For normal java imports, the test directory gets the main directory as a dependency. However, this doesn't happen for proto imports.

I was able to resolve this by updating my generateProtoTasks with:

  generateProtoTasks {
    ofSourceSet("test").each {task ->
      sourceSets.main.proto.srcDirs.each { srcDir ->
        task.include srcDir
      }
    }
  }

I would expect this to happen automatically for all test source sets to match the java rules.

Support protoc --proto_path

When you define protos importing definitions from another project, protoc --proto_path=blah allows you to depend on these without re-compiling them.

In your plugin, adding another srcDir will compile the dependent protos, resulting in duplicates in a multi module project. The maven proto plugin correctly adds dependent protos to the include path without compiling them, as does the Thomas lee gradle-protoc-plugin.

Would be great if you can include the ability to modify the include path only, without adding the included proto files as arguments requiring compilation. Whether this is done using simple filesets or dependency archives isn't an issue.

Thanks

Incremental build seems not working

I encountered an issue where I built a project once, which failed because an error in a proto file, I then fixed the proto file and tried to build it again (without clean), and the proto file was not re-compiled.

Error:The 'java' plugin has been applied, but it is not compatible with the Android plugins

I simply added protobuf setting in build.gradle, and got the error "Error:The 'java' plugin has been applied, but it is not compatible with the Android plugins".

(Android Studio version 1.2.2).

build.gradle:

apply plugin: 'com.android.application'
apply plugin: 'com.google.protobuf'

buildscript {
    repositories {
        jcenter()
        mavenCentral()
    }
    dependencies {
        classpath "com.google.protobuf:protobuf-gradle-plugin:0.4.1"
    }
}

repositories {
    jcenter()
    mavenCentral()
    maven {
        url 'https://oss.sonatype.org/content/repositories/snapshots/'
    }
}

android {
...

"./gradlew clean" should remove files from generated source dirs

I'm setting generatedFilesBaseDir = "$projectDir/src/generated" so IntelliJ can more easily recognize my generated proto .java files as sources. When I re-generate these files, I want a clean to remove them (in case I renamed a class with java_outer_classname, for instance)

Protobuf nano and J2Objc translated classes in Swift

I have a J2Objc project using protobuf nano. The issue I'm having is that the instance variables in the generated Objective C code are iVars and Swift cannot access them directly. I have to use the "valueForKey("mResponseData_")" syntax and cast the result to the correct type which is not ideal. If I could annotate the generated protobuf Java classes with the J2Objc @Property annotation J2Ojbc would create properties but I have not found a way of annotating my .proto file to add @Property in the generated Java classes.

As a workaround I'm using " option 'optional_field_style=accessors'" to generate accessors for primitive data types. However, reference types do not have generated accessors so for those I'll have to use the "valueForKey" syntax.

Is there a better way of doing this?

Thanks,

Gabriel

Infer protoc version from protobuf version automatically

Because protoc generated code uses Protobuf internal API which doesn't have any stability guarantee, the version of protoc should always exactly match that of the protobuf runtime. We should make the protobuf plugin decide what version of protoc to use based the version of protobuf/protobuf-nano in the java compile dependencies, unless the user has explicitly specified a particular protoc version or binary.

gradle 2.12 issue

> Failed to apply plugin [id 'com.google.protobuf']
   > Could not create an instance of type com.google.protobuf.gradle.ProtobufSourceDirectorySet_Decorated.

generated files are always named Protos.java

Am I missing something or does the generateProtos task always generated a file named Protos.java?
Ideally, I would like it to generate multiple files (one per message). I see there is a way to specify option 'multiple_files=true' for javanano. I was hoping to have something similar.

Justify the protobuf/testProtobuf dependency feature or remove it

The Protobuf dependencies, i.e.., protobuf, testProtobuf etc, define packages of proto files that will be compiled along with those under the src dir. This feature is inherited from @aantono's plugin. However, I don't feel it should be called dependency. It's just another form of sources. The scenarios described in #15 and #8 sound more like dependency.

Maybe we should move this feature into sourceSets.

@aantono @ejona86 WDYT?

Expose a way to query builtins/plugins at Gradle configuration time

A bit of background first... I am trying to generate both Java and C++ bindings and then publish the Java classes as well as the C++ source files to a Maven repository.

I currently have:

protobuf {
    protoc {
        artifact = 'com.google.protobuf:protoc:3.0.0-beta-1'
    }
    generateProtoTasks {
        all().each { t ->
            t.builtins {
                cpp {}
            }
        }
    }
}

task zipCpp(type: Zip, dependsOn: 'generateProto') {
    from "${buildDir}/generated/source/proto/main/cpp"
}
assemble.dependsOn zipCpp

It would be nice if the zip task could discover the outputs of a builtin with a given name.. Something like:

task zipCpp(type: Zip, dependsOn: 'generateProto') {
    from tasks.getByName('generateProto') { t ->
        t.builtins.findAll { it.name == 'cpp' }.each { b->
            t.outputDir(b)
        }
    }
}

Or even better, move some of that complexity into the actual plugin.. At the moment the task is not available at configuration time and moving the creation of the zip task to inside the generateProtoTasks closure is even more complicated. Suggestions/workarounds welcome.

Support for the native project (C++)

Gradle now support native compilation (C++) and it would be great if this plugin could natively support this. The big problem is how the native project differ from the java project in Gradle. However, I'm confident that we can find a way to support both at the same time with minimal code duplication. I'm willing to help on this contribution but some guideline on the roadmap of this plugin would be required. The code is extremely Java project centric and some big change would be required in order to support both Java and C++. As for some background about me, I have been working with protobuf in a C++ environment compiled with Gradle. I had to hacked together a native plugin for protobuf with what was available at the time. I cannot contribute the code directly has it was really a one off solution but I do have idea on how we can make this possible for longer term.

Ideas? Thought? Complain?

google/protobuf/any.proto: File not found

I am trying to use protobuf-gradle-plugin with proto3. However, it seems that the plugin is unable to import runtime proto definitions, such as google/protobuf/any.proto.

According to the documentation for the Any time, I added the following to the beginning of my proto file:

syntax = "proto3";

import "google/protobuf/any.proto";

And following the documentation for the protobuf-gradle-plugin:

buildscript {
  repositories {
    mavenCentral()
  }
  dependencies {
    ...
    classpath 'com.google.protobuf:protobuf-gradle-plugin:0.7.0'
  }
}

...

project(':azkaban-common') {
  apply plugin: 'com.google.protobuf'

  protobuf {
    protoc {
      artifact = 'com.google.protobuf:protoc:3.0.0-beta-1'
    }
  }

  ...
}

However, when I tried to build, I got an error saying that google/protobuf/any.proto was not found:

❯❯❯ gradle build -x test
Picked up JAVA_TOOL_OPTIONS: -Djava.awt.headless=true
:azkaban-common:extractIncludeProto
:azkaban-common:extractProto
:azkaban-common:generateProto FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':azkaban-common:generateProto'.
> protoc: stdout: . stderr: /Volumes/Ocean/Projects/azkaban/azkaban/azkaban-common/build/extracted-protos/main: warning: directory does not exist.
  /Volumes/Ocean/Projects/azkaban/azkaban/azkaban-common/build/extracted-include-protos/main: warning: directory does not exist.
  google/protobuf/any.proto: File not found.
  workflow.proto: Import "google/protobuf/any.proto" was not found or had errors.
  workflow.proto:27:3: "Any" is not defined.


* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output.

BUILD FAILED

Total time: 0.922 secs

Best way to create proto only packages?

In an effort to eliminate copy/pasting *.proto files into many projects across languages, I'd like to have a gradle project with only the proto files contained and published in the released artifact. What is my best method for doing so? I'd like to use a standard dependency like compile "com.mypackage:my-service-definition:1.0" and have it be available and used somehow within the project. I saw ways to reference protobuf file(...) and others, but it wasn't clear how to modify this for what I'm after. Thanks for any help you can provide!

Support protoc plugin options

Plugins sometimes support parameters by passing semicolon-delimited options before the output file name of the out paramater passed to protoc.

Nano for Java uses this heavily. To use Nano with gRPC you also need to add nano=true.

We need a way to provide such options.

google/api/annotations.proto: File not found.

Hey,

I'm starting using protobuf and this might be a newbie question so be patient :)

On my service.proto definition I have

syntax = "proto3";
import "google/api/annotations.proto";

and on build.gradle

buildscript {
    repositories {
        mavenCentral()
    }
    dependencies {
        classpath 'com.google.protobuf:protobuf-gradle-plugin:0.7.1'
    }
}

repositories {
    mavenCentral()
}

dependencies {
    compile 'com.google.protobuf:protoc:3.0.0-beta-1'
    compile 'io.grpc:grpc-core:0.9.0'
    compile 'io.grpc:grpc-stub:0.9.0'
    compile 'io.grpc:grpc-netty:0.9.0'
    compile 'io.grpc:grpc-protobuf:0.9.0'
}

protobuf {
    protoc {
        // The version of protoc must match protobuf-java. If you don't depend on
        // protobuf-java directly, you will be transitively depending on the
        // protobuf-java version that grpc depends on.
        artifact = "com.google.protobuf:protoc:3.0.0-beta-1"
    }
    plugins {
        grpc {
            artifact = 'io.grpc:protoc-gen-grpc-java:0.9.0'
        }
    }
    generateProtoTasks {
        all()*.plugins {
            grpc {}
        }
    }
    generatedFilesBaseDir = "$projectDir/src/generated"
}

Custom source sets are not supported

Hi,
I have defined a custom source set for Java application:
sourceSets { generated } , but this source set is untouched when compilation is runned:

06:44:08.182 [DEBUG] [org.gradle.api.internal.tasks.execution.ExecuteAtMostOnceTaskExecuter] Starting to execute task ':generateProto'
06:44:08.183 [INFO] [org.gradle.api.internal.file.collections.DirectoryFileTree] file or directory '/home/admin/workspace/travel-vendor-service-sncf/src/main/proto', not found
06:44:08.183 [INFO] [org.gradle.api.internal.file.collections.DirectoryFileTree] file or directory '/home/admin/workspace/travel-vendor-service-sncf/build/extracted-protos/main', not found
06:44:08.183 [INFO] [org.gradle.api.internal.tasks.execution.SkipEmptySourceFilesTaskExecuter] Skipping task ':generateProto' as it has no source files.
06:44:08.183 [DEBUG] [org.gradle.api.internal.tasks.execution.ExecuteAtMostOnceTaskExecuter] Finished executing task ':generateProto'

Is it possible to enhance plugin to support custom source sets?

Please apply the Java plugin or the Android plugin first

A complete minimal example build.gradle file that uses the new plugins block would be appreciated; https://plugins.gradle.org/plugin/com.google.protobuf suggests the new "plugin mechanism introduced in Gradle 2.1" build script snippet should be:

plugins {
    id "com.google.protobuf" version "0.7.0"
}

I tried setting up a very minimal project with only two proto files in src/main/proto (renamed to nyct-subway.proto and gtfs-realtime.proto) and I can't seem to get the proto plug in to not complain to "Please apply the Java plugin or the Android plugin first", though, you know that's not so specific since http://plugins.gradle.org/plugin/java isn't a plugin.

A broken build.gradle using new plugins mechanism

plugins {
//    id "java" //Having this line or not changes nothing.
    id "com.google.protobuf" version "0.7.0"
}
group 'foo'
version '0.1-SNAPSHOT'

apply plugin: 'java'
apply plugin: "com.google.protobuf"

sourceCompatibility = 1.5

repositories {
    mavenCentral()
}

protobuf {
    // Configure the protoc executable
    protoc {
        // Download from repositories
        artifact = 'com.google.protobuf:protoc:3.0.0-alpha-3'
    }
}

The message occurs no matter the task, including unrelated tasks like 'wrapper'. For now I'm using the generateProto task, because I haven't seen a clear explanation of what tasks this plugin adds by default. That said, it does seem to correctly get the protoc dependency on the first invocation.

The generateProto task will work correctly when given a build.gradle file like:

A working build.gradle file using the older buildscript block

buildscript {
    repositories {
        mavenCentral()
    }
    dependencies {
        classpath 'com.google.protobuf:protobuf-gradle-plugin:0.7.0'
    }
}

group 'foo'
version '0.1-SNAPSHOT'

apply plugin: 'java'
apply plugin: "com.google.protobuf"

sourceCompatibility = 1.5

repositories {
    mavenCentral()
}

protobuf {
    // Configure the protoc executable
    protoc {
        // Download from repositories
        artifact = 'com.google.protobuf:protoc:3.0.0-alpha-3'
    }
}

You might consider fixing support for the new plugin mechanism, making the error message much more detailed, or adding a detection specific to the new plugin mechanism to produce an error message about lacking support or compatibility with the suggestion to use the buildscript block style.

protoc binaries are not resolved using Nexus private repo + inconsistent debug messages

Environment:

  • The issue reproduced on v0.7.1
  • Gradle fragment specify
protobuf {
  protoc {
    artifact = 'com.google.protobuf:protoc:2.6.1'
  }
}

But getting errors:

A problem occurred configuring root project 'my-project-name'
Could not find protoc-windows-x86_64.exe (com.google.protobuf:protoc:2.6.1)
Searched in the following locations:
http://my-private-nexus-proxy:3000/artifacts/content/groups/PR_Group/com/google/protobuf/protoc/2.6.1/protoc-2.6.1-windows-x86_64.exe

The strange thing the path 'http://my-private-nexus-proxy:3000/artifacts/content/groups/PR_Group/com/google/protobuf/protoc/2.6.1/protoc-2.6.1-windows-x86_64.exe' is CORRECT.

Notes:

  • The reference to 'protoc-windows-x86_64.exe' is not correct and should be 'protoc-2.6.1-windows-x86_64.exe'
  • The path to my private Nexus repository is valid and it is not clear why it can not resolve existing artifact

proto import statement from Android project to Android project broken?

I accidentally opened this in the old repository.

I have two Android libraries, both of which have protos. ProjectB depends on ProjectA -- compile project(':projectA'). The only non-standard thing I've done is to disable nano protos and use the normal style protos (reverting that doesn't solve my problem).

Both protos can compile just fine when they're independent. However, if I add an import statement within protoB.proto to protoA.proto, I get an error:
protoA.proto: File not found.
protoB.proto: Import "protoA.proto" was not found or had errors.

I think this issue is similar to #22 but in my case, both projects are Android instead of plain Java. Should this work?

https://github.com/google/protobuf-gradle-plugin/blob/master/testProjectAndroid/build.gradle shows an Android app depending on protos defined in another Java project. If that works, shouldn't it still work when the other project is an Android project?

Should I move all protos into their own Java projects and have the Android projects depend on that instead?

Details:
classpath 'com.google.protobuf:protobuf-gradle-plugin:0.7.1'
proto_library_version = '3.0.0-beta-1'
google_proto_library = "com.google.protobuf:protobuf-java:${proto_library_version}"
google_protoc_artifact = "com.google.protobuf:protoc:${proto_library_version}"


I just confirmed my theory by converting ProjectA to a Java project (ProjectB is still an Android library). All of the sudden, protoB.proto is allowed to import a proto from protoA.proto.

In my case, I think I can just convert both of them to Java projects and be done with it. I made them Android library projects even though they only contain regular Java (there was some limitation with Android and eclipse long ago). But it still seems like this shouldn't be a restriction.

Duplicate Entry Errors with Android and MultiDex

I'm following the Android example project here: https://github.com/google/protobuf-gradle-plugin/tree/master/testProjectAndroid

My project uses multidex and when I integrate this into my project I start getting the following error:

Error:Execution failed for task ':MyApp:packageAllDebugClassesForMultiDex'.
> java.util.zip.ZipException: duplicate entry: io/grpc/protobuf/DeferredProtoInputStream.class

Normally this is fixed by applying an exclusion because you have a duplicate entry somewhere. I confirmed that I do not have a duplicate entry by inspecting the gradle dependency graph. The dependency related to protobuf is this one:

compile 'io.grpc:grpc-all:0.7.0'

I made some changes to it, recompiled (rinse/wash/repeat) and eventually got to the point where I was having to exclude everything the dependency relies on:

    compile('io.grpc:grpc-all:0.7.0') {
        exclude group: 'io.grpc', module: 'grpc-protobuf'
        exclude group: 'io.grpc', module: 'grpc-protobuf-nano'
        exclude group: 'io.grpc', module: 'grpc-stub'
    }

At this point the app breaks because I'm using the nano messages.

I have not referenced anything proto related anywhere else in my project and this is the first time I've introduced it to my project.

What do I need to do in order to get this working? What needs to change? Is there a bug with multidex?

break connection to parent repository (aantono)

@aantono, @zhangkun83 - can you send a request to GitHub asking to reparent repositories from aantono/gradle-plugin-protobuf to google/protobuf-gradle-plugin. Everything should be forked from the google repo. This also overcomes a limitation of GitHub which disallows people from searching forks. As google/protobuf-gradle-plugin is currently a fork of aantono/gradle-plugin-protobuf, this means you can't search. I've gone through the reparent process before but GitHub will naturally only accept a request from you both.

Error:(4, 0) Cause: com/google/protobuf/gradle/ProtobufPlugin : Unsupported major.minor version 52.0

Environment:

Android Studio: 1.2.2
Gradle version: 2.2.1
Android plugin version: 1.1.0
jdk version: java-7-openjdk-amd64

build.gradle:

apply plugin: 'com.android.application'
apply plugin: 'com.google.protobuf'

buildscript {
    repositories {
        maven {
            url 'https://oss.sonatype.org/content/repositories/snapshots/'
        }
        jcenter()
        mavenLocal()
        mavenCentral()
    }
    dependencies {
        classpath 'com.google.protobuf:protobuf-gradle-plugin:0.5.0-SNAPSHOT'
    }
}

Setting an output directory

"python" below sets the python_out flag; but how do you assign a value?
e.g., python_out=$buildDir/generated-sources/main/python

sourceSets {
  main {
    proto {
      builtins {
        python {
        }
      }
    }
  }
}

google/protobuf-gradle-plugin is forked from personal repo and unsearchable

@aantono

It's currently impossible to search the protobuf-gradle-plugin directory as it's a parent of @aantono's public directory - this is a limitation of GitHub (see screenshot below). By following the help system, you can make a request to GitHub to remove this as a parent. I've done this before successfully by making the request, so it does work. This will be helpful for two reasons:

  1. A Google project should not be a fork from a personal project. It's bizarre to see that.
  2. In the request, please ask GitHub to enable searching for this repository. Hopefully that should be automatic or easy for them to do.

An alternative to try is @aantono to delete his public repository... but that may result in another repository that was a fork of @aantono to become the new "parent". It depends on when the accounts were created.

Gradle build failing with gradle 2.12

The gradle build is failing with this release.
i am using gradle 2.12.

Error Message : Error:(2, 0) Cause: org/gradle/api/internal/file/collections/DefaultDirectoryFileTreeFactory

When i switch back to the 0.7.5 release, it builds fine

Compiler generates invalid Java code when processing packageless proto

The following protobufs cause the plugin to generate invalid Java:

all.proto

syntax = "proto3";

// This file is meant to be used for convenient importing of frequently used protobufs
// Note that no package is defined

import public "demo/one.proto";
import public "demo/two.proto";
import public "demo/three.proto";

demo/one.proto

syntax = "proto3";

package demo;

message One {
    string one = 1;
}

demo/two.proto and demo/three.proto follow the same pattern.

demo.proto

syntax = "proto3";

package demo;

import "all.proto";

message Numbers {
    demo.One one = 1;
    demo.Two two = 2;
    demo.Three three = 3;
}

Below is a snippet from the generated file Demo.java:

// Generated by the protocol buffer compiler.  DO NOT EDIT!
// source: demo.proto

package demo;

public final class Demo {

  /* omitted */

  private static com.google.protobuf.Descriptors.FileDescriptor descriptor;

  static {
    java.lang.String[] descriptorData = { /* omitted */ };
    com.google.protobuf.Descriptors.FileDescriptor.InternalDescriptorAssigner assigner = /* omitted */;
    com.google.protobuf.Descriptors.FileDescriptor
      .internalBuildGeneratedFileFrom(descriptorData,
        new com.google.protobuf.Descriptors.FileDescriptor[] {
          .All.getDescriptor(), // <-- Note the leading dot, which prevents compilation
        }, assigner);
    /* omitted */
  }

  // @@protoc_insertion_point(outer_class_scope)
}

The plugin adds an extra dot before "All", resulting in invalid Java. The workaround is to specify a package in all.proto.

Organize the options better

Most of the options are exposed as top-level conventions, which pollutes the project name space and doesn't read well. We'd like them to be better organized, possibly put them under a big top-level protobuf {} block and use a more Gradle-ly style, e.g., change

protocDep = 'com.google.protobuf:protoc:3.0.0-alpha-2'
protobufNativeCodeGenPluginDeps = ["grpc:io.grpc:protoc-gen-grpc-java:0.1.0-SNAPSHOT"]

to

protobuf {
  protocExecutable {
    artifact 'com.google.protobuf:protoc:3.0.0-alpha-2'
  }
  codeGenPlugins {
    grpc {
      artifact 'io.grpc:protoc-gen-grpc-java:0.1.0-SNAPSHOT'
    }
  }
}

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.