Git Product home page Git Product logo

sparkjni's Introduction

SparkJNI

This framework is a powerful platform which enables integration of native kernels (FPGA, OpenCL, C++) with Apache Spark, targeting faster execution of Big Data applications. SparkJNI is meant to reduce the development effort of native-accelerated Spark applications by means of offloading developers of handling the JNI programming.

Setup

In order to run the simplest example, first set a system-wide environment variable JAVA_HOME with the location of your JDK installation (minimum JDK 7) (create a .sh file with export JAVA_HOME=<where-java-is-installed> in /etc/profile.d/ and make sure it is loaded by logging in and out).

If you don't have Maven install, please install it with sudo apt-get install maven.

Go to the root of the repository and run sudo mvn clean install. By this, you will do a full build and test of the project and its modules. A build without running unit and integration tests can be done by running sudo mvn clean install -DskipTests. A full build should pass all tests. If build is run in a resource-constrained environment, then an extra parameter should be provided to the Maven command: mvn clean install -DtestMode=constrained to ensure tests are not failing due to memory requirements.

Annotating a sample SparkJNI-enabled application

SparkJNI generates JNI native code based on annotations found on user-defined types and functions. These annotations are demoed next based on a example application: we define a list of vectors on which we apply a map transformation, than a reduce. The main program is given in the VectorOpsMain.java source file, in the sparkjni-examples module in the vectorOps package.

SparkJNI containers

SparkJNI containers are user-defined types that can be passed in and retrieved from SparkJNI-enabled transformations. These containers hold data of interest that can be processed in the native code.

Going back to the sample VectorOps application, the container which holds the vector contents, VectorBean is defined below:

@JNI_class public class VectorBean extends JavaBean {
    @JNI_field int[] data;

    @JNI_method public VectorBean(@JNI_param(target = "data") int[] data) {
        this.data = data;
    }
}

This container, like any other user-defined containers, needs to be annotated with @JNI_class and extends the SparkJNI-defined JavaBean. Next, each container field that we want exposed in the native side should be annotated with @JNI_field. Last, a setter-constructor (that initializes only the @JNI_field annotated fields) should be implemented and annotated with @JNI_method.

JNI functions

JNI functions are user-defined classes which are used to transfer control to native code. These classes need to satisfy the following requirements:

  1. Extend one of the JniReduceFunction or JniMapFunction SparkJNI types.

  2. Declare a public default constructor.

  3. Override the two-argument constructor from their supertype.

  4. Contain at least a declaration of a native Java function (with the modifier native), with the required number of arguments.

In the case of the example application, we define the two classes used for the reduce and the map transformations:

@JniFunction
public class VectorAddJni extends JniReduceFunction {
    public VectorAddJni() {
    }

    public VectorAddJni(String nativePath, String nativeFunctionName) {
        super(nativePath, nativeFunctionName);
    }

    public native VectorBean reduceVectorAdd(VectorBean v1, VectorBean v2);
}

and

@JniFunction.
public class VectorMulJni extends JniMapFunction {
    public VectorMulJni() {
    }

    public VectorMulJni(String nativePath, String nativeFunctionName) {
        super(nativePath, nativeFunctionName);
    }

    public native VectorBean mapVectorMul(VectorBean inputVector);
}

Deployment modes

There are two ways of building Spark applications with SparkJNI:

  1. By using the SparkJNI Maven Generator plugin. (recommended)

  2. By programmatically configuring SparkJNI before the actual application execution flow.

SparkJNI generator plugin

The Generator plugin works by retrieving user-defined classes annotated with SparkJNI annotations. The native code is automatically generated at build time (e.g. maven install). In order to use SparkJNI in this manner, include the following build configuration in the application's pom:

<project>
...
    <build>
        <plugins>
            <plugin>
                <groupId>tudelft.ceng.tvoicu</groupId>
                <artifactId>sparkjni-generator-plugin</artifactId>
                <version>0.2</version>
                <executions>
                    <execution>
                        <phase>install</phase>
                        <goals>
                            <goal>sparkjni-generator</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>
...
</project>

The build will succeed only if SparkJNI has been previously built locally (i.e. the generator artifact exists in the local Maven repository).

Deploying SparkJNI programmatically

For more experienced users (or in cases where Maven is not used), SparkJNI can be configured before application deployment. The following example showcases constructing a sample application, including the configuration of SparkJNI.

These are used in the main class (VectorOpsMain). The following code snippet showcases the programmatic configuration. A SparkJNI instance can be created using the built-in builder and by setting two mandatory parameters:

  1. The native path points to the directory where the native code will be generated, including the template kernel file.

  2. The app name indicates the kernel file and native library name (native/path/appName.cpp and native/path/appName.so).

private static void initSparkJNI(){
    SparkJni sparkJniProvider = new SparkJniSingletonBuilder()
            .nativePath(nativePath)
            .appName(appName)
            .build();
    sparkJniProvider.setDeployMode(new DeployMode(JUST_BUILD))
            .addToClasspath(sparkjniClasspath, examplesClasspath);
    // Register control and data transfer links (classes).
    sparkJniProvider.registerContainer(VectorBean.class)
            .registerJniFunction(VectorMulJni.class)
            .registerJniFunction(VectorAddJni.class);
    sparkJniProvider.deploy();
}

The native code deploy mode is by default set to FULL_GENERATE_AND_BUILD. However, in advanced use-cases, it can be set to different deploy modes (e.g. in this case it is JUST_BUILD, so it does not overwrite native source files).

SparkJNI works based on type mappings and it needs Containers and JniFunctions that will be used in the target application to be indexed. In the above example, they are registered with the previously-built SparkJni instance.

Last, but most important, code generation needs to be triggered after SparkJNI configuration with deploy().

public static void main(String[] args){
    initSparkJNI();
    JavaRDD<VectorBean> vectorsRdd = getSparkContext().parallelize(generateVectors(2, 4));
    JavaRDD<VectorBean> mulResults = vectorsRdd.map(new VectorMulJni(libPath, "mapVectorMul"));
    VectorBean results = mulResults.reduce(new VectorAddJni(libPath, "reduceVectorAdd"));
}

After successful deployment of a SparkJNI configuration, a Spark application can use the populated native functionality. Instead of using anonymous classes in transformations, you can use previously-defined JniFunctions). These functions need to be instantiated with the native library path and target native function as arguments).

Native code

After the Java application has been constructed, annotated with SparkJNI annotations and built (depending on the SparkJNI deployment mode chosen), native code functionality can be inserted in the generated native functions.

In our VectorOps example, we have to populate the native functions with desired behavior, in the vectorOps.cpp kernel file:

...
std::shared_ptr<CPPVectorBean> reduceVectorAdd(std::shared_ptr<CPPVectorBean>& cppvectorbean0, std::shared_ptr<CPPVectorBean>& cppvectorbean1,  jclass cppvectorbean_jClass, JNIEnv* jniEnv) {
	for(int idx = 0; idx < cppvectorbean0->getdata_length(); idx++)
		cppvectorbean0->getdata()[idx] += cppvectorbean1->getdata()[idx];
	return cppvectorbean0;
}

std::shared_ptr<CPPVectorBean> mapVectorMul(std::shared_ptr<CPPVectorBean>& cppvectorbean0,  jclass cppvectorbean_jClass, JNIEnv* jniEnv) {
	for(int idx = 0; idx < cppvectorbean0->getdata_length(); idx++)
		cppvectorbean0->getdata()[idx] *= 2;
	return cppvectorbean0;
}

Now, if the application has been deployed using the Generator plugin, the project needs to be built again (mvn clean install). Otherwise, if SparkJNI has been programmatically configured, the application can be run (if it had enabled native code build). The program can be run then, either from the IDE or by packaging the jar and with ./spark-submit.

Virtualized deployment with Vagrant

For means of automatic deployment and vanilla testing of the SparkJNI framework, one can use the vagrant build script deployVagrantTestingVM.sh. This launches an Ubuntu 16.04 virtual machine, installs all dependencies and builds and tests the latest SparkJNI GitHub version.

sparkjni's People

Contributors

shanshanren avatar tudorv91 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

sparkjni's Issues

Test testReduceJniFunctionPrototype(sparkjni.jniLink.FunctionSignatureMapperProviderTest) fails during building

Hi! I tried building the project using the command mvn clean install but the following error occurs:

java.io.IOException: Cannot run program "javah": error=2, No such file or directory
	at java.base/java.lang.ProcessBuilder.start(ProcessBuilder.java:1128)
	at java.base/java.lang.ProcessBuilder.start(ProcessBuilder.java:1071)
	at java.base/java.lang.Runtime.exec(Runtime.java:592)
	at java.base/java.lang.Runtime.exec(Runtime.java:416)
	at java.base/java.lang.Runtime.exec(Runtime.java:313)
	at sparkjni.utils.JniUtils.runProcess(JniUtils.java:267)
	at sparkjni.utils.JniLinkHandler.javah(JniLinkHandler.java:104)
	at sparkjni.jniLink.FunctionSignatureMapperProviderTest.setUp(FunctionSignatureMapperProviderTest.java:39)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:566)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:252)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:141)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:112)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:566)
	at org.apache.maven.surefire.util.ReflectionUtils.invokeMethodWithArray(ReflectionUtils.java:189)
	at org.apache.maven.surefire.booter.ProviderFactory$ProviderProxy.invoke(ProviderFactory.java:165)
	at org.apache.maven.surefire.booter.ProviderFactory.invokeProvider(ProviderFactory.java:85)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:115)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:75)
Caused by: java.io.IOException: error=2, No such file or directory
	at java.base/java.lang.ProcessImpl.forkAndExec(Native Method)
	at java.base/java.lang.ProcessImpl.<init>(ProcessImpl.java:340)
	at java.base/java.lang.ProcessImpl.start(ProcessImpl.java:271)
	at java.base/java.lang.ProcessBuilder.start(ProcessBuilder.java:1107)
	... 28 more
java.io.IOException: Cannot run program "javah": error=2, No such file or directory
	at java.base/java.lang.ProcessBuilder.start(ProcessBuilder.java:1128)
	at java.base/java.lang.ProcessBuilder.start(ProcessBuilder.java:1071)
	at java.base/java.lang.Runtime.exec(Runtime.java:592)
	at java.base/java.lang.Runtime.exec(Runtime.java:416)
	at java.base/java.lang.Runtime.exec(Runtime.java:313)
	at sparkjni.utils.JniUtils.runProcess(JniUtils.java:267)
	at sparkjni.utils.JniLinkHandler.javah(JniLinkHandler.java:104)
	at sparkjni.jniLink.FunctionSignatureMapperProviderTest.setUp(FunctionSignatureMapperProviderTest.java:39)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:566)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:252)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:141)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:112)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:566)
	at org.apache.maven.surefire.util.ReflectionUtils.invokeMethodWithArray(ReflectionUtils.java:189)
	at org.apache.maven.surefire.booter.ProviderFactory$ProviderProxy.invoke(ProviderFactory.java:165)
	at org.apache.maven.surefire.booter.ProviderFactory.invokeProvider(ProviderFactory.java:85)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:115)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:75)
Caused by: java.io.IOException: error=2, No such file or directory
	at java.base/java.lang.ProcessImpl.forkAndExec(Native Method)
	at java.base/java.lang.ProcessImpl.<init>(ProcessImpl.java:340)
	at java.base/java.lang.ProcessImpl.start(ProcessImpl.java:271)
	at java.base/java.lang.ProcessBuilder.start(ProcessBuilder.java:1107)
	... 28 more
Tests run: 2, Failures: 1, Errors: 1, Skipped: 0, Time elapsed: 0.181 sec <<< FAILURE!
testMapperJniFunctionPrototype(sparkjni.jniLink.FunctionSignatureMapperProviderTest)  Time elapsed: 0.004 sec  <<< ERROR!
java.lang.NullPointerException
	at sparkjni.jniLink.FunctionSignatureMapperProviderTest.testMapperJniFunctionPrototype(FunctionSignatureMapperProviderTest.java:61)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:566)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:252)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:141)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:112)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:566)
	at org.apache.maven.surefire.util.ReflectionUtils.invokeMethodWithArray(ReflectionUtils.java:189)
	at org.apache.maven.surefire.booter.ProviderFactory$ProviderProxy.invoke(ProviderFactory.java:165)
	at org.apache.maven.surefire.booter.ProviderFactory.invokeProvider(ProviderFactory.java:85)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:115)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:75)

testReduceJniFunctionPrototype(sparkjni.jniLink.FunctionSignatureMapperProviderTest)  Time elapsed: 0.001 sec  <<< FAILURE!
junit.framework.AssertionFailedError
	at junit.framework.Assert.fail(Assert.java:55)
	at junit.framework.Assert.assertTrue(Assert.java:22)
	at junit.framework.Assert.assertNotNull(Assert.java:256)
	at junit.framework.Assert.assertNotNull(Assert.java:248)
	at junit.framework.TestCase.assertNotNull(TestCase.java:417)
	at sparkjni.jniLink.FunctionSignatureMapperProviderTest.testReduceJniFunctionPrototype(FunctionSignatureMapperProviderTest.java:74)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:566)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:252)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:141)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:112)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:566)
	at org.apache.maven.surefire.util.ReflectionUtils.invokeMethodWithArray(ReflectionUtils.java:189)
	at org.apache.maven.surefire.booter.ProviderFactory$ProviderProxy.invoke(ProviderFactory.java:165)
	at org.apache.maven.surefire.booter.ProviderFactory.invokeProvider(ProviderFactory.java:85)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:115)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:75)

Running sparkjni.jniLink.linkContainers.JniRootContainerTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.001 sec
Running sparkjni.utils.cpp.fields.CppFieldTest
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.008 sec
Running sparkjni.utils.CppBeanTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.001 sec
Running sparkjni.utils.SparkJniTest
Tests run: 21, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.006 sec
Running sparkjni.dataLink.CppBeanTest
Tests run: 10, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 0.008 sec

Results :

Failed tests:   testReduceJniFunctionPrototype(sparkjni.jniLink.FunctionSignatureMapperProviderTest)

Tests in error: 
  testMapperJniFunctionPrototype(sparkjni.jniLink.FunctionSignatureMapperProviderTest)

Tests run: 37, Failures: 1, Errors: 1, Skipped: 5

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for sparkjni 0.2:
[INFO] 
[INFO] sparkjni ........................................... SUCCESS [  0.160 s]
[INFO] core ............................................... FAILURE [  3.059 s]
[INFO] sparkjni-examples .................................. SKIPPED
[INFO] integration-test ................................... SKIPPED
[INFO] core-test .......................................... SKIPPED
[INFO] sparkjni-generator-plugin .......................... SKIPPED
[INFO] generator-test ..................................... SKIPPED
[INFO] sparkjni-benchmarks ................................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  3.309 s
[INFO] Finished at: 2022-05-16T15:38:06+03:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.12.4:test (default-test) on project core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/mary/Projects/SparkJNI/core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.12.4:test (default-test) on project core: There are test failures.

Please refer to /home/mary/Projects/SparkJNI/core/target/surefire-reports for the individual test results.
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:215)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:156)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:148)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
    at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:56)
    at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:305)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:192)
    at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105)
    at org.apache.maven.cli.MavenCli.execute (MavenCli.java:957)
    at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:289)
    at org.apache.maven.cli.MavenCli.main (MavenCli.java:193)
    at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
    at jdk.internal.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
    at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke (Method.java:566)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:282)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:225)
    at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:406)
    at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:347)
Caused by: org.apache.maven.plugin.MojoFailureException: There are test failures.

Please refer to /home/mary/Projects/SparkJNI/core/target/surefire-reports for the individual test results.
    at org.apache.maven.plugin.surefire.SurefireHelper.reportExecution (SurefireHelper.java:83)
    at org.apache.maven.plugin.surefire.SurefirePlugin.writeSummary (SurefirePlugin.java:176)
    at org.apache.maven.plugin.surefire.SurefirePlugin.handleSummary (SurefirePlugin.java:150)
    at org.apache.maven.plugin.surefire.AbstractSurefireMojo.executeAfterPreconditionsChecked (AbstractSurefireMojo.java:650)
    at org.apache.maven.plugin.surefire.AbstractSurefireMojo.execute (AbstractSurefireMojo.java:586)
    at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo (DefaultBuildPluginManager.java:137)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:210)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:156)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:148)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
    at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:56)
    at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:305)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:192)
    at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105)
    at org.apache.maven.cli.MavenCli.execute (MavenCli.java:957)
    at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:289)
    at org.apache.maven.cli.MavenCli.main (MavenCli.java:193)
    at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
    at jdk.internal.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
    at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke (Method.java:566)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:282)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:225)
    at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:406)
    at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:347)
[ERROR] 
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <args> -rf :core

This error is generated both for JDK 8 (openjdk1.8.0_302-jvmci-21.2-b08) and JDK 7 (JDK 1.7.0_80). The maven version used is 3.6.3.
Are there any steps I need to follow before building, apart from setting the JAVA_HOME?

Thank you!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.