Git Product home page Git Product logo

kunpengcompute / hadoop Goto Github PK

View Code? Open in Web Editor NEW

This project forked from apache/hadoop

1.0 1.0 3.0 490.39 MB

Apache Hadoop

Home Page: https://hadoop.apache.org/

License: Apache License 2.0

Shell 0.47% Python 0.02% Java 92.71% HTML 0.22% CSS 0.09% CMake 0.11% C 1.93% C++ 2.81% XSLT 0.02% JavaScript 1.25% TeX 0.02% Batchfile 0.08% TLA 0.01% Dockerfile 0.01% TSQL 0.02% SCSS 0.02% Handlebars 0.20%

hadoop's People

Contributors

aajisaka avatar acmurthy avatar arp7 avatar atm avatar cmccabe avatar cnauroth avatar elicollins avatar jian-he avatar jing9 avatar jlowe avatar junpingdu avatar kambatla avatar kihwal avatar oza avatar qwertymaniac avatar revans2 avatar shvachko avatar sidseth avatar sryza avatar steveloughran avatar szetszwo avatar tgravescs avatar toddlipcon avatar tomwhite avatar umamaheswararao avatar umbrant avatar vinayakumarb avatar vinoduec avatar xgong avatar zjshen14 avatar

Stargazers

 avatar

Watchers

 avatar  avatar

hadoop's Issues

Hadoop开发环境准备以及在openEuler上完成编译和测试

  • 开发环境准备
    • 根据个人喜好准备开发IDE(IntelliJ IDEA, Vim等)
    • kunpengcompute/hadoop repo中建立基于rel/release-2.7.7的开发分支rel-2.7.7-dev
    • 分成小组由华为云提供openEuler操作系统(ARM)的虚拟机
    • 阅读Hadoop文档,主要是编译指导
  • 在openEuler ARM虚拟机上编译Hadoop 2.7.7,整理编译中遇到的问题
  • 分模块执行Hadoop测试用例,并梳理出所遇到的问题
  • 分组尝试解决上述编译和测试中遇到的问题

编译hadoop提示找不到dlopen等函数

** [exec] /usr/bin/c++ -g -Wall -O2 -D_REENTRANT -D_GNU_SOURCE -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -rdynamic CMakeFiles/wordcount-simple.dir/main/native/examples/impl/wordcount-simple.cc.o -o examples/wordcount-simple libhadooppipes.a libhadooputils.a /usr/local/lib/libssl.a /usr/local/lib/libcrypto.a -lpthread
[exec] /usr/bin/ld: /usr/local/lib/libcrypto.a(dso_dlfcn.o): in function dlfcn_globallookup': [exec] dso_dlfcn.c:(.text+0x14): undefined reference to dlopen'
[exec] /usr/bin/ld: dso_dlfcn.c:(.text+0x24): undefined reference to dlsym' [exec] /usr/bin/ld: dso_dlfcn.c:(.text+0x30): undefined reference to dlclose'
[exec] /usr/bin/ld: /usr/local/lib/libcrypto.a(dso_dlfcn.o): in function dlfcn_bind_func': [exec] dso_dlfcn.c:(.text+0x3dc): undefined reference to dlsym'
[exec] /usr/bin/ld: dso_dlfcn.c:(.text+0x48c): undefined reference to dlerror' [exec] /usr/bin/ld: /usr/local/lib/libcrypto.a(dso_dlfcn.o): in function dlfcn_bind_var':
[exec] dso_dlfcn.c:(.text+0x508): undefined reference to dlsym' [exec] /usr/bin/ld: dso_dlfcn.c:(.text+0x5b4): undefined reference to dlerror'
[exec] /usr/bin/ld: /usr/local/lib/libcrypto.a(dso_dlfcn.o): in function dlfcn_load': [exec] dso_dlfcn.c:(.text+0x618): undefined reference to dlopen'
[exec] /usr/bin/ld: dso_dlfcn.c:(.text+0x680): undefined reference to dlclose' [exec] /usr/bin/ld: dso_dlfcn.c:(.text+0x6c0): undefined reference to dlerror'
[exec] /usr/bin/ld: /usr/local/lib/libcrypto.a(dso_dlfcn.o): in function dlfcn_pathbyaddr': [exec] dso_dlfcn.c:(.text+0x764): undefined reference to dladdr'
[exec] /usr/bin/ld: dso_dlfcn.c:(.text+0x7cc): undefined reference to dlerror' [exec] /usr/bin/ld: /usr/local/lib/libcrypto.a(dso_dlfcn.o): in function dlfcn_unload':
[exec] dso_dlfcn.c:(.text+0x820): undefined reference to `dlclose'
[exec] collect2: error: ld returned 1 exit status
[exec] make[2]: *** [CMakeFiles/wordcount-simple.dir/build.make:88: examples/wordcount-simple] Error 1
[exec] make[1]: *** [CMakeFiles/Makefile2:74: CMakeFiles/wordcount-simple.dir/all] Error 2
[exec] make: *** [Makefile:84: all] Error 2
[exec] make[2]: Leaving directory '/root/lixuezhen/hadoop/hadoop-tools/hadoop-pipes/target/native'
[exec] make[1]: Leaving directory '/root/lixuezhen/hadoop/hadoop-tools/hadoop-pipes/target/native'
**

编译hadoop提示HMAC_CTX_cleanup找不到

错误:

    [exec] /usr/bin/c++   -g -Wall -O2 -D_REENTRANT -D_GNU_SOURCE -ldl -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64  -rdynamic CMakeFiles/wordcount-simple.dir/main/native/examples/impl/wordcount-simple.cc.o  -o examples/wordcount-simple libhadooppipes.a libhadooputils.a -lssl -lcrypto -lpthread
 [exec] /usr/bin/ld: libhadooppipes.a(HadoopPipes.cc.o): in function `HadoopPipes::BinaryProtocol::createDigest(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >&, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >&)':
 [exec] /root/lixuezhen/hadoop/hadoop-tools/hadoop-pipes/src/main/native/pipes/impl/HadoopPipes.cc:430: undefined reference to `HMAC_CTX_cleanup'
 _**[exec] /usr/bin/ld: /root/lixuezhen/hadoop/hadoop-tools/hadoop-pipes/src/main/native/pipes/impl/HadoopPipes.cc:430: undefined reference to `HMAC_CTX_cleanup'**_
 [exec] collect2: error: ld returned 1 exit status
 [exec] make[2]: *** [CMakeFiles/wordcount-simple.dir/build.make:88: examples/wordcount-simple] Error 1
 [exec] make[2]: Leaving directory '/root/lixuezhen/hadoop/hadoop-tools/hadoop-pipes/target/native'
 [exec] make[1]: Leaving directory '/root/lixuezhen/hadoop/hadoop-tools/hadoop-pipes/target/native'
 [exec] make[1]: *** [CMakeFiles/Makefile2:74: CMakeFiles/wordcount-simple.dir/all] Error 2
 [exec] make: *** [Makefile:84: all] Error 2

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................. SUCCESS [ 0.672 s]
[INFO] Apache Hadoop Build Tools .......................... SUCCESS [ 0.665 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [ 0.595 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [ 1.939 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [ 0.127 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [ 1.232 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [ 1.980 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [ 3.082 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [ 3.519 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [ 2.087 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [01:04 min]
[INFO] Apache Hadoop NFS .................................. SUCCESS [ 3.113 s]
[INFO] Apache Hadoop KMS .................................. SUCCESS [ 9.678 s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [ 0.030 s]
[INFO] Apache Hadoop HDFS ................................. SUCCESS [01:31 min]
[INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 14.307 s]
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [ 3.415 s]
[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [ 2.763 s]
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [ 0.033 s]
[INFO] hadoop-yarn ........................................ SUCCESS [ 0.036 s]
[INFO] hadoop-yarn-api .................................... SUCCESS [ 24.300 s]
[INFO] hadoop-yarn-common ................................. SUCCESS [ 17.716 s]
[INFO] hadoop-yarn-server ................................. SUCCESS [ 0.030 s]
[INFO] hadoop-yarn-server-common .......................... SUCCESS [ 6.370 s]
[INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [ 12.284 s]
[INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [ 1.921 s]
[INFO] hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [ 3.946 s]
[INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [ 10.974 s]
[INFO] hadoop-yarn-server-tests ........................... SUCCESS [ 2.897 s]
[INFO] hadoop-yarn-client ................................. SUCCESS [ 3.457 s]
[INFO] hadoop-yarn-server-sharedcachemanager .............. SUCCESS [ 2.098 s]
[INFO] hadoop-yarn-applications ........................... SUCCESS [ 0.027 s]
[INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [ 1.438 s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [ 1.140 s]
[INFO] hadoop-yarn-site ................................... SUCCESS [ 0.027 s]
[INFO] hadoop-yarn-registry ............................... SUCCESS [ 2.802 s]
[INFO] hadoop-yarn-project ................................ SUCCESS [ 3.258 s]
[INFO] hadoop-mapreduce-client ............................ SUCCESS [ 0.106 s]
[INFO] hadoop-mapreduce-client-core ....................... SUCCESS [ 11.688 s]
[INFO] hadoop-mapreduce-client-common ..................... SUCCESS [ 9.628 s]
[INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [ 2.258 s]
[INFO] hadoop-mapreduce-client-app ........................ SUCCESS [ 4.976 s]
[INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [ 3.137 s]
[INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [ 3.808 s]
[INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [ 1.181 s]
[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [ 3.061 s]
[INFO] hadoop-mapreduce ................................... SUCCESS [ 2.335 s]
[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [ 2.229 s]
[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 5.484 s]
[INFO] Apache Hadoop Archives ............................. SUCCESS [ 1.331 s]
[INFO] Apache Hadoop Rumen ................................ SUCCESS [ 2.905 s]
[INFO] Apache Hadoop Gridmix .............................. SUCCESS [ 2.249 s]
[INFO] Apache Hadoop Data Join ............................ SUCCESS [ 1.325 s]
[INFO] Apache Hadoop Ant Tasks ............................ SUCCESS [ 1.102 s]
[INFO] Apache Hadoop Extras ............................... SUCCESS [ 1.545 s]
[INFO] Apache Hadoop Pipes ................................ FAILURE [ 7.860 s]
[INFO] Apache Hadoop OpenStack support .................... SKIPPED
[INFO] Apache Hadoop Amazon Web Services support .......... SKIPPED
[INFO] Apache Hadoop Azure support ........................ SKIPPED
[INFO] Apache Hadoop Client ............................... SKIPPED
[INFO] Apache Hadoop Mini-Cluster ......................... SKIPPED
[INFO] Apache Hadoop Scheduler Load Simulator ............. SKIPPED
[INFO] Apache Hadoop Tools Dist ........................... SKIPPED
[INFO] Apache Hadoop Tools ................................ SKIPPED
[INFO] Apache Hadoop Distribution ......................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 06:08 min
[INFO] Finished at: 2020-10-23T14:58:33+08:00
[INFO] Final Memory: 237M/1575M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-pipes: An Ant BuildException has occured: exec returned: 2
[ERROR] around Ant part ...... @ 8:115 in /root/lixuezhen/hadoop/hadoop-tools/hadoop-pipes/target/antrun/build-main.xml
[ERROR] -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-pipes: An Ant BuildException has occured: exec returned: 2
around Ant part ...... @ 8:115 in /root/lixuezhen/hadoop/hadoop-tools/hadoop-pipes/target/antrun/build-main.xml
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:216)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:80)
at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:120)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:355)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:155)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:584)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:216)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:160)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
Caused by: org.apache.maven.plugin.MojoExecutionException: An Ant BuildException has occured: exec returned: 2
around Ant part ...... @ 8:115 in /root/lixuezhen/hadoop/hadoop-tools/hadoop-pipes/target/antrun/build-main.xml
at org.apache.maven.plugin.antrun.AntRunMojo.execute(AntRunMojo.java:355)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:132)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:208)
... 19 more
Caused by: /root/lixuezhen/hadoop/hadoop-tools/hadoop-pipes/target/antrun/build-main.xml:8: exec returned: 2
at org.apache.tools.ant.taskdefs.ExecTask.runExecute(ExecTask.java:646)
at org.apache.tools.ant.taskdefs.ExecTask.runExec(ExecTask.java:672)
at org.apache.tools.ant.taskdefs.ExecTask.execute(ExecTask.java:498)
at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:291)
at sun.reflect.GeneratedMethodAccessor28.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
at org.apache.tools.ant.Task.perform(Task.java:348)
at org.apache.tools.ant.Target.execute(Target.java:390)
at org.apache.tools.ant.Target.performTasks(Target.java:411)
at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1399)
at org.apache.tools.ant.Project.executeTarget(Project.java:1368)
at org.apache.maven.plugin.antrun.AntRunMojo.execute(AntRunMojo.java:327)
... 21 more
[ERROR]
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn -rf :hadoop-pipes

leveldbjni不支持arm64

java.lang.UnsatisfiedLinkError: Could not load library. Reasons: [no leveldbjni64-1.8 in java.library.path, no leveldbjni-1.8 in java.library.path, no leveldbjni in java.library.path, /tmp/libleveldbjni-64-1-8368487336325240191.8: /tmp/libleveldbjni-64-1-8368487336325240191.8: cannot open shared object file: No such file or directory (Possible cause: can't load AMD 64-bit .so on a AARCH64-bit platform)]
at org.fusesource.hawtjni.runtime.Library.doLoad(Library.java:182)

protobuf 2.5.0编译不支持arm64

问题:
当前是arm64位, protobuf 2.5 不支持

错误信息:
/bin/sh ../libtool --tag=CXX --mode=compile g++ -DHAVE_CONFIG_H -I. -I.. -pthread -Wall -Wwrite-strings -Woverloaded-virtual -Wno-sign-compare -O2 -g -DNDEBUG -MT atomicops_internals_x86_gcc.lo -MD -MP -MF .deps/atomicops_internals_x86_gcc.Tpo -c -o atomicops_internals_x86_gcc.lo test -f 'google/protobuf/stubs/atomicops_internals_x86_gcc.cc' || echo './'google/protobuf/stubs/atomicops_internals_x86_gcc.cc
libtool: compile: g++ -DHAVE_CONFIG_H -I. -I.. -pthread -Wall -Wwrite-strings -Woverloaded-virtual -Wno-sign-compare -O2 -g -DNDEBUG -MT atomicops_internals_x86_gcc.lo -MD -MP -MF .deps/atomicops_internals_x86_gcc.Tpo -c google/protobuf/stubs/atomicops_internals_x86_gcc.cc -fPIC -DPIC -o .libs/atomicops_internals_x86_gcc.o
In file included from ./google/protobuf/stubs/atomicops.h:59:0,
from google/protobuf/stubs/atomicops_internals_x86_gcc.cc:36:
./google/protobuf/stubs/platform_macros.h:61:2: error: #error Host architecture was not detected as supported by protobuf
#error Host architecture was not detected as supported by protobuf
^~~~~
In file included from google/protobuf/stubs/atomicops_internals_x86_gcc.cc:36:0:
./google/protobuf/stubs/atomicops.h:161:1: error: stray '#' in program
#error "Atomic operations are not supported on your platform"
^
./google/protobuf/stubs/atomicops.h:188:1: note: in expansion of macro 'GOOGLE_PROTOBUF_ATOMICOPS_ERROR'
GOOGLE_PROTOBUF_ATOMICOPS_ERROR
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
./google/protobuf/stubs/atomicops.h:161:2: error: 'error' does not name a type; did you mean 'perror'?
#error "Atomic operations are not supported on your platform"
^
./google/protobuf/stubs/atomicops.h:188:1: note: in expansion of macro 'GOOGLE_PROTOBUF_ATOMICOPS_ERROR'
GOOGLE_PROTOBUF_ATOMICOPS_ERROR
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
make[2]: *** [Makefile:1170: atomicops_internals_x86_gcc.lo] Error 1
make[2]: Leaving directory '/opt/protobuf-2.5.0/src'
make[1]: *** [Makefile:568: all-recursive] Error 1
make[1]: Leaving directory '/opt/protobuf-2.5.0'
make: *** [Makefile:477: all] Error 2

版本:
[root@hackathon-hadoop-0003 protobuf-2.5.0]# uname -a
Linux hackathon-hadoop-0003 4.19.90-2003.4.0.0036.oe1.aarch64 kunpengcompute/hackathon#1 SMP Mon Mar 23 19:06:43 UTC 2020 aarch64 aarch64 aarch64 GNU/Linux
[root@hackathon-hadoop-0003 protobuf-2.5.0]#

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.