Git Product home page Git Product logo

ambari-hdp-docker's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

ambari-hdp-docker's Issues

Cannot submit blueprint by curl

Hi author,
After I build images and start compose,
I run:
cd ./blueprints
curl -H "X-Requested-By: ambari" -X POST -u admin:admin http://localhost:8080/api/v1/blueprints/ambari -d @blueprint.json
==> Ok
curl -H "X-Requested-By: ambari" -X POST -u admin:admin http://localhost:8080/api/v1/clusters/ambari -d @hosts.json
==> {
"status" : 400,
"message" : "Invalid Cluster Creation Template: org.apache.ambari.server.topology.InvalidTopologyTemplateException: The specified blueprint doesn't exist: org.apache.ambari.server.topology.NoSuchBlueprintException: No blueprint exists with the name 'jarvis'"
}

Please help me

hive doesn't start

Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_metastore.py", line 203, in
HiveMetastore().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 367, in execute
method(env)
File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_metastore.py", line 56, in start
create_metastore_schema()
File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive.py", line 413, in create_metastore_schema
user = params.hive_user
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 166, in init
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 262, in action_run
tries=self.resource.tries, try_sleep=self.resource.try_sleep)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner
result = function(command, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call
tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call
raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of 'export HIVE_CONF_DIR=/usr/hdp/current/hive-metastore/conf/conf.server ; /usr/hdp/current/hive-server2-hive2/bin/schematool -initSchema -dbType mysql -userName hive -passWord [PROTECTED] -verbose' returned 1. SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.3.0-235/hive2/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.3.0-235/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Metastore connection URL: jdbc:mysql://namenode/hive?createDatabaseIfNotExist=true
Metastore Connection Driver : com.mysql.jdbc.Driver
Metastore connection User: hive
org.apache.hadoop.hive.metastore.HiveMetaException: Failed to get schema version.
Underlying cause: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException : Communications link failure

The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
SQL Error code: 0
org.apache.hadoop.hive.metastore.HiveMetaException: Failed to get schema version.
at org.apache.hive.beeline.HiveSchemaHelper.getConnectionToMetastore(HiveSchemaHelper.java:80)
at org.apache.hive.beeline.HiveSchemaTool.getConnectionToMetastore(HiveSchemaTool.java:133)
at org.apache.hive.beeline.HiveSchemaTool.testConnectionToMetastore(HiveSchemaTool.java:187)
at org.apache.hive.beeline.HiveSchemaTool.doInit(HiveSchemaTool.java:291)
at org.apache.hive.beeline.HiveSchemaTool.doInit(HiveSchemaTool.java:277)
at org.apache.hive.beeline.HiveSchemaTool.main(HiveSchemaTool.java:526)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:233)
at org.apache.hadoop.util.RunJar.main(RunJar.java:148)
Caused by: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure

The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at com.mysql.jdbc.Util.handleNewInstance(Util.java:404)
at com.mysql.jdbc.SQLError.createCommunicationsException(SQLError.java:983)
at com.mysql.jdbc.MysqlIO.(MysqlIO.java:339)
at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2252)
at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2285)
at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2084)
at com.mysql.jdbc.ConnectionImpl.(ConnectionImpl.java:795)
at com.mysql.jdbc.JDBC4Connection.(JDBC4Connection.java:44)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at com.mysql.jdbc.Util.handleNewInstance(Util.java:404)
at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:400)
at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:327)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:247)
at org.apache.hive.beeline.HiveSchemaHelper.getConnectionToMetastore(HiveSchemaHelper.java:76)
... 11 more
Caused by: java.net.ConnectException: Connection refused (Connection refused)
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:589)
at com.mysql.jdbc.StandardSocketFactory.connect(StandardSocketFactory.java:214)
at com.mysql.jdbc.MysqlIO.(MysqlIO.java:298)
... 26 more
*** schemaTool failed ***

stdout: /var/lib/ambari-agent/data/output-84.txt

2020-10-17 07:14:17,877 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=2.6.3.0-235 -> 2.6.3.0-235
2020-10-17 07:14:17,900 - Using hadoop conf dir: /usr/hdp/2.6.3.0-235/hadoop/conf
2020-10-17 07:14:18,390 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=2.6.3.0-235 -> 2.6.3.0-235
2020-10-17 07:14:18,452 - Using hadoop conf dir: /usr/hdp/2.6.3.0-235/hadoop/conf
2020-10-17 07:14:18,455 - Group['livy'] {}
2020-10-17 07:14:18,456 - Group['spark'] {}
2020-10-17 07:14:18,457 - Group['hdfs'] {}
2020-10-17 07:14:18,457 - Group['zeppelin'] {}
2020-10-17 07:14:18,458 - Group['hadoop'] {}
2020-10-17 07:14:18,458 - Group['users'] {}
2020-10-17 07:14:18,459 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-10-17 07:14:18,480 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-10-17 07:14:18,486 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-10-17 07:14:18,488 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2020-10-17 07:14:18,490 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-10-17 07:14:18,491 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2020-10-17 07:14:18,504 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'zeppelin', u'hadoop'], 'uid': None}
2020-10-17 07:14:18,506 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-10-17 07:14:18,513 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-10-17 07:14:18,520 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2020-10-17 07:14:18,527 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-10-17 07:14:18,529 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None}
2020-10-17 07:14:18,543 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-10-17 07:14:18,545 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-10-17 07:14:18,558 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-10-17 07:14:18,570 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2020-10-17 07:14:18,572 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2020-10-17 07:14:18,580 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2020-10-17 07:14:18,674 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2020-10-17 07:14:18,675 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2020-10-17 07:14:18,683 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2020-10-17 07:14:18,696 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2020-10-17 07:14:18,697 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2020-10-17 07:14:18,798 - call returned (0, '1014')
2020-10-17 07:14:18,806 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1014'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2020-10-17 07:14:18,897 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1014'] due to not_if
2020-10-17 07:14:18,898 - Group['hdfs'] {}
2020-10-17 07:14:18,899 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', u'hdfs']}
2020-10-17 07:14:18,900 - FS Type:
2020-10-17 07:14:18,901 - Directory['/etc/hadoop'] {'mode': 0755}
2020-10-17 07:14:18,988 - File['/usr/hdp/2.6.3.0-235/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2020-10-17 07:14:18,997 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2020-10-17 07:14:19,099 - Execute[('setenforce', '0')] {'not_if': '(! which getenforce ) || (which getenforce && getenforce | grep -q Disabled)', 'sudo': True, 'only_if': 'test -f /selinux/enforce'}
2020-10-17 07:14:19,155 - Skipping Execute[('setenforce', '0')] due to not_if
2020-10-17 07:14:19,157 - Directory['/var/log/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'hadoop', 'mode': 0775, 'cd_access': 'a'}
2020-10-17 07:14:19,162 - Directory['/var/run/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'root', 'cd_access': 'a'}
2020-10-17 07:14:19,164 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'create_parents': True, 'cd_access': 'a'}
2020-10-17 07:14:19,192 - File['/usr/hdp/2.6.3.0-235/hadoop/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'hdfs'}
2020-10-17 07:14:19,205 - File['/usr/hdp/2.6.3.0-235/hadoop/conf/health_check'] {'content': Template('health_check.j2'), 'owner': 'hdfs'}
2020-10-17 07:14:19,276 - File['/usr/hdp/2.6.3.0-235/hadoop/conf/log4j.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
2020-10-17 07:14:19,316 - File['/usr/hdp/2.6.3.0-235/hadoop/conf/hadoop-metrics2.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2020-10-17 07:14:19,317 - File['/usr/hdp/2.6.3.0-235/hadoop/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
2020-10-17 07:14:19,319 - File['/usr/hdp/2.6.3.0-235/hadoop/conf/configuration.xsl'] {'owner': 'hdfs', 'group': 'hadoop'}
2020-10-17 07:14:19,330 - File['/etc/hadoop/conf/topology_mappings.data'] {'owner': 'hdfs', 'content': Template('topology_mappings.data.j2'), 'only_if': 'test -d /etc/hadoop/conf', 'group': 'hadoop', 'mode': 0644}
2020-10-17 07:14:19,357 - File['/etc/hadoop/conf/topology_script.py'] {'content': StaticFile('topology_script.py'), 'only_if': 'test -d /etc/hadoop/conf', 'mode': 0755}
2020-10-17 07:14:20,969 - MariaDB RedHat Support: false
2020-10-17 07:14:20,977 - Using hadoop conf dir: /usr/hdp/2.6.3.0-235/hadoop/conf
2020-10-17 07:14:21,019 - call['ambari-python-wrap /usr/bin/hdp-select status hive-server2'] {'timeout': 20}
2020-10-17 07:14:21,090 - call returned (0, 'hive-server2 - 2.6.3.0-235')
2020-10-17 07:14:21,092 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=2.6.3.0-235 -> 2.6.3.0-235
2020-10-17 07:14:21,238 - File['/var/lib/ambari-agent/cred/lib/CredentialUtil.jar'] {'content': DownloadSource('http://ambari:8080/resources/CredentialUtil.jar'), 'mode': 0755}
2020-10-17 07:14:21,240 - Not downloading the file from http://ambari:8080/resources/CredentialUtil.jar, because /var/lib/ambari-agent/tmp/CredentialUtil.jar already exists
2020-10-17 07:14:21,241 - checked_call[('/usr/jdk64/jdk1.8.0_112/bin/java', '-cp', u'/var/lib/ambari-agent/cred/lib/*', 'org.apache.ambari.server.credentialapi.CredentialUtil', 'get', 'javax.jdo.option.ConnectionPassword', '-provider', u'jceks://file/var/lib/ambari-agent/cred/conf/hive_metastore/hive-site.jceks')] {}
2020-10-17 07:14:23,603 - checked_call returned (0, 'SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".\nSLF4J: Defaulting to no-operation (NOP) logger implementation\nSLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.\nOct 17, 2020 7:14:22 AM org.apache.hadoop.util.NativeCodeLoader \nWARNING: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable\nhadoop')
2020-10-17 07:14:23,672 - Directory['/etc/hive'] {'mode': 0755}
2020-10-17 07:14:23,673 - Directories to fill with configs: [u'/usr/hdp/current/hive-metastore/conf', u'/usr/hdp/current/hive-metastore/conf/conf.server']
2020-10-17 07:14:23,673 - Directory['/etc/hive/2.6.3.0-235/0'] {'owner': 'hive', 'group': 'hadoop', 'create_parents': True, 'mode': 0755}
2020-10-17 07:14:23,675 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive/2.6.3.0-235/0', 'mode': 0644, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
2020-10-17 07:14:23,710 - Generating config: /etc/hive/2.6.3.0-235/0/mapred-site.xml
2020-10-17 07:14:23,710 - File['/etc/hive/2.6.3.0-235/0/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2020-10-17 07:14:23,799 - File['/etc/hive/2.6.3.0-235/0/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2020-10-17 07:14:23,799 - File['/etc/hive/2.6.3.0-235/0/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2020-10-17 07:14:23,803 - File['/etc/hive/2.6.3.0-235/0/hive-exec-log4j.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2020-10-17 07:14:23,808 - File['/etc/hive/2.6.3.0-235/0/hive-log4j.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2020-10-17 07:14:23,812 - File['/etc/hive/2.6.3.0-235/0/parquet-logging.properties'] {'content': ..., 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2020-10-17 07:14:23,812 - Directory['/etc/hive/2.6.3.0-235/0/conf.server'] {'owner': 'hive', 'group': 'hadoop', 'create_parents': True, 'mode': 0700}
2020-10-17 07:14:23,813 - Changing permission for /etc/hive/2.6.3.0-235/0/conf.server from 755 to 700
2020-10-17 07:14:23,813 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive/2.6.3.0-235/0/conf.server', 'mode': 0600, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
2020-10-17 07:14:23,838 - Generating config: /etc/hive/2.6.3.0-235/0/conf.server/mapred-site.xml
2020-10-17 07:14:23,838 - File['/etc/hive/2.6.3.0-235/0/conf.server/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0600, 'encoding': 'UTF-8'}
2020-10-17 07:14:23,905 - Writing File['/etc/hive/2.6.3.0-235/0/conf.server/mapred-site.xml'] because it doesn't exist
2020-10-17 07:14:23,905 - Changing owner for /etc/hive/2.6.3.0-235/0/conf.server/mapred-site.xml from 0 to hive
2020-10-17 07:14:23,906 - Changing group for /etc/hive/2.6.3.0-235/0/conf.server/mapred-site.xml from 0 to hadoop
2020-10-17 07:14:23,906 - Changing permission for /etc/hive/2.6.3.0-235/0/conf.server/mapred-site.xml from 644 to 600
2020-10-17 07:14:23,906 - File['/etc/hive/2.6.3.0-235/0/conf.server/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2020-10-17 07:14:23,906 - Writing File['/etc/hive/2.6.3.0-235/0/conf.server/hive-default.xml.template'] because it doesn't exist
2020-10-17 07:14:23,906 - Changing owner for /etc/hive/2.6.3.0-235/0/conf.server/hive-default.xml.template from 0 to hive
2020-10-17 07:14:23,907 - Changing group for /etc/hive/2.6.3.0-235/0/conf.server/hive-default.xml.template from 0 to hadoop
2020-10-17 07:14:23,907 - Changing permission for /etc/hive/2.6.3.0-235/0/conf.server/hive-default.xml.template from 644 to 600
2020-10-17 07:14:23,907 - File['/etc/hive/2.6.3.0-235/0/conf.server/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2020-10-17 07:14:23,907 - Writing File['/etc/hive/2.6.3.0-235/0/conf.server/hive-env.sh.template'] because it doesn't exist
2020-10-17 07:14:23,907 - Changing owner for /etc/hive/2.6.3.0-235/0/conf.server/hive-env.sh.template from 0 to hive
2020-10-17 07:14:23,907 - Changing group for /etc/hive/2.6.3.0-235/0/conf.server/hive-env.sh.template from 0 to hadoop
2020-10-17 07:14:23,908 - Changing permission for /etc/hive/2.6.3.0-235/0/conf.server/hive-env.sh.template from 644 to 600
2020-10-17 07:14:23,910 - File['/etc/hive/2.6.3.0-235/0/conf.server/hive-exec-log4j.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2020-10-17 07:14:23,911 - Writing File['/etc/hive/2.6.3.0-235/0/conf.server/hive-exec-log4j.properties'] because it doesn't exist
2020-10-17 07:14:23,912 - Changing owner for /etc/hive/2.6.3.0-235/0/conf.server/hive-exec-log4j.properties from 0 to hive
2020-10-17 07:14:23,912 - Changing group for /etc/hive/2.6.3.0-235/0/conf.server/hive-exec-log4j.properties from 0 to hadoop
2020-10-17 07:14:23,912 - Changing permission for /etc/hive/2.6.3.0-235/0/conf.server/hive-exec-log4j.properties from 644 to 600
2020-10-17 07:14:23,916 - File['/etc/hive/2.6.3.0-235/0/conf.server/hive-log4j.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2020-10-17 07:14:23,916 - Writing File['/etc/hive/2.6.3.0-235/0/conf.server/hive-log4j.properties'] because it doesn't exist
2020-10-17 07:14:23,917 - Changing owner for /etc/hive/2.6.3.0-235/0/conf.server/hive-log4j.properties from 0 to hive
2020-10-17 07:14:23,917 - Changing group for /etc/hive/2.6.3.0-235/0/conf.server/hive-log4j.properties from 0 to hadoop
2020-10-17 07:14:23,917 - Changing permission for /etc/hive/2.6.3.0-235/0/conf.server/hive-log4j.properties from 644 to 600
2020-10-17 07:14:23,917 - File['/etc/hive/2.6.3.0-235/0/conf.server/parquet-logging.properties'] {'content': ..., 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2020-10-17 07:14:23,917 - Writing File['/etc/hive/2.6.3.0-235/0/conf.server/parquet-logging.properties'] because it doesn't exist
2020-10-17 07:14:23,918 - Changing owner for /etc/hive/2.6.3.0-235/0/conf.server/parquet-logging.properties from 0 to hive
2020-10-17 07:14:23,918 - Changing group for /etc/hive/2.6.3.0-235/0/conf.server/parquet-logging.properties from 0 to hadoop
2020-10-17 07:14:23,918 - Changing permission for /etc/hive/2.6.3.0-235/0/conf.server/parquet-logging.properties from 644 to 600
2020-10-17 07:14:23,919 - File['/usr/hdp/current/hive-metastore/conf/conf.server/hive-site.jceks'] {'content': StaticFile('/var/lib/ambari-agent/cred/conf/hive_metastore/hive-site.jceks'), 'owner': 'hive', 'group': 'hadoop', 'mode': 0640}
2020-10-17 07:14:23,919 - Writing File['/usr/hdp/current/hive-metastore/conf/conf.server/hive-site.jceks'] because it doesn't exist
2020-10-17 07:14:23,920 - Changing owner for /usr/hdp/current/hive-metastore/conf/conf.server/hive-site.jceks from 0 to hive
2020-10-17 07:14:23,920 - Changing group for /usr/hdp/current/hive-metastore/conf/conf.server/hive-site.jceks from 0 to hadoop
2020-10-17 07:14:23,920 - Changing permission for /usr/hdp/current/hive-metastore/conf/conf.server/hive-site.jceks from 644 to 640
2020-10-17 07:14:23,921 - XmlConfig['hive-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-metastore/conf/conf.server', 'mode': 0600, 'configuration_attributes': {u'hidden': {u'javax.jdo.option.ConnectionPassword': u'HIVE_CLIENT,WEBHCAT_SERVER,HCAT,CONFIG_DOWNLOAD'}}, 'owner': 'hive', 'configurations': ...}
2020-10-17 07:14:23,931 - Generating config: /usr/hdp/current/hive-metastore/conf/conf.server/hive-site.xml
2020-10-17 07:14:23,931 - File['/usr/hdp/current/hive-metastore/conf/conf.server/hive-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0600, 'encoding': 'UTF-8'}
2020-10-17 07:14:24,123 - Writing File['/usr/hdp/current/hive-metastore/conf/conf.server/hive-site.xml'] because it doesn't exist
2020-10-17 07:14:24,123 - Changing owner for /usr/hdp/current/hive-metastore/conf/conf.server/hive-site.xml from 0 to hive
2020-10-17 07:14:24,123 - Changing group for /usr/hdp/current/hive-metastore/conf/conf.server/hive-site.xml from 0 to hadoop
2020-10-17 07:14:24,123 - Changing permission for /usr/hdp/current/hive-metastore/conf/conf.server/hive-site.xml from 644 to 600
2020-10-17 07:14:24,124 - XmlConfig['hivemetastore-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-metastore/conf/conf.server', 'mode': 0600, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': {u'hive.service.metrics.hadoop2.component': u'hivemetastore', u'hive.metastore.metrics.enabled': u'true', u'hive.service.metrics.reporter': u'HADOOP2'}}
2020-10-17 07:14:24,141 - Generating config: /usr/hdp/current/hive-metastore/conf/conf.server/hivemetastore-site.xml
2020-10-17 07:14:24,141 - File['/usr/hdp/current/hive-metastore/conf/conf.server/hivemetastore-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0600, 'encoding': 'UTF-8'}
2020-10-17 07:14:24,147 - Writing File['/usr/hdp/current/hive-metastore/conf/conf.server/hivemetastore-site.xml'] because it doesn't exist
2020-10-17 07:14:24,148 - Changing owner for /usr/hdp/current/hive-metastore/conf/conf.server/hivemetastore-site.xml from 0 to hive
2020-10-17 07:14:24,148 - Changing group for /usr/hdp/current/hive-metastore/conf/conf.server/hivemetastore-site.xml from 0 to hadoop
2020-10-17 07:14:24,148 - Changing permission for /usr/hdp/current/hive-metastore/conf/conf.server/hivemetastore-site.xml from 644 to 600
2020-10-17 07:14:24,154 - File['/usr/hdp/current/hive-metastore/conf/conf.server/hive-env.sh'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2020-10-17 07:14:24,155 - Writing File['/usr/hdp/current/hive-metastore/conf/conf.server/hive-env.sh'] because it doesn't exist
2020-10-17 07:14:24,155 - Changing owner for /usr/hdp/current/hive-metastore/conf/conf.server/hive-env.sh from 0 to hive
2020-10-17 07:14:24,155 - Changing group for /usr/hdp/current/hive-metastore/conf/conf.server/hive-env.sh from 0 to hadoop
2020-10-17 07:14:24,155 - Changing permission for /usr/hdp/current/hive-metastore/conf/conf.server/hive-env.sh from 644 to 600
2020-10-17 07:14:24,156 - Directory['/etc/security/limits.d'] {'owner': 'root', 'create_parents': True, 'group': 'root'}
2020-10-17 07:14:24,160 - File['/etc/security/limits.d/hive.conf'] {'content': Template('hive.conf.j2'), 'owner': 'root', 'group': 'root', 'mode': 0644}
2020-10-17 07:14:24,162 - Execute[('cp', '--remove-destination', '/usr/share/java/mysql-connector-java.jar', u'/usr/hdp/current/hive-metastore/lib/mysql-connector-java.jar')] {'path': ['/bin', '/usr/bin/'], 'sudo': True}
2020-10-17 07:14:24,186 - File['/usr/hdp/current/hive-metastore/lib/mysql-connector-java.jar'] {'mode': 0644}
2020-10-17 07:14:24,187 - Execute[('cp', '--remove-destination', '/usr/share/java/mysql-connector-java.jar', u'/usr/hdp/current/hive-server2-hive2/lib/mysql-connector-java.jar')] {'path': ['/bin', '/usr/bin/'], 'sudo': True}
2020-10-17 07:14:24,208 - File['/usr/hdp/current/hive-server2-hive2/lib/mysql-connector-java.jar'] {'mode': 0644}
2020-10-17 07:14:24,209 - File['/usr/lib/ambari-agent/DBConnectionVerification.jar'] {'content': DownloadSource('http://ambari:8080/resources/DBConnectionVerification.jar'), 'mode': 0644}
2020-10-17 07:14:24,210 - Not downloading the file from http://ambari:8080/resources/DBConnectionVerification.jar, because /var/lib/ambari-agent/tmp/DBConnectionVerification.jar already exists
2020-10-17 07:14:24,220 - File['/usr/hdp/current/hive-metastore/conf/conf.server/hadoop-metrics2-hivemetastore.properties'] {'content': Template('hadoop-metrics2-hivemetastore.properties.j2'), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
2020-10-17 07:14:24,221 - Writing File['/usr/hdp/current/hive-metastore/conf/conf.server/hadoop-metrics2-hivemetastore.properties'] because it doesn't exist
2020-10-17 07:14:24,222 - Changing owner for /usr/hdp/current/hive-metastore/conf/conf.server/hadoop-metrics2-hivemetastore.properties from 0 to hive
2020-10-17 07:14:24,222 - Changing group for /usr/hdp/current/hive-metastore/conf/conf.server/hadoop-metrics2-hivemetastore.properties from 0 to hadoop
2020-10-17 07:14:24,222 - Changing permission for /usr/hdp/current/hive-metastore/conf/conf.server/hadoop-metrics2-hivemetastore.properties from 644 to 600
2020-10-17 07:14:24,222 - File['/var/lib/ambari-agent/tmp/start_metastore_script'] {'content': StaticFile('startMetastore.sh'), 'mode': 0755}
2020-10-17 07:14:24,223 - Writing File['/var/lib/ambari-agent/tmp/start_metastore_script'] because it doesn't exist
2020-10-17 07:14:24,223 - Changing permission for /var/lib/ambari-agent/tmp/start_metastore_script from 644 to 755
2020-10-17 07:14:24,224 - Directory['/tmp/hive'] {'owner': 'hive', 'create_parents': True, 'mode': 0777}
2020-10-17 07:14:24,224 - Creating directory Directory['/tmp/hive'] since it doesn't exist.
2020-10-17 07:14:24,224 - Changing owner for /tmp/hive from 0 to hive
2020-10-17 07:14:24,224 - Changing permission for /tmp/hive from 755 to 777
2020-10-17 07:14:24,225 - HdfsResource[''] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/2.6.3.0-235/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://namenode:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hive', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/2.6.3.0-235/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/mr-history/done', u'/app-logs', u'/tmp'], 'mode': 01777}
2020-10-17 07:14:24,233 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://namenode:50070/webhdfs/v1?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmptBSbCB 2>/tmp/tmpDwlSsI''] {'logoutput': None, 'quiet': False}
2020-10-17 07:14:24,267 - call returned (0, '')
2020-10-17 07:14:24,268 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X PUT '"'"'http://namenode:50070/webhdfs/v1?op=MKDIRS&user.name=hdfs'"'"' 1>/tmp/tmpzcSsZ7 2>/tmp/tmpUkxp_A''] {'logoutput': None, 'quiet': False}
2020-10-17 07:14:24,311 - call returned (0, '')
2020-10-17 07:14:24,314 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X PUT '"'"'http://namenode:50070/webhdfs/v1?op=SETPERMISSION&user.name=hdfs&permission=1777'"'"' 1>/tmp/tmpWl9vCp 2>/tmp/tmp6z_Mz9''] {'logoutput': None, 'quiet': False}
2020-10-17 07:14:24,388 - call returned (0, '')
2020-10-17 07:14:24,396 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X PUT '"'"'http://namenode:50070/webhdfs/v1?op=SETOWNER&owner=hive&group=hadoop&user.name=hdfs'"'"' 1>/tmp/tmpwhNtte 2>/tmp/tmpi75oIt''] {'logoutput': None, 'quiet': False}
2020-10-17 07:14:24,490 - call returned (0, '')
2020-10-17 07:14:24,493 - HdfsResource[''] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/2.6.3.0-235/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://namenode:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'owner': 'hive', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/2.6.3.0-235/hadoop/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/mr-history/done', u'/app-logs', u'/tmp'], 'mode': 0700}
2020-10-17 07:14:24,495 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://namenode:50070/webhdfs/v1?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpK6U73u 2>/tmp/tmpg5a6Ea''] {'logoutput': None, 'quiet': False}
2020-10-17 07:14:24,541 - call returned (0, '')
2020-10-17 07:14:24,544 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X PUT '"'"'http://namenode:50070/webhdfs/v1?op=SETPERMISSION&user.name=hdfs&permission=700'"'"' 1>/tmp/tmp6oekZW 2>/tmp/tmpTkFC7V''] {'logoutput': None, 'quiet': False}
2020-10-17 07:14:24,587 - call returned (0, '')
2020-10-17 07:14:24,588 - HdfsResource[None] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/2.6.3.0-235/hadoop/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://namenode:8020', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 'hdfs', 'action': ['execute'], 'hadoop_conf_dir': '/usr/hdp/2.6.3.0-235/hadoop/conf', 'immutable_paths': [u'/apps/hive/warehouse', u'/mr-history/done', u'/app-logs', u'/tmp']}
2020-10-17 07:14:24,589 - Directory['/var/run/hive'] {'owner': 'hive', 'create_parents': True, 'group': 'hadoop', 'mode': 0755, 'cd_access': 'a'}
2020-10-17 07:14:24,592 - Changing group for /var/run/hive from 113 to hadoop
2020-10-17 07:14:24,593 - Directory['/var/log/hive'] {'owner': 'hive', 'create_parents': True, 'group': 'hadoop', 'mode': 0755, 'cd_access': 'a'}
2020-10-17 07:14:24,593 - Changing group for /var/log/hive from 113 to hadoop
2020-10-17 07:14:24,594 - Directory['/var/lib/hive'] {'owner': 'hive', 'create_parents': True, 'group': 'hadoop', 'mode': 0755, 'cd_access': 'a'}
2020-10-17 07:14:24,594 - Changing owner for /var/lib/hive from 0 to hive
2020-10-17 07:14:24,594 - Changing group for /var/lib/hive from 0 to hadoop
2020-10-17 07:14:24,596 - Execute['export HIVE_CONF_DIR=/usr/hdp/current/hive-metastore/conf/conf.server ; /usr/hdp/current/hive-server2-hive2/bin/schematool -initSchema -dbType mysql -userName hive -passWord [PROTECTED] -verbose'] {'not_if': u"ambari-sudo.sh su hive -l -s /bin/bash -c 'export HIVE_CONF_DIR=/usr/hdp/current/hive-metastore/conf/conf.server ; /usr/hdp/current/hive-server2-hive2/bin/schematool -info -dbType mysql -userName hive -passWord [PROTECTED] -verbose'", 'user': 'hive'}

Command failed after 1 tries

HDP 2.6.5

Hello, I'm trying to port the blueprint to hdp 2.6.5 and I'm having a lot of trouble performing such procedure. I can not find where the repository is being mounted to change the version.

Can you help me? I am using ambari 2.6.2 and would like to install hdp 2.6.5, I am not able to "decipher" where the repo secret is.

Thank you.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.