When I deploy the hdp-hadoop and hive ,run the smoke test failed.

Bug #1669721 reported by 258189379@qq.com
256
This bug affects 1 person
Affects Status Importance Assigned to Milestone
Canonical Juju
Invalid
Undecided
Unassigned
hadoop (Juju Charms Collection)
New
Undecided
Unassigned

Bug Description

I have deployed the hdp-hive and run the smoke test as the guide, there is an error as below. anybody who knows the reason? thanks.

ubuntu@juju-bed050-default-12:~$ hadoop jar /usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-*.jar teragen 10000 /user/ubuntu/teragenout
17/03/03 09:18:33 INFO client.RMProxy: Connecting to ResourceManager at /192.168.0.21:8050
17/03/03 09:18:33 WARN hdfs.DFSClient: DataStreamer Exception
org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /user/ubuntu/.staging/job_1488527818721_0006/job.jar could only be replicated to 0 nodes instead of minReplication (=1). There are 0 datanode(s) running and no node(s) are excluded in this operation.
 at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget(BlockManager.java:1434)
 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2702)
 at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:590)
 at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:440)
 at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
 at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
 at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
 at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2013)
 at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2009)
 at java.security.AccessController.doPrivileged(Native Method)
 at javax.security.auth.Subject.doAs(Subject.java:415)
 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
 at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2007)

 at org.apache.hadoop.ipc.Client.call(Client.java:1410)
 at org.apache.hadoop.ipc.Client.call(Client.java:1363)
 at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
 at com.sun.proxy.$Proxy9.addBlock(Unknown Source)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:606)
 at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:190)
 at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
 at com.sun.proxy.$Proxy9.addBlock(Unknown Source)
 at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:361)
 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1439)
 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1261)
 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:525)
17/03/03 09:18:33 INFO mapreduce.JobSubmitter: Cleaning up the staging area /user/ubuntu/.staging/job_1488527818721_0006
org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /user/ubuntu/.staging/job_1488527818721_0006/job.jar could only be replicated to 0 nodes instead of minReplication (=1). There are 0 datanode(s) running and no node(s) are excluded in this operation.
 at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget(BlockManager.java:1434)
 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2702)
 at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:590)
 at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:440)
 at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
 at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
 at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
 at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2013)
 at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2009)
 at java.security.AccessController.doPrivileged(Native Method)
 at javax.security.auth.Subject.doAs(Subject.java:415)
 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
 at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2007)

 at org.apache.hadoop.ipc.Client.call(Client.java:1410)
 at org.apache.hadoop.ipc.Client.call(Client.java:1363)
 at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
 at com.sun.proxy.$Proxy9.addBlock(Unknown Source)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:606)
 at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:190)
 at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
 at com.sun.proxy.$Proxy9.addBlock(Unknown Source)
 at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:361)
 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1439)
 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1261)
 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:525)

root@juju-client:~# juju status
Model Controller Cloud/Region Version
default mycontroller otc/eu-de 2.1.0.1

App Version Status Scale Charm Store Rev OS Notes
compute-node unknown 3 hdp-hadoop jujucharms 9 ubuntu
hdphive unknown 1 hdp-hive jujucharms 5 ubuntu
mysql 5.7.17 active 1 mysql jujucharms 56 ubuntu
yarn-hdfs-master unknown 1 hdp-hadoop jujucharms 9 ubuntu

Unit Workload Agent Machine Public address Ports Message
compute-node/6* unknown idle 13 192.168.0.22 8010/tcp,8025/tcp,8030/tcp,8050/tcp,8088/tcp,8141/tcp,8480/tcp,10020/tcp,19888/tcp,50010/tcp,50075/tcp
compute-node/7 unknown idle 14 192.168.0.23 8010/tcp,8025/tcp,8030/tcp,8050/tcp,8088/tcp,8141/tcp,8480/tcp,10020/tcp,19888/tcp,50010/tcp,50075/tcp
compute-node/8 unknown idle 15 192.168.0.24 8010/tcp,8025/tcp,8030/tcp,8050/tcp,8088/tcp,8141/tcp,8480/tcp,10020/tcp,19888/tcp,50010/tcp,50075/tcp
hdphive/0* unknown idle 17 192.168.0.26 10000/tcp
mysql/2* active idle 16 192.168.0.25 3306/tcp Ready
yarn-hdfs-master/2* unknown idle 12 192.168.0.21 8010/tcp,8020/tcp,8025/tcp,8030/tcp,8050/tcp,8088/tcp,8141/tcp,8480/tcp,10020/tcp,19888/tcp,50070/tcp,50075/tcp,50470/tcp

Machine State DNS Inst id Series AZ
12 started 192.168.0.21 f85c544a-54cb-4db1-abb6-fea1b430321a trusty eu-de-01
13 started 192.168.0.22 55dccc3f-c19a-456e-8307-de91495d3836 trusty eu-de-02
14 started 192.168.0.23 aa85fa11-6b8e-4a2b-bb60-5ad37a535e60 trusty eu-de-01
15 started 192.168.0.24 c07d267a-6417-4149-918c-b2a702a95a82 trusty eu-de-01
16 started 192.168.0.25 bd430d6c-bb0c-49ac-8521-9d06c7ebf7e9 xenial eu-de-02
17 started 192.168.0.26 2bff0d3e-9f1d-421e-952d-6268ad9e8810 trusty eu-de-02

Relation Provides Consumes Type
compute-nodes compute-node compute-node peer
resourcemanager compute-node yarn-hdfs-master regular
db hdphive mysql regular
resourcemanager hdphive yarn-hdfs-master regular
cluster mysql mysql peer
compute-nodes yarn-hdfs-master yarn-hdfs-master peer

My juju status result:

information type: Public → Public Security
Revision history for this message
Anastasia (anastasia-macmood) wrote :

This is not a Juju bug. I have added hadoop in.

Changed in juju:
status: New → Invalid
To post a comment you must log in.
This report contains Public Security information  
Everyone can see this security related information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.