360SDN.COM

首页/Hadoop/列表

hadoop运行错误总结 DataXceiver error

来源:  2019-03-01 10:34:04    评论:0点击:

1.文件打开限制
 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: hadoop03:50010:DataXceiver error processing WRITE_BLOCK operation  src: /10.162.197.47:55322 dst: /10.162.197.47:50010
java.io.IOException: Premature EOF from inputStream
        at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:194)
        at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.doReadFully(PacketReceiver.java:213)
        at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.doRead(PacketReceiver.java:134)
        at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.receiveNextPacket(PacketReceiver.java:109)
        at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:467)
        at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:781)
        at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:730)
        at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:137)
        at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:74)
        at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:235)
        at java.lang.Thread.run(Thread.java:745)

解决方法:
[root@hadoop02 hadoop]# vim /etc/security/limits.conf       --设置文件打开的限制
* soft nofile 1000000
* hard nofile 1000000
[root@hadoop02 hadoop]# vim /usr/local/hadoop-2.6.0/etc/hadoop/hdfs-site.xml       --设置hadoop写入的数据块大小
   
        dfs.datanode.max.transfer.threads
        8192
为您推荐

友情链接 |九搜汽车网 |手机ok生活信息网|ok生活信息网|ok微生活
 Powered by www.360SDN.COM   京ICP备11022651号-4 © 2012-2016 版权