脚本宝典收集整理的这篇文章主要介绍了Hadoop运行start-dfs.sh报ERROR: Attempting to operate on hdfs as root错误的解决方法,脚本宝典觉得挺不错的,现在分享给大家,也给大家做个参考。
错误提示:
Starting namenodes on [master]
ERROR: Attempting to operate on hDFs namenode as root
ERROR: but there is no HDFS_NAMENODE_USER defined. Aborting oPEration.
Starting datanodes
ERROR: Attempting to operate on hdfs datanode as root
ERROR: but there is no HDFS_DATANODE_USER defined. Aborting operation.
Starting secondary namenodes
ERROR: Attempting to operate on hdfs secondarynamenode as root
ERROR: but there is no HDFS_SECONDARYNAMENODE_USER defined. Aborting operation.
Starting journal nodes
ERROR: Attempting to operate on hdfs journalnode as root
ERROR: but there is no HDFS_JOURNALNODE_USER defined. Aborting operation.
Starting ZK Failover Controllers on NN hosts
ERROR: Attempting to operate on hdfs zkfc as root
ERROR: but there is no HDFS_ZKFC_USER defined. Aborting operation.
原因:
使用root账号启动服务,但没预先定义
解决方法:
* 该步骤需要在每台机都执行,也可以先在其中一台机修改,再用scp同步给其它机
1.修改start-dfs.sh和stop-dfs.sh
cd /home/hadoop/sbin
vim start-dfs.sh
vim stop-dfs.sh
在头部添加以下内容:
HDFS_ZKFC_USER=root
HDFS_JOURNALNODE_USER=root
HDFS_NAMENODE_USER=root
HDFS_SECONDARYNAMENODE_USER=root
HDFS_DATANODE_USER=root
HDFS_DATANODE_SECURE_USER=root
#HADOOP_SECURE_DN_USER=root
2.修改start-yarn.sh和stop-yarn.sh
cd /home/hadoop/sbin
vim start-yarn.sh
vim stop-yarn.sh
在头部添加以下内容:
#HADOOP_SECURE_DN_USER=root
HDFS_DATANODE_SECURE_USER=rootYARN_NODEMANAGER_USER=root
YARN_RESOURCEMANAGER_USER=root
3.同步到其它机
cd /home/hadoop/sbin
scp * c2:/home/hadoop/sbin
scp * c3:/home/hadoop/sbin
scp * c4:/home/hadoop/sbin
以上是脚本宝典为你收集整理的Hadoop运行start-dfs.sh报ERROR: Attempting to operate on hdfs as root错误的解决方法全部内容,希望文章能够帮你解决Hadoop运行start-dfs.sh报ERROR: Attempting to operate on hdfs as root错误的解决方法所遇到的问题。
本图文内容来源于网友网络收集整理提供,作为学习参考使用,版权属于原作者。
如您有任何意见或建议可联系处理。小编QQ:384754419,请注明来意。