虚拟机配置两台centos,ssh成功后,配置hadoop,启动时提示以下
[hadoop@master ~]$ start-all.sh This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh Starting namenodes on [master] master: starting namenode, logging to /usr/hadoop/logs/hadoop-hadoop-namenode-master.out 192.168.1.186: starting datanode, logging to /usr/hadoop/logs/hadoop-hadoop-datanode-slave.out Starting secondary namenodes [0.0.0.0] 0.0.0.0: secondarynamenode running as process 2771. Stop it first. starting yarn daemons resourcemanager running as process 2914. Stop it first. 192.168.1.186: starting nodemanager, logging to /usr/hadoop/logs/yarn-hadoop-nodemanager-slave.out
查看hadoop:
[hadoop@master ~]$ hdfs dfsadmin -report report: Call From master/192.168.1.187 to master:9000 failed on connection exception: java.net.ConnectException: 拒绝连接; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused [hadoop@slave hadoop]$ hdfs dfsadmin -report report: No Route to Host from slave/192.168.1.186 to master:9000 failed on socket timeout exception: java.net.NoRouteToHostException: 没有到主机的路由; For more details see: http://wiki.apache.org/hadoop/NoRouteToHost
你检查一下 hosts文件配置是否正确,slaves配置文件是不是和hostname对得上
我是新手,我是用虚拟机测试的,ssh无密码登陆没有问题
host相同文件如下:
127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4
::1 localhost localhost.localdomain localhost6 localhost6.localdomain6
192.168.1.187 master
192.168.1.186 slave
hostname都改为以上两个名称,两台应该没有问题吧??
@水狐:
你试一试这样
127.0.0.1 localhost
192.168.1.187 master
192.168.1.186 slave
错误显示是host没有解析到