jdk1.8(通过java -service可以正常查看版本信息)
hadoop3.2(进程、页面都可以征程访问)
spark3.4
环境变量、spark-env.sh(各JAVA_HOME、SCALA_HOME、SPARK_MASTER_IP、SPARK_WORKER_MEMORY、HADOOP_CONF_DIR都配置了)、slaves都有修改
在启动spark时失败,报错信息如下:
[root@spark1 sbin]# ./start-all.sh
starting org.apache.spark.deploy.master.Master, logging to /usr/local/spark-3.4/logs/spark-root-org.apache.spark.deploy.master.Master-1-spark1.out
failed to launch:
nice -n 0 /usr/local/spark-3.4/bin/spark-class org.apache.spark.deploy.master.Master --host spark1 --port 7077 --webui-port 8080
at java.lang.Class.getMethod0(Class.java:3018)
at java.lang.Class.getMethod(Class.java:1784)
at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:669)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:651)
Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger
at java.net.URLClassLoader.findClass(URLClassLoader.java:387)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:355)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
... 7 more
full log in /usr/local/spark-3.4/logs/spark-root-org.apache.spark.deploy.master.Master-1-spark1.out
spark2: starting org.apache.spark.deploy.worker.Worker, logging to /usr/local/spark-3.4/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-spark2.out
spark3: starting org.apache.spark.deploy.worker.Worker, logging to /usr/local/spark-3.4/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-spark3.out
spark2: failed to launch: nice -n 0 /usr/local/spark-3.4/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://spark1:7077
spark2: at java.lang.Class.getMethod0(Class.java:3018)
spark2: at java.lang.Class.getMethod(Class.java:1784)
spark2: at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:669)
spark2: at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:651)
spark2: Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger
spark2: at java.net.URLClassLoader.findClass(URLClassLoader.java:387)
spark2: at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
spark2: at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:355)
spark2: at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
spark2: ... 7 more
spark2: full log in /usr/local/spark-3.4/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-spark2.out
spark3: failed to launch: nice -n 0 /usr/local/spark-3.4/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://spark1:7077
spark3: at java.lang.Class.getMethod0(Class.java:3018)
spark3: at java.lang.Class.getMethod(Class.java:1784)
spark3: at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:669)
spark3: at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:651)
spark3: Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger
spark3: at java.net.URLClassLoader.findClass(URLClassLoader.java:387)
spark3: at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
spark3: at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:355)
spark3: at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
spark3: ... 7 more
spark3: full log in /usr/local/spark-3.4/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-spark3.out
[root@spark1 sbin]#
请大佬们帮忙看下错误原因,谢谢!!!
Spark启动时报错localhost: failed to launch: nice -n 0 /home/chan/spark/spark-2.4.3-bin-hadoop2.7/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://Hadoop:7077
https://www.cnblogs.com/litstar/p/12519696.html
文中解决:
在spark安装目录中sbin下的spark-config.sh中添加JAVA_HOME=jdk地址。