IDEA的java单词计数代码远程连接spark,日志没有报错,却一直在显示提交。
10:09:41.164 [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerMaster - Removal of executor 84 requested
10:09:41.164 [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.cluster.CoarseGrainedSchedulerBackend$DriverEndpoint - Asked to remove non-existent executor 84
10:09:41.164 [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.cluster.StandaloneSchedulerBackend - Granted executor ID app-20200305100612-0039/85 on hostPort 172.17.0.14:39162 with 2 core(s), 1024.0 MB RAM
10:09:41.165 [dispatcher-event-loop-7] INFO org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint - Executor updated: app-20200305100612-0039/85 is now RUNNING
10:09:41.901 [dispatcher-event-loop-3] DEBUG org.apache.spark.scheduler.TaskSchedulerImpl - parentName: , name: TaskSet_0.0, runningTasks: 0
10:09:42.315 [Timer-0] WARN org.apache.spark.scheduler.TaskSchedulerImpl - Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
10:09:42.902 [dispatcher-event-loop-5] DEBUG org.apache.spark.scheduler.TaskSchedulerImpl - parentName: , name: TaskSet_0.0, runningTasks: 0
10:09:43.573 [dispatcher-event-loop-6] INFO org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint - Executor updated: app-20200305100612-0039/85 is now EXITED (Command exited with code 1)
10:09:43.573 [dispatcher-event-loop-6] INFO org.apache.spark.scheduler.cluster.StandaloneSchedulerBackend - Executor app-20200305100612-0039/85 removed: Command exited with code 1
10:09:43.573 [dispatcher-event-loop-6] DEBUG org.apache.spark.scheduler.cluster.CoarseGrainedSchedulerBackend$DriverEndpoint - Asked to remove executor 85 with reason Command exited with code 1
10:09:43.573 [dispatcher-event-loop-6] INFO org.apache.spark.storage.BlockManagerMaster - Removal of executor 85 requested
10:09:43.573 [dispatcher-event-loop-6] INFO org.apache.spark.scheduler.cluster.CoarseGrainedSchedulerBackend$DriverEndpoint - Asked to remove non-existent executor 85
10:09:43.573 [dispatcher-event-loop-6] INFO org.apache.spark.storage.BlockManagerMasterEndpoint - Trying to remove executor 85 from BlockManagerMaster.
10:09:43.575 [dispatcher-event-loop-1] INFO org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint - Executor added: app-20200305100612-0039/86 on worker-20200304161428-172.17.0.14-39162 (172.17.0.14:39162) with 2 core(s)
10:09:43.575 [dispatcher-event-loop-1] INFO org.apache.spark.scheduler.cluster.StandaloneSchedulerBackend - Granted executor ID app-20200305100612-0039/86 on hostPort 172.17.0.14:39162 with 2 core(s), 1024.0 MB RAM
10:09:43.583 [dispatcher-event-loop-2] INFO org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint - Executor updated: app-20200305100612-0039/86 is now RUNNING
10:09:43.902 [dispatcher-event-loop-7] DEBUG org.apache.spark.scheduler.TaskSchedulerImpl - parentName: , name: TaskSet_0.0, runningTasks: 0
10:09:44.901 [dispatcher-event-loop-3] DEBUG org.apache.spark.scheduler.TaskSchedulerImpl - parentName: , name: TaskSet_0.0, runningTasks: 0
spark的work日志:
20/03/05 10:15:21 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set()
20/03/05 10:15:21 INFO ExecutorRunner: Launch command: "/opt/jdk1.8.0_221/bin/java" "-cp" "/opt/spark-2.4.5-bin-hadoop2.7/conf/:/opt/spark-2.4.5-bin-hadoop2.7/jars/" "-Xmx1024M" "-Dspark.driver.port=60035" "org.apache.spark.executor.CoarseGrainedExecutorBackend" "--driver-url" "spark://CoarseGrainedScheduler@PC-20190118XCKM:60035" "--executor-id" "15" "--hostname" "172.17.0.14" "--cores" "2" "--app-id" "app-20200305101444-0045" "--worker-url" "spark://Worker@172.17.0.14:39162"
20/03/05 10:15:24 INFO Worker: Executor app-20200305101444-0045/15 finished with state EXITED message Command exited with code 1 exitStatus 1
20/03/05 10:15:24 INFO ExternalShuffleBlockResolver: Clean up non-shuffle files associated with the finished executor 15
20/03/05 10:15:24 INFO ExternalShuffleBlockResolver: Executor is not registered (appId=app-20200305101444-0045, execId=15)
20/03/05 10:15:24 INFO Worker: Asked to launch executor app-20200305101444-0045/16 for JavaWordCount
20/03/05 10:15:24 INFO SecurityManager: Changing view acls to: root
20/03/05 10:15:24 INFO SecurityManager: Changing modify acls to: root
20/03/05 10:15:24 INFO SecurityManager: Changing view acls groups to:
20/03/05 10:15:24 INFO SecurityManager: Changing modify acls groups to:
20/03/05 10:15:24 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set()
20/03/05 10:15:24 INFO ExecutorRunner: Launch command: "/opt/jdk1.8.0_221/bin/java" "-cp" "/opt/spark-2.4.5-bin-hadoop2.7/conf/:/opt/spark-2.4.5-bin-hadoop2.7/jars/" "-Xmx1024M" "-Dspark.driver.port=60035" "org.apache.spark.executor.CoarseGrainedExecutorBackend" "--driver-url" "spark://CoarseGrainedScheduler@PC-20190118XCKM:60035" "--executor-id" "16" "--hostname" "172.17.0.14" "--cores" "2" "--app-id" "app-20200305101444-0045" "--worker-url" "spark://Worker@172.17.0.14:39162"
20/03/05 10:15:26 INFO Worker: Executor app-20200305101444-0045/16 finished with state EXITED message Command exited with code 1 exitStatus 1
20/03/05 10:15:26 INFO ExternalShuffleBlockResolver: Clean up non-shuffle files associated with the finished executor 16
20/03/05 10:15:26 INFO ExternalShuffleBlockResolver: Executor is not registered (appId=app-20200305101444-0045, execId=16)
20/03/05 10:15:26 INFO Worker: Asked to launch executor app-20200305101444-0045/17 for JavaWordCount
20/03/05 10:15:26 INFO SecurityManager: Changing view acls to: root
20/03/05 10:15:26 INFO SecurityManager: Changing modify acls to: root
20/03/05 10:15:26 INFO SecurityManager: Changing view acls groups to:
20/03/05 10:15:26 INFO SecurityManager: Changing modify acls groups to:
我也遇到了这个问题 还没有解决