首页 新闻 会员 周边

关于用Scala代码连接spark调用MLLib错误

0
悬赏园豆:10 [待解决问题]

在用Scala代码连接spark来调用MLLib运行测试数据时,出如下错误:

14/10/24 11:30:25 INFO DAGScheduler: Failed to run reduce at MLUtils.scala:95
Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Master removed our application: FAILED
    at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1049)
    at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1033)
    at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1031)
    at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
    at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
    at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1031)
    at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:635)
    at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:635)
    at scala.Option.foreach(Option.scala:236)
    at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:635)
    at org.apache.spark.scheduler.DAGSchedulerEventProcessActor$$anonfun$receive$2.applyOrElse(DAGScheduler.scala:1234)
    at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498)
    at akka.actor.ActorCell.invoke(ActorCell.scala:456)
    at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237)
    at akka.dispatch.Mailbox.run(Mailbox.scala:219)
    at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
    at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
    at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
    at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
    at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
14/10/24 11:30:25 INFO SparkUI: Stopped Spark web UI at 
14/10/24 11:30:25 INFO DAGScheduler: Stopping DAGScheduler
14/10/24 11:30:25 INFO SparkDeploySchedulerBackend: Shutting down all executors
14/10/24 11:30:25 INFO SparkDeploySchedulerBackend: Asking each executor to shut down

Scala代码如下:

import org.apache.spark.SparkContext
import org.apache.spark.mllib.classification.SVMWithSGD
import org.apache.spark.mllib.evaluation.BinaryClassificationMetrics
import org.apache.spark.mllib.util.MLUtils

object Test {
  def main(args: Array[String]) {

    val sc = new SparkContext("spark://masterIp:7077", "WordCount",
      System.getenv("/home/spark-1.0.2"), Seq(System.getenv("lib/spark-assembly-1.0.2-hadoop2.2.0.jar")))

    // Load and parse the data file
    // Load training data in LIBSVM format.
    val data = MLUtils.loadLibSVMFile(sc, "hdfs://masterIp:9000/sample_libsvm_data.txt")

    // Split data into training (60%) and test (40%).
    val splits = data.randomSplit(Array(0.6, 0.4), seed = 11L)
    val training = splits(0).cache()
    val test = splits(1)

    // Run training algorithm to build the model
    val numIterations = 100
    val model = SVMWithSGD.train(training, numIterations)

    // Clear the default threshold.
    model.clearThreshold()

    // Compute raw scores on the test set.
    val scoreAndLabels = test.map { point =>
      val score = model.predict(point.features)
      (score, point.label)
    }

    // Get evaluation metrics.
    val metrics = new BinaryClassificationMetrics(scoreAndLabels)
    val auROC = metrics.areaUnderROC()

    println("Area under ROC = " + auROC)
  }
}

请各位大神帮帮忙,在此深表感激!!!!

不务正业的程序员的主页 不务正业的程序员 | 初学一级 | 园豆:192
提问于:2014-10-24 11:40
< >
分享
所有回答(1)
0

那么久了 想必已经解决了

Chaoa | 园豆:643 (小虾三级) | 2014-10-30 13:07

大神啊!这个还真没解决,只不过有其他替代方案,把这问题放那了。如果您知道这个解决方案的话,还是请不吝赐教,多谢了

支持(0) 反对(0) 不务正业的程序员 | 园豆:192 (初学一级) | 2014-10-30 14:40
清除回答草稿
   您需要登录以后才能回答,未注册用户请先注册