首页 新闻 会员 周边

idea+spark+streaming+kafka报错

0
悬赏园豆:50 [待解决问题]

哪位大神指教一下,十分感谢

1.源码

def main(args: Array[String]) {
val conf = new SparkConf().setAppName("stream test").setMaster("local[2]")
// val ss = SparkSession.builder().config(conf).getOrCreate()records
val streamContext = new StreamingContext(conf, Seconds(3))
val topics = Array("DCWeiboArticle").toSet
val kafkaParams = Map[String, String](
"bootstrap.servers" -> "192.168.200.7:9092",
"group.id" -> "example",
"auto.offset.reset" -> "smallest",
"enable.auto.commit" -> "true")

val records = KafkaUtils.createDirectStream[String, String, StringDecoder, StringDecoder](
streamContext, kafkaParams, topics).map(_._2)
records.count().foreachRDD(rdd => rdd.foreach(println(_)))
streamContext.start()
streamContext.awaitTermination()

}
2.依赖是
<properties>
<scala.version>2.10.4</scala.version>
</properties>

<repositories>
<repository>
<id>scala-tools.org</id>
<name>Scala-Tools Maven2 Repository</name>
<url>http://scala-tools.org/repo-releases</url>
</repository>
</repositories>

<pluginRepositories>
<pluginRepository>
<id>scala-tools.org</id>
<name>Scala-Tools Maven2 Repository</name>
<url>http://scala-tools.org/repo-releases</url>
</pluginRepository>
</pluginRepositories>

<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.6.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka_2.10</artifactId>
<version>1.6.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.10</artifactId>
<version>1.6.1</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.4</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.specs</groupId>
<artifactId>specs</artifactId>
<version>1.2.5</version>
<scope>test</scope>
</dependency>
</dependencies>
3.报错信息
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/streaming/StreamingContext
    at com.trs.operations.StreamingKafkaTest$.main(StreamingKafkaTest.scala:15)
    at com.trs.operations.StreamingKafkaTest.main(StreamingKafkaTest.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:601)
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.streaming.StreamingContext
    at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
    ... 7 more
4.本地Scala环境是2.10.4,jdk是1.7

青狼_兴的主页 青狼_兴 | 初学一级 | 园豆:152
提问于:2017-06-22 12:20
< >
分享
所有回答(2)
0

应该是环境变量classpath路径有问题

风行天下12 | 园豆:3867 (老鸟四级) | 2017-06-26 16:40

应该不是classpath的问题,这个环境已经用了一年多了,java项目和spark项目都没有问题,只是最近用了一下spark streaming,这个项目就开始报错,查了一下maven仓库的依赖,版本都是对应的。。。

支持(0) 反对(0) 青狼_兴 | 园豆:152 (初学一级) | 2017-06-26 16:45
0

<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.10</artifactId>
<version>1.6.1</version>
<scope>provided</scope>
</dependency>

去掉provided,只在编译测试时使用这个streaming,运行时没有这个,打包也不会有这个...

 

不过这个在eclipse中都可运行...idea中不行...

yelon | 园豆:202 (菜鸟二级) | 2017-10-09 10:35
清除回答草稿
   您需要登录以后才能回答,未注册用户请先注册