首页 新闻 会员 周边 捐助

flume 往 hadoop 写文件不成功?

0
悬赏园豆:5 [待解决问题]

a1.sources=r1
a1.channels=c1
a1.sinks=k1

sources r1 conf

a1.sources.r1.type=exec
a1.sources.r1.command=ping 192.168.1.125

a1.sinks.k1.type=hdfs
a1.sinks.k1.path=hdfs://192.168.1.125:9000/flume/events/%Y/%m/%D
a1.sinks.k1.hdfs.filePrefix=cmcc
a1.sinks.k1.hdfs.minBlockReplicas=1
a1.sinks.k1.hdfs.writeFormat=Text
a1.sinks.k1.hdfs.rollInterval=60
a1.sinks.k1.hdfs.rollSize=0
a1.sinks.k1.hdfs.rollCount=0
a1.sinks.k1.hdfs.idleTimeout=0
a1.sinks.k1.hdfs.fileType=DataStream

a1.channels.c1.type=memory
a1.channels.c1.capacity=1000
a1.channels.c1.transactionCapacity=100

a1.sources.r1.channels=c1
a1.sinks.k1.channel=c1
------------------------flume peizhi -------

---cmd ----
[root@bogon apache-flume-1.8.0-bin]# bin/flume-ng agent -c conf -f conf/avroa.conf -n a1
Info: Including Hive libraries found via () for Hive access

  • exec /opt/java/jdk1.8.0_65/bin/java -Xmx20m -cp '/opt/flume/apache-flume-1.8.0-bin/conf:/opt/flume/apache-flume-1.8.0-bin/lib/:/lib/' -Djava.library.path= org.apache.flume.node.Application -f conf/avroa.conf -n a1

这是什么问题,折腾好几天了。请大神赐教!

问题补充:

按照教程上,hadoop 里面没有数据写入成功。

Company的主页 Company | 初学一级 | 园豆:32
提问于:2018-12-03 23:17
< >
分享
所有回答(1)
0

查看日志,是因为缺少相应的hadoop jar包,导入到flume/lib 就可以了

Company | 园豆:32 (初学一级) | 2018-12-04 22:34
清除回答草稿
   您需要登录以后才能回答,未注册用户请先注册