java - error while starting flume -
i trying use dynatrace source flume , hadoop sink,where dynatrace , flume on same server , hadoop on server.
however,when starting flume,i getting below error :
error [conf-file-poller-0] (org.apache.flume.conf.file.abstractfileconfigurationprovider$filewatcherrunnable.run:211) - unhandled error java.lang.nosuchmethoderror: org.slf4j.spi.locationawarelogger.log(lorg/slf4j/marker;ljava/lang/string;iljava/lang/string;ljava/lang/throwable;)v @ org.apache.commons.logging.impl.slf4jlocationawarelog.debug(slf4jlocationawarelog.java:120) @ org.apache.hadoop.metrics2.impl.metricssystemimpl.register(metricssystemimpl.java:220) @ org.apache.hadoop.metrics2.metricssystem.register(metricssystem.java:54) @ org.apache.hadoop.security.usergroupinformation$ugimetrics.create(usergroupinformation.java:106) @ org.apache.hadoop.security.usergroupinformation.(usergroupinformation.java:208) @ org.apache.flume.sink.hdfs.hdfseventsink.authenticate(hdfseventsink.java:529) @ org.apache.flume.sink.hdfs.hdfseventsink.configure(hdfseventsink.java:247) @ org.apache.flume.conf.configurables.configure(configurables.java:41) @ org.apache.flume.conf.properties.propertiesfileconfigurationprovider.loadsinks(propertiesfileconfigurationprovider.java:373) @ org.apache.flume.conf.properties.propertiesfileconfigurationprovider.load(propertiesfileconfigurationprovider.java:223) @ org.apache.flume.conf.file.abstractfileconfigurationprovider.doload(abstractfileconfigurationprovider.java:123) @ org.apache.flume.conf.file.abstractfileconfigurationprovider.access$300(abstractfileconfigurationprovider.java:38) @ org.apache.flume.conf.file.abstractfileconfigurationprovider$filewatcherrunnable.run(abstractfileconfigurationprovider.java:202) @ java.util.concurrent.executors$runnableadapter.call(executors.java:471) @ java.util.concurrent.futuretask.runandreset(futuretask.java:304) @ java.util.concurrent.scheduledthreadpoolexecutor$scheduledfuturetask.access$301(scheduledthreadpoolexecutor.java:178) @ java.util.concurrent.scheduledthreadpoolexecutor$scheduledfuturetask.run(scheduledthreadpoolexecutor.java:293) @ java.util.concurrent.threadpoolexecutor.runworker(threadpoolexecutor.java:1145) @ java.util.concurrent.threadpoolexecutor$worker.run(threadpoolexecutor.java:615) @ java.lang.thread.run(thread.java:745)
this flume configuration file :
# name components on agent qkdsk305.sources = httpsource qkdsk305.sinks = purepathsink useractionsink visitsink nullsink qkdsk305.channels = purepathchannel useractionchannel visitchannel nullchannel # describe/configure httpsource qkdsk305.sources.httpsource.type = org.apache.flume.source.http.httpsource qkdsk305.sources.httpsource.port = 4321 qkdsk305.sources.httpsource.handler = com.dynatrace.diagnostics.btexport.flume.btexporthandler # describe sinks qkdsk305.sinks.purepathsink.type = hdfs qkdsk305.sinks.purepathsink.hdfs.path = hdfs://qkdsk303:9000/user/bts/pp qkdsk305.sinks.purepathsink.hdfs.filetype = datastream qkdsk305.sinks.purepathsink.hdfs.fileprefix = export qkdsk305.sinks.purepathsink.hdfs.filesuffix = .txt qkdsk305.sinks.purepathsink.hdfs.rollinterval = 120 qkdsk305.sinks.purepathsink.hdfs.rollsize = 131072 qkdsk305.sinks.purepathsink.serializer = com.dynatrace.diagnostics.btexport.flume.btpurepathserializerbuilder qkdsk305.sinks.useractionsink.type = hdfs qkdsk305.sinks.useractionsink.hdfs.path = hdfs://qkdsk303:9000/user/bts/pa qkdsk305.sinks.useractionsink.hdfs.filetype = datastream qkdsk305.sinks.useractionsink.hdfs.fileprefix = export qkdsk305.sinks.useractionsink.hdfs.filesuffix = .txt qkdsk305.sinks.useractionsink.hdfs.rollinterval = 120 qkdsk305.sinks.useractionsink.hdfs.rollsize = 131072 qkdsk305.sinks.useractionsink.serializer = com.dynatrace.diagnostics.btexport.flume.btpageactionserializerbuilder qkdsk305.sinks.visitsink.type = hdfs qkdsk305.sinks.visitsink.hdfs.path = hdfs://qkdsk303:9000/user/bts/visit qkdsk305.sinks.visitsink.hdfs.filetype = datastream qkdsk305.sinks.visitsink.hdfs.fileprefix = export qkdsk305.sinks.visitsink.hdfs.filesuffix = .txt qkdsk305.sinks.visitsink.hdfs.rollinterval = 120 qkdsk305.sinks.visitsink.hdfs.rollsize = 131072 qkdsk305.sinks.visitsink.serializer = com.dynatrace.diagnostics.btexport.flume.btvisitserializerbuilder qkdsk305.sinks.visitsink.serializer.charset = iso-8859-2 qkdsk305.sinks.nullsink.type = null qkdsk305.channels.nullchannel.type = memory # use channel buffers events in memory qkdsk305.channels.purepathchannel.type = memory qkdsk305.channels.purepathchannel.capacity = 1000 qkdsk305.channels.purepathchannel.transactioncapactiy = 100 qkdsk305.channels.useractionchannel.type = memory qkdsk305.channels.useractionchannel.capacity = 1000 qkdsk305.channels.useractionchannel.transactioncapactiy = 100 qkdsk305.channels.visitchannel.type = memory qkdsk305.channels.visitchannel.capacity = 1000 qkdsk305.channels.visitchannel.transactioncapactiy = 100 # bind source , sink channel qkdsk305.sources.httpsource.channels = purepathchannel useractionchannel visitchannel nullchannel qkdsk305.sinks.purepathsink.channel = purepathchannel qkdsk305.sinks.useractionsink.channel = useractionchannel qkdsk305.sinks.visitsink.channel = visitchannel qkdsk305.sinks.nullsink.channel = nullchannel qkdsk305.sources.httpsource.selector.type = multiplexing qkdsk305.sources.httpsource.selector.header = bttype qkdsk305.sources.httpsource.selector.mapping.purepath = purepathchannel qkdsk305.sources.httpsource.selector.mapping.page_action = useractionchannel qkdsk305.sources.httpsource.selector.mapping.visit = visitchannel qkdsk305.sources.httpsource.selector.default = nullchannel
kindly help.
thanks, pranil
it seems 1 of our other users ran similiar issue. fixed problem recompiling flume protobuf 2.5. instead of copying posted on our dynatrace community forum feel free check out posting here: https://community.compuwareapm.com/community/display/dl/big+data+business+transaction+bridge?focusedcommentid=192087283#comment-192087283
andi
Comments
Post a Comment