Pregunta

I have a requirement in my project. I have to collect log data using flume and that data has to be fed into hive table.

Here my requirement to collect files placed in a folder into hdfs which I am doing using spooldir. After this I need to process these files and place output in hive folder for data to be queried immediately.

Can I process the source files using sink in such a way that data placed in hdfs is already process into required format.?

Thanks, Sathish

¿Fue útil?

Solución 2

Using below configuration has served my purpose.

source.type = spooldir source.spooldir = ${location}

Otros consejos

Yes, you need to use a serializer (implement this class - http://flume.apache.org/releases/content/1.2.0/apidocs/org/apache/flume/serialization/EventSerializer.html), drop it into plugin.d/ and then add it to the configuration for the HDFS Sink.

Licenciado bajo: CC-BY-SA con atribución
No afiliado a StackOverflow
scroll top