Question

Our hadoop cluster using snappy as default codec. Hadoop job reduce output file name is like part-r-00000.snappy. JSnappy fails to decompress the file bcz JSnappy requires the file start with SNZ. The reduce output file start with some bytes 0 somehow.

How could I decompress the file?

Was it helpful?

Solution

Use "Hadoop fs -text" to read this file and pipe it to txt file. ex:

hadoop fs -text part-r-00001.snappy > /tmp/mydatafile.txt

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top