Question

Hadoop sequence file is really strange. I pack the images into sequence file and can't recovery image. I do some simple test. And I found the size of bytes even not same before and after use sequence file.

Configuration confHadoop = new Configuration();
        FileSystem fs = FileSystem.get(confHadoop);

        String fileName = args[0];
        Path file = new Path(fs.getUri().toString() + "/" + fileName);
        Path seqFile = new Path("/temp.seq");
        SequenceFile.Writer writer = null;
        FSDataInputStream in = null;
        try{
            writer = SequenceFile.createWriter(confHadoop,Writer.file(seqFile), Writer.keyClass(Text.class),
                    Writer.valueClass(BytesWritable.class));

            in = fs.open(file);
            byte buffer[] = IOUtils.toByteArray(in);


            System.out.println("original size ---->  " + String.valueOf(buffer.length));
            writer.append(new Text(fileName), new BytesWritable(buffer));
            System.out.println(calculateMd5(buffer));
            writer.close();

        }finally{
            IOUtils.closeQuietly(in);
        }

        SequenceFile.Reader reader = new SequenceFile.Reader(confHadoop, Reader.file(seqFile));

        Text key = new Text();
        BytesWritable val = new BytesWritable();

        while (reader.next(key, val)) {
            System.out.println("size get from sequence file --->" + String.valueOf(val.getLength()));
            String md5 = calculateMd5(val.getBytes());
            Path readSeq=new Path("/write back.png");  
            FSDataOutputStream out = null;
            out = fs.create(readSeq);
            out.write(val.getBytes());  //YES! GOT THE ORIGIANL IAMGE
            out.close();
            System.out.println(md5);
            .............
}

The output shows I got the same number of bytes, and after I write the image back to the local disk, I am sure I got the original image. But why the MD5 value is not same?

What's wrong I made here?

14/04/22 16:21:35 INFO compress.CodecPool: Got brand-new compressor [.deflate]
original size ---->  485709
c413e36fd864b27d4c8927956298edbb
14/04/22 16:21:35 INFO compress.CodecPool: Got brand-new decompressor [.deflate]
size get from sequence file --->485709
322cce20b732126bcb8876c4fcd925cb
Was it helpful?

Solution

I finally solve this strange problem, and I have to share it. First, I will show you the wrong way to get the bytes from sequence.

Configuration conf = new Configuration();
FileSystem fs = FileSystem.get(conf);
Path input = new Path(inPath);
Reader reader = new SequenceFile.Reader(conf, Reader.file(input));
Text key = new Text();

BytesWritable val = new BytesWritable();
    while (reader.next(key, val)) {
    fileName = key.toString();
    byte[] data = val.getBytes(); //don't think you have got the data!
}

the reason is getBytes() does not return the exactly size of your original data. I put the data in using

FSDataInputStream in = null;
in = fs.open(input);
byte[] buffer = IOUtils.toByteArray(in);

Writer writer = SequenceFile.createWriter(conf,
Writer.file(output), Writer.keyClass(Text.class),
Writer.valueClass(BytesWritable.class));

writer.append(new Text(inPath), new BytesWritable(buffer));
writer.close();

I check the size of output sequence file, it is original size plus head, I am not sure the reason why getBytes() give me more bytes than original. But let's see how to get the data correctly.

Option #1, copy the size of data you need.

byte[] rawdata = val.getBytes();
length = val.getLength(); //exactly size of original data
byte[] data = Arrays.copyOfRange(rawdata,  0, length); this is corrent

Option #2

byte[] data = val.copyBytes();

this is more sweet. :) Finally got it right.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top