Afterwards, maybe the title could be "ZipInputStream limits read performance" or similar like that because other kind of streams don't limit the read size. When you want to get 4096 bytes, you will get 4096 bytes. Tested this with a text-file for example. I still do not know why the ZipInputStream limits the read performance.
I'm not really sure there is a real performance boost (sometimes it is and sometimes not), but using now IOUtils from apache's 'Commons IO' package - http://commons.apache.org/proper/commons-io/. This also simplifies the whole 'operation'.
I have also seen some solutions with channels but seem to only work with filestreams so cannot use it (or cannot figure out how to apply it to this situation). See also: Faster way of copying data in Java? (see accepted answer).
This is the new version I have made (change also object names to meaningful names):
public String unzipStream( String sFileName )
{
ByteArrayOutputStream oBaosBuffer = new ByteArrayOutputStream();
try
{
ZipInputStream oZipStream = new ZipInputStream( this.activity.getAssets().open( sFileName ) );
ZipEntry oZipEntry;
long iSize = 0;
while( (iSize == 0) && ((oZipEntry = oZipStream.getNextEntry()) != null) && !oZipEntry.isDirectory() )
{
iSize = IOUtils.copyLarge(oZipStream, oBaosBuffer);
oZipStream.closeEntry();
}
oZipStream.close();
if( iSize > 0 )
{
return oBaosBuffer.toString("UTF-8");
//sResult = new String( Base64.decode(sb.toString("UTF-8"), Base64.DEFAULT), Charset.forName("UTF-8") );
}
}
catch(Exception e)
{
System.out.println("Error unzip: "+e.getMessage());
}
return null;
}