Basically Crawler4j loads the existing statistics from the database, by loading all values from the DB. In fact the code is pretty much incorrect, since a transaction is opened and no modification to the DB is made. Therefore the lines dealing with tnx could be removed.
Commented line by line :
//Create a database configuration object
DatabaseConfig dbConfig = new DatabaseConfig();
//Set some parameters : allow creation, set to transactional db and don't use deferred write
dbConfig.setAllowCreate(true);
dbConfig.setTransactional(true);
dbConfig.setDeferredWrite(false);
//Open the database called "Statistics" with the upon created configuration
statisticsDB = env.openDatabase(null, "Statistics", dbConfig);
OperationStatus result;
//Create new database entries key and values
DatabaseEntry key = new DatabaseEntry();
DatabaseEntry value = new DatabaseEntry();
//Start a transaction
Transaction tnx = env.beginTransaction(null, null);
//Get the cursor on the DB
Cursor cursor = statisticsDB.openCursor(tnx, null);
//Position the cursor to the first occurrence of key/value
result = cursor.getFirst(key, value, null);
//While result is success
while (result == OperationStatus.SUCCESS) {
//If the value at the current cursor position is not null, get the name and the value of the counter and add it to the Hashmpa countervalues
if (value.getData().length > 0) {
String name = new String(key.getData());
long counterValue = Util.byteArray2Long(value.getData());
counterValues.put(name, counterValue);
}
result = cursor.getNext(key, value, null);
}
cursor.close();
//Commit the transaction, changes will be operated on th DB
tnx.commit();
I also answered a similar question here. About SleepyCat, are you speaking about this ?