Question

I have a 1.7G file with the following format:

String Long String Long String Long String Long ... etc

Essentially, String is a key and Long is a value in a hashmap i'm interested in initialising before running anything else in my application.

My current code is:

  RandomAccessFile raf=new RandomAccessFile("/home/map.dat","r");
                raf.seek(0);
                while(raf.getFilePointer()!=raf.length()){
                        String name=raf.readUTF();
                        long offset=raf.readLong();
                        map.put(name,offset);
                }

This takes about 12 mins to complete and I'm sure there are better ways of doing this so I would appreciate any help or pointer.

thanks


Update as in EJP suggestion?

EJP thank you for your suggestion and I hope this is what you meant. Correct me if this is wrong

DataInputStream dis=null;
    try{
     dis=new DataInputStream(new BufferedInputStream(new FileInputStream("/home/map.dat")));
     while(true){
       String name=dis.readUTF();
       long offset=dis.readLong();
       map.put(name, offset);
     }
    }catch (EOFException eofe){
      try{
        dis.close();
      }catch (IOException ioe){
        ioe.printStackTrace();
      }
    }
Was it helpful?

Solution 2

I would construct the file so it can be used in place. i.e. without loading this way. As you have variable length records you can construct an array of the location of each record, then place the key in order so you can perform a binary search for data. (Or you can use a custom hash table) You can then wrap this with method which hide the fact the data is actually store in a file instead of turned into data objects.

If you do all this the "load" phase becomes redundant and you won't need to create so many objects.


This is a long example but hopefully shows what is possible.

import vanilla.java.chronicle.Chronicle;
import vanilla.java.chronicle.Excerpt;
import vanilla.java.chronicle.impl.IndexedChronicle;
import vanilla.java.chronicle.tools.ChronicleTest;

import java.io.IOException;
import java.util.*;

public class Main {
    static final String TMP = System.getProperty("java.io.tmpdir");

    public static void main(String... args) throws IOException {
        String baseName = TMP + "/test";
        String[] keys = generateAndSave(baseName, 100 * 1000 * 1000);

        long start = System.nanoTime();
        SavedSortedMap map = new SavedSortedMap(baseName);
        for (int i = 0; i < keys.length / 100; i++) {
            long l = map.lookup(keys[i]);
//            System.out.println(keys[i] + ": " + l);
        }
        map.close();
        long time = System.nanoTime() - start;

        System.out.printf("Load of %,d records and lookup of %,d keys took %.3f seconds%n",
                keys.length, keys.length / 100, time / 1e9);
    }

    static SortedMap<String, Long> generateMap(int keys) {
        SortedMap<String, Long> ret = new TreeMap<>();
        while (ret.size() < keys) {
            long n = ret.size();
            String key = Long.toString(n);
            while (key.length() < 9)
                key = '0' + key;
            ret.put(key, n);
        }
        return ret;
    }

    static void saveData(SortedMap<String, Long> map, String baseName) throws IOException {
        Chronicle chronicle = new IndexedChronicle(baseName);
        Excerpt excerpt = chronicle.createExcerpt();
        for (Map.Entry<String, Long> entry : map.entrySet()) {
            excerpt.startExcerpt(2 + entry.getKey().length() + 8);
            excerpt.writeUTF(entry.getKey());
            excerpt.writeLong(entry.getValue());
            excerpt.finish();
        }
        chronicle.close();
    }

    static class SavedSortedMap {
        final Chronicle chronicle;
        final Excerpt excerpt;
        final String midKey;
        final long size;

        SavedSortedMap(String baseName) throws IOException {
            chronicle = new IndexedChronicle(baseName);
            excerpt = chronicle.createExcerpt();
            size = chronicle.size();
            excerpt.index(size / 2);
            midKey = excerpt.readUTF();
        }

        // find exact match or take the value after.
        public long lookup(CharSequence key) {
            if (compareTo(key, midKey) < 0)
                return lookup0(0, size / 2, key);
            return lookup0(size / 2, size, key);
        }

        private final StringBuilder tmp = new StringBuilder();

        private long lookup0(long from, long to, CharSequence key) {
            long mid = (from + to) >>> 1;
            excerpt.index(mid);
            tmp.setLength(0);
            excerpt.readUTF(tmp);
            if (to - from <= 1)
                return excerpt.readLong();
            int cmp = compareTo(key, tmp);
            if (cmp < 0)
                return lookup0(from, mid, key);
            if (cmp > 0)
                return lookup0(mid, to, key);
            return excerpt.readLong();
        }

        public static int compareTo(CharSequence a, CharSequence b) {
            int lim = Math.min(a.length(), b.length());
            for (int k = 0; k < lim; k++) {
                char c1 = a.charAt(k);
                char c2 = b.charAt(k);
                if (c1 != c2)
                    return c1 - c2;
            }
            return a.length() - b.length();
        }

        public void close() {
            chronicle.close();
        }
    }

    private static String[] generateAndSave(String baseName, int keyCount) throws IOException {
        SortedMap<String, Long> map = generateMap(keyCount);
        saveData(map, baseName);
        ChronicleTest.deleteOnExit(baseName);

        String[] keys = map.keySet().toArray(new String[map.size()]);
        Collections.shuffle(Arrays.asList(keys));
        return keys;
    }
}

generates 2 GB of raw data and performs a million lookups. It's written in such a way that the loading and lookup uses very little heap. ( << 1 MB )

ls -l /tmp/test*
-rw-rw---- 1 peter peter 2013265920 Dec 11 13:23 /tmp/test.data
-rw-rw---- 1 peter peter  805306368 Dec 11 13:23 /tmp/test.index

/tmp/test created.
/tmp/test, size=100000000
Load of 100,000,000 records and lookup of 1,000,000 keys took 10.945 seconds

Using a hash table lookup would be faster per lookup as it is O(1) instead of O(ln N), but more complex to implement.

OTHER TIPS

  1. Use a DataInputStream wrapped around a BufferedInputStream wrapped around a FileInputStream.

  2. Instead of at least four system calls per iteration, checking the length, and the current size and performing who knows how many reads to get the string and the long, just call readUTF() and readLong() until you get an EOFException.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top