Question

I have an application that writes information to file. This information is used post-execution to determine pass/failure/correctness of the application. I'd like to be able to read the file as it is being written so that I can do these pass/failure/correctness checks in real time.

I assume it is possible to do this, but what are the gotcha's involved when using Java? If the reading catches up to the writing, will it just wait for more writes up until the file is closed, or will the read throw an exception at this point? If the latter, what do I do then?

My intuition is currently pushing me towards BufferedStreams. Is this the way to go?

Was it helpful?

Solution

Could not get the example to work using FileChannel.read(ByteBuffer) because it isn't a blocking read. Did however get the code below to work:

boolean running = true;
BufferedInputStream reader = new BufferedInputStream(new FileInputStream( "out.txt" ) );

public void run() {
    while( running ) {
        if( reader.available() > 0 ) {
            System.out.print( (char)reader.read() );
        }
        else {
            try {
                sleep( 500 );
            }
            catch( InterruptedException ex ) {
                running = false;
            }
        }
    }
}

Of course the same thing would work as a timer instead of a thread, but I leave that up to the programmer. I'm still looking for a better way, but this works for me for now.

Oh, and I'll caveat this with: I'm using 1.4.2. Yes I know I'm in the stone ages still.

OTHER TIPS

If you want to read a file while it is being written and only read the new content then following will help you achieve the same.

To run this program you will launch it from command prompt/terminal window and pass the file name to read. It will read the file unless you kill the program.

java FileReader c:\myfile.txt

As you type a line of text save it from notepad and you will see the text printed in the console.

public class FileReader {

    public static void main(String args[]) throws Exception {
        if(args.length>0){
            File file = new File(args[0]);
            System.out.println(file.getAbsolutePath());
            if(file.exists() && file.canRead()){
                long fileLength = file.length();
                readFile(file,0L);
                while(true){

                    if(fileLength<file.length()){
                        readFile(file,fileLength);
                        fileLength=file.length();
                    }
                }
            }
        }else{
            System.out.println("no file to read");
        }
    }

    public static void readFile(File file,Long fileLength) throws IOException {
        String line = null;

        BufferedReader in = new BufferedReader(new java.io.FileReader(file));
        in.skip(fileLength);
        while((line = in.readLine()) != null)
        {
            System.out.println(line);
        }
        in.close();
    }
}

You might also take a look at java channel for locking a part of a file.

http://java.sun.com/javase/6/docs/api/java/nio/channels/FileChannel.html

This function of the FileChannel might be a start

lock(long position, long size, boolean shared) 

An invocation of this method will block until the region can be locked

I totally agree with Joshua's response, Tailer is fit for the job in this situation. Here is an example :

It writes a line every 150 ms in a file, while reading this very same file every 2500 ms

public class TailerTest
{
    public static void main(String[] args)
    {
        File f = new File("/tmp/test.txt");
        MyListener listener = new MyListener();
        Tailer.create(f, listener, 2500);

        try
        {
            FileOutputStream fos = new FileOutputStream(f);
            int i = 0;
            while (i < 200)
            {
                fos.write(("test" + ++i + "\n").getBytes());
                Thread.sleep(150);
            }
            fos.close();
        }
        catch (Exception e)
        {
            e.printStackTrace();
        }
    }

    private static class MyListener extends TailerListenerAdapter
    {
        @Override
        public void handle(String line)
        {
            System.out.println(line);
        }
    }
}

The answer seems to be "no" ... and "yes". There seems to be no real way to know if a file is open for writing by another application. So, reading from such a file will just progress until content is exhausted. I took Mike's advice and wrote some test code:

Writer.java writes a string to file and then waits for the user to hit enter before writing another line to file. The idea being that it could be started up, then a reader can be started to see how it copes with the "partial" file. The reader I wrote is in Reader.java.

Writer.java

public class Writer extends Object
{
    Writer () {

    }

    public static String[] strings = 
        {
            "Hello World", 
            "Goodbye World"
        };

    public static void main(String[] args) 
        throws java.io.IOException {

        java.io.PrintWriter pw =
            new java.io.PrintWriter(new java.io.FileOutputStream("out.txt"), true);

        for(String s : strings) {
            pw.println(s);
            System.in.read();
        }

        pw.close();
    }
}

Reader.java

public class Reader extends Object
{
    Reader () {

    }

    public static void main(String[] args) 
        throws Exception {

        java.io.FileInputStream in = new java.io.FileInputStream("out.txt");

        java.nio.channels.FileChannel fc = in.getChannel();
        java.nio.ByteBuffer bb = java.nio.ByteBuffer.allocate(10);

        while(fc.read(bb) >= 0) {
            bb.flip();
            while(bb.hasRemaining()) {
                System.out.println((char)bb.get());
            }
            bb.clear();
        }

        System.exit(0);
    }
}

No guarantees that this code is best practice.

This leaves the option suggested by Mike of periodically checking if there is new data to be read from the file. This then requires user intervention to close the file reader when it is determined that the reading is completed. Or, the reader needs to be made aware the content of the file and be able to determine and end of write condition. If the content were XML, the end of document could be used to signal this.

Not Java per-se, but you may run into issues where you have written something to a file, but it hasn't been actually written yet - it might be in a cache somewhere, and reading from the same file may not actually give you the new information.

Short version - use flush() or whatever the relevant system call is to ensure that your data is actually written to the file.

Note I am not talking about the OS level disk cache - if your data gets into here, it should appear in a read() after this point. It may be that the language itself caches writes, waiting until a buffer fills up or file is flushed/closed.

There are a Open Source Java Graphic Tail that does this.

https://stackoverflow.com/a/559146/1255493

public void run() {
    try {
        while (_running) {
            Thread.sleep(_updateInterval);
            long len = _file.length();
            if (len < _filePointer) {
                // Log must have been jibbled or deleted.
                this.appendMessage("Log file was reset. Restarting logging from start of file.");
                _filePointer = len;
            }
            else if (len > _filePointer) {
                // File must have had something added to it!
                RandomAccessFile raf = new RandomAccessFile(_file, "r");
                raf.seek(_filePointer);
                String line = null;
                while ((line = raf.readLine()) != null) {
                    this.appendLine(line);
                }
                _filePointer = raf.getFilePointer();
                raf.close();
            }
        }
    }
    catch (Exception e) {
        this.appendMessage("Fatal error reading log file, log tailing has stopped.");
    }
    // dispose();
}

I've never tried it, but you should write a test case to see if reading from a stream after you have hit the end will work, regardless of if there is more data written to the file.

Is there a reason you can't use a piped input/output stream? Is the data being written and read from the same application (if so, you have the data, why do you need to read from the file)?

Otherwise, maybe read till end of file, then monitor for changes and seek to where you left off and continue... though watch out for race conditions.

You can't read a file which is opened from another process using FileInputStream, FileReader or RandomAccessFile.

But using FileChannel directly will work:

private static byte[] readSharedFile(File file) throws IOException {
    byte buffer[] = new byte[(int) file.length()];
    final FileChannel fc = FileChannel.open(file.toPath(), EnumSet.of(StandardOpenOption.READ));
    final ByteBuffer dst = ByteBuffer.wrap(buffer);
    fc.read(dst);
    fc.close();
    return buffer;
}
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top