Domanda

I have written this very classic piece of code to read the content of an URL.

// Prepares the connection
URL url = new URL(urlString);
URLConnection uc = url.openConnection();

// Reads the data
StringBuilder data;
String inputLine;
try (InputStreamReader isr = new InputStreamReader(uc.getInputStream(), "utf-8");
     BufferedReader in = new BufferedReader(isr)) {
    data = new StringBuilder();
    while ((inputLine = in.readLine()) != null) {
        data.append(inputLine);
    }
}

// Returns the read data
return data.toString();

But sometimes, the url I'm reading contains too much data, or the client connection is too slow, or whatever... and so the reading takes too much time.

Is there a way to specify to the BufferedReader (or to the InputStreamReader, or maybe to the URLConnection?) a "max-reading-time"? Ideally it will throw a TimeoutException after "the max-reading-time" is reach.

I made some research, but all I could find was limitation on the size of the data received, not on the execution time.

È stato utile?

Soluzione

Just call URLConnection.setReadTimeout() before starting to read. If the timeout expires, a SocketTimeoutException will be thrown.

Autorizzato sotto: CC-BY-SA insieme a attribuzione
Non affiliato a StackOverflow
scroll top