Domanda

I'm searching for realiable (hang-proof) way to get HTTP page in Java with those features:

  • stop fetching page if content-type in header will not be text/...
  • you can stop fetching page at any time if loaded data exceds limit in bytes (and content-length isn't set in response)
  • you can stop fetching page at given time limit

Has apache http client those options ? I know that after many connections, some small part of all with totally hangs for long time and doesn't respond even to process signals - but this problem can be ommited in running page fetch in other thread, that you can forget after some time limit. But I still didn't found solution for given problems.

Also my target is to avoid lot of forgotten-zombie-threads downloading huge files so stopping bad download is a priority here.

È stato utile?

Soluzione

  1. HttpRequestExecutor.doReceiveResponse(...) to make filtering based on content.

  2. Override org.apache.http.message.BasicLineParser#parseRequestLine(...) with setting limit on maximal cursor right position.

  3. You could override HttpRequestExecutor.preProcess(...) to set up a timer expiration.

All these requires some effort from your side.

Altri suggerimenti

Using httpclient or httpURLConnection:

  1. response.getHeaders or connection.getHeaderField
  2. stop read loop after n bytes
  3. stop read loop after t (just check System.currentTimeMillis())
Autorizzato sotto: CC-BY-SA insieme a attribuzione
Non affiliato a StackOverflow
scroll top