Question

I'm working on an application that needs to get the source of a web page from a link, and then parse the html from that page.

Could you give me some examples, or starting points where to look to start writing such an app?

Was it helpful?

Solution

You can use HttpClient to perform an HTTP GET and retrieve the HTML response, something like this:

HttpClient client = new DefaultHttpClient();
HttpGet request = new HttpGet(url);
HttpResponse response = client.execute(request);

String html = "";
InputStream in = response.getEntity().getContent();
BufferedReader reader = new BufferedReader(new InputStreamReader(in));
StringBuilder str = new StringBuilder();
String line = null;
while((line = reader.readLine()) != null)
{
    str.append(line);
}
in.close();
html = str.toString();

OTHER TIPS

I would suggest jsoup.

According to their website:

Fetch the Wikipedia homepage, parse it to a DOM, and select the headlines from the In the news section into a list of Elements (online sample):

Document doc = Jsoup.connect("http://en.wikipedia.org/").get();
Elements newsHeadlines = doc.select("#mp-itn b a");

Getting started:

  1. Download the jsoup jar core library
  2. Read the cookbook introduction

This question is a bit old, but I figured I should post my answer now that DefaultHttpClient, HttpGet, etc. are deprecated. This function should get and return HTML, given a URL.

public static String getHtml(String url) throws IOException {
    // Build and set timeout values for the request.
    URLConnection connection = (new URL(url)).openConnection();
    connection.setConnectTimeout(5000);
    connection.setReadTimeout(5000);
    connection.connect();

    // Read and store the result line by line then return the entire string.
    InputStream in = connection.getInputStream();
    BufferedReader reader = new BufferedReader(new InputStreamReader(in));
    StringBuilder html = new StringBuilder();
    for (String line; (line = reader.readLine()) != null; ) {
        html.append(line);
    }
    in.close();

    return html.toString();
}
public class RetrieveSiteData extends AsyncTask<String, Void, String> {
@Override
protected String doInBackground(String... urls) {
    StringBuilder builder = new StringBuilder(100000);

    for (String url : urls) {
        DefaultHttpClient client = new DefaultHttpClient();
        HttpGet httpGet = new HttpGet(url);
        try {
            HttpResponse execute = client.execute(httpGet);
            InputStream content = execute.getEntity().getContent();

            BufferedReader buffer = new BufferedReader(new InputStreamReader(content));
            String s = "";
            while ((s = buffer.readLine()) != null) {
                builder.append(s);
            }

        } catch (Exception e) {
            e.printStackTrace();
        }
    }

    return builder.toString();
}

@Override
protected void onPostExecute(String result) {

}
}

If you have a look here or here, you will see that you can't do that directly with android API, you need an external librairy...

You can choose between the 2 here's hereabove if you need an external librairy.

Call it like

new RetrieveFeedTask(new OnTaskFinished()
        {
            @Override
            public void onFeedRetrieved(String feeds)
            {
                //do whatever you want to do with the feeds
            }
        }).execute("http://enterurlhere.com");

RetrieveFeedTask.class

class RetrieveFeedTask extends AsyncTask<String, Void, String>
{
    String HTML_response= "";

    OnTaskFinished onOurTaskFinished;


    public RetrieveFeedTask(OnTaskFinished onTaskFinished)
    {
        onOurTaskFinished = onTaskFinished;
    }
    @Override
    protected void onPreExecute()
    {
        super.onPreExecute();
    }

    @Override
    protected String doInBackground(String... urls)
    {
        try
        {
            URL url = new URL(urls[0]); // enter your url here which to download

            URLConnection conn = url.openConnection();

            // open the stream and put it into BufferedReader
            BufferedReader br = new BufferedReader(new InputStreamReader(conn.getInputStream()));

            String inputLine;

            while ((inputLine = br.readLine()) != null)
            {
                // System.out.println(inputLine);
                HTML_response += inputLine;
            }
            br.close();

            System.out.println("Done");

        }
        catch (MalformedURLException e)
        {
            e.printStackTrace();
        }
        catch (IOException e)
        {
            e.printStackTrace();
        }
        return HTML_response;
    }

    @Override
    protected void onPostExecute(String feed)
    {
        onOurTaskFinished.onFeedRetrieved(feed);
    }
}

OnTaskFinished.java

public interface OnTaskFinished
{
    public void onFeedRetrieved(String feeds);
}

One of the other SO post answer helped me. This doesn't read line by line; supposingly the html file had a line null in between. As preRequisite add this dependancy from project settings "com.koushikdutta.ion:ion:2.2.1" implement this code in AsyncTASK. If you want the returned -something- to be in UI thread, pass it to a mutual interface.

Ion.with(getApplicationContext()).
load("https://google.com/hashbrowns")
.asString()
.setCallback(new FutureCallback<String>()
 {
        @Override
        public void onCompleted(Exception e, String result) {
            //int s = result.lastIndexOf("user_id")+9;
            // String st = result.substring(s,s+5);
           // Log.e("USERID",st); //something

        }
    });
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top