Question

I am looking for a way to retrieve the URLs of the get requests from a specific website / link that I do not have any affiliation with. I've been using PHP and its really not working out. I am pretty sure the code below is getting the information of the index page itself. Not the get requests because the page needs to load to even initiate get requests, and I don't know of a way to "load" a page without actually going to it in a browser... If you give me any lead within any programming language it would be a great help.

$url = 'http://apple.com';
echo "<pre>";
print_r(get_headers($url, 1));
echo "</pre>";

This is what I want an array of (just the URL's / filenames):

Image of FireBug Net tab / HTTP Request Headers

With certain things like Simple HTML Dom Parser and cURL I was thinking there might be a way. If there is another language that can do this I would love to know.

Was it helpful?

Solution

I dont thins this is possible as it is the browser that makes those request.

The PHP code is run on server side and does not load images, javascript, css etc.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top