Question

I want to make a program that will simulate a user browsing a site and clicking on links. Cookies and javascript have to be enabled. I've successfully done this in python, but I want to write it an compilable language (python ide's don't cut it). The links on the site are generated with javascript and are dynamic. With python I used PAMIE (third party module that uses win32com) to launch an instance of Internet explorer, scrape the generated html for the links, then navigate to one of them. The point is for the whole process to be transparent to the server. What's the best (compilable) language and method to do this? I was thinking C# with WebBrowser control but I don't want to spend a lot of time learning something if it isn't going to work. Any kind help is appreciated!

Was it helpful?

OTHER TIPS

I wrote a blog post on this awhile back: Web scraping in .NET. That discusses cookies but not JavaScript; I don't know if that would require additional coding.

Might be worth having a look at selenium .

We use it for web testing in a C# asp.net envirnorment.

The documentation isn't to bad

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top