First and foremost, read ?Rprof
. It's a speed profiler, and will give you a table of the run-times of every individual execution in your function, making it easy to see where the speed issues are.
Here's one (my first) suggestion.
tabs <- getURL(paste("http://google.de/",x,sep=""))
Google is a massive search engine, and depending on what you're searching for, "Googling" every iteration may suck up a lot of time. Additionally, you're nesting paste
in a function that downloads information. While it may not in your function, nesting function calls almost always slows things down. Consider pasting before calling getURL
. Also, I would use paste0
instead of paste
in this situation.
system.time(replicate(1e6, paste('a', 'b', sep = '')))
## user system elapsed
## 5.864 0.000 5.679
system.time(replicate(1e6, paste0('a', 'b')))
## user system elapsed
## 3.98 0.00 3.82
I can't go much further than that without knowing what the xmltable
s look like. Can you please provide a few sample URLs for us to test your function on?