Question

I'm writing code to query an online API, which restricts the number of times I can access per 10 seconds. I would like to make my code as fast as possible, which means querying very close to the limit.

I'm wondering if there is any way to guarantee that an iteration of a for loop takes a minimum of t seconds. So, for example, if the code inside the loop takes n < t seconds, then the program will wait t-n seconds before iterating again.

Although I'm using Julia currently, I'm open to solutions in C++, Python or Java. Also, if there are other languages in which this is easier, I'm always willing to learn.

Was it helpful?

Solution

Many languages have a getTickCount(), getFrequency() and sleep(ms) functions - you can string them together pretty easily as:

while (doMoreQueries) 
{
   startTick = getTickCount();

   // send query, do other things

   remainingMs = 10000 - (getTickCount() - startTick) * 1000 / getFrequency();
   sleep(remainingMs);
}

Although I'm not familiar with Julia, in C++ you could look at using some of the features in chrono, with a sleep function like this.

OTHER TIPS

Or in Julia ...

while (some_condition)
  start_time=time()
  # do your stuff
  sleep(max(0,10-(time()-start_time)))
end
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top