Question

I have a program that takes a text file, read each line and then insert each line to a list. I used stopwatch to calculate time of execution. But there is some droll result! I'll give various elapsed time when I run the program each time (difference is about 1 or 2 seconds).

The text file involves 3 million URLs.

Any idea??!

Était-ce utile?

La solution 2

Some fluctuation is perfectly normal, especially when the code involves IO such as reading files.

Autres conseils

The processing time for a set of commands depends on (but is not limited to):

  • the CPU speed
  • the efficency of your code
  • asyn vs sync methods
  • network speed
  • the networked computer
  • hard drive speed
  • ram
  • my mood
  • ...

I could go on for days here. The point is (as TheifMaster said), there are so many variables at stake here that it would be a miracle if your code had the same time for any 2 runs! It's the "every snowflake is different" methodology. With so many environment variables, no 2 runs will be the same.

Don't fret on time differences, just focus on getting the average time as low as you need to.

Licencié sous: CC-BY-SA avec attribution
Non affilié à StackOverflow
scroll top