質問

I have to measure the performance of a decoder in terms of number of frames decoded per second. i.e. calculating the FPS.

Below is what my code snippet looks like:

//global variables
clock_t start,stop;
double totTime, FPS;


main()
{
   while(End_of_file)
   {
      start=clock();

      //Decode function is called here

      stop=clock();

      totTime=stop-start;

      FPS=1/(totTime/CLOCKS_PER_SEC);

      printf("\n %lf fps\n",FPS);
   }
}

The printf statement sometimes prints proper value, however sometimes it gives a value 1.#INF00 which according to whatever i have searched is a floating point exception also called as positive infinity, which occurs when we try to divide a positive number by zero. So my question is why is it taking totTime=start-stop; being considered as 0 ? Second, if not clock(), then how do i get the time taken by the decode function.

Any suggestions regarding the same will be really helpful. Thanks in advance.

役に立ちましたか?

解決

The proper macro for this is CLOCKS_PER_SEC, not CLOCK_PER_SEC.

Also, instead of using clock(), you can use clock_gettime() since you just want the time.

ライセンス: CC-BY-SA帰属
所属していません StackOverflow
scroll top