Pergunta

I've a program to calculate the latency of an object in a pub-sub model. I've used the following function for timestamp:

uint64_t GetTimeStamp() {
    struct timeval tv;
    gettimeofday(&tv,NULL);
    return tv.tv_sec*(uint64_t)1000000+tv.tv_usec;
}

The latency is measured as timestamp difference in publisher and subscriber. So, I'm concerned about the unit of the latency measured. Is it in seconds or microseconds??

Foi útil?

Solução

The timeval structure has tv_sec, which gives you the absolute value for the seconds, and tv_usec, which gives you the remaining fraction in micro seconds.

So, you could get the resolution in micro seconds.

For more information, http://www.gnu.org/software/libc/manual/html_node/Elapsed-Time.html

Outras dicas

tv.tv_sec gives the second count, tv.tv_usec gives the remaining microsecond count.

For gettimeofday() principle and its accuracy:

How is the microsecond time of linux gettimeofday() obtained and what is its accuracy?

Is gettimeofday() guaranteed to be of microsecond resolution?

Licenciado em: CC-BY-SA com atribuição
Não afiliado a StackOverflow
scroll top