Question

I need to model the latency with several system configuration (single core, multi core, multi node on same server, multi servers) of an LTE simulator. Does anyone have any idea how to calculate the computation amount of a source code (or a part of the whole code, if I want to)? I think the possible approaches are:

  1. Take the difference in timestamp at the start and end of the execution using clock()
  2. Total no of operators/Instruction per second(machine dependent)
  3. Total no of instructions/Instruction per second

3rd is the more general version of 2nd.

The simulator is in Matlab, and I am free to use c (through Mex files).

Was it helpful?

Solution

Matlab has a very easy way to do this build in. use the following code in a script file:

tic;

operations...;

toc;

This automatically prints out the time elapsed for the set of commands used. Hope this helps

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top