log(a*b)
should always be faster, because computing the logarithm is expensive. In log(a*b)
you only do it once, in log(a)+log(b)
you do it twice.
Computing products and sums is trivial compared to logarithms, exponentials etc. In terms of processor cycles, both sums and products are generally less than 5 whereas exponentials and logarithms can go from 50 up to 200 cycles for some architectures.
Is log (a*b*...*z) always faster than log(a) + log(b) + ... + log(z)
Yes. Definitely. Avoid computing logarithms whenever possible.
Here's a small experiment:
a=rand(5000000,1);
% log(a(1)*a(2)...)
tic
for ii=1:100
res=log(prod(a));
end
toc
% Elapsed time is 0.649393 seconds.
% log(a(1))+log(a(2))+...
tic
for ii=1:100
res=sum(log(a));
end
toc
% Elapsed time is 6.894769 seconds.
At some point the ratio in time will saturate. Where it saturates depends on your processor architecture, but the difference will be at least an order of magnitude.