The answer provided by meagar is correct, from strictly a JavaScript / Unix time perspective. However, if you just multiply by 1000, you will loose any sub-second precision that might have existed in your data.
Moment.js offers two different methods, as described in the docs. .unix()
returns the value in seconds. It is effectively dividing by 1000 and truncating any decimals. You want to use the .valueOf()
method, which just returns the milliseconds without modification.