Is there a difference between checking for inequality (!==) against undefined and null and just checking for corcive inequality (!=) against null?
-
26-10-2019 - |
Question
If I pass a variable to the existential operator in Coffescript it is converted to a pair of !==
comparisons:
compiles to
Coffeescript ------> JS
a? typeof a !== "undefined" && a !== null;
But if I use a literal or expression it instead uses a !=
comparison:
compiles to
Coffeescript ------> JS
17? 17 != null;
//same thing for regexps, strings, function calls and other expressions
//as far as I know.
Is there any reason for preferring the double !==
s over the shorter != null
, other then perhaps making JSLint happy?
Solution
Short answer: They're behaviorally equivalent, and the != null
compilation is an optimization. Either way, x?
means that x
is neither null
nor undefined
.
People ask about this a lot on the CoffeeScript issue tracker. The reason x != null
isn't used everywhere as the compiled output of x?
is that x != null
(or any other comparison against x
) causes a runtime error if x
doesn't exist. Try it on the Node REPL:
> x != null
ReferenceError: x is not defined
By "doesn't exist," I mean no var x
, no window.x = ...
, and you're not in a function where x
is the name of an argument. (The CoffeeScript compiler can't identify the window.x
case because it doesn't make any assumptions about the environment you're in.) So unless there's a var x
declaration or an argument named x
in the current scope, the compiler has to use typeof x !== "undefined"
to prevent your process from potentially crashing.
OTHER TIPS
I can understand why people find that confusing. In ECMAScript:
a?
is equivalent to:
typeof a !== 'undefined' && a !== undefined && a !== null && a !== 0 && a !== false && a !== '';
The Coffeescript refactoring to:
typeof a !== "undefined" && a !== null;
means that:
var a = false;
a?; // evaluates to true?
Is that correct?