Does system.Decimal use more memory than 'decimal'?
Question
I heard someone say that in C#, capital Decimal is using more memory than lower case decimal, because Decimal is resolved to the lowercase decimal and that requires memory.
Is that true?
Solution
No.
decimal
is simply an alias for System.Decimal
. They're exactly the same and the alias is resolved at compile-time.
OTHER TIPS
No, that is not true.
The decimal
keyword is an alias for the type System.Decimal
. They are the exact same type, so there is no memory difference and no performance difference. If you use reflection to look at the compiled code, it's not even possible to tell if the alias or the system type was used in the source code.
There is two differences in where you can use the alias and the system type, though:
The
decimal
alias is always the system type and can not be changed in any way. The use of theDecimal
identifier relies on importing theSystem
namespace. The unambiguous name for the system type isglobal::System.Decimal
.Some language constructs only accept the alias, not the type. I can't think of an example for
decimal
, but when specifying the underlying type for an enum you can only use language aliases likeint
, not the corresponing system type likeSystem.Int32
.
No. That's just silly.
In C#, decimal is just a synonym for Decimal. The compiler will treat decimal declarations as Decimal, and the compiled code will be as if Decimal was used.