Question

This seems like a bug to me...

I accept that automatic properties, defined as such:

public decimal? Total { get; set; }

Will be null when they are first accessed. They haven't been initialized, so of course they are null.

But, even after setting their value through +=, this decimal? still remains null. So after:

Total += 8;

Total is still null. How can this be correct? I understand that it's doing a (null + 8), but seems strange that it doesn't pick up that it means it should just be set to 8...

Addendums:

I made the "null + 8" point in my question - but notice that it works with strings. So, it does null + "hello" just fine, and returns "hello". Therefore, behind the scenes, it is initializing the string to a string object with the value of "hello". The behavior should be the same for the other types, IMO. It might be because a string can accept a null as a value, but still, a null string is not an initialized object, correct?

Perhaps it's just because a string isn't a nullable...

Was it helpful?

Solution

public decimal? Total { get; set; }

Think of null as "unknown value". If you have an unknown quantity of something and you add 8 more, how many do you have now?

Answer: unknown.

Operations on Nullable Variables

There are cases where operations on unknown values give you knowable results.

public bool? State { get; set; }

The following statements have knowable solutions even though they contain unknown values:

State = null;
nextState = State & false;         // always equals false
nextState = State & true;          // still unknown (null)

nextState = State | true;          // always true
nextState = State | false;         // still unknown (null)

See the pattern?

Of course, if you want Total to be equivalent (equal) to 0 when it is null, you can use the null coalescing operator and write something like this:

Total = (Total ?? 0) + 8;

That will use the value of Total in your equation unless it is null, in which case it will use the value 0.

OTHER TIPS

Null + 8 = Null

You'll need to set it with zero before.

null means unknown value,

unknown value + known value = still unknown value

Here's a one-liner to initialize it on the first call and increment it afterwards:

    public void InitializeOrIncrement(decimal value)
    {
        // if Total is null then initialize, otherwise increment
        Total = (Total == null) ? value : Total + value;
    }

    public decimal? Total { get; set; }

From MSDN:

When you perform comparisons with nullable types, if the value of one of the nullable types is null and the other is not, all comparisons evaluate to false except for != (not equal). It is important not to assume that because a particular comparison returns false, the opposite case returns true.

So, it works as intended.

I know that it makes sense to do

public decimal? Total { get; set; }

Total = (Total ?? 0) + 8;

but wouldnt it just be easier to do :

public decimal Total { get; set; }

initial value of Total is 0

As other people have pointed out, null is not equal to zero. Although it may seem more convenient for a null integer to default to zero in the long run it is likely to produce weird results that you may not spot until it is too late.

As a quick example, say one of your data feeds fails and populates your result set with nulls. Your calculations will treat the nulls as zeros and continue to produce results. As numbers are still coming out, even though they are likely wrong, you might never notice that something has gone critically wrong.

to set the value of the Total is just

Total = 8;

I would recommend reading up on Nullable Types to understand how they work. You can check to see if the property has a value by using HasValue.

From MSDN:

Operators

The predefined unary and binary operators and any user-defined operators that exist for value types may also be used by nullable types. These operators produce a null value if the operands are null; otherwise, the operator uses the contained value to calculate the result. For example:

int? a = 10;
int? b = null;

a++;         // Increment by 1, now a is 11.
a = a * 10;  // Multiply by 10, now a is 110.
a = a + b;   // Add b, now a is null.

Null isn't the same as zero. Zero plus eight is eight... but null plus eight? Always null. Just like infinity plus anything is still infinity - it's undefined.

You'll find that this is universally true of null. Every database (at least that I've ever worked with) will give you the same result.

public decimal? Total { get; set; }

Would something like this work? Some sort of automatic initalization if the value isn't yet set.

public decimal? Total
{
  get { return this.totalValue;}
  set
  { 
     if(totalValue == null) { this.totalValue = 0;}
     this.totalValue = value;
  }
}

private decimal? totalValue;
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top