Question

I have a case which I've never seen before.

In one case, where we assign a decimal value via code, e.g. the value foo.DecimalField = 200M; When I ToString() this with no formatter or culture, I get the value "200"

In another case, when the same field is assigned from EntityFramework, the debugger still says that the exact value of the field is 200M, but when ToString()'ing it, it yields "200.00"

I would get that if it was a floating point problem here, or if there were hidden decimals in the debugger.

But since I'm using a fixed point decimal type, how can this be? There is no webserver or anything that can affect culture or such involved.

What am I missing? Can the decimal type behave differently depending on how it was initialized (I assume there is some reader that inits the decimal when fetching it from EF internally)

Was it helpful?

Solution

Unlike Double, Decimal preserves zeros, e.g.

  Decimal d0 = 200M;
  Decimal d1 = 200.0M;
  Decimal d2 = 200.00M;

  // 200 
  String St0 = d0.ToString();
  // 200.0 (or 200,0 depending on CultureInfo.CurrentCulture)
  String St1 = d1.ToString();
  // 200.00
  String St2 = d2.ToString();

So it seems that in the process of computations, database reading etc. one value appeared to be 200M while the other 200.00M.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top