Question

I tried to write the following program in C# to calculate pi using infinite recursion, but I keep getting confused about integer/double/decimal division.

I really have no clue why this isn't working, so pardon me for my lack of understanding of strongly typed stuff, as I'm still learning C#.

Thanks in advance!

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;

namespace ConsoleApplication1
{
    class Program
    {
        public static int Main(string[] args)
        {
            int numeratornext = 2;
            int denominatornext = 5;

            decimal findto = 100.0M;
            decimal pi = 0.0M;
            decimal halfpi = 1.0M;
            int seriesnum = 1;
            int seriesden = 3;

            for (int i = 0; i < findto; i++)
            {
                halfpi += Decimal.Divide((decimal)seriesnum, (decimal)seriesden);
                //System.Console.WriteLine(Decimal.Divide((decimal)seriesnum, (decimal)seriesden).ToString());
                seriesnum *= numeratornext;
                seriesden *= denominatornext;
                numeratornext++;
                denominatornext += 2;
            }

            pi = halfpi * 2;

            System.Console.WriteLine(pi.ToString());
            System.Console.ReadLine();
            return 0;
        }
    }
}
Was it helpful?

Solution

Your confusion arises because you're using int and decimal, both of which are better suited for integers or fixed-precision financial calculations. My suggestion would be to simply declare all of your numeric variables as double here, which will remove any ambiguity about what the types of your return values are, since any mathematical operation on two doubles results in another double.

(Aside from this, your algorithm is wrong. However, the wrongness of your algorithm is a separate issue from the types you're using.)

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top