I'm not entirely sure I understand how the confidence interval of the signal is supposed to be applied to each sample of the signal, but we can compute the confidence interval of the sample set as follows:
public static Tuple<double, double> A(double[] samples, double interval)
{
double theta = (interval + 1.0)/2;
double mean = samples.Mean();
double sd = samples.StandardDeviation();
double T = StudentT.InvCDF(0,1,samples.Length-1,theta);
double t = T * (sd / Math.Sqrt(samples.Length));
return Tuple.Create(mean-t, mean+t);
}
Except that the line where we compute T does not compile because unfortunately there is no StudentT.InvCDF
in current Math.NET Numerics yet. But we can still evaluate it numerically as a workaround in the meantime:
var student = new StudentT(0,1,samples.Length-1);
double T = FindRoots.OfFunction(x => student.CumulativeDistribution(x)-theta,-800,800);
For example, with 16 samples and alpha 0.05 we get 2.131 as expected. If there are more than ~60-100 samples, this can also be approximated with the normal distribution:
double T = Nomal.InvCDF(0,1,theta);
So all in all:
public static Tuple<double, double> B(double[] samples, double interval)
{
double theta = (interval + 1.0)/2;
double T = FindRoots.OfFunction(x => StudentT.CDF(0,1,samples.Length-1,x)-theta,-800,800);
double mean = samples.Mean();
double sd = samples.StandardDeviation();
double t = T * (sd / Math.Sqrt(samples.Length));
return Tuple.Create(mean-t, mean+t);
}
This is not the full answer yet as I understand you wanted to somehow apply the confidence interval to each sample, but hopefully it helps on the way to get there.
PS: Using Math.NET Numerics v3.0.0-alpha7