multivariate normal distribution with Math.net
質問
I am plugging Math.net to make some sampling and I just dont get the way it is supposed to work... What am I missing here ?
let mm = Double.DenseMatrix.Identity(2)
let ida = Double.DenseMatrix.Identity(2)
let idb = Double.DenseMatrix.Identity(2)
let generator = MatrixNormal(mm, ida, idb)
generator.Density(mm)
I get
System.ArgumentOutOfRangeException: Matrix dimensions must agree.
Parameter name: x
at MathNet.Numerics.Distributions.MatrixNormal.Density(Matrix`1 x) in c:\TeamCity\buildAgent\work\392bcd0e1411b00f\src\Numerics\Distributions\Multivariate\MatrixNormal.cs:line 241
at <StartupCode$FSI_0079>.$FSI_0079.main@()
Stopped due to error
Which oddly enough is thrown here when I look at the github source
public double Density(Matrix<double> x) {
if (x.RowCount != _m.RowCount || x.ColumnCount != _m.ColumnCount)
{
throw Matrix.DimensionsDontMatch<ArgumentOutOfRangeException>(x, _m, "x");
}
Edit
After restarting everything it worked for 2*2 matrix mean, but not for 2*1 (after adjusting the column variance). what is very strange is that there are some dimension check upon definition. yet the error messages are upon invocation. may be the checks are mistaking row variance and column variance, and using the correct one upon invocation.. All this underline the strong advantage of rich type check.
For the curious, here is a multivariate implementation. have not checked it though..
let sampleNormal =
let rnd = new MersenneTwister()
fun () ->
let rec randomNormal () =
let u1, u2 = rnd.NextDouble(),rnd.NextDouble()
let r = sqrt (-2. * (log u1))
let theta = 2. * System.Math.PI * u2
seq { yield r * sin theta
yield r * cos theta
yield! randomNormal() }
randomNormal ()
let generate covar =
let chol = Double.Factorization.DenseCholesky(covar)
let a = chol.Factor
fun () -> let v = vector ( sampleNormal() |> Seq.take(covar.ColumnCount) |> List.ofSeq )
a * v
//generate covar
let generatecovar = generate covar
let generaten n covar = Seq.init n (fun _ -> generatecovar ())
Edit
Cholesky fails for perfectly correlated inputs, this is ok
let mapply (m:Generic.Matrix<float>) f = m.IndexedEnumerator() |> Seq.iter(fun (i,j,v) -> m.[i,j] <- f v ); m
let generate (covar:Generic.Matrix<float>) =
let R = if covar.Determinant() = 0. then // we want covar = R.RT // C = U D1/2.D1/2 U' = (RT.QT) Q.R = RT.RTT
let u, d, vt = let t = covar.Svd(true) in t.U(), t.W(), t.VT()
let A = (mapply d sqrt) * u.Transpose()
let qr = A.QR() in qr.R.Transpose()
else
let chol = covar.Cholesky()
chol.Factor
fun () -> let v = vector ( sampleNormal() |> Seq.take(covar.ColumnCount) |> List.ofSeq )
R * v
解決
- Identity constructor: DenseMatrix.cs#L321
- MatrixNormal constructor: MatrixNormal.cs#L83
- Density method: MatrixNormal.cs#L237
I can't find anything wrong with your code. Are you sure you are running the correct code? Maybe wrong version of the library code, or even your own? Try restarting FSI, or even recompiling the library.