문제

I am attempting to use optim() in R to solve for lambda in the following equation:

lambda/sigma^2 - ln(lambda/sigma^2) = 1 + 1/Q

subject to constraint :

lambda > sigma^2.

I am not sure how one goes about setting up this in R.

I am open to alternative optimization routines as well although the equation seems convex and therefore optim should be a fine choice.

Thank you!

도움이 되었습니까?

해결책

You are trying to solve an equation. Whether or not the constraint is met, can only be decided ex post. You can use uniroot as follows

f <- function(x,sigma=1,Q=1) {x/sigma^2 - log(x/sigma^2) - 1 - 1/Q}
uniroot(f,c(1,5))

giving

$root
[1] 3.146198

$f.root
[1] 3.552369e-06

$iter
[1] 5

$estim.prec
[1] 6.103516e-05

다른 팁

Decided this is more an answer than a comment.

Both optim and optimize minimize functions, so what you want to do is write an error function that returns, say, the squared error for a given lambda (se(lambda, sigma^2, Q), make sure your lambda is the first argument). Then call optim(f = se, lower = sigma^2, sigma^2, Q) and it will return the value of lambda that minimizes your error function. If you have multiple data points (Q, sigma^2 pairs) then make your function a sum of squared errors or try using nls().

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top