Question

I would like to solve two ODE first order on microcontroller. It has to be evaluated every 100ms

x'=-k_{1}\cdot (x-x_{ref})\cdot e^{-b\cdot ((x-x_{obs})^{2}+(y-y_{obs})^{2})}
y'=-k_{1}\cdot (y-y_{ref})\cdot e^{-b\cdot ((x-x_{obs})^{2}+(y-y_{obs})^{2})}

Basically i thought of using euler integration (Runge-Kute I)

y(k+1)=y(k)+f(k,y(k))*dT

I expect error to be < 0.001. How do i determine how many iterations i should run until i hit that error rate ?

Was it helpful?

Solution

I guess that x and y, as well as x_{ref}, y_{ref}, x_{obs}, y_{obs} are time dependent. This limits the number of ODE solver you can use. So it can be only the Euler method and a Runge-Kutta method of 2 order (I forgot the name), which evaluate the r.h.s of you ODE only at the time points x(t), x(t+dT)´,x(t+2dT)`,...

You can use classical step size control with these two methods. That is you make one with step with the Euler method and one step with the RK-II method. The difference between these two steps is an indicator for the error and can be used for classical step size control. Have a look at the Numerical Recipes for more details.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top