Question

I apologize for the poor title, but I found it hard to describe the problem in a comprehensible way.

What I want to do is to solve an ODE, but I don't want to start integrating at time = 0. I want the initial value, i.e. the starting point of the integration, to be accessible for changes until the integration starts. I'll try to illustrate this with a piece of code:

model testModel "A test"
  parameter Real startTime = 10 "Starting time of integration";
  parameter Real a = 0.1 "Some constant";
  Real x;
  input Real x_init = 3;
initial equation
  x = x_init;
equation
  if time <= startTime then
    x = x_init;
  else
    der(x) = -a*x;
  end if;
end testModel;

Notice that x_init is declared as input, and can be changed continuously. This code yields an error message, and as far as I can tell, this is due to the fact that I have declared x as both der(x) = and x =. The error message is:

Error: Singular inconsistent scalar system for der(x) = ( -(if time <= 10 then x-x_init else a*x))/((if time <= 10 then 0.0 else 1.0)) = -1e-011/0

I thought about writing

der(x) = 0

instead of

x = init_x

in the if-statement, which will avoid the error message. The problem in such an approach, however, is that I lose the ability to modify the x_init, i.e. the starting point of the integration, before the integration starts. Lets say, for instance, that x_init changes from 3 to 4 at time = 7.

Is there a work-around to perform what I want? Thanks.

(I'm gonna use this to simulate several submodels as part of a network, but the submodels are not going to be initiated at the same time, hence the startTime-variable and the ability to change the initial condition before integration.)

Suggested solution: I've tried out the following:

when time >= startTime
  reinit(x,x_init);
end when;

in combination with the der(x) = 0 alternative. This seems to work. Other suggestions are welcome.

Was it helpful?

Solution

If your input is differentiable, this should work:

model testModel "A test"
  parameter Real startTime = 10 "Starting time of integration";
  parameter Real a = 0.1 "Some constant";
  Real x;
  input Real x_init = 3;
initial equation
  x = x_init;
equation
  if time <= startTime then
    der(x) = der(x_init);
  else
    der(x) = -a*x;
  end if;
end testModel;

Otherwise, I suspect the best you could do would be to have your x variable be a very fast first-order tracker before startTime.

The fundamental issue here is that you are trying to model a variable index DAE. None of the Modelica tools I'm aware of support variable index systems like this.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top