Pergunta

Eric Evans talks a lot about evolving models in DDD so refactorings seem to be essential to DDD. When one has a relational persisted state of the world you can handle model changes by migrations that change the database schema.

How can I cope with model changes when using event sourcing? If there are incompatible changes to an aggregate that would prevent replay of events is there some sort of best practice? Or is it a just-don't?

Foi útil?

Solução 3

Events are just DTOs. It doesn't matter how the model changes as long as you still have one object, if the event itself doesn't change. If you need to change the event, you can 'upgrade' it with the required properties. The Apply method will know what to do with it. I can't come up with something specific without knowing details.

If the model changes so much that basically now you have 2 Aggregate Roots(AR) instead of a previous one, this means you have new different aggregates which won't be using the old events. Basically you start from the old AR, create the new ones and generate the corresponding events which will be specific to those ARs. So you don't really have a compatibility problem in this case.

Working with events is not as straightforward as 'classic' OOP and RDBMS schema, but they are more flexible if you think in business terms and treat your objects as domain concepts. Changing the model means the business concept definition or usage has changed as well, so now you're dealing with a different (new as far as the persistence is concerned) concept.

Outras dicas

If there are incompatible changes to an aggregate that would prevent replay of events

You have essentially two options in this scenario:

  • Patch the older events in such a way that they are made compatible and events can be replayed from the beginning. The benefit here is that you don't lose the history but the downside is that you have to expend some effort to patch the old events.
  • Take a snapshot/memento of the aggregate at the point of the schema change and "re-base" the event stream from this point onward. The benefit here is that you don't have to spend any effort (with event sourcing you most likely have a snapshot mechanism in place). The downside being that you lose the ability to replay events from before the snapshot.

As a general rule of thumb I'd say default to the second option unless you know for sure that you need to be able to go back and edit history before the schema change.

I have not much expierence myself. But I saw a concept called Upcasting

Originally a concept of object-oriented programming, where: "a subclass gets cast to it's superclass automatically when needed", the concept of upcasting can also be applied to event sourcing. To upcast an event means to transform it from its original structure to its new structure. Unlike OOP upcasting, event upcasting cannot be done in full automation because the structure of the new event is unknown to the old event. Manually written Upcasters have to be provided to specify how to upcast the old structure to the new structure.

You can refer to Axon's doc for more detail

Licenciado em: CC-BY-SA com atribuição
Não afiliado a StackOverflow
scroll top