Is the LSP restriction on strengthening of preconditions in conflict with the suggestions that the need to downcast indicates bad design

StackOverflow https://stackoverflow.com/questions/23222819

質問

I've recently started reading about the Liskov substitution principle (LSP) and I'm struggling to fully comprehend the implications of the restriction that "Preconditions cannot be strengthened in a subtype". It would seem to me that this restriction is in conflict with the design principle that suggests that one should minimize or avoid entirely the need to downcast from a base to a derived class.

That is, I start with an Animal class, and derive the animals Dog, Bird, and Human. The LSP restriction on preconditions clearly fits with nature, in so far as no dog, bird, or human should be more constrained than the general class of animal. Sticking to LSP, the derived classes would then add special features, such as Bird.fly() or Human.makeTool() that are not common to Animal.

It feels a bit absurd for the base class Animal to have virtual methods for every possible feature of every possible animal subtype, but if it doesn't then I would need to downcast an Animal reference to its underlying subtype to access those unique features. This need to downcast, however, is generally considered to be a red flag for bad design. Wikipedia even goes so far as to suggest that it's because of LSP that downcasting is considered bad practice.

So what am I missing?

Bonus question: Consider again the class hierarchy of Animals described above. Clearly it would be an LSP violation if Animal.setWeight(weight) required only a non-negative number, but Human.setWeight(weight) strengthened this precondition and required a non-negative number less than 1000. But what about the constructor for Human, which might look like Human(weight, height, gender)? Would it be an LSP violation if the constructor imposed the limit on weight? If so, how should this hierarchy be redesigned to respect clear boundaries on physical properties of derived animals?

役に立ちましたか?

解決

LSP is all about behavioral subtyping. Roughly speaking, B is a subtype of A if it can be always used where A is expected. Moreover, such usage shouldn't change expected behavior.

So, considering LSP application, the main point is what "expected behavior of A" is. In your example, it is Animal. It is not that simple to design useful Animal interface that common for all animals.

Sticking to LSP, the derived classes would then add special features, such as Bird.fly() or Human.makeTool() that are not common to Animal.

Not quite. LSP assumes that you deal only with Animals. As if it wouldn't be possible to downcast. So, your Human, Bird and other animals can have any methods, constructors or whatever. It is not related to LSP at all. They just should behave as expected when used as Animals.

The problem is that such interfaces are very limited. In practice, we often have to use type switching to let birds fly and people make useful tools.

Two common approaches in mainstream OOP languages are:

  1. Downcasting
  2. Visitor pattern

There is nothing wrong with downcasting in this context, because this is how you usually do type switching in languages that do not support native variant types. You can spend a lot of time introducing hierarchies of interfaces to avoid explicit downcasting, but usually it just makes code less readable and harder to maintain.

他のヒント

Many aspects of programming involve tradeoff, and SOLID principles are among them. If there are some kinds of action which can be done in the same way to nearly all derivatives of a class or implementations of an interface, and aren't really part of the main purpose of the interface, but a few particular derivatives or implementations may have a better way of doing them, the "Interface Segregation Principle" would suggest that such actions not be included in the common interface(*). In such cases, it may be helpful for code which receives a reference to something of non-specific type to check whether the actual object has certain "special" features and use them if so. For example, code which receives an IEnumerable<Animal> and wants to know how many items it contains may check whether it implements ICollection<Animal> or the non-generic ICollection [note that List<Cat> implements the latter but not the former] and--if so--downcast and use the Count method]. There is nothing wrong with downcasting in such cases since the method doesn't require that passed-in instances implement those interfaces--it merely works better when they do.

(*) IMHO, IEnumerable should have included a method to describe properties of the sequence, such as whether the count was known, whether it would forevermore contain the same items, etc. but it doesn't.

Another use of downcasting occurs in cases where one will have a collection of groups of objects, and know that the particular object instances within each group are "compatible" with each other, even though objects in one group may not be compatible with objects in another. For example, the MaleCat.MateWith() method may only accept an instance of FemaleCat, and FemaleKangaroo.MateWith() with may only accept an instance of MaleKangaroo(), but the most practical way for Noah to have a collection of mating pairs of animals would be for each type of animal to have a MateWith() method that accepts an Animal and downcasts to the proper type (and probably also have a CanMateWith() property). If a MatingPair is constructed containing a FemaleHamster and MaleWolf, an attempt to invoke the Breed() method on that pair would fail at runtime, but if code avoids construction of incompatible mating pairs, such failures should never occur. Note that generics can substantially reduce the need for this sort of downcasting, but not entirely eliminate it.

In determining whether downcasting violates the LSP, the $50,000 question is whether a method will uphold its contract for anything that might get passed in. If the MateWith() method's contract specifies that it is only guaranteed to behave usefully on particular instances of Animal for which CanMateWith() has returned true, the fact that it will fail when given some subtypes of Animal would not be an LSP violation. In general, it's useful to have methods reject at compile time objects whose type isn't guaranteed to be usable, but in some cases code may have knowledge about relationships among the types of certain object instances which cannot be expressed syntactically [e.g. the fact that a MatingPair will hold two Animal instances that can be successfully bred]. While downcasting is often a code smell, there's nothing wrong with it when it's used in ways consistent with an object's contract.

ライセンス: CC-BY-SA帰属
所属していません StackOverflow
scroll top