Java .equals() instanceof subclass? Why not call superclass equals instead of making it final?

StackOverflow https://stackoverflow.com/questions/18575294

  •  27-06-2022
  •  | 
  •  

Question

It is stated in Object's .equals(Object) javadoc:

It is symmetric: for any non-null reference values x and y, x.equals(y) should return true if and only if y.equals(x) returns true.

Almost everywhere in example code I see overridden .equals(Object) method which uses instanceof as one of the first tests, for example here: What issues / pitfalls must be considered when overriding equals and hashCode?

public class Person {
    private String name;
    private int age;

    public boolean equals(Object obj) {
        if (obj == null)
            return false;
        if (obj == this)
            return true;
        if (!(obj instanceof Person))
            return false;
        ...
    }

}

Now with class SpecialPerson extends Person having in equals:

        if (!(obj instanceof SpecialPerson))
            return false;

we con not guarantee that .equals() is symmetric. It has been discussed for example here: any-reason-to-prefer-getclass-over-instanceof-when-generating-equals

Person a = new Person(), b = new SpecialPerson();

a.equals(b);    //sometimes true, since b instanceof Person
b.equals(a);    //always false

Maybe I should add in the beginning of SpecialPerson's equals direct call to super?

    public boolean equals(Object obj) {
        if( !obj instanceof SpecialPerson )
            return super.equals(obj);
        ... 
        /* more equality tests here */
    }
Était-ce utile?

La solution

A lot of the examples use instanceof for two reasons: a) it folds the null check and type check into one or b) the example is for Hibernate or some other code-rewriting framework.

The "correct" (as per the JavaDoc) solution is to use this.getClass() == obj.getClass(). This works for Java because classes are singletons and the VM guarantees this. If you're paranoid, you can use this.getClass().equals(obj.getClass()) but the two are really equivalent.

This works most of the time. But sometimes, Java frameworks need to do "clever" things with the byte code. This usually means they create a subtype automatically. Since the subtype should be considered equal to the original type, equals() must be implemented in the "wrong" way but this doesn't matter since at runtime, the subtypes will all follow certain patterns. For example, they will do additional stuff before a setter is being called. This has no effect on the "equalness".

As you noticed, things start to get ugly when you have both cases: You really extend the base types and you mix that with automatic subtype generation. If you do that, you must make sure that you never use non-leaf types.

Autres conseils

You are missing something here. I will try to highlight this:

Suppose you have Person person = new Person() and Person personSpecial = new SpecialPerson() then I am sure you would not like these two objects to be equal. So, its really working as required, the equal must return false.

Moreover, symmetry specifies that the equals() method in both the classes must obey it at the same time. If one equals return true and other return false, then I would say the flaw is in the equals overriding.

Your attempt at solving the problem is not correct. Suppose you have 2 subclasss SpecialPerson and BizarrePerson. With this implementation, BizarrePerson instances could be equal to SpecialPerson instances. You generally don't want that.

don't use instanceof. use this.getClass() == obj.getClass() instead. then you are checking for this exact class.

when working with equalsyou should always use the hashCode and override that too!

the hashCode method for Person could look like this:

@Override
public int hashCode()
{
    final int prime = 31;
    int result = 1;
    result = prime * result + age;
    result = prime * result + ((name == null) ? 0 : name.hashCode());
    return result;
}

and use it like this in your equals method:

if (this.hashCode() != obj.hashCode())
{
    return false;
}

A type should not consider itself equal to an object of any other type--even a subtype--unless both objects derive from a common class whose contract specifies how descendants of different types should check for equality.

For example, an abstract class StringyThing could encapsulate strings, and provide methods to do things like convert to a string or extract substrings, but not impose any requirements on the backing format. One possible subtype of StringyThing, for example, might contain an array of StringyThing and encapsulate the value of the concatenation of all those strings. Two instances of StringyThing would be defined as equal if conversion to strings would yield identical results, and comparison between two otherwise-indistinguishable StringyThing instances whose types knew nothing about each other may have to fall back on that, but StringyThing-derived types could include code to optimize various cases. For example, if one StringyThing represents "M repetitions of character ch" and another represents "N repetitions of the string St", and the latter type knows about the first, it could check whether St contains nothing but M/N repetitions of the character ch. Such a check would indicate whether or not the strings are equal, without having to "expand out" either one of them.

Licencié sous: CC-BY-SA avec attribution
Non affilié à StackOverflow
scroll top