I'm defining a Python object as being "immutable at any depth" iff

  1. it is (nominally) immutable; and
  2. if it is a "container" object, then it contains only objects that are "immutable at any depth";

For example ((1, 2), (3, 4)) is immutable at any depth, whereas ((1, 2), [3, 4]) isn't (even though the latter, by virtue of being a tuple, is "nominally" immutable).

Is there a reasonable way to test whether a Python object is "immutable at any depth"?

It is relatively easy to test for the first condition (e.g. using collections.Hashable class, and neglecting the possibility of an improperly implemented __hash__ method), but the second condition is harder to test for, because of the heterogeneity of "container" objects, and the means of iterating over their "contents"...

Thanks!

有帮助吗?

解决方案

There are no general tests for immutability. An object is immutable only if none of its methods can mutate the underlying data.

More likely, you're interested in hashability which usually depends on immutability. Containers that are hashable will recursively hash their contents (i.e. tuples and frozensets). So, your test amounts to running hash(obj) and if it succeeds then it was deeply hashable.

IOW, your code already used the best test available:

>>> a = ((1, 2), (3, 4))
>>> b = ((1, 2), [3, 4])
>>> hash(a)
5879964472677921951
>>> hash(b)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
TypeError: unhashable type: 'list'

其他提示

I imagine you're looking for something like this:

def deeply_hashable(obj):
    try:
        hash(obj)
    except TypeError:
        return False
    try:
        iter(obj)
    except TypeError:
        return True
    return all(deeply_hashable(o) for o in obj)

One obvious problem here is that iterating over a dict iterates over its keys, which are always immutable, rather than its values, which is what you're interested in. There is no easy way around this, aside of course from special-casing dict -- which doesn't help with other classes that might behave similarly but are not derived from dict At the end, I agree with delnan: there's no simple, elegant, general way to do this.

I'm not sure about what you are looking for exactly. But using your example data:

>>> a = ((1, 2), (3, 4))
>>> b = ((1, 2), [3, 4])
>>> isinstance(a, collections.Hashable)
True
>>> isinstance(b, collections.Hashable)
True

Hence, indeed using collections.Hashable isn't the way to go. However,

>>> hash(a)
5879964472677921951
>>> hash(b)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
TypeError: unhashable type: 'list'

So, at least for the example data, using hash is enough to verify if the an object is hashable. Of course, as you already pointed out in your question, if __hash__ is incorrectly implemented for a subclass of, let's say, list, then this check won't work.

It absolutely makes sense to have such a test!

Consider the time of 'deepcopy()-ing' (or manually clone()-ing) an object versus a simple reference assignment!

Imagine two entities need to own one and the same object, but rely on that it is not changed (dict-keys are a good example).

Then, it is only safe to use a reference assignment, if and only if the immutability can be verified.

I would consider to recursively test for something like

def check(Candidate):
    if isinstance(Candidate, (str, int, long)):
        return True
    elif isinstance(Candidate, tuple):
        return not any(not check(x) for x in Candidate)
    else:
        return False
许可以下: CC-BY-SA归因
不隶属于 StackOverflow
scroll top