Frage

Given a metaclass, or simpler, type(), the last argument stands as the class dict, the one which is in charge with class variables. I was wondering, is there a way to set instance variables from within a metaclass?

The reason I am using a metaclass is that I need to collect some variables defined on class creation and process them. But then, the result of this processing needs to be attached to each instance of the class, not to the class itself, since the result will be a list, which will differ from one instance of the class to another.

I will provide an example so it's easier to follow me.


I have a Model class, defined as follows:

class Model(object):
    __metaclass__ = ModelType

    has_many = None

    def __init__(self, **kwargs):
        self._fields = kwargs

The has_many class variable will be filled in at class definition by whatever model someone feels the need. In the __init__(), I simply assign the keywords provided at model instantiation to instance variable _fields.

I then have the ModelType metaclass:

class ModelType(type):
    def __new__(cls, name, bases, dct):
        if 'has_many' in dct:
            dct['_restricted_fields'].append([has-many-model-name])

What I need to do here is pretty straight-forward. I check if someone defined the has_many class variable on his custom model class, then add the variable's content to a list of restricted fields.

And now comes the important part. When instantiating a model, I need _fields to be composed of both the keyword arguments used for instantiating the model and the restricted fields, as processed in the metaclass. This would be pretty easy if _fields were a class variable in the Model class, but since it is an instance variable, I don't know how can I also add the restricted fields to it.


These being said, is there any way of achieving this? Or is there a better way of handling this? (I thought of using only a metaclass and set a _restricted_fields class variable on the model, then use this variable within the class __init__() to add the restricted fields to the normal fields, but very soon the model class will be cluttered of code which, in my opinion, should rest in another part of the code).

War es hilfreich?

Lösung

Using a metaclass for this is not the correct approach here. A metaclass modifies the class-creation behavior not the instance creation behavior. You should use the __init__or the __new__ function to modify instance creation behavior. Wanting to use a metaclass for such things is using a hammer instead of a screwdriver to put a screw in a wall. ;-)

I'd suggest you use __new__ to achieve what you want. From the Python docs:

__new__() is intended mainly to allow subclasses of immutable types (like int, str, or tuple) to customize instance creation. It is also commonly overridden in custom metaclasses in order to customize class creation.

class MetaModel(type):
    def __new__(cls, name, bases, attrs):
        attrs['_restricted_fields'] = [attrs.get('has_many')]
        return type.__new__(cls, name, bases, attrs)

class Model(object):
    __metaclass__ = MetaModel
    has_many = None
    def __new__(cls, *args, **kwargs):
        instance = object.__new__(cls, *args, **kwargs)
        instance.instance_var = ['spam']
        return instance

class SubModel(Model):
    has_many = True
    def __init__(self):
        # No super call here.
        self.attr = 'eggs'

s = SubModel()
assert s._restricted_fields == [True]  # Added by MetaModel
assert s.instance_var == ['spam']  # Added by Model.__new__
assert s.attr == 'eggs'  # Added by __init__

# instance_var is added per instance.
assert SubModel().instance_var is not SubModel().instance_var

The MetaModel is responsible for creating Model classes. It adds a _restricted_fields class variable to any Model class created by it (the value is a list containing the has_many class variable).

The Model class defines a default has_many class variable. It also modifies the instance creation behavior, it adds an instance_var attribute to each created instance.

The SubModel is created by a user of your code. It defines an __init__ function to modify instance creation. Note that it does not call any super-class function, this is not a necessity. The __init__ adds an attr attribute to each SubClass instance.

Andere Tipps

The place to set up instance variables is in the class's __init__ method. But if you don't want to make each class that uses your metaclass include the same code, why not have the metaclass provide its own __init__ method for the class, wrapping around whatever existing __init__ method is there:

class MyMetaclass(type):
    def __new__(mcs, name, bases, dct):
        if 'has_many' in dct:
            dct['_restricted_fields'].append(["something"])

        orig_init = dct.get("__init__") # will be None if there was no __init__

        def init_wrapper(self, *args, **kwargs):
            if orig_init:
                orig_init(self, *args, **kwargs)          # call original __init__
            self._fields = getattr(self, "_fields", [])   # make sure _fields exists
            self._fields.extend(["whatever"])             # make our own additions

        dct["__init__"] = init_wrapper   # replace original __init__ with our wrapper

        return type.__new__(mcs, name, bases, dct)

I don't entirely understand what the actual logic you want to implement is, but you can replace self._fields.extend(["whatever"]) in the wrapper function with whatever it is you actually want to do. self in that context is the instance, not the class or metaclass! The wrapper function is a closure, so you do have access to the metaclass in mcs and the class dictionary in dct if you need them.

If it needs to be set per-instance, the place to put the code to calculate this whatever it is, is in the __init__.

However, be aware that class variables are available through instances, unless shadowed by an instance variable.

You could do this all with a factory instead of a metaclass.

The factory can create types and then also create instances of those types - however doing so at initialization time rather than at class-creation time. That keeps the related knowledge in one place (the factory) but doesn't try to shoehorn two different kinds of work into one operation.

OTOH you could also dynamically compose a function to be run at init time in each instance using info you had at class-creation time via a closure

class ExMeta(type):

        def __new__(cls, name, parents, kwargs):
            kwargs['ADDED_DYNAMICALLY'] = True

            secret_knowledge = 42
            def fake_init(*args, **kwargs):
                print "hello world... my args are:",  args, kwargs
                print "the meaning of the universe is ", secret_knowledge
            kwargs['_post_init'] = fake_init
            return super(ExMeta, cls).__new__(cls, name, parents, kwargs)

class Example(object):
    SOME_ATTR = 1
    SOME_OTHER = 2
    __metaclass__ = ExMeta       

    def __init__(self):
        self._post_init()


bob = Example()
> hello world... my args are: (<__main__.Example object at 0x0000000002425BA8>,) {}
> the meaning of the universe is  42

Update which might show what I kind of was thinking:

class MetaKlass(object):
    def __new__(cls, *args, **kwargs):
        class_methods = {'foo'}
        _ins = super(MetaKlass, cls).__new__(cls)
        for k, v in kwargs.iteritems():
            if k in class_methods:
                _do = _ins.__getattribute__(k)
                setattr(_ins, k, _do(v)) # self monkey patching:
                                        # classmethod is replaced by its output
            else:
                setattr(_ins, k, v)
        return _ins

    def __init__(self, arg1, arg2, *args, **kwargs):
        self.a = arg1
        self.b = arg2

    @classmethod
    def foo(cls, x):
        return 42

So we get:

>>> from overflow_responses import MetaKlass
>>> klass_dict = {'foo': 1, 'angel': 2}
>>> k = MetaKlass(3, 7, **klass_dict)
>>> k.foo, k.angel, k.a, k.b
(42, 2, 3, 7)

Are you looking for something like this:

>>> kwd = {'a': 2}  # the 'class variables dict'
>>> class MetaKlass(object):
...     def __new__(cls, kwd):
...             _ins = super(MetaKlass, cls).__new__(cls)
...             for k, v in kwd.iteritems():
...                     setattr(_ins, k, v)
...             return _ins
...
>>> mk = MetaKlass(kwd)
>>> mk.a
2
>>>

The code should be fairly straight forward. When I organize a collection of methods into a class, many of those methods rely on both a set of class values and specific instance values.

Lizenziert unter: CC-BY-SA mit Zuschreibung
Nicht verbunden mit StackOverflow
scroll top