Tracking class instantiations
One of the experiments I have been doing based on code reloading, is to among other things, add the ability to react to instantiations of arbitrary classes. The way I go about doing this is to react to the definition of a class by giving it a new __init__
method.
If the class came with a custom __init__
method, then this is the easiest case to handle. When my replacement method is called, I can just call the original custom method before I notify whatever is interested.
def init_wrapper(self, *args, **kwargs):What gets a little more complex, is the case when the class does not have a custom
class_.__real_init__(self, *args, **kwargs)
events.Register(self)
__init__
method defined. In this situation, I still need to notify whatever is interested, but in order to preserve existing behaviour I need to pass the call down to the base classes.def init_standin(self, *args, **kwargs):The thing that really complicates this, is the
if type(self) is types.InstanceType:
# Old-style..
for baseClass in class_.__bases__:
if hasattr(baseClass, "__init__"):
baseClass.__init__(self, *args, **kwargs)
else:
# New-style..
super(class_, self).__init__(*args, **kwargs)
events.Register(self)
class_
variable used in the both snippets included above. This needs to refer to the class that the given method being executed is defined on. I cannot infer the class from any of the existing variables, so it needs to be provided in some other way. Because I know it when I install the __init__
method, I define these functions inline so their use of it includes it in their closures.def MonitorClassInstantiation(self, class_):With this working, the remaining concern is the effect of a hierarchy of classes modified in this way. Each has one of these two new
def init_wrapper(self, *args, **kwargs):
class_.__real_init__(self, *args, **kwargs)
events.Register(self)
def init_standin(self, *args, **kwargs):
if type(self) is types.InstanceType:
for baseClass in class_.__bases__:
if hasattr(baseClass, "__init__"):
baseClass.__init__(self, *args, **kwargs)
else:
super(class_, self).__init__(*args, **kwargs)
events.Register(self)
init_real = class_.__dict__.get("__init__", None)
if init_real is None:
class_.__init__ = init_standin
else:
class_.__init__ = init_wrapper
class_.__real_init__ = init_real
__init__
methods, and the subclasses call down through them to the base classes. For now, I have a WeakKeyDictionary
of instances in the events
handler that ignores calls from instances that have already called.Source code: events.py
In the course of normal Python usage, metaclasses may be the only way to do some of the more complicated behaviours like these. But I am working in a framework which takes care of loading code and that can transform it in the process. This gives me the ability to take more straightforward approaches.
ReplyDeleteI have always found metaclasses to complicate code. They are the kind of thing that you come back to and have to spend time trying to relearn exactly what they are doing. On the other hand, patching __init__ as I have the ability to do here, is straightforward to do and to comprehend as a newcomer.