Issue
I have a number of Python classes, Say, Class1
, Class2
, Class3
etc from a library/package. I want to extend all of the classes with some common functionalities. If I extend each class individually, we introduce a lot of redundancy and break the "Don't Repeat Yourself" acronym. So, my thought is to have a Base
class and use it extend other classes. For example:
class Base:
def __init__(self):
# I want self.base_attr_1, self.base_attr_2 and so on...
def base_method_1(self, *args, **kwargs):
pass
def base_method_2(self, *args, **kwargs):
pass
# and so on...
So we can extend Class1
, Class2
and so on using maybe Multiple inheritances. Say,
class Class1(Class1, Base):
pass
class Class2(Class2, Base):
pass
# and so on...
So that at last when I create an object of Class1
, Class2
etc., I can use the Base
class attributes and methods. Like;
class_1 = Class1(*args, **kwargs)
print(class_1.base_attr_1)
print(class_1.base_attr_2)
class_1.base_method_1(*args, **kwargs)
class_2.base_method_2(*args, **kwargs)
# and so on..
Please explain how to implement the Class1
, Class2
etc, to extend the Base
class.
Any help is highly appreciated. Thank you.
Solution
following your description, you would have two possibilities to handle your issue:
- metaclass
- decorator
If I was you, I would try something like this (decorator solution) :
from functools import wraps
def deco(cls):
def test(x):
return x**2
d = {k:v for k,v in locals().items() if k != "cls"}
@wraps(cls)
def wrapper(*args, **kwargs):
o = cls(*args, **kwargs)
#o.test = test # setattr(o,"test", test) will be better solution,
# if you have more elements, which you'd like to add
# generalized :
# ============
for k,v in d.items(): setattr(o,k,v)
return o
#Fast replacement of @wraps, but not exhaustive !
#wrapper.__doc__ = cls.__doc__
#wrapper.__name__ = cls.__name__
return wrapper
@deco
class A(object):
pass
a = A()
print(a.__dict__)
print(a.test(10))
Result:
{'test': <function test at 0x02843C70>}
100
As already spoken my solution with metaclass:
class MyMeta(type):
def __new__(cls, clsname, bases, clsdict):
def test1(self, x):
return x ** 2
def test2(self, x):
return x * 10
# filter the elements which aren't in the __new__ signature
tmp = {k: v for k, v in locals().items() if k not in ("cls", "clsname", "bases", "clsdict")}
for k, v in tmp.items(): clsdict[k] = v
return type.__new__(cls, clsname, bases, clsdict)
class A(object):
__metaclass__ = MyMeta
pass
a = A()
print(a.__dict__)
print(A.__dict__)
print(a.test1(10))
print(a.test2("ok_?"))
Result:
{}
{'test1': <function test1 at 0x029F73F0>, '__module__': '__main__', 'test2': <function test2 at 0x02A14BB0>, '__metaclass__': <class '__main__.MyMeta'>, '__dict__': <attribute '__dict__' of 'A' objects>, '__weakref__': <attribute '__weakref__' of 'A' objects>, '__doc__': None}
100
ok_?ok_?ok_?ok_?ok_?ok_?ok_?ok_?ok_?ok_?
The difference here is that the test1 and test2 aren't in the instance, but in the class itself, which means that if you subclass these classes they will be in the structure of the new classes also.
Which will be the better solution depends from your case :)
Answered By - baskettaz
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.