I've started to learn Python today. I'm trying to write decoupled middleware or chain of responsibility pattern.
In my app I have some two classes:
- abstract class or interface named Processable
- class ProcessorChain, which must locate implementations of the Processable, create objects of these implementations and call methods of these objects
In client code, user can:
- create his own class based on Processable
- write full path to his class to some global setting
- and be sure what it will be processed in queue by my app processor chain :-)
So, the same thing we got using service container, when we call services by tag. In Symfony, for example, we use service tags and CompilerPass to process a group of decoupled objects which implement one interface.
The question is: how to get same in Python?
I've already learnt about __subclasses__
method, but it works only if sub-classes was imported. So, I've learnt about importlib.importmodule
, then I got what I wanted.
The folder structure is:
- patterns
- main.py
- settings.py
- Chain
- chain.py
- processor.py
patterns.Chain.chain.py
import importlib
from abc import ABCMeta, abstractmethod
from patterns import settings
class Processable(metaclass=ABCMeta):
"""Meta class define required properties and methods"""
__metaclass__ = ABCMeta
@abstractmethod
def execute(self) -> str:
return 'Base class processing...'
class ProcessorChain:
@staticmethod
def process_directly():
# load processors from settings list
for definition in settings.CHAIN_PROCESSORS:
# parse module and package
m, p = definition.rsplit('.', 1)
module = importlib.import_module(m, p)
# get class
cls = getattr(module, p)
# check if the class is a child of Processable
if issubclass(cls, Processable):
# create an instance of the class
processor = cls()
# and do something
print(processor.execute())
@staticmethod
def process_subclass():
# still requires make dynamic imports
for definition in settings.CHAIN_PROCESSORS:
# parse module and package
m, p = definition.rsplit('.', 1)
# import a module
importlib.import_module(m, p)
# load all subclasses of Processable
for processor in Processable.__subclasses__():
# and do something
print(processor().execute())
patterns.Chain.processor.py
from .chain import Processable
class FirstProcessor(Processable):
"""Provides required properties and methods"""
context = 1
def execute(self):
return super(FirstProcessor, self).execute() + ' (implementation from meta class)'
class SecondProcessor(Processable):
"""Provides required properties and methods"""
context = 2
def execute(self):
return str(self.context) + ' processing...' + ' (its own implementation)'
patterns.settings.py
CHAIN_PROCESSORS = [
'patterns.Chain.processor.FirstProcessor',
'patterns.Chain.processor.SecondProcessor'
]
patterns.main.py
from patterns.Chain.chain import ProcessorChain
ProcessorChain().process_directly()
The run of main.py as expected prints:
Base class processing... (implementation from meta class)
2 processing... (its own implementation)
https://github.com/xcono/py3-trials/tree/master/patterns
But, actually, I don't know is what pythonic way? May be there is a better way to acheive decoupled chain? May be better to use event listeners/dispatchers?
Dear Pythonists, please, tell the truth, how do you cook it?