Skip to content Skip to sidebar Skip to footer

Dataflow/apache Beam: Manage Custom Module Dependencies

I have a .py pipeline using apache beam that import another module (.py), that is my custom module. I have a strucutre like this: ├── mymain.py └── myothermodule.py I

Solution 1:

When you run your pipeline remotely, you need to make any dependencies available on the remote workers too. To do it you should put your module file in a Python package by putting it in a directory with a __init__.py file and creating a setup.py. It would look like this:

├── mymain.py
├── setup.py
└── othermodules
    ├── __init__.py
    └── myothermodule.py

And import it like this:

from othermodules import myothermodule

Then you can run you pipeline with the command line option --setup_file ./setup.py

A minimal setup.py file would look like this:

import setuptools

setuptools.setup(packages=setuptools.find_packages())

The whole setup is documented here.

And a whole example using this can be found here.


Post a Comment for "Dataflow/apache Beam: Manage Custom Module Dependencies"