Question

We have multiple Python projects that have dependencies on each other. Hierarchically, these are organized like this:

P1
P2
...
Pn

Each of these is an PyDev project within Eclipse and they co-exist just fine within that environment. We are in the process of structuring out build process to enable us to deploy these and distribute these in a more systematic fashion. Currently, we just zip up these projects and copy them over for deployment.

I need some advice on how to go about this task using distutils. Our objective is to have a script to build a zip file (or tar file) using distutils that contains all the necessary code and necessary data/properties from the projects P1 through Pn. We should then be able to deploy this with setup.py and having our DJango-based web layer access it.

My first attempt is to create a project whose sole purpose is to build the deployment artifacts. This will sit parallel to the projects P1 through Pn, called PBuild.

Does this seem reasonable? I'm having some issues with this approach. Does anybody have any other ideas of how to do this?

Was it helpful?

Solution

There's different philosophies on how apps should be packaged, but most Python developers adhere to a very minimalistic approach. In other words, you package up the smallest units of logic you can.

So, your goal here shouldn't be to cram everything together, but to package each discrete application separately. By application, here, I don't mean necessarily each Django app, although breaking out some of the apps into their own packages may be worthwhile as well.

This is really all about reusability. Any piece that could serve a purpose in some other scenario should get its own package. Then, you can set them up to have dependencies on whatever other packages they require.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top