Question

Suppose I have a useful python function or class (or whatever) called useful_thing which exists in a single file. There are essentialy two ways to organize the source tree. The first way uses a single module:

- setup.py
- README.rst
- ...etc...
- foo.py

where useful_thing is defined in foo.py. The second strategy is to make a package:

- setup.py
- README.rst
- ...etc...
- foo
|-module.py
|-__init__.py

where useful_thing is defined in module.py. In the package case __init__.py would look like this

from foo.module import useful_thing

so that in both cases you can do from foo import useful_thing.

Question: Which way is preferred, and why?

EDIT: Since user gnat says this question is poorly formed, I'll add that the official python packaging tutorial does not seem to comment on which of the methods described above is the preferred one. I am explicitly not giving my personal list of pros and cons because I'm interested in whether there is a community preferred method, not generating a discussion of pros/cons :)

Was it helpful?

Solution

You do the simplest thing that works for you.

For a one function module, there is absolutely no point in creating a package. Packages are useful for creating an additional namespace and/or for organising your code across multiple modules.

The json and unittest modules in the Python standard library are really packages for example, for code organisation purposes. But it is perfectly fine to leave code currently living in just one python file, as one module.

If you want examples:

For a good example of projects that really make excellent use of packages, look at:

There is no 'official' recommendation; both options are entirely valid.

Licensed under: CC-BY-SA with attribution
scroll top