Question

Some context: I have some C code that when compiled I can call in the terminal like this: ./my_excec -params It generates some files that I am using in python to generate charts, and other stuff.

I want to pack everything in a python library both the C code and the python code. The C code is not a python extension (prolly going to be in a future but right now is not).

I have a make file to compile the C code and I know I can call it from the setup.py like this: subprocess.call(['make', '-C', 'word2vec-src'])

What I want to be able to do is: pip install my_module That should call the makefile, compile the C so the user can call the binaries: my_excec -params and also be able to import the python code around it.

The problem I am having is when packaging the python package. I am using the data_files option in setup() like this:data_files=[('bin', ['bin/binary_file'])], This moves the files from bin to the installation folder (in a virtual env) and I can call them. But when packaging is also putting the compiled files in the tarball and when I call pip install my_module` is putting the compiled files from my computer.

Thanks.

Was it helpful?

Solution

I was able to find a really easy solution.

As I said my main problem was that I was packaging the compiled files. To exclude those files form the tarball/zip just had to put this on MANIFEST.in: prune bin.

Then just need to call the makefile from setup.py:

directory = 'bin'
if not os.path.exists(directory):
    os.makedirs(directory)

subprocess.call(['make', '-C', 'src'])

With that when someone does pip install whatever is going to call the make file and put the binaries on bin (have to specify this on the make file).

Then just need to say the setup to copy those files:

setup(
...
data_files=[('bin', ['bin/binaries'])],
)

Done! Hopefuly someone find this usefull :)

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top