Question

I am re-evaluating different ways to wrap external C libraries into Python. I had chosen long ago to use the plain Python C API, which was fast, simple, standalone and, as I thought, future-proof. Then I stumbled upon PyPy, which apparently does not plan on supporting the CPython API, but might become an interesting alternative in the future... I am thus looking for a higher-level entry point. ctypes was slow, so now I am back at cython, which appears to make an effort to support PyPy.

My library has lots of functions with the same signature, so I made extensive use of C preprocessor macros to generate the Python module. I thought this would become a lot more comfortable in cython, since I would have access to the whole Python language. However, I am having trouble writing a factory for my function wrappers:

import cython
from numpy cimport ndarray, double_t

cimport my_c_library

cdef my_c_library.data D

ctypedef double_t DTYPE_t

cdef parse_args(ndarray[DTYPE_t] P, ndarray[DTYPE_t] x, ndarray[DTYPE_t] y):
    D.n = P.size
    D.m = x.size
    D.P = <double*> P.data
    D.x = <double*> x.data
    D.y = <double*> y.data

def _fun_factory(name):
    cpdef fun(ndarray[DTYPE_t] P, ndarray[DTYPE_t] x, ndarray[DTYPE_t] y):
        parse_args(P, x, y)
        getattr(my_c_library, name)(&D)
        return y
    return fun

fun1 = _fun_factory('fun1')
fun2 = _fun_factory('fun2')
# ... many more function definitions ...

The cython compiler complains: "C function definition not allowed here", referring to the cpdef inside _fun_factory. What is the problem here? I thought pyx files were just like regular python files. Is there a way to get this working, other than the obvious to generate the pyx file dynamically from a separate python script, such as setup.py?

I was also surprised that cython wouldn't let me do:

ctypedef ndarray[double_t, ndim=1] p_t

to clean up the code. Why doesn't this work?

I am aware that there are automatic C -> cython translators out there, but I am reluctant to make myself dependent on such 3rd party tools. But please feel free to suggest one if you think it is ready for production use.

Was it helpful?

Solution

pyx files are not like Python files in the sense that you can match C and Python functions, and there are some constraints on what you can do with a C (cdef or cpdef) function. For one, you can't dynamically generate C code at runtime, which is what your code is trying to do. Since fun is really just executing some Python code after typechecking its arguments, you might just as well make it a regular Python function:

def fun(P, x, y):
    parse_args(P, x, y)
    getattr(my_c_library, name)(&D)
    return y

parse_args will do the same argument checking, so you lose nothing. (I'm not sure whether getattr works on a C library that's cimport'd, though. You might want to import it as well.)

As for the ctypedef, that's probably some limitation/bug in Cython that no-one's got round to fixing yet.

OTHER TIPS

After playing around some more, the following appears to work:

def _fun_factory(fun_wrap):
    def fun(P, x, y):
        parse_args(P, x, y)
        fun_wrap()
        return y
    return fun

def _fun1(): my_c_library.fun1(&D)
def _fun2(): my_c_library.fun2(&D)
# ... many more ...

fun1 = _fun_factory(_fun1)
fun2 = _fun_factory(_fun2)
# ... many more...

So there seems to be no possibility to use any Python operations on expressions like my_c_library.fun1(&D), which apparently need to be typed as is. The factory can be used only on a second pass when one already has generated a first set of Python wrappers. This isn't any more elegant than the obvious:

cpdef fun1(ndarray[DTYPE_t] P, ndarray[DTYPE_t] x, ndarray[DTYPE_t] y):
    parse_args(P, x, y)
    my_c_function.fun1(&D)
    return y

# ... many more ...

Here, cpdef can be used without problems. So I'm going for the copy-paste approach... Anyone also interested in preprocessor macros for Cython in the future?

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top