質問

I'm using Python to display a bitmap image on a LCD display (Raspberry Pi). This works quite well but it seems that I've created myself a huge memory leak. This piece of code seems to be the culprit:

def displayBitmap(self):
    spi.open(0,0)
    f = open("data/565.bmp", "rb")
    imgdata = f.read()
    f.close()

    self.setAddress(0, 0, LCD_WIDTH-1, LCD_HEIGHT-1)
    k = 0
    for i in range(0, (LCD_WIDTH*LCD_HEIGHT)):
        dt = (ord(imgdata[k]) | (ord(imgdata[k+1]) << 8))

        self.spiOutData(dt,2)
        k +=2
    imgdata = None
    spi.close()

...

def spiOutData(self, data, bytes=1):
    io.digitalWrite(15, io.LOW)
    io.digitalWrite(16, io.HIGH)

    io.digitalWrite(self.dcPin, io.HIGH)

    if (bytes == 1):
        spi.xfer2([(data)])
    else:
        spi.xfer2([(data>>8)])
        spi.xfer2([(data)])

It runs fine for some time, but at some point it terminates due to lack of memory. My guess is that the contents of imgdata never get deleted but my Python knowledge seems to be too bad to be able to find the reason. Could you give me a hint, please? Thank you very much.

役に立ちましたか?

解決

So here is what I have found out:

  • the py-spidev module (or some part of spidev itself) seems to be the problem.
  • besides the memory leak, py-spidev is incredibly slow

I've now got rid of py-spidev and write to /dev/spidev0.0 directly via file handle. No more excessive memory usage and SPI communication now needs about two seconds, that's about a tenth of the time it needed before.

ライセンス: CC-BY-SA帰属
所属していません StackOverflow
scroll top