Domanda

I have a 700MB SQLite3 database that I'm reading/writing to with a simple Python program. I'm trying to gauge the memory usage of the program as it operates on the database. I've used these methods:

  • use Python's memory_profiler to measure memory usage for the main loop function which runs all the insert/selects
  • use Python's psutil to measure peak memory usage during execution
  • manually watching the memory usage via top/htop

The first two support the conclusion it uses no more than 20MB at any given time. I can start with an empty database and fill it up with 700MB of data and it remains under 20MB:

Memory profiler's figure never went above 15.805MiB:

Line #    Mem usage    Increment   Line Contents
================================================
   ...
   229   13.227 MiB    0.000 MiB       @profile
   230                                 def loop(self):
   231                                     """Loop to record DB entries"""
   234   15.805 MiB    2.578 MiB           for ev in range(self.numEvents):
   ...

pstuil said peak usage was 16.22265625MB

Now top/htop is a little weirder. Both said that the python process's memory usage wasn't above 20MB, but I could also clearly see the free memory steadily decreasing as it filled up the database via the used number:

Mem:   4047636k total,   529600k used,  3518036k free,    83636k buffers

My questions:

  • is there any "hidden" memory usage? Does Python call libsqlite in such a way that it might use memory on its own that isn't reported as belonging to Python either via psutil or top?
  • is the above method sound for determining the memory usage of a program interacting with the database? Especially top: is top reliable for measuring memory usage of a single process?
  • is it more or less true that a process interacting with a SQLite database doesn't need to load any sizeable part of it into memory in order to operate on it?

Regarding the last point, my ultimate objective is to use a rather large SQLite database of unknown size on an embedded system with limited RAM and I would like to know if it's true that that memory usage is more or less constant regardless of the size of the database.

È stato utile?

Soluzione

SQLite's memory usage doesn't depend on the size of the database; SQLite can handle terabyte-sized databases just fine, and it only loads the parts of the database that it needs (plus a small, configurable-sized cache).

SQLite should be fine on embedded systems; that's originally what it's designed for.

Autorizzato sotto: CC-BY-SA insieme a attribuzione
Non affiliato a StackOverflow
scroll top