Question

If a 64bit program wants to consume lot of memory, does it matter if memory is allocated in process heap or from memory map file/s? I understand other benefits of memory map file like sharing across two or more processes, however, in my case, data in memory maps is not shared across processes.

Was it helpful?

Solution

It's not entirely clear what you mean would be the difference, especially given that large allocations [by some definition of large that depends on settings in the C library] are typically make by using anonymous mmap regions (that is, memory mapped files that don't actually have a real file backing them - the OS uses /dev/zero as the "file", so when memory is paged in from the "file" it reads as zero. It is never written back...].

In other words, whilst "heap" is memory managed by the C library, and if you manually manage memory mapped files you have to do the "management" in your code for that, it's otherwise the same thing.

Edit:

In response to the comment:

It's really going to depend on:

  1. The amount of memory in the system. If you have 1TB+, then you will probably be able to use either method with approximately the same result.
  2. How large the sections are from file - if you are reading little portions (significantly less than 4KB) in many different places, then malloc will probably win. If the sections of file you are working on is much larger, then either method will have about the same memory usage factor.

Either method will not give decent performance if you have a lot less memory than the actual amount of data you are processing, because too much time is spent reading/writing data to/from disk, and/or allocating/deallocating memory.

In general, mapping files to memory is the fastest way of loading the data into memory (there is fewer times the OS has to copy the data on the way from the disk to the final location in memory). But for any relatively large files, the actual speed the data comes off the disk will be the main factor, and no matter what you do, that will dominate the time it takes to read 1TB of file(s).

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top