문제

All I need is a large persistent lookup table in Erlang and dets seems like just the thing though I need a definative answer to:

  • just how big the total size of the binaries in the table can be.
  • how big each entry can be
  • what to do if the answer to the first question is less than 100G
도움이 되었습니까?

해결책

One obvious approach, once it is thought of, is to hash dets entries over multiple dets files.

A linear hash should make it dynamically growable, by splitting buckets into newly created dets files when one file reaches an upper threshold.

There are also a number of port drivers that enable you to use sleepycat/berkely db, or tokyo tyrrant. Those databases have file limits that are much higher than 2Gb.

다른 팁

This is kind of an RTFM question. As quoted directly in the second paragraph of the DETS manual:

The size of Dets files cannot exceed 2 GB. If larger tables are needed, Mnesia's table fragmentation can be used.

using mnesia fragmented dic_copies can overcome these limits provided you know how many fragments to crate ahead of time
http://www.trapexit.org/Mnesia_Table_Fragmentation

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top