سؤال

I have a list of 100 million strings one on each line. The file size is 1.66 Gb. Each string is about 20 chars long.

I started inserting the data into small instance and got max memory error. There were only 1million records inserted by then.

Micro
Micro Cache Node (cache.t1.micro): 213 MB memory, Up to 2 ECU (for short periodic bursts), 64-bit platform, Low I/O Capacity
Standard
Small Cache Node (cache.m1.small): 1.3 GB memory, 1 ECU (1 virtual core with 1 ECU), 64-bit platform, Moderate I/O Capacity
Medium Cache Node (cache.m1.medium): 3.35 GB memory, 2 ECU (1 virtual core with 2 ECUs), 64-bit platform, Moderate I/O Capacity
Large Cache Node (cache.m1.large): 7.1 GB memory, 4 ECUs (2 virtual cores with 2 ECUs each), 64-bit platform, High I/O Capacity
Extra Large Cache Node (cache.m1.xlarge): 14.6 GB of memory, 8 ECUs (4 virtual cores with 2 ECUs each), 64-bit platform, High I/O Capacity

Will a small cache node be able to store the data or will I run out of space? How to calculate the number of records that an instance can handle?

هل كانت مفيدة؟

المحلول

According to this: http://redis.io/topics/faq

When you have a lot of small keys, it will use 5-6 times as much memory as the size of the data stored in redis.

You will probably need somewhere around 8-10 GB of memory to store your data set. Which will limit you to cache.m1.xlarge.

نصائح أخرى

2^32 keys

Which equals to 4,294,967,296. Basically You should say how much ram is needed to save this much of data! On average, having 8 characters for every key will take 32GB of ram! Summing with the size of values, you will need much!

Source: https://redis.io/topics/faq

مرخصة بموجب: CC-BY-SA مع الإسناد
لا تنتمي إلى StackOverflow
scroll top