I read through the documentation of Kafka 0.8.0 in its entirety, but could not find an answer to this question. If anyone has experience with Kafka, whats the maximum amount of data it can hold assuming your hardware has not reached its point of failure. We are planing on storing our payload in kafka for DR purposes.

有帮助吗?

解决方案

There is no limit in Kafka itself. As data comes in from producers it will be written to disk in file segments, these segments are rotated based on time (log.roll.hours) or size (log.segment.bytes) according to configuration.

Older segments are deleted based on retention configuration (log.cleanup.policy, log.retention.minutes, log.retention.bytes) which can be turned off.

Do note however that the broker will keep each segment file open, so make sure your file descriptor limits are set accordingly.

Broker configuration documentation: http://kafka.apache.org/documentation.html#brokerconfigs

许可以下: CC-BY-SA归因
不隶属于 StackOverflow
scroll top