Question

We are building our new fast farm and have plans to put up one column of Physical servers for FAST indexing and crawling. We will be crawling around 15 million items including SharePoint sites, exchange public folders, custom sql databases and file share documents. I know if it was just SharePoint sites, the formula is 20% of your total content DB size. Is there some similar formula with which I can calculate the physical memory required on my index and crawl boxes. As per MS (http://www.microsoft.com/download/en/details.aspx?displaylang=en&id=23576) they recommend to use 1 TB HDD size per 5 million items. But that doesn't seems to be the case. Any recommendations?

Was it helpful?

Solution

As data vary so much from installation to installation your best approach is to index perhaps 100,000 items which are representative of your corpus, and extrapolate the disk needed from that.

Also read over the scenario listed at http://technet.microsoft.com/en-us/library/ff599526.aspx which have numbers for raw data and index size, which might can carry over to your installation.

Licensed under: CC-BY-SA with attribution
Not affiliated with sharepoint.stackexchange
scroll top