Question

I have content sources containing over 562,000 items. and I have given D drive 100 GB on the Fast Search Server.

I just got a warning of "Average (4 samples) disk free on D:\ is now 11%, which is below the warning threshold (15%) out of total size 100.0 GB"

I checked the space and found out D:\data_fixml has used 33.6 GB; D:\data_index has used 33.9 GB.

My questions are:

  1. If 100 GB space is big enough for over 500,000 items?

  2. if not, what recommendation size I should give to the D drive?

  3. If yes, how can I optimize the data inside data_fixml and data_index?

Thanks.

Was it helpful?

Solution

Take a look at the hardware recommendations for FAST Search Server here. Also other articles mention that virtual servers are faster, make sure you are using a virtual server.

  1. I wouldn't think so.
  2. This article is recommending 1TB on a RAID, across 6 spindles or more. That's what FAST Search is supposed to do, turn hardware value into speed for your searching needs. Here is another article that goes over a checklist on things to do to setup FAST Search server correctly and improve performance:
Licensed under: CC-BY-SA with attribution
Not affiliated with sharepoint.stackexchange
scroll top