Question

This question is a mix of software and hardware-related issues.

I have a custom app, written in c using VS2010, that requires some high-end hardware (the app needs to bite and chew on about 50 gigs of data once a minute). It runs under 64bit Win 7. Right now I have a Dell t7500 with dual xenon 5690 chips (each hex core), so there are 12 physical and 24 logical cores. The setup has 48 gig of ram, which it needs because the app has about 30-40 gig of data that it is using at one time.

The app is multithreaded (it's a beautiful thing to see all 24 logical cores maxed out at 100% !), so the thing has gone from being cpu-bound to io bound. (Before I rewrote the app as multithreaded, it took about 36 hours to the data backloading; it now takes about 1.5 hours, but I still need to get that down to under 15 minutes).

I am able to modify the software so it can take advantage of a multidrive configuartion.

I can split out the io work between the 24 cores. I need about 3 TB of storage. Would the best scenario, although impractical, be to have 48 separate SSDs, so that each of the 24 threads running would have its own private drive for reads and writes? (The app doesn't do a lot of small file reads/writes; instead, it reads/writes in 1-2 GB chunks.)

Assuming that getting that many drives on a single system is impossible, then what is the limit to this sort of setup in terms of number of drives? I have 5 slots on the t7500 motherboard. And what sort of controller/config would be the fastest here? SATA? SCSI? Raid 0?
Or are there some things I can do in software that would make a lot of the hardware issues less important? Thanks.

Was it helpful?

Solution

I found that you could use many detachable 3tb usb drives.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top