Question

I'm writing a .NET application that will be used by engineers to graph and report upon our scrap and rework database. The application will feature pre-canned graphs and reports on application launch in a dashboard type implementation. Users will then be able to create their own graphs and reports (multiple graphs/reports will be open at the same time). A network connection and login is required for the application to run.

My question refers to the applications gathering and usage of the data. Currently the scrap and rework database table in question is roughly 100,000 rows and growing at about 16,000 rows per month.

I'm looking for a best practice or experienced based answer, however here are some of our ideas:

  1. Query the entire table on application launch in a "mecha-query", immediately converting to objects for the rest of the program to work with. In the future If the table grows too large have a setting for a partial or full load. (My favorite, but seems like terrible practice.)

  2. Writing a local copy of the table to the users computer using something like SQLite on application launch, data is queried from the on disk SQLite DB as needed, local DB is cleaned up on application close, or on application start if detected.

  3. Using an in memory SQLite DB in which data is queried as needed.

  4. Query SQL Server as needed.

For option 1 and 3 I'm worried about the applications memory footprint looking into the future 5-6 years. With the dashboard functionality previously described the advantages of options 2 and 4 seem negated because the application is basically going to need all the data on start-up anyways. I'm also thinking about the applications extensibility; maybe it will be ported to a web app someday.

Thanks!

Was it helpful?

Solution

Yes I recommend 4

And rethink why you need to build all the objects at start
Even if you do need to build them all at start then put them in a Dictionary and let SQL do what SQL does
With .NET you have size limits for a collection

I see no reason to worry about the load on SQL but you could also use LINQ against the Dictionary.

100,000 rows is not even close to big
100 million rows and you are starting to get big

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top