Question

The .dat file contains binary data with 245kb size of file. So the total of binary digits is 2 millions.

I need to read as binary, and store the binary digit into database. Here what I have done :

  byte[] data = File.ReadAllBytes(dat_file);

  BitArray bits = new BitArray(data);                       

  int id_card = 1;

  for (int i = 0; i < bits.Length; i++)
  {
      if (stop)
      {
         break;
      }
      //Insert to database
      save_to_database(Convert.ToInt16(bits[i]),id_card);

      id_card++;

      double perc = (double)i/ (double)bits.Length;              

      this.btnSubmit.Invoke(new MethodInvoker(delegate { btnSubmit.Text = perc.ToString("P"); }));             


  }

But the problems it takes a very long time when foreach the bit array and save it to database. If I not mistaken, it took around 1 hour to reach number the 600k.

Any ideas how can I solved the problems? I has more than 5 of files need to read as binary and save it to database.

Était-ce utile?

La solution 2

How do you write to database? What your database engine is?

For me, it's taken about 20 seconds to generate and write absolutely same amount of bits to SQL-Server's localdb instance on very slow low-end PC. It's clearly slower than just loading file from filesystem, and writing bit's to a fast server.

Unit test:

using Microsoft.VisualStudio.TestTools.UnitTesting;
using System;
using System.Data;
using System.Data.SqlClient;

namespace BitsTest
{
    [TestClass]
    public class BitsTester
    {
        [TestMethod]
        public void BitsTest()
        {
            // random seed for emulating bit-array file
            Random rand = new Random();

            DataTable table = new DataTable();
            table.Columns.Add("bit",typeof(bool));

            string cs = @"Data Source=(localdb)\v11.0;Initial Catalog=bittest;Integrated Security=True";

            // 2007040 records = 245kb of bits
            for (int i = 0; i < 2007040; i++)
                table.Rows.Add(rand.Next() % 2 == 0);

            using (SqlBulkCopy bulk = new SqlBulkCopy(cs))
            {
                bulk.DestinationTableName = "bits";
                bulk.WriteToServer(table);
            }
        }
    }
}

So. Problem is not in code that you shown. Try to profile your code to find real bottleneck.

Autres conseils

You are doing a database insert operation in every iteration.

I would put the insert operation outside the loop and store the result into a list, and then do a bulk insert to the database.

Another approach is to use BLOB fields in database. Look at http://support.microsoft.com/kb/317016 to know how to read / write binary data from database.

Licencié sous: CC-BY-SA avec attribution
Non affilié à StackOverflow
scroll top