Question

I have a MySQL database "DB" in which there's a table "TABLE" of a single field and it contains about 8 Million rows. I am trying to download that table into a text file as follows:

//Open connection.
connection.Open();

//Create command.
MySqlCommand command = connection.CreateCommand();
command.CommandText = "SELECT * FROM `DB`.`TABLE`";

//Execute command.
MySqlDataReader result = command.ExecuteReader();

//If result isn't null
if (result != null)
{
    //Open stream to write result.
    System.IO.StreamWriter file = new System.IO.StreamWriter(@"C:\Users\StackOverflow\Desktop\ID_List.txt", true);

    //For each line row of result.
    while (result.Read())
    {
        //Write result in a line.
        file.WriteLine(result.GetValue(0).ToString());
    }
}

//Close connection.
connection.Close();

Upon running it starts downloading the data and outputing it to the text file but after only a minute or two it gives:

An unhandled exception of type 'MySql.Data.MySqlClient.MySqlException' occurred in test program.exe

Additional information: Timeout expired.  The timeout period elapsed prior to completion of the operation or the server is not responding.

What Might I change or do differently for it to work? Thanks for any suggestions and answers :)

[EDIT]: I added command.CommandTimeout = 240; and tested three times, every time it downloaded for a couple minutes (about 40MB of data each time) before giving:

An unhandled exception of type 'MySql.Data.MySqlClient.MySqlException' occurred in test program.exe

Additional information: Fatal error encountered during data read.

And these are the last few lines of the debug output:

at MySql.Data.MySqlClient.NativeDriver.FetchDataRow(Int32 statementId, Int32 columns) in :line 0
   at MySql.Data.MySqlClient.Driver.FetchDataRow(Int32 statementId, Int32 columns) in :line 0
   at MySql.Data.MySqlClient.ResultSet.GetNextRow() in :line 0
   at MySql.Data.MySqlClient.ResultSet.NextRow(CommandBehavior behavior) in :line 0
   at MySql.Data.MySqlClient.MySqlDataReader.Read() in :line 0</ExceptionString><InnerException><ExceptionType>System.IO.EndOfStreamException, mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089</ExceptionType><Message>Attempted to read past the end of the stream.</Message><StackTrace>   at MySql.Data.MySqlClient.MySqlStream.ReadFully(Stream stream, Byte[] buffer, Int32 offset, Int32 count) in :line 0
   at MySql.Data.MySqlClient.MySqlStream.LoadPacket() in :line 0</StackTrace><ExceptionString>System.IO.EndOfStreamException: Attempted to read past the end of the stream.
   at MySql.Data.MySqlClient.MySqlStream.ReadFully(Stream stream, Byte[] buffer, Int32 offset, Int32 count) in :line 0
   at MySql.Data.MySqlClient.MySqlStream.LoadPacket() in :line 0</ExceptionString></InnerException></InnerException></Exception></TraceRecord>
An unhandled exception of type 'MySql.Data.MySqlClient.MySqlException' occurred in test program.exe
Additional information: Fatal error encountered during data read.
The program '[4548] test program.exe: Program Trace' has exited with code 0 (0x0).
The program '[4548] test program.exe: Managed (v4.0.30319)' has exited with code 0 (0x0).

[EDIT2]: I added MySqlDataReader result = command.ExecuteReader(System.Data.CommandBehavior.SingleResult); in an effort to remedy the above error and now I am getting:

An unhandled exception of type 'MySql.Data.MySqlClient.MySqlException' occurred in test program.exe
Additional information: Query execution was interrupted

After some light google-ing I figure it is GoDaddy that is limiting the query execution time. Therefore a working solution is to have try/catch with a counter to keep track of how many records were retrieved before the connection was closed and to re-open new connections with the counter as starting point to LIMIT until the table size is exhausted.

Thanks for all the help!

Was it helpful?

Solution

Increase the wait_timeout parameter in my.ini file.

http://dev.mysql.com/doc/refman/5.0/en/server-system-variables.html#sysvar_wait_timeout

There are other timeout parameters that can be the problem, but I think it is the one.

If you don't have access to your ini file, use the "limit" to get a limited result set.

First, get total row count (count(*) ) Second, get small result set thanks to a for

 for (int i=0; i<=numberOfRowsInYourTable ; i=i+1000){
      command.CommandText = "SELECT * FROM `DB`.`TABLE` LIMIT " + i + " " + new String(i + 1000);
      // Do what you want
 }

OTHER TIPS

Add this to your connection string:

default command timeout=240

and adjust it as necessary.

You have several options

  1. you can set a longer timeout that means you'll need to set the CommandTimeout parameter set in your connection string

  2. you can use SELECT INTO that would generate the file on the server

  3. you can use the mysqldump utility that would also generate the file on the server

EDIT Fellow SO users mention other types of timeout parameters. I'm not user which one is the key here

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top