Question

It has been asked to me to make a sort of "exporting" feature on a site in ASP Classic.

I thought it could be really simple to make, but I got some problems. I get the right numbers of row in the file, but I get always the same recordset row, something like the recordset isn't updating the values during the foreach loop.

Here my code:

dim fs,f
set fs = Server.CreateObject("Scripting.FileSystemObject")
set f = fs.OpenTextFile(Server.MapPath("thebackupfile.csv"), 8, true)

set RS = conn.execute("SELECT * FROM sometable")

For each fField in RS.Fields 
    row = row & " " & fField.Name & ";"     
Next

row = left (row, len(row)-1)

f.WriteLine(row)

do while not RS.EOF
    For each fField in RS.Fields 
        csv = csv &" "& RS(fField.Name) & ";"     
    Next
    csv = left (csv, len(csv)-1)

    f.WriteLine(csv)
    RS.movenext
loop

f.Close
set f = Nothing
set fs = Nothing

I can't figure out why I get n times the first row with n equals of the number of rows in the recordset.

Any advice? Am I taking the wrong way to do that?

Was it helpful?

Solution

I do spot a bug, but those results seem strange to me, too. The bug I see is that you are not clearing your csv variable on each iteration. What I'd expect is you'd have n(log n) number of rows, because when writing each new row you would also repeat everything that went before.


Update: Reading this again, the only time you write a new line is at the record boundaries as part of the f.WriteLine() call, so what I expect is that you get the right number of rows, but each row will be longer and longer, with new stuff added to the end of the row. Scroll the file to the right, and I bet you'll see the items in each row.

This means my code below should fix up your entire issue.


To fix the bug with clearing your csv variable, rather than add code to reset it I would change things to avoid the buffer string variable entirely, and write everything directly to your stream. I also added code to flush the stream every so often. These changes are important in a web context. If you don't flush your buffer, your browser might not see any response from your web server for a long time, and that could result in a timeout error. Given that this is tagged asp-classic, I wonder that you write to a file at all, rather than the response stream.

dim fs,f,d
set fs = Server.CreateObject("Scripting.FileSystemObject")
set f = fs.OpenTextFile(Server.MapPath("thebackupfile.csv"), 8, true)
d = ""

set RS = conn.execute("SELECT * FROM sometable")

For each fField in RS.Fields 
    f.Write(d)
    f.Write(fField.Name)
    f.Write(" ")
    d= ";"     
Next

f.WriteLine("")

Dim i
i = 0

do while not RS.EOF
    d = ""
    For Each fField in RS.Fields 
        f.Write(d)
        f.Write(" ")
        f.Write(RS(fField.Name))
        d = ";"    
    Next

    f.WriteLine("")
    RS.movenext

    i = i + 1
    If i Mod 50 = 0 Then
       f.Flush
    End If
Loop

f.Close
set f = Nothing
set fs = Nothing

OTHER TIPS

You should first load the values of a record into an array, then join that array into a delimiter-separated line, then write that line to the output file:

set RS = conn.Execute("SELECT * FROM sometable")

fields = Array()
Redim fields(RS.Fields.Count-1)

For i = 0 To RS.Fields.Count-1
  fields(i) = RS.Fields(i).Name
Next
f.WriteLine """" & Join(fields, """;""") & """"

Do Until RS.EOF
  For i = 0 To RS.Fields.Count-1
    fields(i) = RS.Fields(i).Value
  Next
  f.WriteLine """" & Join(fields, """;""") & """"

  RS.MoveNext
Loop

However, even better than that would be having MySQL directly generate the output file:

SELECT *
FROM sometable
INTO OUTFILE 'C:\\path\\to\\thebackupfile.csv'
FIELDS TERMINATED BY ';'
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top