I do spot a bug, but those results seem strange to me, too. The bug I see is that you are not clearing your csv variable on each iteration. What I'd expect is you'd have n(log n) number of rows, because when writing each new row you would also repeat everything that went before.
Update: Reading this again, the only time you write a new line is at the record boundaries as part of the f.WriteLine()
call, so what I expect is that you get the right number of rows, but each row will be longer and longer, with new stuff added to the end of the row. Scroll the file to the right, and I bet you'll see the items in each row.
This means my code below should fix up your entire issue.
To fix the bug with clearing your csv variable, rather than add code to reset it I would change things to avoid the buffer string variable entirely, and write everything directly to your stream. I also added code to flush the stream every so often. These changes are important in a web context. If you don't flush your buffer, your browser might not see any response from your web server for a long time, and that could result in a timeout error. Given that this is tagged asp-classic
, I wonder that you write to a file at all, rather than the response stream.
dim fs,f,d
set fs = Server.CreateObject("Scripting.FileSystemObject")
set f = fs.OpenTextFile(Server.MapPath("thebackupfile.csv"), 8, true)
d = ""
set RS = conn.execute("SELECT * FROM sometable")
For each fField in RS.Fields
f.Write(d)
f.Write(fField.Name)
f.Write(" ")
d= ";"
Next
f.WriteLine("")
Dim i
i = 0
do while not RS.EOF
d = ""
For Each fField in RS.Fields
f.Write(d)
f.Write(" ")
f.Write(RS(fField.Name))
d = ";"
Next
f.WriteLine("")
RS.movenext
i = i + 1
If i Mod 50 = 0 Then
f.Flush
End If
Loop
f.Close
set f = Nothing
set fs = Nothing