If you don’t know about how to bulk import data into SQL Server with the
BULK INSERT statement, you can stop reading.
Rails 1.1.6
SQL Server Adapter (ODBC mode)
I have a situation where I am performing a BULK INSERT followed by a
DELETE statement.
I’m using the same connection from the same model object to do both
actions.
It seems like if I perform a BULK INSERT such that the # of rows that
I’m inserting > the batch size specified for the BULK INSERT, then my
DELETE fails with the dreaded: “Connection is busy with results for
another hstmt” ODBC error.
However, if the # of rows is smaller than the BULK INSERT batch size,
the DELETE goes through fine.
Does anyone have any insight into this? I don’t expect that anyone will
have any experience with this, but thought I might try for a long shot.
If I set the BATCHSIZE for the BULK INSERT to just (# of rows to be
inserted) + 1, then I can immediately turn around and do the DELETE with
no problem.
But if BATCHSIZE <= (# of rows to be inserted), then the DELETE fails
and says that the connection is busy with results of any statement.
That statement appears to be: “SELECT @@ROWCOUNT AS AffectedRows”
I’ve read the relevant code in the SQL Server adapter, and that query is
only supposed to happen for regular INSERTS and DELETEs. I cannot
figure out how “SELECT @@ROWCOUNT AS AffectedRows” is getting initiated
due to my BULK INSERT. My best guess is that the BULK INSERT gets
turned into multiple regular INSERT statements which then behave as
normal?
Anyhow, my workaround is to always dynamically set the BATCHSIZE for the
BULK INSERT to be the (# of rows to be inserted) + 1, but of course,
this defeats the memory saving purpose of the BATCHSIZE parameter in the
first place.
Maybe another patch for the 'ol SQL Server adapter (I have 3 still to
submit)?
Thanks,
Wes
This forum is not affiliated to the Ruby language, Ruby on Rails framework, nor any Ruby applications discussed here.