Populating 'long' data column with Ruby OCI8 (trying again,

Hello,

Is there any way to insert a big chunk of data (say, 100K) into a column
of type long with Ruby OCI8 or by any other Ruby means? I saw that to do
it in C you must be prepared to handle OCI error code inviting you to
insert another piece. But how to do it in Ruby, especially with OCI8? If
I do

require ‘oci8’

db = OCI8.new ‘system’, ‘manager’
db.exec(“create table sample (id number, data long)”)

data = ‘A’ * 1024 * 100

c = db.parse(‘insert into sample values (2, :data)’)
c.bind_param(’:data’, data)

c.exec
db.commit

What I end up with in column ‘data’ is host dependent (or db block size
dependent?). I observed 14464 bytes on 2K database on Solaris, and 34652
bytes on 8K database on Linux.

UPDATE: It is not possible to read (with OCI8) columns of type long if
they contain large data chunks (100K). Reported error is:
`fetch’: ORA-01406: fetched column value was truncated

Thank you,
Gennady B…

On 1/17/06, Gennady B. [email protected] wrote:

Is there any way to insert a big chunk of data (say, 100K) into
a column of type long with Ruby OCI8 or by any other Ruby means?

You need to do it in chunks. See documentation for OCI8::BLOB#write().
Here’s an example that works for me loading data from a file. You’ll
need to adapt to load from a variable.

require ‘oci8’

conn = OCI8.new(user, passwd, sid)

test.txt is a file that’s > 100K

name = “test.txt”

create the row with an empty blob

cursor = conn.parse(“INSERT INTO nkrb_test (name, data) VALUES(:name,
EMPTY_BLOB())”)
cursor.exec(name)

now load blob column with file contents

conn.exec(“SELECT name, data FROM nkrb_test”) do |name, data|
chunk_size = data.chunk_size
File.open(name, ‘r’) do |f|
until f.eof?
data.write(f.read(chunk_size))
end
data.size = f.pos
end
end
conn.commit

Schema for nkrb_test is:

create table nkrb_test (
id INT,
name VARCHAR(255),
data BLOB,
CONSTRAINT nkrb_test_pk PRIMARY KEY (id)
);

UPDATE: It is not possible to read (with OCI8) columns of type long if
they contain large data chunks (100K). Reported error is:
`fetch’: ORA-01406: fetched column value was truncated

Try setting OCI8::BLOB#truncate() to longer than the longest data you
expect to see in a row, or use OCI8::BLOB#read() to read row data in
chunks – again see docs for example.

If you haven’t figured it out yet already, LOBs are a real pain to deal
with.

“Gennady B.” [email protected] writes:

db = OCI8.new ‘system’, ‘manager’
What I end up with in column ‘data’ is host dependent (or db block size
dependent?). I observed 14464 bytes on 2K database on Solaris, and 34652
bytes on 8K database on Linux.

I don’t know why. It may needs piecewise inserts to insert long values.

I’ve not tested the following patch. It may work, but may not…
— oci8.rb.bak 2005-11-13 16:21:33.000000000 +0900
+++ oci8.rb 2006-01-19 03:07:00.000000000 +0900
@@ -264,6 +264,14 @@
end
end

  • get/set LONG

  • LONG = Object.new
  • class << LONG
  •  def fix_type(env, val, length, precision, scale)
    
  •    [OCI8::SQLT_LNG, val, length || (val.nil? ? nil : val.length)]
    
  •  end
    
  • end
  • get/set RAW

    RAW = Object.new
    class << RAW
    @@ -867,7 +875,7 @@

    datatype type size prec scale

    -------------------------------------------------

    LONG SQLT_LNG 0 0 0

  • BindType::Mapping[OCI8::SQLT_LNG] = BindType::String
  • BindType::Mapping[OCI8::SQLT_LNG] = BindType::LONG

    datatype type size prec scale

    -------------------------------------------------

UPDATE: It is not possible to read (with OCI8) columns of type long if
they contain large data chunks (100K). Reported error is:
`fetch’: ORA-01406: fetched column value was truncated

How about edit oci8.rb at line 693 as the following:

from:
when SQLT_LNG, SQLT_LBI
# TODO: use OCI_DYNAMIC_FETCH
datasize = 65535
end
to:
when SQLT_LNG, SQLT_LBI
# TODO: use OCI_DYNAMIC_FETCH
datasize = 100 * 1024
end