Re: JDBC + large objects problem

Lists: pgsql-interfaces
From: Patrick Goodwill <goodwill(at)cheese(dot)stanford(dot)edu>
To: pgsql-interfaces(at)postgresql(dot)org
Subject: JDBC + large objects problem
Date: 2000-08-25 21:25:27
Message-ID: Pine.BSF.4.21.0008251424240.1475-100000@cheese.stanford.edu
Views: Raw Message | Whole Thread | Download mbox | Resend email
Lists: pgsql-interfaces

I'm trying to use JDBC and BLOBS to store large amounts of text in a
database. I get a strange error when I try to use the conventional JDBC
interface... it comes out with a SQL Exceptions of:

"InputStream as parameter not supported"

for the code:

Connection conn = pool.getConnection();
PreparedStatement pstmt =
conn.prepareStatement("INSERT INTO t" + book
+ "_data (author_id, title, text,
type) VALUES ( ?, ?, ?, ?)");
pstmt.setInt(1, userId);
pstmt.setString(2, title);
InputStream textStream = stringToStream(text);
pstmt.setBinaryStream(3, textStream, text.length());
pstmt.setInt(4, type);
pstmt.executeUpdate();
pstmt.close();

... with some helper functions....

private InputStream stringToStream(String string) {
byte[] bytes = string.getBytes();
ByteArrayInputStream stream = new ByteArrayInputStream(bytes);
return (InputStream) stream;
}

private String streamToString(InputStream stream) {
try {
int length = stream.available();
byte[] bytes = new byte[length];
stream.read(bytes);
return new String(bytes);
} catch (IOException e) {
System.out.println("No Stream");
}
return null;
}

with an abbreviated schema of....

>> \d t1_data
Table "t1_data"
Attribute | Type | Modifier
-----------+---------+-------------------------------------------------------
data_id | integer | not null default
nextval('t1_data_data_id_seq'::text)
author_id | integer |
title | text |
text | oid |
type | integer |
time | time |
Index: t1_data_pkey


.... using postgresql 7.0 and the newest JDBC driver from retep.org.uk

if ya'll have any ideas why it does what it does, i just might kiss your
feet. =)

-Patrick.


From: Rob Judd <rjudd(at)mlug(dot)missouri(dot)edu>
To: Patrick Goodwill <goodwill(at)cheese(dot)stanford(dot)edu>
Cc: pgsql-interfaces(at)postgresql(dot)org
Subject: Re: JDBC + large objects problem
Date: 2000-08-25 22:16:10
Message-ID: Pine.LNX.4.21.0008251709380.28163-100000@mlug.missouri.edu
Views: Raw Message | Whole Thread | Download mbox | Resend email
Lists: pgsql-interfaces


The problem is that the JDBC interface has not yet implemented a lot of
the functionality, so you get these "not supported" Exceptions. On the
bright side, for every problem like this there seems to be enough
functionality to achieve what you need.

In this case it is a little simpler:

Instead of:

InputStream textStream = stringToStream(text);
pstmt.setBinaryStream(3, textStream, text.length());

you can use:

pstmt.setBytes(3, string.getBytes()) ;

and to retrieve you can use ResultSet.getBytes() to return the byte array.

Hope this helps, Rob

> I'm trying to use JDBC and BLOBS to store large amounts of text in a
> database. I get a strange error when I try to use the conventional JDBC
> interface... it comes out with a SQL Exceptions of:
>
> "InputStream as parameter not supported"
>
> for the code:
>
> Connection conn = pool.getConnection();
> PreparedStatement pstmt =
> conn.prepareStatement("INSERT INTO t" + book
> + "_data (author_id, title, text,
> type) VALUES ( ?, ?, ?, ?)");
> pstmt.setInt(1, userId);
> pstmt.setString(2, title);
> InputStream textStream = stringToStream(text);
> pstmt.setBinaryStream(3, textStream, text.length());
> pstmt.setInt(4, type);
> pstmt.executeUpdate();
> pstmt.close();
>
>
> ... with some helper functions....
>
> private InputStream stringToStream(String string) {
> byte[] bytes = string.getBytes();
> ByteArrayInputStream stream = new ByteArrayInputStream(bytes);
> return (InputStream) stream;
> }
>
> private String streamToString(InputStream stream) {
> try {
> int length = stream.available();
> byte[] bytes = new byte[length];
> stream.read(bytes);
> return new String(bytes);
> } catch (IOException e) {
> System.out.println("No Stream");
> }
> return null;
> }
>
> with an abbreviated schema of....
>
> >> \d t1_data
> Table "t1_data"
> Attribute | Type | Modifier
> -----------+---------+-------------------------------------------------------
> data_id | integer | not null default
> nextval('t1_data_data_id_seq'::text)
> author_id | integer |
> title | text |
> text | oid |
> type | integer |
> time | time |
> Index: t1_data_pkey
>
>
> .... using postgresql 7.0 and the newest JDBC driver from retep.org.uk
>
>
> if ya'll have any ideas why it does what it does, i just might kiss your
> feet. =)
>
> -Patrick.
>
>


From: Joachim Achtzehnter <joachim(at)kraut(dot)bc(dot)ca>
To: Patrick Goodwill <goodwill(at)cheese(dot)stanford(dot)edu>
Cc: pgsql-interfaces(at)postgresql(dot)org
Subject: Re: JDBC + large objects problem
Date: 2000-08-25 22:21:21
Message-ID: Pine.LNX.4.21.0008251514410.14451-100000@penguin.kraut.bc.ca
Views: Raw Message | Whole Thread | Download mbox | Resend email
Lists: pgsql-interfaces

Today, in a message to pgsql-interfaces, Patrick Goodwill wrote:
>
> "InputStream as parameter not supported"
>
> PreparedStatement pstmt = ...
> ...
> pstmt.setBinaryStream(3, textStream, text.length());

As the error message says, streams are not yet supported for blobs.
Instead, read the contents of the stream into a byte array and use the
setBytes() method of PreparedStatement.

> if ya'll have any ideas why it does what it does, i just might kiss your
> feet. =)

Better yet, you could consult the archives of this mailing list. This
question seems to come up every couple of weeks. You can find an archive
at http://www.geocrawler.com/lists/3/Databases/105/0/.

Joachim

--
work: joachima(at)realtimeint(dot)com (http://www.realtimeint.com)
private: joachim(at)kraut(dot)bc(dot)ca (http://www.kraut.bc.ca)


From: "Peter Mount" <peter(at)retep(dot)org(dot)uk>
To: "Patrick Goodwill" <goodwill(at)cheese(dot)stanford(dot)edu>, <pgsql-interfaces(at)postgresql(dot)org>
Subject: Re: JDBC + large objects problem
Date: 2000-08-26 09:56:16
Message-ID: 00a401c00f46$aa31ed60$2516989e@heather
Views: Raw Message | Whole Thread | Download mbox | Resend email
Lists: pgsql-interfaces

Streams are not supported yet, hence the message.

There are various ways around this, all of which are discussed recently in
the list archives.

Peter

--
Peter T Mount peter(at)retep(dot)org(dot)uk, me(at)petermount(dot)com
Homepage: http://www.retep.org.uk Contact details: http://petermount.com
PostgreSQL JDBC: http://www.retep.org.uk/postgres/
Java PDF Generator: http://www.retep.org.uk/pdf/
----- Original Message -----
From: "Patrick Goodwill" <goodwill(at)cheese(dot)stanford(dot)edu>
To: <pgsql-interfaces(at)postgresql(dot)org>
Sent: Friday, August 25, 2000 10:25 PM
Subject: [INTERFACES] JDBC + large objects problem

> I'm trying to use JDBC and BLOBS to store large amounts of text in a
> database. I get a strange error when I try to use the conventional JDBC
> interface... it comes out with a SQL Exceptions of:
>
> "InputStream as parameter not supported"
>
> for the code:
>
> Connection conn = pool.getConnection();
> PreparedStatement pstmt =
> conn.prepareStatement("INSERT INTO t" + book
> + "_data (author_id, title, text,
> type) VALUES ( ?, ?, ?, ?)");
> pstmt.setInt(1, userId);
> pstmt.setString(2, title);
> InputStream textStream = stringToStream(text);
> pstmt.setBinaryStream(3, textStream, text.length());
> pstmt.setInt(4, type);
> pstmt.executeUpdate();
> pstmt.close();
>
>
> ... with some helper functions....
>
> private InputStream stringToStream(String string) {
> byte[] bytes = string.getBytes();
> ByteArrayInputStream stream = new ByteArrayInputStream(bytes);
> return (InputStream) stream;
> }
>
> private String streamToString(InputStream stream) {
> try {
> int length = stream.available();
> byte[] bytes = new byte[length];
> stream.read(bytes);
> return new String(bytes);
> } catch (IOException e) {
> System.out.println("No Stream");
> }
> return null;
> }
>
> with an abbreviated schema of....
>
> >> \d t1_data
> Table "t1_data"
> Attribute | Type | Modifier
> -----------+---------+----------------------------------------------------
---
> data_id | integer | not null default
> nextval('t1_data_data_id_seq'::text)
> author_id | integer |
> title | text |
> text | oid |
> type | integer |
> time | time |
> Index: t1_data_pkey
>
>
> .... using postgresql 7.0 and the newest JDBC driver from retep.org.uk
>
>
> if ya'll have any ideas why it does what it does, i just might kiss your
> feet. =)
>
> -Patrick.
>
>