Hi, Russell.
 Your suggest is very correct and in fact it works very well but only if the file size is under 10 MB, 
 The problem happens just at moment to execute:
 InputStream ins = rs.getBinaryStream( 1 );
 I don't know how to overcome this :-(
On 9/29/05, Russell Francis <[email protected]> wrote: Alfredo Rico wrote:
> Hi friends, greetings :-)
>
> I'm using PostgreSQL v7.4.7 with JDBC driver version 8.0 Build 312 in a Java
> Web Deveploment.
>
> I have a table named 'attachedfiles' in which there is a column used to 
> store binary data (bytea type) (pdf's files, doc, png, jpg, what ever...).
>
> I have stored, without troubles, binary data reaching 30 MB.
>
> To retrieve the binary data I'm using the following Java code: 
>
> PreparedStatement ps = this.con.prepareStatement("SELECT contentfile from
> attachedfiles where filename = ? ", ResultSet.TYPE_FORWARD_ONLY,
> ResultSet.CONCUR_READ_ONLY);
> ps.setString (1,"UsingJDBC.pdf");
> bytes filebinary[] = null;
> ResultSet rs = ps.executeQuery();
> if (rs != null)
> {
> while (rs.next()) //Only one row
> {
> filebinary = rs.getBytes (1); //Obtain the file...
> }
> rs.close();
> }
> ps.close();
>
>
> Problem:
> If the binary data that I want to retrieve, is up to 12 MB, I obtain a
> java.lang.OutOfMemoryError : Java Heap Space.
Hi Alfredo,
I am not an expert and am relatively new to this list but perhaps a
different approach rather than using getBytes which loads the whole 12M
into memory would work.  Have you tried something like this? 
...
int bytes_read = 0;
byte[] buf = new byte[ 8192 ];
ServletOutputStream outs = servletRequest.getOutputStream();
InputStream ins = rs.getBinaryStream( 1 );
while( ( bytes_read = ins.read( buf ) ) != -1 ) 
{
        outs.write( buf, 0, bytes_read );
}
outs.close();
ins.close();
...
This may be less memory intensive than pulling the whole binary field
into memory at once.
Cheers,
Russ