我有一个数据库,其中包含一个加载了基本64位图像的表,当数据库增长太多时,该数据库已被证明是一个问题,我们尝试导出其数据 .

为了重现一个极限情况条件,我已经用一个700Mb的数据库将java堆空间减少到96Mb,所以我可以保证如果导出运行良好,它不会再出现任何问题(希望至少这样) .

另外,根据H2高级文档:

Storing and Reading Large Objects
If it is possible that the objects don't fit into memory, then the data type CLOB 
(for textual data) or BLOB (for binary data) should be used. 
For these data types, the objects are not fully read into memory, by using streams. 
To store a BLOB, use PreparedStatement.setBinaryStream. 
To store a CLOB, use PreparedStatement.setCharacterStream. 
To read a BLOB, use ResultSet.getBinaryStream, and to read a CLOB, 
use ResultSet.getCharacterStream. When using the client/server mode, 
large BLOB and CLOB data is stored in a temporary file on the client side.

图像应作为字符流进行访问 . 通过这种方法,我分析了一旦JVM内存不足而创建的内存报告,内存泄漏位于一个名为org.h2.result.ResultDiskBuffer $ ResultDiskTape的类中 .

如果不是使用Stream方法而是使用JDBC的getString方法,我在名为PageResult的类中获得内存泄漏 .

如果有人对代码感兴趣:

/**
 * Exports the data implemented by us.
 * @param conn
 * @return
 */
private static boolean exportData(Connection conn) {
    PreparedStatement st = null;
    ResultSet rs = null;
    try {
        // Get the table names.
        List<String> tables = getTableNames(conn);

        // For each table...
        for ( String table : tables ) {

            int batch = 0;
            List<String> columns = new ArrayList<String>();
            System.out.println("---------------------------------------------------------");
            System.out.println(" >>>>>>>>>>>>>>> Exporting table " + table);

            // Close previous resources
            if ( st != null ) {
                st.close();
            }

            // Create the prepared statement.
            st = conn.prepareStatement("SELECT * FROM " + table + " LIMIT ? OFFSET ?");

            // Get the columns of the database using its metadata.
            ResultSetMetaData metaData = st.getMetaData();
            for ( int col = 0 ; col < metaData.getColumnCount() ; col++ ) {
                String columnName = metaData.getColumnLabel(col + 1);
                columns.add(columnName);
            }

            // Do while there is data in the table
            do {
                if ( batch % 100 == 0 ) { // Every 100 points, next line
                    System.out.println();
                }
                System.out.print(".");

                if ( rs != null ) { // Close previous resources
                    rs.close();
                }

                // Clears the batch
                st.clearBatch();

                // Execute query to grab a batch of batch size
                st.setInt(1, BATCH_SIZE);
                st.setInt(2, batch * BATCH_SIZE);
                rs = st.executeQuery();

                // Count will hold the number of rows found in the batch. If 0, no more data in the table, so we can break
                // the do while.
                int count = 0;

                // While data is in the result set...
                while ( rs.next() ) {
                    // Increase counter
                    count ++;

                    startXMLElement(table);
                    // Write to xml the columns of it.
                    for ( String name : columns) {
                        Reader characterStream = rs.getCharacterStream(name);
                        if ( characterStream == null ) {
                            continue;
                        }
                        BufferedReader br = new BufferedReader(characterStream);
                        String res = "";
                        String line = null;
                        while ( (line = br.readLine()) != null ) {
                            res += line;
                        }
                        try {
                            br.close();
                            characterStream.close();
                        } catch (Exception e ) {
                            e.printStackTrace();
                        }
                        addXMLAttribute(name, res);
                    }

                    // Write map to xml
                    endXMLElement(table , columns);
                }

                // No more data in table. Break while.
                if ( count == 0 ) {
                    break;
                }

                // Increase the batch for next iteration if data found.
                batch++;

            } while ( true );

            System.out.println();
            System.out.println(" >>>>>>>>>>>>>>> Done exporting " + table);
            System.out.println("---------------------------------------------------------");
        }
        return true;
    } catch (Exception e) {
        e.printStackTrace();
        return false;
    } finally {
        if ( st != null ) {
            try {
                st.close();
            } catch (SQLException e) {
                e.printStackTrace();
            }
        }

        if ( rs != null ) {
            try {
                rs.close();
            } catch (SQLException e) {
                e.printStackTrace();
            }
        }
    }
}



/**
 * @param conn
 * @return the names of the table created in the database.
 * @throws SQLException
 */
private static List<String> getTableNames(Connection conn) throws SQLException {
    List<String> tables = new ArrayList<String>();

    tables.add("IMAGECACHE");

    DatabaseMetaData md = conn.getMetaData();
    ResultSet rs = md.getTables(null, null, "%", null);

    while (rs.next()) {
      String name = rs.getString(3);
      String type = rs.getString(4);
      if ( "TABLE".equals(type) && ! tables.contains(name) ) {
          tables.add(name);
      }
    }
    return tables;
}