When doing integration testing or fixing a bug in a piece of Java code that uses Oracle as a database, being able to do quick exports and imports of your schema can be a big help. Sometimes just calling Oracle’s imp/exp commandline tools from your code can be of help, but I was looking for something a bit more portable and less demanding on my local development machine. I found that Oracle’s datapump functionality can be called from stored procedures, which in turn can be called from a normal JDBC statement.

Combining Oracle’s information and questions like this one on StackOverflow, I’ve come up with a solution that allows me to call Java methods to import and export a database schema to a dumpfile, like so:

import("myschema", "myschema.dmp"); export("myschema", "myschema.dmp");

These methods make use of regular JDBC calls, are implemented as follows:

public static final String EXPORT_DIR = "/tmp"; public void export(String schema, String fileName) throws SQLException, IOException { String sql = getFileContents("export.sql"); pStmt pStmt = dataSource .getConnection() .prepareStatement(sql); pStmt.setString(1, schema.toUpperCase()); pStmt.setString(2, EXPORT_DIR); pStmt.setString(3, fileName); pStmt.execute(); } public void import(String schema, String fileName) throws IOException, SQLException { String sql = getFileContents("import.sql"); pStmt pStmt = dataSource .getConnection() .prepareStatement(sql); pStmt.setString(1, schemaName.toUpperCase()); pStmt.setString(2, EXPORT_DIR); pStmt.setString(3, fileName); pStmt.execute(); } private String getFileContents(String fileName) throws IOException { InputStream in = this.getClass() .getResourceAsStream(fileName); return IOUtils.toString(in, "UTF-8"); }

As you can see in the code, I create prepared statements from two sql files which are located in the same Java package. The “export.sql” contains the following code:

DECLARE schemaname VARCHAR2(200) := ?; directoryname VARCHAR2(200) := ?; dumpfilename VARCHAR2(200) := ?; directoryvariable VARCHAR(100) := 'EXPORT_DIR_' || schemaname; handle NUMBER; status VARCHAR2(20); BEGIN EXECUTE IMMEDIATE 'CREATE OR REPLACE DIRECTORY ' || directoryvariable || ' AS ''' || directoryname || ''''; handle := DBMS_DATAPUMP.OPEN( operation => 'EXPORT', job_mode => 'SCHEMA', job_name => 'datapump export schema ' || schemaname); DBMS_DATAPUMP.ADD_FILE( handle => handle, filename => dumpfilename, directory => directoryvariable, reusefile => 1); DBMS_DATAPUMP.ADD_FILE( handle => handle, filename => dumpfilename || '.export.log', directory => directoryvariable, filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE, reusefile => 1); DBMS_DATAPUMP.METADATA_FILTER( handle => handle, name => 'SCHEMA_EXPR', value => 'IN (''' || schemaname || ''')'); DBMS_DATAPUMP.START_JOB(handle); DBMS_DATAPUMP.WAIT_FOR_JOB(handle, status); EXECUTE IMMEDIATE 'DROP DIRECTORY ' || directoryvariable; END;

The “import.sql” looks very similar, containing the following code:

DECLARE schemaname VARCHAR2(200) := ?; directoryname VARCHAR2(200) := ?; dumpfilename VARCHAR2(200) := ?; directoryvariable VARCHAR(100) := 'EXPORT_DIR_' || schemaname; handle NUMBER; status VARCHAR2(20); BEGIN EXECUTE IMMEDIATE 'CREATE OR REPLACE DIRECTORY ' || directoryvariable || ' AS ''' || directoryname || ''''; handle := DBMS_DATAPUMP.OPEN( operation => 'IMPORT', job_mode => 'SCHEMA', job_name => 'datapump import schema ' || schemaname); DBMS_DATAPUMP.ADD_FILE( handle => handle, filename => dumpfilename, directory => directoryvariable); DBMS_DATAPUMP.ADD_FILE( handle => handle, filename => dumpfilename || '.import.log', directory => directoryvariable, filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE, reusefile => 1); DBMS_DATAPUMP.SET_PARAMETER( handle => handle, name => 'table_exists_action', value => 'REPLACE'); DBMS_DATAPUMP.METADATA_FILTER( handle => handle, name => 'SCHEMA_EXPR', value => 'IN (''' || schemaname || ''')'); DBMS_DATAPUMP.START_JOB(handle); DBMS_DATAPUMP.WAIT_FOR_JOB(handle, status); EXECUTE IMMEDIATE 'DROP DIRECTORY ' || directoryvariable; END;

I recommend reading the Oracle Datapump documentation (albeit dangerously enterprisey) to see what exactly happens when these scripts run. For instance, dumpfiles are reused (overwritten) without notice, and logfiles are created next to the dumpfile.

In this example, I write the dumpfiles to the “/tmp” directory on the database server. You can change this to any directory you like, but please make sure that the directory exists, and that the oracle user or orainst group has write access in that directory. A good way to start is first make it work with the /tmp directory, as in this example. If Oracle can not (over)write a file for any reason, you will get incomprehensible error messages about incorrect parameters in the dbms.datapump.add_file() calls.

Enjoy your Oracle-free development environment!

Share: Print

Email

Twitter

Facebook

More

Reddit

LinkedIn



Related