DBEI - Database Import/Export Utility - Version 7.02

Copyright (c) 2014,2015,2016,2017,2018 Adrian Cornwell. All rights reserved.

DBEI - Introduction

The Free Database Export Import utility has been designed to allow data to be exported and imported between databases and embraces the following features.

DBEI - License

THIS SOFTWARE IS PROVIDED ``AS IS'' AND ANY EXPRESSED OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE DEVELOERS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

This product also includes PHP software, freely available from http://www.php.net/software

DBEI - Installation

To install or upgrade DBEI proceed as follows:

DBEI - ODBC and Windows 64Bit

If you are running DBEI on a 64 bit windows machine, you may get the following error when connecting to your databases.
architecture mismatch between the Driver and Application
This is because the application needs to use the 32 bit drivers and not the 64 bit drivers.
windows includes the ODBC admin tool for 32 bit drivers (although it is not in the control panel), you can access it from the start/run by entering %SystemRoot%\syswow64\odbcad32.exe. DSN's created here should work fine with DBEI.

DBEI - Uninstall

To uninstall DBEI all you need to do is delete the DBEI Folder.

DBEI - Update History



5.00
* Added Query Generator to Meta Data Module

5.01
* Deltas generated for DB2 no longer include DATA CAPTURE IN clause on index creations.

5.02
* You can now enter multiple watch check items per object
* minor modifications to PostgresQL Meta data extract
* You can now use @filename syntax when specifying a dsn for connections.
* Added scoping to Compare Databases Report.

5.03
* Added support for Synonyms in meta data (db2,oracle and sqlserver)

5.04
* Added table reorg commands to DB2 delta scripts if table structure altered
* Added new Run SQL Script option to main menu

6.00
* The Data Export program has had major changes (see help for that option)
* Improvements to the Data Import Program * The Run Script option now allows you to override the default statement delimiter.

6.01
* Additional improvements and bug fixes to import process
* Import program now supports Full Validation mode that generates SQL for the import.

6.02
* Added CSV Import option

6.03
* Fix to delta generator to handle Foreign Key Cascade option changes

6.04
* Added Collation support to SQLSERVER Meta Data extract and delta generator

6.05
* Removed embedded linefeeds during CSV Import
* Blank fields in CSV Inport files now treated as NULL rather than zero length string

6.06
* The Cpmpare meta extracts report now uses synonyms to match tablenames if direct match not found

6.07
* If you leave the output filename blank when generating deltas, changes will be listed on screen, but no delta scriot will be generated. This can be usefull for performing a quick differences check between meta extract sets.
* New report added to meta data section allowing you to check a meta data set for missing indexes (Foreign keys and logical constraints defined without supporting indexes)
* When generating deltas that contain ALTER COLUMN, the script will now include the old column defintion as a comment in the script.
* Improvements to POSTGRESQL Delta script generation.

6.08
* When specifying number of records to export when exporting a database, this will also apply to paremt tables as a minimum value.
* Import now aborts if error detected and INCL_INSERT_ERROR_SQL=Y
* Improved resolution of datetime/timestamp fields in export/import programs.

6.09
* modified import so that it does not remove trailing newline characters/
* Modifications to sqlserver VARBINARY data import.

6.10
* modified sqlserver meta extract to handle mixed collations
* Modified postgres meta extract to include sequences not assigned to a table.
* When running delete data from database in validation mode, the delete code is written to the log file.
* When running delete data from database, you can also specify whether to include dependant tables automatically.

6.11
* added support for VARCHAR FOR BIT DATA to DB2 meta data
* modified import/exports to better handle some Binary data
* Modified export to include logical foreign keys.

6.12
* added Parameterised Deletion script generation to meta data module
* Meta Data Module Delta generator now detects duplicated Foreign Keys

6.13
* added ability to restrict object scope when generating meta data delta scripts


6.14
* Added Database sizings report to the meta data module
* Added Meta Data Summary Report to the meta data module


6.15
* Added Generate Normalised insert scrips to main menu
* Added Make a Meta Data Extract Script for distribution to the meta data module
* Added Meta Data Detail Report to Meta Data Module


7.00
* Meta data Module Changes
- View Meta Data files option on the meta data menu now also displays database information from the meta data sets.
- added LARGE_DEF_RESTRICTION to the configuration file allowing to restrict the size of large meta data objects (for example a massive stored procedure). This can be usefull if you experiance memory problems.
- Added FILESEARCH_DEFAULT to the configuration allowing default directories to be defined for meta data detail report and file searches,
- Oracle meta extract now supports database comments and Materialised Views
- When generating an Oracle Delta, it also cretes a unique constraint when creating unique indexes
- When entering the restriction clause in the Delta builder, you can now enter a comma seperated list of values and also prefix the first value with ! to exclude this type.
- Added Meta Data Search and Meta Data Quick Search options to meta data menu
- Added File Search option to meta data menu allowing you to perform a search oin your source code files.
- Added Option to Generate Script to find a string in a Database.
- WATCH_SCOPE file now supports !string to exlude objects whose name contains this string
* Meta Data Detail Report
- Now includes database comments and materialised view details, and includes Foreign key and constraints information.
- Now includes the ability to generate an SQLLite database so that you can perform your own SQL Queries on the meta data.
- Added source code search option which allows source code references to a database object to be included in the detail report.
- Report allows you to enter upto 4 meta data files to include in the report
* Meta Data Summary Report
- Now includes a Table Usages Summary Section.
* Query Generator
- If prefixing comments in watch_check,txt with # character, item will be excluded from the Query generator
* Extras folder
- This folder contains useful scripts etc that can be used to enhance your DBEI Usage.
* Query Generator
- This has been removed from the meta data model and has been replaced by the _REL_MODEL_REPORT.txt that can be found in the extras folder


7.01
When producing deltas, indexes are now created before PK's and Foreign keys only created if both tables in the WATCH_SCOPE.txt file if used.
The Oracle Meta extract no longer excludes objects whose name contains $ character
Oracle Deltas no longer include Commit statements as DDL commands are committed automatically
Generating Deltas or Comparing meta data options now allow you to use WC (Watch Check) Option to restrict objects to those that are in the watch check file.
New "QD Quick Differences Check" report added to Meta Data Module.
Add Batch Job option to Meta Data Menu



7.02
Ordering of Meta Data Scripts modified so that filters used during Meta data reports and Delta production work better


How does it work ?

It uses an implementation of the PHP command line and ODBC Drivers to access the databases.
You do not need to install the product, you just unzip the zip file and execute the DBEI.bat file that will display the menu.
To uninstall the product, just delete the created folder.

Known Issues

When importing data into PostGreSQL, Timestamp fields get truncated. I do not have a work-around for this.


File Locations

The export files are saved in the exports folder where you installed this application.

The Log Files for various export/imports that you have executed are in the logs folder.

Need Support ?

You can contact us if you have any questions, suggestions or feedback at support@cornwell-consulting.com or from our Webbsite www.cornwell-consulting.com.

The Main Menu

When you run the DBEI.bat file, the following menu will be displayed.

Exporting a Database

When selecting the export option from the menu you will be prompted to enter the following information. Data Source Name (DSN) of Database:
Enter the DSN Name of the Database that you wish to export from.
If you have created a file in the DBEI/odbc_direct directory containing a dsn connection string then you can specify it here by prefixing the filename with @. For example, @mydb.txt will load the connection string from the file DBEI/odbc_direct/mydb.txt

Username:
Enter the username for connecting to the database. If using windows authentication then this field can remain blank.

Password:
Enter the Password for connecting to the database. (This will not be prompted for if you left username field blank).

Enter the filename for the export file.
example: mydata

Enter Schema to export from (or leave blank):
You can enter the database schema from which to export the tables.
Note that if you are connecting to an SQLSERVER Database, then you can also use database.schema syntax that will switch to the specified database and use the schema entered. for example: MYDATABASE.dbo

Tables:
Enter the Tables to export. You can use one of the following syntax options.

Max Number of Rows to Export per table. Leave blank for all:
You can enter one of:

The Export will now begin. Errors and warnings will be displayed on the screen and a log file will also be created in the php/DBEI/logs directory.

Importing into a Database

When selecting the import option from the menu you will be prompted to enter the following information. Data Source Name (DSN) of Database:
Enter the DSN Name of the Database that you wish to import into.
If you have created a file in the DBEI/odbc_direct directory containing a dsn connection string then you can specify it here by prefixing the filename with @. For example, @mydb.txt will load the connection string from the file DBEI/odbc_direct/mydb.txt

Username:
Enter the username for connecting to the database. If using windows authentication then this field can remain blank.

Password:
Enter the Password for connecting to the database. (This will not be prompted for if you left username field blank).

Enter the export file to import:
example: mydata

Enter Schema to import into (or leave blank):
You can enter the database schema to which you wish to import the data.
Note that if you are connecting to an SQLSERVER Database, then you can also use database.schema syntax that will switch to the specified database and use the schema entered. for example: MYDATABASE.dbo

Tables:
Enter the Tables to import. You can use one of the following syntax options.

Run in Validation Mode Y/N/F :
When running in Validation Mode N, No Changes will be made to the Target Database. instead a check of the foreign keys will be performed and the process will just list any warnings (related tables that are not being imported) and will also list all the tables that will be imported. At this point the process will stop.
if you are happy with the report you can rerun the import without validation to actually perform the import.

Using validation option F performs a full validation. The import process executes but nothing is changed in the target database. instead all the SQL is output to the log file. you can then execute this file to import the data. Note, that in this mode the trigger disabling is not performed.

Disable Triggers during the import Y/N:
This option is not displayed if you are running the import in validation mode
if you enter Y (which is the default), the any database triggers on the tables being imported will be disabled prior to the import being performed. They will be enabled when the import process is completed.
Enter N to leave the triggers active.

Delete data from the database before importing data Y/N/I:
This option is not displayed if you are running the import in validation mode
if you enter Y (which is the default), application will attempt to delete all data from the tables being imported prior to starting the import process.
Enter N to leave the current data, Enter Y to delete current data (Process will abort if error detected) or I to delete data but continue processing even if an error is detected.

The Import will now begin. Errors and warnings will be displayed on the screen and a log file will also be created in the php/DBEI/logs directory.


Import CSV File into a Database

This option allows you to perform a simple import of a text file into your database. Normally, you would load into a work table and process the data from their into your main tables as this import does not perform any data conversions or validation checks.

When selecting this option from the menu you will be prompted to enter the following information.

Data Source Name (DSN) of Database:
Enter the DSN Name of the Database that you wish to import into.
If you have created a file in the DBEI/odbc_direct directory containing a dsn connection string then you can specify it here by prefixing the filename with @. For example, @mydb.txt will load the connection string from the file DBEI/odbc_direct/mydb.txt

Username:
Enter the username for connecting to the database. If using windows authentication then this field can remain blank.

Password:
Enter the Password for connecting to the database. (This will not be prompted for if you left username field blank).

Enter the CSV file to import:
example: c:\temp\mydata.txt

Enter CSV File Delimiter:
You can enter the field delimiter used in your file. leave blank for ; or enter TAB for tab delimited files.

Enter Import Tablename:
Enter the name of the table in the database to import into.

The import will now start. Note that the program expects the first record to contain the column names for the insert table.

Delete Data from a Database

When selecting the Delete Data option from the menu you will be prompted to enter the following information. Data Source Name (DSN) of Database:
If you have created a file in the DBEI/odbc_direct directory containing a dsn connection string then you can specify it here by prefixing the filename with @. For example, @mydb.txt will load the connection string from the file DBEI/odbc_direct/mydb.txt

Enter the DSN Name of the Database that you wish to import into.

Username:
Enter the username for connecting to the database. If using windows authentication then this field can remain blank.

Password:
Enter the Password for connecting to the database. (This will not be prompted for if you left username field blank).

Enter Schema to delete from (or leave blank):
You can enter the database schema from which you wish to delete data.
Note that if you are connecting to an SQLSERVER Database, then you can also use database.schema syntax that will switch to the specified database and use the schema entered. for example: MYDATABASE.dbo

Tables:
Enter the Tables to delete from. You can use one of the following syntax options. Note that tables that are dependant upon these tables will automatically be included in the deletion list by the this utility that will also deterime the order in which the deletions need to be performed.

Run in Validation Mode Y/N :
When running in Validation Mode, No Changes will be made to the Database. The log file will contain the SQL to perform the deletions

Disable Triggers during the deletion Y/N:
if you enter Y (which is the default), the any database triggers on the tables being deleted from will be disabled prior to the delete being performed. They will be enabled when the process is completed.
Enter N to leave the triggers active.

Automatically add Dependant Tables Y/N:
if you enter Y (which is the default), then any tables dependant on the tables being deleted from will be automatically added to the deletion list. Enter N to turn this behaviour off.


Run an SQL Script

When selecting the Run an SQL Script option from the menu you will be prompted to enter the following information. Data Source Name (DSN) of Database:
If you have created a file in the DBEI/odbc_direct directory containing a dsn connection string then you can specify it here by prefixing the filename with @. For example, @mydb.txt will load the connection string from the file DBEI/odbc_direct/mydb.txt

Enter the DSN Name of the Database that you wish to import into.

Username:
Enter the username for connecting to the database. If using windows authentication then this field can remain blank.

Password:
Enter the Password for connecting to the database. (This will not be prompted for if you left username field blank).

Enter full filename of the SQL File:
Enter the full path\filename of the sql file you wish to execute.

Stop on Errors Y/N:
Enter Y to stop executing the script if an error is detected or N to continue

Override Command Delimiter (Leave blank for Database default):
By default the program uses the delimiter default for each database (/ for Oracle and DB2, ; for Postgres and GO for SQL Server), If you wish to use another delimiter character, then you can enter it here.

A database connection will be made and the script will be executed. when completed, the log file will be opened automatically.
Note, that if your script contained SELECT statements, the output will be in the log file in tab delimited format.

Generate a Normalised Insert Script

When selecting the Generate a Normalised Insert Script option from the menu you will be prompted to enter the following information.

Data Source Name (DSN) of Database:
If you have created a file in the DBEI/odbc_direct directory containing a dsn connection string then you can specify it here by prefixing the filename with @. For example, @mydb.txt will load the connection string from the file DBEI/odbc_direct/mydb.txt
Enter the DSN Name of the Database that you wish to import into.

Username:
Enter the username for connecting to the database. If using windows authentication then this field can remain blank.

Password:
Enter the Password for connecting to the database. (This will not be prompted for if you left username field blank).

Enter Schema to use (or leave blank):
Enter the name of the schema containing the table or leave blank for default

Enter Table to provide inserts from:
The insert statements will be produced using data from this table.

Enter full filename of the SQL File:
Enter the full path\filename of the sql file you wish to execute.

Enter Database Type for Output:
Enter The database type for the output or leave blank if same as the source database. The valid types are SQLSERVER, ORACLE, POSTGRESQL, DB2

Max Number of rows to export:
leave blank or enter the number of rows to generate insert scripts for

A database connection will be made and the script will be produced and opened automatically.

Config File

The config.txt file is in the directory in which you installed DBEI. if you delete this file and run the Export or Import options the file will automatically be created with the default values.

The parameters in this file are listed below.

AUTOTRUNCATE=Y
If this value is Y then when performing an import for a column whose type is a string, then it will check that the length of the string being imported does not exceed that of the field definition in the target database. If it does then the string will be truncated to fit the import field. When this happens a warning will be added to the import log file.
To disable this processing set the flag to N

INCL_INSERT_ERROR_SQL=N
When performing an import, error messages that occur when attempting to insert a record are written to the log file.
If you set this flag to Y, then the SQL used for the insert (with data values added) will be included also. This can make the log file very large, but can be useful in identifying inserted data issues.
If an error is dectected and this flag is set to Y then the import process will terminate after the table with the errors has been completely processed.

MAX_BUFFERED_ERRORS=500
By default, only the first 500 errors/warnings (per table) are added to the log file during an import. You can change the value here if required. (0 will include all errors).
Note that errors that occur when attempting to insert a record to the target database do not count here if the INCL_INSERT_ERROR_SQL flag is set to Y.

COMMIT_COUNT=5000
This parameter simply specifies how many records can be processed before a database Commit is performed.

MAX_ROWS_PER_CURSOR=5000
This parameter specifies the number of records to be processed in a single cursor during the export process. If the table contains more records then the cursor will be repopened for the next data set. normally, you will not need to change this parameter.

SAMPLE_ROWS=100
When performing an export and using the -1 parameter for number of rows to export, the program will look at the foreign keys in the database and use this information to extract record counts from the related tables in order to find the minimum number of rows that need to be extracted for a stable database. For tables without Foreign key dependancies the number of rows specified here will be exported for these tables.

ORACLE_BLANK_TO_SPACE=Y
When inserting a zero length string ("") in Oracle, the database automatically converts this to NULL which can cause errors when inserting into a column defined as NOT NULL. Setting this Flag to Y will result in zero length strings being converted to a single space prior to being inserted into the database.

SKIP_COLLATION=N
When generating a delta using the Meta Extract module, this option allows you to turn off collation checks. Currently these only have an impact for SQLSERVER databases.

ORACLE_DROP=N
When generating an Oracle Delta script using the Meta Extract module and skipping dependant objects, then triggers,procedures,functions, views etc that are being updated are not dropped as it is assumed that the change is done with a CREATE OR REPLACE command. If you wish to force these objects to be dropped first then you can do so by setting this flag to Y.

LARGE_DEF_RESTRICTION=0
If you have an extremly large database with loads of stored procedures, packages etc and you are getting memory issues when processing meta data, then you can use this parameter to define the max length of data to be loaded into the dbei processor.
for example: LARGE_DEF_RESTRICTION=1000 would restrict these definitions to 1000 characters.

FILESEARCH_DEFAULT=
Here you can enter a default file mask. the value here will be used if you enter DEFAULT in the file search option on the meta data detail and file search reports.
sub-directories will be included in the search, so a mask c:\myproject\*.sql will search all .sql files in the myproject folder and all sub-folders.
You can also enter a comma-seperated list of masks here. (eg: c:\myproject\*.sql,c:\myproject\*.java)
Additionally to can add your own additional defaults by adding to the end of the parameter name, for example, FILESEARCH_DEFAULT_SQL= could be referenced by entering DEFAULT_SQL when performing source searches

Logical Constraints

When performing data exports,imports into a database or deletions from a database, DBEI automatically checks the foreign keys defined in the database to determine the insert/delete order required for the tables and also to identify dependancies.

You can also add logical constraints (Foreign keys that do not actually exist in the database), but creating a file called logical_constraints.dat in the metadata directory.

This file (plain text) can contain logical constraints in the format fromTable;totable;condition. The condition field is only used by the query generator. if not entered, then this relationship will not be used by the query generator.

Example:
TABLE1;TABLE2;TABLE1.CUST_ID = TABLE2.ID
TABLE1;TABLE3

The above example defines 2 logical containts, TABLE1 to TABLE2 and TABLE1 to TABLE3, The second relationship will not be used by the query generator as no join condition has been specified.

SQLSERVER Identity Columns

The DBEI Utility handles these by creating a table called __DBEI with the same layout as the target table and then creates an instead of trigger on it to populate the target table with identity_insert turned on. When the table is loaded, the __DBEI Table is dropped.

Feel free to drop this table if it remains in ytour database (which can happen if the import fails)


Setting up Watch Lists

In the folder metadata you can create a text file called WATCH_CHECK.txt.
This file can contain a list of objects (one per line). a comment can also be added using a semi-colon delimiter. When any of the DBEI programs any objects in this list will be processed as normal, but a "You need to check object message" will be added displayed warning you about the check to be made.
The 2 examples below show a table added to this list with and without a comment.
MYTABLE1
MYTABLE2;this text will be displayed

If you prefix the text with the # character, then this item will be excluded when using the query generator.
for example:
MYTABLE1
MYTABLE2;#do not use this table