Inserts data from an external file with a supported file format into a table, hierarchy, or view. A faster alternative is sqluload - Load; however, the load utility does not support loading data at the hierarchy level.
Authorization
Required Connection
Database. If implicit connect is enabled, a connection to the default database is established.
Version
sqlutil.h
C API Syntax
/* File: sqlutil.h */ /* API: Import */ /* ... */ SQL_API_RC SQL_API_FN sqluimpr ( char * pDataFileName, sqlu_media_list * pLobPathList, struct sqldcol * pDataDescriptor, struct sqlchar * pActionString, char * pFileType, struct sqlchar * pFileTypeMod, char * pMsgFileName, short CallerAction, struct sqluimpt_in* pImportInfoIn, struct sqluimpt_out* pImportInfoOut, long * pNullIndicators, void * pReserved, struct sqlca * pSqlca); /* ... */ |
Generic API Syntax
/* File: sqlutil.h */ /* API: Import */ /* ... */ SQL_API_RC SQL_API_FN sqlgimpr ( unsigned short DataFileNameLen, unsigned short FileTypeLen, unsigned short MsgFileNameLen, char * pDataFileName, sqlu_media_list * pLobPathList, struct sqldcol * pDataDescriptor, struct sqlchar * pActionString, char * pFileType, struct sqlchar * pFileTypeMod, char * pMsgFileName, short CallerAction, struct sqluimpt_in* pImportInfoIn, struct sqluimpt_out* pImportInfoOut, long * NullIndicators, void * pReserved, struct sqlca * pSqlca); /* ... */ |
API Parameters
For more information, see SQLU-MEDIA-LIST.
A location pair with both locations equal to zero indicates that a nullable column is to be filled with NULLs.
For more information, see SQLDCOL.
The character array is of the form:
{INSERT | INSERT_UPDATE | REPLACE | CREATE | REPLACE_CREATE} INTO {tname[(tcolumn-list)] | [{ALL TABLES | (tname[(tcolumn-list)][, tname[(tcolumn-list)]])}] [IN] HIERARCHY {STARTING tname | (tname[, tname])} [UNDER sub-table-name | AS ROOT TABLE]} [DATALINK SPECIFICATION datalink-spec]
Attention: If an error occurs after the existing data is deleted, that data is lost.
Attention: If an error occurs after the existing data is deleted, that data is lost.
The tname and the tcolumn-list parameters correspond to the tablename and the colname lists of SQL INSERT statements, and have the same restrictions.
The columns in tcolumn-list and the external columns (either specified or implied) are matched according to their position in the list or the structure (data from the first column specified in the sqldcol structure is inserted into the table or view field corresponding to the first element of the tcolumn-list).
If unequal numbers of columns are specified, the number of columns actually processed is the lesser of the two numbers. This could result in an error (because there are no values to place in some non-nullable table fields) or an informational message (because some external file columns are ignored).
For more information, see SQLCHAR.
For more information about file formats, see the "Export/Import/Load Utility File Formats" appendix in the Data Movement Utilities Guide and Reference.
Not all options can be used with all of the supported file types.
For more information, see SQLCHAR, and the Command Reference.
If the initial call or any subsequent call returns and requires the calling application to perform some action prior to completing the requested import operation, the caller action must be set to one of the following:
REXX API Syntax
IMPORT FROM datafile OF filetype [MODIFIED BY :filetmod] [METHOD {L|N|P} USING :dcoldata] [COMMITCOUNT :commitcnt] [RESTARTCOUNT :restartcnt] MESSAGES msgfile {INSERT|REPLACE|CREATE|INSERT_UPDATE|REPLACE_CREATE} INTO tname [(:columns)] [OUTPUT INTO :output] CONTINUE IMPORT STOP IMPORT |
REXX API Parameters
Sample Programs
Usage Notes
Be sure to complete all table operations and release all locks before starting an import operation. This can be done by issuing a COMMIT after closing all cursors opened WITH HOLD, or by issuing a ROLLBACK.
The import utility adds rows to the target table using the SQL INSERT statement. The utility issues one INSERT statement for each row of data in the input file. If an INSERT statement fails, one of two actions result:
The utility performs an automatic COMMIT after the old rows are deleted during a REPLACE or a REPLACE_CREATE operation. Therefore, if the system fails, or the application interrupts the database manager after the table object is truncated, all of the old data is lost. Ensure that the old data is no longer needed before using these options.
If the log becomes full during a CREATE, REPLACE, or REPLACE_CREATE operation, the utility performs an automatic COMMIT on inserted records. If the system fails, or the application interrupts the database manager after an automatic COMMIT, a table with partial data remains in the database. Use the REPLACE or the REPLACE_CREATE option to rerun the whole import operation, or use INSERT with the restartcnt parameter set to the number of rows successfully imported.
By default, automatic COMMITs are not performed for the INSERT or the INSERT_UPDATE option. They are, however, performed if the commitcnt parameter is not zero. A full log results in a ROLLBACK.
Whenever the import utility performs a COMMIT, two messages are written to the message file: one indicates the number of records to be committed, and the other is written after a successful COMMIT. When restarting the import operation after a failure, specify the number of records to skip, as determined from the last successful COMMIT.
The import utility accepts input data with minor incompatibility problems (for example, character data can be imported using padding or truncation, and numeric data can be imported with a different numeric data type), but data with major incompatibility problems is not accepted.
One cannot REPLACE or REPLACE_CREATE an object table if it has any dependents other than itself, or an object view if its base table has any dependents (including itself). To replace such a table or a view, do the following:
If an error occurs while recreating the foreign keys, modify the data to maintain referential integrity.
Referential constraints and foreign key definitions are not preserved when creating tables from PC/IXF files. (Primary key definitions are preserved if the data was previously exported using SELECT *.)
Importing to a remote database requires enough disk space on the server for a copy of the input data file, the output message file, and potential growth in the size of the database.
If an import operation is run against a remote database, and the output message file is very long (more than 60KB), the message file returned to the user on the client may be missing messages from the middle of the import operation. The first 30KB of message information and the last 30KB of message information are always retained.
Importing PC/IXF files to a remote database is much faster if the PC/IXF file is on a hard drive rather than on diskettes. Non-default values for pDataDescriptor, or specifying an explicit list of table columns in pActionString, makes importing to a remote database slower.
The database table or hierarchy must exist before data in the ASC, DEL, or WSF file formats can be imported; however, if the table does not already exist, IMPORT CREATE or IMPORT REPLACE_CREATE creates the table when it imports data from a PC/IXF file. For typed tables, IMPORT CREATE can create the type hierarchy and the table hierarchy as well.
PC/IXF import should be used to move data (including hierarchical data) between databases. If character data containing row separators is exported to a delimited ASCII (DEL) file and processed by a text transfer program (moving, for example between OS/2 and AIX systems), fields containing the row separators will shrink or expand. PC/IXF file format specifications permit migration of data between OS/2 (IBM Extended Services for OS/2, OS/2 Extended Edition, and DB2 for OS/2) databases and DB2 for AIX databases via export, binary copying of files between OS/2 and AIX, and import. The file copying step is not necessary if the source and the target databases are both accessible from the same client.
The data in ASC and DEL files is assumed to be in the code page of the client application performing the import. PC/IXF files, which allow for different code pages, are recommended when importing data in different code pages. If the PC/IXF file and the import utility are in the same code page, processing occurs as for a regular application. If the two differ, and the FORCEIN option is specified, the import utility assumes that data in the PC/IXF file has the same code page as the application performing the import. This occurs even if there is a conversion table for the two code pages. If the two differ, the FORCEIN option is not specified, and there is a conversion table, all data in the PC/IXF file will be converted from the file code page to the application code page. If the two differ, the FORCEIN option is not specified, and there is no conversion table, the import operation will fail. This applies only to PC/IXF files on DB2 for AIX clients.
For table objects on an 8KB page that are close to the limit of 1012 columns, import of PC/IXF data files may cause DB2 to return an error, because the maximum size of an SQL statement was exceeded. This situation can occur only if the columns are of type CHAR, VARCHAR, or CLOB. The restriction does not apply to import of DEL or ASC files.
DB2 Connect can be used to import data to DRDA servers such as DB2 for OS/390, DB2 for VM and VSE, and DB2 for OS/400. Only PC/IXF import (INSERT option) is supported. The restartcnt parameter, but not the commitcnt parameter, is also supported.
When using the CREATE option with typed tables, create every sub-table defined in the PC/IXF file; sub-table definitions cannot be altered. When using options other than CREATE with typed tables, the traversal order list enables one to specify the traverse order; therefore, the traversal order list must match the one used during the export operation. For the PC/IXF file format, one need only specify the target sub-table name, and use the traverse order stored in the file.
The import utility can be used to recover a table previously exported to a PC/IXF file. The table returns to the state it was in when exported.
Data cannot be imported to a system table or a summary table.
Views cannot be created through the import utility.
Importing a multiple-part PC/IXF file whose individual parts are copied from an OS/2 system to an AIX system is supported on DB2.
On the Windows NT operating system:
DB2 Data Links Manager Considerations
Before running the DB2 import utility, do the following:
When the import utility is executed on the target system, data related to DATALINK columns is loaded into the underlying DB2 tables using SQL INSERT (as is the case for other columns).
During the insert operation, DATALINK column processing links the files in the appropriate Data Links servers according to the column specifications at the target database.
Representation of DATALINK Information in an Input File
For a description of how DATALINK information is represented in an input
file, see ***.
Table 8. Valid File Type Modifiers (Import)
Modifier | Description | ||
---|---|---|---|
All File Formats | |||
compound=x | x is a number between 1 and 100 inclusive (7 on DOS/Windows). Uses nonatomic compound SQL to insert the data, and x statements will be attempted each time. | ||
lobsinfile | lob-path specifies the path to the files containing LOB values. | ||
no_type_id | Valid only when importing into a single sub-table. Typical usage is to export data from a regular table, and then to invoke an import operation (using this modifier) to convert the data into a single sub-table. | ||
nodefaults | If a source column for a target table column is not explicitly specified,
and the table column is not nullable, default values are not loaded.
Without this option, if a source column for one of the target table columns is
not explicitly specified, one of the following occurs:
| ||
usedefaults | If a source column for a target table column has been specified, but it
contains no data for one or more row instances, default values are
loaded. Examples of missing data are:
| ||
ASCII File Formats (ASC/DEL) | |||
implieddecimal | The location of an implied decimal point is determined by the column definition; it is no longer assumed to be at the end of the value. For example, the value 12345 is loaded into a DECIMAL(8,2) column as 123.45, not 12345.00. | ||
noeofchar | The optional end-of-file character x'1A' is not recognized as the end of file. Processing continues as if it were a normal character. | ||
ASC (Non-delimited ASCII) File Format | |||
nochecklengths | If nochecklengths is specified, an attempt is made to import each row, even if the source data has a column definition that exceeds the size of the target table column. Such rows can be successfully imported if code page conversion causes the source data to shrink; for example, 4-byte EUC data in the source could shrink to 2-byte DBCS data in the target, and require half the space. This option is particularly useful if it is known that the source data will fit in all cases despite mismatched column definitions. | ||
nullindchar=x | x is a single character. Changes the character denoting
a null value to x. The default value of x is
Y.b
This modifier is case sensitive for EBCDIC data files, except when the character is an English letter. For example, if the null indicator character is specified to be the letter N, then n is also recognized as a null indicator. | ||
reclen=x | x is an integer with a maximum value of 32 767. x characters are read for each row, and a new-line character is not used to indicate the end of the row. | ||
striptblanks | Truncates any trailing blank spaces when loading data into a
variable-length field. If this option is not specified, blank spaces
are kept.
In the following example, striptblanks causes the import utility to truncate trailing blank spaces: db2 import from myfile.asc of asc modified by striptblanks method l (1 10, 12 15) messages msgs.txt insert into staff This option cannot be specified together with striptnulls. These are mutually exclusive options.
| ||
striptnulls | Truncates any trailing NULLs (0x00 characters) when loading data into a
variable-length field. If this option is not specified, NULLs are
kept.
This option cannot be specified together with striptblanks. These are mutually exclusive options.
| ||
DEL (Delimited ASCII) File Format | |||
chardelx | x is a single character string delimiter. The default
value is a double quotation mark ("). The specified character is
used in place of double quotation marks to enclose a character
string.ab
The single quotation mark (') can also be specified as a character string delimiter. In the following example, chardel'' causes the import utility to interpret any single quotation mark (') it encounters as a character string delimiter: db2 "import from myfile.del of del modified by chardel'' method p (1, 4) insert into staff (id, years)" | ||
coldelx | x is a single character column delimiter. The default
value is a comma (,). The specified character is used in place of
a comma to signal the end of a column.ab
In the following example, coldel; causes the import utility to interpret any semicolon (;) it encounters as a column delimiter:
db2 import from myfile.del of del modified by coldel; messages msgs.txt insert into staff | ||
datesiso | Date format. Causes all date data values to be imported in ISO format. | ||
decplusblank | Plus sign character. Causes positive decimal values to be prefixed with a blank space instead of a plus sign (+). The default action is to prefix positive decimal values with a plus sign. | ||
decptx | x is a single character substitute for the period as a decimal
point character. The default value is a period (.). The
specified character is used in place of a period as a decimal point
character.ab
In the following example, decpt; causes the import utility to interpret any semicolon (;) it encounters as a decimal point: db2 "import from myfile.del of del modified by chardel' decpt; messages msgs.txt insert into staff" | ||
delprioritychar | The current default priority for delimiters is: record delimiter,
character delimiter, column delimiter. This modifier protects existing
applications that depend on the older priority by reverting the delimiter
priorities to: character delimiter, record delimiter, column
delimiter. Syntax:
db2 import ... modified by delprioritychar ... For example, given the following DEL data file: "Smith, Joshua",4000,34.98<row delimiter> "Vincent,<row delimiter>, is a manager", ... ... 4005,44.37<row delimiter> With the delprioritychar modifier specified, there will be only two rows in this data file. The second <row delimiter> will be interpreted as part of the first data column of the second row, while the first and the third <row delimiter> are interpreted as actual record delimiters. If this modifier is not specified, there will be three rows in this data file, each delimited by a <row delimiter>. | ||
dldelx | x is a single character DATALINK delimiter. The default
value is a semicolon (;). The specified character is used in place
of a semicolon as the inter-field separator for a DATALINK value. It is
needed because a DATALINK value may have more than one sub-value.
ab
| ||
nodoubledel | Suppresses recognition of double character delimiters. | ||
IXF File Format | |||
forcein | Directs the utility to accept data despite code page mismatches, and to
suppress translation between code pages.
Fixed length target fields are checked to verify that they are large enough for the data. If nochecklengths is specified, no checking is done, and an attempt is made to import each row. | ||
indexixf | Directs the utility to drop all indexes currently defined on the existing table, and to create new ones from the index definitions in the PC/IXF file. This option can only be used when the contents of a table are being replaced. It cannot be used with a view, or when a insert-column is specified. | ||
indexschema=schema | Uses the specified schema for the index name during index creation. If schema is not specified (but the keyword indexschema is specified), uses the connection user ID. If the keyword is not specified, uses the schema in the IXF file. | ||
nochecklengths | If nochecklengths is specified, an attempt is made to import each row, even if the source data has a column definition that exceeds the size of the target table column. Such rows can be successfully imported if code page conversion causes the source data to shrink; for example, 4-byte EUC data in the source could shrink to 2-byte DBCS data in the target, and require half the space. This option is particularly useful if it is known that the source data will fit in all cases despite mismatched column definitions. | ||
Notes:
|
See Also