IM Relationship Resolution Information Center, Version 4.2

Pipeline log files

Whenever you start a pipeline, the system automatically starts logging, based on the current pipeline logging configuration in the pipeline configuration file. Logging files are created for each pipeline, by pipeline name, even if you started multiple pipelines using the same configuration file.

Types of pipeline log files

By default, all pipeline log files are written to the directory on the pipeline node where the pipeline was started. There are several different types of pipeline log files. Which message is logged to which file depends upon the mode in which the pipeline was started (debug -d mode or daemon/service -s mode), the type of message being logged, and the current logging configuration.
Table 1. Pipeline Logging Files by Type of Message, Log File Name, and Logging Modes
Type of message Log file name Action Logging mode(s)
Error messages pipeline_name.err

Logs critical errors that occurred in the pipeline.

After reviewing this log file,

Service

Debug

SQL error messages pipeline_name.SqlErr.log

Logs SQL errors that occurred in the pipeline.

This file has a size limit of 1 megabyte. When the file reaches that size limit, the system automatically archives the current log file and creates a new one.

After reviewing this log file,

Service

Debug

Queue errors pipeline_name.MQErr.log

Logs queue errors.

After reviewing this log file,

 
Windows® Event Viewer (Microsoft® Windows platforms only)

If the pipeline has services installed and was started using the service mode (-s pipeline option), the pipeline also sends errors and important messages to the Windows Event Viewer.

 

Service (Microsoft Windows platforms only)

Bad/invalid UMF messages that could not be processed pipeline_name.bad

Logs information about records in the incoming data source file that contain malformed or invalid UMF.

The pipeline could not process the portion of the record containing this bad or invalid UMF, which sometimes mean that the pipeline processes partial records.

After reviewing this log file, fix the records in the incoming data source file with bad or invalid UMF. Then send the corrected records back through a pipeline for processing.

Service

Debug

UMF messages that generated exceptions pipeline_name.msg

Logs information about records in the incoming data source file that contain generated exceptions during processing.

The pipeline did process the record.

This type of message may indicate a problem with the data quality for this data source file.

After reviewing this log file, you may still need to fix records in the incoming data source file that generated the UMF exception, and then send the corrected record back through a pipeline for processing.

You can also review the Load Summary Report or the Data Source Summary Report for more information.

Service

Debug

Debug tracing

Logs debug tracing information when a pipeline was started using the debug mode (-d pipeline option). There is no log file. The pipeline runs in the foreground with output messages sent directly to the command shell. You can use the redirection feature to create a file from the pipeline command output:

pipeline -d -f my_umf.xml > my_log_file.log
 

Debug

SQL statements and performance statistics pipeline_name.SqlDebug.log

Logs SQL statements and performance statistics that can assist with troubleshooting problems and monitoring performance.

This file has a size limit of 48 megabytes. When a file reaches the size limit, the system automatically archives the current log file and creates a new log file.

 

Debug

Pipeline shuts down while processing a file pipeline_name.cnt

As the pipeline processes incoming records, it logs the name of the data source file being processed, as well as a record count for every 100 records in the file successfully processed.

If a pipeline shuts down while processing an incoming data source file, this file can help you determine which of the records in the data source file need to be reloaded into the pipeline for processing.

After reviewing this log file and fixing the problem that shut down the pipeline, reload the unprocessed records into the pipeline for processing.

File



Feedback

Last updated: 2009