Whenever you start a pipeline, the system automatically starts logging, based on the current pipeline logging configuration in the pipeline configuration file. Logging files are created for each pipeline, by pipeline name, even if you started multiple pipelines using the same configuration file.
Type of message | Log file name | Action | Logging mode(s) |
---|---|---|---|
Error messages | pipeline_name.err Logs critical errors that occurred in the pipeline. |
After reviewing the log files, fix the errors or issues indicated with the pipeline. |
Service Debug |
SQL error messages | pipeline_name.SqlErr.log Logs SQL errors that occurred in the pipeline. This file has a size limit of 1 megabyte. When the file reaches that size limit, the system automatically archives the current log file and creates a new one. |
After reviewing this log file, fix the SQL errors or issues indicated. |
Service Debug |
Queue errors | pipeline_name.MQErr.log Logs queue errors. |
After reviewing this log file, fix the MQ errors or issues indicated. |
|
Windows® Event Viewer | (Microsoft® Windows
platforms only) If the pipeline has services installed and was started using the service mode (-s pipeline option), the pipeline also sends errors and important messages to the Windows Event Viewer. |
Monitor the messages in the Windows Event View and fix any errors or issues indicated.. |
Service (Microsoft Windows platforms only) |
Bad/invalid UMF messages that could not be processed | pipeline_name.bad Logs information about records in the incoming data source file that contain malformed or invalid UMF. The pipeline could not process the portion of the record containing this bad or invalid UMF, which sometimes mean that the pipeline processes partial records. |
After reviewing this log file, fix the records in the incoming data source file with bad or invalid UMF. Then send the corrected records back through a pipeline for processing. |
Service Debug |
UMF messages that generated exceptions | pipeline_name.msg Logs information about records in the incoming data source file that contain generated exceptions during processing. The pipeline did process the record. This type of message may indicate a problem with the data quality for this data source file. |
After reviewing this log file, you might still need to fix records in the incoming data source file that generated the UMF exception. Then send the corrected records back through a pipeline for processing. You can also review the Load Summary Report or the Data Source Summary Report for more information. |
Service Debug |
Debug tracing | Logs debug tracing information when a pipeline was started using the debug mode (-d pipeline option). There is no log file. The pipeline runs in the foreground with output messages sent directly to the command shell. You can use the redirection feature to create a file from the pipeline command output: pipeline -d -f my_umf.xml > my_log_file.log |
Debug |
|
SQL statements and performance statistics | pipeline_name.SqlDebug.log Logs SQL statements and performance statistics that can assist with troubleshooting problems and monitoring performance. This file has a size limit of 48 megabytes. When a file reaches the size limit, the system automatically archives the current log file and creates a new log file. |
Debug |
|
Pipeline shuts down while processing a file | pipeline_name.cnt As the pipeline processes incoming records, it logs the name of the data source file being processed, as well as a record count for every 100 records in the file successfully processed. If a pipeline shuts down while processing an incoming data source file, this file can help you determine which of the records in the data source file need to be reloaded into the pipeline for processing. |
After reviewing this log file and fixing the problem that shut down the pipeline, reload the unprocessed records into the pipeline for processing. |
File |