Bucket project objects

To monitor business activity where a particular value falls within various ranges, you can define "buckets" to specify the desired ranges of values. The pre-configured objects in the Buckets project monitor the number of workflows and work items based on their processing time:

NOTE To examine this project, import the Buckets project into FileNet Application Workbench. See BAM example - Work items by Amount categories (buckets) for an end-to-end demonstration of setting up bucket objects.

Dashboard objects

The following table lists the Buckets dashboard objects for work items and workflows.

  Objects related to work items Objects related to workflows
Dashboards: Work Items Workflows
Dashboard objects:

Buckets of active work items (processing time)

Buckets of active workflows (processing time)

Workbench objects

In FileNet Business Activity Monitor, the following objects gather and assemble the business data for display in BAM dashboard objects:

 

The following table lists the Buckets FileNet Application Workbench objects for work items and workflows.

  Objects related to work items Objects related to workflows
Cubes, views, and events: Number of Work Items Per Bucket Cube
Number of Work Items Per Bucket View
Number of Work Items Per Bucket Event
Number of Workflows in Buckets Cube
Number of Workflows Per Bucket View
Number of Workflows Per Bucket Event

Contexts and dimensions:

See Contexts and dimensions for more information.

Queue_Context/Queue_Dimension
UserName_Context/UserName_Dimension
Workflow_Context/Workflow_Dimension
WorkflowDefinition_Context/WorkflowDefinition_Dimension

Number of Work Items Per Bucket Event

The Number of Work Items Per Bucket Event calculates the number of active work items whose processing time falls into one of two groups, or buckets.

This event uses the following query to retrieve information from the Process Analyzer F_DMWIP cube.

select SUM(ProcTimeBucket1Count) as ProcTimeBucket1Count,
  SUM(ProcTimeBucket2Count) as ProcTimeBucket2Count,
  DMUser_key, DMOperation_key, DMStep_key
from
(
  select count(*) as ProcTimeBucket1Count,
    0 as ProcTimeBucket2Count,
    DMUser_key,
    DMOperation_key,
    DMStep_key
  from F_DMWIP
  where (ProcCurrentMinutes +MinutesSinceLastEvent*IsInProcStatus)< 60
  group by DMUser_key, DMOperation_key, DMStep_key
union
  select 0 as ProcTimeBucket1Count,
    count(*) as ProcTimeBucket2Count,
    DMUser_key,
    DMOperation_key,
    DMStep_key
  from F_DMWIP
  where (ProcCurrentMinutes + MinutesSinceLastEvent*IsInProcStatus)>=60
  group by DMUser_key, DMOperation_key, DMStep_key
)
as UnionResults
group by DMUser_key, DMOperation_key, DMStep_key

Since this is an aggregate query, any additions to the selected fields must contain an aggregate function or be added to the group by clause. Since this query performs a union, any additions to the selected fields must be added to both of the union sub-queries as well as the primary query.

Number of Workflows Per Bucket Event

The Number of Workflows Per Bucket Event calculates the number of active workflows whose processing time falls into one of two groups, or buckets.

This event uses the following query to retrieve information from the Process Analyzer F_DMWorkflowWIP table, with additional reference information retrieved from the Process Analyzer D_DMWorkflow table.

select SUM(ProcTimeBucket1Count) as ProcTimeBucket1Count,
  SUM(ProcTimeBucket2Count) as ProcTimeBucket2Count,
  DMWorkClass_key
from
(
  select count(*) as ProcTimeBucket1Count,
    0 as ProcTimeBucket2Count,
    DMWorkClass_key
  from F_DMWorkflowWIP f, D_DMWorkflow d
  where f.Workflow_key = d.Workflow_key and MinutesSinceCreation < 4320
  group by DMWorkClass_key
union
  select 0 as ProcTimeBucket1Count,
    count(*) as ProcTimeBucket2Count,
    DMWorkClass_key
  from F_DMWorkflowWIP f, D_DMWorkflow d
  where f.Workflow_key = d.Workflow_key and MinutesSinceCreation >= 4320
  group by DMWorkClass_key
)
as UnionResults
group by DMWorkClass_key

Since this is an aggregate query, any additions to the selected fields must contain an aggregate function or be added to the group by clause. Since this query performs a union, any additions to the selected fields must be added to both of the union sub-queries as well as the primary query.