IBM FileNet P8, Version 5.2.1            

Setting the indexing work capacity of index servers

You can set the number of index batches that an IBM® Content Search Services index server can work on concurrently. By doing so, you control the indexing work capacity of the server.

About this task

You set this capacity with the Maximum concurrent index batches property, which is part of the configuration for an index server. This capacity indirectly controls the number of concurrent batches that a Content Platform Engine instance can send to IBM Content Search Services. Content Platform Engine performs the following calculation:

    concurrent batches per CPE instance = (property value / CPE instances with indexing enabled)

In this calculation, CPE instances with indexing enabled includes all Content Platform Engine instances that are configured with indexing enabled. For example, suppose that the property value is 12 and the number of instances with indexing enabled is 2: 12 / 2 = 6 maximum concurrent batches per Content Platform Engine instance.

Important: IBM Content Search Services can operate at less than the configured capacity if a Content Platform Engine instance stops running for any reason. In the previous example, if you shut down a Content Platform Engine instance, the remaining instance still sends a maximum of six concurrent batches. The result is that IBM Content Search Services operates at 50% capacity. To keep IBM Content Search Services running at full capacity, disable indexing for any Content Platform Engine instance that might not resume running soon.

Procedure

To set the work capacity of an index server:

  1. Open the properties of the index server:
    1. Listing the registered servers
    2. On the Text Search Servers tab, click the name of the server. A tab for the server opens.
  2. On the server tab, click the General subtab.
  3. Enter a value for the Maximum concurrent index batches property.
  4. Save your changes.


Last updated: October 2015
p8pcc210.htm

© Copyright IBM Corporation 2015.