WebSphere brand IBM WebSphere Premises Server, Version 6.1.x

Installing a high availability system

High availability provides several benefits, including load balancing and failover. High availability with WebSphere® Premises Server consists of setting up a server cluster and then configuring those servers for load balancing.

About this task

The installer creates the cluster topology and load balances the node servers.

Procedure
  1. Make sure that you have completed all the prerequisite steps necessary for high availability.
  2. Launch the high availability post-installation script located at the root of the High Availability for IBM® WebSphere Premises Server Central Site Server disk.
    • Windows setupwin32.exe
    • Linux setuplinux.bin
    The Welcome panel displays.
  3. Click Next.
  4. This panel shows the installation directory for the WebSphere Premises Server high availability system.

    The directory is:

    • Windows IBM_RFID_HOME\HA
    • Linux IBM_RFID_HOME/HA

    Click Next to continue.

  5. Enter the host name and port for WebSphere Application Server Network Deployment, and click Next.
    Tip: Make sure that WebSphere Application Server Network Deployment is running. The installer verifies that it can connect to WebSphere Application Server Network Deployment using the port and host name you have provided before continuing. If it cannot connect, you will be asked to go Back and edit the values on the previous panel, or you can Cancel out of the installer.
  6. Create the cluster members. Create at least one member on this panel in order to proceed with the installation.

    Use the Add Member button to add cluster members. The created member's name, node, and weight appear in the box at the bottom of the installer panel.

    To delete a cluster member, select the member name from the list of created members and click Delete Member.

    For more information on creating cluster members, see Adding members to a cluster.

  7. Click Next.
  8. A summary panel displays your installation selections. Click Install to continue the installation process.

    When the installation is complete, another summary panel displays the installation status and prompts you to check the log file for any errors.

    • Windows IBM_RFID_HOME\HA\logs\install.log
    • Linux IBM_RFID_HOME/HA/logs/install.log

    If you do see errors or exceptions in the installation log file, try uninstalling and reinstalling the high availability topology. Also check the Troubleshooting tips documentation for possible resolutions to the problem. If you are unable to resolve the errors, contact IBM Support.

  9. If you see exceptions in the WebSphere Application Server SystemOut.log file on the central and node servers, follow the procedure in this technote.
  10. Restart the central server and the cluster.
  11. If you are using WebSphere Application Server security, enable it, and then restart the deployment manager, all node agents, and all servers.
  12. Enable dynamic cache replication for all servers in the cluster.
    1. In the WebSphere Application Server administrative console, go to Servers > Application servers > server name > Container Services > Dynamic cache service and check Enable service at server startup for each server in the cluster.
    2. Define a new replication domain by going to Environment > Replication domains > New. Choose Entire domain when creating the new replication domain.
    3. Navigate to Resources > Cache instances > Object cache instances and add the new replication to all object cache components.
      1. Check Enable cache replication.
      2. Choose your cluster name for Full group replication domain.
      3. Choose Push only for Replication type.
      4. Set Push frequency to 1 seconds.
  13. Configure the Data Capture and Delivery controllers for high availability.
    1. Make sure you are using Java™ 1.4.2 on your Data Capture and Delivery controllers.
    2. Set the appropriate MQ user name for your operating system in the controller's Equinox script.
      • Windows -Duser.name=MUSR_MQADMIN
      • Linux -Duser.name=mqm

      If you used the sample files provided with the IBM Data Capture and Delivery Toolkit for WebSphere Premises Server to set up your remote Data Capture and Delivery controllers, modify the remoteDC script with the MQ user name.

      For example:

      Windows
      %JAVA_HOME%\bin\java" -Duser.name=MUSR_MQADMIN -Xmx256M -Xms256M
      Linux
      "$JAVA_HOME/bin/java" -Duser.name=mqm -Xmx256M -Xms256M
    3. Edit the config.ini file in the controller's Equinox configuration directory make sure the configuration is set to the dc_core4dts.txt file for the bundle list and E4 for the edge controller.
      com.ibm.rfid.bundle.list.url=http://IP_address:port_number/bundleadmin/GetBundle?name=http://IBM_HTTP_Server_IP_address/bundles/bundlelists/dc_core4dts.txt
      
      com.ibm.rfid.edge.config.url=http://IP_address:port_number/ibmrfidadmin/premises.sl?action=getconfig&edge=E4&version=6.1

      The values for IP_address and IBM_HTTP_Server_IP_address are the name of the server that is hosting the Bundle Repository Server.

      The second line of code points to the E4 controller, which is installed with WebSphere Premises Server specifically for high availability.

  14. Check to see if Data Transformation is running (started as a service) on your central server, and if so, stop it.
  15. Start the Equinox runtime on the Data Capture and Delivery controllers.
  16. Start the bundle loader on the Data Capture and Delivery controllers.
    1. Find the ID of the bundle loader bundle by running the OSGi ss command.
    2. Start the bundle loader bundle by entering start bundle_ID at the OSGi prompt.
  17. Test the clustered configuration using the Simulated Reader in the WebSphere Premises Server Administrative Console. Choose R4 as your simulated test reader.

    Optionally, you can test with a real reader.

  18. Create a new remote Data Capture and Delivery controller based on the E4 sample to use with your real reader.
What to do next
If you need to create additional cluster members, follow the steps in Installing additional cluster members.

Library | Support | Terms of use

(c) Copyright IBM Corporation 2004, 2008. All rights reserved.
U.S. Government Users Restricted Rights - Use, duplication or disclosure restricted by GSA ADP Schedule Contract with IBM Corp.