If you want to increase the availability of your vRealize Orchestrator 7 deployment, you can deploy the vRealize Orchestrator 7 in cluster mode. This means that the cluster will consist of at least two vRealize Orchestrator 7 server instances that share one database. The vRealize Orchestrator 7 in cluster mode supports Oracle, Microsoft SQL and PostgreSQL databases. Refer to the VMware Product Interoperability Matrix – Solution/Database Interoperability for all supported databases. Unlike previous versions of Orchestrator where standalone mode and cluster mode are present as options, in vRealize Orchestrator 7 there is no difference between the two options and they do not exist. Orchestrator works as a single instance until it is configured to work as a part of a cluster.

Note: If you plan to use vRealize Automation with vRealize Orchestrator, VMware recommends having vRealize Automation deployed and configured before deploying the vRealize Orchestrator cluster.

Prerequisites

Before you begin configuring the vRealize Orchestrator 7 in cluster mode, there are some prerequisites that needs to be completed first.

External Database – a supported external database is required for the vRealize Orchestrator 7 cluster. Check the vRealize Orchestrator 7 documentation on how to set up the database.

– a supported external database is required for the vRealize Orchestrator 7 cluster. Check the vRealize Orchestrator 7 documentation on how to set up the database. Certificates – if you plan to use a load balancer for both vRealize Orchestrator cluster nodes, you must create signed or self-signed certificates that contain the vRealize Orchestrator Virtual IP address and the vRealize Orchestrator nodes hostnames (out of scope for this blog post). This is mandatory if you want to use vRealize Orchestrator 7 in cluster mode with vRealize Automation.

– if you plan to use a load balancer for both vRealize Orchestrator cluster nodes, you must create signed or self-signed certificates that contain the vRealize Orchestrator Virtual IP address and the vRealize Orchestrator nodes hostnames (out of scope for this blog post). This is mandatory if you want to use vRealize Orchestrator 7 in cluster mode with vRealize Automation. Time – make sure the clocks on Orchestrator machines are synchronized with your NTP server(s).

Deploy vRealize Orchestrator 7 in Cluster Mode

In this case I deployed 2 full installations of vRealize Orchestrator 7 appliance. You can of course add more nodes if you want using the same procedure.

Deploy the downloaded vRealize Orchestrator Appliance as you would normally do using the “Deploy OVF Template” wizard. See for an example deployment one of my previous blogs. Open the Orchestrator Control Center (https://<IP_or_FQDN>:8283/vco-controlcenter/) on the first node and log in as root. Configure an external database connection by navigating to Configure Database under the Database menu. Select the database type (I used SQL Server 2012 SP2), server address and port, database name. username and password, server instance name (if applicable), domain name, and select User Windows authentication (NTLMv2). Click Save Changes. Next, upgrade the database and follow the steps described in the message. First, force the plug-ins reinstallation. You will have to stop the vRO server first from the Startup Options under the Manage Navigate to Monitor and Control -> Troubleshooting and click Force Plug-ins Reinstall. Create a new Package Signing Certificate from the Manage -> Certificates –> Package Signing Certificate menu by entering the required certificate information clicking the Generate button. At this point you can also assign a valid license from the Manage -> Licensing and selecting the appropriate license provider (manual or vSphere license). Next, click Orchestrator Node Settings under Manage menu.

Change the Number of active nodes setting from 1 to 2 and click Save. Start the vRO server and wait until the status displays that the vRO server is running.

Add second node to the cluster

Open the Orchestrator Control Center on the second node and log in as root. Click Join Node to Cluster under the Manage menu. Enter the FQDN or IP address, username and password of the first node in the cluster and click Join. If the cluster join operation completes successfully, the vRO server needs to be restarted. After restart, go back to the first node in the cluster and verify that both nodes are now displayed in the Orchestrator Node Settings and the state of the server is running.

If you want to add additional nodes to the cluster, simply follow the steps above for each new node.

That’s it! So, what’s next? Consider deploying a load-balancer like F5 or NSX to distribute work load among the vRealize Orchestrator servers.

Cheers!

– Marek.Z