This blog will show you how to deploy the PivotalCF Hadoop service using Operations manager. Once the service is deployed you can use the developers console to launch on demand hadoop clusters using the PivotalCF framework
Login to the operations manager console using a web client
On the left side under Avaliable products click Import a product
Browse to the directory where you have downloaded the PHD CF install file
the file will be uploaded to the Operations Manager. This is a large file and will take a little bt to import
After the PHD product has been succesfully imported you can add it to your installed products and configure it for use
On the left click the Add button on the Pivotal HD service
The service is added to your product filed. Notice that it is Orange, meaning it needs to be configured. Green = configured. See these blogs for details of seting up the Operations Manager for VMware and the Elastic Runtime. Click on the PHD service tab
The only thing that has to be configured is the Network. The service plan and resource sizing have the defaults set but you can edit these to your desired settings. Click Network settings
Enter the information for your network enviornment. Note that the excluded range will define the IPS to be used by this service and must not overlap that excluded IP's used by the Operations manager. Example:
OpsManager exluded range x.x.x.1-x.x.x.200 = ips 201-254 are used by OpsManager
PHD exclude range x.x.x.1-x.x.x.149, x.x.x.160 -x.x.x.254 = ips 150-159 are used by PHD.
Notice that the 2 ranges do not overlap
Above we see the warning for CF about IP overlap
After creating our network we click the apply button the upper right side.
The service is installed and configured.
Return to the dashboard and we see the configured services.
Comments