We recently implemented NSX in an existing vCenter/vSphere environment and ran across an interesting issue when attempting to prepare the hosts. SQL Server Reporting Services was co-installed with vCenter, so to prevent a conflict on TCP/80 the customer specified a custom HTTP port during vCenter installation. At this point we should have been conflict free on TCP/80 but as we soon found out, that was not the case.
We noticed the issue after attempting to prepare the hosts resulted in a status of Not Ready. To add to the mystery there was a suspicious lack of activity within the recent tasks pane of the client. After running through the troubleshooting steps in this VMware KB, and reviewing the required ports for NSX 6.2, we reviewed the EAM logs on vCenter and could tell something was not quite right. EAM wasn’t healthy and this was due to EAM attempting to use port 80. Opening the eam.properties file and seeing the first two lines confirmed this:
eam.properties is located in \ProgramFiles\VMware\Infrastructure\tomcat\webapps\eam\WEB-INF\
After modifying the second line in the eam.properties file to use the same custom HTTP port as vCenter we restarted vCenter services and reviewed the EAM logs once more. The logs showed EAM as initialized and listening on its new port so things were looking promising. We logged back into the vCenter web client and noticed the NSX extension (Network & Security Plug-in) was missing. This is a known issue anytime vCenter starts after NSX Manager so we restarted NSX Manager in order to restore access to the extension. A restart of the NSX Management Service would have probably sufficed but we were in “troubleshooting mode” and wanted to eliminate as many variables as possible.When attempting to prepare the hosts again, we observed the following
The individual host scans would all eventually time out and we were left again with hosts in a Not Ready state. This time we turned to the ESX Patch and Update logs (/var/log/esxupdate.log) and found errors similar to those described in this VMware KB. It turns out we had only fixed half of the problem. ESXi’s restrictive firewall needed to be opened up in the outbound direction on the custom EAM port. Following this VMware KB, we opened up the firewall outbound on the EAM port and made the change persist after a reboot. We applied the fix to one of the hosts as a test and observed the following.
The scan completed instantly and it went on to the install phase.
We repeated the firewall change on the other hosts and all were finally prepared.
This was a vCenter/ESXi 5.5 environment. VMware states here that:
“VMware vSphere 6.0 supports VIB downloads over port 443 (instead of port 80). This port is opened and closed dynamically. The intermediate devices between the ESXi hosts and vCenter Server must allow traffic using this port.”
So this issue may no longer exist in vSphere 6 environments as long as both 80 and 443 are not in use by another application on vCenter.