Intel NUCs with ESXi are a proven standard for virtualization home labs. I'm currently running a homelab consisting of 3 Intel NUCs with a FreeNAS based All-Flash Storage. If you are generally interested in running ESXi on Intel NUCs, read this post first. One major drawback is that they only have a single Gigabit network adapter. This might be sufficient for a standalone ESXi with a few VMs, but when you want to use shared Storage or VMware NSX, you totally want to have additional NICs.

A few month ago, this problem has been solved by an unofficial driver that has been made available by VMware engineer William Lam.

[UPDATE 2017-02: I've updated this post because we have another great driver by Jose Gomes and some changes in ESXi 6.5]

[UPDATE 2020-06: Update to add USB NIC Fling]

Prerequisites

These drivers are intended to be used with systems like the Intel NUC that does not have PCIe slots for additional network adapters. They are not officially supported by VMware. Do not install them in production.

The drivers are made for USB NICs with the AX88179 chipset which is available for about $25. The following adapters have been verified to work:

Make sure that the system supports USB 3.0 and network adapter are mapped to the USB 3.0 hub. Legacy BIOS Settings might prevent ESXi to correctly map devices as explained here.

Verify the USB configuration with lsusb -tv

# lsusb -tv Bus# 2 `-Dev# 1 Vendor 0x1d6b Product 0x0003 Linux Foundation 3.0 root hub `-Dev# 2 Vendor 0x0b95 Product 0x1790 ASIX Electronics Corp. AX88179 Gigabit Ethernet

Choose a driver

As of today, we have two different drivers made by William Lam and Jose Gomes. Both drivers are working without issues, so it's up to you which one you choose:

Installation

This is an example based on William Lams ASIX driver and vSphere 6.0. If you are using another driver, please refer to the guide provided by the link in the "Choose a driver" section.

Download Driver VIB from here. Upload the drivers to a Datastore Install the driver # esxcli software vib install -v /vmfs/volumes/datastore/vghetto-ax88179-esxi60u2.vib -f Verify that the drivers have been loaded successfully: # esxcli network nic list Name PCI Device Driver Admin Status Link Status Speed Duplex MAC Address MTU Description ------ ------------ ------------ ------------ ----------- ----- ------ ----------------- ---- ------------------------------------------------- vmnic0 0000:00:19.0 e1000e Up Up 1000 Full b8:ae:ed:75:08:68 1500 Intel Corporation Ethernet Connection (3) I218-LM vusb0 Pseudo ax88179_178a Up Up 1000 Full 00:23:54:8c:43:45 1600 Unknown Unknown # esxcfg-nics -l Name PCI Driver Link Speed Duplex MAC Address MTU Description vmnic0 0000:00:19.0 e1000e Up 1000Mbps Full b8:ae:ed:75:08:68 1500 Intel Corporation Ethernet Connection (3) I218-LM vusb0 Pseudo ax88179_178aUp 1000Mbps Full 00:23:54:8c:43:45 1600 Unknown Unknown Add the USB uplink to a Standard Switch or dvSwitch. You can do that with the vSphere Web Client or the VMware Host Client.



- Command Line # esxcli network vswitch standard uplink add -u vusb0 -v vSwitch0 Please note that this does not work with the vSphere Client. USB network adapters are not visible when adding adapters to vSwitches with the C# Client.

ESXi Installation with USB NIC

It is possible to create a customized ESXi Image including the AX88179 driver. This might be useful if you want to install ESXi on a system without any compatible network adapter.

Creating a custom ESXi Image that includes the driver is very easy with ESXi-Customizer by Andreas Peetz.



Performance

I have tested the Anker and Startech adapters on two different NUCs, an NUC5i5MYHE and the new NUC6i7KYK. The receiving end was my shared storage which has an Intel Quad Port Gigabit Adapter connected to a Cisco C2960G Switch. I've compared the performance with the NUCs onboard NIC.

Latency

I've measured the latency in both directions. The performance of both adapters is quite the same, and both are slightly slower, probably caused by the USB overhead. The results are not bad ad all:

Onboard NIC min/avg/max: 0.168/0.222/0.289 ms

AX88179 NIC min/avg/max: 0.193/0.310/0.483 ms

Bandwidth

To measure the bandwidth I've used iPerf, which is available on ESXi by default.

RX Performance Onboard NIC: 938 Mbits/sec

RX Performance Startech AX88179: 829 Mbits/sec

RX Performance Anker AX88179: 839 Mbits/sec



TX Performance Onboard NIC: 927 Mbits/sec

TX Performance Startech AX88179: 511 Mbits/sec

TX Performance Anker AX88179: 527 Mbits/sec

You can use multiple adapters to stack the performance. The NUC has 4 USB 3.0 ports while the first is used for the flash drive where ESXi boots from. I've tested the performance of 3 Startech Adapters:

RX Performance 3x Startech AX88179: 2565 Mbits/sec

TX Performance 3x Startech AX88179: 1522 Mbits/sec



When 3 network adapters are not enough, where is the limit? USB 3.0 supports up to 5000 MBits/s. I've connected all 3 network adapters to the same port with a USB 3.0 Hub. Here are the results:

RX Performance 3x Startech AX88179: 1951 Mbits/sec

TX Performance 3x Startech AX88179: 1555 Mbits/sec

As you can see, the overall performance, especially TX Performance (That is sending data out of the NUC) can not saturate the full bandwidth, but for a homelab the performance is sufficient and can be extended with multiple adapters. USB Adapters can also be used with Jumbo Frames, so creating an NSX Lab is possible.