viKernel Lab

I tend to use a combination of whitebox baremetal and nested vSphere environments in my home lab. I personally prefer a nested solution due to the flexibility it provides.

viKernel viLab 2015-2017

In the current design I have deployed 2x mid-sized whitebox ESXi hosts running vSphere 6,  then nesting an additional two ESXi 5.5 hosts for dev/testing purposes. Infrastructure servers such as AD/DNS, DHCP, FW-appliances, vCenter Server, SRM, vROPs etc.. are run on the top-level/parent hosts. In addition to the two nested dev/test ESXi hosts, I also run a nested ESXi host which acts as a replication partner for the parent ESXi hosts. Using vSphere replication is a great way to test VMware SRM if you don’t have an SRM supported storage array.

Long term I intend to expand the physical node count, but for now the current design works pretty well.

2x Compute node specifications:

  • CPU – 1x Intel Xeon E5-2630 V3 s2011-3
  • Cooling – 1x Noctua NH-D9DX i4 3U 92mm Intel Xeon
  • Memory – 4x  16GB SAM/M393A2G40DB0-CPB
  • Mainboard – Supermicro MBD-X10SRH-CLN4F-B
  • Disk –  2x WD Red 2TB
  • Disk – 2x Intel SSD 730 480GB 2.5
  • Network – Intel 10GbE Dual Port E10G42B X520-DA2 cards

Supermicro MBD-X10SRH-CLN4F-B Mainboard Specifications:

  • Single socket R3 (LGA 2011) supports Intel® Xeon® processor E5-2600 v3 and E5-1600 v3 family
    • Intel® C612 chipset
  • Up to 512GB ECC DDR4 2133MHz LRDIMM; 8x DIMM slots
  • 1 PCI-E 3.0 x4 (in x8), 1 PCI-E 3.0 x8 (in x16), 2 PCI-E 3.0 x8, 1 PCI-E 2.0 x2 (in x4), 1 PCI-E 2.0 x4 (in x8)
  • Intel® i350-AM4 Quad port GbE LAN
  • 10x SATA3 (6Gbps) via C612
  • 8x SAS3 (12Gbps) via LSI 3008 SW controller; RAID 0, 1, 10
  • 1x VGA, 2x COM, 1x TPM
  • 4x USB 3.0 ports, 8x USB 2.0 ports
  • 2x SuperDOM with built-in power

Bill of Materials:





  • Synology DS1813+ (primary)
    • Presenting NFS to Physical and nested ESXi hosts.
    • 3x 2TB WD RED 3.5′ drives – NL Tier
    • 2x Samsung 850 Pro 256GB SSD’s – Performance Tier
    • 2x OCZ Agility3 120GB SSD’s – Synology cache disks
  • Synology DS714+ (secondary)
    • Predominantly used as a backup repository
    • 2x 3TB WD RED 3.5′ drives.
  • x86 Solaris 11 host with comstar used as an FC target (retired)


  • Cisco 5505 Firewall (security plus license)
  • Dell PowerConnet 6224 L3 Switch
  • HPv1910 24 port Gigabit SW (retired)

Macbook – vSphere/VMware Fusion (Running vSphere 5.5 env)

I dont tend to use the Macbook much for lab exercises anymore, but it is helpful if having to demo something where remote connectivity is limited. In this setup I just have two ESXi nodes, VCSA, AD/DNS server.

  • MBP – Retina display: 15.4-inch
  • CPU – 2.3 GHz Intel Core i7
  • RAM – 16 GB 1600 MHz DDR3
  • Disk – Fusion I/O 500GB SSD

Additional Resources:

viLab 2013-2015

2 x HP N40L running ESXi5 (retired)

  • AMD Turion II (dual core) @ 1.5 Ghz
  • 16 GB (2x CORSAIR XMS3 8GB 240-Pin DDR3 SDRAM DDR3 1333 Desktop Memory)
  • Onboard NC107i PCI Express Gigabit 10/100/1000
  • Additional Intel PRO/1000 MT Dual Port Server Adapter
  • 16GB memory stick for vSphere OS

1x HP N40L running ESXi5
Function: Open Indiana, Mail Server, DC/DNS server, vCenter

  • AMD Turion II (dual core) @ 1.5 Ghz
  • 16GB (2x CORSAIR XMS3 8GB 240-Pin DDR3 SDRAM DDR3 1333 Desktop Memory)
  • Onboard NC107i PCI Express Gigabit 10/100/1000
  • Additional Intel PRO/1000 MT Dual Port Server Adapter
  • 16GB memory stick for ESXi OS
  • 3ware SAS 9750-4l SAS controller
  • 2x OCZ agility III 120 GB SSDs
  • 4x Seagate 2TB 7.2k

Leave a Reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.