|Home > My Projects > Home Automation > Home Assistant Hosting Hardware|
I had a bit of a journey of hosting hardware that seems to bite a lot of Home Assistant users in the beginning. Partially because the Home Assistant Getting Started will tell you that a Raspberry Pi is a good affordable starting point for your home automation journey. This is far from the truth. First though, I'll cover how things are currently hosted. I say "currently" because it is entirely possible that this may change in the future. This is one of the strengths of Home Assistant; it is easy to move to a new environment.
Current Hosting Infrastructure
My Home Assistant infrastructure consists of a Lanner FW-7582 1U network appliance computer, running the VMware ESXi 6.7.0U2 hupervisor. Home Assistant is installed within a Python venv on top of Ubuntu Server 22.04 LTS configured as VM.
The FW-7582 was not chosen for any particular reason other than I happened to have it. A few months before I had bought a "box of network equipment" at a local auction for a reasonable bid. I wanted it because it was full of Netgear gigabit PoE capable switches. Also in the box was this strange fluorescent yellow thing. Which upon closer examination turned out to be a rack mountable computer of some kind with 4GB of RAM and a Kingston 120GB SSD. Score! Inside it is standard PC hardware for the most part. A Core i3-2120 at 3.3Ghz (2 core, 4 threads) and 6 Intel 82583 gigabit ethernet controllers broken out to 6 front mounted RJ45 connectors. This particular unit is branded "Vexalor" which is a commonly used restaurant gift card management system by Gixex. Out of the box it booted to a Linux login prompt. Not having the root password I gave up after a few attempts but did clone the drive because who knows, I might be bored in the future and mount the drive to see if I can harvest all the customer data probably so thoughtfully left on the system.
I upgraded the FW-7582 to 8GB of RAM as the included 4GB seemed a bit light. It was a bit disappointing that while the datasheet indicated that the system was equipped with Intel's Rapid Storage Technology software based RAID, I could not find any indication in the BIOS that it was actually there. There were no firmware updates, no jumpers nor BIOS switches to enable it. So to add redundancy I installed a Marvell 88SE9128 based PCIe SATA 6 Gb/s RAID controller I had on hand. That controller runs two WD Black 500GB laptop HDs which are mounted on a dual 2.5" to 3.5" adapter so that they fit into the single 3.5" bay which the chassis provides. VMWare does recognize the 88SE9128 chipset so I get visibility as to the array status. The array is just a mirror, RAID1. I am not worried about I/O performance so much as I am about reliability and with two drives available, mirror is the best choice.
As mentioned, Home Assistant runs on an Ubuntu Server 22.04 host running within a VM. While bare metal would have eliminated a layer in that setup there were a number of reasons I chose to use virtualization. First, it means that in the event of a catastrophic hardware failure (such as a motherboard failure) I could easily move the VM to nearly any other hosting hardware and boot it up in a matter of minutes. On the same theme, it is trivially easy to take a full and complete backup of the virtual hard drive (VMDK files) at any point for obvious backup purposes, and disaster recovery if needed.
I use this functionality when performing upgrades or making any significant system level changes.
Previous Hosting Infrastructure
My first Home Assistant installation was performed on an Orange Pi Zero using a random spare 4GB SD card I had lying around. I installed Home Assistant 0.65 on Armbian 5.38 and began experimenting and playing with it.
Everything seemed to work fine with decent performance, so I promoted this system to "production" and it became my Home Assistant instance.
However about 6-8 months into use, it became unstable. After adding more than a few entities the performance took a nose dive due to SD card write latency. So I moved the database from the default SQLite instance to MySQL. It made a big difference, though sometimes the system was still I/O bound. And after two burned out SD cards, I upgraded to a 16GB Kingston industrial rated card. That seemed to help things a little, until the system over the period of a week became increasing unstable to the point where it wouldn't boot. During that week, performance was completely unacceptable: multiple second delays when turning on a light and automations not reliably triggering.
It was upon connecting the OrangePi Zero to a console that I noticed the likely cause of the increased instability: the CPU had burned. The label was blackened, the case distorted. That OrangePi Zero was done.
A lesson learned here is to never, ever, use an SD card based device to run Home Assistant unless it is just for testing. And be very careful on single board computer choice. Because while the Home Assistant documentation will lead you to believe you can build a reliable and responsive system using a Raspberry Pi, a simple cruise through the forums will very quickly confirm otherwise.
With the failed OrangePi, I had to scramble to replace the system because half my house wasn't working (incidentally this is exactly the scenario that is 100% completely unacceptable for automation). I quickly ordered an OrangePi Prime. Shipping from China meant I may have a bit of a wait so in the mean time I grabbed my old HP Compaq TC4200 convertible tablet. With a Pentium M 1.8GHz CPU, it had been upgraded along the way to 3GB RAM and a 64GB Transcend (parallel ATA!) SSD. I loaded the latest version of Ubuntu Server and necessary support software, the latest version of Home Assistant then restored my configuration and made the few necessary minor tweaks due to breaking changes and some hardware sensors. The laptop was configured to run screen closed, connected via Ethernet and took over for the failed OrangePi Zero.
This was an eye opener. Even though the TC4200 is about a 2002 vintage system, the performance compared to the OrangePi was staggering. Which of course could be expected seeing as the system used a real SSD instead of an SD card and had 6x the RAM. Over the next few weeks I decided that trying to run Home Assistant on a single board computer is stupid. The compromises for the advantages (low cost, low power use) are too many. Even a 20 year old Pentium M CPU was much faster, the SSD meant no more I/O bound operations, and I no longer had to worry about the database burning out SD cards. Not to mention the built in UPS (battery), keyboard and mouse.
The extra power allowed me to add some additional integrations I could never have used on the OrangePi Zero. For example, integrating my security cameras via the Generic Camera platform.
This would be my hosting infrastructure for over a year before I moved to the VM based setup on the Lanner FW-7582. I never bothered to migrate Home Assistant to the Orange Pi Prime.
Protocols And Standards
DIY Power Bars and Outlets
Commercial Power Bars and Outlets
Architecture and Application
Back To Home Automation Page | Mail Me | Search