Unraid machines will often have more than just a single purpose, besides being used as network attached storage or application servers. Depending on where you first learnt of Unraid, you would even be forgiven for thinking that Unraid’s prime purpose was being the underlying operating system for a media server using Plex Media Server or as a gaming battlestation (take this video from Linus Tech Tips as an example).
But does Unraid even need a graphics card (GPU)? And if not, why do so many users seemingly not just have any GPU in their Unraid machine, but a high-end and power hungry model? Those are just a few of the questions this article will answer.
Unraid does not need a GPU
To answer the question asked in the title of this article: No, Unraid does not need a GPU. Unraid doesn’t even need an integrated GPU (iGPU), such as those present on many Intel processors (Intel Xe, Intel UHD, and Iris Plus Graphics) or on a select number of AMD processors (AMD Accelerated Processing Units). You can get started with Unraid without having ever to attach a monitor to the machine itself. Such a setup, one with no video output, is called a headless Unraid system.
Unraid is managed using a web dashboard, and that dashboard can be accessed from any browser, on any computer (or even a phone or tablet). All you have to do is figure out the machine’s IP using your router and nothing more.
Your motherboard or CPU might though
Even if Unraid doesn’t need a GPU, your motherboard or CPU might. Before you hit the purchase button on your dream Unraid rig, you must check the manuals or search for reports on whether your combination of hardware will be able to boot with the absence of a GPU.
I should know because I ran in to this issue myself when I built my Unraid machine using an AMD Ryzen processor and an MSI B450M MORTAR motherboard. Despite not having an iGPU, the AMD Ryzen CPU I chose is perfectly capable of booting in a headless system. The motherboard on the other hand wasn’t happy with my setup. I ended up buying a cheap, second-hand Nvidia GPU just to be able to boot and use Unraid. Since then, MSI has allowed headless setups via a BIOS update but at the time I had no other choice.
Headless debugging isn’t fun
Even if I wouldn’t need the GPU in my Unraid machine now, I still have it installed because of a recent issue I was experiencing. My machine would boot, and I could even access the dashboard, but in a matter of minutes it would freeze and become totally unresponsive. But while the dashboard was inaccessible, the shell was still in working order. Using a monitor and keyboard, I could successfully save a log file on to a USB drive, analyse, and finally resolve the issue (I ended up having to reduce the speed of my RAM).
- New turing architecture
- Classic and modern games at 1080p at 60 fps. Supported os windows 10 / 8 / 7
- Super compACt 8.5-inch card, fits 99% of systems
- NVIDIA Ampere Streaming Multiprocessors: The building blocks for the world’s fastest, most efficient GPU, the all-new Ampere SM brings 2X the FP32 throughput and improved power efficiency.
- 2nd Generation RT Cores: Experience 2X the throughput of 1st gen RT Cores, plus concurrent RT and shading for a whole new level of ray tracing performance.
- 3rd Generation Tensor Cores: Get up to 2X the throughput with structural sparsity and advanced AI algorithms such as DLSS. Now with support for up to 8K resolution, these cores deliver a massive boost in game performance and all-new AI capabilities.
- Chipset Radeon RX 6700 XT
- Boost Clock 2424 MHz
- Video Memory 12GB GDDR6
Why put a GPU in an Unraid machine?
There are at least four reasons why many Unraid builds contain a GPU: Because Unraid has KVM capabilities, many will pass through the GPU to a virtual machine and use the system for gaming. There are bound to be cryptocurrency miners that use Unraid as their operating system. Plex has the ability to transcode video files using a GPU for remote viewing (read my article on why the GTX 1650 SUPER is the best GPU for this use-case here). Or you might be a video producer and want to archive your files in a more efficient codec without having to tax your editing rig. Finally, certain research applications, such as Folding@home can use a GPU’s power to solve real-world issues.