Does Unraid need a graphics card (GPU)?

Does Unraid need a graphics card (GPU)? And if not, why do so many users have a high-end and power hungry model installed?

Disclosure: This post contains affiliate links. If you click through and make a purchase, I’ll earn a commission, at no additional cost to you. Read my full disclosure here.
Advertisements

Unraid machines will often have more than just a single purpose, besides being used as network attached storage or application servers. Depending on where you first learnt of Unraid, you would even be forgiven for thinking that Unraid’s prime purpose was being the underlying operating system for a media server using Plex Media Server or as a gaming battlestation (take this video from Linus Tech Tips as an example).

But does Unraid even need a graphics card (GPU)? And if not, why do so many users seemingly not just have any GPU in their Unraid machine, but a high-end and power hungry model? Those are just a few of the questions this article will answer.

Unraid does not need a GPU

To answer the question asked in the title of this article: No, Unraid does not need a GPU. Unraid doesn’t even need an integrated GPU (iGPU), such as those present on many Intel processors (Intel Xe, Intel UHD, and Iris Plus Graphics) or on a select number of AMD processors (AMD Accelerated Processing Units). You can get started with Unraid without having ever to attach a monitor to the machine itself. Such a setup, one with no video output, is called a headless Unraid system.

Unraid is managed using a web dashboard, and that dashboard can be accessed from any browser, on any computer (or even a phone or tablet). All you have to do is figure out the machine’s IP using your router and nothing more.

Your motherboard or CPU might though

Even if Unraid doesn’t need a GPU, your motherboard or CPU might. Before you hit the purchase button on your dream Unraid rig, you must check the manuals or search for reports on whether your combination of hardware will be able to boot with the absence of a GPU.

I should know because I ran in to this issue myself when I built my Unraid machine using an AMD Ryzen processor and an MSI B450M MORTAR motherboard. Despite not having an iGPU, the AMD Ryzen CPU I chose is perfectly capable of booting in a headless system. The motherboard on the other hand wasn’t happy with my setup. I ended up buying a cheap, second-hand Nvidia GPU just to be able to boot and use Unraid. Since then, MSI has allowed headless setups via a BIOS update but at the time I had no other choice.

Headless debugging isn’t fun

Even if I wouldn’t need the GPU in my Unraid machine now, I still have it installed because of a recent issue I was experiencing. My machine would boot, and I could even access the dashboard, but in a matter of minutes it would freeze and become totally unresponsive. But while the dashboard was inaccessible, the shell was still in working order. Using a monitor and keyboard, I could successfully save a log file on to a USB drive, analyse, and finally resolve the issue (I ended up having to reduce the speed of my RAM).

Bestseller No. 1
MSI Gaming GeForce RTX 3060 12GB 15 Gbps GDRR6 192-Bit HDMI/DP PCIe 4 Torx Twin Fan Ampere OC Graphics Card
  • NVIDIA GeForce RTX 3060 12GB GDDR6 dedicated graphics card
  • 1710 MHz GPU clock speed and 1807 MHz memory clock speed
  • DisplayPort x 3 (v1.4a) and HDMI 2.1 x 1 output interfaces
Bestseller No. 2
ASUS Dual NVIDIA GeForce RTX 3050 6GB OC Edition Gaming Graphics Card - PCIe 4.0, 6GB GDDR6 Memory, HDMI 2.1, DisplayPort 1.4a, 2-Slot Design, Axial-tech Fan Design, 0dB Technology, Steel Bracket
  • NVIDIA Ampere Streaming Multiprocessors: The all-new Ampere SM brings 2X the FP32 throughput and improved power efficiency.
  • 2nd Generation RT Cores: Experience 2X the throughput of 1st gen RT Cores, plus concurrent RT and shading for a whole new level of ray-tracing performance.
  • 3rd Generation Tensor Cores: Get up to 2X the throughput with structural sparsity and advanced AI algorithms such as DLSS. These cores deliver a massive boost in game performance and all-new AI capabilities.
Bestseller No. 3
ASUS ROG Strix GeForce RTX™ 4060 OC Edition Gaming Graphics Card (PCIe 4.0, 8GB GDDR6, DLSS 3, HDMI 2.1a, DisplayPort 1.4a, Axial-tech Fan Design, Aura Sync, 0dB Technology)
  • NVIDIA Ada Lovelace Streaming Multiprocessors: Up to 2X performance and power efficiency
  • 4th Generation Tensor Cores: Up to 4x performance with DLSS 3 vs. brute-force rendering
  • 3rd Generation RT Cores: Up to 2x ray tracing performance

Why put a GPU in an Unraid machine?

There are at least four reasons why many Unraid builds contain a GPU: Because Unraid has KVM capabilities, many will pass through the GPU to a virtual machine and use the system for gaming. There are bound to be cryptocurrency miners that use Unraid as their operating system. Plex has the ability to transcode video files using a GPU for remote viewing (read my article on why the GTX 1650 SUPER is the best GPU for this use-case here). Or you might be a video producer and want to archive your files in a more efficient codec without having to tax your editing rig. Finally, certain research applications, such as Folding@home can use a GPU’s power to solve real-world issues.

About Liam Alexander Colman

I first heard of Unraid through the same medium as many of us did: The Linus Tech Tips channel on YouTube. At the time, I was running TrueNAS (or FreeNAS as it was called back then) on my DIY NAS built using a dual-core Intel Pentium G4400 at its heart. I was convinced, I had chosen the better operating system. After all, it was free and open-source and had a large community behind it. One day, after once again facing the need to buy another three hard drives, I seriously started researching Unraid and its features. I bit the bullet and gave it a go, transferring my data on to external hard drives that I later shucked and added to the Unraid array. Since that day, I have not looked back once, and I am now an enthusiastic and experienced user of Unraid. You can find out more about Unraid Guides right here.

Leave a comment