What’s the best affordable pre-built mini server available in Europe? I’m looking for a reliable and compact option that won’t break the bank
Edit: something that are not arm based
Edit 2: I’m looking to set up a system that can run Jellyfin, Ollama, and a few other small services. By ‘pre-built’, I mean I want to buy a device that already has the necessary hardware and installation, so all I need to do is install the operating system and I’m good to go
You need to first explain what you want the server for, because that will give us an idea of your CPU and storage requirements.
I’m looking to set up a system that can run Jellyfin, Ollama, and a few other small services.
Ollama is a big thing, do you want it to be fast? You will need a GPU, how large is the model you will be running be? 7/8B with CPU not as fast but no problem. 13B slow with CPU but possible
I am satisfied if it can run a 7/8B relatively fast
Check out used tiny/mini/micro desktops on eBay. Loads of info here: https://www.servethehome.com/introducing-project-tinyminimicro-home-lab-revolution/
Only downside is going to be no GPU for the AI workload. Maybe some of the later AMD APUs could cut it. If not, all three major manufacturers have SFF variants that are pretty much the same hardware in a little bigger case. Those will accept smaller off the shelf GPUs.
Edit: something that are not arm based
You want pre-built to run ollama, that’s at least gonna cost you an arm, maybe even a leg.
It depends on the model you run. Mistral, Gemma, or Phi are great for a majority of devices, even with CPU or integrated graphics inference.
I’m not sure if they’re still affordable but I ended up getting both a morefine and a beelink, one with the n100 Intel CPU and the other with the n305. They handle everything I’ve thrown at them, and come with out of the box quicksync transcoding for Jellyfin/Plex. Handles 4K transcode like a champ. Couple that with 2.5g Ethernet and they sip power. Though they might have gone up in price since I bought mine.
I have a beelink running jellyfin and it’s fine.
It has the n100.
2 bay nas with a Ryzen7 and up to 32GB ram.
Thanks i will take a look at that.
There is also a cheaper option with an N100
Define “pre-built”?
What will you be using it for?
I’m still testing the CWWK X86 P5 but so far it seems like a near-perfect solution for personal use and the price can’t be best.
deleted by creator
I want to run Jellyfin, ollama (local AI), and some other small services.
Define pre-built.
How big is your media library?
Ollama will probably benefit from a more powerful processor like a Ryzen 7000/8000 series.
Can’t comment on NPUs and TOPs and all that. Still completely at a loss as to what is or is not marketing nonsense when it comes to AI. But the new Ryzen CPUs are supposed to have more of those, if you can wait.
By ‘pre-built’, I mean I want to buy a device that already has the necessary hardware and installation, so all I need to do is install the operating system and I’m good to go. Storage capacity between 1 and 8 terabytes
I want to buy a device that already has the necessary hardware
Does that include disks? That’s going to limit your options considerably.
The one I recommended does not come with RAM or storage but those are both very easy to install and you can choose what size. It also comes with 2xSATA/power connectors. You could get 2x8TB 2.5" SSDs but I’d probably go with 4x2TB NVMe, that would give you 8TB RAW or 6TB with a parity disk.
Prebuilt like a traditional server? I personally use an orange pi and it’s pretty good. Just make sure to use the open source arm OS.
I love my NUCs but haven’t really paid attention to what has happened since Intel sold that line to ASUS.
Thanks i will take a look at the NUC.
Acronyms, initialisms, abbreviations, contractions, and other phrases which expand to something larger, that I’ve seen in this thread:
Fewer Letters More Letters NUC Next Unit of Computing brand of Intel small computers NVMe Non-Volatile Memory Express interface for mass storage Plex Brand of media server package SSD Solid State Drive mass storage
4 acronyms in this thread; the most compressed thread commented on today has 17 acronyms.
[Thread #785 for this sub, first seen 5th Jun 2024, 13:55] [FAQ] [Full list] [Contact] [Source code]
I like my HPE microserver gen10+
Although it does not come with a GPU by default, but you can install a low power one.
How small? How many drives? I bought several used Lenovo P330 E2276G for my servers.
The Intel CPU has great low power GPU for video encoding/decoding for video streaming.
The Xeon ECC ram gives long term reliability. It’s important if you leave your PC on 24/7 for years at a time.
I am not a big fan of buying used.
Why not? It would help massively with the ‘affordable’ criterion
Used servers/workstations are likely more reliable than new consumer.
They were very likely kept temperature controlled, have ECC, and are actually known working instead of something like Asus. If I remember correctly, PC mortality is very high the first 6 months, goes down to near zero for 5 years, then starts going back up.
Replace the SSD/hard drive and you are good. You might not even have to do that. I checked the stats on the SSD that came with my used Lenovo workstation and it had like 20 hours on it.
what are you gonna use it for?
something that are not arm based I’m looking to set up a system that can run Jellyfin, Ollama, and a few other small services. By ‘pre-built’, I mean I want to buy a device that already has the necessary hardware and installation, so all I need to do is install the operating system and I’m good to go
deleted by creator
I’ve had a good experience so far with two minipcs, mele quieter 4c for kodi, and a morefine m9 (I think this one is branded as mipowcat in the EU). They’re both n100, the m9 can go up to 32gb of ram although it is picky about what modules it will accept. I use the m9 for jellyfin and about 10 other services. Quick sync works great as far as I’ve tested it. For jellyfin I’m relying mostly on direct streaming, but I tried a few episodes with forcing some transcoding by using Firefox for playback and it worked fine.
If you want to run Ollama and other ML stuff, you’re looking at buying an RTX4090, my friend. Affordable and ML are two things you can’t put into one sentence.
While you certainly can run AI models that require such a beefy GPU, there are plenty of models that run fine even on a CPU-only system. So it really depends on what exactly Ollama is going to be used for.