Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is neat, although I have a word of caution (even if it might be a bit obvious): it's possible to find good deals, but you should be aware of power usage. There are modern mini PCs, such as those with Intel N100 processors, that are very cheap and consume very few watts while being useful for many purposes. I personally bought a brand-new CHUWI LarkBox X, and it's been great. It cost around 100 EUR on a deal. If however power usage isn't an issue for you and you don't care about other misc stuff (noise levels etc) then you can disregard this.


I wouldn't automatically prefer any random N100 mini PC over a nice second hand enterprise mini PC.

In home server use cases, mini PCs stay idle the vast majority of their runtime. So it's idle power consumption that is the most useful metric to look into. The N100 can have great idle performance in theory, but most data I can find about N100 boxes is them idling in the 12W-15W range. This is something that older enterprise mini desktops have no trouble matching or beating [1]. Especially since roughly the Skylake era (Intel 6th gen), idle power consumption for enterprise PCs has been excellent - but even before then it wasn't bad.

Enterprise vendors like Dell/HP/Lenovo have always optimized for TCO and actually usually use quite high quality power supply circuitry, whereas most N100 mini PCs tend to be built with cheaper components and not as optimized for low power usage for the whole system.

[1]: I recommend reviewing Serve The Home's TinyMiniMicro project, which often finds the smallest enterprise PC form factors to idle at 8 to 13W, even older ones. Newer systems can get below 7W! https://www.servethehome.com/tag/tinyminimicro/


One can also do things like undervolting to reduce the power draw even more. Modern BIOSs can give a lot of freedom for underclocking/volting, not just pushing things to consume more power.


Power usage on these mini pcs is actually pretty decent.

I have a bunch of SFF computers (Dell 7060, HP 600 G4, etc) with i7-8700 or similar CPUs. They all idle around 12 watts.

Most of the mini pcs use the T version of the processors, which are usually 35w TDP.

Power usage will definitely be higher than an N100 (65W TDP vs 6W), but they're a lot more versatile since you're getting more than double the performance, 2-3x the threads, and an iGPU that can do things like transcoding for plex and accelerate ML models for Frigate/Scrypted.


N100's QuickSync has been good enough for me for Plex transcoding FWIW, though maybe your demands are higher than mine in terms of resolution


Can it do that while also running 4 drives in a ZFS RAID array? I've been thinking about building such a system but haven't decided on the CPU yet, so I'm afraid of getting something too underpowered.


In `top` you won't notice a difference in cpu utilization for the transcoding work with an intel iGPU (as long as transcoding is being handled by it, of course).

The N100 is definitely powerful enough run a ZFS RAID array. Depending on what all you'd like to run, it might be enough. Check it out with cpubenchmark's compare feature!

I used a Celeron G4900 (also has an iGPU) as a plex server for years, and it's half as powerful as an N100. The celeron is a fairly slow processor, but for plex it was enough since the iGPU did the heavy lifting.


Very interesting, thanks! What about adding a few other server tasks, like automated backups, and running Immich (Google Photos clone)? I'm seeing a lot of conflicting information online, with some people swearing by the N100 as a home server capable of these tasks, and others saying it's just too slow and something like a Celeron 13100 is needed. Since I can't exactly build a system, and then return the motherboard/CPU if it's too slow, I'm a little afraid of under-speccing it. My goal is to have a 3-5 drive system with ZFS, and use it for NAS, Jellyfin (with 4K transcodes, because of subtitles; probably never more than 1 movie at a time though), Immich photos, and backups.


You're definitely limiting your upside with an N100, but it might be enough like I said. It'll work, it's just a matter of how fast it'll be with everything you put on it.

I'd highly recommend joining the serverbuilds dot net discord. They also have a forum with pre-specced NAS build configurations, complete with pricing. People there are very helpful and will give realistic advice.

I think Jellyfin beats Plex on 4k transcoding (tonemapping?) with the iGPU, but fwiw I do not transcode 4k and add subtitles to them just fine. I use an nvidia shield, which direct plays 4k content with the added subtitles. Hearing about transcoding 4k content just to add subtitles is news to me.


I think HTPC hardware transcoding is basically the sweet spot use case for the N100. Its less good compared to alternatives for pushing the game emulation performance.


Reusing these boxes instead of having them thrown away and getting a new one built is better for the environment, though.


Does anyone have any useful rules-of-thumb or heuristics for balancing this trade off of upfront cost v.s. power cost? e.g. how much does an N100 cost to run for a year v.s. say a i5-2400s (the CPU for the first row on the linked site)?


I used to calculate costs of lightbulbs: 1 Watt running the whole year, at 0,28 eurocent/kWh costs 1 Euro per year. Until someone corrected me and it turned out that every 1 Watt 24/7 will be 2 Euro per year.

In the US electric power might be cheaper. And if it's running only part of the time, you should adjust the calculation.

My desktop/server runs 24/7, so I prefer having a CPU with 65W TDP over one that is 125W TDP. That might run up to 120 Euro per year difference for me (if it would be running at 100% CPU).


Real world energy use is nothing like what you see on spec sheets. And not just because manufacturers differ in how they compute TPD. And TPD is also not a good indicator for energy use at (near) idle. With underclocking/volting in the BIOS you can get a beefier CPU to outperform smaller CPUs per watt. Because CPUs get really inefficient as they use more power undervolted or capped high TPD chips might be much more power efficient in the real world than their low TPD counterparts.


My NUC13 with i3 has a nominal 15w TDP, but while idling on a KDE desktop with a browser open to reuters (1 tab) it hovers around 3 - 4w (5% CPU usage). If there's REALLY nothing going on (no desktop even) it's 1.0 - 1.3w (1% CPU usage).

Edit: I should note that there's no fan drawing power because I put it in an Akasa passively cooled case.


I tried to find this out myself. All I could find easily was the TDP of different processors. But I'm not sure if it's a good measure of how much power it will use.


I went down this rabbit hole earlier this year. Best I came up with was to calculate the TDP at max for the whole year. Full TDP is unrealistic, but it gets us a worst-case "max running cost" . Energy for me is roughly $0.12/kWh, so the yearly max running cost for a 35W TDP is $36.79, 65W is $68.33, and the 95W would be $99.86.

I ended up going with a HP EliteDesk 800 G5 Mini I5-9500T (35W) off of Ebay for $100 and it does the stuff I need it to do just fine. According to my current monthly power usage graph, it's averaged 7W which accounts for $0.61 of this month's power bill.


No, sadly the TDP tells us every little about the idle power cost, which might be where you spend most of your time depending on the workload.

Just from tweaking my laptop, I’ve noticed that when it is really idle (or I’ve intentionally put it in a low frequency mode), the big power drains are the wireless interfaces (don’t forget bluetooth) and the screen (OLED helps as long as the screen is mostly black). Gotta tweak the whole thing.


The only real way of knowing is to measure it. If you already have a system in place an energy monitoring smart plug can help you calculate the current running costs and help estimate the savings of using a lower-power machine.

When I did this I was surprised by how much - or how little - it cost to run various devices. It's quite addictive.

It's not always accurate because a lower-power machine doing the same task will often need to work at its full power more often, so the savings may be less. For example, a Raspberry Pi 5 may often be more power effecient than a Pi 4, despite drawing more power at full capacity on paper, because it spends less time at full capacity than the Pi 4 does.

On the other hand, when I upgraded my work PC I found it used less power but I also had to run my office heater more often in winter, as the new PC wasn't as efficient at heating the space.


Yeah, exactly! I suppose that it's workload dependent to a great extent


If a Kwh of power costs $ 0,30, then 1 watt = $ 2,63 a year. (0.001 kwh * 24 hours * 365 days * $ 0,30).

So, it goes quite quickly. Savings of 20 watt save you $ 52 a year.


Agreed, an N100 mini PC can be a great deal. They also tend to be smaller. I added a separate Intel filter that includes a lot of N100s. But it might be better to buy those new, not used.


Power consumption is definitely a big deal. I replaced an old PC that I'd been using as an always-on device with a tiny PC (i7-8700T) and it saved a ton of power. Given that power rates in New England are around $0.30/kWh, saving 50 watts means saving $128/year. I went from using around 60 watts to 10 watts at idle (and going from 110 watts under load to 50 watts).

The new computer cost me $240 back in late 2022 (with 32GB of RAM and WiFi) so it'll basically pay for itself in electricity savings - and it's 3x faster than what it replaced.

ServeTheHome has some good reviews: https://www.servethehome.com/tag/tinyminimicro/. The tl;dr is just that there's good options from Dell, HP, and Lenovo and the differences are kinda minor, but it's a good source if you care about specific information and teardowns.

It's a great little machine, takes up almost no space, it's almost silent, and it was basically free with the power savings - in fact, once I pass the two year mark, it was cheaper to get the new hardware than to keep running the old.

And you can put Proxmox on it as a hypervisor to run multiple OSs or containers.


If the system draws 65 watts and you pay 12 cents/kwh, then it will cost you right at $68.377 dollars a year to run it at full tilt.

Math: 1,000 watts /65 watts/hour = 15.384 hours per kwh. 365.25 days/year * 24 hours/day = 8766 hours/yr <=(accounting for leap days) 8766 hours/yr / 15.384 hours/kwh = 569.81 kwh/yr 569.81 kwh/yr * $0.12/kwh = $68.377/yr

For quick math where accuracy isn't very important, at $0.12/kwh it will cost you ~$1.05/year per watt (65w = $68.38/yr), so every watt you save per year is a dollar in your pocket.

Of course, there are ways to reduce the energy usage of a system, a computer rarely has to run at 100% 24/7/365 unless it is very underspecced for your use case, even things as simple as enabling C states and not utilizing all of the PC resources available will save you many dollars a year.


For those wondering (like me) the normal price for the CHUWI LarkBox X is about $190.


The N100s are everywhere, but I think the N305 with 8 E-cores is the bomb for a home server at slightly more power consumption.


Those are at the price point where other options like enterprise mini pc make more sense


Well, I guess if you make no effort to understand other peoples use case, you can make sweeping generalizations like that. If you need particular features, power consumption, or other factors, a 2-4 gen old used mini might not be the right answer.

Looking at the OP, there's a ton of i5-8500s in the same price range as a new n305 on Amazon (not bargain hunting here, I'm sure you can find either option much cheaper anywhere). Compared to an N305, a i5-8500 has less cores/threads (6), has al least 4x the TPD, and has a significantly worse GPU. And many people want to buy new. But the used i5 is more expandable, and particularly affords more max memory.

There isn't one uncontested "best".




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: