Who would buy a Raspberry Pi for $120?

That is indeed a puzzling question, brought about by Raspberry Pi’s introduction of the newest Raspberry Pi 5 model, with 16 GB of RAM.
I spent a couple weeks testing the new Pi model, and found it does have its virtues. But being only $20 away from a complete GMKTec N100 Mini PC with 8GB of RAM, it’s probably a step too far for most people.
For most, the 2 GB ($50) or 4 GB ($60) Pi 5 is a much better option. Or if you’re truly budget conscious and want a well-supported SBC, the Pi 4 still exists, and starts at $35. Or a Pi Zero 2 W for $15.
And for stats nerds, the pricing model for Pi 5 follows this polynomial curve almost perfectly:
…very much unlike Apple’s memory and storage pricing for the M4 Mac mini, which follows an equation that ranges from “excellent deal” to “exorbitant overcharge”.
Performance
Before I get to the reasons why some people might consider spending $120 on a Pi 5, I ran a bunch of benchmarks, and one of the more pertinent results is HPL:
This compares the performance of the 8 GB Pi 5 against the new 16 GB model. For many benchmarks, the biggest difference may be caused by the 16 GB model having the newer, trimmer D0 stepping of the BCM2712. But for some, having more RAM helps, too.
Applications like ZFS can cache more files with more RAM, leading to lower latency and higher bandwidth file copies—in certain conditions.
For all my 16 GB Pi 5 benchmarks, see this follow-up comment on my Pi 5 sbc-reviews thread.
5 Reasons a 16 GB Pi 5 should exist
But I distilled my thoughts into a list of 5 reasons the 16 GB Pi 5 ought to exist:
- Keeping up with the Joneses: Everyone seems to be settling on 16 GB of RAM as the new laptop/desktop baseline—even Apple, a company notoriously stingy on RAM in its products! So having a high-end SBC with the same amount of RAM as a low-end desktop makes sense, if for no other reason than just to have it available.
- LLMs and ‘AI’: Love it or hate it, Large Language Models love RAM. The more, the merrier. The 8 GB Pi 5 can only handle up to an 8 billion parameter model, like
llama3.1:8b
. The 16 GB model can run much larger models, likellama2:13b
. Whether getting 1-2 tokens/s on such a large model on a Pi 5 is useful is up to you to decide. I posted my Ollama benchmarks results in this issue - Performance: I already discussed this above, but along with the latest SDRAM tuning the Pi engineers worked on, this Pi is now the fastest and most efficient, especially owing to the newer D0 chip revision.
- Capacity and Consolidation: With more RAM, you can run more apps, or more threads. For example, a Pi 5 with 4 GB of RAM could run one Drupal or Wordpress website comfortably. With 16 GB, you could conceivably run three or four websites with decent traffic, assuming you’re not CPU-bound. You could also run more instances on the same Pi of Docker containers or pimox VMs.
- AAA Gaming: This is, of course, a stretch… but there are some modern AAA games that I had trouble with on my eGPU Pi 5 setup which ran out of system memory on the 8 GB Pi 5, causing thrashing and lockups. For example: Forza Horizon 4, which seemed to enjoy using about 8 GB of system RAM total (alongside the 1 GB or so required by the OS and Steam).
I have a full video covering the Pi 5 16GB, along with illustrations of some of the above points. You can watch it below: