c0derالـ

Workstation Hardware Requirements in the age of LLMs

Published:

Discover the real memory requirements for running local LLMs with development tools. Analysis of RAM needs for Ollama, Docker, and why 32GB is barely enough.

As Ollama seems to become the Docker of the LLM space, there isn’t enough space in the RAM to run both—for most devs. The idea of local LLMs is appealing as it protects company knowledge from leaking into external AI consciousness. Although getting started with local LLMs is easy, local LLMs are far from applicable.

First, local LLM require a lot of memory. To run LLMs on localhost, organizations need to set 32 GB RAM as the minimum requirements for not only devs but any knowledge worker. This is barely enough to run the OS, Chrome, Slack, Docker, and a small LLM. Any memory below that or any larger model will result in a lot of memory swaps frying the SSD in the process.

Setting a minimum RAM of 32 GB is still an optimistic measure, there is a lot of betting on technology advancement to make it work. First, LLMs could become more memory efficient. It is true that there are many small models, but the performance of these models is terrible compared to the commercial models we are used to such as ChatGPT and Claude. LLMs need to become memory efficient without sacrificing a lot of accuracy.

Advancement in generic models maybe hard, so we may bet on many small specialized models. Instead of jack of all trades coding model, models need to be specialized in one language or one technology. I believe this is the direction that would work eventually. There could be many specialized LLMs just like docker images. The industry could also move forward with advancement in hardware. As more specialized chips emerge, we could expect an advancement along this axis.

In conclusion, until technology breakthroughs are made in LLM and/or hardware, local LLMs would remain limited. For now, one could only bet on running LLMs in the cloud.

Tags: local-llm-deployment , hardware-requirements , ai-infrastructure