fbpx
Ollama running DeepSeek on Android

Credit: Robert Triggs / Android Authority

Cloud AI might be impressive, but I yearn for the added security that only offline, local processing provides, especially in light of DeepSeek reporting user data back to China. Replacing CoPilot with DeepSeek on my laptop yesterday made me wonder if I could run large language models offline on my smartphone as well. After all, today’s flagship phones claim to be plenty powerful, have pools of RAM, and have dedicated AI accelerators that only the most modern PCs or expensive GPUs can best. Surely, it can be done.

Well, it turns out you can run a condensed version of DeepSeek (and many other large language models) locally on your phone without being tethered to an internet connection. While the responses are not as fast or accurate as those of the full-sized cloud model, the phones I tested can churn out answers at a brisk reading pace, making them very usable. Importantly, the smaller models are still good at assisting with problem-solving, explaining complex topics, and even producing working code, just like its bigger sibling.