Super HN

New Show
   How to run LLMs locally on mobile devices (with Gemma and On-Device AI tools) (annjose.com)
Any models that can run on a mobile device will likely be 8B or smaller, will have very noticable hallucination problems.