How to run LLMs locally on mobile devices (with Gemma and On-Device AI tools)
(annjose.com)
1 point by annjose 29 minutes ago
Any models that can run on a mobile device will likely be 8B or smaller, will have very noticable hallucination problems.
by incomingpain 13 minutes ago