Local Llama 3.2 and many more
Chat with AI language models on your Mac, totally offline. Every conversation is completely private and saved locally. No subscriptions, no snooping, complete freedom.
· Customize persona and expertise by changing the system prompt
· Try any GGUF formatted model including CodeLlama, Mistral 7B, and Llama 3
· No internet connection required, everything runs on your device
- Open source and auditable
Requires 8GB+ RAM
This app hasn’t received enough ratings or reviews to display an overview.
Doesn't seem to work with DeepSeek
Aramis720
I installed the DeepSeek model and it just freezes. Help?
Offline
Whenutwe
"Offline resources offer valuable opportunities for exploration and learning. In situations where internet connectivity may be limited or unreliable, utilizing available materials provides an alternative means of accessing information."
Works beautifully on M1 MBP & M2 MBA
tinyapps
Have not had any trouble loading over half a dozen GGUFs downloaded from HF. Big, big thanks to the developer for crafting and sharing this great interface.
非常好!
KevinEloiseBoge
能够在本地运行,并且是支持中文的。感谢开发者。It's very nice, I can use it without Internet. It's safe at the same time. Thanks to the developer
This release bumps llama.cpp to get major speed upgrades from ggerganov and co as well as latest model compatibility.
Version 1.0.11
The developer, Peter Sugihara, indicated that the app’s privacy practices may include handling of data as described below. For more information, see the developer’s privacy policy .
Data Not Collected
The developer does not collect any data from this app.
Privacy practices may vary, for example, based on the features you use or your age. Learn More
Accessibility
The developer has not yet indicated which accessibility features this app supports. Learn More