- 官网
https://ollama.com/
- 开源地址
https://github.com/ollama/ollama
- 特点
启动并运行大型语言模型。 运行 Llama 3、Phi 3、Mistral、Gemma 和其他模型。自定义并创建自己的。
支持本地知识库
- 使用说明
- 下载安装ollama
下载地址:
https://ollama.com/download
- 用ollama下载模型
例如:
ollama pull llama3
- 开始使用
例如:
ollama run llama3
搭配
您应该至少有8 GB的RAM用于运行7B模型,16 GB用于运行13B模型,32 GB用于运行33B模型。
普通的家用电脑上也可以运行本地大模型了。
搭配网页或桌面客户端,可以想访问“豆包”、“chatgpt”一样使用自己电脑上的大模型。
- Open WebUI
- Enchanted (macOS native)
- Hollama
- Lollms-Webui
- LibreChat
- Bionic GPT
- HTML UI
- Saddle
- Chatbot UI
- Chatbot UI v2
- Typescript UI
- Minimalistic React UI for Ollama Models
- Ollamac
- big-AGI
- Cheshire Cat assistant framework
- Amica
- chatd
- Ollama-SwiftUI
- Dify.AI
- MindMac
- NextJS Web Interface for Ollama
- Msty
- Chatbox
- WinForm Ollama Copilot
- NextChat with Get Started Doc
- Alpaca WebUI
- OllamaGUI
- OpenAOE
- Odin Runes
- LLM-X (Progressive Web App)
- AnythingLLM (Docker + MacOs/Windows/Linux native app)
- Ollama Basic Chat: Uses HyperDiv Reactive UI
- Ollama-chats RPG
- QA-Pilot (Chat with Code Repository)
- ChatOllama (Open Source Chatbot based on Ollama with Knowledge Bases)
- CRAG Ollama Chat (Simple Web Search with Corrective RAG)
- RAGFlow (Open-source Retrieval-Augmented Generation engine based on deep document understanding)
- StreamDeploy (LLM Application Scaffold)
- chat (chat web app for teams)
- Lobe Chat with Integrating Doc
- Ollama RAG Chatbot (Local Chat with multiple PDFs using Ollama and RAG)
- BrainSoup (Flexible native client with RAG & multi-agent automation)
- macai (macOS client for Ollama, ChatGPT, and other compatible API back-ends)
- Open WebUI的界面