Ollama
安裝 Ollama 執行 DeepSeek R1
安裝
MacOS
brew install ollama
Ubuntu
curl -fsSL https://ollama.com/install.sh | sh
使用
ollama run deepseek-r1:14b
API
curl http://localhost:11434/api/generate -X POST -H "Content-Type: application/json" -d '{"model":"deepseek-r1:14b","prompt":"Hello, world!"}'
臨時開放外部呼叫
sudo systemctl stop ollama.service
OLLAMA_HOST=0.0.0.0:11434 ollama serve
自動開放外部呼叫
sudo systemctl stop ollama.service
sudo systemctl edit ollama.service
### Editing /etc/systemd/system/ollama.service.d/override.conf
### Anything between here and the comment below will become the new contents of the file
# 加入下面這兩行
[Service]
Environment="OLLAMA_HOST=0.0.0.0:11434"
#下面的不要碰
### Lines below this comment will be discarded
### /etc/systemd/system/ollama.service
# [Unit]
# Description=Ollama Service
# After=network-online.target
#
# [Service]
# ExecStart=/usr/local/bin/ollama serve
# User=ollama
# Group=ollama
# Restart=always
# RestartSec=3
# Environment="PATH=/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games"
#
# [Install]
# WantedBy=default.target
sudo systemctl start ollama.service
用另一台電腦測試是否可以打 API
curl http://{your_ip_address}:11434/api/generate -X POST -H "Content-Type: application/json" -d '{"model":"deepseek-r1:14b","prompt":"Hello, world!"}'
python
from ollama import Client
client = Client(
host='http://localhost:11434',
headers={'x-some-header': 'some-value'}
)
response = client.chat(model='deepseek-r1:14b', messages=[
{
'role': 'user',
'content': '为什么天空是蓝色的?',
},
])
print(response)
參考
Last modified: 27 February 2025