github:https://github.com/ollama/ollama
模型地址:https://ollama.com/library/llama3.1
linux: 安装
1.下载安装脚本
curl -fsSL https://ollama.com/install.sh | sh
2.修改启动环境变量
如果是root 用户记得改为root
vim /etc/systemd/system/ollama.service[Unit] Description=Ollama Service After=network-online.target[Service] ExecStart=/usr/local/bin/ollama serve User=root Group=root Restart=always RestartSec=3 Environment="PATH=/root/.nvm/versions/node/v18.20.4/bin:/home/miniconda3/bin:/home/miniconda3/condabin:/usr/lib64/qt-3.3/bin:/root/perl5/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/root/bin:/usr/local/mysql/bin" Environment="OLLAMA_DEBUG=1" Environment="OLLAMA_HOST=0.0.0.0:11434" Environment=" OLLAMA_KEEP_ALIVE=5h" Environment="OLLAMA_MAX_LOADED_MODELS=10" #export OLLAMA_MAX_QUEUE=100 Environment="OLLAMA_MODELS=/home/data/llm/ollama/models/" [Install] WantedBy=default.target
2.相关命令
(base) [root@ceph1 ~]# ollama Usage:ollama [flags]ollama [command]Available Commands:serve Start ollamacreate Create a model from a Modelfileshow Show information for a modelrun Run a modelpull Pull a model from a registrypush Push a model to a registrylist List modelsps List running modelscp Copy a modelrm Remove a modelhelp Help about any commandFlags:-h, --help help for ollama-v, --version Show version informationUse "ollama [command] --help" for more information about a command.
3. 执行模型运行
o ollama run llama3.1:70b
首次执行会下载模型到 环境变量Environment="OLLAMA_MODELS=/home/data/llm/ollama/models/"
第二次就不用了下载执行 开始运行