fix ollama

This commit is contained in:
xuhaifeng 2024-04-17 18:23:16 +08:00
parent 0b24fbde4b
commit fa7287db45
2 changed files with 10 additions and 2 deletions

View File

@ -194,7 +194,11 @@ We have supported inference with [llama.cpp](https://github.com/ggerganov/llama.
More parameters adjustment [see this](https://github.com/ggerganov/llama.cpp/blob/master/examples/main/README.md)
**ollama**
Solving [this issue](https://github.com/ollama/ollama/issues/2383)
1. [install ollama](https://github.com/ollama/ollama)
2. In command line:
```
ollama run modelbest/minicpm-2b-dpo
```
**fastllm**
1. install [fastllm](https://github.com/ztxz16/fastllm)

View File

@ -201,7 +201,11 @@ MiniCPM支持[llama.cpp](https://github.com/ggerganov/llama.cpp/) 、[ollama](ht
更多参数调整[详见](https://github.com/ggerganov/llama.cpp/blob/master/examples/main/README.md)
**ollama**
正在解决[这个问题](https://github.com/ollama/ollama/issues/2383)
1. [安装ollama](https://github.com/ollama/ollama)
2. 在命令行运行:
```
ollama run modelbest/minicpm-2b-dpo
```
**fastllm**
1. [编译安装fastllm](https://github.com/ztxz16/fastllm)