diff --git a/README-en.md b/README-en.md index d02e6eb..23cd075 100644 --- a/README-en.md +++ b/README-en.md @@ -194,7 +194,11 @@ We have supported inference with [llama.cpp](https://github.com/ggerganov/llama. More parameters adjustment [see this](https://github.com/ggerganov/llama.cpp/blob/master/examples/main/README.md) **ollama** -Solving [this issue](https://github.com/ollama/ollama/issues/2383) +1. [install ollama](https://github.com/ollama/ollama) +2. In command line: +``` +ollama run modelbest/minicpm-2b-dpo +``` **fastllm** 1. install [fastllm](https://github.com/ztxz16/fastllm) diff --git a/README.md b/README.md index 017f2e9..b3a368e 100644 --- a/README.md +++ b/README.md @@ -201,7 +201,11 @@ MiniCPM支持[llama.cpp](https://github.com/ggerganov/llama.cpp/) 、[ollama](ht 更多参数调整[详见](https://github.com/ggerganov/llama.cpp/blob/master/examples/main/README.md) **ollama** -正在解决[这个问题](https://github.com/ollama/ollama/issues/2383) +1. [安装ollama](https://github.com/ollama/ollama) +2. 在命令行运行: +``` +ollama run modelbest/minicpm-2b-dpo +``` **fastllm** 1. [编译安装fastllm](https://github.com/ztxz16/fastllm)