From 7b6062eb4d8400b4e1374b4bb9bfc9f3b85ee7bc Mon Sep 17 00:00:00 2001 From: shengdinghu Date: Tue, 13 Feb 2024 00:02:10 +0800 Subject: [PATCH] change en example --- README-en.md | 8 ++++---- README.md | 4 ++-- 2 files changed, 6 insertions(+), 6 deletions(-) diff --git a/README-en.md b/README-en.md index efddbde..21bb654 100644 --- a/README-en.md +++ b/README-en.md @@ -198,16 +198,16 @@ python inference.py --model_path --prompt_path prompts/promp We have supported inference with [llama.cpp](https://github.com/ggerganov/llama.cpp/) and [ollama](https://github.com/ollama/ollama). -##### llama.cpp +**llama.cpp** 1. [install llama.cpp](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#build) -2. download model in gguf format。[link-fp16](https://huggingface.co/runfuture/MiniCPM-2B-dpo-fp16-gguf) [link-q4km](https://huggingface.co/runfuture/MiniCPM-2B-dpo-q4km-gguf) +2. download model in gguf format. [link-fp16](https://huggingface.co/runfuture/MiniCPM-2B-dpo-fp16-gguf) [link-q4km](https://huggingface.co/runfuture/MiniCPM-2B-dpo-q4km-gguf) 3. In command line: ``` -./main -m ../../model_ckpts/download_from_hf/MiniCPM-2B-dpo-fp16-gguf.gguf --prompt "<用户>写藏头诗,藏头是龙年大吉" --temp 0.3 --top-p 0.8 --repeat-penalty 1.05 +./main -m ../../model_ckpts/download_from_hf/MiniCPM-2B-dpo-fp16-gguf.gguf --prompt "<用户>Write an acrostic poem with the word MINICPM (One line per letter)" --temp 0.3 --top-p 0.8 --repeat-penalty 1.05 ``` More parameters adjustment [see this](https://github.com/ggerganov/llama.cpp/blob/master/examples/main/README.md) -##### ollama +**ollama** Solving [this issue](https://github.com/ollama/ollama/issues/2383) diff --git a/README.md b/README.md index 93569c7..bc22ab5 100644 --- a/README.md +++ b/README.md @@ -204,7 +204,7 @@ python inference.py --model_path --prompt_path prompts/promp #### llama.cpp与Ollama推理 我们支持了[llama.cpp](https://github.com/ggerganov/llama.cpp/) 推理与[ollama](https://github.com/ollama/ollama)推理. -##### llama.cpp +**llama.cpp** 1. [安装llama.cpp](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#build) 2. 下载gguf形式的模型。[下载链接-fp16格式](https://huggingface.co/runfuture/MiniCPM-2B-dpo-fp16-gguf) [下载链接-q4km格式](https://huggingface.co/runfuture/MiniCPM-2B-dpo-q4km-gguf) 3. 在命令行运行示例代码: @@ -213,7 +213,7 @@ python inference.py --model_path --prompt_path prompts/promp ``` 更多参数调整[详见](https://github.com/ggerganov/llama.cpp/blob/master/examples/main/README.md) -##### Ollama +**ollama** 正在解决[这个问题](https://github.com/ollama/ollama/issues/2383)