mirror of
https://github.com/RYDE-WORK/MiniCPM.git
synced 2026-01-19 21:03:39 +08:00
Merge branch 'OpenBMB:main' into main
This commit is contained in:
commit
3691af0fd2
@ -194,7 +194,11 @@ We have supported inference with [llama.cpp](https://github.com/ggerganov/llama.
|
||||
More parameters adjustment [see this](https://github.com/ggerganov/llama.cpp/blob/master/examples/main/README.md)
|
||||
|
||||
**ollama**
|
||||
Solving [this issue](https://github.com/ollama/ollama/issues/2383)
|
||||
1. [install ollama](https://github.com/ollama/ollama)
|
||||
2. In command line:
|
||||
```
|
||||
ollama run modelbest/minicpm-2b-dpo
|
||||
```
|
||||
|
||||
**fastllm**
|
||||
1. install [fastllm](https://github.com/ztxz16/fastllm)
|
||||
|
||||
@ -201,7 +201,11 @@ MiniCPM支持[llama.cpp](https://github.com/ggerganov/llama.cpp/) 、[ollama](ht
|
||||
更多参数调整[详见](https://github.com/ggerganov/llama.cpp/blob/master/examples/main/README.md)
|
||||
|
||||
**ollama**
|
||||
正在解决[这个问题](https://github.com/ollama/ollama/issues/2383)
|
||||
1. [安装ollama](https://github.com/ollama/ollama)
|
||||
2. 在命令行运行:
|
||||
```
|
||||
ollama run modelbest/minicpm-2b-dpo
|
||||
```
|
||||
|
||||
**fastllm**
|
||||
1. [编译安装fastllm](https://github.com/ztxz16/fastllm)
|
||||
|
||||
@ -171,6 +171,8 @@ if __name__ == "__main__":
|
||||
model_path=model_args.model_name_or_path,
|
||||
max_length=training_args.model_max_length,
|
||||
use_lora=training_args.use_lora,
|
||||
bf16=training_args.bf16,
|
||||
fp16=training_args.fp16
|
||||
)
|
||||
|
||||
train_dataset = SupervisedDataset(
|
||||
|
||||
Loading…
x
Reference in New Issue
Block a user