diff --git a/README-en.md b/README-en.md index 19c9ba8..c8beca8 100644 --- a/README-en.md +++ b/README-en.md @@ -407,7 +407,7 @@ python bnb_quantize.py model_path="" bnb_path="" ``` -3. 3. In the MiniCPM/quantize directory, enter the following command in the command line: +3. In the MiniCPM/quantize directory, enter the following command in the command line: ``` bash quantize_eval.sh ``` @@ -416,7 +416,8 @@ python bnb_quantize.py
## Community -- [LLaMA-Factory](https://github.com/hiyouga/LLaMA-Factory.git): [MiniCPM fine-tuning one-click solution](https://modelbest.feishu.cn/wiki/AIU3wbREcirOm9kkvd7cxujFnMb?from=from_copylink) +- [xtuner](https://github.com/InternLM/xtuner): [More efficient fine-tuning options of MiniCPM](https://modelbest.feishu.cn/wiki/AIU3wbREcirOm9kkvd7cxujFnMb#AMdXdzz8qoadZhxU4EucELWznzd) +- [LLaMA-Factory](https://github.com/hiyouga/LLaMA-Factory.git): [MiniCPM fine-tuning one-click solution](https://modelbest.feishu.cn/wiki/AIU3wbREcirOm9kkvd7cxujFnMb#BAWrdSjXuoFvX4xuIuzc8Amln5E) - [ChatLLM](https://github.com/foldl/chatllm.cpp): [Run MiniCPM on CPU](https://huggingface.co/openbmb/MiniCPM-2B-dpo-bf16/discussions/2#65c59c4f27b8c11e43fc8796) diff --git a/README.md b/README.md index b9447b4..51eb222 100644 --- a/README.md +++ b/README.md @@ -424,7 +424,8 @@ python bnb_quantize.py ## 开源社区 -- [LLaMA-Factory](https://github.com/hiyouga/LLaMA-Factory.git):[MiniCPM微调一键式解决方案](https://modelbest.feishu.cn/wiki/AIU3wbREcirOm9kkvd7cxujFnMb?from=from_copylink) +- [xtuner](https://github.com/InternLM/xtuner): [MiniCPM高效率微调的不二选择](https://modelbest.feishu.cn/wiki/AIU3wbREcirOm9kkvd7cxujFnMb#AMdXdzz8qoadZhxU4EucELWznzd) +- [LLaMA-Factory](https://github.com/hiyouga/LLaMA-Factory.git):[MiniCPM微调一键式解决方案](https://modelbest.feishu.cn/wiki/AIU3wbREcirOm9kkvd7cxujFnMb#BAWrdSjXuoFvX4xuIuzc8Amln5E) - [ChatLLM框架](https://github.com/foldl/chatllm.cpp):[在CPU上跑MiniCPM](https://huggingface.co/openbmb/MiniCPM-2B-dpo-bf16/discussions/2#65c59c4f27b8c11e43fc8796)