diff --git a/README-en.md b/README-en.md index 5dc8c92..fc447c4 100644 --- a/README-en.md +++ b/README-en.md @@ -44,6 +44,7 @@ We release all model parameters for research and limited commercial use. In futu - [Updates](#0) - [Downloading](#1) - [Quick Start](#2) +- [Community](#community) - [Benchmark](#3) - [Deployment on Mobile Phones](#4) - [Demo & API](#5) @@ -55,6 +56,7 @@ We release all model parameters for research and limited commercial use. In futu
## Update Log +- 2024/02/09 We have included a [Community](#community) section in the README to encourage support for MiniCPM from the open-source community. - 2024/02/08 We updated the [llama-format model weights](#llamaformat), which can be loaded into LlamaModel directly. We also supporting llama.cpp and ollama, making it more convenient for everyone to use our model quickly. - 2024/02/01 Initial release. @@ -202,6 +204,14 @@ ollama run minicpm ``` (Note: We have noticed that this quantized model has noticable performance decrease and are trying to fix it) + + +## Community + +- [ChatLLM](https://github.com/foldl/chatllm.cpp) :[Run MiniCPM on CPU](https://huggingface.co/openbmb/MiniCPM-2B-dpo-bf16/discussions/2#65c59c4f27b8c11e43fc8796) + + + ## Evaluation results diff --git a/README.md b/README.md index 71b4531..75cbfba 100644 --- a/README.md +++ b/README.md @@ -45,6 +45,7 @@ MiniCPM 是面壁智能与清华大学自然语言处理实验室共同开源的 - [更新日志](#0) - [模型下载](#1) - [快速上手](#2) +- [开源社区](#community) - [评测结果](#3) - [手机部署](#4) - [Demo & API 部署](#5) @@ -56,6 +57,7 @@ MiniCPM 是面壁智能与清华大学自然语言处理实验室共同开源的 ## 更新日志 +- 2024/02/09 我们在readme里加入了一个[开源社区](#community)章节,用来收集开源社区对MiniCPM的支持案例。 - 2024/02/08 我们更新了[llama-format的模型权重](#llamaformat),支持了llama.cpp调用和ollama调用,方便大家更加快捷地使用我们的模型。 - 2024/02/01 初始发布。 @@ -212,6 +214,15 @@ ollama run minicpm ``` (注:我们注意到这个量化后的模型性能有较大损失,正在尝试解决) + + + +## 开源社区 + +- [ChatLLM框架](https://github.com/foldl/chatllm.cpp):[在CPU上跑MiniCPM](https://huggingface.co/openbmb/MiniCPM-2B-dpo-bf16/discussions/2#65c59c4f27b8c11e43fc8796) + + + ## 评测结果