mirror of
https://github.com/RYDE-WORK/MiniCPM.git
synced 2026-01-19 21:03:39 +08:00
update readme
This commit is contained in:
parent
34ac3a2237
commit
ed46d4671c
10
README-en.md
10
README-en.md
@ -60,7 +60,7 @@ We release all model parameters for research and limited commercial use.
|
||||
<p id="0"></p>
|
||||
|
||||
## Update Log
|
||||
- 2024/04/11 We release [MiniCPM-V 2.0](https://huggingface.co/openbmb/MiniCPM-V-2.0), [MiniCPM-2B-128k](https://huggingface.co/openbmb/MiniCPM-2B-128k), [MiniCPM-MoE-8x2B](https://huggingface.co/openbmb/MiniCPM-MoE-8x2B) and [MiniCPM-1B](https://huggingface.co/openbmb/MiniCPM-1B-sft-bf16)!
|
||||
- **2024/04/11 We release [MiniCPM-V 2.0](https://huggingface.co/openbmb/MiniCPM-V-2.0), [MiniCPM-2B-128k](https://huggingface.co/openbmb/MiniCPM-2B-128k), [MiniCPM-MoE-8x2B](https://huggingface.co/openbmb/MiniCPM-MoE-8x2B) and [MiniCPM-1B](https://huggingface.co/openbmb/MiniCPM-1B-sft-bf16)!**
|
||||
- 2024/03/16 Intermediate checkpoints were released [here](https://huggingface.co/openbmb/MiniCPM-2B-history)!
|
||||
- 2024/02/13 We support llama.cpp
|
||||
- 2024/02/09 We have included a [Community](#community) section in the README to encourage support for MiniCPM from the open-source community.
|
||||
@ -180,8 +180,8 @@ print(res)
|
||||
```
|
||||
|
||||
|
||||
#### llama.cpp、Ollama、fastllm、mlx_lm Inference
|
||||
We have supported inference with [llama.cpp](https://github.com/ggerganov/llama.cpp/) 、[ollama](https://github.com/ollama/ollama)、[fastllm](https://github.com/ztxz16/fastllm)、、[mlx_lm](https://github.com/ml-explore/mlx-examples). Thanks to [@runfuture](https://github.com/runfuture) for the adaptation of llama.cpp and ollama.
|
||||
#### llama.cpp, Ollama, fastllm, mlx_lm Inference
|
||||
We have supported inference with [llama.cpp](https://github.com/ggerganov/llama.cpp/), [ollama](https://github.com/ollama/ollama), [fastllm](https://github.com/ztxz16/fastllm), [mlx_lm](https://github.com/ml-explore/mlx-examples). Thanks to [@runfuture](https://github.com/runfuture) for the adaptation of llama.cpp and ollama.
|
||||
|
||||
|
||||
**llama.cpp**
|
||||
@ -680,12 +680,12 @@ MBPP, instead of the hand-verified set.
|
||||
* Android, HarmonyOS
|
||||
* Adapt based on open-source framework MLC-LLM.
|
||||
* Adapted for text model MiniCPM, and multimodel model MiniCPM-V.
|
||||
* Support MiniCPM-2B-SFT-INT4、MiniCPM-2B-DPO-INT4、MiniCPM-V.
|
||||
* Support MiniCPM-2B-SFT-INT4, MiniCPM-2B-DPO-INT4, and MiniCPM-V.
|
||||
* [Compile and Installation Guide](https://github.com/OpenBMB/mlc-MiniCPM/blob/main/README.md)
|
||||
* iOS
|
||||
* Adapt based on open-source framework LLMFarm.
|
||||
* Adapted for text model MiniCPM.
|
||||
* Support MiniCPM-2B-SFT-INT4、MiniCPM-2B-DPO-INT4.
|
||||
* Support MiniCPM-2B-SFT-INT4, MiniCPM-2B-DPO-INT4.
|
||||
* [Compile and Installation Guide](https://github.com/OpenBMB/LLMFarm)
|
||||
|
||||
#### Performance
|
||||
|
||||
@ -61,7 +61,7 @@ MiniCPM 是面壁智能与清华大学自然语言处理实验室共同开源的
|
||||
<p id="0"></p>
|
||||
|
||||
## 更新日志
|
||||
- 2024/04/11 开源[MiniCPM-V-2.0](https://huggingface.co/openbmb/MiniCPM-V-2.0)、[MiniCPM-2B-128k](https://huggingface.co/openbmb/MiniCPM-2B-128k)、[MiniCPM-MoE-8x2B](https://huggingface.co/openbmb/MiniCPM-MoE-8x2B)和[MiniCPM-1B](https://huggingface.co/openbmb/MiniCPM-1B-sft-bf16)!
|
||||
- **2024/04/11 开源[MiniCPM-V-2.0](https://huggingface.co/openbmb/MiniCPM-V-2.0)、[MiniCPM-2B-128k](https://huggingface.co/openbmb/MiniCPM-2B-128k)、[MiniCPM-MoE-8x2B](https://huggingface.co/openbmb/MiniCPM-MoE-8x2B)和[MiniCPM-1B](https://huggingface.co/openbmb/MiniCPM-1B-sft-bf16)!**
|
||||
- 2024/03/16 MiniCPM-2B 的30余个中间检查点开放了
|
||||
- 2024/02/13 支持了llama.cpp
|
||||
- 2024/02/09 我们在README里加入了一个[开源社区](#community)章节,用来收集开源社区对MiniCPM的支持案例。
|
||||
|
||||
Loading…
x
Reference in New Issue
Block a user