fix broken link

This commit is contained in:
TangJingqi 2024-08-16 11:10:30 +08:00
parent 77a34c289c
commit 4f87756c2e

View File

@ -165,7 +165,7 @@ Through these two rules, we place all previously unmatched layers (and their sub
## Muti-GPU ## Muti-GPU
If you have multiple GPUs, you can set the device for each module to different GPUs. If you have multiple GPUs, you can set the device for each module to different GPUs.
DeepseekV2-Chat got 60 layers, if we got 2 GPUs, we can allocate 30 layers to each GPU. Complete multi GPU rule examples [here](ktransformers/optimize/optimize_rules). DeepseekV2-Chat got 60 layers, if we got 2 GPUs, we can allocate 30 layers to each GPU. Complete multi GPU rule examples [here](https://github.com/kvcache-ai/ktransformers/blob/main/ktransformers/optimize/optimize_rules/DeepSeek-V2-Chat-multi-gpu.yaml).
<p align="center"> <p align="center">