mirror of
https://github.com/RYDE-WORK/ktransformers.git
synced 2026-02-04 13:33:12 +08:00
fix broken link
This commit is contained in:
parent
77a34c289c
commit
4f87756c2e
@ -165,7 +165,7 @@ Through these two rules, we place all previously unmatched layers (and their sub
|
|||||||
## Muti-GPU
|
## Muti-GPU
|
||||||
|
|
||||||
If you have multiple GPUs, you can set the device for each module to different GPUs.
|
If you have multiple GPUs, you can set the device for each module to different GPUs.
|
||||||
DeepseekV2-Chat got 60 layers, if we got 2 GPUs, we can allocate 30 layers to each GPU. Complete multi GPU rule examples [here](ktransformers/optimize/optimize_rules).
|
DeepseekV2-Chat got 60 layers, if we got 2 GPUs, we can allocate 30 layers to each GPU. Complete multi GPU rule examples [here](https://github.com/kvcache-ai/ktransformers/blob/main/ktransformers/optimize/optimize_rules/DeepSeek-V2-Chat-multi-gpu.yaml).
|
||||||
|
|
||||||
|
|
||||||
<p align="center">
|
<p align="center">
|
||||||
|
|||||||
Loading…
x
Reference in New Issue
Block a user