Logo
Explore Help
Register Sign In
RYDE-WORK/ktransformers
1
0
Fork 0
You've already forked ktransformers
mirror of https://github.com/RYDE-WORK/ktransformers.git synced 2026-01-24 07:29:28 +08:00
Code Issues Actions Packages Projects Releases Wiki Activity
ktransformers/ktransformers/operators
History
Atream 477ac28a9c fix-update-flashinfer_wrapper_local_chat
2025-02-25 12:47:31 +00:00
..
__init__.py
Initial commit
2024-07-27 16:06:58 +08:00
attention.py
fix-update-flashinfer_wrapper_local_chat
2025-02-25 12:47:31 +00:00
base_operator.py
fix precision bug imported by position_ids in 0.2.0
2025-02-17 09:23:14 +00:00
cpuinfer.py
[feature] release 0.1.3
2024-08-28 16:11:43 +00:00
dynamic_attention.py
[feature] release 0.1.3
2024-08-28 16:11:43 +00:00
experts.py
Merge pull request #657 from kvcache-ai/feat-absorb-for-long-prefill
2025-02-25 16:53:21 +08:00
flashinfer_wrapper.py
fix-update-flashinfer_wrapper_local_chat
2025-02-25 12:47:31 +00:00
gate.py
Add data loader to read special weights for fp8; Add special weight process script
2025-02-24 11:34:17 +00:00
linear.py
Merge remote-tracking branch 'upstream/develop-0.2.2' into support-fp8
2025-02-24 11:58:10 +00:00
models.py
support absorb for prefill long context
2025-02-25 08:52:02 +00:00
RoPE.py
fix precision bug imported by position_ids in 0.2.0
2025-02-17 09:23:14 +00:00
triton_attention.py
Update triton_attention.py
2025-02-15 15:41:01 +08:00
Powered by Gitea Version: 1.23.8 Page: 23ms Template: 1ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API