-
Notifications
You must be signed in to change notification settings - Fork 3.1k
[LLM INFER] Append attn #9244
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
[LLM INFER] Append attn #9244
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
…nto append_attn
Change-Id: Ibe8920ba41ea9775e676b05b12dc01cb9da95b5e
…nto append_attn
…nto append_attn
|
Thanks for your contribution! |
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## develop #9244 +/- ##
========================================
Coverage 52.73% 52.74%
========================================
Files 661 661
Lines 107422 107371 -51
========================================
- Hits 56653 56630 -23
+ Misses 50769 50741 -28 ☔ View full report in Codecov by Sentry. |
PR types
New features
PR changes
Others
Description
大模型推理attention组网重构,新的append_attn方案相比旧方案有10%到90%的性能提升。
目前已支持了llama/qwen/qwen-moe/mixtral的推理。
使用方式,原推理脚本的 --block_attn选项改为--append_attn即可。
TODO: