Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
[Inference] Append attn FP8 quant (#9328)
* add fp8 gen files to gitignore * append_attn support fp8 quant * Unified FP8 Network * include cuda_fp8.h * simplify qwen2 network and FusedBlockMultiTransformerFP8 * simplify llama network and code check * check fp8 params * code check * check * default config for fp8 gemm
- Loading branch information