You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
For MoE inference, mixed type group-gemm is helpful. But now cutlass seems only support mixed type matmul, and group-gemm of non-mixed type.
Describe the solution you'd like
group-gemm of fp16_int4, fp16_int8, e4m3_int4
The text was updated successfully, but these errors were encountered:
This issue has been labeled inactive-30d due to no recent activity in the past 30 days. Please close this issue if no further response or action is needed. Otherwise, please respond with a comment indicating any updates or changes to the original issue and/or confirm this issue still needs to be addressed. This issue will be labeled inactive-90d if there is no activity in the next 60 days.
Is your feature request related to a problem? Please describe.
For MoE inference, mixed type group-gemm is helpful. But now cutlass seems only support mixed type
matmul
, and group-gemm ofnon-mixed
type.Describe the solution you'd like
group-gemm of
fp16_int4
,fp16_int8
,e4m3_int4
The text was updated successfully, but these errors were encountered: