Keyword : INV_FREQ Error occurs in latest version of Transformer, But does not in the version of 4.37.2

28 Views Asked by At

Has there been any observed discrepancy or performance degradation reported by users transitioning from the COGVLM 8-bit Quantized model, which demonstrated seamless operation with the Transformer version 4.37.2, to the subsequent transformer update of a newer version? Specifically, within this migration process, have there been notable challenges encountered in terms of model functionality, compatibility, or any other pertinent factors that might impede the optimal utilization of the updated version?

When trying the COGVLM 8-bit Quantized model was working fine (Transformer version of 4.37.2) till the released the new transformer update of version. Error shows Keyword : INV_FREQ

enter image description here

0

There are 0 best solutions below