Skip to content

Commit 538374f

Browse files
Enable models regressed because of frontend type alignment changes (openvinotoolkit#26661)
Enable models regressed by frontend type alignment changes Enable models regressed by frontend type alignment changes aeaa4f7 & e732ade that insert additional ConvertLike operations that prevent SDPAToPagedAttention transformation to work smoothly as expected. Fix it by moving ConvertConvertLike after type resolution in FrontEnd::normalize() to turn ConvertLike into Convert only after types are resolved hence nothing prevents SDPAToPagedAttention transformation to do its job. Therefore, enable katuni4ka/tiny-random-chatglm2 & katuni4ka/tiny-random-glm4 in precommit tests. List of enabled models: * THUDM/chatglm2-6b * THUDM/chatglm3-6b * katuni4ka/tiny-random-chatglm2 * katuni4ka/tiny-random-glm4 Ticket: * CVS-152286
1 parent 5033754 commit 538374f

File tree

2 files changed

+4
-3
lines changed

2 files changed

+4
-3
lines changed

src/frontends/pytorch/src/frontend.cpp

+1
Original file line numberDiff line numberDiff line change
@@ -317,6 +317,7 @@ void FrontEnd::normalize(const std::shared_ptr<ov::Model>& model) const {
317317
manager.register_pass<ov::pass::RemoveMultiSubGraphOpDanglingParamsResults>();
318318
manager.register_pass<ov::pass::ReverseShapeAndTypeInfer>();
319319
manager.register_pass<ov::pass::ResolveNameCollisions>(true);
320+
manager.register_pass<ov::pass::ConvertConvertLike>();
320321
manager.run_passes(model);
321322

322323
// Usually if nn.Module.forward is given as a source model for conversion, there is the first Parameter

tests/model_hub_tests/transformation_tests/models/hf-tiny-random-models-precommit

+3-3
Original file line numberDiff line numberDiff line change
@@ -39,8 +39,8 @@ fxmarty/really-tiny-falcon-testing,https://huggingface.co/fxmarty/really-tiny-fa
3939
Xenova/tiny-random-Phi3ForCausalLM,https://huggingface.co/Xenova/tiny-random-Phi3ForCausalLM
4040
facebook/opt-125m,https://huggingface.co/facebook/opt-125m
4141
facebook/opt-350m,https://huggingface.co/facebook/opt-350m
42+
katuni4ka/tiny-random-chatglm2,https://huggingface.co/katuni4ka/tiny-random-chatglm2
43+
katuni4ka/tiny-random-glm4,https://huggingface.co/katuni4ka/tiny-random-glm4
4244
hf-internal-testing/tiny-random-BioGptForCausalLM,https://huggingface.co/hf-internal-testing/tiny-random-BioGptForCausalLM,xfail,No ScaledDotProductAttention operation observed in the graph CVS-145820
4345
hf-internal-testing/tiny-random-XGLMForCausalLM,https://huggingface.co/hf-tiny-model-private/tiny-random-XGLMForCausalLM,xfail,No ScaledDotProductAttention operation observed in the graph CVS-145820
44-
katuni4ka/tiny-random-orion,https://huggingface.co/katuni4ka/tiny-random-orion,xfail,No ScaledDotProductAttention operation observed in the graph CVS-145820
45-
katuni4ka/tiny-random-chatglm2,https://huggingface.co/katuni4ka/tiny-random-chatglm2,xfail,Model references undeclared parameters: beam_idx () CVS-145820
46-
katuni4ka/tiny-random-glm4,https://huggingface.co/katuni4ka/tiny-random-glm4,xfail,Model references undeclared parameters beam_idx () attention_mask () CVS-145820
46+
katuni4ka/tiny-random-orion,https://huggingface.co/katuni4ka/tiny-random-orion,xfail,No ScaledDotProductAttention operation observed in the graph CVS-145820

0 commit comments

Comments
 (0)