Skip to content

Conversation

@andrewgin
Copy link

@andrewgin andrewgin commented Jan 26, 2026

V5 of the transformers package removes the is_torch_fx_available This is used in:

V5 of the transformers package removes the `is_torch_fx_available`
This is used in:
* research/llm_reranker/merge/modeling_minicpm_reranker.py
* research/llm_reranker/finetune_for_layerwise/modeling_minicpm_reranker.py
* FlagEmbedding/finetune/reranker/decoder_only/layerwise/modeling_minicpm_reranker.py
* FlagEmbedding/inference/reranker/decoder_only/models/modeling_minicpm_reranker.py

For FlagOpen#1561
@andrewgin andrewgin force-pushed the limit_transformers_package branch from a030c83 to 5c30ba9 Compare January 26, 2026 18:21
@hw584521314
Copy link

very important information for our newbee.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

huggingface transformers has removed the is_torch_fx_available function

2 participants