How to use TianRW/CEAEval-Model with Transformers:
# Load model directly from transformers import AutoTokenizer, GatedAttenQwen2_5omnithinker tokenizer = AutoTokenizer.from_pretrained("TianRW/CEAEval-Model") model = GatedAttenQwen2_5omnithinker.from_pretrained("TianRW/CEAEval-Model")
7b20070
1
2
3
4
5
{ "_from_model_config": true, "transformers_version": "4.57.0.dev0" }