-
Notifications
You must be signed in to change notification settings - Fork 31.2k
Description
System Info
Transformers version: 4.57.1
Python: 3.13
Linux and macOS (seems irrelevant)
Who can help?
It looks like the same issue is also breaking the AutoTokenizer for Magistral:
Traceback (most recent call last):
File "./debug_mistral.py", line 23, in
tokenizer = AutoTokenizer.from_pretrained("mistralai/Magistral-Small-2509")
File "./.pixi/envs/dev/lib/python3.13/site-packages/transformers/models/auto/tokenization_auto.py", line 1156, in from_pretrained
tokenizer_class_py, tokenizer_class_fast = TOKENIZER_MAPPING[type(config)]
~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^
File "./.pixi/envs/dev/lib/python3.13/site-packages/transformers/models/auto/auto_factory.py", line 815, in getitem
raise KeyError(key)
KeyError: <class 'transformers.models.mistral3.configuration_mistral3.Mistral3Config'>
Information
- The official example scripts
- My own modified scripts
Tasks
- An officially supported task in the
examplesfolder (such as GLUE/SQuAD, ...) - My own task or dataset (give details below)
Reproduction
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("mistralai/Magistral-Small-2509")Expected behavior
It should work and not crash.