Open
Description
I finetuned with Unsloth the "unsloth/Llama-3.2-11B-Vision-Instruct" model. After finetuning i saved it with this command :
model.save_pretrained_merged("Finetuned_Model", tokenizer,) (it creates a folder that contains safetensors and json files) Then I tried to convert it to gguf so i can use it locally without gpu using the command: !python llama.cpp/convert_hf_to_gguf.py Finetuned_Model, but i have the below error:
ERROR:hf-to-gguf:Model MllamaForConditionalGeneration is not supported
Metadata
Metadata
Assignees
Labels
No labels