### Prerequisites - [X] I am running the latest code. Mention the version if possible as well. - [X] I carefully followed the [README.md](https://github.com/ggerganov/llama.cpp/blob/master/README.md). - [X] I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed). - [X] I reviewed the [Discussions](https://github.com/ggerganov/llama.cpp/discussions), and have a new and useful enhancement to share. ### Feature Description Add support for InternLM-XComposer and Facebook Chameleon to GGMU my repo and/or convert_hf_to_ggmu.py ### Motivation It increases the multi-modal options ### Possible Implementation None.