Uni-MoE: a unified multimodal LLM based on a dispersed MoE architecture by Technical Terrence Team 05/25/2024 0 Unleashing the potential of large multimodal language models (MLLMs) to handle diverse modalities such as speech, text, images and video ...