Mistral AI Introduces Mixtral 8x7B: A Sparse Mixture of Experts (SMoE) Language Model Transforming Machine Learning 01/14/2024