The generative artificial intelligence (ai) revolution is in full swing, and customers of all sizes and across all industries are leveraging this transformative technology to reshape their businesses. From reinventing workflows to make them more intuitive and easier to improving decision-making processes through rapid information synthesis, generative ai promises to redefine the way we interact with machines. It's been amazing to see the number of companies launching innovative generative ai applications on AWS using amazon Bedrock. Siemens is integrating amazon Bedrock into its Mendix low-code development platform to enable thousands of companies across multiple industries to build and update applications with the power of generative ai. Accenture and Anthropic are collaborating with AWS to help organizations, especially those in highly regulated industries such as healthcare, public sector, banking, and insurance, responsibly adopt and scale generative ai technology with amazon Bedrock. This collaboration will help organizations like the District of Columbia Health Department accelerate innovation, improve customer service, and improve productivity while maintaining data privacy and security. amazon-pharmacy-work” target=”_blank” rel=”noopener”>amazon Pharmacy is using generative ai to fill prescriptions quickly and accurately, making customer service faster and more helpful, and ensuring the correct quantities of medications are stocked for customers.
To power so many diverse applications, we recognized early on the need for diversity of models and options for generative ai. We know that different models excel in different areas, each with unique strengths tailored to specific use cases, leading us to provide customers with access to multiple state-of-the-art Large Language Models (LLM) and Core Models (FM). through a Unified Service: amazon Bedrock. By providing access to the best models from amazon, Anthropic, AI21 Labs, Cohere, Meta, Mistral ai and Stability ai, we allow customers to experiment, evaluate and ultimately select the model that offers optimal performance for their needs. .
Mistral Large ad on amazon Bedrock
Today, we are excited to announce the next step in this journey with an expanded collaboration with Mistral ai. Mistral ai, a French startup, has quickly established itself as a pioneering force in the generative ai landscape, known for its focus on portability, transparency, and its cost-effective design that requires fewer computational resources to operate. We recently announced the availability of ai–amazon-bedrock” target=”_blank” rel=”noopener”>Mistral 7B and Mixtral 8x7B models on amazon Bedrock, with weights that customers can inspect and modify. Today, Mistral ai brings its latest and most capable model, Mistral Large, to amazon Bedrock, and is committed to making future models accessible to AWS customers. Mistral ai will also use AWS ai-optimized AWS Trainium and AWS Inferentia to build and deploy its future foundation models on amazon Bedrock, benefiting from the price, performance, scale, and security of AWS. In conjunction with this announcement, starting today, customers can use amazon Bedrock in the AWS Europe (Paris) Region. At launch, customers will have access to some of the latest models from amazon, Anthropic, Cohere and Mistral ai, expanding their options to support diverse use cases, from text understanding to complex reasoning.
Mistral Large has exceptional language generation and understanding capabilities, making it ideal for complex tasks that require reasoning or highly specialized capabilities, such as synthetic text generation, code generation, retrieval augmented generation (RAG), or agents. For example, customers can create ai agents capable of engaging in articulate conversations, generating nuanced content, and tackling complex reasoning tasks. The model's strengths also extend to coding, with proficiency in code generation, review, and commenting in major coding languages. And Mistral Large's exceptional multilingual performance, covering French, German, Spanish and Italian in addition to English, presents a compelling opportunity for customers. By offering a model with strong multilingual support, AWS can better serve customers with diverse linguistic needs, promoting global accessibility and inclusivity of generative ai solutions.
By integrating Mistral Large into amazon Bedrock, we can offer customers an even wider range of high-performing LLMs to choose from. No model is optimized for all use cases, and to unlock the value of generative ai, customers need access to a variety of models to discover which works best for their business needs. We are committed to continually introducing best-in-class models, giving customers access to the latest and most innovative generative ai capabilities.
“We are excited to announce our collaboration with AWS to accelerate the adoption of our cutting-edge ai technology in organizations around the world. Our mission is to make cutting-edge ai ubiquitous, and to achieve this mission, we want to collaborate with the world's leading cloud provider to distribute our world-class models. “We have a long and deep relationship with AWS, and by strengthening it today, we will be able to provide personalized ai to builders around the world.”
– Arthur Mensch, CEO of Mistral ai.
Customers appreciate the choice
Since we first announced amazon Bedrock, we've been innovating at a rapid pace—adding more powerful features like agents and guardrails. And we've said all along that more interesting innovations will continue to come, including new models. With more model options, customers tell us they can achieve remarkable results:
“The ease of accessing different models from an API is one of Bedrock's strengths. The model options available have been exciting. As new models become available, our ai team can quickly and easily evaluate them to see if they fit our needs. “The security and privacy that Bedrock offers make it a great option to meet our ai needs.”
– Jamie Caramanica, Senior Vice President of Engineering at CS Disco.
“Our top priority today is helping organizations use generative ai to support employees and enhance bots through a variety of applications, such as more robust detection of topics, sentiments and tones of customer conversations, language translation, content creation and variation, knowledge optimization, response. Automatic highlighting and summary. To make it easier for you to harness the potential of generative ai, we offer our users access to a variety of large language models, including models developed by Genesys and multiple third-party foundational models through amazon Bedrock, including from Anthropic. Claude, Jurrassic-2 from AI21 Labs and amazon Titan. Together with AWS, we give customers exponential power to create differentiated experiences based on their business needs, while helping them prepare for the future.”
– Glenn Nethercutt, Chief technology Officer, Genesys.
As the generative ai revolution continues to unfold, AWS is poised to shape its future, empowering customers across industries to drive innovation, optimize processes, and redefine the way we interact with machines. Together with leading partners like Mistral ai and with amazon Bedrock as a foundation, our customers can create more innovative generative ai applications.
Democratize access to LLM and FM
amazon Bedrock is democratizing access to cutting-edge LLM and FM and AWS is the only cloud provider delivering the most popular and advanced FMs to customers. The collaboration with Mistral ai represents an important milestone in this journey, further expanding amazon Bedrock's diverse model offerings and reinforcing our commitment to providing customers with unparalleled options through amazon Bedrock. By recognizing that no one model can optimally serve all use cases, AWS has paved the way for customers to unlock the full potential of generative ai. Through amazon Bedrock, organizations can experiment and leverage the unique strengths of multiple high-performance models, tailoring their solutions to specific needs, industry domains, and workloads. This unprecedented choice, combined with AWS's robust security, privacy, and scalability, enables customers to harness the power of generative ai responsibly and with confidence, regardless of their industry or regulatory constraints.
Resources
- Mistral Big News Blog
- ai-mistral-large-amazon-bedrock” target=”_blank” rel=”noopener”>About amazon Blog
- Mistral ai on the amazon Bedrock product page
About the Author
Swami Sivasubramanian He is vice president of data and machine learning at AWS. In this role, Swami oversees all AWS database, analytics, artificial intelligence, and machine learning services. His team's mission is to help organizations put their data to work with a complete end-to-end data solution to store, access, analyze, visualize and predict.