Integrated modeling approach decodes solid-state battery microstructures for better performance

Researchers at Lawrence Livermore National Laboratory (LLNL) have developed a novel, integrated modeling approach to identify and improve key interface and microstructural features in complex materials typically used for advanced batteries. The work helped unravel the relationship between material microstructure and key properties and better predict how those properties affect battery operation, paving the way …

ML18202 image001

Deploy DeepSeek-R1 Distilled Llama models in Amazon Bedrock

Open foundation models (FMs) have become a cornerstone of generative AI innovation, enabling organizations to build and customize AI applications while maintaining control over their costs and deployment strategies. By providing high-quality, openly available models, the AI community fosters rapid iteration, knowledge sharing, and cost-effective solutions that benefit both developers and end-users. DeepSeek AI, a …

Parameters vs FLOPs: Scaling Laws for Optimal Sparsity for Mixture-of-Experts Language Models

Scaling the capacity of language models has consistently proven to be a reliable approach for improving performance and unlocking new capabilities. Capacity can be primarily defined by two dimensions: the number of model parameters and the compute per example. While scaling typically involves increasing both, the precise interplay between these factors and their combined contribution …

ML 16454 solution architecture

Develop a RAG-based application using Amazon Aurora with Amazon Kendra

Generative AI and large language models (LLMs) are revolutionizing organizations across diverse sectors to enhance customer experience, which traditionally would take years to make progress. Every organization has data stored in data stores, either on premises or in cloud providers. You can embrace generative AI and enhance customer experience by converting your existing data into …