MLBlog13048 image1

Bring legacy machine learning code into Amazon SageMaker using AWS Step Functions

Tens of thousands of AWS customers use AWS machine learning (ML) services to accelerate their ML development with fully managed infrastructure and tools. For customers who have been developing ML models on premises, such as their local desktop, they want to migrate their legacy ML models to the AWS Cloud to fully take advantage of …

1 PyTorch training.0995065416410811.max 1000x1000 1

Optimize PyTorch training performance with Reduction Server on Vertex AI

As deep learning models become increasingly complex and datasets larger, distributed training is all but a necessity. Faster training makes for faster iteration to reach your modeling goals. But distributed training comes with its own set of challenges. On top of deciding what kind of distribution strategy you want to use and making changes to …

Mix-and-match kit could enable astronauts to build a menagerie of lunar exploration bots

The Walking Oligomeric Robotic Mobility System, or WORMS, is a reconfigurable, modular, multiagent robotics architecture for extreme lunar terrain mobility. The system could be used to assemble autonomous worm-like parts into larger biomimetic robots that could explore lava tubes, steep slopes, and the moon’s permanently shadowed regions.

Pre-trained Model Representations and their Robustness against Noise for Speech Emotion Analysis

Pre-trained model representations have demonstrated state-of-the-art performance in speech recognition, natural language processing, and other applications. Speech models, such as Bidirectional Encoder Representations from Transformers (BERT) and Hidden units BERT (HuBERT), have enabled generating lexical and acoustic representations to benefit speech recognition applications. We investigated the use of pre-trained model representations for estimating dimensional emotions, …