image2 unQaZiP.max 1000x1000 1

Introducing Accurate Quantized Training (AQT) for accelerated ML training on TPU v5e

AI models continue to get bigger, requiring larger compute clusters with exa-FLOPs (10^18 FLOPs) of computing. While large-scale models continue to unlock new capabilities, driving down the cost of training and serving these models is the key to sustaining the pace of this innovation. Typically, the tensor operations (ops)1 are the most compute-intensive part of …

The Ultimate Guide to Generative AI for Email Marketing

Generative AI or GenAI is a game-changing technology using the power of machine learning algorithms and large language models to create original content, images, videos, and more, all in response to user prompts. Its exceptional capacity to instantly create content at scale has made it a staple for marketers everywhere. Let’s explore the widespread use …

Image Feature Extraction in OpenCV: Edges and Corners

In the world of computer vision and image processing, the ability to extract meaningful features from images is important. These features serve as vital inputs for various downstream tasks, such as object detection and classification. There are multiple ways to find these features. The naive way is to count the pixels. But in OpenCV, there …

altup 1

Alternating updates for efficient transformers

Posted by Xin Wang, Software Engineer, and Nishanth Dikkala, Research Scientist, Google Research Contemporary deep learning models have been remarkably successful in many domains, ranging from natural language to computer vision. Transformer neural networks (transformers) are a popular deep learning architecture that today comprise the foundation for most tasks in natural language processing and also …

Putting data storage at the forefront of cloud security

We live in an era of unprecedented technology breakthroughs and opportunities. Recent advances in areas like AI and quantum computing offer transformative potential for businesses, but may also bring new risks and security challenges. IBM is working to address these challenges and evolving threats by helping organizations support highly secure, resilient and durable storage through …

altup

Alternating updates for efficient transformers

Posted by Xin Wang, Software Engineer, and Nishanth Dikkala, Research Scientist, Google Research Contemporary deep learning models have been remarkably successful in many domains, ranging from natural language to computer vision. Transformer neural networks (transformers) are a popular deep learning architecture that today comprise the foundation for most tasks in natural language processing and also …