Categories: FAANG

Layer-Wise Data-Free CNN Compression

We present an efficient method for compressing a trained neural network without using any data. Our data-free method requires 14x-450x fewer FLOPs than comparable state-of-the-art methods. We break the problem of data-free network compression into a number of independent layer-wise compressions. We show how to efficiently generate layer-wise training data, and how to precondition the network to maintain accuracy during layer-wise compression. We show state-of-the-art performance on MobileNetV1 for data-free low-bit-width quantization. We also show state-of-the-art performance on data-free…
AI Generated Robotic Content

Recent Posts

Never forget…

submitted by /u/ShadowBoxingBabies [link] [comments]

5 hours ago

A Reinforcement Learning Based Universal Sequence Design for Polar Codes

To advance Polar code design for 6G applications, we develop a reinforcement learning-based universal sequence…

5 hours ago

Democratizing business intelligence: BGL’s journey with Claude Agent SDK and Amazon Bedrock AgentCore

This post is cowritten with James Luo from BGL. Data analysis is emerging as a…

5 hours ago

An ‘Intimacy Crisis’ Is Driving the Dating Divide

In his book The Intimate Animal, sex and relationships researcher Justin Garcia says people have…

6 hours ago

New fire just dropped: ComfyUI-CacheDiT ⚡

ComfyUI-CacheDiT brings 1.4-1.6x speedup to DiT (Diffusion Transformer) models through intelligent residual caching, with zero…

1 day ago