Beating the AI bottleneck: Communications innovation could markedly improve AI training process
Artificial intelligence (AI) is infamous for its resource-heavy training, but a new study may have found a solution in a novel communications system, called ZEN, that markedly improves the way large language models (LLMs) train.
In a Nature Communications study, researchers from China have developed an error-aware probabilistic update (EaPU) method that aligns memristor hardware's noisy updates with neural network training, slashing energy use by nearly six orders of magnitude versus GPUs while boosting accuracy on vision tasks. The study validates EaPU on 180 nm…
A small team of AI engineers at Zoom Communications has developed a new approach to training AI systems that uses far fewer resources than the standard approach now in use. The team has published their results on the arXiv preprint server.
From ChatGPT to DALL-E, deep learning artificial intelligence (AI) algorithms are being applied to an ever-growing range of fields. A new study from University of Toronto Engineering researchers, published in Nature Communications, suggests that one of the fundamental assumptions of deep learning models—that they require enormous amounts of training data—may…