mlm sphere header image 220818

Last call: Stefan Krawcyzk’s ‘Mastering MLOps’ Live Cohort

Tweet Tweet Share Share Last Updated on August 19, 2022 Sponsored Post   This is your last chance to sign up for Stefan Krawczyk’s exclusive live cohort, starting next week (August 22nd). We already have students enrolled from Apple, Amazon, Spotify, Nubank, Workfusion, Glassdoor, ServiceNow, and more. Stefan Krawczky has spent the last 15+ years …

Why Initialize a Neural Network with Random Weights

Why Initialize a Neural Network with Random Weights?

Tweet Tweet Share Share Last Updated on August 15, 2022 The weights of artificial neural networks must be initialized to small random numbers. This is because this is an expectation of the stochastic optimization algorithm used to train the model, called stochastic gradient descent. To understand this approach to problem solving, you must first understand …

When to Use MLP CNN and RNN Neural Networks

When to Use MLP, CNN, and RNN Neural Networks

Tweet Tweet Share Share Last Updated on August 15, 2022 What neural network is appropriate for your predictive modeling problem? It can be difficult for a beginner to the field of deep learning to know what type of network to use. There are so many types of networks to choose from and new methods being …

What is the Difference Between a Batch and an Epoch in a Neural Network

Difference Between a Batch and an Epoch in a Neural Network

Tweet Tweet Share Share Last Updated on August 15, 2022 Stochastic gradient descent is a learning algorithm that has a number of hyperparameters. Two hyperparameters that often confuse beginners are the batch size and number of epochs. They are both integer values and seem to do the same thing. In this post, you will discover …

arisa chattasa o58Xi32Rnlk unsplash

Using Depthwise Separable Convolutions in Tensorflow

Tweet Tweet Share Share Last Updated on August 10, 2022 Looking at all of the very large convolutional neural networks such as ResNets, VGGs, and the like, it begs the question on how we can make all of these networks smaller with less parameters while still maintaining the same level of accuracy or even improving …

rev eng fig1

Reverse engineering the NTK: towards first-principles architecture design

Deep neural networks have enabled technological wonders ranging from voice recognition to machine transition to protein engineering, but their design and application is nonetheless notoriously unprincipled. The development of tools and methods to guide this process is one of the grand challenges of deep learning theory. In Reverse Engineering the Neural Tangent Kernel, we propose …