Digests » 152
this week's favorite
Rice University computer scientists have demonstrated artificial intelligence (AI) software that runs on commodity processors and trains deep neural networks 15 times faster than platforms based on graphics processors.
I’m asked the same question multiple times a day on my social media. The question is “How can I start in machine learning?”. It frequently takes multiple forms, such as “How can I start for free?” or “How can I start if I don’t have a developer background”, etc. So I decided to write a complete guide on how to start in machine learning in 2021 from no background at all, and for free. Because of these pertinent questions, I’ve researched a lot of resources and saved the best ones on a notepad over the past year to quickly answer the next upcoming questions.
This video discusses the Kullback Leibler divergence and explains how it's a natural measure of distance between distributions. The video goes through a simple proof, which shows how with some basic maths, we can get under the KL divergence and intuitively understand what it's about.
In recent years, the use of Generative Adversarial Networks (GANs) has become very popular in generative image modeling. While style-based GAN architectures yield state-of-the-art results in high-fidelity image synthesis, computationally, they are highly complex.
My implementation of DeepMind's Perceiver. You can read more about the model on DeepMind's website. I trained an MNIST model which you can find in models/mnist.pkl. It get's 93.45% which is... so-so. In the bottom of this document are some to-do's that might help make it better.