Digests » 78
this week's favorite
Deep Neural Networks (DNNs) have facilitated tremendous progress across a range of applications, including image classification, translation, language modeling, and video captioning. DNN training is extremely time-consuming, needing efficient multi-accelerator parallelization.
Self-supervised learning opens up a huge opportunity for better utilizing unlabelled data, while learning in a supervised learning manner. This post covers many interesting ideas of self-supervised learning tasks on images, videos, and control problems.
A flurry of recent papers in theoretical deep learning tackles the common theme of analyzing neural networks in the infinite-width limit. At first, this limit may seem impractical and even pointless to study. However, it turns out that neural networks in this regime simplify to linear models with a kernel called the neural tangent kernel. Gradient descent is therefore very simple to study. While this may seem promising at first, empirical results show that neural networks in this regime perform worse than practical over-parameterized networks. Nevertheless, this still provides theoretical insight into some aspects of neural network training, so it is worth studying.
In this article, I want to go into depth about the principles behind using Markov Chains and statistics to procedurally generate 3D buildings and other systems.
Use Data Pipelines, Transfer Learning and BERT to achieve 85% accuracy in Sentiment Analysis.