or subscribe with
Join 3,500+ readers for one email each week.
Digests » 75
Take the time to learn something new. Click here to discover how to get 40% off your entire purchase at manning.com!
this week's favorite
This repository implements a simple Neural Architecture Search (NAS) system in PyTorch. Heavily inspired by the work of Barret Zoph & Quoc V. Le (2016).
The success of Neural Networks has sparked the AI revolution in the last 10 years. From Atari Games to Go, to Dota and to Starcraft. What many people don't know - the basic idea of Neural Networks has been around since the late 1950s.
This article summarizes a few classical papers about measuring uncertainty in deep neural networks. It's an overview article, but I felt the quality of the article is much higher than the typical "getting started with ML" kind of medium blog posts, so people might appreciate it on this forum.
In this lab we will introduce different ways of learning from sequential data. As an example, we will train a neural network to do language modelling, i.e. predict the next token in a sentence. In the context of natural language processing a token could be a character or a word, but mind you that the concepts introduced here apply to all kinds of sequential data, such as e.g. protein sequences, weather measurements, audio signals or monetary transaction history, just to name a few.
In an April 2018 paper coauthored with collaborators from the University of Washington and DeepMind, the Google-owned artificial intelligence company, Bowman introduced a battery of nine reading-comprehension tasks for computers called GLUE (General Language Understanding Evaluation).