Digests » 75

sponsor

Save 40% at manning.com!

Take the time to learn something new. Click here to discover how to get 40% off your entire purchase at manning.com!

ai

Minimal implementation of a Neural Architecture Search system

This repository implements a simple Neural Architecture Search (NAS) system in PyTorch. Heavily inspired by the work of Barret Zoph & Quoc V. Le (2016).

Uncertainty Quantification in Deep Learning

This article summarizes a few classical papers about measuring uncertainty in deep neural networks. It's an overview article, but I felt the quality of the article is much higher than the typical "getting started with ML" kind of medium blog posts, so people might appreciate it on this forum.

🎥 The Origins of Neural Networks

The success of Neural Networks has sparked the AI revolution in the last 10 years. From Atari Games to Go, to Dota and to Starcraft. What many people don't know - the basic idea of Neural Networks has been around since the late 1950s.

How to build a RNN and LSTM from scratch with NumPy

In this lab we will introduce different ways of learning from sequential data. As an example, we will train a neural network to do language modelling, i.e. predict the next token in a sentence. In the context of natural language processing a token could be a character or a word, but mind you that the concepts introduced here apply to all kinds of sequential data, such as e.g. protein sequences, weather measurements, audio signals or monetary transaction history, just to name a few.

Machines Beat Humans on a Reading Test. But Do They Understand?

In an April 2018 paper coauthored with collaborators from the University of Washington and DeepMind, the Google-owned artificial intelligence company, Bowman introduced a battery of nine reading-comprehension tasks for computers called GLUE (General Language Understanding Evaluation).