Digests » 168

this week's favorite

2020: A year full of amazing AI papers 

A curated list of the latest breakthroughs in AI in 2020 by release date with a clear video explanation, link to a more in-depth article, and code.

A 135B parameter sparse neural network for massively improved search relevance

In this blog post, we are introducing “Make Every feature Binary” (MEB), a large-scale sparse model that complements our production Transformer models to improve search relevance for Microsoft customers using AI at Scale. To make search more accurate and dynamic, MEB better harnesses the power of large data and allows for an input feature space with over 200 billion binary features that reflect the subtle relationships between search queries and documents.

Continental-scale building detection from high resolution satellite imagery

Identifying the locations and footprints of buildings is vital for many practical and scientific purposes. Such information can be particularly useful in developing regions where alternative data sources may be scarce. In this work, we describe a model training pipeline for detecting buildings across the entire continent of Africa, using 50 cm satellite imagery.

DALL·E mini

Generate images from a text prompt in this interactive report: DALL·E on a smaller architecture.

What is small and wide data, and how is big data different?

Whether reading Greek philosophy or listening to songs on the radio, we’re often reminded that the only thing that stays the same is that everything changes. In the realm of research and analytics, one of the most important changes currently influencing individuals, corporations and even politics is a shift in focus away from the concept and capabilities of big data.