Digests » 149

this week's favorite

11 pandas built-in functions you should know

I’ve been using pandas for a few years and each time I feel I am typing too much, I google the operation and I usually find a shorter way of doing it — a new pandas trick!

Leveraging machine learning for game development

Over the years, online multiplayer games have exploded in popularity, captivating millions of players across the world. This popularity has also exponentially increased demands on game designers, as players expect games to be well-crafted and balanced — after all, it's no fun to play a game where a single strategy beats all the rest.

Expressive power of graph neural networks and the Weisfeiler-Lehman test

Do you have a feeling that deep learning on graphs is a bunch of heuristics that work sometimes and nobody has a clue why? In this post, I discuss the graph isomorphism problem, the Weisfeiler-Lehman heuristic for graph isomorphism testing, and how it can be used to analyse the expressive power of graph neural networks.

Beyond Weisfeiler-Lehman: using substructures for provably expressive graph neural networks

In this post, I discuss how to design local and computationally efficient provably powerful graph neural networks that are not based on the Weisfeiler-Lehman tests hierarchy.

Factorized layers revisited: Compressing deep networks without playing the lottery

We’re witnessing exciting and emerging research into compressing these models to make them less expensive, small enough to store on any device, and more energy efficient. Perhaps the most popular approach to model compression is pruning, in which redundant model parameters are removed, leaving only a small subset of parameters, or a subnetwork. A major drawback of pruning, though, is it requires training a large model first, which is expensive and resource intensive.