or subscribe with
Join 0+ readers for one email each week.
Digests » 145
this week's favorite
When we process a scene in front of our eyes, we automatically identify objects as different from one another and associate them with definitions. Image recognition is a term used to describe the task of identifying images and categorizing them in one of several predefined distinct classes. The technology is a computer vision technique that allows machines to interpret and categorize what they “see” in images or videos.
Freewire is a Keras-like API for creating optimized freely wired neural networks to run on CUDA. Freely wired neural networks are defined at the level of individual nodes (or neurons) and their connections, instead of at the level of homogeneous layers. The goal of Freewire is to make it so that any arbitrary DAG of artificial neurons can be defined first and the optimized set of operations can be compiled at runtime and run on CUDA.
The purpose of this post is to break down the math behind the Transformer architecture, as well as share some helpful resources and gotcha's based on my experience in learning about this architecture. We start with an exploration of sequence transduction literature leading up to the Transformer, after which we dive into the foundational Attention is All You Need paper by Vaswani, et al. (2017).
Deriving graph neural networks (GNNs) from first principles, motivating their use, and explaining how they have emerged along several related research lines.
A Python library for detecting community structure in graphs. It implements the following algorithms: Louvain method, Girvan-Newman algorithm, Hierarchical clustering, Spectral clustering and Bron-Kerbosch algorithm.