Digests » 71

this week's favorite

Efficient inference for dynamical models

Dynamical systems theory provides a mathematical framework for studying how complex systems evolve over time, such as the neurons in our brains, the global climate system, or engineered cells. But predicting how these systems will behave in the future or how they might respond to disruption requires an accurate model.

Dickey-Fuller test for Time Series Stationarity using Python

The Augmented Dickey Fuller Test (ADF) is unit root test for stationarity. It checks if your time series is stationary or not. A stationary time series is one whose statistical properties such as mean, variance, autocorrelation, are all constant over time. Such statistics are useful as descriptors of future behavior only if the series is stationary.

Bradley Gram-Hansen

In this post I will go through a powerful Markov Chain Monte Carlo (MCMC) algorithm called Hamiltonian Monte Carlo (HMC) (MC’s be in da house) and demonstrate how to implement the algorithm within the pytorch framework.


HuggingFace has just released Transformers 2.0, a library for Natural Language Processing in TensorFlow 2.0 and PyTorch which provides state-of-the-art pre-trained models in most recent NLP architectures (BERT, GPT-2, XLNet, RoBERTa, DistilBert, XLM...) comprising several multi-lingual models.

Natural Language Processing Roadmap

nlp-roadmap is Natural Language Processing ROADMAP(Mind Map) and KEYWORD for students those who have interest in learning Natural Language Processing. The roadmap covers the materials from basic probability/statistics to SOTA NLP models.