or subscribe with
Join 0+ readers for one email each week.
Digests » 111
Take the time to learn something new. Click here to discover how to get 40% off your entire purchase at manning.com!
this week's favorite
Colab is one of the best products to come from Google. It has made GPUs freely accessible to learners and practitioners like me who otherwise wouldn’t be able to afford a high-end GPU.
Hugging Captions fine-tunes GPT-2, a transformer-based language model by OpenAI, to generate realistic photo captions. All of the transformer stuff is implemented using Hugging Face's Transformers library, hence the name Hugging Captions.
TextAttack is a Python framework for adversarial attacks, data augmentation, and model training in NLP.
Long Short Term Memory (LSTM) is a popular Recurrent Neural Network (RNN) architecture. This tutorial covers using LSTMs on PyTorch for generating text; in this case - pretty lame jokes.
Proteins are the workhorses of almost all cellular functions and a core component of life. But despite their versatility, all proteins are built as sequences of the same 20 amino acids. These sequences can be analyzed with tools from NLP. This paper investigates the attention mechanism of a BERT model that has been trained on protein sequence data and discovers that the language model has implicitly learned non-trivial higher-order biological properties of proteins.