or subscribe with
Join 3,800+ readers for one email each week.
Digests » 126
this week's favorite
In this blog post we discuss a mental model for RL, based on the idea that RL can be viewed as doing supervised learning on the “good data”. What makes RL challenging is that, unless you’re doing imitation learning, actually acquiring that “good data” is quite challenging.
PyTorch has sort of became one of the de facto standards for creating Neural Networks now, and I love its interface. Yet, it is somehow a little difficult for beginners to get a hold of.
The idea is we make short videos that focus on the aspects of NLP that currently work well and are useful. Speech-to-text now works pretty well, so these methods will also be useful for the audio portions of videos.
Recent advances in image generation gave rise to powerful tools for semantic image editing. However, existing approaches can either operate on a single image or require an abundance of additional information. They are not capable of handling the complete set of editing operations, that is addition, manipulation or removal of semantic concepts.
Looking at the machine learning landscape, one of the major trends you can see is the proliferation of projects focused on applying software engineering principles to machine learning. Cortex, for example, recreates the experience of deploying serverless functions, but with inference pipelines. DVC, similarly, implements modern version control and CI/CD pipelines, but for ML.
A practical guide written by the people who do the resume screening: engineering managers and technical recruiters working at tech companies. Also, Gergely gives the book for free to developers who have lost their jobs due to current world situation.