or subscribe with
Join 3,500+ readers for one email each week.
Digests » 134
this week's favorite
In this session we won’t be able to teach you how to become a data scientist. But in this on demand session you can spend 3 hours, showing you how to build on your current developer skills to integrate AI services into your business applications and also set you up on your journey to become certified in AI Fundamentals.
As its width tends to infinity, a deep neural network's behavior under gradient descent can become simplified and predictable (e.g. given by the Neural Tangent Kernel (NTK)), if it is parametrized appropriately (e.g. the NTK parametrization). However, we show that the standard and NTK parametrizations of a neural network do not admit infinite-width limits that can learn features, which is crucial for pretraining and transfer learning such as with BERT.
In this blog post, we focus on Coral, a newly open-sourced SQL translation, analysis, and rewrite engine that we integrate to the Dali Catalog, and use to virtualize views, to improve their accessibility, expressibility, and understandability by the engines, and to better control their behavior.
Leveraging structure in data is key to making progress in AI, says AI prodigy Gary Marcus. A forward-looking view on Software 2.0, AI chips, robotics, and the future of AI.
This essay provides a broad overview of the sub-field of machine learning interpretability. While not exhaustive, my goal is to review conceptual frameworks, existing research, and future directions.