or subscribe with
Join 3,500+ readers for one email each week.
Digests » 142
this week's favorite
Backpropagation is an algorithm used to adjust parameters of a neural network. It computes the gradient of the loss function with respect to the weights of the network for a single input–output example.
Shapash is a Python library which aims to make machine learning interpretable and understandable by everyone. It provides several types of visualization that display explicit labels that everyone can understand.
BudgetML is perfect for practitioners who would like to quickly deploy their models to an endpoint, but not waste a lot of time, money, and effort trying to figure out how to do this end-to-end.
When I started learning Neural Networks from scratch a few years ago, I did no think about just looking at some Python code or similar. I found it quite hard to understand all the concepts behind Neural Networks (e.g. Bias, Backpropagation, ...).
Bayesian linear regression is the Bayesian interpretation of linear regression. What does that mean? To answer this question we first have to understand the Bayesian approach. In most of the algorithms we have looked at so far we computed point estimates of our parameters. For example, in linear regression we chose values for the weights and bias that minimized our mean squared error cost function. In the Bayesian approach we don't work with exact values but with probabilities. This allows us to model the uncertainty in our parameter estimates. Why is this important?