Latest Posts
-
Influence Functions
It’s a classic technique from robust statistics (1), to understand and improve machine learning models. By tracing a model’s prediction back to its training data, influence functions determine the training points most responsible for a given prediction. This information can be used to improve the model by selecting and removing noisy or irrelevant data and to debug models by identifying errors in the training data or the model’s assumptions. Overall, influence functions are a powerful tool for understanding and improving machine learning models (2). They are relatively easy to compute and can be used with linear and non-linear models, making them increasingly popular in machine learning research and practice.
-
Notes on Forward Forward Algorithm
The forward-forward algorithm is a novel method for training neural networks as an alternative to backpropagation.
-
Early detection of 3D printing issues
This Kaggle competition required us to focus on a particular kind of anomaly, which was to detect under extrusion. Under extrusion in 3D printing occurs when the 3D printer doesn’t supply enough filament for the print job. This can result in gaps, weak structures, or incomplete layers in the printed object. Hence the objective was to detect this kind of extrusion using images obtained from various 3D printers.
-
Notes on Predictive Forward-Forward Paper
Predictive Forward-Forward Learning