Methods for conformal/probabilistic prediction (Conformal Predictors, Venn-Predictors) are wrappers around Machine Learning algorithms, that provide guarantees about their predictions; generally, their sole assumption is exchangeability on data (weaker than standard i.i.d.). They were first proposed in the book “Algorithmic Learning in a Random World” (Vovk, Gammerman, Shafer; 2005).
Conformal Predictors (CP) allow limiting the errors committed by a learning algorithm (“underlying algorithm”), in a multi-label classification setting, to a desired significance level : their accuracy is guaranteed to be at least .
Venn-Predictors (VP) output a set of probability distributions on the labels, as a prediction for a new object ; one of these distributions is guaranteed to be perfectly calibrated.
- Main Python implementation of Conformal Prediction: https://github.com/donlnz/nonconformist
- My Rust implementation of Conformal Prediction-related methods, aiming for correctness and performance: https://github.com/gchers/random-world
- (Unmaintained) My old Python implementation of Conformal Prediction: https://github.com/gchers/cpy
A list of works on CP.
Majority vote ensembles of conformal predictors Machine Learning 2018
Exchangeability martingales for selecting features in anomaly detection In Proceedings of the Seventh Workshop on Conformal and Probabilistic Prediction and Applications 2018
Conformal Clustering and Its Application to Botnet Traffic In Statistical Learning and Data Sciences - Third International Symposium, SLDS 2015, Egham, UK, April 20-23, 2015, Proceedings 2015