Skip to content

tor4z/awesome-confidence-calibration

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

46 Commits
 
 
 
 

Repository files navigation

awesome-confidence-calibration

  • [ICML 2017] On Calibration of Modern Neural Networks paper

  • [ALMC 1997] Probabilistic outputs for support vector machines and comparison to regularized likelihood methods paper

  • [ICLR 2020] distance-based learning from errors for confidence calibration paper

  • [Arxiv 2019] Confidence Calibration for Convolutional Neural Networks Using Structured Dropout paper

  • [ICML 2016] Dropout as a bayesian approximation: Representing model uncertainty in deep learning paper code

  • [NIPS 2020] Improving model calibration with accuracy versus uncertainty optimization paper code

  • [AISTATS 2011] Approximate inference for the loss-calibrated Bayesian paper

  • [Arxiv 2018] Loss-calibrated approximate inference in bayesian neural networks paper

  • [CVPR 2019] Learning for Single-Shot Confidence Calibration in Deep Neural Networks through Stochastic Inferences paper

  • [ICML 2005] Predicting Good Probabilities With Supervised Learning apper

  • [KDD 2002] Transforming Classifier Scores into Accurate Multiclass Probability Estimates paper

  • [NIPS 2017] On Fairness and Calibration paper

  • [Arxiv 2016] Approximating Likelihood Ratios with Calibrated Discriminative Classifiers paper

  • [NIPS 2020] Calibrating Deep Neural Networks using Focal Loss paper

  • [NIPS 2019] Verified Uncertainty Calibration paper code

  • [CVPR 2021] Improving Calibration for Long-Tailed Recognition paper code

  • [CVPR 2021] Post-hoc Uncertainty Calibration for Domain Drift Scenarios paper code

  • [NIPS 2017] Simple and Scalable Predictive Uncertainty Estimation using Deep Ensembles apper

  • [NIPS 2019] Addressing Failure Prediction by Learning Model Confidence paper code

  • [ICML 2018] Accurate Uncertainties for Deep Learning Using Calibrated Regression paper

  • [ICLR 2018] Training Confidence-calibrated Classifiers for Detecting Out-of-Distribution Samples paper code

  • [NIPS 2019] Addressing Failure Prediction by Learning Model Confidence paper code

  • [Arxiv 2021] On the Calibration and Uncertainty of Neural Learning to Rank Models paper code

  • [Classic] Isotonic Regression paper paper paper

  • [NIPS 2019] Beyond temperature scaling: Obtaining well-calibrated multi-class probabilities with Dirichlet calibration paper

  • [ICML 2001] Obtaining calibrated probability estimates from decision trees and naive bayesian classifiers paper

  • [AISTATS 2017] Beyond sigmoids: How to obtain well-calibrated probabilities from binary classifiers with beta calibration paper code

  • [Arxiv 2015] Binary classifier calibration using an ensemble of near isotonic regression models. paper

  • [KDD 2019] Non-parametric Bayesian Isotonic Calibration: Fighting Over-Confidence in Binary Classification paper code

  • [AAAI 2015] Obtaining Well Calibrated Probabilities Using Bayesian Binning paper code

  • [Arxiv 2021] Distribution-free calibration guarantees for histogram binning without sample splitting paper code

  • [ICML 2012] Predicting accurate probabilities with a ranking loss paper

Resources

  • Statistical Decision Theory and Bayesian Analysis book

  • A Tutorial on Learning With Bayesian Networks paper

Application

  • [AAAI 2021] Learning to Cascade: Confidence Calibration for Improving the Accuracy and Computational Cost of Cascade Inference Systems paper

About

awesome confidence calibration paper list

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages