Skip to content

Uncertinty metrics #3

Open
Open
@deebuls

Description

https://openreview.net/pdf?id=XOuAOv_-5Fx

UNCERTAINTY CALIBRATION ERROR: A NEW METRIC FOR MULTI-CLASS CLASSIFICATION

The paper was rejected as per https://openreview.net/forum?id=XOuAOv_-5Fx

The paper talks about the different metrics used for classification uncertinty estimation the problems with it and suggests a better metric.

The paper gives a complete picture on what all metrics are avaialbe in chapter 3

  1. Expected Calibration Error
  2. Adaptive Calibration Error
  3. Negative Log-Likelihood
  4. Brier Score
  5. Maximum Mean Calibration Error
  6. UNCERTAINTY CALIBRATION ERROR -> new metric by the paper

Please find python code for all these metrics . Some of them are being used by @satwick

https://github.com/JeremyNixon/uncertainty-metrics-1

Activity

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Metadata

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions