Skip to content

Latest commit

 

History

History
37 lines (32 loc) · 1.73 KB

README.md

File metadata and controls

37 lines (32 loc) · 1.73 KB

Addons - Metrics

Maintainers

Submodule Maintainers Contact Info
cohens_kappa Aakash Nain [email protected]
f_scores Saishruthi Swaminathan [email protected]
r_square Saishruthi Swaminathan [email protected]
multilabel_confusion_matrix Saishruthi Swaminathan [email protected]

Contents

Submodule Metric Reference
cohens_kappa CohenKappa Cohen's Kappa
f1_scores F1 micro, macro and weighted F1 Score
r_square RSquare R-Sqaure
multilabel_confusion_matrix Multilabel Confusion Matrix mcm

Contribution Guidelines

Standard API

In order to conform with the current API standard, all metrics must:

  • Inherit from tf.keras.metrics.Metric.
  • Register as a keras global object so it can be serialized properly.
  • Add the addon to the py_library in this sub-package's BUILD file.

Testing Requirements

  • Simple unittests that demonstrate the metric is behaving as expected.
  • When applicable, run all unittests with TensorFlow's @run_in_graph_and_eager_modes (for test method) or run_all_in_graph_and_eager_modes (for TestCase subclass) decorator.
  • Add a py_test to this sub-package's BUILD file.

Documentation Requirements

  • Update the table of contents in this sub-package's README.