Contents Menu Expand Light mode Dark mode Auto light/dark mode

Check out our demonstration exploring individual and group fairness of three BERT-based toxic text classification models Demonstration

inFairness
Logo

Index

  • Tutorial
  • Examples
  • Papers implemented

Package Reference

  • API Documentation
    • Algorithms
    • Auditors
    • Distances
    • Post-Processing
    • Utilities
  • Development
    • Auditors
    • Distances
  • Changelog
  • GitHub Repository
Back to top

Development#

How to integrate new algo, auditor, etc. in the code

  • Auditors
  • Distances
Next
Auditors
Previous
Utilities
Copyright © 2022, IBM Research
Made with Sphinx and @pradyunsg's Furo