Login / Signup

FairMOE: counterfactually-fair mixture of experts with levels of interpretability.

Joe GerminoNuno MonizNitesh V. Chawla
Published in: Mach. Learn. (2024)
Keyphrases
  • mixture model
  • levels of abstraction
  • case study
  • expert advice
  • high level
  • probabilistic model
  • expectation maximization
  • prediction accuracy
  • domain experts