Login / Signup
FairMOE: counterfactually-fair mixture of experts with levels of interpretability.
Joe Germino
Nuno Moniz
Nitesh V. Chawla
Published in:
Mach. Learn. (2024)
Keyphrases
</>
mixture model
levels of abstraction
case study
expert advice
high level
probabilistic model
expectation maximization
prediction accuracy
domain experts