Login / Signup

A combinatorial multi-armed bandit approach to correlation clustering.

Francesco GulloDomenico MandaglioAndrea Tagarelli
Published in: Data Min. Knowl. Discov. (2023)
Keyphrases
  • correlation clustering
  • multi armed bandit
  • multi armed bandits
  • reinforcement learning
  • hierarchical clustering
  • hard constraints
  • map inference
  • constrained clustering
  • graphical models
  • regret bounds