Login / Signup

Self-Distillation into Self-Attention Heads for Improving Transformer-based End-to-End Neural Speaker Diarization.

Ye-Rin JeoungJeong-Hwan ChoiJu-Seok SeongJehyun KyungJoon-Hyuk Chang
Published in: INTERSPEECH (2023)
Keyphrases