Login / Signup

CorrMAE: Pre-training Correspondence Transformers with Masked Autoencoder.

Tangfei LiaoXiaoqin ZhangGuobao XiaoMin LiTao WangMang Ye
Published in: CoRR (2024)
Keyphrases