Login / Signup
Correspondence Attention Transformer: A Context-Sensitive Network for Two-View Correspondence Learning.
Jiayi Ma
Yang Wang
Aoxiang Fan
Guobao Xiao
Riqing Chen
Published in:
IEEE Trans. Multim. (2023)
Keyphrases
</>
context sensitive
learning process
learning algorithm
multiple task learning
supervised learning
reinforcement learning
learning tasks
inductive transfer
active learning
inductive learning
prior knowledge
probabilistic model
network structure
multi task learning
multiple tasks