Login / Signup
CS-UM6P at SemEval-2021 Task 1: A Deep Learning Model-based Pre-trained Transformer Encoder for Lexical Complexity.
Nabil El Mamoun
Abdelkader El Mahdaouy
Abdellah El Mekki
Kabil Essefar
Ismail Berrada
Published in:
SemEval@ACL/IJCNLP (2021)
Keyphrases
</>
deep learning
pre trained
unsupervised learning
machine learning
natural language processing
training data
data sets
semantic relations
image segmentation
supervised learning
training examples
mental models
control signals