Fine-tuning Pre-Trained Transformer Language Models to Distantly Supervised Relation Extraction.
Christoph AltMarc HübnerLeonhard HennigPublished in: ACL (1) (2019)
Keyphrases
- fine tuning
- language model
- relation extraction
- pre trained
- language modeling
- automatic extraction
- information extraction
- named entities
- training data
- training examples
- supervised learning
- domain specific
- n gram
- information retrieval
- semantic relations
- probabilistic model
- named entity recognition
- query expansion
- retrieval model
- speech recognition
- question answering
- semi supervised
- unsupervised learning
- semantic features
- learning algorithm
- context sensitive
- relevance model
- active learning
- labeled data
- co occurrence
- machine learning
- unlabeled data
- feature space
- natural language
- feature extraction
- decision trees
- feature selection