Login / Signup
Transformer-Based LM Surprisal Predicts Human Reading Times Best with About Two Billion Training Tokens.
Byung-Doh Oh
William Schuler
Published in:
CoRR (2023)
Keyphrases
</>
training set
human interaction
neural network
training process
language model
human subjects
training phase
artificial intelligence
training examples
power system
motor skills
fuzzy logic
semi supervised
training samples
fault diagnosis
training algorithm