Login / Signup

ByT5: Towards a Token-Free Future with Pre-trained Byte-to-Byte Models.

Linting XueAditya BaruaNoah ConstantRami Al-RfouSharan NarangMihir KaleAdam RobertsColin Raffel
Published in: Trans. Assoc. Comput. Linguistics (2022)
Keyphrases
  • pre trained
  • real time
  • neural network
  • image sequences
  • training data
  • pairwise
  • viewpoint
  • probabilistic model