Login / Signup
ByT5: Towards a Token-Free Future with Pre-trained Byte-to-Byte Models.
Linting Xue
Aditya Barua
Noah Constant
Rami Al-Rfou
Sharan Narang
Mihir Kale
Adam Roberts
Colin Raffel
Published in:
Trans. Assoc. Comput. Linguistics (2022)
Keyphrases
</>
pre trained
real time
neural network
image sequences
training data
pairwise
viewpoint
probabilistic model