Login / Signup
Goro Kobayashi
Publication Activity (10 Years)
Years Active: 2020-2024
Publications (10 Years): 12
Top Topics
Word Frequency
Language Model
Automatic Summarization
Context Sensitive
Top Venues
CoRR
EMNLP (1)
EMNLP
ACL (Findings)
</>
Publications
</>
Goro Kobayashi
,
Tatsuki Kuribayashi
,
Sho Yokoi
,
Kentaro Inui
Analyzing Feed-Forward Blocks in Transformers through the Lens of Attention Maps.
ICLR
(2024)
Goro Kobayashi
,
Tatsuki Kuribayashi
,
Sho Yokoi
,
Kentaro Inui
Transformer Language Models Handle Word Frequency in Prediction Head.
ACL (Findings)
(2023)
Mengyu Ye
,
Tatsuki Kuribayashi
,
Jun Suzuki
,
Goro Kobayashi
,
Hiroaki Funayama
Assessing Step-by-Step Reasoning against Lexical Negation: A Case Study on Syllogism.
CoRR
(2023)
Hiroto Kurita
,
Goro Kobayashi
,
Sho Yokoi
,
Kentaro Inui
Contrastive Learning-based Sentence Encoders Implicitly Weight Informative Words.
EMNLP (Findings)
(2023)
Goro Kobayashi
,
Tatsuki Kuribayashi
,
Sho Yokoi
,
Kentaro Inui
Transformer Language Models Handle Word Frequency in Prediction Head.
CoRR
(2023)
Goro Kobayashi
,
Tatsuki Kuribayashi
,
Sho Yokoi
,
Kentaro Inui
Feed-Forward Blocks Control Contextualization in Masked Language Models.
CoRR
(2023)
Mengyu Ye
,
Tatsuki Kuribayashi
,
Jun Suzuki
,
Goro Kobayashi
,
Hiroaki Funayama
Assessing Step-by-Step Reasoning against Lexical Negation: A Case Study on Syllogism.
EMNLP
(2023)
Hiroto Kurita
,
Goro Kobayashi
,
Sho Yokoi
,
Kentaro Inui
Contrastive Learning-based Sentence Encoders Implicitly Weight Informative Words.
CoRR
(2023)
Goro Kobayashi
,
Tatsuki Kuribayashi
,
Sho Yokoi
,
Kentaro Inui
Incorporating Residual and Normalization Layers into Analysis of Masked Language Models.
CoRR
(2021)
Goro Kobayashi
,
Tatsuki Kuribayashi
,
Sho Yokoi
,
Kentaro Inui
Incorporating Residual and Normalization Layers into Analysis of Masked Language Models.
EMNLP (1)
(2021)
Goro Kobayashi
,
Tatsuki Kuribayashi
,
Sho Yokoi
,
Kentaro Inui
Attention Module is Not Only a Weight: Analyzing Transformers with Vector Norms.
CoRR
(2020)
Goro Kobayashi
,
Tatsuki Kuribayashi
,
Sho Yokoi
,
Kentaro Inui
Attention is Not Only a Weight: Analyzing Transformers with Vector Norms.
EMNLP (1)
(2020)