Login / Signup
TaiYu Cheng
Publication Activity (10 Years)
Years Active: 2019-2022
Publications (10 Years): 7
Top Topics
Design Methodology
Neural Network Training
Critical Path
Energy Minimization
Top Venues
IEICE Trans. Fundam. Electron. Commun. Comput. Sci.
PATMOS
ASP-DAC
DATE
</>
Publications
</>
TaiYu Cheng
,
Yutaka Masuda
,
Jun Nagayama
,
Yoichi Momiyama
,
Jun Chen
,
Masanori Hashimoto
Activation-Aware Slack Assignment Based Mode-Wise Voltage Scaling for Energy Minimization.
IEICE Trans. Fundam. Electron. Commun. Comput. Sci.
(3) (2022)
Yutaka Masuda
,
Jun Nagayama
,
TaiYu Cheng
,
Tohru Ishihara
,
Yoichi Momiyama
,
Masanori Hashimoto
Low-Power Design Methodology of Voltage Over-Scalable Circuit with Critical Path Isolation and Bit-Width Scaling.
IEICE Trans. Fundam. Electron. Commun. Comput. Sci.
(3) (2022)
Yutaka Masuda
,
Jun Nagayama
,
TaiYu Cheng
,
Tohru Ishihara
,
Yoichi Momiyama
,
Masanori Hashimoto
Critical Path Isolation and Bit-Width Scaling Are Highly Compatible for Voltage Over-Scalable Design.
DATE
(2021)
TaiYu Cheng
,
Masanori Hashimoto
Minimizing Energy of DNN Training with Adaptive Bit-Width and Voltage Scaling.
ISCAS
(2021)
TaiYu Cheng
,
Yukata Masuda
,
Jun Nagayama
,
Yoichi Momiyama
,
Jun Chen
,
Masanori Hashimoto
Mode-wise Voltage-scalable Design with Activation-aware Slack Assignment for Energy Minimization.
ASP-DAC
(2021)
TaiYu Cheng
,
Yukata Masuda
,
Jun Chen
,
Jaehoon Yu
,
Masanori Hashimoto
Logarithm-approximate floating-point multiplier is applicable to power-efficient neural network training.
Integr.
74 (2020)
TaiYu Cheng
,
Jaehoon Yu
,
Masanori Hashimoto
Minimizing Power for Neural Network Training with Logarithm-Approximate Floating-Point Multiplier.
PATMOS
(2019)