Transformer-BLS: An efficient learning algorithm based on multi-head attention mechanism and incremental learning algorithms.
Rongrong FuHaifeng LiangShiwei WangChengcheng JiaGuangbin SunTengfei GaoDan ChenYaodong WangPublished in: Expert Syst. Appl. (2024)
Keyphrases
- learning algorithm
- attention mechanism
- machine learning algorithms
- training data
- machine learning
- back propagation
- batch mode
- active learning
- visual attention
- learning scheme
- learning problems
- incremental learning
- training examples
- supervised learning
- generalization error
- real time
- learning tasks
- reinforcement learning
- fault diagnosis
- fuzzy logic
- neural network
- learning process
- image processing
- saliency map
- denoising
- sample complexity
- hypothesis space
- perceptron algorithm
- high quality