Optimizing a Hardware Network Stack to Realize an In-Network ML Inference Application.
Marco HartmannLukas WeberJohannes WirthLukas SommerAndreas KochPublished in: H2RC@SC (2021)
Keyphrases
- abstraction layer
- network structure
- computer networks
- network architecture
- peer to peer
- application level
- low cost
- maximum likelihood
- network resources
- network design
- network model
- link prediction
- content addressable
- resource manager
- network size
- data transfer
- data flow
- communication cost
- real time
- network traffic
- complex networks
- cloud computing
- expectation maximization
- learning algorithm
- neural network