Limits on the Performance Benefits of Multithreading and Prefetching.
Beng-Hong LimRicardo BianchiniPublished in: SIGMETRICS (1996)
Keyphrases
- prefetching
- multithreading
- cache misses
- response time
- access patterns
- web prefetching
- parallel computing
- access latency
- web caching
- caching scheme
- highly efficient
- computational power
- web documents
- user perceived latency
- web page prediction
- shared memory
- message passing
- coarse grained
- memory efficient
- parallel machines
- distributed memory
- scheduling algorithm
- parallel processing
- parallel algorithm
- conditional random fields
- peer to peer
- data structure
- machine learning