Große Auswahl an günstigen Büchern
Schnelle Lieferung per Post und DHL

A Primer on Hardware Prefetching

Über A Primer on Hardware Prefetching

Since the 1970¿s, microprocessor-based digital platforms have been riding Moore¿s law, allowing for doubling of density for the same area roughly every two years. However, whereas microprocessor fabrication has focused on increasing instruction execution rate, memory fabrication technologies have focused primarily on an increase in capacity with negligible increase in speed. This divergent trend in performance between the processors and memory has led to a phenomenon referred to as the ¿Memory Wall.¿ To overcome the memory wall, designers have resorted to a hierarchy of cache memory levels, which rely on the principal of memory access locality to reduce the observed memory access time and the performance gap between processors and memory. Unfortunately, important workload classes exhibit adverse memory access patterns that baffle the simple policies built into modern cache hierarchies to move instructions and data across cache levels. As such, processors often spend much time idling upon a demand fetch of memory blocks that miss in higher cache levels. Prefetching¿predicting future memory accesses and issuing requests for the corresponding memory blocks in advance of explicit accesses¿is an effective approach to hide memory access latency. There have been a myriad of proposed prefetching techniques, and nearly every modern processor includes some hardware prefetching mechanisms targeting simple and regular memory access patterns. This primer offers an overview of the various classes of hardware prefetchers for instructions and data proposed in the research literature, and presents examples of techniques incorporated into modern microprocessors.

Mehr anzeigen
  • Sprache:
  • Englisch
  • ISBN:
  • 9783031006159
  • Einband:
  • Taschenbuch
  • Seitenzahl:
  • 68
  • Veröffentlicht:
  • 2 Juni 2014
  • Abmessungen:
  • 191x5x235 mm.
  • Gewicht:
  • 147 g.
  Versandkostenfrei
  Versandfertig in 1-2 Wochen.

Beschreibung von A Primer on Hardware Prefetching

Since the 1970¿s, microprocessor-based digital platforms have been riding Moore¿s law, allowing for doubling of density for the same area roughly every two years. However, whereas microprocessor fabrication has focused on increasing instruction execution rate, memory fabrication technologies have focused primarily on an increase in capacity with negligible increase in speed. This divergent trend in performance between the processors and memory has led to a phenomenon referred to as the ¿Memory Wall.¿ To overcome the memory wall, designers have resorted to a hierarchy of cache memory levels, which rely on the principal of memory access locality to reduce the observed memory access time and the performance gap between processors and memory. Unfortunately, important workload classes exhibit adverse memory access patterns that baffle the simple policies built into modern cache hierarchies to move instructions and data across cache levels. As such, processors often spend much time idling upon a demand fetch of memory blocks that miss in higher cache levels. Prefetching¿predicting future memory accesses and issuing requests for the corresponding memory blocks in advance of explicit accesses¿is an effective approach to hide memory access latency. There have been a myriad of proposed prefetching techniques, and nearly every modern processor includes some hardware prefetching mechanisms targeting simple and regular memory access patterns. This primer offers an overview of the various classes of hardware prefetchers for instructions and data proposed in the research literature, and presents examples of techniques incorporated into modern microprocessors.

Kund*innenbewertungen von A Primer on Hardware Prefetching



Ähnliche Bücher finden
Das Buch A Primer on Hardware Prefetching ist in den folgenden Kategorien erhältlich:

Willkommen bei den Tales Buchfreunden und -freundinnen

Jetzt zum Newsletter anmelden und tolle Angebote und Anregungen für Ihre nächste Lektüre erhalten.