PC Hardware

Dual Inline Memory Modules

These newer modules look much like SIMMs, but come in a package with 168 pins and have a different wiring structure, so that one card can form a complete bank. These are the memory packages used on virtually all new motherboards.

Cache Memory

To cache is to set something aside, or to store for anticipated use. Mass storage is much slower than RAM, and RAM is much slower than the CPU. Caching, in PC terms, is the holding of a recently used or frequently used code or data in a special memory location for rapid retrieval. Speed is everything when it comes to computers. The high-speed memory chip generally used for caching is called SRAM.

SRAM

SRAM (static RAM) does not use capacitors to store ones and zeroes. Instead, SRAM uses a special circuit called a flip-flop.

The advantages of SRAM are that it is fast and it does not have to be refreshed, because it uses the flip-flop circuit to store each bit. A flip-flop circuit will toggle on or off and retain its position, whereas a standard memory circuit requires constant refreshing to maintain an on state.

The main disadvantage of SRAM is that it is more expensive than DRAM.

Internal Cache (L1)

Starting with the 486 chips, a cache has been included on every CPU. This original on-board cache is known as Level 1 (L1) or internal cache. All commands for the processor go through the cache. The cache stores a backlog of commands so that, if a wait state is encountered, the CPU can continue to process using commands from the cache. Caching will store any code that has been read and keep it available for the CPU to use. This eliminates the need to wait for fetching of the data from DRAM.

External Cache (L2)

Additional cache can be added to most computers, depending on the motherboard. This cache is mounted directly on the motherboard, outside the CPU. The external cache is also called Level 2 (L2) and is the same as L1, but larger. L2 can also (on some motherboards) be added or expanded. When installing any L2 cache, be sure to check the CMOS setup and enable the cache.

Write-Back vs. Write-Through

As mentioned, the primary use of a cache is to increase the speed of data from RAM to the CPU. Some caches immediately send all data directly to RAM, even if it means hitting a wait state. This is called write-through cache, shown in Figure 7.7.

Figure 7.7 Write-through cache

Some caches store the data for a time and send it to RAM later. This is called write-back cache, shown in Figure 7.8.

Figure 7.8 Write-back cache

Write-back caches are harder to implement but are much more powerful than write-through caches, because the CPU does not have to stop for the wait state of the RAM. However, write-through caches are less expensive.