What is a computer memory?
Computer memory refers to the electronic holding place for instructions and data where the processor can read quickly.
Describe and illustrate the memory hierarchy
*See notes for pic
Describe each of the 5 parameters of memory
What are the two main categories of the memory
1.Primary Memory(Main Memory)
• The memory unit that communicates directly with the CPU is called main memory.
• The primary memory allows the computer to store data for immediate manipulation and to keep track of what is currently being processed.
• It is volatile in nature (biased to RAM), it means that when the power is turned off, the contents of the primary memory are lost forever.
• Divided into RAM and ROM
2.Secondary Memory (Auxiliary Memory/Storage
Devices)
• The secondary memory stores much larger amounts of data and information for extended periods of time.
• Data in secondary memory cannot be processed directly by the CPU, it must first be copied into primary storage i.e…, RAM.
• Secondary storage is used to store data and programs when they are not being processed.
• It is also non-volatile in nature.
• Due to this, the data remain in the secondary storage as long as it is not overwritten or deleted by the user.
• It is a permanent storage i.e., device.
Briefly describe RAM
It is also known as read/write memory, that allows CPU to read as well as write data and instructions into it.
• RAM is used for the temporary storage of input data, output data and intermediate results.
• RAM is a microchip implemented using semiconductors.
What are the two categories of RAM
i. Dynamic RAM (DRAM) It is made up of memory cells where each cell is composed of one capacitor and one transistor. DRAM must be refreshed continually to store information.
• The refresh operation occurs automatically thousands of times per second DRAM is slower, less-expensive and occupies less space on the computer’s motherboard.
ii. Static RAM (SRAM) It retains the data as long as power is provided to the memory chip.
• It needs not be ‘refreshed’ periodically. SRAM uses multiple transistors for each memory cell. It does not use capacitor.
• SRAM is often used as cache memory d
What is EDO DRAM
Extended Data Output Dynamic RAM (EDO DRAM) It is a type of RAM chip.
• It is used to improve the time to read content from memory and enhance the method of access.
Describe ROM
What are the categories of ROM
a. Programmable ROM (PROM) It is also non-volatile in nature. Once a PROM has been programmed, its contents can never be changed. It is a one-time programmable device. PROMs are manufactured blank and can be programmed at buffer, Final test or in system.
• These type of memories are found in video game consoles, mobile phones, implantable medical devices and high definition multimedia interfaces.
b. Erasable Programmable ROM (EPROM) It is similar to PROM, but it can be erased by exposure to strong ultraviolet light, then rewritten. So, it is also known as Ultraviolet Erasable Programmable ROM (UV EPROM).
c. Electrically Erasable Programmable ROM (EEPROM) It is similar to EPROM, but it can be erased electrically, then rewritten electrically and the burning process is reversible by exposure to electric pulses.
Describe the cache memory
(THIS IS ALSO SOMETIMES VIEWED AS A MAJOR TYPE OF MEMORY ON ITS OWN)
(Therefore having cache memory, primary/main memory and secondary memory)
• A CPU hardware cache is a smaller memory, located closer to the processor, that stores recently referenced data or instructions so that they can be quickly retrieved if needed again.
• By reducing costly reads and writes that access the slower main memory, caching has an enormous impact on the performance of a CPU.
• Cache memory is very expensive, so it is smaller in size.
• Generally. computers have cache memory of sizes 256 KB to 2 MB.
List some other memories affiliated with the main memory
• Flash Memory is a kind of semiconductor-based nonvolatile rewritable memory used in digital camera, mobile phone. Printer.
etc.
• Virtual Memory is a technique that allows the execution of processes that are not completely in main memory.
• One major advantage of this scheme is that programs can be larger than main memory.
• This technique frees programmers from the concerns of memory storage limitations.
• Buffer is a temporary physical storage used to hold data during execution of process from one place to another.
List some examples of Secondary memory devices
What are some of the ways in which the cache has been improved
The first caches were off-chip, or external.
• These were soon replaced by on-chip cache memories typically made from SRAM.
• To improve performance further, these on-chip caches were split into instruction and data partitions.
• Cache partitions led to the birth of multi-level cache hierarchies
• Where processor cores would have their own small, private cache (L1) that sat above a larger shared cache (L2), with some processors including a third cache level (L3) and occasionally a fourth (L4).
What is the principle of locality of reference and how does it relate to caching
What are the subsets of locality
Advantages of Bigger Blocks
——————————————————————–
•Using bigger blocks means more data transfer per I/O call. So faster data transfer from disk to memory.
Disadvantages of bigger Blocks
———————————————————————-
•If the rows are predominated random then you are increasing the possibility of contention in the buffer cache. Because now with same same amount of memory in buffer cache as it was in small blocks, we need more memory in the buffer cache to keep the same amount of buffers in memory in the buffer cache.
•If you have high transactional concurrency to a segment, using bigger blocks is just going to make the concurrency even higher.
Small Block
The advantage of small blocks are they reduce block contention and they are really good where there is small rows or the selectivity of rows are highly random.
The disadvantages of small blocks are they have relatively larger overhead.
What is the difference between caches and virtual memories
1Objective
Cache memory increase CPU access speed. Virtual memory increase main memory capacity.
2Memory Unit
Cache memory is a memory unit and is very fast to access. Virtual memory is a technique and involves hard disk and is slower to access.
3 Management
CPU and related hardwares manages cache memory. Operating System manages virtual memory.
4 Size
Cache memory is small in size. Size of virtual memory is much larger than cache memory.
5 Operation
Cache memory keeps recently used data. Virtual memory keeps the programs which are not getting accommodated in main memory.
Describe cache size and performance
There are several motivations for minimizing the cache size. The larger the cache, the greater the number of gates involved in addressing the cache is needed. The result is that larger caches end up being slightly slower than small ones. The available chip and board area also limits cache size.
Describe direct mapping
This is the simplest form of mapping. One block from main memory maps into only one possible line of cache memory. As there are more blocks of main memory than there are lines of cache, many blocks in main memory can map to the same line in cache memory.
To implement this function, use the following formula:
α = β % γ
where α is the cache line number, β is the block number in main memory, γ is the total number of lines in cache memory and % being the modulus operator.
What is the disadvantage of direct mapping
The main disadvantage of using this type of mapping is that there is a fixed cache location for any given block in main memory. If two blocks of memory sharing the same cache line are being continually referenced, cache misses would occur and these two blocks would continuously be swapped, resulting in slower memory access due to the time taken to access main memory (or the next level of memory).
Describe associative mapping
This type of mapping overcomes the disadvantage of direct mapping by permitting each main memory block to be loaded into any line of cache. To do so, the cache control logic interprets a memory address as a tag and a word field. The tag uniquely identifies a block in main memory. The primary disadvantage of this method is that to find out whether a particular block is in cache, all cache lines would have to be examined. Using this method, replacement algorithms are required to maximize its potential.
Describe Set Associative mapping
This type of mapping is designed to utilize the strengths of the previous two mappings, while minimizing the disadvantages. The cache is divided into a number of sets containing an equal number of lines. Each block in main memory maps into one set in cache memory similar to that of direct mapping. Within the set, the cache acts as associative mapping where a block can occupy any line within that set. Replacement algorithms may be used within the set.
List the four most common replacement algorithms
For associative and set associative mapping, however, an algorithm is needed. For maximum speed, this algorithm is implemented in the hardware.
Explain the differences between L1 and L2 cache
L1 has a smaller memory capacity than L2. Also, L1 can be accessed faster than L2. L1 is usually in-built to the chip, while L2 is soldered on the motherboard very close to the ch