Full Transcript

# CACHE MEMORY: - Cache memory is a memory which is very nearest to the CPU. - All recent instructions are shared into the cache memory. - Cache memory is attached to storing the inputs which is given by the user. - And which is necessary by the CPU to perform the task. - If the active portions of...

# CACHE MEMORY: - Cache memory is a memory which is very nearest to the CPU. - All recent instructions are shared into the cache memory. - Cache memory is attached to storing the inputs which is given by the user. - And which is necessary by the CPU to perform the task. - If the active portions of the program and data are placed in a fast small memory, the average memory access time can be reduced, thus reducing the total execution time of the program. - Such a fast small memory is referred to as a cache memory. - It is placed between the CPU and main memory. - Although the cache is only a small fraction of the size of main memory, a large fraction of memory requests will be found in the fast cache memory because of the locality of reference property of programs. ## The basic operation of the cache is as follows: - When the CPU needs to access memory, the cache is examined. - If the word is found in the cache, it is read from the fast memory. - If the word addressed by the CPU is not found in the cache, the main memory is accessed to read the word. - A block of words containing the one just accessed is then transferred from main memory to cache memory. - The block size may vary from one word (the one just accessed) to about 16 words adjacent to the one just accessed. ## Importance of Cache Memory: - **Speed:** Cache memory significantly reduces the time to access data compared to accessing it directly from RAM. - **Cost:** Cache memory is more expensive than RAM but less expensive than CPU registers. - **Efficiency:** By storing frequently accessed data, cache helps in speeding up repetitive tasks. ## Cache Replacement Policies: - When the cache is full, the system needs a policy to decide which block to replace. - Common policies include: - **Least Recently Used (LRU):** Replaces the block that hasn't been used for the longest time. - **First-In-First-Out (FIFO):** Replaces the oldest block in the cache. - **Random:** Randomly selects a block to replace. ## DIAGRAM OF CACHE MEMORY: A diagram depicting the hierarchy of cache memory from CPU to Main Memory with each level labelled as L1, L2 and L3 cache. ## The cache memory access time is less than the access time of main memory by a factor of 5 to 10. - The cache is the fastest component in the memory hierarchy and approaches the speed of CPU components. - If the process finds the address code data is not available in the cache memory. ## CPU ↔ CACHE ↔ MAIN MEMORY - **1:** Word transfer occurs - **2:** Block transfer occurs. ## Write Policies - **1: Write-Through** - Data is written to both the cache and main memory simultaneously. - Ensures data consistency but is slower. - **2: Write-Back** - Data is written to the cache and only updated in the main memory when the cache block is replaced. - Faster but more complex, as the system must ensure the data in memory is consistent. # Advantages of Cache Memory: - **Faster Data Access:** Since cache memory is faster than RAM, it speeds up data access. - **Improved CPU Performance:** Reduces the time the CPU spends waiting for data. - **Reduced Latency:** Minimizes the delay in accessing frequently used data. # Disadvantages of Cache Memory: - **Cost:** Cache memory is more expensive per byte compared to other types of memory like RAM. - **Limited Size:** The size of cache memory is small, and once it fills up, data has to be replaced, which can occasionally lead to delays. # MAIN MEMORY: - Main memory, also known as primary memory, is a volatile memory that provides fast storage and retrieval of data. - It temporarily stores data that is being executed by the CPU (Central Processing Unit). - This data includes both the instructions (codes) needed for processing and the results of these processed instructions. ## Memory addressing - Memory addressing is a fundamental concept in computer architecture, referring to the process by which a computer identifies and accesses specific locations in its memory. - A memory address is a unique identifier that corresponds to a specific byte or group of bytes in the computer's RAM (Random Access Memory). ## Functions of Main Memory: - **1: Temporary Storage:** It stores the data and instructions that are required by the CPU for processing. - **2: High-Speed Data Access:** Since the CPU interacts directly with the main memory, it must be fast enough to keep up with the processing speed of the CPU. - **3: Volatile Storage:** Main memory loses its data when the system is powered off, unlike non-volatile storage devices. - **4: Immediate Execution:** It allows the processor to quickly retrieve and execute data and instructions during the operation of programs. # Types of Main Memory: - **1: RAM (Random Access Memory):** - **Dynamic RAM (DRAM):** The most common type of RAM used in computers. - It needs to be refreshed thousands of times per second, but it is cost-effective and used for general-purpose computing. - **Static RAM (SRAM):** Does not need to be refreshed and is faster than DRAM, but it is more expensive and is used for specific applications like CPU caches. - **2: ROM (Read-Only Memory):** - Non-volatile memory that retains data even when the computer is turned off. - It is used to store the firmware or BIOS, which are essential programs for starting and running the computer. # Types of ROM: - **PROM (Programmable ROM):** Can be programmed once by the user after manufacturing. - **EPROM (Erasable Programmable ROM):** Can be erased and reprogrammed using ultraviolet light. - **EEPROM (Electrically Erasable Programmable ROM):** Can be reprogrammed using electrical signals, making it more flexible. ## STRUCTURE OF MAIN MEMORY: A diagram depicting a memory address and data blocks with CPU fetching and executing the data. ## MEMORY HIERARCHY: A diagram depicting a hierarchy of memory systems: 1. CPU 2. Cache memory 3. Main memory 4. Secondary storage # Memory Access Methods: - Main memory can be accessed using different methods, which include: - **1: Random Access:** The CPU can directly access any memory cell with its unique address. Both RAM and ROM support random access. - **2: Sequential Access:** Data is accessed in a sequential order. This method is common in storage media like tapes but not in RAM. - **3: Direct Access:** Allows data to be accessed directly but with restrictions. Used mainly in magnetic disks, where the read/write head moves to the data's location. # MEMORY MANAGEMENT: - **Paging:** Divides memory into fixed-size pages and stores them in memory frames. - It avoids fragmentation and allows for better memory utilization. - **Segmentation:** Divides memory into variable-size segments, such as code, data, and stack segments. - Segmentation is user-visible, unlike paging. ## Memory Allocation Techniques: - **1: Single Contiguous Allocation:** All memory is allocated to a single process. - **2: Partitioned Allocation:** Divides memory into fixed or dynamic partitions for processes. - **3: Virtual Memory:** A technique that allows programs to use more memory than physically available by using disk space to simulate extra memory. # Advantages of Main Memory: - **High-Speed Data Access:** Provides fast access to data compared to secondary storage. - **Supports Multitasking:** Allows multiple processes to run simultaneously by allocating memory dynamically. - **Efficient Data Processing:** Reduces latency and speeds up CPU operations by storing active data. # Disadvantages of Main Memory: - **Volatility:** Loses data when the system is turned off. - **Limited Capacity:** RAM has limited storage compared to secondary storage like hard drives. - **Cost:** Faster types of memory like SRAM are expensive, so larger capacities are cost-prohibitive.

Use Quizgecko on...
Browser
Browser