What is RAM | Random Access Memory

RAM, which is an abbreviation for Random Access Memory, is a computer hardware component that is typically found on the motherboard of the computer and serves as the CPU’s internal memory. Using this feature, when you turn on your computer, the CPU may store data, programmes, and programme results. Because it is both the read-only and write-only memory of a computer, it allows information to be both written to and read from it.

A volatile memory, such as random access memory (RAM), is one that does not retain data or instructions indefinitely. For example, when you turn on your computer or reboot it, the data and instructions from your hard disc are saved in the RAM. When you start a programme, the operating system (OS) and that particular software are loaded into RAM, often from an HDD or solid-state drive (SSD). The CPU makes use of this information in order to complete the tasks at hand. As soon as you turn off the computer, the data in the RAM is lost forever. As a result, the data is retained in the RAM as long as the computer is running and is erased when the machine is shut down. The advantage of putting data into RAM is that reading data from the RAM is significantly faster than reading data from the hard drive, which is a significant time savings.

In layman’s terms, we may say that RAM is analogous to a person’s short-term memory and that hard drive storage is analogous to a person’s long term memory. Short-term memory retains information for a short period of time, but long-term memory retains information for an extended period of time. It is possible to refresh short-term memory by referring to knowledge stored in the brain’s long-term memory. It operates the same way in a computer; when the RAM is full, the CPU accesses the hard disc to overwrite the old data in RAM with fresh data. It’s similar to a reusable scratch paper on which you may jot down notes, figures, and other information using a pencil. As you run out of space on a piece of paper, you may remove the information that you no longer require; RAM operates in a similar manner; when the RAM fills up, the data that is no longer required is destroyed and replaced with fresh data from the hard drive that is required for the present processes.

RAM is available in two configurations: as a single chip that is separately put on the motherboard, or as a group of chips on a tiny board that is linked to the motherboard. A computer’s primary memory is referred to as the main memory. When compared to other types of memory, such as a hard disc drive (HDD), solid-state drive (SSD), optical drive, and so on, it is far faster to write to and read from.

The size or storage capacity of the RAM is the most important factor affecting the performance of a computer. If the computer does not have enough RAM (random access memory) to operate the operating system and software programmes, it will function slower than it should. A computer’s performance will be enhanced by increasing the amount of RAM available. Information stored in RAM is accessed at random, rather than in a sequential manner like information saved on a CD or hard disc. As a result, its access time is significantly shorter.

History of RAM:

● The Williams tube was the first form of RAM to be created, and it was released in 1947. Cathode ray tubes were employed to store data, and the information was stored as electrically charged spots on the tube’s surface.
● The magnetic-core memory, which was created in 1947, was the second form of random access memory. It was constructed of little metal rings with wires running between each ring. A ring contains a single piece of information that may be retrieved at any moment.
● Robert Dennard, working at IBM’s Thomas J. Watson Research Center in 1968, devised the random access memory (RAM) that we know today as solid state memory.
● It is precisely referred to as dynamic random access memory (DRAM), and it is comprised of transistors that store data bits. In order to keep the state of each transistor stable, a steady source of electricity was necessary.
● Intel 1103 DRAM was released in October 1969, making it the world’s first DRAM. It was the company’s first DRAM that was made commercially available.
● In 1993, Samsung released the KM48SL2000 synchronous DRAM, which is still in use today (SDRAM).
● DDR SDRAM was first made commercially accessible in 1996.
● In 1999, RDRAM became accessible for use in computer systems.
● DDR2 SDRAM was first made available for purchase in 2003.
● DDR3 SDRAM was first made available for purchase in June 2007.
● DDR4 memory was first made available on the market in September 2014.
RAM may be classified into the following types:

Integrated RAM chips may be divided into two categories:

● Static RAM (SRAM):
● Dynamic RAM (DRAM):
1) Static RAM (random access memory):

Static random access memory (SRAM) is a sort of random access memory that preserves its state for data bits or stores data for as long as it is supplied with electricity. This type of RAM is composed of memory cells, and is referred to as static RAM since it does not require periodic refreshment because it does not require power to prevent leaking, as is the case with dynamic RAM. As a result, it is significantly quicker than DRAM.

Flip-flops, which are a form of memory cell, are created by a specific arrangement of transistors in this device. One piece of data is stored in each memory cell. The majority of current SRAM memory cells are constructed from six CMOS transistors, however they do not contain any capacitors. SRAM chips have access times as low as 10 nanoseconds, which is extremely fast. DRAM, on the other hand, has an access latency that is often greater than 50 nanoseconds.

Furthermore, because it does not require a break between accesses, its cycle time is significantly lower than that of DRAM. It is generally utilised for system cache memory, high-speed registers, and tiny memory banks such as the frame buffer on graphics cards as a result of the advantages connected with its utilisation.

This is due to the six-transistor structure of the Static RAM’s circuit, which maintains the flow of current in one way or the other (0 or 1). When the capacitor is in the 0 or 1 state, it may be written and read immediately, without having to wait for the capacitor to fill or drain. While the early asynchronous static RAM chips performed read and write operations sequentially, the newer synchronous static RAM chips overlap read and write processes to provide greater performance.

For the same amount of storage (memory), the disadvantage of static RAM is that its memory cells take up more space on a chip than DRAM memory cells since it has more pieces than a DRAM. As a result, it has less memory per chip.

2) Dynamic Random Access Memory (RAM):

DRAM (dynamic random access memory) is likewise made up of memory cells. It is an integrated circuit (IC) made up of millions of transistors and capacitors that are extremely small in size, and each transistor is lined up with a capacitor to create a very compact memory cell that can fit on a single memory chip, allowing for millions of them to be used on a single memory chip. Consequently, a DRAM memory cell has one transistor and one capacitor, with each cell representing or storing a single bit of data in its capacitor within an integrated circuit, which is called a memory cell.

This bit of information or data is stored in the capacitor, either as a 0 or as a 1. The transistor, which is also included in the cell, serves as a switch, allowing the electric circuit on the memory chip to read the capacitor and change the state of the capacitor as needed.

It is necessary to recharge the capacitor at regular intervals in order to keep the charge in the capacitor constant. This is the reason why it is referred to as dynamic RAM: it must be refreshed on a regular basis in order to keep its data, or it would forget what it is holding. A refresh circuit, which rewrites the data many hundred times per second, is used to achieve this result by connecting the memory to a refresh circuit. It takes around 60 nanoseconds for DRAM to access a memory cell.

As an analogy, a capacitor may be thought of as a box that holds electrons. The box is filled with electrons in order to store the number?1? in the memory cell. In contrast, to store a?0?, the space is completely empty. The disadvantage is that there is a leak in the box. In a matter of moments, the entire box is transformed into an empty one. For dynamic memory to function properly, the CPU or memory controller must replenish all of the capacitors before they are allowed to discharge. In order to do this, the memory controller reads from the memory and then writes it back to it. Known as refreshing the memory, this process occurs automatically thousands of times per second and is repeated thousands of times per second after that. As a result, this sort of RAM must be dynamically updated on a regular basis.

Types of DRAM include:

i) Asynchronous DRAM:

This kind of DRAM does not have its clock synced with the CPU’s. Consequently, the disadvantage of using this RAM is that the CPU will not be able to predict the precise time when data will be accessible from the RAM on the input-output connection. This constraint was solved by the following generation of RAM, which is referred to as synchronous DRAM (synchronous dynamic random access memory).

ii) Synchronous DRAM:

SDRAM (Synchronous DRAM) first appeared on the market in late 1996. SDRAM was designed to keep the RAM clock synced with the CPU clock. It enabled the CPU, or to be more accurate, the memory controller, to determine the exact clock cycle or timing, as well as the number of cycles after which the data would become accessible on the bus. As a result, the CPU is not required to perform memory accesses, allowing the read and write speeds of the memory to be enhanced. This type of memory is also known as the single data rate SDRAM (sometimes known as SDR SDRAM) because data is transmitted at a single data rate, which is every rising edge of the clock cycle. See the image in the accompanying description for more information.


The DDR RAM is the name given to the next generation of synchronous DRAM chips. It was created in order to overcome the restrictions of SDRAM and began to be utilised in PC memory at the beginning of the decade of the year 2000. The data is transferred twice during each clock cycle in DDR SDRAM (also known as DDR RAM), once during the positive edge (rising edge) of the cycle and once during the negative edge (falling edge). As a result, it is referred to as the double data rate SDRAM.

DDR SDRAM is available in several generations, the first of which is DDR1, followed by DDR2, DDR3, and finally DDR4. Today, the majority of the memory that we use in our computers, laptops, smartphones, and other devices is either DDR3 or DDR4 RAM. DDR SDRAM is available in the following configurations:


DDR1 SDRAM is the first generation of high-capacity SDRAM. When the voltage was decreased from 3.3 V to 2.5 V in this RAM, the performance was significantly improved. This occurs throughout both the rising and falling edges of the clock cycle, and the data is sent at both times. As a result, instead of 1 bit being pre-fetched in each clock cycle, 2 bits are being pre-fetched, which is referred to as the 2 bit pre-fetch. It is generally used in the frequency range of 133 MHz to 200 MHz, with a few exceptions.

Furthermore, the data rate at the input-output bus is double the clock frequency since the data is sent during both the rising as well as falling edge. So, if a DDR1 RAM is functioning at 133 MHz, the data rate would be twice, 266 Mega transfer per second.


It is an improved version of the DDR1 memory chip. Because it operates at 1.8 V rather than 2.5 V, it is more efficient. In order to achieve this increase in data rate, the amount of bits that are pre-fetched during each cycle has been increased from 2 bits to 4 bits, which is double the amount of data rate achieved by the previous generation. This RAM’s internal bus width has been expanded to accommodate the increase in capacity. The internal bus width of a system will be comparable to 128 bits in size if, for example, the input-output bus is 64 bits in width. As a result, a single cycle can handle twice as much data as before.

iii) DDR3 SDRAM:

In this version, the voltage is dropped even further, from 1.8 V to 1.5 V, which is a significant reduction. Because the number of bits that are pre-fetched has been raised from 4 bits to 8 bits, the data rate of the new generation RAM is double that of the previous generation RAM. It is possible to claim that the internal data bus width of RAM has been raised by two times compared to the previous generation.


However, the operational voltage has been dropped even more in this iteration, from 1.5 V to 1.2 V, although the number of bits that may be pre-fetched remains the same as in the previous generation, at 8 bits per cycle. The internal clock frequency of the RAM has been increased by double compared to the previous generation. Assuming you are running at 400 MHz, the clock frequency of the input-output bus would be four times higher, at 1600 MHz, and the transfer rate would be equal to 3200 Megabits per second if you were working at 400 MHz.






Leave a Reply

Your email address will not be published. Required fields are marked *