An Illustrated History of Computer Memory
By ,
1. Introduction

Computer memory is much more than DRAM or Flash; it has come a long way up until the origins of today's omnipresent memory technologies. Let me take you more than 160 years back in time and revisit the milestones of computer memory technology - and products you may have never heard of.

2. 1834: Punch Cards - Computer Memory

There is a time in your life when you start feeling old. That feeling has a tendency to surface more often as time goes on. Punch cards are just one more example for me: The first science books we used in school in the early 1980s still illustrated punch cards as the medium of choice to store data "digitally". Of course, punch cards are much older than that. Charles Babbage, an English mathematician and the person who is credited with the idea of the programmable computer. He built the "Analytical Engine" from 1834 until his death in 1871. The device used punch cards as read-only memory. The total memory integrated in the Analytical Engine was the equivalent of 675 bytes.

3. 1932-1945: Drum Memory, Neon Lamps and Delay Line Memory

Austrian IT engineer Gustav Tauschek invented the first widely used computer memory, called drum memory. While he created drum memory in 1932, it took more than 20 years for the technology to be generally adopted. The decade following the invention of drum memory also saw several other, albeit short-lived memory ideas. For example, Konrad Zuse used mechanical sliding metal memory in his Z3 device, considered to be the first Turing-complete computer. In 1939, Helmut Schreyer, who worked with Zuse on the Z3, invented neon lamp-based memory. In 1942, the Atanasoff-Berry Computer combined capacitors mounted on two revolving drums as well as punch cards as memory solution. The famed ENIAC introduced wire delay line memory built from mercury and nickel. By the 1950s, drum memory dominated early computers with delay lines and Williams-Kilburn tube-based memory (developed in 1947, patent #2,951,176), which used a cathode ray tube to store data, being used as alternatives.

4. 1951: Magnetic Core Memory

Magnetic core memory was the second major milestone in modern computer memory technology that was widely adopted. While it is generally believed that all key components of the technology had been developed by 1951, the idea is traced back to MIT's Jay Forrester who first described the technology in 1949. In parallel An Wang was working on magnetic core memory in a slightly different approach as well. The basic principle of core memory was using a "core" as a ring of ferrite that could be magnetized in one of two directions. As a result, the memory was able to store digital information either a 1 or 0. In the 1960s up to the early 1970s, non-volatile magnetic core memory was used as the default memory technology in more than 90 percent of all computers globally. Wang received the initial patent (#2,708,722) with 34 claims for magnetic core memory, which was acquired shortly after its approval in 1955 by IBM for $500,000. MIT challenged the patent claims and engaged in a lengthy patent battle with IBM. IBM eventually paid MIT $13 million for the rights to magnetic core memory in 1964.

5. 1952: Ultrasonic Memory

One of the wrong turns in memory development was the ultrasonic memory used in the EDVAC. The system features 1024 44-bit words of ultrasonic memory (equivalent to 5.5 KB), while the ENIAC gets a fresh core memory module.

 

6. 1952: FeRAM - Computer Memory

FeRAM, or Ferroelectric RAM, still feels somewhat exotic today, but its history dates back to 1952, when MIT graduate student Dudley Allen Buck described the principle of FeRAM in his master's thesis entitled Ferroelectrics for Digital Information Storage and Switching. It took more than 30 years for the idea to be picked up again. The technology idea was completed in 1991 at NASA's Jet Propulsion Laboratory. The FeRAM in production today is largely based on technology developed by IP (intellectual property) company Ramtron, while Fujitsu, IBM and Texas Instruments are the most significant producers of FeRAMs at this time.

7. 1953: Selectron Tube - Computer Memory

Developed between 1946 and 1953 by Jay Raichman at RCA, the selectron tube was the final shot at making tube memory popular, but never made it into production due to the popularity of magnetic core memory. The selectron tube was 10 inches long and measure 3 inches in diameter. The device used an indirectly heated cathode running up the middle, surrounded by two separate sets of wires and offered a storage capacity of 256 bits in the proposed production device, but was claimed to be able to scale to up to 4,096 bits.

8. 1964: Static Random Access memory (SRAM)

The first traces to SRAM date back to 1964, when 64-bit Metal Oxide Semiconductor (MOS) Static RAM was developed at Fairchild Semiconductor. However, the breakthrough came when Intel developed its first 256-bit static RAM (SRAM), the 1101 chip in 1969 and formally launched it in 1971. The 1101 was the world's first MOS memory in mass-production and was first to use MOS silicon gate technology.

Other than DRAM, SRAMs do not need to be periodically refreshed and support data remanence, but exhibit volatile characteristics and lose their content when not connected to an active power source. Typically, applications today include MOSFET for CMOS. SRAMs are also used as stepping-stone technology to develop new semiconductor production processes.

 

9. 1966: Dynamic Random Access Memory (DRAM)

Robert Dennard invented DRAM as the foundation of the computer memory we use today at the IBM Thomas J. Watson Research Center in 1966/1967. He received the DRAM patent #3,387,286 in 1968. The document describes a one-transistor DRAM cell.

10. 1969: Phase-Change memory

Phase-change memory is still in its nascent stages today, more than 50 years after its invention. In his 1969 dissertation, Charles Sie of the Iowa State University explained that a phase change memory device would be "feasible" by integrating chalcogenide film with a diode array. However, some work had been done prior to that by Stanford Ovshinsky at Energy Conversion Devices who believed that the properties of chalcogenide glasses could be used as a potential memory technology. Intel co-founder Gordon Moore also published a paper describing phase-change memory in 1970. Today, the opportunity in PRAM is mainly pursued by Samsung.

11. 1970: The First DRAMs - Computer Memory

The first known DRAM chip ever developed was a 256-bit device created by Lee Boysel at Fairchild Semiconductor in 1968. Boysel later went on to found Four Phase Systems in 1969 and developed 1,024-bit and 2,048-bit DRAMs. Intel released the 1103, the industry's first mass-produced DRAM device, in 1970. The 1,024 bit chip was massively successful and turned into the world's best-selling semiconductor chip by 1972. In that year, Intel already had more than 1,000 employees and revenue of more than $23 million. HP's 9800 was the first commercially available computer that integrated the 1103. IBM followed and used it in the System 370/158.

The 1103 is remembered as the magnetic core memory killer and suddenly offered the industry an opportunity to store vast amounts of information on a single chip. DRAM began overtaking magnetic core memory in the second half of the 1970s.

12. 1971: Erasable Programmable Read Only Memory (EPROM)

The first EPROM (sometimes also referred to as EROM) was invented by Dov Frohman of Intel in 1971 (patent #3,660,819). The first produced device was the i1701 with an 8x256-bit structure and a total capacity of 2,048 bits. An EPROM is a non-volatile memory device, i.e. a memory that retains its content even when the connected power is switched off. The technology uses arrays of floating-gate transistors that can be individually programmed. Data is erased via exposure to an ultraviolet light source. In 1978, George Perlegos at Intel developed the Intel 2816, the first EEPROM (Electrically Erasable Programmable Read Only Memory), which used the foundation of EPROMs, but leveraged a thin gate oxide layer to enable deletion without the need of an ultraviolet light source.

13. 1984: Flash - Computer Memory

Toshiba's Fujio Masuoka invented Flash sometime in the early 1980s. Most sources today point at the 1984 Integrated Electronics Devices Meeting at which Masuoka discussed and detailed flash (NOR and NAND) for the first time and is credited with the invention of both NOR and NAND Flash.

The first widely known Flash memory device was produced by Intel in 1988. The 256 KB chip came in the size of a shoebox. Of course, Flash is used today in virtually any mass storage solution ranging from chips in our phones, USB sticks, and SSDs. What is most remarkable about the technology is how far Flash has scaled. In the early 2000s, there were discussions when the technology would be replaced, potentially by Ovonics Unified memory (OUM), nanocrystals, MRAM, FeRAM, PFRAM, PCRAM, or Nanotube RAM (NRAM). So far, Flash is still going strong and it will not be replaced anytime soon.

14. 1989: MRAM - Computer Memory

As an exotic and very specialized memory technology today, MRAM is not believed to become a successor to Flash anymore. First steps in developing MRAM devices were made by IBM in 1989 and throughout the 1990s, when the company discovered the "giant magnetoresistive" effect in thin-film structures. In 2000, IBM created a joint-venture with Infineon to bring MRAM into production, which resulted in a 128 Kbit MRAM chip built in 180 nm. Until today, MRAM is still waiting for its mainstream breakthrough, but has found customers in smart cards, cell base stations, as well as aerospace and military scenarios.

15. 1993: Synchronous DRAM - Computer Memory

The general idea of SDRAM was described in the 1970s, shortly after the mass-production of DRAMs was initiated. However, the device considered to be the inflection point for SDRAM, the basic technology we use for RAMs in our computers today, was the development of the Samsung KM48SL2000 in 1993 (original press release here). It was the first compelling SDRAM device that helped replace all other DRAMs in computers by about 2000. The device achieved 100 MHz speeds in a JEDEC standardized PC100 configuration. Other types of SDRAM included PC66 on the low-end and PC133 on the high-end.

16. 1996: Double Data Rate (DDR) SDRAM

As a successor to SDRAM, the DDR SDRAM standard was developed at JEDEC between 1996 and 2000. DDR (1) debuted at bus clocks and internal speeds of 133, 166 and 200 MHz and version numbers DDR-266, DDR-333 and DDR-400. The first DDR SDRAM specification was released in June 2000. DDR memory devices are not backwards compatible with SDRAM.

The second generation, DDR2, followed in 2003 and was available with clock rates of 200, 266, 333, 400 and 533 MHz (DDR2-400, DDR2-533, DDR2-667, DDR2-800, DDR2-1066). The current generation DDR3 was generally released in 2006 and is available in400, 533, 667 and 800 MHz versions (DDR3-800, DDR3-1066, DDR3-1333 and DDR3-1600). DDR4 devices are currently expected to be released in 2014.

17. 1999: Rambus DRAM (RDRAM) - Computer Memory

RDRAM or DRDRAM (Direct Rambus DRAM) is a type of synchronous dynamic RAM and is the most famous example of a failed attempt to replace SDRAM. It was developed in the late 1990s by Rambus and was intended to  launch in concert with Intel's Pentium 4 processor. Intel threw its weight behind the technology and invested $300 million in Samsung's production capacity to support the launch of the Pentium 4 in 2000. The main advantage of rambus memory was greater bandwidth than SDRAM and the surfacing DDR SDRAM. Intel canceled the effort and limited Rambus support to a few server and workstation-targeted platforms due to industry opposition. Rambus memory was expensive to produce in part caused by high licensing fees.Rambus exhibited hostile behavior when it sued its potential customers over IP infringements in the SDRAM and DDR space. The use of Rambus memory was later briefly discussed for graphic cards, but remained limited to some server, and networking devices, as well as core memory technology in Sony's PS3 game console.