IBM Corp. has devised a way to triple the amount of memory stored on microchips and double the performance of processors by replacing a problematic type of memory with one that uses less space on a slice of silicon.
According to the company, the eDRAM chip "vastly improves microprocessor performance in multi-core designs and speeds the movement of graphics in gaming, networking, and other image intensive, multi-media applications." Shown as a 65 nm concept at the currently held International Solid State Circuits Conference (ISSCC), IBM claims that the eDRAM exceeds the performance of conventional SRAM, which is typically used for on-die CPU cache, in about one-third the space with one-fifth the standby power. With a random cycle time of 2 ns and a latency of 1.5 ns, IBM now is convinced that DRAM is ready to be integrated into the CPU with the goal to replace SRAM. eDRAM so far has been used in applications ranging from supercomputing to gaming. For example, IBM's BlueGene/L system uses eDRAM as L3 cache technology, while game consoles such as the Gamecube, the Xbox 360 and the Wii have been using eDRAM as embedded memory technology for their graphics processors.
Compared to conventional DRAM modules, integrating DRAM has provided a key advantage of performance gains eliminating the need to drive I/O signals to external memory chips. eDRAM manufacturers such as NEC also believe that the technology will make its way into mobile application due to the low power consumption of eDRAM devices.