Phase-change Memory Could Open Doors to Better Data Storage and Computing
Pushing beyond the von Neumann bottleneck can be difficult for memory devices. One option is phase-change memory (PCM) which Stanford and IBM are researching for data storage and computing.
Despite the demand for better memory, there has been a lot of difficulty with meeting design standards at 28 nm and smaller footprints for embedded memory solutions and devices. Coupled with conventional von Neumann data bottleneck syndrome, and you have some limiting computer architecture constraints.
A possible solution is a phase-change memory, a non-volatile random-access memory that can retain stored data even when power is removed and provide computational analysis without leaving the memory itself.
PCM's architecture. A PCM can read/store and write/fetch data at low voltages that provide an advantage over traditional volatile memory-based architectures. Image used courtesy of STMicroelectronics
The fundamentals of PCM dates back to the 1960s, where the original development is currently patented by STMicroelectronics that go to 28 nm size.
Some advantages for PCMs include data retention, low power characteristics, and robust performance through operating temperatures that reach 165 degrees Celsius.
The beauty of PCM research is how altering the molecular or atomic structure can allow other developers to create a new way to build on the original development and achieve smaller geometries in next-gen nanotechnology.
This article will go over some recent advancements and research on pushing PCM to the future; however, before diving into it, let's briefly talk about a hang-up with increasing memory device––the von Neumann bottleneck.
Going Beyond Von Neumann Architectures
In 1945, mathematician and physicist John von Neumann developed a computer architecture that consisted of a single, shared memory for programs and data. This technology included a single bus for memory access, an arithmetic unit, and a program control unit. Nearly all CPUs have been based on this architecture.
A general overview of the von Neumann architecture. Image used courtesy of JavaTpoint
However, what if architectures of processors and computers were to deviate from this fundamental model?
There may be new levels of computing that can be reached through an alternative design, appropriately deemed the non-von Neumann architecture.
Gaining inspiration from the human brain, developing the non-von Neumann computational approach creates a system that can provide in-memory computing to alleviate data traffic build-up. Redundancy is eliminated when tasks and storage are performed within the memory itself. PCMs can capitalize on this data-centric computing architecture to address traditional von Neumann Bottleneck.
Stanford’s Flexible PCM Research
Electrical engineering research is heavy at Stanford University. The research consists of physical technology, information systems, hardware systems, and renewable energy.
Within the EE department is Pop's Lab, led by Stanford Professor Eric Pop. Pop's Lab research work explores nanoelectronics and nanoscale energy conversion technology. Recently, Professor Pop's team found that phase-change memory devices can be manipulated and bent to meet demands for smaller nanotechnology without sacrificing PCM's standard characteristics.
Stanford's PCM device. Screenshot used courtesy of Khan et al
These PCMs could be plastic, paper, flex glass, or metal foil to achieve the bending characteristics. Stanford's researchers found that a superlattice made of polyamide can be layered on PCMs to provide flexibility while withstanding low and high resistance states even after 100 bending cycles.
The ultra-low current density was present in the structure of the PCM along with low thermal conductivity. These results indicate that flexible PCMs could achieve multi-level capabilities and be a promising solution for in-memory computing applications.
Professor Eric Pop believes that his team's discovery has a current density, which is needed to switch the flexible memory cells, of 10 to 100x lower than most other reported phase-change memory. Additionally, when bent, the memory cells can maintain their performance.
Flexible PCMs can yield a higher energy density and could be the missing key for data storage to reach larger volumes of data as the world enters a new era of neural networks and quantum computing, which is why IBM Research is looking into PCM technology.
IBM Explores In-memory Computing with PCM Research
IBM, a highly recognized and respected name in cloud computing, artificial intelligence (AI), quantum computing, and exploratory science and materials, has continued research to find ways to cope with global data and information will continue to grow exponentially.
Global data storage reached 33 zettabytes in 2018, requiring the discovery of new ways to meet next-generation computing. Knowing these issues, IBM's researchers respond to this problem by analyzing new materials to provide energy-efficient architectures.
A general overview of in-memory computing which IBM hopes PCM could be beneficial. Image used courtesy of Hazel Cast
A promising solution for IBM is PCMs since they can store and delete data solely based on changes taking place in their atomic structure.
As mentioned, because of von Neumann architectures, bottlenecks continue to limit higher levels of data storage such as AI and deep learning applications. PCMs could become excellent candidates to break past von Neumann constraints; however, they have some delicate design challenges.
When PCM crystals are heated, they soften up physically but can delete stored information quickly. Providing a cooling system wouldn't help much due to the material hardening and slowing down the rate at which information can be stored and fetched—putting the design back up with traditional von Neumann with inefficient energy usage and bottleneck data traffic.
To further their research, IBM has found PCMs composed of germanium antimony tellurium (GST) alloy can withstand significant changes in temperature without losing physical properties. These metalloids require low energy and won't experience melting, which can lead to data loss.
In IBM's Zurich facility, in-memory and neuromorphic computing are the leading methods for applying PCMs in actual applications. Going back to the fundamentals, having a memory that can perform tasks and arithmetic operations is what in-memory computing is seeking.
With PCMs as the building blocks, these memory devices could store data and matrix-vector operations to be performed.
An example of these low-level computations would be solving systems of linear equations. PCMs could implement spike-based algorithms for neural networks in deep learning applications.
IBM continues to dive deeper into exploratory multi-level PCM technology to help larger amounts of data travel smoothly through a new bridge of memory and processing.
Interested in other memory advancements? Learn more in the articles down below.
What’s Trending in Memory Technology in 2021?
Samsung Shoots for High-performance Computing with 2.5D High-bandwidth Memory
Is Software Going to Revolutionize Memory? IBM Digs into Software-defined Storage