News

Data Buffers to the Rescue for Bandwidth-Burdened Server and Cloud Applications

September 11, 2020 by Jake Hertz

This week, Renesas has released their newest product: a data buffer for DDR5 aimed at empowering a new generation of high-performance computing applications.

Recent developments in real-time analytics, ML/AI, and high-performance computing have pushed the limits of server memory and bandwidth requirements. This trend is only expected to continue with the IoT producing massive amounts of data and cloud computing and AI applications growing in ubiquity. 

 

Nielsen’s Law of internet bandwidth

Nielsen’s Law of internet bandwidth. Image used courtesy of the Nielsen Norman Group

 

There have been many proposed solutions to the problem of increasing bandwidth demand. One of the leading solutions that engineers have come up with is the use of data buffers. 

 

What are Data Buffers?

Put simply, a data buffer is a memory storage component that is used to temporarily store data while it's being moved from one place to another. 

Very similar to a cache, buffers are used to address performance slowdowns that occur due to memory access times. Specifically, the speed of a processor is much higher than the speed of its attached I/O peripherals. This is undesirable since a significant portion of processor time ends up being wasted waiting for the peripherals to respond.

This is where buffers come in handy.

 

Centralized vs. distributed buffers in DDR4 LRDIMM and DDR3 LRDIMM

Centralized vs. distributed buffers in DDR4 LRDIMM and DDR3 LRDIMM. Image used courtesy of ITD and Samsung

 

Buffers, which are generally located physically on RAM, are used for holding data to be forwarded to I/O devices. This allows the CPU to continue processing requests in the meantime, eventually accessing the buffer for data that is considerably faster than accessing the RAM itself. 

 

Benefits of Data Buffers

The original idea of the buffer was to prevent data congestion from an incoming port to an outgoing port of transfer. Since their adoption, buffers have come to serve many purposes and provide many benefits. These include: 

  • Improving capacity by allowing one controller to connect to many DRAM devices
  • Improving bandwidth by cleaning up system signal integrity, enabling higher operating data rate
  • Improving reliability by allowing buffered data to be checked with ECC before being sent to the CPU 
  • The ability to perform logical operations on data before going to CPU, allowing for decreased demand on the CPU

 

A Recent Example: Renesas’ New Data Buffer Hits the Market 

Understanding the growing need and importance of data buffers in data center applications, Renesas has recently released its newest product

The DDR5 data buffer, 5DB0148, enables significantly higher speeds and lower latency for load-reduced dual inline memory modules (LRDIMMs). In fact, Renesas claims that the first generation of DDR5 LRDIMMs based on Renesas components enables a bandwidth increase of more than 35 percent over DDR4 LRDIMMs operating at 3200 MT/s.

 

Graphic of the new 5DB0148

Graphic of the new 5DB0148. Image used courtesy of Renesas

 

Beneficial for heavily loaded systems, the buffer works using a combination of capacitive load reduction, data alignment, and signal recovery techniques. Utilizing this new buffer, server motherboards with a large number of memory channels and complex routing topologies can operate at maximum speed, even when fully populated with high-density memory.

 

Buffers May Alleviate Bandwidth Demands

Already available for sampling, the 5DB0148 will hopefully alleviate increasing bandwidth requirements in servers and enable the coming generations of AI/ML applications. 

On this point, Renesas' VP of data center business Rami Sethi explains, “Our DDR5 data buffers are critical to enabling high-performance DRAM solutions such as LRDIMMs, alternative high-density modules, and heterogeneous memory solutions, all of which empower a new generation of high-performance computing applications.”