News

AI-Accelerating Software Gives FPGAs a Leg Up on GPUs

June 25, 2020 by Jake Hertz

Xilinx recently tapped a startup for its AI accelerating software platform. In what ways might this platform add extra benefits for FPGAs in contrast to GPUs?

It’s no secret that artificial intelligence is rapidly unfolding and making its way into virtually every field. AI imposes a unique workload on its compute platforms, one where parallelization is crucial. For this reason, GPUs, which generally boast over 1000 cores, have become preferable to CPUs when running neural networks, according to Medium contributor Connor Shorten.

Xilinx posits that in some scenarios, FPGAs are more useful than GPUs on account of their massive parallelism and fast re-programmability. In an interview with AAC, founder and CEO of Mipsology Ludovic Larzul claimed that FPGAs can offer up to three to four times higher performance than GPUs. This makes FPGAs an especially attractive platform for artificial intelligence applications.

While FPGAs are a useful solution in such scenarios, the seeming tradeoff here is that programming an FPGA requires knowledge of esoteric hardware description languages (HDLs) such as VHDL or Verilog.

Enter Mipsology's Zebra platform. Mipsology claims that its Zebra software is a neural network accelerator that can be deployed to FPGAs without needing HDL specialists and without needing to change your current setup. 

 

Zebra: A Platform That Makes FPGAs Accessible

Zebra was conceived as a means of replacing the CPUs/GPUs in AI compute platforms while making the change invisible to the AI scientist. This means that for them, there would be no changes to their work. As far as these scientists are concerned, they are still working on a GPU-like device; it just happens to be five times faster, according to Mipsology.

 

Stack on GPU vs. stack on Zebra

Stack on GPU vs. stack on Zebra. Image used courtesy of Xilinx

 

Larzul says that while GPUs and FPGAs use the same base, GPUs typically have more transistors. Despite the processing advantages of more transistors, GPUs also face the issues or shorter lifespan, higher heat output (and demand for cooling), and more power consumption. 

"Zebra-backed FPGAs give a lot of freedom [for designers]" Larzul explains. "Transitioning is easy and it can fit in systems with many constraints. Zebra . . . creates the possibility to deploy without changing the framework. No extra R&D work to redesign the neural networks is needed."

The company website explains that designers can deploy the software without expertise in underlying hardware technology or compilation tools. As Larzul outlined, this means programmers won't need to change the neural network, framework, training, or application. 

For this reason, AAC contributor Gary Elinoff recently included the Zebra platform in his roundup of resources that are making FPGAs more accessible to non-FPGA specialists than ever before.

 

Xilinx Taps Zebra for Its FPGAs

Xilinx recently announced that they are shipping Zebra with the latest build of its Alveo U50 cards for the data center. This news, however, doesn’t mark the first time these companies have teamed up. Xilinx already has a list of FPGAs that have Zebra deployed, including their Alveo U200 and Alveo U250 boards. 

 

Alveo U250 data center accelerator card

Alveo U250 data center accelerator card, which now deploys Zebra. Image (modified) used courtesy of Xilinx
 

Ludo Larzul says “Zebra delivers the highest possible performance and ease-of-use for inference acceleration. With the Alveo U50, Xilinx and Mipsology are providing AI application developers with a card that excels across multiple apps and in every development environment.”

 

Implications of an FPGA That Outpaces GPUs

Making FPGAs easier to use and more accessible will likely have profound implications in data centers and broadly in AI applications. FPGAs generally work faster and with less power than conventional GPUs in AI applications. This means less power consumption and subsequently less cooling resources and maintenance in data centers. 

As far as Mipsology goes, there is no sign of them slowing down anytime soon. They already have 12 patents pending and continue to work closely with Xilinx. Their software is also compatible with third-party accelerators including some by Western Digital and Advantech.

 


 

Do you work with FPGAs or GPUs for AI-based applications? What are your greatest design challenges and how do you overcome them? Share your experience in the comments below.