Which is faster, FPGA or GPU?

Asked 1 years ago, Updated 1 years ago, 84 views

Which is better, GPU or FPGA, for implementing large (or small) image processing, machine learning, or artificial intelligence?
Personally, I don't want to beat GPUs in terms of processing speed even with ultra-high-performance FPGA.I think there are physical delays due to difficulty in pipelineing and hard work.
What do you think?I'd love to hear your opinion.

gpu fpga

2022-09-29 22:46

2 Answers

Case by case.

First, there are variations in FPGA and GPU, just as there are low to high performance CPUs.In addition, while FPGA is intended to create programmable circuits, GPU is intended to perform computations for images (although GPGPU can be used), and the purpose is different.In addition, there are variations in how to create source code for each hardware to run.

Therefore, depending on the problem setting and the actual system design, "Which is faster?" will change.In other words, the premise of this question is too broad.

The speed comparison between FPGA and GPU is also a paper, and some of the findings will be successful if you look at https://scholar.google.co.jp using the keyword "FPGA GPU".For example, in Asano, S., Maruyama, T., Yamaguchi, Y. "Performance comparison of FPGA, GPU and CPU in image processing" (2009), problems have caused FPGA or GPU to become faster.Another study uses a combination of FPGA and GPU.

Finally, another point of view is that there are attempts to create GPUs in FPGA, so the two cannot be completely divided.For example, https://github.com/jbush001/NyuziProcessor is a GPGPU project on the FPGA.


2022-09-29 22:46

FPGA is an electronic circuit that can programmably implement any computational circuit.Even GPUs can be implemented on FPGA.If you reproduce the GPU and then tune it at the electronic circuit level, depending on the algorithm that runs on it, it will be faster than the (existing) GPU.

I think the question will be summarized as to whether or not to do so.I still don't know about computational algorithms that give me a real advantage over computational algorithms (such as deep learning) based on gpgpu architecture, and although there is a great possibility that such algorithms exist, I think they are still in the research stage.


2022-09-29 22:46

If you have any answers or tips


© 2024 OneMinuteCode. All rights reserved.