'how important are many of the new GPU algorithms that are being published? [closed]

I have done some GPU programming in the past and realized that unlike in typical sequential algorithms, where for example you may have an O(nlogn) solution to a problem which you can then efficiently implement, in GPUs, it's more like you have a sequential algorithm, which you can then make faster by applying paralellization whenever possible.

Most papers concerning GPU algorithms make conclusion of the kind "we can achieve a 3x/4x/etc speed up"

For example

Link

http://www.sciencedirect.com/science/article/pii/S0022000012000992

So let's suppose that we have a sequential algorithm O(2^n). Does it achieve anything to create a GPU algorithm that speeds it up by for let's say a factor of 10?

It's like you're saying that you achieved a O(2^n/10) time complexity which essentially for a big n, makes your achievement worthless.

The same applies to all kinds of complexities I guess.

So I'm not sure what's the importance of these GPU algorithms exactly when it's usually only reducing the time complexity by a constant factor.

gpu


Solution 1:[1]

In the sense of conceptual breakthroughs in computer science, you are correct, constant factors are not important. But in terms of actual practical use, a factor 10 is a huge deal! The reason being that n is often fixed or growing very slowly, and the difference between 10 hours running time and 1 hour, in terms of feedback on what you're doing, is immense. To take an example from my own experience, in particle physics you tend to deal with data samples whose size is effectively fixed on timescales of several months - that's how long it takes to get a new batch of events from your detector, through the processing pipeline, and into the hands of a graduate student. Now you want to do your analysis on that data several hundred times, with minor changes between each run - especially, debugging changes; you want rapid feedback. The difference between getting that feedback ten times a day, or once a day, can easily be the difference between graduating or not.

In the specific case of 2^n, admittedly, you'd better hope your data size doesn't increase at all. But for more common cases like n^2, factors 10 can be quite useful - three times the data for the same processing time is pretty cool.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Rolf Andreassen