**Problem Detail:**From the point of view of asymptotic behavior, what is considered an “efficient” algorithm? What is the standard / reason for drawing the line at that point? Personally, I would think that anything which is what I might naively call “sub-polynomial”, such that $f(n) = o(n^2)$ such as $n^{1+epsilon}$ would be efficient and anything which is $Omega(n^2)$ would be “inefficient”. However, I’ve heard anything which is of any polynomial order be called efficient. What’s the reasoning?

###### Asked By : Robert S. Barnes

#### Answered By : adrianN

That depends on context. In theoretical computer science, usually every polynomial time algorithm is considered ‘efficient’. In approximation algorithms for example a runtime of $n^{1/epsilon^{1/epsilon}}$ would be considered efficient, even though it won’t be usable in practice for any reasonable value of $epsilon$. An algorithm for SAT that runs in $n^{2^{100}}$ would be an amazing breakthrough. In classic algorithmics, i.e. algorithms from the 80’s and before, runtimes below $n^3$ or so (think matrix multiplication, min cost matching, flows, linear programming) are considered efficient. They still are considered efficient by most people, I’d say. Of course a $n^2$ algorithm is not considered efficient if a $nlog n$ algorithm is known, as for sorting for example. Nowadays there is a trend towards sublinear algorithms or streaming algorithms that are able to deal with terabytes of data. Try using matrix multiplication to compute the page rank of all pages in Google’s index. That won’t work. Of course, while certainly useful, the asymptotic runtime of an algorithm doesn’t tell the whole story. There are algorithms that have good asymptotic runtime, but constants that are so huge that they effectively can’t be used. Ever. Lipton calls them Galactic Algorithms. Robert Sedgewick even states that worst case bounds are “often useless for prediction, often useless for guarantees” and “worst-case analysis is useless for predicting performance” in his talk Putting the Science Back Into Computer Science.

###### Best Answer from StackOverflow

Question Source : http://cs.stackexchange.com/questions/10472

Informative article, totally what I was looking for.

http://learningtips.eu