Problem Detail: So I have the following pseudo-code:
Function(n): for (i = 4 to n^2): for (j = 5 to floor(3ilog(i))): // Some math that executes in constant time
So far, I know that to find this i will be calculating $sum_{i=4}^{n^2}sum_{j=5}^{3i log_{2}i}C$ where $C$ is a constant, but I am completely lost as to how to proceed past the first summation which, if I’m not mistaken, will give me $3C cdot (sum_{i=4}^{n^2}(i log_{2}i) – 5(n^2 – 4))$, but from here I’m lost. I don’t need exact running time, but the asymptotic complexity. All help is greatly appreciated! I realize that this might be a duplicate, but I haven’t been able to find a nested for loop problem of this nature anywhere…
Asked By : furlessxp
Answered By : Paresh
$$F(n) = sumlimits_{i = 4}^{n^2} sumlimits_{j = 5}^{3ilog i}C = Csumlimits_{i = 4}^{n^2}(3ilog i – 4) = 3Csumlimits_{i = 4}^{n^2}i log i – Theta(n^2)$$ For asymptotic analysis, we can start the summation from $i = 1$ instead of $4$. Let $begin{align} T(n) &= sumlimits_{i = 1}^{n^2}ilog i = sumlimits_{i = 1}^{n^2}log {i^i} &= log {1^1} + log {2^2} + log {3^3} + dots + log {(n^2)^{n^2}} &le n^2 cdot log (n^2)^{n^2} text{(there are (n^2) terms)} &= 2n^4 log n &= O(n^4 log n) end{align} $ Now, since $f(x) = xlog x$ is a continuous increasing function in $x in [1, infty)$, you can verify that :
$intlimits_{1}^{n^2}xlog x dx le sumlimits_{x = 1}^{n^2}xlog x $ (changing $i$ to $x$ simply because $x$ looks better when integrating). According to Wolfram, $intlimits_{1}^{n^2}xlog x dx = frac{1}{4}left(n^4 log left({frac{n^4}{e}}right) + 1right) = n^4 log left({frac{n}{e}}right) + frac{1}{4}$ Therefore, $sumlimits_{x = 1}^{n^2}xlog x = Omega(n^4log n)$. Combining the upper and lower bounds, we have $T(n) = Theta(n^4log n)$ Overall, $F(n) = Theta(n^4log n) – Theta(n^2) = Theta(n^4 log n)$. So, your loops have complexity $Theta(n^4 log n)$. Remarks:
1. You can show the lower and upper bounds in a similar way by observing that: $intlimits_{1}^{n^2}xlog x dx le sumlimits_{x = 1}^{n^2}xlog x le intlimits_{0}^{n^2}(x+1)log {(x+1)} dx$
2. For the above inequalities, you can use the method of Riemann sum (use the left and the right Riemann sums to see the two bounds). Draw unit-width rectangles on an increasing function to see these bounds.
3. I have done some implicit simplifications, such as just ignoring the constant $3C$ term, or starting the summation from $1$ instead of $4$ since we are dealing with $Theta$. I have also made the common but slight abuse of notation when equating or subtracting $Theta$ terms, without affecting the answer.
$intlimits_{1}^{n^2}xlog x dx le sumlimits_{x = 1}^{n^2}xlog x $ (changing $i$ to $x$ simply because $x$ looks better when integrating). According to Wolfram, $intlimits_{1}^{n^2}xlog x dx = frac{1}{4}left(n^4 log left({frac{n^4}{e}}right) + 1right) = n^4 log left({frac{n}{e}}right) + frac{1}{4}$ Therefore, $sumlimits_{x = 1}^{n^2}xlog x = Omega(n^4log n)$. Combining the upper and lower bounds, we have $T(n) = Theta(n^4log n)$ Overall, $F(n) = Theta(n^4log n) – Theta(n^2) = Theta(n^4 log n)$. So, your loops have complexity $Theta(n^4 log n)$. Remarks:
1. You can show the lower and upper bounds in a similar way by observing that: $intlimits_{1}^{n^2}xlog x dx le sumlimits_{x = 1}^{n^2}xlog x le intlimits_{0}^{n^2}(x+1)log {(x+1)} dx$
2. For the above inequalities, you can use the method of Riemann sum (use the left and the right Riemann sums to see the two bounds). Draw unit-width rectangles on an increasing function to see these bounds.
3. I have done some implicit simplifications, such as just ignoring the constant $3C$ term, or starting the summation from $1$ instead of $4$ since we are dealing with $Theta$. I have also made the common but slight abuse of notation when equating or subtracting $Theta$ terms, without affecting the answer.
Best Answer from StackOverflow
Question Source : http://cs.stackexchange.com/questions/9714