[Solved]: Maxima of diagonals in a column wise and row wise sorted matrix

Problem Detail: Let ${a_i}$ and ${b_i}$ be non-decreasing sequences of non-negative integers. How fast can one find $$c_j=max_{0 leq i< j}{a_i+b_{j-i-1}}$$ for all $0leq jleq n-1$? Naively, it takes $O(n^2)$ time, but I’m hoping monotonicity can help here. It’s easy to observe ${c_i}$ is also non-decreasing. If we consider the matrix $M$ where $M_{i,j} = a_i+b_j$, then it is a matrix sorted in both row and column direction, and we are searching for the maximum element in every diagonal. However, if it’s an arbitrary column wise and row wise sorted matrix, this problem require $Omega(n^2)$ time. Proof: Let all the numbers below the main diagonal be $infty$. The elements in the $k$th diagonal are randomly numbers from $(k,k+1)$. Reading any entry provides no information for any other entry. Edit: This problem is much harder than I anticipated. We can model this problem as a convolution problem over the semiring $(min,+)$ (take the dual, search for min instead of max), and it can be solved in $O(frac{n^2}{log n})$ time according to a Ryan Williams’s answer on mathoverflow. It doesn’t use the information that the sequence is non-decreasing though.

Asked By : Chao Xu

Answered By : jbapple

Great question! “Distance Transforms of Sampled Functions”, by Felzenszwalb and Huttenlocher shows how to compute this in $O(n lg n)$ time. “Necklaces, Convolutions, and X+Y”, By Bremner et al. shows a $Oleft(frac{n^2}{frac{lg^2 n}{(lg lg n)^3}}right)$ algorithm for this problem on the real RAM and a $O(n sqrt{n})$ algorithm in the nonuniform linear decision tree model.
Best Answer from StackOverflow

Question Source : http://cs.stackexchange.com/questions/18211  Ask a Question  Download Related Notes/Documents