$\begingroup$

I recently had an interview, at which I was asked to solve the following problem: you have a sorted array of integers, and you need to find if there are 3 numbers that sum to 0. The brute force solution gives $O(n^3)$ complexity. Given that it is sorted, we can iterate over 2 numbers, and the third one using binary search. It gives $O(n^2\log(n))$ complexity. In fact, there exist a solution with $O(n^2)$ complexity, but apparently this is the best what we can get.

I was thinking is there a way to show in advance that $O(n^2$) is the best, and I came up with the following considerations. They are based on the postulate that if the solution of a problem contains $I$ amount of information, then the minimum amount of operations required to get it is $O(2^I)$. Indeed, if we have $n$ equally probable independent answers, i.e., $I=log(n)$, the only way to solve the problem is to try each of them, i.e. spend $O(n)$ operations. Here are the considerations applied to the problem above.

1) Let us imagine that the array is unsorted. In this case any combination of 3 digits can be a solution, and therefore the amount of information in the answer is $3\log(n)$.

2) Let us assume that the best sorting algorithm has a complexity $O(n\log(n))$, so we can consider that $\log(n\log(n))$ is the amount of information it gives us about the array. Or in other words, the entropy of the array was decreased on this amount.

3) Subtracting one from another, we get $3\log(n) - \log(n\log(n)) = 2\log(n) - \log(\log(n))$. This is the amount of information in the solution of the problem.

4) According to the postulate, the algorithm complexity must be at least $O(2^{2\log(n) - \log(\log(n))}) = O(n^2/\log(n))$. This coincides with the fact that there exist a solution with the complexity $O(n^2)$, but not $O(n\log(n))$.

What do you think? Could you advice me to read something similar?