algorithmic idea using greedy strategy
Question
So I just finished taking a test and thought that I would ask you guys about a question that stumped me. I was asked to come up with the algorithmic idea using greedy strategy
on an ascending array of n numbers, where A[i] + A[j] = T
. T is just a number that is given to see if the two numbers add up to it. We are to keep it to O(n)
. I'm wasn't sure on how to do this properly.
The way that I suggested was to take A[i] + A[j] > T
if it wasn't then A[i + 1] + A[j + 1]
until you found an answer that was greater than T. Once you found that it was greater than T you would go from A[0] to A[j-1] and see if it matches any of the cases. I don't think my idea is O(n) though. Any thoughts?
Solution
The idea is that for each big value of A[i] starting at the end of the list, you will search for the smaller value of j such that A[i]+A[j]=T, if such a j exists. If the array were unordered, this would be an O(n^2) loop nest.
The greedy part is that you never need to "go backward" with the value of j because the values of A are ascending. I.e. when you decrease the value of i, all the possible correct values of j must be bigger than the current value.
In code:
i = n - 1;
j = 0;
while (j <= i)
if (A[i] + A[j] < T)
j++;
else if (A[i] + A[j] > T)
i--;
else {
printf("A[%d] + A[%d] = %d\n", i, j, T);
break;
}
OTHER TIPS
I'm not sure if this qualifies as exactly a greedy strategy, but the standard solution to this problem is to maintain a left index L and a right index R. Initially these are set to 1 and n. If A[L] + A[R] < T then increment L by 1. If A[L] + A[R] > T then decrease R by 1. Once you reach L = R, if you haven't found a solution then no solution exists.