質問

I have a sequence of integers {a1,a2...an} and I want to divide it into minimum number of "increasing subsequences". For example: let the sequence be {10,30,20,40}, then the answer would be 2. In this case, the first increasing sequence would be {10,30,40} and the second would be {20}. I am able to do it using an O(N^2) algorithm but I'm particularly interested in an O(N*logN) solution which could serve the purpose.

役に立ちましたか?

解決

This could be done with simple greedy algorithm.

Keep sorted array of subsequences found so far. Sorted in decreasing order, by the value of the last integer in each sequence. Initially empty.

Get first not-yet-processed element from the sequence and find the subsequence where the last integer is smaller than this element but larger (or equal) than in all such subsequences. Use binary search here to get overall complexity O(N log N). If no such subsequence found, add a new one at the end of the array. Add this element to this subsequence. Repeat while there are not-yet-processed elements in the sequence.

ライセンス: CC-BY-SA帰属
所属していません StackOverflow
scroll top