Question

So the problem is to find whether x is in one of the elements of a sorted matrix ascending by row and by column.

example :

1 2 3

4 5 6

7 8 9

I'm interested to find the time complexity of the divide and conquer solution of this problem. I've googled it but i've found only for the O(m+n) solution, and also from this Search a sorted 2D matrix. but he says(comment on answer) that "T(R) = 3T(R/4)", why is the f(R) = 0 ?

The question is what is the complexities of the divide and conquer solution using master theorem ? and in T(R) = 3T(R/4) + f(R), what should be the f(R) ?

if it helps, this is my divide and conquer solution :

bool find_x_in_matrix(int x, int mat[3][3], int top_row, int top_col, int bot_row, int bot_col) {
if(top_row > bot_row || top_col > bot_col)
    return false;

int mid_row = (bot_row + top_row) / 2;
int mid_col = (bot_col + top_col) / 2;

if(mat[mid_row][mid_col] == x)
    return true;
else if(mat[mid_row][mid_col] > x) {
    return find_x_in_matrix(x, mat, top_row, mid_col, mid_row - 1, bot_col) ||
        find_x_in_matrix(x, mat, top_row, top_col, mid_row-1, mid_col-1) || 
        find_x_in_matrix(x, mat, mid_row, top_col, bot_row, mid_col-1);
}else if(mat[mid_row][mid_col] < x) {
    return find_x_in_matrix(x, mat, top_row, mid_col + 1, mid_row, bot_col) || 
        find_x_in_matrix(x, mat, mid_row + 1, top_col, bot_row, mid_col) || 
        find_x_in_matrix(x, mat, mid_row + 1, mid_col + 1, bot_row, bot_col);       
}
}

edit to clarify solution :

algorithm : 1. compare the middle element of the matrix to the search value 2. if the value equals m(i,j) return true 3. if the value is smaller, search the value in top right, top left, bottom left of matrix 4. if the value is larger, search the value in top right, bottom right, bottom left of the matrix

Was it helpful?

Solution 2

I dont know if this right, but i'm using case 2 of the master theorem

T(R) = 3T(R/4) + theta(1)

f(R) = theta(1) = theta(R) = theta(R^(log4(3)))

f(R) = theta(R^(log4(3))) = theta(R^(log4(3)) * logk(R)) is true with k = 0, so the time complexity is :

T(R) = theta(R^(log4(3)) * log(R)) = theta(R^0.8 * log(R))

OTHER TIPS

The recurrence relation

T(R) = 3T(R/4) + c

is clear because in every step, you are discarding 1/4th of the search space and looking at rest of the 1/4th of space 3 times .

According to wiki, f (n) is the cost of the work done outside the recursive calls, which includes the cost of dividing the problem and the cost of merging the solutions to the subproblems.

I think this is just a constant. f(n) may not be zero but it is definitely a constant value and doesnt depend on the search space.

EDIT:

Im not sure how to use the Master Theorem for this but if we unroll the recurrence relation we get

 T(n) = 3^2* T(n/(4^2)) + c(1 + 3)

Going ahead, T(n) = 3^k * T(n/4^k) + c(3^0 + 3^1 ... + 3^(k-1))

This is where I am stuck..Can we reduce the RHS ? Forgotten my high school math.

I stand to be corrected though.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top