Sign up ×
Stack Overflow is a question and answer site for professional and enthusiast programmers. It's 100% free.

Let m, n be integers such that 0<= m,n< N.

Define:

  Algorithm A: Computes m + n in time O(A(N))

  Algorithm B: Computes m*n in time O(B(N))

  Algorithm C: Computes m mod n in time O(C(N))

Using any combination of algorithms A, B and C describe an algorithm for N X N matrix addition and matrix multiplication with entries in Z/NZ. Also indicate the algorithm's run time big-O notation.

Attempt at solution:

For N X N addition in Z/NZ: Let A, and B be N X N matrices in Z/NZ with entries a_{ij} and b_{ij} such that i,j in {0,1,...,N} where i represents the row and j represents the column in the matrix. Also, let A + B = C

Step 1. Run Algorithm A to get a_{ij} + b_{ij} = c_{ij}   in time O(A(N))

Step 2. Run Algorithm C to get c_{ij} mod N in time O(C(N))

Repeat Steps 1 and 2 for all i,j in {0,1,...,N}.

This means that we have to repeat the steps 1,2 N^2 times. So the total run time is estimated by

 N^2[ O(A(N)) + O(C(N)) ] = O(N^2 A(N)) + O(N^2 C(N)) = O(|N^2 A(N)| + |(N^2 C(N)|).

For multiplication algorithm I just replaced step 1 by Algorithm B and got the total run time to beO(|N^2 B(N)| + |(N^2 C(N)|) just like above.

Please tell me if I am approaching this problem correctly, especially with the big-O notation.

Thanks.

share|improve this question
    
For matrix multiplication your algorithm is wrong. A*B_{i,j} != A_{i,j} * B_{i,j}, In fact, it is proven one cannot multiple matrices better than O(n^2logn). – amit Feb 24 at 8:42

1 Answer 1

up vote 0 down vote accepted

Your algorithm for matrix multiplication is wrong, and will yield a wrong answer, since A*B_{i,j} != A_{i,j} * B_{i,j} (with exception for some unique cases like zero matrix)

I assume the goal of the question is not to implement an efficient matrix multiplication, since it's a hard and still studied problem, so I will answer for the naive implementation of matrix multiplication.

For any indices i,j:

(AB)_{i,j} = Sum( A_{i,k} * B_{k,j}) = 
           = A_{i,1} * B_{1,j} + A_{i,2} * B_{2,j} + ... + A_{i,k} * B_{k,j} + ... + A_{i,n} * B_{n,j} 

As you can see, for each pair i,j there are n multiplications and n-1 additionsץ Regarding the amount of invokations of C - it depends if you need to invoke it after each addition, or only once when you are done (it really depends on how many bits you have to represent each number), so for each pair of i,j - you might need it anywhere from once to 2n-1 invokations.

This gives us total complexity of (assuming 2n-1 modolus for each (i,j) pair, if less are needed as explained above - adjust accordingly):

O(n^3*A + n^3*B + n^3*C)

As a side note, a good sanity check that shows your algorithm is indeed incorrect - it is proven that matrix multiplication cannot be done better than Omega(n^2 logn) (Raz,2002), and best current implementation is ~O(n^2.3)

share|improve this answer
    
Hey amit, Thanks a lot for your help. For the multiplication algorithm we are allowed to assume that we only have to do modulus for each 'i,j' pair once. So, would that change the complexity of the algorithm to 'O(n^2*A + n^2*B + n^2*C)'? Also, is the complexity for my addition algorithm of 'O(|N^2 A(N)| + |(N^2 C(N)|)' correct? Thanks again. – Rose S Feb 24 at 11:19
    
@RoseS No, it will be in this case O(n^3*A + n^3*B + n^2*C). Complexity of addition seems correct to me. – amit Feb 24 at 11:20
    
Ah yes, because we are still doing n multiplications and n-1 additions for n^2 entries. Thank you. – Rose S Feb 24 at 11:25

Your Answer

 
discard

By posting your answer, you agree to the privacy policy and terms of service.

Not the answer you're looking for? Browse other questions tagged or ask your own question.