Let us say we have an algorithm that carries out N^2 operations for an input of size N. Let us say that a computer takes 1 microsecond (1/1000000 second) to carry out one operation. How long does the algorithm run for an input of size 3000?

(A) 90 seconds
(B) 9 seconds
(C) 0.9 seconds
(D) 0.09 seconds

error: Content is protected !!