j= 2

while (j < n) {

k=2

while (k < n) {

Sum += a[k]*b[k]

k = k * sqrt(k) }

j += 5*j/3 }

and

for (int j = 1 to n) { int k = j

while (k < n) {

Sum += a[k]*b[k] k += n1/3

} }

- All categories
- Asymptotic Analysis (22)
- Divide & Conquer (9)
- Greedy Algorithms (2)
- Dynamic Programming (8)
- Backtracking/DFS/BFS (2)
- Branch & Bound (2)
- Graph Theory (15)
- NP-Completeness (6)

+1 vote

Best answer

My answer would be O(logn * log log n) for #1 and O(n^{5/3}) for #2. I don't know about others, I have an alternative way to quantify time complexity: the total times for executing each line of code.

Explanation:

#1: for the inner loop

while (k < n) {

Sum += 1;

k = k * sqrt(k);

}

How many iterations will be executed depends on value k, in this loop, k starts as 2, goes with 2^{3/2}, (2^{3/2})^{3/2}, .... , 2^{(3/2)^z}, until it becomes greater than n. the value z here is the times of iteration, assume 2^{(3/2)^z} = n, there exists z = log log n.

For the outer loop,

while (j < n) {

k = 2;

//inner loop, takes log log n

...

j += 5*j/3;

}

Similarly, j starts as 2, goes with 2, 5/3 * 2, (5/3)^{2} * 2, ...., (5/3)^{z} * 2, until it becomes greater than n. the value z = log_{5/3}n/2, approximately, log n.

The time complexity of the total algorithm is the product of each loop, which is O(log n * log log n).

#2: **I take k += n1/3 as k += n**^{1/3}. Again, for inner loop, k starts as j, goes with j + n^{1/3}, j + 2 * n^{1/3}, ... , j + z * n^{1/3 }until it becomes greater than n. (omit constant z) the value z = n^{2/3}. For the outer loop. j starts as 1 to n, so z = n. For the whole algorithm , it's O(n^{5/3}).

We can't accurately calculate each algorithm manually, because constant will be omitted during the process, the base number of log will always count as 2, etc. However, we can know the order of magnitude after doing some simple math, which is the meaning of O().

...