Analysis of Algorithms Time Complexity


https://math.stackexchange.com/questions/1865688/what-is-the-computational-complexity-of-newton-raphson-method-to-find-square-roo
The average theoretical complexity is log2(N). However, two notes:
log2(N) is just the highest order term. The actual number of steps to convergence might be:
constant term + coefficient*log2(N)
Secondly, there can be some nuance about the exact floating point computations, etc. where convergence might occur a step sooner or later, before x and y converge to within "machine epsilon" (i.e. so close that they can't be distinguished).
Basically, don't worry. The number of steps to convergence roughly follows log2(N). If you did more iterations, it would more perfectly fit log2(N).

https://www.cs.cornell.edu/courses/cs3110/2012sp/lectures/lec20-master/lec20.html

Recursion Trees

recursion tree is useful for visualizing what happens when a recurrence is iterated. It diagrams the tree of recursive calls and the amount of work done at each call.
For instance, consider the recurrence
T(n) = 2T(n/2) + n2.
The recursion tree for this recurrence has the following form:
In this case, it is straightforward to sum across each row of the tree to obtain the total work done at a given level:
This a geometric series, thus in the limit the sum is O(n2). The depth of the tree in this case does not really matter; the amount of work at each level is decreasing so quickly that the total is only a constant factor more than the root.
Recursion trees can be useful for gaining intuition about the closed form of a recurrence, but they are not a proof (and in fact it is easy to get the wrong answer with a recursion tree, as is the case with any method that includes ''...'' kinds of reasoning). As we saw last time, a good way of establishing a closed form for a recurrence is to make an educated guess and then prove by induction that your guess is indeed a solution. Recurrence trees can be a good method of guessing.
Let's consider another example,
T(n) = T(n/3) + T(2n/3) + n.
Expanding out the first few levels, the recurrence tree is:
Note that the tree here is not balanced: the longest path is the rightmost one, and its length is log3/2 n. Hence our guess for the closed form of this recurrence is O(n log n).

https://www.geeksforgeeks.org/analysis-algorithm-set-4-master-method-solving-recurrences/
T(n) = aT(n/b) + f(n) where a >= 1 and b > 1
There are following three cases:
1. If f(n) = Θ(nc) where c < Logba then T(n) = Θ(nLogba)
2. If f(n) = Θ(nc) where c = Logba then T(n) = Θ(ncLog n)
3.If f(n) = Θ(nc) where c > Logba then T(n) = Θ(f(n))

How does this work?
Master method is mainly derived from recurrence tree method. If we draw recurrence tree of T(n) = aT(n/b) + f(n), we can see that the work done at root is f(n) and work done at all leaves is Θ(nc) where c is Logba. And the height of recurrence tree is Logbn
Master Theorem
In recurrence tree method, we calculate total work done. If the work done at leaves is polynomially more, then leaves are the dominant part, and our result becomes the work done at leaves (Case 1). If work done at leaves and root is asymptotically same, then our result becomes height multiplied by work done at any level (Case 2). If work done at root is asymptotically more, then our result becomes work done at root (Case 3).

http://www.geeksforgeeks.org/interesting-time-complexity-question/
int fun(int n)
{   
    for (int i = 1; i <= n; i++)
    {
        for (int j = 1; j < n; j += i)
        {
            // Some O(1) task
        }
    }   
}
For i = 1, the inner loop is executed n times.
For i = 2, the inner loop is executed approximately n/2 times.
For i = 3, the inner loop is executed approximately n/3 times.
For i = 4, the inner loop is executed approximately n/4 times.
…………………………………………………….
…………………………………………………….
For i = n, the inner loop is executed approximately n/n times.
So the total time complexity of the above algorithm is (n + n/2 + n/3 + … + n/n)
Which becomes n * (1/1 + 1/2 + 1/3 + … + 1/n)
The important thing about series (1/1 + 1/2 + 1/3 + … + 1/n) is, it is equal to Θ(Logn) (See this for reference). So the time complexity of the above code is Θ(nLogn).
O(N)
http://geeksquiz.com/algorithms-analysis-of-algorithms-question-3/
What is time complexity of fun()?
int fun(int n)
{
  int count = 0;
  for (int i = n; i > 0; i /= 2)
     for (int j = 0; j < i; j++)
        count += 1;
  return count;
}
(C) O(n)

For a input integer n, the innermost statement of fun() is executed following times. So time complexity T(n) can be written as

T(n) = O(n + n/2 + n/4 + … 1) = O(n)
The value of count is also n + n/2 + n/4 + .. + 1
http://geeksquiz.com/algorithms-analysis-of-algorithms-question-13/
void fun(int n, int arr[])
{
    int i = 0, j = 0;
    for(; i < n; ++i)
        while(j < n && arr[i] < arr[j])
            j++;
}
(A) O(n)
But, please note that the variable j is not initialized for each value of variable i. So, the inner loop runs at most n times. 
http://geeksquiz.com/algorithms-analysis-of-algorithms-question-5/
int fun(int n)
{
  int count = 0;
  for (int i = 0; i < n; i++)
     for (int j = i; j > 0; j--)
        count = count + 1;
  return count;
}
(B) Theta (n^2)
The time complexity can be calculated by counting number of times the expression “count = count + 1;” is executed. The expression is executed 0 + 1 + 2 + 3 + 4 + …. + (n-1) times.
Time complexity = Theta(0 + 1 + 2 + 3 + .. + n-1) = Theta (n*(n-1)/2) = Theta(n^2)

http://geeksquiz.com/algorithms-analysis-of-algorithms-question-24/
int recursive (mt n)
{
   if (n == 1)
     return (1);
   else
     return (recursive (n-1) + recursive (n-1));
}
(D) 0(2^n)
Explanation: Recursive expression for the above program will be.
  T(n) = 2T(n-1) + c
  T(1) = c1.
Let us solve it.
  T(n) = 2(2T(n-2) + c) + c        = 4T(n-2) + 3c
  T(n) = 8T(n-3) + 6c + c          =  8T(n-3) + 7c
  T(n) = 16T(n-4) + 14c + c        =  16T(n-4) + 15c
  ............................................................
  .............................................................
  T(n) = (2^(n-1))T(1) +  (2^(n-1) - 1)c

  T(n) = O(2^n)
http://www.geeksforgeeks.org/data-structures-and-algorithms-set-11/
What is the time complexity of the following recursive function:
int DoSomething (int n)
{
  if (n <= 2)
    return 1;
  else 
    return (DoSomething (floor(sqrt(n))) + n);
}
(D) Θ(loglogn)
Recursive relation for the DoSomething() is
  T(n) =  T(√n) + C1 if n > 2  
We have ignored the floor() part as it doesn’t matter here if it’s a floor or ceiling.
  Let n = 2^m,  T(n) = T(2^m)
  Let T(2^m) =  S(m)

  From the above two, T(n) = S(m)

  S(m) = S(m/2) + C1  /* This is simply binary search recursion*/
  S(m)  = O(logm)      
          = O(loglogn)  /* Since n = 2^m */
  
  Now, let us go back to the original recursive function T(n) 
  T(n)  = S(m) 
          = O(LogLogn)
4. In the following C function, let n >= m.
http://mathworld.wolfram.com/EuclideanAlgorithm.html
int gcd(n,m)
{
  if (n%m ==0) return m; 
  n = n%m;
  return gcd(m,n);
}
How many recursive calls are made by this function?
(A) Θ(logn)?
Difficult
http://geeksquiz.com/algorithms-analysis-of-algorithms-question-18-3/
 int unknown(int n) {
    int i, j, k = 0;
    for (i  = n/2; i <= n; i++)
        for (j = 2; j <= n; j = j * 2)
            k = k + n/2;
    return k;
 }
The outer loop runs n/2 or
\Theta(n)
times. The inner loop runs
\Theta(Logn)
times (Note that j is divide by 2 in every iteration). So the statement “k = k + n/2;” runs
\Theta(nLogn)
times. The statement increases value of k by n/2. So the value of k becomes n/2*
\Theta(nLogn)
which is
\Theta(n^2Logn)
http://geeksquiz.com/tag/analysis-of-algorithms-recurrences/
Master Theorem
http://geeksquiz.com/algorithms-analysis-of-algorithms-recurrences-question-11/
Change of variables: let m = lg n. Recurrence becomes S(m) = S(m/2) + theta(lgm). Case 2 of master’s theorem applies, so T(n) = theta( (log Log n) ^ 2).

http://staff.ustc.edu.cn/~csli/graduate/algorithms/book6/chap03.htm
For positive integers n, the nth harmonic number is

Miscs
http://geeksquiz.com/algorithms-analysis-of-algorithms-question-19-2/
T(n) = T(n-1) + C which is O(n)
Time complexity of fun2() can be written as
T(n) = 2T(n-1) + C which is O(2^n)
What does it mean when we say that an algorithm X is asymptotically more efficient than Y?
(B) X will be a better choice for all inputs except small inputs
The worst case time complexity is always greater than or same as the average case time complexity.
(C) A(n) = O(W(n)) 

Consider the following program fragment for reversing the digits in a given integer to obtain a new integer. Let n = D1D2…Dm
int n, rev;
rev = 0;
while (n > 0)
{
   rev = rev*10 + n%10;
   n = n/10;
}
The loop invariant condition at the end of the ith iteration is: (GATE CS 2004)
(A) n = D1D2….Dm-i and rev = DmDm-1…Dm-i+1
We can get it by taking an example like n = 54321. After 2 iterations, rev would be 12 and n would be 543.

Labels

LeetCode (1432) GeeksforGeeks (1122) LeetCode - Review (1067) Review (882) Algorithm (668) to-do (609) Classic Algorithm (270) Google Interview (237) Classic Interview (222) Dynamic Programming (220) DP (186) Bit Algorithms (145) POJ (141) Math (137) Tree (132) LeetCode - Phone (129) EPI (122) Cracking Coding Interview (119) DFS (115) Difficult Algorithm (115) Lintcode (115) Different Solutions (110) Smart Algorithm (104) Binary Search (96) BFS (91) HackerRank (90) Binary Tree (86) Hard (79) Two Pointers (78) Stack (76) Company-Facebook (75) BST (72) Graph Algorithm (72) Time Complexity (69) Greedy Algorithm (68) Interval (63) Company - Google (62) Geometry Algorithm (61) Interview Corner (61) LeetCode - Extended (61) Union-Find (60) Trie (58) Advanced Data Structure (56) List (56) Priority Queue (53) Codility (52) ComProGuide (50) LeetCode Hard (50) Matrix (50) Bisection (48) Segment Tree (48) Sliding Window (48) USACO (46) Space Optimization (45) Company-Airbnb (41) Greedy (41) Mathematical Algorithm (41) Tree - Post-Order (41) ACM-ICPC (40) Algorithm Interview (40) Data Structure Design (40) Graph (40) Backtracking (39) Data Structure (39) Jobdu (39) Random (39) Codeforces (38) Knapsack (38) LeetCode - DP (38) Recursive Algorithm (38) String Algorithm (38) TopCoder (38) Sort (37) Introduction to Algorithms (36) Pre-Sort (36) Beauty of Programming (35) Must Known (34) Binary Search Tree (33) Follow Up (33) prismoskills (33) Palindrome (32) Permutation (31) Array (30) Google Code Jam (30) HDU (30) Array O(N) (29) Logic Thinking (29) Monotonic Stack (29) Puzzles (29) Code - Detail (27) Company-Zenefits (27) Microsoft 100 - July (27) Queue (27) Binary Indexed Trees (26) TreeMap (26) to-do-must (26) 1point3acres (25) GeeksQuiz (25) Merge Sort (25) Reverse Thinking (25) hihocoder (25) Company - LinkedIn (24) Hash (24) High Frequency (24) Summary (24) Divide and Conquer (23) Proof (23) Game Theory (22) Topological Sort (22) Lintcode - Review (21) Tree - Modification (21) Algorithm Game (20) CareerCup (20) Company - Twitter (20) DFS + Review (20) DP - Relation (20) Brain Teaser (19) DP - Tree (19) Left and Right Array (19) O(N) (19) Sweep Line (19) UVA (19) DP - Bit Masking (18) LeetCode - Thinking (18) KMP (17) LeetCode - TODO (17) Probabilities (17) Simulation (17) String Search (17) Codercareer (16) Company-Uber (16) Iterator (16) Number (16) O(1) Space (16) Shortest Path (16) itint5 (16) DFS+Cache (15) Dijkstra (15) Euclidean GCD (15) Heap (15) LeetCode - Hard (15) Majority (15) Number Theory (15) Rolling Hash (15) Tree Traversal (15) Brute Force (14) Bucket Sort (14) DP - Knapsack (14) DP - Probability (14) Difficult (14) Fast Power Algorithm (14) Pattern (14) Prefix Sum (14) TreeSet (14) Algorithm Videos (13) Amazon Interview (13) Basic Algorithm (13) Codechef (13) Combination (13) Computational Geometry (13) DP - Digit (13) LCA (13) LeetCode - DFS (13) Linked List (13) Long Increasing Sequence(LIS) (13) Math-Divisible (13) Reservoir Sampling (13) mitbbs (13) Algorithm - How To (12) Company - Microsoft (12) DP - Interval (12) DP - Multiple Relation (12) DP - Relation Optimization (12) LeetCode - Classic (12) Level Order Traversal (12) Prime (12) Pruning (12) Reconstruct Tree (12) Thinking (12) X Sum (12) AOJ (11) Bit Mask (11) Company-Snapchat (11) DP - Space Optimization (11) Dequeue (11) Graph DFS (11) MinMax (11) Miscs (11) Princeton (11) Quick Sort (11) Stack - Tree (11) 尺取法 (11) 挑战程序设计竞赛 (11) Coin Change (10) DFS+Backtracking (10) Facebook Hacker Cup (10) Fast Slow Pointers (10) HackerRank Easy (10) Interval Tree (10) Limited Range (10) Matrix - Traverse (10) Monotone Queue (10) SPOJ (10) Starting Point (10) States (10) Stock (10) Theory (10) Tutorialhorizon (10) Kadane - Extended (9) Mathblog (9) Max-Min Flow (9) Maze (9) Median (9) O(32N) (9) Quick Select (9) Stack Overflow (9) System Design (9) Tree - Conversion (9) Use XOR (9) Book Notes (8) Company-Amazon (8) DFS+BFS (8) DP - States (8) Expression (8) Longest Common Subsequence(LCS) (8) One Pass (8) Quadtrees (8) Traversal Once (8) Trie - Suffix (8) 穷竭搜索 (8) Algorithm Problem List (7) All Sub (7) Catalan Number (7) Cycle (7) DP - Cases (7) Facebook Interview (7) Fibonacci Numbers (7) Flood fill (7) Game Nim (7) Graph BFS (7) HackerRank Difficult (7) Hackerearth (7) Inversion (7) Kadane’s Algorithm (7) Manacher (7) Morris Traversal (7) Multiple Data Structures (7) Normalized Key (7) O(XN) (7) Radix Sort (7) Recursion (7) Sampling (7) Suffix Array (7) Tech-Queries (7) Tree - Serialization (7) Tree DP (7) Trie - Bit (7) 蓝桥杯 (7) Algorithm - Brain Teaser (6) BFS - Priority Queue (6) BFS - Unusual (6) Classic Data Structure Impl (6) DP - 2D (6) DP - Monotone Queue (6) DP - Unusual (6) DP-Space Optimization (6) Dutch Flag (6) How To (6) Interviewstreet (6) Knapsack - MultiplePack (6) Local MinMax (6) MST (6) Minimum Spanning Tree (6) Number - Reach (6) Parentheses (6) Pre-Sum (6) Probability (6) Programming Pearls (6) Rabin-Karp (6) Reverse (6) Scan from right (6) Schedule (6) Stream (6) Subset Sum (6) TSP (6) Xpost (6) n00tc0d3r (6) reddit (6) AI (5) Abbreviation (5) Anagram (5) Art Of Programming-July (5) Assumption (5) Bellman Ford (5) Big Data (5) Code - Solid (5) Code Kata (5) Codility-lessons (5) Coding (5) Company - WMware (5) Convex Hull (5) Crazyforcode (5) DFS - Multiple (5) DFS+DP (5) DP - Multi-Dimension (5) DP-Multiple Relation (5) Eulerian Cycle (5) Graph - Unusual (5) Graph Cycle (5) Hash Strategy (5) Immutability (5) Java (5) LogN (5) Manhattan Distance (5) Matrix Chain Multiplication (5) N Queens (5) Pre-Sort: Index (5) Quick Partition (5) Quora (5) Randomized Algorithms (5) Resources (5) Robot (5) SPFA(Shortest Path Faster Algorithm) (5) Shuffle (5) Sieve of Eratosthenes (5) Strongly Connected Components (5) Subarray Sum (5) Sudoku (5) Suffix Tree (5) Swap (5) Threaded (5) Tree - Creation (5) Warshall Floyd (5) Word Search (5) jiuzhang (5)

Popular Posts