Showing posts with label Theory. Show all posts
Showing posts with label Theory. Show all posts

Induction and Recursive Algorithms


http://en.wikipedia.org/wiki/Mathematical_induction
Mathematical induction is a form of direct proof, and it is done in two steps. The first step, known as the base case, is to prove the given statement for the first natural number. The second step, known as the inductive step, is to prove that the given statement for any one natural number implies the given statement for the next natural number.

Mathematical induction in this extended sense is closely related to recursion. Mathematical induction, in some form, is the foundation of all correctness proofs for computer programs.

Another variant, called complete induction (or strong induction or course of values induction), says that in the second step we may assume not only that the statement holds for n = m but also that it is true for all n less than or equal to m.

http://www.cs.sfu.ca/~bbart/225/induction1-1.html
The technical term for this is the inductive step: to use the verified smaller structures to verify the larger structures. Each of the smaller structures is called an inductive hypothesis.

In a simple induction, like this one, the inductive step shows that the nth case you verify will imply the n + 1st case.

For simple induction, you only have to verify one basis case:n = 1. For strong induction, it may be several cases: whatever your inductive step doesn't cover. The process of verifying these cases is known as the basis step. The combination of the inductive step and basis step is a proof by induction.

Part 2: How Induction Relates to Computer Science
Induction is about mathematical structures which are self-similar.

Induction and Recursion
Recursion is an algorithmic technique where you solve a problem by using the solutions to smaller instances of the same problem.

This sort of recursive reasoning, where you break down the large case into smaller cases, is known as the top-down approach

Induction and Iteration
The other way to look at induction is by starting with case 1, the basis case. Then, by using the inductive step, case 1 implies case 2.

This sort of recursive reasoning, where you use the smaller cases to build up to the large case, is known as the bottom-up approach. 

Loop Invariants
Insertion Sort is an incremental sort. Each loop begins with a sorted A[0..i-1], and the element A[i] is joined to it such that the result is a sorted A[0..i]. In other words:
At the beginning of each loop, A[0..i-1] is a sorted permutation of the first i elements of A[]and at the end of each loop A[0..i] is a sorted permutation of the first i+1 elements of A[].
This is a loop invariant about the progress of the algorithm.

Once you have declared a loop invariant, your next goal is to prove it by induction. Why? Because the byproduct of your proof will be a proof of correctness of the algorithm.


Invariants and Recursive Algorithms
Recursive Algorithms often solve problems by the Divide & Conquer paradigm, which involves three stages: divideconquer and combine.
  • Divide the problem into 2 or more smaller subproblems, whose solutions are strongly related to the solution of the original problem.
  • Conquer by calling the recursive algorithm on each subproblem.
  • Combine together the subproblem solutions to form the solution to the original problem.
Running Time Analysis of Merge Sort
Inductive Step
Assume T(n) ≤ c⋅n log n holds for every n < k, and prove T(k) ≤ c⋅k log k.
Case 1: k is even
T(k≤ T(⎡k/2⎤) + T(⎣k/2⎦) + ak
T(k/2) + T(k/2) + ak
≤ c⋅(k/2) log(k/2) + c⋅(k/2) log(k/2) + ak
                                      (Inductive Hypothesis)
c⋅k log(k/2) + ak
c⋅k (log k − log 2) + ak
c⋅k log k − ck log 2 + ak
≤ c⋅k log k                     (when c ≥ a / log 2)
Case 2: k is odd
T(k≤ T(⎡k/2⎤) + T(⎣k/2⎦) + ak
T((k + 1)/2) + T((k − 1)/2) + ak
≤ c⋅(k + 1)/2 log((k + 1)/2) + c⋅(k − 1)/2 log((k − 1)/2) + ak
                                      (Inductive Hypothesis)
= (c/2)⋅[(k + 1) log(k + 1) + (k − 1) log(k − 1)] − ck log 2 + ak
If you're still paying attention, you'll notice the symmetry with Case 1. Except for the first term, the two expressions are identical, and so if only you could show
(k + 1) log(k + 1) + (k − 1) log(k − 1) ≤ 2k log k,
you would be home. Unfortunately, this inequality is false, and you can prove it is false by using calculus, or perhaps even mathematical induction. But it turns out that the left hand side is not that much bigger than the right hand side, which means that you can utilize a portion of the − ck log 2 term to reduce its value by just enough to satisfy the inequality.
Back to the action:
T(k) ≤ (c/2)⋅[(k + 1) log(k + 1) + (k − 1) log(k − 1)] − ck log 2 + ak
≤ (c/2)⋅[(k + 1) log(k + 1) + (k − 1) log(k + 1)] − ck log 2 + ak
c⋅k log(k + 1) − ck log 2 + ak
c⋅k log(k + 1) − ck log 4/3 − ck log 3/2 + ak
c⋅k[log(k + 1) − log 4/3] − ck log 3/2 + ak
c⋅k log 3(k + 1)/4 − ck log 3/2 + ak
≤ c⋅k log k − ck log 3/2 + ak      (when 3(k + 1)/4 ≤ k, i.e., k ≥ 3)
≤ c⋅k log k                                  (when c ≥ a / log(3/2))
So, no matter whether k is even or odd, T(k) ≤ c⋅k log k, whenever k ≥ 3 and c ≥ max(a / log 2, a / log(3/2)) = a / log(3/2).
[You're almost there. There's one last detail to take care of.]



http://sydney.edu.au/engineering/it/~jchan3/soft1001/jme/recursion/recursion.html

http://www.johndcook.com/blog/2012/08/14/understanding-recursion-ii/
Recursion is about solving a problem in terms of smaller versions of itself. The process ends because the problems get smaller until the process reaches some problem small enough to be solved directly.


Recursion is the process of solving a problem in terms of smaller versions of the same problem. Since the problem gets smaller each time, the process eventually terminates in a problem (the “base case”)  that can be solved directly. Be sure of three things:

The problem gets smaller each time.
You include a solution for the base case.
Each case is handled correctly.

Loop and Recursion Invariant


http://touch.ethz.ch/recursion.pdf
http://www.cs.sfu.ca/CourseCentral/125/johnwill/Packet14.pdf
A loop invariant is a statement that is true every
loop
● usually asserted at the beginning of the loop
● usually parametrized by the loop index
A good loop invariant should indicate the progress of the algorithm
● the invariant should carry all state information, loop to loop
● the invariant should imply the post-condition (the goal of the algorithm) at the end of the last loop

Use mathematical reasoning to capture the behaviour of an
algorithm:
● State invariants at various checkpoints
● Show that the invariant holds:
○ at the first checkpoint
○ during execution between checkpoints
● Conclude that the post-condition holds
○ the invariant holds at / after the last checkpoint
Initialization
Maintenance
Termination
Q. This works pretty well for simple iteration,
but what if your algorithm has no loops?
● Invariants are very powerful for recursive programs

Invariants and Recursion
Rule of Thumb: You may assume the invariant holds for any smaller case

X. http://www.drdobbs.com/cpp/invariants-for-binary-search-part-1-a-si/240169169


https://www.eecs.yorku.ca/course_archive/2013-14/W/2011/lectures/09%20Loop%20Invariants%20and%20Binary%20Search.pdf
Ø Maintain a sublist. Ø If the key is contained in the original list, then the key is contained in the sublist.

 It is faster not to check if the middle element is the key. Ø Simply continue.
http://www.cs.cornell.edu/courses/cs2112/2015fa/lectures/lec_loopinv/index.html
https://reprog.wordpress.com/2010/04/25/writing-correct-code-part-1-invariants-binary-search-part-4a/

http://www.cnblogs.com/youngforever/p/3200081.html
We use loop invariants to help us understand why an algorithm is correct. We must show three things about a loop invariant:
·         Initialization: It is true prior to the first iteration of the loop.
·         Maintenance: If it is true before an iteration of the loop, it remains true before the next iteration.
·         Termination: When the loop terminates, the invariant gives us a useful property that helps show that the algorithm is correct.

循环不变式(loop invariant)与数学归纳法(induction)进行对比:

When the first two properties hold, the loop invariant is true prior to every iteration of the loop. Note the similarity to mathematical induction, where to prove that a property holds, you prove a base case and an inductive step. Here, showing that the invariant holds before the first iteration is like the base case, and showing that the invariant holds from iteration to iteration is like the inductive step.

The third property is perhaps the most important one, since we are using the loop invariant to show correctness. It also differs from the usual use of mathematical induction, in which the inductive step is used infinitely; here, we stop the "induction" when the loop terminates.



http://analgorithmaday.blogspot.in/2011/02/loop-invariant-to-prove-correctness.html
This technique is like theorem proving, where you consider something is true through out. Only if it is true, the theorem is also true. There are many popular theoretical way to prove a theorem,

   - proof by induction
   - proof by contradiction
Examples
  • Insertion sort
    • loop invariant is for A[0..j-1], A[0] <= A[1] <=… <=A[j-1] where j-1 is the end of the sorted list.
  • Linear search
    • loop invariant which will be always true is: key is no found in A[j], where j is some index until which we have not found the key yet!!. If j>=n, then we already passed the size of A and hence key is not found.
  • Binary search
    • end is the last index, start is the first index of an array A, key is the value to search for.
    • loop invariant is: A[j]  < key for all j –> end, where j is the mid index at any iteration OR A[j] > key for all start –> j, where j is the mid index at any iteration
Important points
  • Loop invariants is very useful for proving the correctness of a program. Can be used for recursive as well as loop based algorithms
  • Invariants are basically nothing but ASSERTs in our code. It just verifies the precondition for a function. But Loop invariants can be used to test both pre and post conditions and successful return of the value. Used for doing unit testing.
  • Understanding the loop invariant of an algorithm catches bugs in code!!!.. THIS IS THE MOST IMPORTANT USE OF LOOP INVARIANT


http://www.drdobbs.com/cpp/loop-invariants-abbreviate-induction-pro/240169015
This example suggests a general technique for writing loops:
  • Define an invariant that describes the loop's behavior, independently of how many times it executes.
  • Choose a condition for the while statement that, when combined with the invariant, yields the behavior we want. Write the body of the loop in a way that maintains the invariant while working toward eventually making the while condition false, so that the loop will always terminate.
http://whiley.org/2013/01/29/understanding-loop-invariants-in-whiley/
http://homepages.ius.edu/RWISMAN/C455/html/notes/Chapter2/LoopInvariantProof.htm

Loop invariant proofs must show three parts:
  1. Initialization - the loop invariant holds prior to executing the loop, after initialization.
  2. Maintenance - the loop invariant holds at the end of executing the loop.
  3. Termination - the loop eventually terminates.


What we have here, then, is a recursion invariant, that is, an invariant property that is guaranteed to be preserved between recursive calls. The way to prove a recursion invariant is basically the same as we would a loop invariant: initiation, maintenance and termination. But instead of thinking in terms of how a loop changes certain variables, we think of states and the relationship between consecutive states.

What we have here, then, is a recursion invariant, that is, an invariant property that is guaranteed to be preserved between recursive calls. The way to prove a recursion invariant is basically the same as we would a loop invariant: initiation, maintenance and termination. But instead of thinking in terms of how a loop changes certain variables, we think of states and the relationship between consecutive states.

In particular, we interested in the first state (the initiation), the last state (the termination, which can be straightforwardly transformed to the final result) and the inductive step of generating a new state from the current state, assuming that the current one satisfies the invariant. No mutation of variables and no notion of time to worry about; just the sequence of states.

And it gets even better: we don’t even need to think about the sequence itself. It suffices to establish the relationship between input and output. In terms of code, we have to establish the relationship between the first and second parameters in line 2 with the first and second parameters in lines 7 and 8.

To sum up, the next time you write code to solve some problem, try to think about what property your algorithm keeps throughout its execution. And, if possible, try to develop a (tail) recursive version of it, so that you can prove that it works with mucho more elegance and simplicity. The key to understand how an algorithm changes things is to observe what it does not change

欧几里得+扩展的欧几里得算法+线性同余方程+中国剩余定理


http://www.nowamagic.net/academy/detail/40110119
欧几里得算法:
欧几里德算法又称辗转相除法,用于计算两个整数a,b的最大公约数
1gcd(a, b) = gcd(b, a mod b)
扩展欧几里德算法
由朴素欧几里得算法我们可以得出扩展欧几里德算法:对于不全为 0 的非负整数 a、b,gcd(a,b)表示 a,b 的最大公约数,必然存在整数对 x,y ,使得 gcd(a,b)=ax+by。
就是给两个整数 a,b 必然存在一对整数 x,y 使得 ax + by = gcd(a,b),这个定理又叫贝祖定理。
void gcdEx(const int a,const int b,int& x,int& y){
    if (b){
//      gcdEx(b,a % b,x,y);
//      int t = x;
//      x = y;
//      y = t - (int)(a / b) * y;
        // 上面四句可化简为下面两句
        gcdEx(b,a % b,y,x);
        y -= (int)(a / b) * x;
    } else {
        x = 1;
        y = 0;
    }
}
http://www.nowamagic.net/academy/detail/40110122
已知等式 47x + 30y = 1; 求x,y的整数解。
设扩展欧几里德算法为gcdEx(a,b,x,y) 其中a,b为已知,x,y为所求。那么gcdEx(a,b,x,y)函数就有如下的计算过程:
解释一下这个过程:
1. 左边是利用朴素欧几里德算法相除得到一系列中间值,gcdEx(47,30,x,y),a = 47,b = 30,所以 gcd(47, 30) = gcd(b, a%b) = gcd(30, 17),直到b=0。
2. 当b=0时,可以得知x=1,y=0,当知道这两个值的时候,就可以开始根据扩展欧几里德算法倒推了。还记得那个定理吗?这里再说一次:
1x1 = y2
2y1 = x2 - (int)(a/b) * y2
3. 往上一步的x,y看作x1,y1,下面“旧”的x,y看作x2,y2,利用朴素欧几里德算法推导出的结论,x1=y2,y1=x2-(a/b)*y2。其中a,b就是当前gcdEx中的a,b.当这样递归回去到了第一个调用点时,所得到的x,y就是结果,即x=-7,y=11;
扩展欧几里得算法就是不断地的将b放小,直至b等于0,最后反推求回x和y。前面不断的递归求解一定会有个时候 b=0,所以递归可以结束。结束之后再利用定理反推。

http://www.nowamagic.net/academy/detail/40110128
小珂是一名初中生,她现在很苦恼,因为老师布置了一个让她苦恼的作业,你能不能帮助她呢?题目信息如下:
已知二元一次方程 a*x+b*y=n,判断这个二元一次方程有没有整数解,x、y为未知数,其中a、b、n都为整数且不等于零,同时满足0 < a,b,n < 2^16-1。
根据扩展欧几里德算法我们知道,对于与不完全为 0 的非负整数 a 和 b,gcd(a,b)表示 a 和 b 的最大公约数,那么存在唯一的整数 x、y,使得 gcd(a,b)=ax+by。
那么问题现在就简单了,就是比较 gcd(a,b) 与 n,如果他们模为0,那么就 a*x+b*y=n 就有解,否则无解。
扩展欧几里德算法的应用主要有以下三方面:
(1)求解不定方程;
(2)求解模线性方程(线性同余方程);
(3)求解模的逆元;


Labels

LeetCode (1432) GeeksforGeeks (1122) LeetCode - Review (1067) Review (882) Algorithm (668) to-do (609) Classic Algorithm (270) Google Interview (237) Classic Interview (222) Dynamic Programming (220) DP (186) Bit Algorithms (145) POJ (141) Math (137) Tree (132) LeetCode - Phone (129) EPI (122) Cracking Coding Interview (119) DFS (115) Difficult Algorithm (115) Lintcode (115) Different Solutions (110) Smart Algorithm (104) Binary Search (96) BFS (91) HackerRank (90) Binary Tree (86) Hard (79) Two Pointers (78) Stack (76) Company-Facebook (75) BST (72) Graph Algorithm (72) Time Complexity (69) Greedy Algorithm (68) Interval (63) Company - Google (62) Geometry Algorithm (61) Interview Corner (61) LeetCode - Extended (61) Union-Find (60) Trie (58) Advanced Data Structure (56) List (56) Priority Queue (53) Codility (52) ComProGuide (50) LeetCode Hard (50) Matrix (50) Bisection (48) Segment Tree (48) Sliding Window (48) USACO (46) Space Optimization (45) Company-Airbnb (41) Greedy (41) Mathematical Algorithm (41) Tree - Post-Order (41) ACM-ICPC (40) Algorithm Interview (40) Data Structure Design (40) Graph (40) Backtracking (39) Data Structure (39) Jobdu (39) Random (39) Codeforces (38) Knapsack (38) LeetCode - DP (38) Recursive Algorithm (38) String Algorithm (38) TopCoder (38) Sort (37) Introduction to Algorithms (36) Pre-Sort (36) Beauty of Programming (35) Must Known (34) Binary Search Tree (33) Follow Up (33) prismoskills (33) Palindrome (32) Permutation (31) Array (30) Google Code Jam (30) HDU (30) Array O(N) (29) Logic Thinking (29) Monotonic Stack (29) Puzzles (29) Code - Detail (27) Company-Zenefits (27) Microsoft 100 - July (27) Queue (27) Binary Indexed Trees (26) TreeMap (26) to-do-must (26) 1point3acres (25) GeeksQuiz (25) Merge Sort (25) Reverse Thinking (25) hihocoder (25) Company - LinkedIn (24) Hash (24) High Frequency (24) Summary (24) Divide and Conquer (23) Proof (23) Game Theory (22) Topological Sort (22) Lintcode - Review (21) Tree - Modification (21) Algorithm Game (20) CareerCup (20) Company - Twitter (20) DFS + Review (20) DP - Relation (20) Brain Teaser (19) DP - Tree (19) Left and Right Array (19) O(N) (19) Sweep Line (19) UVA (19) DP - Bit Masking (18) LeetCode - Thinking (18) KMP (17) LeetCode - TODO (17) Probabilities (17) Simulation (17) String Search (17) Codercareer (16) Company-Uber (16) Iterator (16) Number (16) O(1) Space (16) Shortest Path (16) itint5 (16) DFS+Cache (15) Dijkstra (15) Euclidean GCD (15) Heap (15) LeetCode - Hard (15) Majority (15) Number Theory (15) Rolling Hash (15) Tree Traversal (15) Brute Force (14) Bucket Sort (14) DP - Knapsack (14) DP - Probability (14) Difficult (14) Fast Power Algorithm (14) Pattern (14) Prefix Sum (14) TreeSet (14) Algorithm Videos (13) Amazon Interview (13) Basic Algorithm (13) Codechef (13) Combination (13) Computational Geometry (13) DP - Digit (13) LCA (13) LeetCode - DFS (13) Linked List (13) Long Increasing Sequence(LIS) (13) Math-Divisible (13) Reservoir Sampling (13) mitbbs (13) Algorithm - How To (12) Company - Microsoft (12) DP - Interval (12) DP - Multiple Relation (12) DP - Relation Optimization (12) LeetCode - Classic (12) Level Order Traversal (12) Prime (12) Pruning (12) Reconstruct Tree (12) Thinking (12) X Sum (12) AOJ (11) Bit Mask (11) Company-Snapchat (11) DP - Space Optimization (11) Dequeue (11) Graph DFS (11) MinMax (11) Miscs (11) Princeton (11) Quick Sort (11) Stack - Tree (11) 尺取法 (11) 挑战程序设计竞赛 (11) Coin Change (10) DFS+Backtracking (10) Facebook Hacker Cup (10) Fast Slow Pointers (10) HackerRank Easy (10) Interval Tree (10) Limited Range (10) Matrix - Traverse (10) Monotone Queue (10) SPOJ (10) Starting Point (10) States (10) Stock (10) Theory (10) Tutorialhorizon (10) Kadane - Extended (9) Mathblog (9) Max-Min Flow (9) Maze (9) Median (9) O(32N) (9) Quick Select (9) Stack Overflow (9) System Design (9) Tree - Conversion (9) Use XOR (9) Book Notes (8) Company-Amazon (8) DFS+BFS (8) DP - States (8) Expression (8) Longest Common Subsequence(LCS) (8) One Pass (8) Quadtrees (8) Traversal Once (8) Trie - Suffix (8) 穷竭搜索 (8) Algorithm Problem List (7) All Sub (7) Catalan Number (7) Cycle (7) DP - Cases (7) Facebook Interview (7) Fibonacci Numbers (7) Flood fill (7) Game Nim (7) Graph BFS (7) HackerRank Difficult (7) Hackerearth (7) Inversion (7) Kadane’s Algorithm (7) Manacher (7) Morris Traversal (7) Multiple Data Structures (7) Normalized Key (7) O(XN) (7) Radix Sort (7) Recursion (7) Sampling (7) Suffix Array (7) Tech-Queries (7) Tree - Serialization (7) Tree DP (7) Trie - Bit (7) 蓝桥杯 (7) Algorithm - Brain Teaser (6) BFS - Priority Queue (6) BFS - Unusual (6) Classic Data Structure Impl (6) DP - 2D (6) DP - Monotone Queue (6) DP - Unusual (6) DP-Space Optimization (6) Dutch Flag (6) How To (6) Interviewstreet (6) Knapsack - MultiplePack (6) Local MinMax (6) MST (6) Minimum Spanning Tree (6) Number - Reach (6) Parentheses (6) Pre-Sum (6) Probability (6) Programming Pearls (6) Rabin-Karp (6) Reverse (6) Scan from right (6) Schedule (6) Stream (6) Subset Sum (6) TSP (6) Xpost (6) n00tc0d3r (6) reddit (6) AI (5) Abbreviation (5) Anagram (5) Art Of Programming-July (5) Assumption (5) Bellman Ford (5) Big Data (5) Code - Solid (5) Code Kata (5) Codility-lessons (5) Coding (5) Company - WMware (5) Convex Hull (5) Crazyforcode (5) DFS - Multiple (5) DFS+DP (5) DP - Multi-Dimension (5) DP-Multiple Relation (5) Eulerian Cycle (5) Graph - Unusual (5) Graph Cycle (5) Hash Strategy (5) Immutability (5) Java (5) LogN (5) Manhattan Distance (5) Matrix Chain Multiplication (5) N Queens (5) Pre-Sort: Index (5) Quick Partition (5) Quora (5) Randomized Algorithms (5) Resources (5) Robot (5) SPFA(Shortest Path Faster Algorithm) (5) Shuffle (5) Sieve of Eratosthenes (5) Strongly Connected Components (5) Subarray Sum (5) Sudoku (5) Suffix Tree (5) Swap (5) Threaded (5) Tree - Creation (5) Warshall Floyd (5) Word Search (5) jiuzhang (5)

Popular Posts