# Week 1

2. Karatsuba Multiplication Algorithm
3. Sorting
1. How does sorting work in case of all distinct elements?
2. How does sorting work in case of repeating elements?
3. How does sorting work in case of all same elements?
4. Implement Merge-Sort, considering all these cases separately.
4.  Merge-sort
1. How does Merge-Sort implementation differ in case of repeating elements?
2. Merge-sort running time complexity? = 6n * (log n + 1)
3. Height of recursion tree= log(2) n +1
5. y=log(n) is always less than y=n
6. Implementations= Merge-Sort, Basic Sorts, Karatsuba(if possible)
7. Asymptotic Analysis
1. Coarse enough to suppress implementation details, fine enough to differentiate different algorithms
8. Big O Notation
1. T(n)=O(F(n)) if and only if there exist constants c, n0 >0 such that
T(n)<=c*F(n) for all n>=n0
9. Big Omega Notation
1. T(n)=Omega(F(n)) if and only if there exist constants c, n0 >0 such that
T(n)>=c*F(n) for all n>=n0
10. Big Theta Notation
1. T(n)=Theta(F(n)) if and only if there exist constants c, n0 >0 such that
T(n)=O(n) and T(n)=Omega(n) for all n>=n0
2. For all constants c1,c2, n0 such that c1F(n)<=T(n)<=c2F(n)
11. Little O Notation
1. T(n)=O(F(n)) if and only if for all constants c>0, there exists a constant n0 such that T(n)<=c*F(n) for all n>=n0

## Principles for Design and Analysis of Algorithms

1. Design algorithms for worst-case scenarios.
1. Best fit for design sub-routines
2. In case target function/field is known, we may consider average-case scenarios and benchmarks==Requires domain knowledge
3. Worst case calculations are easier than average case calculations
2. Constants are not important.
1. Constants depend more on architecture/compiler/programmer
2. Granularity achieved at this level is perfectly acceptable.
3. Focus on Asymptotic Analysis
1. Focus on running time for large input sizes

Fast Algorithm===Worst case running time grows slowly for increasing input