Detailed schedule (Fall 2023)

Week 1: Warmup (bubble sort, insertion sort, selection sort) and analysis.

August 30-September 1

Objectives:

  • Review searching (linear search, binary search) and simple sorting (bubble sort, selection sort, insertion sort) and their analysis
  • Basics of algorithm analysis, big-Oh notation, best-case and worst-case analysis

Resources:


Week 2: Asymptotic Notation and Summations

September 4-8

Objectives:

  • Relevance of analysis in practice, as well as its assumptions and limitations
  • Definitions of O(), Ω(), Θ()
  • Rate of growth of common functions
  • Finding the rate of growth of a function
  • Comparing the order of growth of two arbitrary functions f(n), g(n)
  • Analyzing basic algorithm running times using O(), Ω(), Θ()
  • Arithmetic and geometric summations and their Θ() bound
  • Recognizing arithmetic and geometric summations in different forms and give Θ() bounds

Resources:


Week 3: Mergesort and Recurrences

September 11-15

Objectives:

  • Mergesort: how it works, why it works, and its running time analysis
  • Express the running time of recursive algorithms using recurrences
  • Solve recurrences by repeated iteration
  • Recognize broadly classes of recurrences ( logarithmic, linear, Θ(n lg n), exponential)

Resources:


Week 4: Quicksort and Heapsort

September 18-22

Objectives:

  • Lomuto partition, Quicksort, Randomized-Quicksort and analysis
  • Min/max-binary heap and operations (deleteMin/Max, insert, heapify, buildheap) along with analysis
  • Heapsort works in place
  • Using the heap as a tool to solve new problems

Resources:


Week 5: Sorting lower bound. Sorting without comparisons.

September 25-29

Objectives:

  • Can a sorting algorithm do better than Θ(n lg n) in the worst-case? Understand the comparison-based sorting lower bound, when it applies and what assumptions it makes
  • Non-comparison sorting: Understand BucketSort and CountingSort, their analysis and assumptions

Resources:


Week 6: Exam 1 review and exam 1. Selection.

October 2-6

Objectives:

  • The selection problem
  • Quick-Select in expected O(n) time
  • An elegant and ingenious algorithm for selection that runs in O(n) worst-case.

Resources:


What do you do when you want to solve a problem and you don’t know where to start? Although there are no recipes, there are some techniques that come up frequently.

Week 7: Divide-and-conquer

October 11-13 (half week)

Objectives:

  • The divide-and-conquer technique
  • Karatsuba’s integer multiplication algorithm
  • Strassen’s matrix multiplication
  • Using D&C to solve new problems

Resources:


Week 8, 9: Dynamic Programming

October 16-27

Objectives:

  • The dynamic programming technique
  • Examples: Fibonacci, board game, rod cutting and knapsack
  • Justification of correctness via optimal substructure, and analysis
  • Using DP to solve new problems

Resources:


Week 10:The Greedy technique.

October 30-November 3

Objectives:

  • The greedy technique via the activity selection, including the correctness justification
  • Using the greedy technique to solve new problems

Resources:



Week 11: Review and exam 2

November 6, 10

Resources: exam2-review

  • Friday Nov 10: Exam 2 in class

Week 11.5, 12, 13.5: Graphs Basics. BFS and DFS and their applications. Topological order. Dynamic programming on DAGs.

Nov 9, 14, 16, 17, 21

Objectives:

  • Graph representation: adjacency list and adjacency matrix
  • Basic concepts: sparse, complete, dense, trees, paths, connectivity, connected components, reachability, strongly connected components
  • Breadth-first and depth-first traversals, their complexity, and their properties
  • Topological order and two algorithms for computing a topological order
  • Dynamic programming on DAGs

Resources:


Nov 22-26: Thanksgiving break!


Week 14: Shortest paths on DAGs, Dijkstra and Bellman-Ford.

November 27 - December 1

We discuss shortest paths on graphs and see some of the nicest algorithms in computer science: Dijkstra’s and Bellman-Ford’s algorithms. While describing them we try to understand some common principles that guided their design. We’ll see that Bellman-Ford’s algorithm uses dynamic programming and Dijkstra’s is a greedy algorithm, making these nice applications of the techniques we studied earlier in the semester.

Objectives:

  • Shortest paths (SP) on DAgs via dynamic programming
  • Dijkstra’s SP algorithm on graphs without negative weights
  • Bellman Ford’s SP algorithm on general graphs
  • Correctness and analysis
  • Finding negative cycles

Resources:


Week 15: The Minimum Spanning Tree (MST)

December 4-8

Another fundamental problem on graphs is computing a MST. We discuss some properties of MSTs which will get us intuition for how to compute an MST efficiently. We’ll glance at two well-known algorithms, Prim’s and Kruskal’s, which are both greedy algorithms much in the spirit of Dijkstra. Their correctness follows from a neat result called The Cut Theorem.

Objectives:

  • MST (minimum spaninng tree) and properties
  • Kruskal’s and Prim’s algorithms, and the Cut Theorem which captures their correctness

Resources:


Exam 3: [final exam slot as posted in Polaris]