# Dynamic programming

Julián Mestre

School of Information Technologies The University of Sydney

Dynamic programming

Algorithms and Complexity

Dynamic Programming – Julián Mestre

A counting problem

How many way are there to go up a staircase with n steps when you can go up one or two steps at a time?

Let F(n) be this number. Then – F(0) = 1

– F(1) = 1

– F(2) = 2

– F(3) = 3

– F(4) = 5

What’s the pattern?

2

Dynamic Programming – Julián Mestre

A counting problem

To see the pattern, condition on the last move – There are F(n-1) ways of ending with a 1-step move

– There are F(n-2) ways of ending with a 2-step move

F(n) = F(n-1) + F(n-2) for n > 1

3

def fib(n):

if n <= 1: return 1 else: return fib(n-1) +   fib(n-2)

Dynamic Programming – Julián Mestre

Time complexity

Let T(n) be the running time of our algorithm then

T(n) = T(n-1) + T(n-2) + O(1) for n > 1

which is at least Ω(2n/2)

What can be done to speed up the algorithm?

Reuse computation!

4

Dynamic Programming – Julián Mestre

Iterative algorithm

5

def fib(n):

M = array of length n + 1 M = 1 M = 1 for i in range(2, n+1): M[i] = M[i-1] + M[i-2] return M[n]

Dynamic Programming – Julián Mestre

Weighted Interval Scheduling

6

Motivation: – Users submitting requests to use some common resource (e.g., a classroom)

– Each request has a time window where the resource is needed

– Users cannot share the resource

– Users have priority

Input: – Set of weighted intervals {I1, I2, …, In} where Ii = ( s(i), f(i) ) and wi > 0

Task: – Find a maximum weight subset of intervals that do not intersect

time

if all the weight is 1, then is normal interval schedule problem

Dynamic Programming – Julián Mestre

The structure of optimum

Assume intervals are sorted so that f1 ≤ f2 ≤ ≤ fn Let OPT(i) be an optimal solution for intervals {1, …, i}

Obs 1: If interval n ∉ OPT(n) then OPT(n) = OPT(n-1)

Obs 2: If interval n ∈ OPT(n) then OPT(n) = {n} ∪ OPT(p(n)),   where p(i) is the largest index such that fp(i) ≤ si Let M(i) be the value of the optimal solution then

M(i) = max { M(i-1), w(i) + M(p(i) } for i > 0

7

Dynamic Programming – Julián Mestre

Recursive algorithm

8

def MWIS(intervals,w):

def helper(i): if i == 0:

return 0 return max(helper(i-1), w[i] + helper(p[i])

sort intervals in increasing f-value compute p-values for each interval return helper(n)

Dynamic Programming – Julián Mestre

Time complexity

Let T(n) be the running time of the algorithm. In the instance below p(i) = i – 2 for each i > 2 so

T(n) ≥ T(n-1) + T(n-2) + O(1)

which be already saw is exponential!

9

time

Dynamic Programming – Julián Mestre

Iterative algorithm

10

def MWIS(intervals,w):

n = len(intervals) sort intervals in increasing f-value compute p-values for each interval M = array of length n + 1 M = 0 for i in range(1, n+1): M[i] = max(M[i-1], w[i] + M[p[i]]) return M[n]

Dynamic Programming – Julián Mestre

Time complexity

Sorting the intervals takes O(n log n) time

Computing the p-values can be done in O(n log n) time

There are n-2 iterations each taking O(1),   so computing M takes O(n) time

Overall the algorithm runs in O(n log n) time

11

Dynamic Programming – Julián Mestre

12

def MWIS(intervals,w):

def helper(i): if i == 0:

return [] if M[i] == M[i-1]: return helper(i-1) else: return helper(p[i]) + [i] sort intervals in increasing f-value compute p-values for each interval compute M-values return helper(n)

Dynamic Programming – Julián Mestre

Longest increasing subsequence

13

Input: – Unsorted array A with n numbers

Task: – Find longest sequence i1 < i2 < < ik such that A[i1] < A[i2] < < A[ik]

Dynamic Programming – Julián Mestre

Knapsack

14

Motivation: – You need to design an investment portfolio for yourself

– There are n investment options, each having associated a profit and cost.

– Investment decision are binary decision (all or nothing)

– Given your budget, find the best investment option

Input: – Set of pairs (w1, v1), (w2, v2), …, (wn, vn) and knapsack capacity W

Task: – Find S ⊆ {1, …, n} maximizing v(S) subject to w(S) ≤ W

Dynamic Programming – Julián Mestre

Recap: Dynamic programming (DP)

DP algorithms have three distinctive ingredients – A big subproblem is broken up into smaller subproblems

– The solution of a subproblem can be expressed recursively

– There is an ordering of the subproblems from “small” to “large” such that to solve a subproblem we only need the solution to “smaller” subproblems

Correctness depends on the correctness of the recurrence

Time complexity is usually dominated by # of DP states * time it takes to fill one state

15