*In mathematics and computer science, dynamic programming*(DP)

*is a method of solving problems exhibiting the properties of overlapping subproblems and optimal substructure that takes much less time than naive methods.*(from wiki

*)*

DP is a powerful technique. It is a 3 step approach. We break the problem into smaller sub-problems (states), solve them and using the solutions to sub-problems, solve the original problem. The sub-problems are also divided into further sub-problems recursively till we get a base case.

The classic

**coin denomination**problem is a good example. If given a set of coins V {V1, V2, ... Vn}, you have to find the least number of coins required for a number S. Let's say you have coins with denominations V = {1, 3, 5} and you need to find the least coins for sum S = 11. If you know the least coins for sums S-Vj i.e. 10 (11-1), 8 (11-3), 6 (11-5) then you can find for S = 11. The sums S-Vj are sub-problems for this problem.

Coinsi = 0 for i = 0

= min Coinsi-Vj for all Vj in V for i > 0

```
coinDenominations(coins V, sum S):
coins[1 to s] = infinity
deno[1 to S] = undefined
coins[0] = 0
for i = 1 to n:
for coin Vj in set of coins V and Vj <= i:
if ( coins[i-Vj] + 1 < coins[i] ):
coins[i] = coins[i-Vj] + 1
deno[i] = Vj
```

The least coins for S are obtained from coins[] and the coins are obtained from deno[]. For S = 11, coins are 3 (1, 5, 5). Time efficiency is O(S*n) where S is the required sum and n is the number of coins.

*The greedy approach for this problem would not work in some cases. For example, with coins V = {1, 7, 10} and for S = 14, the greedy approach would consider coin with maximum denomination 10 and would end up with (10, 1, 1, 1, 1). But for 14, it should be (7, 7).*

Below is another problem where we want to find the

**longest increasing subsequence**from given numbers. In below case, the numbers in

**bold**is the longest one i.e

**and not**

*3, 4*, 6, 7*3, 4, 8*.

`1 2 3 4 5 6 i`

5, **3**, **4**, *8*, **6**, **7** A[]

The approach in this problem is to maintain seqCount[] for every n. Initially it is 1 i.e. itself. previous[] holds the previous number in the sequence from any number. For any number i in the list, we compare the seqCount for all the predecessors j of i, i.e. j = i-1 to 1. If we find a better seqCount, then we update it and also previous[]. For example, at i=4, A= 8 and seqCount = 3. At i=5, A = 6, seqCount = 1 initially as 6 < 8. Then we compare 6 to 4 (whose sqlCount is 2) and so we get seqCount as 3.

seqCounti = 1 initially for all i

= max(1 + seqCountj) for j < i and Ai > Aj

```
longestIncreasingSubsequence(int[] A):
seqCount = new [a.length]
seqCount[..] = 1
previous = new [a.length]
previous[..] = -1
for i = 0, i < a.length, i++:
for j = i-1, j >= 0, j++:
if a[j] <= a[i]:
if seqCount[j] +1 > seqCount[i]:
seqCount[i] = seqCount[j] +1
previous[i] = j
```

Time efficiency is O(n^2). We can make the time efficiency better - instead of looking for predecessors of i sequentially (

`for j = i-1 to 1`

), when we find the *largest number lesser than i*, we break out of the inner for loop. This can done using a binary tree and the lookup to find the

*largest number lesser than i*would take O(logn) time. So efficiency would be O(nlogn).

DP has polynomial complexity but the issue is usually space and not time. I might update this article as I encounter interesting complex problems that I am able to solve. :P

[Hat tip to D at topcoder]

## No comments:

## Post a Comment