http://www.ics.uci.edu/~eppstein/161/960229.html – Longest common subsequence
Book Chapter : http://www.sce.carleton.ca/faculty/chinneck/po/Chapter15.pdf
More on Dynamic Programming
Dynamic Programming is a technique that trades space for time. It stores the results computed (which might be needed in future) in memory, thus avoiding the need to compute the result everytime
A example of a problem that can be solved with this technique is Longest common subsequence.
Say we want to calculate the length of LCS of two strings.
LCS (“ABCD”,”BD”) = LCS(“ABC”,”B”) + “D”
Note that if the last characters match (“D”), compute LCS of the remaining portion and append this character . If the last characters do not match, then we remove the last character from one of the string and match with another to find the best possible match
To find the length, we can write it as a recursive function
LCS (“ABCD”,”BD”) = 1 + LCS (“ABC”,”B”) (since last chars match)
LCS(“ABCD”,”BDM”)= max( LCS(“ABC”,”BDM”), LCS(“ABCD”,”BD”)) (since last chars don’t match)
The recursion makes lot of calls, most of them have identical arguments. Note that LCS can be called with atmost (M+1)(N+1) different arguments – as each argument represents the length of a string. That is there are only polynomial no. of subproblems. So we can apply the dynamic programming technique.
For dynamic programming problems, the principle of optimality must be satisified. It states that for the global problem to be solved optimally each subproblem must be solved optimally. (Not all optimization problems satisfy this , sometimes it is better to lose a little on one subproblem to make a big gain on another)…and there must be polynomially many problems
Other problems that can be solved using DP technique are
all pair shortest path
Matrix chain multiplication
Regular expression computation
Longest ascending subsequence
Reliable machine design
Independent set of graph
Edit distance (insert,delete, replace)