Wednesday, June 10, 2015


I've been doing some old Google Code Jam programs for fun. One paradigm that comes up repeatedly as a useful tool is Dynamic Programming (DP). In this one called "Welcome to Code Jam", for example, it is fairly obvious that the complexity of naive approaches will increase dramatically with large datasets. A DP algorithm, on the other hand, solves the problem handily in O(mn) time, where m is defined by number of characters in the problem text and n is defined the number of characters in "welcome to code jam".

So what does the difference in running time actually look like, between a naive and DP algorithm? In this case, one can conceive of visualizing the number of array accesses required for each logical step of the respective algorithms. As a start, I wrote a Python module to help with logging array accesses (in a slightly more generalized manner, with support for tracking basic interactions across a number of types using Abstract Base Classes):