I find myself using more functional style components these days, e.g. recursion, list comprehension, and functions with single-line returns to check multi-clause conditions. There's an elegance to the simple, piece-meal constructs. My next big project is to learn Haskell...

## Monday, January 27, 2014

### Project Euler 26 - 35

I find myself using more functional style components these days, e.g. recursion, list comprehension, and functions with single-line returns to check multi-clause conditions. There's an elegance to the simple, piece-meal constructs. My next big project is to learn Haskell...

## Friday, January 24, 2014

### Project Euler 21-25

Two of the problems require finding the sum of all proper divisors of a number. The algorithm needs to be reasonably efficient, since it is iterated over a non-trivial set of numbers (e.g. all integers from 1 to 10,000).

The approach I've implemented runs well enough (about 1 second for all integers from 1 to 10,000), but I suspect there's a much faster approach to part 2.

- As an optimization step, pre-generate a list of primes via a sieve. Decompose each number into its prime factors, e.g. [2, 2, 7] for the number 28.
- For each number, generate the list of proper divisors from the list of prime factors via the following steps:
- Generate the power set of the list of prime factors. Trim the head (the empty element) and the tail (the sum product of which is the number itself). Using [2, 2, 7] as an example, the trimmed power set is [[2], [2, 2], [7], [2, 7]].
- For each list in the power set, find the sum product. Including the number 1 as a divisor, this results in [1, 2, 4, 7, 14].

Given the list of proper divisors, take its sum, e.g. 1 + 2 + 4 + 7 + 14 = 28.

### Project Euler 11-20

## Wednesday, January 22, 2014

### Project Euler 1 -10 & IPython

I've started working my way through the Project Euler problems. Even in the first ten problems, it's been interesting to see how brute-force search approaches compare to solutions more oriented in number theory.

For this kind of modular problem solving, IPython has proved a fantastic environment to work in for a number of reasons:

- Simple text markdown + MathJax support for referencing problem text
- The "cells" allow for easy recording and reference of intermediate results
- The "magic" functions like "%timeit" and "%prun" are useful for simple timing and profiling
- Integrated packages/languages are easily accessible -- so far I've used SymPy and Cython

## Thursday, January 9, 2014

### Top Four

Evernote has a new programming challenge, which I found to be more interesting than the other three posted previously:

So, a selection algorithm. The fourth largest element can be found in linear time by partitioning the input list recursively, using the median of medians to identify optimal pivot points. I recall that this algorithm was taught in the early weeks of Algorithms, Part I -- and it's still the most surprising application of divide-and-conquer that I've seen so far.

Once the fourth largest element is identified, returning the top four elements follows immediately in linear time.

Given an unordered list of N elements, write a function to find the top four elements of the given list. Make sure your function runs in linear time O(N).The solution is trivial for O(N log N), i.e. the optimal running time of comparison sort algorithms. It's not possible to sort the entire list any faster in a deterministic way. But because only the top four elements are required, we just need to identify the fourth largest element in O(N) time.

So, a selection algorithm. The fourth largest element can be found in linear time by partitioning the input list recursively, using the median of medians to identify optimal pivot points. I recall that this algorithm was taught in the early weeks of Algorithms, Part I -- and it's still the most surprising application of divide-and-conquer that I've seen so far.

Once the fourth largest element is identified, returning the top four elements follows immediately in linear time.

## Tuesday, January 7, 2014

### Beauty and Learning

I've just completed Peter Norvig's class on Udacity, the Design of Computer Programs.

The course surprised me more than any other I've started or completed so far, primarily due to the artfulness of Norvig's insights and solutions. We often think of computer science (and science in general) as highly logical and structured, in contrast to the more creative, fluid nature of the arts. But programming languages, like natural languages, can produce an infinite set of utterances, and some are bound to be better than others.

What is "better"? The quality of computer programs can, of course, be measured by objective measures like correctness, running time, and space efficiency. More often than not, programs that satisfy all three are optimized for simplicity (thought not always readability). Simplicity is a good measure. When people talk about quality, though, they also refer to related but distinct things like elegance and beauty. It's even in written into the Python language:

>>> import thisWhen I think of what I've learned in the class, it's hard to point to specific things like data structures or algorithms. But the class has enhanced my appreciation for the depth of knowledge and skill that underlies good programming. There are murmurings of dissatisfaction in the Udacity forums and in various other reviews, primarily around the perceived difficulty and ambiguity of some of the course materials. Maybe the frustration is only natural; unlike most textbook material, Norvig's solutions are downright unexpected and beautiful, a result of what I can only image is tens of thousands of hours of experience and critical thinking. Just the sort of learning he advocates in Teach Yourself Programming in Ten Years.

The Zen of Python, by Tim Peters

Beautiful is better than ugly.

[...]

Subscribe to:
Posts (Atom)