Sunday, June 21, 2020

What goes on in a computer in a millisecond?

In a 2001 Q&A session (around 1h30m) Knuth is asked about unsolved challenges in the field of compute science, à la Hilbert's problems. Knuth's responds by referring to a prompt he posed to the World Computer Congress in 1989, in a keynote titled "Theory and Practice, IV" (Arxiv has a version, though this version appears to have the slides as well):

"Make a thorough analysis of everything your computer does during one second of computation"

Does the prompt require adjustment for context, some thirty years later? Knuth noted the practical challenges associated with monitoring and interpreting the computer's activities, whose difficulty would appear to remain on the same order of magnitude today. On the other hand, Knuth associated 1 second with approximately 250,000 instructions. Given that CPU clock speeds have jumped from standard units in MHz to GHz (wikipedia notes clock speeds in 1989 were in the 16-35 MHz range, in line with Knuth's approximation), maybe today's equivalent of the question is:

"What goes on in a computer in a millisecond"?

In the same keynote, Knuth provides a set of guiding questions, including the correctness of the observed activities and their correspondence to theory -- that is, whether they reflect existing theory, and whether new theory would yield practical improvements.

Despite the daunting scope of the problem, the fundamental idea of mapping theory to some atomic observations of computing practice seems widely applicable and useful for students and practitioners alike.