Code that isn't tested == code that is broken.
— w00t Dude (@w00tDude) February 1, 2011
Corollary. Multi-threaded code that is tested != code that works.
— w00t Dude (@w00tDude) February 1, 2011
Just in case the twitter links stop working:
Code that isn't tested == code that is broken.
Corollary. Multi-threaded code that is tested != code that works.
Wayne Wooten
Ah, the joys of parallel programming.
I did my undergraduate degree at Manchester University in the 1980s. In my “Parallel Computing” class we had a lecture about the Manchester Dataflow Machine – a parallel computing architecture that was going to take over the world.
I did my master’s degree at the University of Washington in the 2000s. In my “Parallel Computing” class we had a lecture about the Manchester Dataflow Machine, and why it didn’t take over the world.*
In 20 years it seems like we came up with a lot of ideas for parallel computing, without ever hitting on the single idea that ties everything together. For serial computing we have the Von Neumann architecture, and even though we now have caches and branch prediction and pipelining and more, we can still regard the basic machine as a Von Neumann machine.
The most practical approach I have seen is the use of parallel programming patterns. Depending on who you talk to, and how they group the patterns there are somewhere between 13 and 20 of these. I am partial to Dr. Michael McCool’s explanation of the patterns:
Parallel Programming Talk 82 – Michael McCool
Structured Parallel Programming with Deterministic Patterns
* I am being cruel for comic effect. It isn’t necessary for a machine (particularly not a research machine) to take over the world in order to be groundbreaking, useful and educational. The dataflow model is still a very relevant one – we used a dataflow architecture for hooking together graphs of image processing algorithms on the Adobe Image Foundation project. Search for “Photoshop oil paint filter” for examples produced by a dataflow graph of 13 image processing kernels.