I used to use a very ambitious approach to computer algorithms that create structure. If I wanted to, for example, construct a level surface from volumetric data, I would carefully build a coherent structure as the algorithm ran, maintaining order at every step.
This would generally require some pretty gnarly code, with lots of dynamic data pointers and intricate logic in the inner loop. In a way it was all quite beautiful, but also fragile and difficult to maintain. A single ill considered change to the code could bring the whole delicate structure crashing down.
I notice that these days I take a different approach, which might be called mess/cleanup. I run a first pass without worrying too much about structure. In that pass I record just enough info along the way so that I can do the rest in a second pass.
That second pass is where I build all the actual structure, using the info I’d recorded earlier. The result ends up being the same, but the process is very different.
Doing it this way, I find that my code is much smaller and more compact, and a lot easier to read. There are fewer errors, and debugging is a piece of cake.
So it seems that a little messiness is a useful thing when you’re writing tricky algorithms. Not too much of a mess, but just enough.
map/reduce?