A past girlfriend and I would occasionally (cheerfully) quibble over the optimal strategy for extracting toothpaste. It occurred to me recently that the disagreement was fundamentally about amortized vs. worst case complexity.
Being lazy, I tend to squeeze the toothpaste out of the front of the tube, optimizing the time spent in the moment and reducing the degree of control required since pressure is exerted near the toothbrush. She would carefully squeeze the tube from the back, maintaining a flat region that would slowly grow as the toothpaste emptied. The main advantage of her strategy is that toothpaste is always at hand, and every iteration is fast. In contrast, squeezing from the front pushes toothpaste both out of the tube and backwards towards the other end. Occasionally, this must be fixed by rolling the back of the tube forwards. Each fix up step takes much longer than a normal toothpaste extraction, but the average time spent might be lower since a normal squeeze is faster. We never did the experiment, so I’m not sure which strategy actually wins on average.
However, the real problem is heterogeneous parallel computation: if two people squeeze a toothpaste tube from the front, their thresholds for when to perform a fix up step will almost certainly differ (I had significantly more finger strength). The result is unfair: the person with the lower threshold will end up doing most of the work. Relatedly, in a parallel context the time lost during a fix up step can amplify, since more than one agent can stall waiting for fix up on a shared resource to complete. Incidentally, in contrast to the computational case, for toothpaste it might be even worse if the fix up step falls to the second person.
Some of these points definitely came up, but I don’t remember the details. The important thing is that I immediately caved. :)