There’s something qualitative and important that happens whe...

There’s something qualitative and important that happens when the error on T⟨A⟩ (aka ϵ) is smaller than the amount of time it would take event A to cause anything to happen (e.g. smaller than one network latency): that means that we can be sure that events that are timestamped before T⟨A⟩ cannot have been caused by A. This is a rather magical property.

I’m suspicious of any distributed system design that uses time without talking about the range of errors on the time estimate (i.e. any design that assumes ϵ==0 or even ϵ≥0).

www.joshbeckman.org/notes/633072574