Both Flann and Keysight use a thickness of 9.63 mm for a quarter wave shim in a WR90 cal kit. Yet when I do a calculation I find the phase shift of such a shim is 56.96 degrees at 8.2 GHz, rising to 121.74 degrees at 12.4 GHz. This means at worst case the phase departure from the optimal 90 degrees is 33.04 degrees.
If instead a thickness of 9.700 mm is used, the phase shift at 8.2 GHz is 57.37 degrees, that at 12.4 GHz is 122.63 degrees, so the maximum departure from the optimal 90 degrees is 32.63 degrees.
I know the difference is pretty insignificant, but I'm puzzled why this should be so. Someone at HP must have sat down and worked out the optimal thickness. Maybe in was in the days of slide rules, before modern computers existed.
I have taken into account the permittivity of air, and the wavelength in waveguide, but ignored the waveguide losses. If you ignore the permittivity of air, it can not account for this either. I thought perhaps the original designer was a radio ham, and designed it to have a 90 degree delay at the SSB calling frequency of 10.368 GHz, but that's not so!