Could someone please point me to a definition of how to calculate the jitter field in the ntpq output on Linux?

We are monitoring our Linux servers. After much work, they are ticking along nicely with sub millisecond offsets but the jitter figure seems to stay stubbornly around the 14ms figure. We do get occasional short periods of higher offsets but there does appear to be no correlation between the delay, the offset and the jitter.

Have there been any known problems with the jitter calculation?

Any insight gratefully received.