Silly Rabbit - absolutely accurate times are a security problem. CPU designers (even waay back in Alpha @ DEC) intentionally introduced clock jitter, just to prevent total predictability. For x86, I think if you performed 3-4 of them, and then saved the values in to registers, and then upon completion reported those values, you would find that the time deltas are NOT exactly the same.
Do you have any sources for this? My googling skills are failing me. I'm surprised early x86 (which I assume you're including) were aware of security issues with accurate clocks; I certainly wasn't until this millennium :D I would rather guess observed clock jitter would be explained by interrupts or some such. Not saying you're wrong, I'd just like to learn more.