I wish they were in universal use. I'm constantly cleaning up a colleagues code where he's made many (now incorrect) assumptions about what "int", "short", "long" would mean. Even pointing him to the standard didn't help, I just got back his stock answer, "It works on X so why do we need to worry?" To which I reply, "Because we've changed architectures and compilers since then and now your code is frequently broken." You'd think he'd learn after a while, but some people are too stubborn.
I started when sizeof(int) = 2, sizeof(long) = 4. I've seen int go from smaller than a long to equal to a long (both 4 bytes), and then back to not equal (sizeof(long) = 8).
And if that colleague won't learn, that sounds like an issue in need of a bit of active management...
Unfortunately he's got the ear of the management. He's a good guy, and actually a good programmer. He's one of those folks that can churn out 1000 lines of (mostly) functioning code in a day. It's not well-written, but it does the job and that's what management wants to see (bad management, but hard to fight). But he's lousy about not learning new things and being stuck in his ways.
Once an idea enters his head it's hard to dislodge it. He learned programming from a mediocre CS school (despite the expensive price tag, which they deserve for their medical and law schools, not their engineering disciplines). And when something works, it's hard to convince him that it's wrong (that is, correct by coincidence) or unstable (in this case, sizeof(int) and related issues). He isn't technically (one of his favorite words) wrong in many instances, but problems will arise when we change OS, compiler, or CPU.
I'm slowly converting him to seeing things my way. My main contribution has been to mentor the junior developers. Things will change as their contributions worm their way into our projects.